From python-checkins at python.org Fri Nov 1 00:55:45 2013 From: python-checkins at python.org (victor.stinner) Date: Fri, 1 Nov 2013 00:55:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2319442=3A_warn=5Fe?= =?utf-8?q?xplicit=28=29_does_nothing_when_called_late_during_Python?= Message-ID: <3d9k1s6yM1z7Lk8@mail.python.org> http://hg.python.org/cpython/rev/13a05ed33cf7 changeset: 86816:13a05ed33cf7 user: Victor Stinner date: Fri Nov 01 00:55:30 2013 +0100 summary: Close #19442: warn_explicit() does nothing when called late during Python shutdown After more tests, I now think that it is the safest option. files: Python/_warnings.c | 18 +++++++++--------- 1 files changed, 9 insertions(+), 9 deletions(-) diff --git a/Python/_warnings.c b/Python/_warnings.c --- a/Python/_warnings.c +++ b/Python/_warnings.c @@ -333,6 +333,13 @@ PyObject *action; int rc; + /* module can be None if a warning is emitted late during Python shutdown. + In this case, the Python warnings module was probably unloaded, filters + are no more available to choose as action. It is safer to ignore the + warning and do nothing. */ + if (module == Py_None) + Py_RETURN_NONE; + if (registry && !PyDict_Check(registry) && (registry != Py_None)) { PyErr_SetString(PyExc_TypeError, "'registry' must be a dict"); return NULL; @@ -635,15 +642,8 @@ if (!setup_context(stack_level, &filename, &lineno, &module, ®istry)) return NULL; - if (module != Py_None) { - res = warn_explicit(category, message, filename, lineno, module, registry, - NULL); - } - else { - /* FIXME: emitting warnings at exit does crash Python */ - res = Py_None; - Py_INCREF(res); - } + res = warn_explicit(category, message, filename, lineno, module, registry, + NULL); Py_DECREF(filename); Py_DECREF(registry); Py_DECREF(module); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 01:24:12 2013 From: python-checkins at python.org (benjamin.peterson) Date: Fri, 1 Nov 2013 01:24:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_fix_xmlcharref?= =?utf-8?q?replace_tests_on_wide_build_when_tests_are_loaded_from_=2Epy=5B?= =?utf-8?b?Y29d?= Message-ID: <3d9kfh2LRbz7LjT@mail.python.org> http://hg.python.org/cpython/rev/2d02b7a97e0b changeset: 86817:2d02b7a97e0b branch: 2.7 parent: 86775:e4fe8fcaef0d user: Benjamin Peterson date: Thu Oct 31 20:22:41 2013 -0400 summary: fix xmlcharrefreplace tests on wide build when tests are loaded from .py[co] files. files: Lib/test/test_codeccallbacks.py | 4 ++-- Lib/test/test_unicode.py | 4 ++-- Misc/NEWS | 6 ++++++ 3 files changed, 10 insertions(+), 4 deletions(-) diff --git a/Lib/test/test_codeccallbacks.py b/Lib/test/test_codeccallbacks.py --- a/Lib/test/test_codeccallbacks.py +++ b/Lib/test/test_codeccallbacks.py @@ -84,9 +84,9 @@ tests = [(u'\U0001f49d', '💝'), (u'\ud83d', '�'), (u'\udc9d', '�'), - (u'\ud83d\udc9d', '💝' if len(u'\U0001f49d') > 1 else - '��'), ] + if u'\ud83d\udc9d' != u'\U0001f49d': + tests += [(u'\ud83d\udc9d', '��')] for encoding in ['ascii', 'latin1', 'iso-8859-15']: for s, exp in tests: self.assertEqual(s.encode(encoding, 'xmlcharrefreplace'), diff --git a/Lib/test/test_unicode.py b/Lib/test/test_unicode.py --- a/Lib/test/test_unicode.py +++ b/Lib/test/test_unicode.py @@ -1663,9 +1663,9 @@ tests = [(u'\U0001f49d', '💝'), (u'\ud83d', '�'), (u'\udc9d', '�'), - (u'\ud83d\udc9d', '💝' if len(u'\U0001f49d') > 1 else - '��'), ] + if u'\ud83d\udc9d' != u'\U0001f49d': + tests += [(u'\ud83d\udc9d', '��')] for s, exp in tests: self.assertEqual( unicode_encodedecimal(u"123" + s, "xmlcharrefreplace"), diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -16,6 +16,12 @@ - Issue #19426: Fixed the opening of Python source file with specified encoding. +Tests +----- + +- Issue #19457: Fixed xmlcharrefreplace tests on wide build when tests are + loaded from .py[co] files. + What's New in Python 2.7.6 release candidate 1? =============================================== -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 01:24:13 2013 From: python-checkins at python.org (benjamin.peterson) Date: Fri, 1 Nov 2013 01:24:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMi43IC0+IDIuNyk6?= =?utf-8?q?_merge_2=2E7=2E6_release_branch?= Message-ID: <3d9kfj4GWdz7LjZ@mail.python.org> http://hg.python.org/cpython/rev/01087a302721 changeset: 86818:01087a302721 branch: 2.7 parent: 86800:8d5df9602a72 parent: 86817:2d02b7a97e0b user: Benjamin Peterson date: Thu Oct 31 20:23:57 2013 -0400 summary: merge 2.7.6 release branch files: Misc/NEWS | 35 +++++++++++++++++++---------------- 1 files changed, 19 insertions(+), 16 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -15,26 +15,29 @@ Tests ----- + +Whats' New in Python 2.7.6? +=========================== + +*Release date: 2013-11-02* + +Library +------- + +- Issue #19435: Fix directory traversal attack on CGIHttpRequestHandler. + +IDLE +---- + +- Issue #19426: Fixed the opening of Python source file with specified encoding. + +Tests +----- + - Issue #19457: Fixed xmlcharrefreplace tests on wide build when tests are loaded from .py[co] files. -Whats' New in Python 2.7.6? -=========================== - -*Release date: 2013-11-02* - -Library -------- - -- Issue #19435: Fix directory traversal attack on CGIHttpRequestHandler. - -IDLE ----- - -- Issue #19426: Fixed the opening of Python source file with specified encoding. - - What's New in Python 2.7.6 release candidate 1? =============================================== -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 05:34:47 2013 From: python-checkins at python.org (eric.snow) Date: Fri, 1 Nov 2013 05:34:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319413=3A_Restore_?= =?utf-8?q?pre-3=2E3_reload=28=29_semantics_of_re-finding_modules=2E?= Message-ID: <3d9rCq6wq8z7Ljk@mail.python.org> http://hg.python.org/cpython/rev/88c3a1a3c2ff changeset: 86819:88c3a1a3c2ff parent: 86816:13a05ed33cf7 user: Eric Snow date: Thu Oct 31 22:22:15 2013 -0600 summary: Issue #19413: Restore pre-3.3 reload() semantics of re-finding modules. files: Lib/importlib/__init__.py | 8 +- Lib/importlib/_bootstrap.py | 8 +- Lib/test/test_importlib/test_api.py | 120 + Misc/NEWS | 2 + Python/importlib.h | 1011 +++++++------- 5 files changed, 642 insertions(+), 507 deletions(-) diff --git a/Lib/importlib/__init__.py b/Lib/importlib/__init__.py --- a/Lib/importlib/__init__.py +++ b/Lib/importlib/__init__.py @@ -107,7 +107,7 @@ if not module or not isinstance(module, types.ModuleType): raise TypeError("reload() argument must be module") name = module.__name__ - if name not in sys.modules: + if sys.modules.get(name) is not module: msg = "module {} not in sys.modules" raise ImportError(msg.format(name), name=name) if name in _RELOADING: @@ -118,7 +118,11 @@ if parent_name and parent_name not in sys.modules: msg = "parent {!r} not in sys.modules" raise ImportError(msg.format(parent_name), name=parent_name) - module.__loader__.load_module(name) + loader = _bootstrap._find_module(name, None) + if loader is None: + raise ImportError(_bootstrap._ERR_MSG.format(name), name=name) + module.__loader__ = loader + loader.load_module(name) # The module may have replaced itself in sys.modules! return sys.modules[module.__name__] finally: diff --git a/Lib/importlib/_bootstrap.py b/Lib/importlib/_bootstrap.py --- a/Lib/importlib/_bootstrap.py +++ b/Lib/importlib/_bootstrap.py @@ -1510,15 +1510,19 @@ """Find a module's loader.""" if not sys.meta_path: _warnings.warn('sys.meta_path is empty', ImportWarning) + is_reload = name in sys.modules for finder in sys.meta_path: with _ImportLockContext(): loader = finder.find_module(name, path) if loader is not None: # The parent import may have already imported this module. - if name not in sys.modules: + if is_reload or name not in sys.modules: return loader else: - return sys.modules[name].__loader__ + try: + return sys.modules[name].__loader__ + except AttributeError: + return loader else: return None diff --git a/Lib/test/test_importlib/test_api.py b/Lib/test/test_importlib/test_api.py --- a/Lib/test/test_importlib/test_api.py +++ b/Lib/test/test_importlib/test_api.py @@ -1,8 +1,10 @@ from . import util frozen_init, source_init = util.import_importlib('importlib') +frozen_util, source_util = util.import_importlib('importlib.util') frozen_machinery, source_machinery = util.import_importlib('importlib.machinery') +import os.path import sys from test import support import types @@ -190,11 +192,129 @@ self.assertEqual(actual.spam, 3) self.assertEqual(reloaded.spam, 3) + def test_reload_missing_loader(self): + with support.CleanImport('types'): + import types + loader = types.__loader__ + del types.__loader__ + reloaded = self.init.reload(types) + + self.assertIs(reloaded, types) + self.assertIs(sys.modules['types'], types) + self.assertEqual(reloaded.__loader__.path, loader.path) + + def test_reload_loader_replaced(self): + with support.CleanImport('types'): + import types + types.__loader__ = None + self.init.invalidate_caches() + reloaded = self.init.reload(types) + + self.assertIsNot(reloaded.__loader__, None) + self.assertIs(reloaded, types) + self.assertIs(sys.modules['types'], types) + + def test_reload_location_changed(self): + name = 'spam' + with support.temp_cwd(None) as cwd: + with util.uncache('spam'): + with support.DirsOnSysPath(cwd): + self.init.invalidate_caches() + path = os.path.join(cwd, name + '.py') + cached = self.util.cache_from_source(path) + expected = {'__name__': name, + '__package__': '', + '__file__': path, + '__cached__': cached, + '__doc__': None, + '__builtins__': __builtins__, + } + support.create_empty_file(path) + module = self.init.import_module(name) + ns = vars(module) + del ns['__initializing__'] + loader = ns.pop('__loader__') + self.assertEqual(loader.path, path) + self.assertEqual(ns, expected) + + self.init.invalidate_caches() + init_path = os.path.join(cwd, name, '__init__.py') + cached = self.util.cache_from_source(init_path) + expected = {'__name__': name, + '__package__': name, + '__file__': init_path, + '__cached__': cached, + '__path__': [os.path.dirname(init_path)], + '__doc__': None, + '__builtins__': __builtins__, + } + os.mkdir(name) + os.rename(path, init_path) + reloaded = self.init.reload(module) + ns = vars(reloaded) + del ns['__initializing__'] + loader = ns.pop('__loader__') + self.assertIs(reloaded, module) + self.assertEqual(loader.path, init_path) + self.assertEqual(ns, expected) + + def test_reload_namespace_changed(self): + self.maxDiff = None + name = 'spam' + with support.temp_cwd(None) as cwd: + with util.uncache('spam'): + with support.DirsOnSysPath(cwd): + self.init.invalidate_caches() + bad_path = os.path.join(cwd, name, '__init.py') + cached = self.util.cache_from_source(bad_path) + expected = {'__name__': name, + '__package__': name, + '__doc__': None, + } + os.mkdir(name) + with open(bad_path, 'w') as init_file: + init_file.write('eggs = None') + module = self.init.import_module(name) + ns = vars(module) + del ns['__initializing__'] + loader = ns.pop('__loader__') + path = ns.pop('__path__') + self.assertEqual(list(path), + [os.path.dirname(bad_path)] * 2) + with self.assertRaises(AttributeError): + # a NamespaceLoader + loader.path + self.assertEqual(ns, expected) + + self.init.invalidate_caches() + init_path = os.path.join(cwd, name, '__init__.py') + cached = self.util.cache_from_source(init_path) + expected = {'__name__': name, + '__package__': name, + '__file__': init_path, + '__cached__': cached, + '__path__': [os.path.dirname(init_path)], + '__doc__': None, + '__builtins__': __builtins__, + 'eggs': None, + } + os.rename(bad_path, init_path) + reloaded = self.init.reload(module) + ns = vars(reloaded) + del ns['__initializing__'] + loader = ns.pop('__loader__') + self.assertIs(reloaded, module) + self.assertEqual(loader.path, init_path) + self.assertEqual(ns, expected) + + class Frozen_ReloadTests(ReloadTests, unittest.TestCase): init = frozen_init + util = frozen_util class Source_ReloadTests(ReloadTests, unittest.TestCase): init = source_init + util = source_util class InvalidateCacheTests: diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -127,6 +127,8 @@ - Issue #8964: fix platform._sys_version to handle IronPython 2.6+. Patch by Martin Matusiak. +- Issue #19413: Restore pre-3.3 reload() semantics of re-finding modules. + - Issue #18958: Improve error message for json.load(s) while passing a string that starts with a UTF-8 BOM. diff --git a/Python/importlib.h b/Python/importlib.h --- a/Python/importlib.h +++ b/Python/importlib.h [stripped] -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 06:49:42 2013 From: python-checkins at python.org (eric.snow) Date: Fri, 1 Nov 2013 06:49:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319413=3A_Disregar?= =?utf-8?q?d_duplicate_namespace_portions_during_reload_tests=2E?= Message-ID: <3d9stG1mV7z7Ljv@mail.python.org> http://hg.python.org/cpython/rev/78d36d54391c changeset: 86820:78d36d54391c user: Eric Snow date: Thu Oct 31 23:44:31 2013 -0600 summary: Issue #19413: Disregard duplicate namespace portions during reload tests. files: Lib/test/test_importlib/test_api.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_importlib/test_api.py b/Lib/test/test_importlib/test_api.py --- a/Lib/test/test_importlib/test_api.py +++ b/Lib/test/test_importlib/test_api.py @@ -279,8 +279,8 @@ del ns['__initializing__'] loader = ns.pop('__loader__') path = ns.pop('__path__') - self.assertEqual(list(path), - [os.path.dirname(bad_path)] * 2) + self.assertEqual(set(path), + set([os.path.dirname(bad_path)])) with self.assertRaises(AttributeError): # a NamespaceLoader loader.path -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Fri Nov 1 07:34:05 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Fri, 01 Nov 2013 07:34:05 +0100 Subject: [Python-checkins] Daily reference leaks (13a05ed33cf7): sum=0 Message-ID: results for 13a05ed33cf7 on branch "default" -------------------------------------------- test_site leaked [0, -2, 2] references, sum=0 test_site leaked [0, -2, 2] memory blocks, sum=0 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogFwlbI4', '-x'] From python-checkins at python.org Fri Nov 1 12:05:59 2013 From: python-checkins at python.org (victor.stinner) Date: Fri, 1 Nov 2013 12:05:59 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_454?= Message-ID: <3dB0vC3SflzSCM@mail.python.org> http://hg.python.org/peps/rev/50539da3ea22 changeset: 5241:50539da3ea22 user: Victor Stinner date: Fri Nov 01 12:05:37 2013 +0100 summary: PEP 454 * rename disable/enable/is_enabled() to stop/start/is_tracing() * add disable/enable/is_enabled() which are temporary, and disable() doesn't clear traces * a traceback now always contains a least 1 frame, use (('', 0),) if the traceback cannot read or if the traceback limit is 0 * write more documentation on filter functions files: pep-0454.txt | 87 ++++++++++++++++++++++++++++++++++----- 1 files changed, 75 insertions(+), 12 deletions(-) diff --git a/pep-0454.txt b/pep-0454.txt --- a/pep-0454.txt +++ b/pep-0454.txt @@ -117,20 +117,23 @@ ``disable()`` function: - Stop tracing Python memory allocations and clear traces of memory - blocks allocated by Python. + Disable temporarily tracing new Python memory allocations, + deallocations are still traced. The change is process-wide, tracing + new Python memory allocations is disabled in all threads. Call + ``enable()`` to reenable tracing new Python memory allocations. - Call ``get_traces()`` or ``take_snapshot()`` function to get traces - before clearing them. + Filters can be used to not trace memory allocations in some files: + use the ``add_filter()`` function. See also ``enable()`` and ``is_enabled()`` functions. ``enable()`` function: - Start tracing Python memory allocations. + Reenable tracing Python memory allocations if was disabled by te + ``disable()`` method. - See also ``disable()`` and ``is_enabled()`` functions. + See also ``is_enabled()`` functions. ``get_traced_memory()`` function: @@ -147,10 +150,50 @@ ``is_enabled()`` function: + ``True`` if the ``tracemalloc`` module is enabled Python memory + allocations, ``False`` if the module is disabled. + + The ``tracemalloc`` module only traces new allocations if + ``is_tracing()`` and ``is_enabled()`` are ``True``. + + See also ``enable()`` and ``disable()`` functions. + + +``is_tracing()`` function: + ``True`` if the ``tracemalloc`` module is tracing Python memory allocations, ``False`` otherwise. - See also ``disable()`` and ``enable()`` functions. + The ``tracemalloc`` module only traces new allocations if + ``is_tracing()`` and ``is_enabled()`` are ``True``. + + See also ``start()`` and ``stop()`` functions. + + +``stop()`` function: + + Stop tracing Python memory allocations and clear traces of memory + blocks allocated by Python. + + The function uninstalls hooks on Python memory allocators, so the + overhead of the module becomes null. + + Call ``get_traces()`` or ``take_snapshot()`` function to get traces + before clearing them. Use ``disable()`` to disable tracing + temporarily. + + See also ``enable()`` and ``is_enabled()`` functions. + + +``start()`` function: + + Start tracing Python memory allocations. + + The function installs hooks on Python memory allocators. These hooks + have important overhead in term of performances and memory usage: + see `Filter functions`_ to limit the overhead. + + See also ``disable()`` and ``is_tracing()`` functions. ``take_snapshot()`` function: @@ -177,8 +220,10 @@ frame, limited to ``get_traceback_limit()`` frames. A frame is a ``(filename: str, lineno: int)`` tuple. -If ``tracemalloc`` failed to get the whole traceback, the traceback may be -empty, truncated or contain ``""`` filename and line number 0. +A traceback contains at least ``1`` frame. If the ``tracemalloc`` module +failed to get a frame, the ``""`` filename and the line number ``0`` +are used. If it failed to get the traceback or if the traceback limit is ``0``, +the traceback is ``(('', 0),)``. Example of a trace: ``(32, (('x.py', 7), ('x.py', 11)))``. The memory block has a size of 32 bytes and was allocated at ``x.py:7``, line called from line @@ -238,6 +283,9 @@ function to measure the overhead and the ``add_filter()`` function to select which memory allocations are traced. + If the limit is set to ``0`` frame, the traceback ``(('', + 0),)`` will be used for all traces. + Use the ``get_traceback_limit()`` function to get the current limit. The ``PYTHONTRACEMALLOC`` environment variable and the ``-X`` @@ -248,6 +296,24 @@ Filter functions ---------------- +Tracing all Python memroy allocations has an important overhead on performances +and on the memory usage. + +To limit the overhead, some files can be excluded or tracing can be restricted +to a set of files using filters. Examples: ``add_filter(Filter(True, +subprocess.__file__))`` only traces memory allocations in the ``subprocess`` +module, and ``add_filter(Filter(False, tracemalloc.__file__))`` do not trace +memory allocations in the ``tracemalloc`` module + +By default, there is one exclusive filter to ignore Python memory blocks +allocated by the ``tracemalloc`` module. + +Tracing can be also be disabled temporarily using the ``disable()`` function. + +Use the ``get_tracemalloc_memory()`` function to measure the memory usage. +See also the ``set_traceback_limit()`` function to configure how many +frames are stored. + ``add_filter(filter)`` function: Add a new filter on Python memory allocations, *filter* is a @@ -274,9 +340,6 @@ Get the filters on Python memory allocations. Return a list of ``Filter`` instances. - By default, there is one exclusive filter to ignore Python memory - blocks allocated by the ``tracemalloc`` module. - See also the ``clear_filters()`` function. -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 1 14:13:36 2013 From: python-checkins at python.org (tim.golden) Date: Fri, 1 Nov 2013 14:13:36 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319464_Suppress_co?= =?utf-8?q?mpiler_warnings_during_clean=2E_Patch_by_Zachary_Ware=2E?= Message-ID: <3dB3kS5977z7Ljb@mail.python.org> http://hg.python.org/cpython/rev/599b5200ad51 changeset: 86821:599b5200ad51 user: Tim Golden date: Fri Nov 01 13:12:17 2013 +0000 summary: Issue #19464 Suppress compiler warnings during clean. Patch by Zachary Ware. files: PCbuild/ssl.vcxproj | 16 ++++++++-------- 1 files changed, 8 insertions(+), 8 deletions(-) diff --git a/PCbuild/ssl.vcxproj b/PCbuild/ssl.vcxproj --- a/PCbuild/ssl.vcxproj +++ b/PCbuild/ssl.vcxproj @@ -122,7 +122,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -133,7 +133,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -144,7 +144,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -155,7 +155,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -166,7 +166,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -177,7 +177,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -188,7 +188,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -199,7 +199,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 15:25:19 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 1 Nov 2013 15:25:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Silence_a_compiler_warning?= =?utf-8?q?_about_an_unused_function?= Message-ID: <3dB5KC3fbtz7LjP@mail.python.org> http://hg.python.org/cpython/rev/b59ef5f203da changeset: 86822:b59ef5f203da user: Brett Cannon date: Fri Nov 01 10:25:13 2013 -0400 summary: Silence a compiler warning about an unused function files: Modules/_hashopenssl.c | 56 +++++++++++++++--------------- 1 files changed, 28 insertions(+), 28 deletions(-) diff --git a/Modules/_hashopenssl.c b/Modules/_hashopenssl.c --- a/Modules/_hashopenssl.c +++ b/Modules/_hashopenssl.c @@ -63,34 +63,6 @@ DEFINE_CONSTS_FOR_NEW(sha512) #endif -static PyObject * -_setException(PyObject *exc) -{ - unsigned long errcode; - const char *lib, *func, *reason; - - errcode = ERR_peek_last_error(); - if (!errcode) { - PyErr_SetString(exc, "unknown reasons"); - return NULL; - } - ERR_clear_error(); - - lib = ERR_lib_error_string(errcode); - func = ERR_func_error_string(errcode); - reason = ERR_reason_error_string(errcode); - - if (lib && func) { - PyErr_Format(exc, "[%s: %s] %s", lib, func, reason); - } - else if (lib) { - PyErr_Format(exc, "[%s] %s", lib, reason); - } - else { - PyErr_SetString(exc, reason); - } - return NULL; -} static EVPobject * newEVPobject(PyObject *name) @@ -588,6 +560,34 @@ return 1; } +static PyObject * +_setException(PyObject *exc) +{ + unsigned long errcode; + const char *lib, *func, *reason; + + errcode = ERR_peek_last_error(); + if (!errcode) { + PyErr_SetString(exc, "unknown reasons"); + return NULL; + } + ERR_clear_error(); + + lib = ERR_lib_error_string(errcode); + func = ERR_func_error_string(errcode); + reason = ERR_reason_error_string(errcode); + + if (lib && func) { + PyErr_Format(exc, "[%s: %s] %s", lib, func, reason); + } + else if (lib) { + PyErr_Format(exc, "[%s] %s", lib, reason); + } + else { + PyErr_SetString(exc, reason); + } + return NULL; +} PyDoc_STRVAR(pbkdf2_hmac__doc__, "pbkdf2_hmac(hash_name, password, salt, iterations, dklen=None) -> key\n\ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 15:38:05 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 1 Nov 2013 15:38:05 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319410=3A_Put_back?= =?utf-8?q?_in_special-casing_of_=27=27_for?= Message-ID: <3dB5bx5M8sz7LjY@mail.python.org> http://hg.python.org/cpython/rev/17d730d37b2f changeset: 86823:17d730d37b2f user: Brett Cannon date: Fri Nov 01 10:37:57 2013 -0400 summary: Issue #19410: Put back in special-casing of '' for importlib.machinery.FileFinder. While originally moved to stop special-casing '' as PathFinder farther up the typical call chain now uses the cwd in the instance of '', it was deemed an unnecessary risk to breaking subclasses of FileFinder to take the special-casing out. files: Doc/library/importlib.rst | 3 - Doc/whatsnew/3.4.rst | 3 - Lib/importlib/_bootstrap.py | 2 +- Misc/NEWS | 3 + Python/importlib.h | 1586 +++++++++++----------- 5 files changed, 797 insertions(+), 800 deletions(-) diff --git a/Doc/library/importlib.rst b/Doc/library/importlib.rst --- a/Doc/library/importlib.rst +++ b/Doc/library/importlib.rst @@ -753,9 +753,6 @@ .. versionadded:: 3.3 - .. versionchanged:: 3.4 - The empty string is no longer special-cased to be changed into ``'.'``. - .. attribute:: path The path the finder will search in. diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -812,6 +812,3 @@ working directory will also now have an absolute path, including when using ``-m`` with the interpreter (this does not influence when the path to a file is specified on the command-line). - -* :class:`importlib.machinery.FileFinder` no longer special-cases the empty string - to be changed to ``'.'``. diff --git a/Lib/importlib/_bootstrap.py b/Lib/importlib/_bootstrap.py --- a/Lib/importlib/_bootstrap.py +++ b/Lib/importlib/_bootstrap.py @@ -1375,7 +1375,7 @@ loaders.extend((suffix, loader) for suffix in suffixes) self._loaders = loaders # Base (directory) path - self.path = path + self.path = path or '.' self._path_mtime = -1 self._path_cache = set() self._relaxed_path_cache = set() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -31,6 +31,9 @@ Library ------- +- Issue #19410: Undo the special-casing removal of '' for + importlib.machinery.FileFinder. + - Issue #19424: Fix the warnings module to accept filename containing surrogate characters. diff --git a/Python/importlib.h b/Python/importlib.h --- a/Python/importlib.h +++ b/Python/importlib.h [stripped] -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 17:56:45 2013 From: python-checkins at python.org (barry.warsaw) Date: Fri, 1 Nov 2013 17:56:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Ethan_Furman=27s_latest_patch?= =?utf-8?q?_for_Issue_19331=2E?= Message-ID: <3dB8gx3FPGz7LkB@mail.python.org> http://hg.python.org/peps/rev/1a40d4eaa00b changeset: 5242:1a40d4eaa00b user: Barry Warsaw date: Fri Nov 01 12:56:37 2013 -0400 summary: Ethan Furman's latest patch for Issue 19331. files: pep-0008.txt | 16 ++++++++++++++-- 1 files changed, 14 insertions(+), 2 deletions(-) diff --git a/pep-0008.txt b/pep-0008.txt --- a/pep-0008.txt +++ b/pep-0008.txt @@ -580,6 +580,12 @@ standards, but where an existing library has a different style, internal consistency is preferred. +Overriding Principle +-------------------- + +Names that are visible to the user as public parts of the API should +follow conventions that reflect usage rather than implementation. + Descriptive: Naming Styles -------------------------- @@ -676,8 +682,14 @@ Class Names ~~~~~~~~~~~ -Almost without exception, class names use the CapWords convention. -Classes for internal use have a leading underscore in addition. +Class names should normally use the CapWords convention. + +The naming convention for functions may be used instead in cases where +the interface is documented and used primarily as a callable. + +Note that there is a separate convention for builtin names: most builtin +names are single words (or two words run together), with the CapWords +convention used only for exception names and builtin constants. Exception Names ~~~~~~~~~~~~~~~ -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 1 19:04:32 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 1 Nov 2013 19:04:32 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Abstract_out_stat_calls_in?= =?utf-8?q?_importlib_for_easier_experimentation=2E?= Message-ID: <3dBBB85NnZz7Ljn@mail.python.org> http://hg.python.org/cpython/rev/e071977772bf changeset: 86824:e071977772bf user: Brett Cannon date: Fri Nov 01 14:04:24 2013 -0400 summary: Abstract out stat calls in importlib for easier experimentation. files: Lib/importlib/_bootstrap.py | 20 +- Python/importlib.h | 5488 +++++++++++----------- 2 files changed, 2766 insertions(+), 2742 deletions(-) diff --git a/Lib/importlib/_bootstrap.py b/Lib/importlib/_bootstrap.py --- a/Lib/importlib/_bootstrap.py +++ b/Lib/importlib/_bootstrap.py @@ -65,10 +65,20 @@ return '', path +def _path_stat(path): + """Stat the path. + + Made a separate function to make it easier to override in experiments + (e.g. cache stat results). + + """ + return _os.stat(path) + + def _path_is_mode_type(path, mode): """Test whether the path is the specified mode type.""" try: - stat_info = _os.stat(path) + stat_info = _path_stat(path) except OSError: return False return (stat_info.st_mode & 0o170000) == mode @@ -458,7 +468,7 @@ def _calc_mode(path): """Calculate the mode permissions for a bytecode file.""" try: - mode = _os.stat(path).st_mode + mode = _path_stat(path).st_mode except OSError: mode = 0o666 # We always ensure write access so we can update cached files @@ -880,7 +890,7 @@ if filepath is None: return None try: - _os.stat(filepath) + _path_stat(filepath) except OSError: return None for loader, suffixes in _get_supported_file_loaders(): @@ -1074,7 +1084,7 @@ def path_stats(self, path): """Return the metadata for the path.""" - st = _os.stat(path) + st = _path_stat(path) return {'mtime': st.st_mtime, 'size': st.st_size} def _cache_bytecode(self, source_path, bytecode_path, data): @@ -1392,7 +1402,7 @@ is_namespace = False tail_module = fullname.rpartition('.')[2] try: - mtime = _os.stat(self.path or _os.getcwd()).st_mtime + mtime = _path_stat(self.path or _os.getcwd()).st_mtime except OSError: mtime = -1 if mtime != self._path_mtime: diff --git a/Python/importlib.h b/Python/importlib.h --- a/Python/importlib.h +++ b/Python/importlib.h [stripped] -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 22:25:47 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 1 Nov 2013 22:25:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Pause_accepting?= =?utf-8?q?_whenever_accept=28=29_returns_certain_errors=2E_Fixes?= Message-ID: <3dBGfM0ZRZz7LjT@mail.python.org> http://hg.python.org/cpython/rev/3ffa457c8d7c changeset: 86825:3ffa457c8d7c user: Guido van Rossum date: Fri Nov 01 14:12:50 2013 -0700 summary: asyncio: Pause accepting whenever accept() returns certain errors. Fixes asyncio issue #78. files: Lib/asyncio/constants.py | 5 ++- Lib/asyncio/selector_events.py | 21 +++++++--- Lib/test/test_asyncio/test_base_events.py | 13 +++++- 3 files changed, 30 insertions(+), 9 deletions(-) diff --git a/Lib/asyncio/constants.py b/Lib/asyncio/constants.py --- a/Lib/asyncio/constants.py +++ b/Lib/asyncio/constants.py @@ -1,4 +1,7 @@ """Constants.""" +# After the connection is lost, log warnings after this many write()s. +LOG_THRESHOLD_FOR_CONNLOST_WRITES = 5 -LOG_THRESHOLD_FOR_CONNLOST_WRITES = 5 +# Seconds to wait before retrying accept(). +ACCEPT_RETRY_DELAY = 1 diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -5,6 +5,7 @@ """ import collections +import errno import socket try: import ssl @@ -98,15 +99,23 @@ try: conn, addr = sock.accept() conn.setblocking(False) - except (BlockingIOError, InterruptedError): + except (BlockingIOError, InterruptedError, ConnectionAbortedError): pass # False alarm. - except Exception: - # Bad error. Stop serving. - self.remove_reader(sock.fileno()) - sock.close() + except OSError as exc: # There's nowhere to send the error, so just log it. # TODO: Someone will want an error handler for this. - logger.exception('Accept failed') + if exc.errno in (errno.EMFILE, errno.ENFILE, + errno.ENOBUFS, errno.ENOMEM): + # Some platforms (e.g. Linux keep reporting the FD as + # ready, so we remove the read handler temporarily. + # We'll try again in a while. + logger.exception('Accept out of system resource (%s)', exc) + self.remove_reader(sock.fileno()) + self.call_later(constants.ACCEPT_RETRY_DELAY, + self._start_serving, + protocol_factory, sock, ssl, server) + else: + raise # The event loop will catch, log and ignore it. else: if ssl: self._make_ssl_transport( diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -1,5 +1,6 @@ """Tests for base_events.py""" +import errno import logging import socket import time @@ -8,6 +9,7 @@ from test.support import find_unused_port, IPV6_ENABLED from asyncio import base_events +from asyncio import constants from asyncio import events from asyncio import futures from asyncio import protocols @@ -585,11 +587,18 @@ def test_accept_connection_exception(self, m_log): sock = unittest.mock.Mock() sock.fileno.return_value = 10 - sock.accept.side_effect = OSError() + sock.accept.side_effect = OSError(errno.EMFILE, 'Too many open files') + self.loop.remove_reader = unittest.mock.Mock() + self.loop.call_later = unittest.mock.Mock() self.loop._accept_connection(MyProto, sock) - self.assertTrue(sock.close.called) self.assertTrue(m_log.exception.called) + self.assertFalse(sock.close.called) + self.loop.remove_reader.assert_called_with(10) + self.loop.call_later.assert_called_with(constants.ACCEPT_RETRY_DELAY, + # self.loop._start_serving + unittest.mock.ANY, + MyProto, sock, None, None) if __name__ == '__main__': -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 22:25:48 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 1 Nov 2013 22:25:48 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Fold_some_long_?= =?utf-8?q?lines=2E?= Message-ID: <3dBGfN2DWlz7Ljn@mail.python.org> http://hg.python.org/cpython/rev/b89dbb0bf0a8 changeset: 86826:b89dbb0bf0a8 user: Guido van Rossum date: Fri Nov 01 14:13:30 2013 -0700 summary: asyncio: Fold some long lines. files: Lib/asyncio/selector_events.py | 3 ++- Lib/asyncio/tasks.py | 5 +++-- 2 files changed, 5 insertions(+), 3 deletions(-) diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -416,7 +416,8 @@ tulip_log.exception('pause_writing() failed') def _maybe_resume_protocol(self): - if self._protocol_paused and self.get_write_buffer_size() <= self._low_water: + if (self._protocol_paused and + self.get_write_buffer_size() <= self._low_water): self._protocol_paused = False try: self._protocol.resume_writing() diff --git a/Lib/asyncio/tasks.py b/Lib/asyncio/tasks.py --- a/Lib/asyncio/tasks.py +++ b/Lib/asyncio/tasks.py @@ -62,8 +62,9 @@ code = func.__code__ filename = code.co_filename lineno = code.co_firstlineno - logger.error('Coroutine %r defined at %s:%s was never yielded from', - func.__name__, filename, lineno) + logger.error( + 'Coroutine %r defined at %s:%s was never yielded from', + func.__name__, filename, lineno) def coroutine(func): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 22:25:49 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 1 Nov 2013 22:25:49 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Add_server=5Fho?= =?utf-8?q?stname_as_create=5Fconnection=28=29_argument=2C_with_secure?= Message-ID: <3dBGfP57tFz7Ljp@mail.python.org> http://hg.python.org/cpython/rev/97bbaf675d61 changeset: 86827:97bbaf675d61 user: Guido van Rossum date: Fri Nov 01 14:16:54 2013 -0700 summary: asyncio: Add server_hostname as create_connection() argument, with secure default. files: Lib/asyncio/base_events.py | 23 ++++- Lib/asyncio/events.py | 2 +- Lib/asyncio/selector_events.py | 4 +- Lib/test/test_asyncio/test_base_events.py | 54 +++++++++++ 4 files changed, 78 insertions(+), 5 deletions(-) diff --git a/Lib/asyncio/base_events.py b/Lib/asyncio/base_events.py --- a/Lib/asyncio/base_events.py +++ b/Lib/asyncio/base_events.py @@ -275,8 +275,27 @@ @tasks.coroutine def create_connection(self, protocol_factory, host=None, port=None, *, ssl=None, family=0, proto=0, flags=0, sock=None, - local_addr=None): + local_addr=None, server_hostname=None): """XXX""" + if server_hostname is not None and not ssl: + raise ValueError('server_hostname is only meaningful with ssl') + + if server_hostname is None and ssl: + # Use host as default for server_hostname. It is an error + # if host is empty or not set, e.g. when an + # already-connected socket was passed or when only a port + # is given. To avoid this error, you can pass + # server_hostname='' -- this will bypass the hostname + # check. (This also means that if host is a numeric + # IP/IPv6 address, we will attempt to verify that exact + # address; this will probably fail, but it is possible to + # create a certificate for a specific IP address, so we + # don't judge it here.) + if not host: + raise ValueError('You must set server_hostname ' + 'when using ssl without a host') + server_hostname = host + if host is not None or port is not None: if sock is not None: raise ValueError( @@ -357,7 +376,7 @@ sslcontext = None if isinstance(ssl, bool) else ssl transport = self._make_ssl_transport( sock, protocol, sslcontext, waiter, - server_side=False, server_hostname=host) + server_side=False, server_hostname=server_hostname) else: transport = self._make_socket_transport(sock, protocol, waiter) diff --git a/Lib/asyncio/events.py b/Lib/asyncio/events.py --- a/Lib/asyncio/events.py +++ b/Lib/asyncio/events.py @@ -172,7 +172,7 @@ def create_connection(self, protocol_factory, host=None, port=None, *, ssl=None, family=0, proto=0, flags=0, sock=None, - local_addr=None): + local_addr=None, server_hostname=None): raise NotImplementedError def create_server(self, protocol_factory, host=None, port=None, *, diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -573,7 +573,7 @@ 'server_side': server_side, 'do_handshake_on_connect': False, } - if server_hostname is not None and not server_side and ssl.HAS_SNI: + if server_hostname and not server_side and ssl.HAS_SNI: wrap_kwargs['server_hostname'] = server_hostname sslsock = sslcontext.wrap_socket(rawsock, **wrap_kwargs) @@ -619,7 +619,7 @@ # Verify hostname if requested. peercert = self._sock.getpeercert() - if (self._server_hostname is not None and + if (self._server_hostname and self._sslcontext.verify_mode != ssl.CERT_NONE): try: ssl.match_hostname(peercert, self._server_hostname) diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -444,6 +444,60 @@ self.assertRaises( OSError, self.loop.run_until_complete, coro) + def test_create_connection_server_hostname_default(self): + self.loop.getaddrinfo = unittest.mock.Mock() + def mock_getaddrinfo(*args, **kwds): + f = futures.Future(loop=self.loop) + f.set_result([(socket.AF_INET, socket.SOCK_STREAM, + socket.SOL_TCP, '', ('1.2.3.4', 80))]) + return f + self.loop.getaddrinfo.side_effect = mock_getaddrinfo + self.loop.sock_connect = unittest.mock.Mock() + self.loop.sock_connect.return_value = () + self.loop._make_ssl_transport = unittest.mock.Mock() + def mock_make_ssl_transport(sock, protocol, sslcontext, waiter, **kwds): + waiter.set_result(None) + self.loop._make_ssl_transport.side_effect = mock_make_ssl_transport + ANY = unittest.mock.ANY + # First try the default server_hostname. + self.loop._make_ssl_transport.reset_mock() + coro = self.loop.create_connection(MyProto, 'python.org', 80, ssl=True) + self.loop.run_until_complete(coro) + self.loop._make_ssl_transport.assert_called_with(ANY, ANY, ANY, ANY, + server_side=False, + server_hostname='python.org') + # Next try an explicit server_hostname. + self.loop._make_ssl_transport.reset_mock() + coro = self.loop.create_connection(MyProto, 'python.org', 80, ssl=True, + server_hostname='perl.com') + self.loop.run_until_complete(coro) + self.loop._make_ssl_transport.assert_called_with(ANY, ANY, ANY, ANY, + server_side=False, + server_hostname='perl.com') + # Finally try an explicit empty server_hostname. + self.loop._make_ssl_transport.reset_mock() + coro = self.loop.create_connection(MyProto, 'python.org', 80, ssl=True, + server_hostname='') + self.loop.run_until_complete(coro) + self.loop._make_ssl_transport.assert_called_with(ANY, ANY, ANY, ANY, + server_side=False, + server_hostname='') + + def test_create_connection_server_hostname_errors(self): + # When not using ssl, server_hostname must be None (but '' is OK). + coro = self.loop.create_connection(MyProto, 'python.org', 80, server_hostname='') + self.assertRaises(ValueError, self.loop.run_until_complete, coro) + coro = self.loop.create_connection(MyProto, 'python.org', 80, server_hostname='python.org') + self.assertRaises(ValueError, self.loop.run_until_complete, coro) + + # When using ssl, server_hostname may be None if host is non-empty. + coro = self.loop.create_connection(MyProto, '', 80, ssl=True) + self.assertRaises(ValueError, self.loop.run_until_complete, coro) + coro = self.loop.create_connection(MyProto, None, 80, ssl=True) + self.assertRaises(ValueError, self.loop.run_until_complete, coro) + coro = self.loop.create_connection(MyProto, None, None, ssl=True, sock=socket.socket()) + self.assertRaises(ValueError, self.loop.run_until_complete, coro) + def test_create_server_empty_host(self): # if host is empty string use None instead host = object() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 22:25:51 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 1 Nov 2013 22:25:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Refactor_ssl_tr?= =?utf-8?q?ansport_ready_loop_=28Nikolay_Kim=29=2E?= Message-ID: <3dBGfR273sz7Lk7@mail.python.org> http://hg.python.org/cpython/rev/10a518b26ed1 changeset: 86828:10a518b26ed1 user: Guido van Rossum date: Fri Nov 01 14:18:02 2013 -0700 summary: asyncio: Refactor ssl transport ready loop (Nikolay Kim). files: Lib/asyncio/selector_events.py | 94 +++--- Lib/test/test_asyncio/test_selector_events.py | 134 ++++++--- 2 files changed, 136 insertions(+), 92 deletions(-) diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -286,7 +286,7 @@ err = sock.getsockopt(socket.SOL_SOCKET, socket.SO_ERROR) if err != 0: # Jump to the except clause below. - raise OSError(err, 'Connect call failed') + raise OSError(err, 'Connect call failed %s' % (address,)) except (BlockingIOError, InterruptedError): self.add_writer(fd, self._sock_connect, fut, True, sock, address) except Exception as exc: @@ -413,7 +413,7 @@ try: self._protocol.pause_writing() except Exception: - tulip_log.exception('pause_writing() failed') + logger.exception('pause_writing() failed') def _maybe_resume_protocol(self): if (self._protocol_paused and @@ -422,7 +422,7 @@ try: self._protocol.resume_writing() except Exception: - tulip_log.exception('resume_writing() failed') + logger.exception('resume_writing() failed') def set_write_buffer_limits(self, high=None, low=None): if high is None: @@ -635,15 +635,16 @@ compression=self._sock.compression(), ) - self._loop.add_reader(self._sock_fd, self._on_ready) - self._loop.add_writer(self._sock_fd, self._on_ready) + self._read_wants_write = False + self._write_wants_read = False + self._loop.add_reader(self._sock_fd, self._read_ready) self._loop.call_soon(self._protocol.connection_made, self) if self._waiter is not None: self._loop.call_soon(self._waiter.set_result, None) def pause_reading(self): # XXX This is a bit icky, given the comment at the top of - # _on_ready(). Is it possible to evoke a deadlock? I don't + # _read_ready(). Is it possible to evoke a deadlock? I don't # know, although it doesn't look like it; write() will still # accept more data for the buffer and eventually the app will # call resume_reading() again, and things will flow again. @@ -658,41 +659,55 @@ self._paused = False if self._closing: return - self._loop.add_reader(self._sock_fd, self._on_ready) + self._loop.add_reader(self._sock_fd, self._read_ready) - def _on_ready(self): - # Because of renegotiations (?), there's no difference between - # readable and writable. We just try both. XXX This may be - # incorrect; we probably need to keep state about what we - # should do next. + def _read_ready(self): + if self._write_wants_read: + self._write_wants_read = False + self._write_ready() - # First try reading. - if not self._closing and not self._paused: - try: - data = self._sock.recv(self.max_size) - except (BlockingIOError, InterruptedError, - ssl.SSLWantReadError, ssl.SSLWantWriteError): - pass - except Exception as exc: - self._fatal_error(exc) + if self._buffer: + self._loop.add_writer(self._sock_fd, self._write_ready) + + try: + data = self._sock.recv(self.max_size) + except (BlockingIOError, InterruptedError, ssl.SSLWantReadError): + pass + except ssl.SSLWantWriteError: + self._read_wants_write = True + self._loop.remove_reader(self._sock_fd) + self._loop.add_writer(self._sock_fd, self._write_ready) + except Exception as exc: + self._fatal_error(exc) + else: + if data: + self._protocol.data_received(data) else: - if data: - self._protocol.data_received(data) - else: - try: - self._protocol.eof_received() - finally: - self.close() + try: + self._protocol.eof_received() + finally: + self.close() - # Now try writing, if there's anything to write. + def _write_ready(self): + if self._read_wants_write: + self._read_wants_write = False + self._read_ready() + + if not (self._paused or self._closing): + self._loop.add_reader(self._sock_fd, self._read_ready) + if self._buffer: data = b''.join(self._buffer) self._buffer.clear() try: n = self._sock.send(data) except (BlockingIOError, InterruptedError, - ssl.SSLWantReadError, ssl.SSLWantWriteError): + ssl.SSLWantWriteError): n = 0 + except ssl.SSLWantReadError: + n = 0 + self._loop.remove_writer(self._sock_fd) + self._write_wants_read = True except Exception as exc: self._loop.remove_writer(self._sock_fd) self._fatal_error(exc) @@ -701,11 +716,12 @@ if n < len(data): self._buffer.append(data[n:]) - self._maybe_resume_protocol() # May append to buffer. + self._maybe_resume_protocol() # May append to buffer. - if self._closing and not self._buffer: + if not self._buffer: self._loop.remove_writer(self._sock_fd) - self._call_connection_lost(None) + if self._closing: + self._call_connection_lost(None) def write(self, data): assert isinstance(data, bytes), repr(type(data)) @@ -718,20 +734,16 @@ self._conn_lost += 1 return - # We could optimize, but the callback can do this for now. + if not self._buffer: + self._loop.add_writer(self._sock_fd, self._write_ready) + + # Add it to the buffer. self._buffer.append(data) self._maybe_pause_protocol() def can_write_eof(self): return False - def close(self): - if self._closing: - return - self._closing = True - self._conn_lost += 1 - self._loop.remove_reader(self._sock_fd) - class _SelectorDatagramTransport(_SelectorTransport): diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -1003,8 +1003,7 @@ self.loop, self.sock, self.protocol, self.sslcontext, waiter=waiter) self.assertTrue(self.sslsock.do_handshake.called) - self.loop.assert_reader(1, tr._on_ready) - self.loop.assert_writer(1, tr._on_ready) + self.loop.assert_reader(1, tr._read_ready) test_utils.run_briefly(self.loop) self.assertIsNone(waiter.result()) @@ -1047,13 +1046,13 @@ def test_pause_resume_reading(self): tr = self._make_one() self.assertFalse(tr._paused) - self.loop.assert_reader(1, tr._on_ready) + self.loop.assert_reader(1, tr._read_ready) tr.pause_reading() self.assertTrue(tr._paused) self.assertFalse(1 in self.loop.readers) tr.resume_reading() self.assertFalse(tr._paused) - self.loop.assert_reader(1, tr._on_ready) + self.loop.assert_reader(1, tr._read_ready) def test_write_no_data(self): transport = self._make_one() @@ -1084,140 +1083,173 @@ transport.write(b'data') m_log.warning.assert_called_with('socket.send() raised exception.') - def test_on_ready_recv(self): + def test_read_ready_recv(self): self.sslsock.recv.return_value = b'data' transport = self._make_one() - transport._on_ready() + transport._read_ready() self.assertTrue(self.sslsock.recv.called) self.assertEqual((b'data',), self.protocol.data_received.call_args[0]) - def test_on_ready_recv_eof(self): + def test_read_ready_write_wants_read(self): + self.loop.add_writer = unittest.mock.Mock() + self.sslsock.recv.side_effect = BlockingIOError + transport = self._make_one() + transport._write_wants_read = True + transport._write_ready = unittest.mock.Mock() + transport._buffer.append(b'data') + transport._read_ready() + + self.assertFalse(transport._write_wants_read) + transport._write_ready.assert_called_with() + self.loop.add_writer.assert_called_with( + transport._sock_fd, transport._write_ready) + + def test_read_ready_recv_eof(self): self.sslsock.recv.return_value = b'' transport = self._make_one() transport.close = unittest.mock.Mock() - transport._on_ready() + transport._read_ready() transport.close.assert_called_with() self.protocol.eof_received.assert_called_with() - def test_on_ready_recv_conn_reset(self): + def test_read_ready_recv_conn_reset(self): err = self.sslsock.recv.side_effect = ConnectionResetError() transport = self._make_one() transport._force_close = unittest.mock.Mock() - transport._on_ready() + transport._read_ready() transport._force_close.assert_called_with(err) - def test_on_ready_recv_retry(self): + def test_read_ready_recv_retry(self): self.sslsock.recv.side_effect = ssl.SSLWantReadError transport = self._make_one() - transport._on_ready() + transport._read_ready() self.assertTrue(self.sslsock.recv.called) self.assertFalse(self.protocol.data_received.called) - self.sslsock.recv.side_effect = ssl.SSLWantWriteError - transport._on_ready() - self.assertFalse(self.protocol.data_received.called) - self.sslsock.recv.side_effect = BlockingIOError - transport._on_ready() + transport._read_ready() self.assertFalse(self.protocol.data_received.called) self.sslsock.recv.side_effect = InterruptedError - transport._on_ready() + transport._read_ready() self.assertFalse(self.protocol.data_received.called) - def test_on_ready_recv_exc(self): + def test_read_ready_recv_write(self): + self.loop.remove_reader = unittest.mock.Mock() + self.loop.add_writer = unittest.mock.Mock() + self.sslsock.recv.side_effect = ssl.SSLWantWriteError + transport = self._make_one() + transport._read_ready() + self.assertFalse(self.protocol.data_received.called) + self.assertTrue(transport._read_wants_write) + + self.loop.remove_reader.assert_called_with(transport._sock_fd) + self.loop.add_writer.assert_called_with( + transport._sock_fd, transport._write_ready) + + def test_read_ready_recv_exc(self): err = self.sslsock.recv.side_effect = OSError() transport = self._make_one() transport._fatal_error = unittest.mock.Mock() - transport._on_ready() + transport._read_ready() transport._fatal_error.assert_called_with(err) - def test_on_ready_send(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError + def test_write_ready_send(self): self.sslsock.send.return_value = 4 transport = self._make_one() transport._buffer = collections.deque([b'data']) - transport._on_ready() + transport._write_ready() self.assertEqual(collections.deque(), transport._buffer) self.assertTrue(self.sslsock.send.called) - def test_on_ready_send_none(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError + def test_write_ready_send_none(self): self.sslsock.send.return_value = 0 transport = self._make_one() transport._buffer = collections.deque([b'data1', b'data2']) - transport._on_ready() + transport._write_ready() self.assertTrue(self.sslsock.send.called) self.assertEqual(collections.deque([b'data1data2']), transport._buffer) - def test_on_ready_send_partial(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError + def test_write_ready_send_partial(self): self.sslsock.send.return_value = 2 transport = self._make_one() transport._buffer = collections.deque([b'data1', b'data2']) - transport._on_ready() + transport._write_ready() self.assertTrue(self.sslsock.send.called) self.assertEqual(collections.deque([b'ta1data2']), transport._buffer) - def test_on_ready_send_closing_partial(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError + def test_write_ready_send_closing_partial(self): self.sslsock.send.return_value = 2 transport = self._make_one() transport._buffer = collections.deque([b'data1', b'data2']) - transport._on_ready() + transport._write_ready() self.assertTrue(self.sslsock.send.called) self.assertFalse(self.sslsock.close.called) - def test_on_ready_send_closing(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError + def test_write_ready_send_closing(self): self.sslsock.send.return_value = 4 transport = self._make_one() transport.close() transport._buffer = collections.deque([b'data']) - transport._on_ready() + transport._write_ready() self.assertFalse(self.loop.writers) self.protocol.connection_lost.assert_called_with(None) - def test_on_ready_send_closing_empty_buffer(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError + def test_write_ready_send_closing_empty_buffer(self): self.sslsock.send.return_value = 4 transport = self._make_one() transport.close() transport._buffer = collections.deque() - transport._on_ready() + transport._write_ready() self.assertFalse(self.loop.writers) self.protocol.connection_lost.assert_called_with(None) - def test_on_ready_send_retry(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError - + def test_write_ready_send_retry(self): transport = self._make_one() transport._buffer = collections.deque([b'data']) - self.sslsock.send.side_effect = ssl.SSLWantReadError - transport._on_ready() - self.assertTrue(self.sslsock.send.called) - self.assertEqual(collections.deque([b'data']), transport._buffer) - self.sslsock.send.side_effect = ssl.SSLWantWriteError - transport._on_ready() + transport._write_ready() self.assertEqual(collections.deque([b'data']), transport._buffer) self.sslsock.send.side_effect = BlockingIOError() - transport._on_ready() + transport._write_ready() self.assertEqual(collections.deque([b'data']), transport._buffer) - def test_on_ready_send_exc(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError + def test_write_ready_send_read(self): + transport = self._make_one() + transport._buffer = collections.deque([b'data']) + + self.loop.remove_writer = unittest.mock.Mock() + self.sslsock.send.side_effect = ssl.SSLWantReadError + transport._write_ready() + self.assertFalse(self.protocol.data_received.called) + self.assertTrue(transport._write_wants_read) + self.loop.remove_writer.assert_called_with(transport._sock_fd) + + def test_write_ready_send_exc(self): err = self.sslsock.send.side_effect = OSError() transport = self._make_one() transport._buffer = collections.deque([b'data']) transport._fatal_error = unittest.mock.Mock() - transport._on_ready() + transport._write_ready() transport._fatal_error.assert_called_with(err) self.assertEqual(collections.deque(), transport._buffer) + def test_write_ready_read_wants_write(self): + self.loop.add_reader = unittest.mock.Mock() + self.sslsock.send.side_effect = BlockingIOError + transport = self._make_one() + transport._read_wants_write = True + transport._read_ready = unittest.mock.Mock() + transport._write_ready() + + self.assertFalse(transport._read_wants_write) + transport._read_ready.assert_called_with() + self.loop.add_reader.assert_called_with( + transport._sock_fd, transport._read_ready) + def test_write_eof(self): tr = self._make_one() self.assertFalse(tr.can_write_eof()) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 22:25:52 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 1 Nov 2013 22:25:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Document_EventL?= =?utf-8?b?b29wLmNsb3NlKCku?= Message-ID: <3dBGfS3tHRz7Lk6@mail.python.org> http://hg.python.org/cpython/rev/2ceb3fd38280 changeset: 86829:2ceb3fd38280 user: Guido van Rossum date: Fri Nov 01 14:19:04 2013 -0700 summary: asyncio: Document EventLoop.close(). files: Lib/asyncio/base_events.py | 5 +++++ Lib/asyncio/events.py | 13 +++++++++++++ Lib/test/test_asyncio/test_events.py | 2 ++ 3 files changed, 20 insertions(+), 0 deletions(-) diff --git a/Lib/asyncio/base_events.py b/Lib/asyncio/base_events.py --- a/Lib/asyncio/base_events.py +++ b/Lib/asyncio/base_events.py @@ -186,6 +186,11 @@ self.call_soon(_raise_stop_error) def close(self): + """Close the event loop. + + This clears the queues and shuts down the executor, + but does not wait for the executor to finish. + """ self._ready.clear() self._scheduled.clear() executor = self._default_executor diff --git a/Lib/asyncio/events.py b/Lib/asyncio/events.py --- a/Lib/asyncio/events.py +++ b/Lib/asyncio/events.py @@ -137,6 +137,17 @@ """Return whether the event loop is currently running.""" raise NotImplementedError + def close(self): + """Close the loop. + + The loop should not be running. + + This is idempotent and irreversible. + + No other methods should be called after this one. + """ + raise NotImplementedError + # Methods scheduling callbacks. All these return Handles. def call_soon(self, callback, *args): @@ -214,6 +225,8 @@ family=0, proto=0, flags=0): raise NotImplementedError + # Pipes and subprocesses. + def connect_read_pipe(self, protocol_factory, pipe): """Register read pipe in eventloop. diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -1472,6 +1472,8 @@ self.assertRaises( NotImplementedError, loop.is_running) self.assertRaises( + NotImplementedError, loop.close) + self.assertRaises( NotImplementedError, loop.call_later, None, None) self.assertRaises( NotImplementedError, loop.call_at, f, f) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 22:25:53 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 1 Nov 2013 22:25:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Log_a_warning_w?= =?utf-8?q?hen_eof=5Freceived=28=29_returns_true_and_using_ssl=2E?= Message-ID: <3dBGfT5cS8z7Ljp@mail.python.org> http://hg.python.org/cpython/rev/0c5f2223026f changeset: 86830:0c5f2223026f user: Guido van Rossum date: Fri Nov 01 14:19:35 2013 -0700 summary: asyncio: Log a warning when eof_received() returns true and using ssl. files: Lib/asyncio/selector_events.py | 5 ++++- 1 files changed, 4 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -684,7 +684,10 @@ self._protocol.data_received(data) else: try: - self._protocol.eof_received() + keep_open = self._protocol.eof_received() + if keep_open: + logger.warning('returning true from eof_received() ' + 'has no effect when using ssl') finally: self.close() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 22:25:55 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 1 Nov 2013 22:25:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Various_style_n?= =?utf-8?q?its=2E?= Message-ID: <3dBGfW1L2tz7Lk5@mail.python.org> http://hg.python.org/cpython/rev/c793f3347c3c changeset: 86831:c793f3347c3c user: Guido van Rossum date: Fri Nov 01 14:20:55 2013 -0700 summary: asyncio: Various style nits. files: Lib/asyncio/base_events.py | 2 +- Lib/asyncio/windows_events.py | 16 +++++ Lib/asyncio/windows_utils.py | 20 +++--- Lib/test/test_asyncio/test_base_events.py | 30 ++++++--- Lib/test/test_asyncio/test_events.py | 1 - Lib/test/test_asyncio/test_windows_events.py | 2 +- 6 files changed, 48 insertions(+), 23 deletions(-) diff --git a/Lib/asyncio/base_events.py b/Lib/asyncio/base_events.py --- a/Lib/asyncio/base_events.py +++ b/Lib/asyncio/base_events.py @@ -300,7 +300,7 @@ raise ValueError('You must set server_hostname ' 'when using ssl without a host') server_hostname = host - + if host is not None or port is not None: if sock is not None: raise ValueError( diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -138,6 +138,7 @@ @tasks.coroutine def start_serving_pipe(self, protocol_factory, address): server = PipeServer(address) + def loop(f=None): pipe = None try: @@ -160,6 +161,7 @@ pipe.close() else: f.add_done_callback(loop) + self.call_soon(loop) return [server] @@ -209,6 +211,7 @@ ov.WSARecv(conn.fileno(), nbytes, flags) else: ov.ReadFile(conn.fileno(), nbytes) + def finish(trans, key, ov): try: return ov.getresult() @@ -217,6 +220,7 @@ raise ConnectionResetError(*exc.args) else: raise + return self._register(ov, conn, finish) def send(self, conn, buf, flags=0): @@ -226,6 +230,7 @@ ov.WSASend(conn.fileno(), buf, flags) else: ov.WriteFile(conn.fileno(), buf) + def finish(trans, key, ov): try: return ov.getresult() @@ -234,6 +239,7 @@ raise ConnectionResetError(*exc.args) else: raise + return self._register(ov, conn, finish) def accept(self, listener): @@ -241,6 +247,7 @@ conn = self._get_accept_socket(listener.family) ov = _overlapped.Overlapped(NULL) ov.AcceptEx(listener.fileno(), conn.fileno()) + def finish_accept(trans, key, ov): ov.getresult() # Use SO_UPDATE_ACCEPT_CONTEXT so getsockname() etc work. @@ -249,6 +256,7 @@ _overlapped.SO_UPDATE_ACCEPT_CONTEXT, buf) conn.settimeout(listener.gettimeout()) return conn, conn.getpeername() + return self._register(ov, listener, finish_accept) def connect(self, conn, address): @@ -264,26 +272,31 @@ raise ov = _overlapped.Overlapped(NULL) ov.ConnectEx(conn.fileno(), address) + def finish_connect(trans, key, ov): ov.getresult() # Use SO_UPDATE_CONNECT_CONTEXT so getsockname() etc work. conn.setsockopt(socket.SOL_SOCKET, _overlapped.SO_UPDATE_CONNECT_CONTEXT, 0) return conn + return self._register(ov, conn, finish_connect) def accept_pipe(self, pipe): self._register_with_iocp(pipe) ov = _overlapped.Overlapped(NULL) ov.ConnectNamedPipe(pipe.fileno()) + def finish(trans, key, ov): ov.getresult() return pipe + return self._register(ov, pipe, finish) def connect_pipe(self, address): ov = _overlapped.Overlapped(NULL) ov.WaitNamedPipeAndConnect(address, self._iocp, ov.address) + def finish(err, handle, ov): # err, handle were arguments passed to PostQueuedCompletionStatus() # in a function run in a thread pool. @@ -296,6 +309,7 @@ raise OSError(0, msg, None, err) else: return windows_utils.PipeHandle(handle) + return self._register(ov, None, finish, wait_for_post=True) def wait_for_handle(self, handle, timeout=None): @@ -432,8 +446,10 @@ self._proc = windows_utils.Popen( args, shell=shell, stdin=stdin, stdout=stdout, stderr=stderr, bufsize=bufsize, **kwargs) + def callback(f): returncode = self._proc.poll() self._process_exited(returncode) + f = self._loop._proactor.wait_for_handle(int(self._proc._handle)) f.add_done_callback(callback) diff --git a/Lib/asyncio/windows_utils.py b/Lib/asyncio/windows_utils.py --- a/Lib/asyncio/windows_utils.py +++ b/Lib/asyncio/windows_utils.py @@ -18,18 +18,18 @@ __all__ = ['socketpair', 'pipe', 'Popen', 'PIPE', 'PipeHandle'] -# + # Constants/globals -# + BUFSIZE = 8192 PIPE = subprocess.PIPE STDOUT = subprocess.STDOUT _mmap_counter = itertools.count() -# + # Replacement for socket.socketpair() -# + def socketpair(family=socket.AF_INET, type=socket.SOCK_STREAM, proto=0): """A socket pair usable as a self-pipe, for Windows. @@ -57,9 +57,9 @@ lsock.close() return (ssock, csock) -# + # Replacement for os.pipe() using handles instead of fds -# + def pipe(*, duplex=False, overlapped=(True, True), bufsize=BUFSIZE): """Like os.pipe() but with overlapped support and using handles not fds.""" @@ -105,9 +105,9 @@ _winapi.CloseHandle(h2) raise -# + # Wrapper for a pipe handle -# + class PipeHandle: """Wrapper for an overlapped pipe handle which is vaguely file-object like. @@ -137,9 +137,9 @@ def __exit__(self, t, v, tb): self.close() -# + # Replacement for subprocess.Popen using overlapped pipe handles -# + class Popen(subprocess.Popen): """Replacement for subprocess.Popen using overlapped pipe handles. diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -446,34 +446,41 @@ def test_create_connection_server_hostname_default(self): self.loop.getaddrinfo = unittest.mock.Mock() + def mock_getaddrinfo(*args, **kwds): f = futures.Future(loop=self.loop) f.set_result([(socket.AF_INET, socket.SOCK_STREAM, socket.SOL_TCP, '', ('1.2.3.4', 80))]) return f + self.loop.getaddrinfo.side_effect = mock_getaddrinfo self.loop.sock_connect = unittest.mock.Mock() self.loop.sock_connect.return_value = () self.loop._make_ssl_transport = unittest.mock.Mock() - def mock_make_ssl_transport(sock, protocol, sslcontext, waiter, **kwds): + + def mock_make_ssl_transport(sock, protocol, sslcontext, waiter, + **kwds): waiter.set_result(None) + self.loop._make_ssl_transport.side_effect = mock_make_ssl_transport ANY = unittest.mock.ANY # First try the default server_hostname. self.loop._make_ssl_transport.reset_mock() coro = self.loop.create_connection(MyProto, 'python.org', 80, ssl=True) self.loop.run_until_complete(coro) - self.loop._make_ssl_transport.assert_called_with(ANY, ANY, ANY, ANY, - server_side=False, - server_hostname='python.org') + self.loop._make_ssl_transport.assert_called_with( + ANY, ANY, ANY, ANY, + server_side=False, + server_hostname='python.org') # Next try an explicit server_hostname. self.loop._make_ssl_transport.reset_mock() coro = self.loop.create_connection(MyProto, 'python.org', 80, ssl=True, server_hostname='perl.com') self.loop.run_until_complete(coro) - self.loop._make_ssl_transport.assert_called_with(ANY, ANY, ANY, ANY, - server_side=False, - server_hostname='perl.com') + self.loop._make_ssl_transport.assert_called_with( + ANY, ANY, ANY, ANY, + server_side=False, + server_hostname='perl.com') # Finally try an explicit empty server_hostname. self.loop._make_ssl_transport.reset_mock() coro = self.loop.create_connection(MyProto, 'python.org', 80, ssl=True, @@ -485,9 +492,11 @@ def test_create_connection_server_hostname_errors(self): # When not using ssl, server_hostname must be None (but '' is OK). - coro = self.loop.create_connection(MyProto, 'python.org', 80, server_hostname='') + coro = self.loop.create_connection(MyProto, 'python.org', 80, + server_hostname='') self.assertRaises(ValueError, self.loop.run_until_complete, coro) - coro = self.loop.create_connection(MyProto, 'python.org', 80, server_hostname='python.org') + coro = self.loop.create_connection(MyProto, 'python.org', 80, + server_hostname='python.org') self.assertRaises(ValueError, self.loop.run_until_complete, coro) # When using ssl, server_hostname may be None if host is non-empty. @@ -495,7 +504,8 @@ self.assertRaises(ValueError, self.loop.run_until_complete, coro) coro = self.loop.create_connection(MyProto, None, 80, ssl=True) self.assertRaises(ValueError, self.loop.run_until_complete, coro) - coro = self.loop.create_connection(MyProto, None, None, ssl=True, sock=socket.socket()) + coro = self.loop.create_connection(MyProto, None, None, + ssl=True, sock=socket.socket()) self.assertRaises(ValueError, self.loop.run_until_complete, coro) def test_create_server_empty_host(self): diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -1276,7 +1276,6 @@ def create_event_loop(self): return windows_events.SelectorEventLoop() - class ProactorEventLoopTests(EventLoopTestsMixin, SubprocessTestsMixin, unittest.TestCase): diff --git a/Lib/test/test_asyncio/test_windows_events.py b/Lib/test/test_asyncio/test_windows_events.py --- a/Lib/test/test_asyncio/test_windows_events.py +++ b/Lib/test/test_asyncio/test_windows_events.py @@ -77,7 +77,7 @@ stream_reader = streams.StreamReader(loop=self.loop) protocol = streams.StreamReaderProtocol(stream_reader) trans, proto = yield from self.loop.create_pipe_connection( - lambda:protocol, ADDRESS) + lambda: protocol, ADDRESS) self.assertIsInstance(trans, transports.Transport) self.assertEqual(protocol, proto) clients.append((stream_reader, trans)) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 22:25:56 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 1 Nov 2013 22:25:56 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Better-looking_?= =?utf-8?q?errors_when_ssl_module_cannot_be_imported=2E_In_part_by?= Message-ID: <3dBGfX4HDSz7Lk4@mail.python.org> http://hg.python.org/cpython/rev/8d67a5d8cfa4 changeset: 86832:8d67a5d8cfa4 user: Guido van Rossum date: Fri Nov 01 14:22:30 2013 -0700 summary: asyncio: Better-looking errors when ssl module cannot be imported. In part by Arnaud Faure. files: Lib/asyncio/base_events.py | 2 + Lib/asyncio/selector_events.py | 31 ++++++--- Lib/test/test_asyncio/test_selector_events.py | 20 ++++++ 3 files changed, 41 insertions(+), 12 deletions(-) diff --git a/Lib/asyncio/base_events.py b/Lib/asyncio/base_events.py --- a/Lib/asyncio/base_events.py +++ b/Lib/asyncio/base_events.py @@ -466,6 +466,8 @@ ssl=None, reuse_address=None): """XXX""" + if isinstance(ssl, bool): + raise TypeError('ssl argument must be an SSLContext or None') if host is not None or port is not None: if sock is not None: raise ValueError( diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -90,12 +90,13 @@ except (BlockingIOError, InterruptedError): pass - def _start_serving(self, protocol_factory, sock, ssl=None, server=None): + def _start_serving(self, protocol_factory, sock, + sslcontext=None, server=None): self.add_reader(sock.fileno(), self._accept_connection, - protocol_factory, sock, ssl, server) + protocol_factory, sock, sslcontext, server) - def _accept_connection(self, protocol_factory, sock, ssl=None, - server=None): + def _accept_connection(self, protocol_factory, sock, + sslcontext=None, server=None): try: conn, addr = sock.accept() conn.setblocking(False) @@ -113,13 +114,13 @@ self.remove_reader(sock.fileno()) self.call_later(constants.ACCEPT_RETRY_DELAY, self._start_serving, - protocol_factory, sock, ssl, server) + protocol_factory, sock, sslcontext, server) else: raise # The event loop will catch, log and ignore it. else: - if ssl: + if sslcontext: self._make_ssl_transport( - conn, protocol_factory(), ssl, None, + conn, protocol_factory(), sslcontext, None, server_side=True, extra={'peername': addr}, server=server) else: self._make_socket_transport( @@ -558,17 +559,23 @@ def __init__(self, loop, rawsock, protocol, sslcontext, waiter=None, server_side=False, server_hostname=None, extra=None, server=None): + if ssl is None: + raise RuntimeError('stdlib ssl module not available') + if server_side: - assert isinstance( - sslcontext, ssl.SSLContext), 'Must pass an SSLContext' + if not sslcontext: + raise ValueError('Server side ssl needs a valid SSLContext') else: - # Client-side may pass ssl=True to use a default context. - # The default is the same as used by urllib. - if sslcontext is None: + if not sslcontext: + # Client side may pass ssl=True to use a default + # context; in that case the sslcontext passed is None. + # The default is the same as used by urllib with + # cadefault=True. sslcontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23) sslcontext.options |= ssl.OP_NO_SSLv2 sslcontext.set_default_verify_paths() sslcontext.verify_mode = ssl.CERT_REQUIRED + wrap_kwargs = { 'server_side': server_side, 'do_handshake_on_connect': False, diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -43,6 +43,7 @@ self.assertIsInstance( self.loop._make_socket_transport(m, m), _SelectorSocketTransport) + @unittest.skipIf(ssl is None, 'No ssl module') def test_make_ssl_transport(self): m = unittest.mock.Mock() self.loop.add_reader = unittest.mock.Mock() @@ -52,6 +53,16 @@ self.assertIsInstance( self.loop._make_ssl_transport(m, m, m, m), _SelectorSslTransport) + @unittest.mock.patch('asyncio.selector_events.ssl', None) + def test_make_ssl_transport_without_ssl_error(self): + m = unittest.mock.Mock() + self.loop.add_reader = unittest.mock.Mock() + self.loop.add_writer = unittest.mock.Mock() + self.loop.remove_reader = unittest.mock.Mock() + self.loop.remove_writer = unittest.mock.Mock() + with self.assertRaises(RuntimeError): + self.loop._make_ssl_transport(m, m, m, m) + def test_close(self): ssock = self.loop._ssock ssock.fileno.return_value = 7 @@ -1277,6 +1288,15 @@ server_hostname='localhost') +class SelectorSslWithoutSslTransportTests(unittest.TestCase): + + @unittest.mock.patch('asyncio.selector_events.ssl', None) + def test_ssl_transport_requires_ssl_module(self): + Mock = unittest.mock.Mock + with self.assertRaises(RuntimeError): + transport = _SelectorSslTransport(Mock(), Mock(), Mock(), Mock()) + + class SelectorDatagramTransportTests(unittest.TestCase): def setUp(self): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 22:25:57 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 1 Nov 2013 22:25:57 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Slight_rearrang?= =?utf-8?q?ement_of_tests_for_server=5Fhostname=3D=2E=2E=2E?= Message-ID: <3dBGfY63xWz7Lk1@mail.python.org> http://hg.python.org/cpython/rev/123804a72a8f changeset: 86833:123804a72a8f user: Guido van Rossum date: Fri Nov 01 14:24:28 2013 -0700 summary: asyncio: Slight rearrangement of tests for server_hostname=... files: Lib/test/test_asyncio/test_base_events.py | 7 ++++--- 1 files changed, 4 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -444,7 +444,7 @@ self.assertRaises( OSError, self.loop.run_until_complete, coro) - def test_create_connection_server_hostname_default(self): + def test_create_connection_ssl_server_hostname_default(self): self.loop.getaddrinfo = unittest.mock.Mock() def mock_getaddrinfo(*args, **kwds): @@ -490,8 +490,8 @@ server_side=False, server_hostname='') - def test_create_connection_server_hostname_errors(self): - # When not using ssl, server_hostname must be None (but '' is OK). + def test_create_connection_no_ssl_server_hostname_errors(self): + # When not using ssl, server_hostname must be None. coro = self.loop.create_connection(MyProto, 'python.org', 80, server_hostname='') self.assertRaises(ValueError, self.loop.run_until_complete, coro) @@ -499,6 +499,7 @@ server_hostname='python.org') self.assertRaises(ValueError, self.loop.run_until_complete, coro) + def test_create_connection_ssl_server_hostname_errors(self): # When using ssl, server_hostname may be None if host is non-empty. coro = self.loop.create_connection(MyProto, '', 80, ssl=True) self.assertRaises(ValueError, self.loop.run_until_complete, coro) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 1 23:14:42 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 1 Nov 2013 23:14:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Resolve_a_few_TBD/TODOs=2E?= Message-ID: <3dBHkp3SVNz7LjZ@mail.python.org> http://hg.python.org/peps/rev/c12ea420a840 changeset: 5243:c12ea420a840 user: Guido van Rossum date: Fri Nov 01 15:14:39 2013 -0700 summary: Resolve a few TBD/TODOs. files: pep-3156.txt | 70 ++++++++++++++++----------------------- 1 files changed, 29 insertions(+), 41 deletions(-) diff --git a/pep-3156.txt b/pep-3156.txt --- a/pep-3156.txt +++ b/pep-3156.txt @@ -740,8 +740,7 @@ ``None``. Note: the name uses ``sendall`` instead of ``send``, to reflect that the semantics and signature of this method echo those of the standard library socket method ``sendall()`` rather than - ``send()``. (TBD: but maybe it would be better to emulate - ``send()`` after all? That would be better for datagram sockets.) + ``send()``. - ``sock_connect(sock, address)``. Connect to the given address. Returns a Future whose result on success will be ``None``. @@ -962,11 +961,16 @@ ``concurrent.futures.Future`` is explicitly mentioned. The supported public API is as follows, indicating the differences with PEP 3148: -- ``cancel()``. If the Future is already done (or cancelled), return - ``False``. Otherwise, change the Future's state to cancelled (this - implies done), schedule the callbacks, and return ``True``. +- ``cancel()``. If the Future is already done (or cancelled), do + nothing and return ``False``. Otherwise, this attempts to cancel + the Future and returns ``True``. If the the cancellation attempt is + successful, eventually the Future's state will change to cancelled + and the callbacks will be scheduled. For regular Futures, + cancellation will always succeed immediately; but for Tasks (see + below) the task may ignore or delay the cancellation attempt. -- ``cancelled()``. Returns ``True`` if the Future was cancelled. +- ``cancelled()``. Returns ``True`` if the Future was successfully + cancelled. - ``done()``. Returns ``True`` if the Future is done. Note that a cancelled Future is considered done too (here and everywhere). @@ -1031,8 +1035,7 @@ - ``TimeoutError``. An alias for ``concurrent.futures.TimeoutError``. May be raised by ``run_until_complete()``. -A Future is associated with the default event loop when it is created. -(TBD: Optionally pass in an alternative event loop instance?) +A Future is associated with an event loop when it is created. A ``asyncio.Future`` object is not acceptable to the ``wait()`` and ``as_completed()`` functions in the ``concurrent.futures`` package. @@ -1045,9 +1048,10 @@ and the Scheduler" below. When a Future is garbage-collected, if it has an associated exception -but neither ``result()`` nor ``exception()`` nor ``__iter__()`` has -ever been called (or the latter hasn't raised the exception yet -- -details TBD), the exception is logged. +but neither ``result()`` nor ``exception()`` has ever been called, the +exception is logged. (When a coroutine uses ``yield from`` to wait +for a Future, that Future's ``result()`` method is called once the +coroutine is resumed.) In the future (pun intended) we may unify ``asyncio.Future`` and ``concurrent.futures.Future``, e.g. by adding an ``__iter__()`` method @@ -1171,10 +1175,13 @@ - ``pause_reading()``. Suspend delivery of data to the protocol until a subsequent ``resume_reading()`` call. Between ``pause_reading()`` and ``resume_reading()``, the protocol's ``data_received()`` method - will not be called. This has no effect on ``write()``. + will not be called. - ``resume_reading()``. Restart delivery of data to the protocol via - ``data_received()``. + ``data_received()``. Note that "paused" is a binary state -- + ``pause_reading()`` should only be called when the transport is not + paused, while ``resume_reading()`` should only be called when the + transport is paused. - ``close()``. Sever the connection with the entity at the other end. Any data buffered by ``write()`` will (eventually) be transferred @@ -1187,17 +1194,7 @@ - ``abort()``. Immediately sever the connection. Any data still buffered by the transport is thrown away. Soon, the protocol's ``connection_lost()`` method will be called with ``None`` as - argument. (TBD: Distinguish in the ``connection_lost()`` argument - between ``close()``, ``abort()`` or a close initated by the other - end? Or add a transport method to inquire about this? Glyph's - proposal was to pass different exceptions for this purpose.) - -TBD: Provide flow control the other way -- the transport may need to -suspend the protocol if the amount of data buffered becomes a burden. -Proposal: let the transport call ``protocol.pause_writing()`` and -``protocol.resume_writing()`` if they exist; if they don't exist, the -protocol doesn't support flow control. (Perhaps different names -to avoid confusion between protocols and transports?) + argument. Unidirectional Stream Transports '''''''''''''''''''''''''''''''' @@ -1245,8 +1242,8 @@ transport is closed. The ``connection_refused()`` method is called before ``connection_lost()`` when ``remote_addr`` was given and an explicit negative acknowledgement was received (this is a UDP -feature). (TBD: Do we need the latter? It seems easy enough to -implement this in the protocol if it needs to make the distinction.) +feature). (TBD: Should fix `connection_refused()`` to not close the +transport.) Subprocess Transports ''''''''''''''''''''' @@ -1267,8 +1264,6 @@ of protocol is a bidirectional stream protocol. (There are no unidirectional protocols.) -(TBD: should protocol callbacks be allowed to be coroutines?) - Stream Protocols '''''''''''''''' @@ -1296,6 +1291,9 @@ ``write_eof()`` (or something equivalent). If this returns a false value (including None), the transport will close itself. If it returns a true value, closing the transport is up to the protocol. + However, for SSL/TLS connections this is ignored, because the TLS + standard requires that no more data is sent and the connection is + closed as soon as a "closure alert" is received. The default implementation returns None. @@ -1303,8 +1301,7 @@ has detected that the other end has closed the connection cleanly, or has encountered an unexpected error. In the first three cases the argument is ``None``; for an unexpected error, the argument is - the exception that caused the transport to give up. (TBD: Do we - need to distinguish between the first three cases?) + the exception that caused the transport to give up. Here is a chart indicating the order and multiplicity of calls: @@ -1313,8 +1310,7 @@ 3. ``eof_received()`` -- at most once 4. ``connection_lost()`` -- exactly once -TBD: Discuss whether user code needs to do anything to make sure that -protocol and transport aren't garbage-collected prematurely. +TBD: Document ``pause_writing()`` and ``resume_writing()``. Datagram Protocols '''''''''''''''''' @@ -1502,9 +1498,6 @@ The coroutine ``asyncio.sleep(delay)`` returns after a given time delay. -(TBD: Should the optional second argument, ``result``, be part of the -spec?) - Tasks ----- @@ -1565,17 +1558,12 @@ TO DO ===== -- Document pause/resume reading/writing. +- Document pause/resume_writing. - Document subprocess/pipe protocols/transports. - Document locks and queues. -- Describe task cancellation details (and update future cancellation - spec to promise less). - -- Explain that eof_received() shouldn't return True with SSL/TLS. - - Document SIGCHILD handling API (once it lands). - Compare all APIs with the source code to be sure there aren't any -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 1 23:23:26 2013 From: python-checkins at python.org (eric.snow) Date: Fri, 1 Nov 2013 23:23:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_=5BPEP_451=5D_Update_the_sign?= =?utf-8?q?ature_of_find=5Fspec=28=29_and_remove_supports=5Freload=28=29?= =?utf-8?q?=2E?= Message-ID: <3dBHwt3L5Zz7LjZ@mail.python.org> http://hg.python.org/peps/rev/845e08302be8 changeset: 5244:845e08302be8 user: Eric Snow date: Fri Nov 01 16:19:06 2013 -0600 summary: [PEP 451] Update the signature of find_spec() and remove supports_reload(). files: pep-0451.txt | 238 +++++++++++++++++++++++++------------- 1 files changed, 156 insertions(+), 82 deletions(-) diff --git a/pep-0451.txt b/pep-0451.txt --- a/pep-0451.txt +++ b/pep-0451.txt @@ -278,9 +278,9 @@ Other API Additions ------------------- -* importlib.find_spec(name, path=None) will work exactly the same as - importlib.find_loader() (which it replaces), but return a spec instead - of a loader. +* importlib.find_spec(name, path=None, existing=None) will work exactly + the same as importlib.find_loader() (which it replaces), but return a + spec instead of a loader. For finders: @@ -295,8 +295,6 @@ over its module execution functionality. * importlib.abc.Loader.create_module(spec) (optional) will return the module to use for loading. -* importlib.abc.Loader.supports_reload(name) (optional) will return True - (the default) if the loader supports reloading the module. For modules: @@ -374,13 +372,13 @@ finders: -* create loader -* create spec +* create/identify a loader that can load the module. +* create the spec for the module. loaders: -* create module (optional) -* execute module +* create the module (optional). +* execute the module. ModuleSpec: @@ -404,11 +402,9 @@ * Implement exec_module() on loaders, if possible. The ModuleSpec factory functions in importlib.util are intended to be -helpful for converting existing finders. from_loader() and -from_file_location() are both straight-forward utilities in this -regard. In the case where loaders already expose methods for creating -and preparing modules, ModuleSpec.from_module() may be useful to -the corresponding finder. +helpful for converting existing finders. spec_from_loader() and +spec_from_file_location() are both straight-forward utilities in this +regard. For existing loaders, exec_module() should be a relatively direct conversion from the non-boilerplate portion of load_module(). In some @@ -497,7 +493,7 @@ name = module.__spec__.name except AttributeError: name = module.__name__ - spec = find_spec(name) + spec = find_spec(name, existing=module) if sys.modules.get(name) is not module: raise ImportError @@ -509,8 +505,6 @@ # namespace loader _init_module_attrs(spec, module) return module - if not spec.loader.supports_reload(name): - raise ImportError if spec.parent and spec.parent not in sys.modules: raise ImportError @@ -520,6 +514,19 @@ finally: del _RELOADING[name] +A key point here is the switch to Loader.exec_module() means that +loaders will no longer have an easy way to know at execution time if it +is a reload or not. Before this proposal, they could simply check to +see if the module was already in sys.modules. Now, by the time +exec_module() is called during load (not reload) the import machinery +would already have placed the module in sys.modules. This is part of +the reason why find_spec() has +`the "existing" parameter `_. + +The semantics of reload will remain essentially the same as they exist +already [reload-semantics-fix]_. The impact of this PEP on some kinds +of lazy loading modules was a point of discussion. [lazy_import_concerns]_ + ModuleSpec ========== @@ -627,7 +634,7 @@ * "submodule_search_locations" can be deduced from loader.is_package() and from os.path.dirname(location) if location is a filename. -**from_loader(name, loader, \*, origin=None, is_package=None)** +**spec_from_loader(name, loader, \*, origin=None, is_package=None)** Build a spec with missing information filled in by using loader APIs. @@ -636,45 +643,6 @@ * "submodule_search_locations" can be deduced from loader.is_package() and from os.path.dirname(location) if location is a filename. -**spec_from_module(module, loader=None)** - -Build a spec based on the import-related attributes of an existing -module. The spec attributes are set to the corresponding import- -related module attributes. See the table in `Attributes`_. - -Omitted Attributes and Methods ------------------------------- - -There is no "PathModuleSpec" subclass of ModuleSpec that separates out -has_location, cached, and submodule_search_locations. While that might -make the separation cleaner, module objects don't have that distinction. -ModuleSpec will support both cases equally well. - -While "is_package" would be a simple additional attribute (aliasing -self.submodule_search_locations is not None), it perpetuates the -artificial (and mostly erroneous) distinction between modules and -packages. - -Conceivably, a ModuleSpec.load() method could optionally take a list of -modules with which to interact instead of sys.modules. That -capability is left out of this PEP, but may be pursued separately at -some other time, including relative to PEP 406 (import engine). - -Likewise load() could be leveraged to implement multi-version -imports. While interesting, doing so is outside the scope of this -proposal. - -Others: - -* Add ModuleSpec.submodules (RO-property) - returns possible submodules - relative to the spec. -* Add ModuleSpec.loaded (RO-property) - the module in sys.module, if - any. -* Add ModuleSpec.data - a descriptor that wraps the data API of the - spec's loader. -* Also see [cleaner_reload_support]_. - - Backward Compatibility ---------------------- @@ -722,15 +690,16 @@ Finders ------- -Finders are still responsible for creating the loader. That loader will +Finders are still responsible for identifying, an typically creating, +the loader that should be used to load a module. That loader will now be stored in the module spec returned by find_spec() rather than returned directly. As is currently the case without the PEP, if a loader would be costly to create, that loader can be designed to defer the cost until later. -**MetaPathFinder.find_spec(name, path=None)** +**MetaPathFinder.find_spec(name, path=None, existing=None)** -**PathEntryFinder.find_spec(name)** +**PathEntryFinder.find_spec(name, existing=None)** Finders must return ModuleSpec objects when find_spec() is called. This new method replaces find_module() and @@ -745,6 +714,42 @@ added in Python 3.3. However, the extra complexity and a less-than- explicit method name aren't worth it. +The "existing" parameter of find_spec() +--------------------------------------- + +A module object with the same name as the "name" argument (or None, the +default) should be passed in to "exising". This argument allows the +finder to build the module spec with more information than is otherwise +available. This is particularly relevant in identifying the loader to +use. + +Through find_spec() the finder will always identify the loader it +will return in the spec. In the case of reload, at this point the +finder should also decide whether or not the loader supports loading +into the module-to-be-reloaded (which was passed in to find_spec() as +"existing"). This decision may entail consulting with the loader. If +the finder determines that the loader does not support reloading that +module, it should either find another loader or return None (indicating +that it could not "find" the module). This reload decision is important +since, as noted in `How Reloading Will Work`_, loaders will no longer be +able to trivially identify a reload situation on their own. + +Two alternatives were presented to the "existing" parameter: +Loader.supports_reload() and adding "existing" to Loader.exec_module() +instead of find_spec(). supports_reload() was the initial approach to +the reload situation. [supports_reload]_ However, there was some +opposition to the loader-specific, reload-centric approach. +[supports_reload_considered_harmful]_ + +As to "existing" on exec_module(), the loader may need other information +from the existing module (or spec) during reload, more than just "does +this loader support reloading this module", that is no longer available +with the move away from load_module(). A proposal on the table was to +add something like "existing" to exec_module(). [exec_module_existing]_ +However, putting "existing" on find_spec() instead is more in line with +the goals of this PEP. Furthermore, it obviates the need for +supports_reload(). + Namespace Packages ------------------ @@ -791,13 +796,6 @@ module attributes. The fact that load_module() does is a design flaw that this proposal aims to correct. -**Loader.supports_reload(name)** - -In cases where a module should not be reloaded, Loaders should implement -supports_reload() and have it return False. If the method is defined -and returns a false value, importlib.reload() will raise an ImportError. -Otherwise reloading proceeds as normal. - Other changes: PEP 420 introduced the optional module_repr() loader method to limit @@ -837,24 +835,27 @@ * importlib.reload() will now make use of the per-module import lock. +Open Issues +=========== + +* In the `Finders`_ section, the PEP specifies returning None (or using +a different loader) when the found loader does not support loading into +an existing module (e.g during reload). An alternative to returning +None would be to raise ImportError with a message like "the loader does +not support reloading the module". This may actually be a better +approach since "could not find a loader" and "the found loader won't +work" are different situations that a single return value (None) may not +sufficiently represent. + + Reference Implementation ======================== -A reference implementation will be available at +A reference implementation is available at http://bugs.python.org/issue18864. - -Open Issues -============== - -\* Impact on some kinds of lazy loading modules. [lazy_import_concerns]_ - -This should not be an issue since the PEP does not change the semantics -of this behavior. - - Implementation Notes -==================== +-------------------- \* The implementation of this PEP needs to be cognizant of its impact on pkgutil (and setuptools). pkgutil has some generic function-based @@ -868,17 +869,90 @@ at ``module.__spec__.name``. +Rejected Additions to the PEP +============================= + +There were a few proposed additions to this proposal that did not fit +well enough into its scope. + +There is no "PathModuleSpec" subclass of ModuleSpec that separates out +has_location, cached, and submodule_search_locations. While that might +make the separation cleaner, module objects don't have that distinction. +ModuleSpec will support both cases equally well. + +While "ModuleSpec.is_package" would be a simple additional attribute +(aliasing self.submodule_search_locations is not None), it perpetuates +the artificial (and mostly erroneous) distinction between modules and +packages. + +Others left out: + +* Add ModuleSpec.submodules (RO-property) - returns possible submodules + relative to the spec. +* Add ModuleSpec.loaded (RO-property) - the module in sys.module, if + any. +* Add ModuleSpec.data - a descriptor that wraps the data API of the + spec's loader. +* Also see [cleaner_reload_support]_. + +The module spec `Factory Functions`_ could be classmethods on +ModuleSpec. However that would expose them on *all* modules via +``__spec__``, which has the potential to unnecessarily confuse +non-advanced Python users. The factory functions have a specific use +case, to support finder authors. See `ModuleSpec Users`_. + +Likewise, several other methods could be added to ModuleSpec that expose +the specific uses of module specs by the import machinery: + +* create() - a wrapper around Loader.create_module(). +* exec(module) - a wrapper around Loader.exec_module(). +* load() - an analogue to the deprecated Loader.load_module(). + +As with the factory functions, exposing these methods via +module.__spec__ is less than desireable. They would end up being an +attractive nuisance, even if only exposed as "private" attributes (as +they were in previous versions of this PEP). If someone finds a need +for these methods later, we can expose the via an appropriate API +(separate from ModuleSpec) at that point, perhaps relative to PEP 406 +(import engine). + +Conceivably, the load() method could optionally take a list of +modules with which to interact instead of sys.modules. Also, load() +could be leveraged to implement multi-version imports. Both are +interesting ideas, but definitely outside the scope of this proposal. + +Others left out: + +* Add ModuleSpec.submodules (RO-property) - returns possible submodules + relative to the spec. +* Add ModuleSpec.loaded (RO-property) - the module in sys.module, if + any. +* Add ModuleSpec.data - a descriptor that wraps the data API of the + spec's loader. +* Also see [cleaner_reload_support]_. + + References ========== -.. [ref_files_pep] http://mail.python.org/pipermail/import-sig/2013-August/000658.html +.. [ref_files_pep] + http://mail.python.org/pipermail/import-sig/2013-August/000658.html .. [import_system_docs] http://docs.python.org/3/reference/import.html -.. [cleaner_reload_support] https://mail.python.org/pipermail/import-sig/2013-September/000735.html +.. [cleaner_reload_support] + https://mail.python.org/pipermail/import-sig/2013-September/000735.html -.. [lazy_import_concerns] https://mail.python.org/pipermail/python-dev/2013-August/128129.html +.. [lazy_import_concerns] + https://mail.python.org/pipermail/python-dev/2013-August/128129.html +.. [reload-semantics-fix] http://bugs.python.org/issue19413 + +.. [supports_reload_considered_harmful] + https://mail.python.org/pipermail/python-dev/2013-October/129971.html + +.. [exec_module_existing] + https://mail.python.org/pipermail/python-dev/2013-October/129933.html Copyright ========= -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 1 23:38:05 2013 From: python-checkins at python.org (eric.snow) Date: Fri, 1 Nov 2013 23:38:05 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_=5BPEP_451=5D_It_pays_to_run_?= =?utf-8?q?make_before_committing=2E?= Message-ID: <3dBJFn3xFRz7LjZ@mail.python.org> http://hg.python.org/peps/rev/1ad0ddb3070b changeset: 5245:1ad0ddb3070b user: Eric Snow date: Fri Nov 01 16:33:49 2013 -0600 summary: [PEP 451] It pays to run make before committing. files: pep-0451.txt | 44 ++++++++++++++++----------------------- 1 files changed, 18 insertions(+), 26 deletions(-) diff --git a/pep-0451.txt b/pep-0451.txt --- a/pep-0451.txt +++ b/pep-0451.txt @@ -35,7 +35,7 @@ with which people may not be so familiar. For the sake of context, here is a brief summary of all three groups of terms and concepts. A more detailed explanation of the import system is found at -[import_system_docs]_. +[#import_system_docs]_. name ---- @@ -180,7 +180,7 @@ there's an API void between finders and loaders that causes undue complexity when encountered. The PEP 420 (namespace packages) implementation had to work around this. The complexity surfaced again -during recent efforts on a separate proposal. [ref_files_pep]_ +during recent efforts on a separate proposal. [#ref_files_pep]_ The `finder`_ and `loader`_ sections above detail current responsibility of both. Notably, loaders are not required to provide any of the @@ -524,8 +524,8 @@ `the "existing" parameter `_. The semantics of reload will remain essentially the same as they exist -already [reload-semantics-fix]_. The impact of this PEP on some kinds -of lazy loading modules was a point of discussion. [lazy_import_concerns]_ +already [#reload-semantics-fix]_. The impact of this PEP on some kinds +of lazy loading modules was a point of discussion. [#lazy_import_concerns]_ ModuleSpec @@ -737,15 +737,15 @@ Two alternatives were presented to the "existing" parameter: Loader.supports_reload() and adding "existing" to Loader.exec_module() instead of find_spec(). supports_reload() was the initial approach to -the reload situation. [supports_reload]_ However, there was some +the reload situation. [#supports_reload]_ However, there was some opposition to the loader-specific, reload-centric approach. -[supports_reload_considered_harmful]_ +[#supports_reload_considered_harmful]_ As to "existing" on exec_module(), the loader may need other information from the existing module (or spec) during reload, more than just "does this loader support reloading this module", that is no longer available with the move away from load_module(). A proposal on the table was to -add something like "existing" to exec_module(). [exec_module_existing]_ +add something like "existing" to exec_module(). [#exec_module_existing]_ However, putting "existing" on find_spec() instead is more in line with the goals of this PEP. Furthermore, it obviates the need for supports_reload(). @@ -838,7 +838,7 @@ Open Issues =========== -* In the `Finders`_ section, the PEP specifies returning None (or using +\* In the `Finders`_ section, the PEP specifies returning None (or using a different loader) when the found loader does not support loading into an existing module (e.g during reload). An alternative to returning None would be to raise ImportError with a message like "the loader does @@ -885,16 +885,6 @@ the artificial (and mostly erroneous) distinction between modules and packages. -Others left out: - -* Add ModuleSpec.submodules (RO-property) - returns possible submodules - relative to the spec. -* Add ModuleSpec.loaded (RO-property) - the module in sys.module, if - any. -* Add ModuleSpec.data - a descriptor that wraps the data API of the - spec's loader. -* Also see [cleaner_reload_support]_. - The module spec `Factory Functions`_ could be classmethods on ModuleSpec. However that would expose them on *all* modules via ``__spec__``, which has the potential to unnecessarily confuse @@ -929,29 +919,31 @@ any. * Add ModuleSpec.data - a descriptor that wraps the data API of the spec's loader. -* Also see [cleaner_reload_support]_. +* Also see [#cleaner_reload_support]_. References ========== -.. [ref_files_pep] +.. [#ref_files_pep] http://mail.python.org/pipermail/import-sig/2013-August/000658.html -.. [import_system_docs] http://docs.python.org/3/reference/import.html +.. [#import_system_docs] http://docs.python.org/3/reference/import.html -.. [cleaner_reload_support] +.. [#cleaner_reload_support] https://mail.python.org/pipermail/import-sig/2013-September/000735.html -.. [lazy_import_concerns] +.. [#lazy_import_concerns] https://mail.python.org/pipermail/python-dev/2013-August/128129.html -.. [reload-semantics-fix] http://bugs.python.org/issue19413 +.. [#reload-semantics-fix] http://bugs.python.org/issue19413 -.. [supports_reload_considered_harmful] +.. [#supports_reload] + https://mail.python.org/pipermail/python-dev/2013-October/129913.html +.. [#supports_reload_considered_harmful] https://mail.python.org/pipermail/python-dev/2013-October/129971.html -.. [exec_module_existing] +.. [#exec_module_existing] https://mail.python.org/pipermail/python-dev/2013-October/129933.html Copyright -- Repository URL: http://hg.python.org/peps From solipsis at pitrou.net Sat Nov 2 07:32:44 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sat, 02 Nov 2013 07:32:44 +0100 Subject: [Python-checkins] Daily reference leaks (123804a72a8f): sum=8 Message-ID: results for 123804a72a8f on branch "default" -------------------------------------------- test_asyncio leaked [0, 0, 4] memory blocks, sum=4 test_site leaked [2, -2, 2] references, sum=2 test_site leaked [2, -2, 2] memory blocks, sum=2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogDVDFGr', '-x'] From python-checkins at python.org Sat Nov 2 09:48:35 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 2 Nov 2013 09:48:35 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5MDg1?= =?utf-8?q?=3A_Added_basic_tests_for_all_tkinter_widget_options=2E?= Message-ID: <3dBYpC5vfXz7Ljf@mail.python.org> http://hg.python.org/cpython/rev/92e268f2719e changeset: 86834:92e268f2719e branch: 3.3 parent: 86814:731bdec35fdd user: Serhiy Storchaka date: Sat Nov 02 10:41:48 2013 +0200 summary: Issue #19085: Added basic tests for all tkinter widget options. files: Lib/tkinter/test/support.py | 39 + Lib/tkinter/test/test_tkinter/test_widgets.py | 919 ++++++++++ Lib/tkinter/test/test_ttk/test_widgets.py | 493 +++++- Lib/tkinter/test/widget_tests.py | 487 +++++ Misc/NEWS | 5 + 5 files changed, 1920 insertions(+), 23 deletions(-) diff --git a/Lib/tkinter/test/support.py b/Lib/tkinter/test/support.py --- a/Lib/tkinter/test/support.py +++ b/Lib/tkinter/test/support.py @@ -77,3 +77,42 @@ widget.event_generate('', x=x, y=y) widget.event_generate('', x=x, y=y) widget.event_generate('', x=x, y=y) + + +import _tkinter +tcl_version = tuple(map(int, _tkinter.TCL_VERSION.split('.'))) + +def requires_tcl(*version): + return unittest.skipUnless(tcl_version >= version, + 'requires Tcl version >= ' + '.'.join(map(str, version))) + +units = { + 'c': 72 / 2.54, # centimeters + 'i': 72, # inches + 'm': 72 / 25.4, # millimeters + 'p': 1, # points +} + +def pixels_conv(value): + return float(value[:-1]) * units[value[-1:]] + +def tcl_obj_eq(actual, expected): + if actual == expected: + return True + if isinstance(actual, _tkinter.Tcl_Obj): + if isinstance(expected, str): + return str(actual) == expected + if isinstance(actual, tuple): + if isinstance(expected, tuple): + return (len(actual) == len(expected) and + all(tcl_obj_eq(act, exp) + for act, exp in zip(actual, expected))) + return False + +def widget_eq(actual, expected): + if actual == expected: + return True + if isinstance(actual, (str, tkinter.Widget)): + if isinstance(expected, (str, tkinter.Widget)): + return str(actual) == str(expected) + return False diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py new file mode 100644 --- /dev/null +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -0,0 +1,919 @@ +import unittest +import tkinter +import os +from test.support import requires + +from tkinter.test.support import tcl_version, requires_tcl, widget_eq +from tkinter.test.widget_tests import (add_standard_options, noconv, + AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests) + +requires('gui') + + +def float_round(x): + return float(round(x)) + + +class AbstractToplevelTest(AbstractWidgetTest, PixelSizeTests): + _conv_pad_pixels = noconv + + def test_class(self): + widget = self.create() + self.assertEqual(widget['class'], + widget.__class__.__name__.title()) + self.checkInvalidParam(widget, 'class', 'Foo', + errmsg="can't modify -class option after widget is created") + widget2 = self.create(class_='Foo') + self.assertEqual(widget2['class'], 'Foo') + + def test_colormap(self): + widget = self.create() + self.assertEqual(widget['colormap'], '') + self.checkInvalidParam(widget, 'colormap', 'new', + errmsg="can't modify -colormap option after widget is created") + widget2 = self.create(colormap='new') + self.assertEqual(widget2['colormap'], 'new') + + def test_container(self): + widget = self.create() + self.assertEqual(widget['container'], 0 if self.wantobjects else '0') + self.checkInvalidParam(widget, 'container', 1, + errmsg="can't modify -container option after widget is created") + widget2 = self.create(container=True) + self.assertEqual(widget2['container'], 1 if self.wantobjects else '1') + + def test_visual(self): + widget = self.create() + self.assertEqual(widget['visual'], '') + self.checkInvalidParam(widget, 'visual', 'default', + errmsg="can't modify -visual option after widget is created") + widget2 = self.create(visual='default') + self.assertEqual(widget2['visual'], 'default') + + + at add_standard_options(StandardOptionsTests) +class ToplevelTest(AbstractToplevelTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', + 'class', 'colormap', 'container', 'cursor', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'menu', 'padx', 'pady', 'relief', 'screen', + 'takefocus', 'use', 'visual', 'width', + ) + + def _create(self, **kwargs): + return tkinter.Toplevel(self.root, **kwargs) + + def test_menu(self): + widget = self.create() + menu = tkinter.Menu(self.root) + self.checkParam(widget, 'menu', menu, eq=widget_eq) + self.checkParam(widget, 'menu', '') + + def test_screen(self): + widget = self.create() + self.assertEqual(widget['screen'], '') + display = os.environ['DISPLAY'] + self.checkInvalidParam(widget, 'screen', display, + errmsg="can't modify -screen option after widget is created") + widget2 = self.create(screen=display) + self.assertEqual(widget2['screen'], display) + + def test_use(self): + widget = self.create() + self.assertEqual(widget['use'], '') + widget1 = self.create(container=True) + self.assertEqual(widget1['use'], '') + self.checkInvalidParam(widget1, 'use', '0x44022', + errmsg="can't modify -use option after widget is created") + wid = hex(widget1.winfo_id()) + widget2 = self.create(use=wid) + self.assertEqual(widget2['use'], wid) + + + at add_standard_options(StandardOptionsTests) +class FrameTest(AbstractToplevelTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', + 'class', 'colormap', 'container', 'cursor', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'relief', 'takefocus', 'visual', 'width', + ) + + def _create(self, **kwargs): + return tkinter.Frame(self.root, **kwargs) + + + at add_standard_options(StandardOptionsTests) +class LabelFrameTest(AbstractToplevelTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', + 'class', 'colormap', 'container', 'cursor', + 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'labelanchor', 'labelwidget', 'padx', 'pady', 'relief', + 'takefocus', 'text', 'visual', 'width', + ) + + def _create(self, **kwargs): + return tkinter.LabelFrame(self.root, **kwargs) + + def test_labelanchor(self): + widget = self.create() + self.checkEnumParam(widget, 'labelanchor', + 'e', 'en', 'es', 'n', 'ne', 'nw', + 's', 'se', 'sw', 'w', 'wn', 'ws') + self.checkInvalidParam(widget, 'labelanchor', 'center') + + def test_labelwidget(self): + widget = self.create() + label = tkinter.Label(self.root, text='Mupp', name='foo') + self.checkParam(widget, 'labelwidget', label, expected='.foo') + label.destroy() + + +class AbstractLabelTest(AbstractWidgetTest, IntegerSizeTests): + _conv_pixels = noconv + + def test_highlightthickness(self): + widget = self.create() + self.checkPixelsParam(widget, 'highlightthickness', + 0, 1.3, 2.6, 6, -2, '10p') + + + at add_standard_options(StandardOptionsTests) +class LabelTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', 'compound', 'cursor', + 'disabledforeground', 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'justify', 'padx', 'pady', 'relief', 'state', + 'takefocus', 'text', 'textvariable', + 'underline', 'width', 'wraplength', + ) + + def _create(self, **kwargs): + return tkinter.Label(self.root, **kwargs) + + + at add_standard_options(StandardOptionsTests) +class ButtonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', + 'command', 'compound', 'cursor', 'default', + 'disabledforeground', 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'justify', 'overrelief', 'padx', 'pady', 'relief', + 'repeatdelay', 'repeatinterval', + 'state', 'takefocus', 'text', 'textvariable', + 'underline', 'width', 'wraplength') + + def _create(self, **kwargs): + return tkinter.Button(self.root, **kwargs) + + def test_default(self): + widget = self.create() + self.checkEnumParam(widget, 'default', 'active', 'disabled', 'normal') + + + at add_standard_options(StandardOptionsTests) +class CheckbuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', + 'command', 'compound', 'cursor', + 'disabledforeground', 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'indicatoron', 'justify', + 'offrelief', 'offvalue', 'onvalue', 'overrelief', + 'padx', 'pady', 'relief', 'selectcolor', 'selectimage', 'state', + 'takefocus', 'text', 'textvariable', + 'tristateimage', 'tristatevalue', + 'underline', 'variable', 'width', 'wraplength', + ) + + def _create(self, **kwargs): + return tkinter.Checkbutton(self.root, **kwargs) + + + def test_offvalue(self): + widget = self.create() + self.checkParams(widget, 'offvalue', 1, 2.3, '', 'any string') + + def test_onvalue(self): + widget = self.create() + self.checkParams(widget, 'onvalue', 1, 2.3, '', 'any string') + + + at add_standard_options(StandardOptionsTests) +class RadiobuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', + 'command', 'compound', 'cursor', + 'disabledforeground', 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'indicatoron', 'justify', 'offrelief', 'overrelief', + 'padx', 'pady', 'relief', 'selectcolor', 'selectimage', 'state', + 'takefocus', 'text', 'textvariable', + 'tristateimage', 'tristatevalue', + 'underline', 'value', 'variable', 'width', 'wraplength', + ) + + def _create(self, **kwargs): + return tkinter.Radiobutton(self.root, **kwargs) + + def test_value(self): + widget = self.create() + self.checkParams(widget, 'value', 1, 2.3, '', 'any string') + + + at add_standard_options(StandardOptionsTests) +class MenubuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', + 'compound', 'cursor', 'direction', + 'disabledforeground', 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'indicatoron', 'justify', 'menu', + 'padx', 'pady', 'relief', 'state', + 'takefocus', 'text', 'textvariable', + 'underline', 'width', 'wraplength', + ) + _conv_pixels = AbstractWidgetTest._conv_pixels + + def _create(self, **kwargs): + return tkinter.Menubutton(self.root, **kwargs) + + def test_direction(self): + widget = self.create() + self.checkEnumParam(widget, 'direction', + 'above', 'below', 'flush', 'left', 'right') + + def test_height(self): + widget = self.create() + self.checkIntegerParam(widget, 'height', 100, -100, 0, conv=str) + + test_highlightthickness = StandardOptionsTests.test_highlightthickness + + def test_image(self): + widget = self.create() + image = tkinter.PhotoImage('image1') + self.checkParam(widget, 'image', image, conv=str) + errmsg = 'image "spam" doesn\'t exist' + with self.assertRaises(tkinter.TclError) as cm: + widget['image'] = 'spam' + if errmsg is not None: + self.assertEqual(str(cm.exception), errmsg) + with self.assertRaises(tkinter.TclError) as cm: + widget.configure({'image': 'spam'}) + if errmsg is not None: + self.assertEqual(str(cm.exception), errmsg) + + def test_menu(self): + widget = self.create() + menu = tkinter.Menu(widget, name='menu') + self.checkParam(widget, 'menu', menu, eq=widget_eq) + menu.destroy() + + def test_padx(self): + widget = self.create() + self.checkPixelsParam(widget, 'padx', 3, 4.4, 5.6, '12m') + self.checkParam(widget, 'padx', -2, expected=0) + + def test_pady(self): + widget = self.create() + self.checkPixelsParam(widget, 'pady', 3, 4.4, 5.6, '12m') + self.checkParam(widget, 'pady', -2, expected=0) + + def test_width(self): + widget = self.create() + self.checkIntegerParam(widget, 'width', 402, -402, 0, conv=str) + + +class OptionMenuTest(MenubuttonTest, unittest.TestCase): + + def _create(self, default='b', values=('a', 'b', 'c'), **kwargs): + return tkinter.OptionMenu(self.root, None, default, *values, **kwargs) + + + at add_standard_options(IntegerSizeTests, StandardOptionsTests) +class EntryTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', 'cursor', + 'disabledbackground', 'disabledforeground', + 'exportselection', 'font', 'foreground', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'insertbackground', 'insertborderwidth', + 'insertofftime', 'insertontime', 'insertwidth', + 'invalidcommand', 'justify', 'readonlybackground', 'relief', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'show', 'state', 'takefocus', 'textvariable', + 'validate', 'validatecommand', 'width', 'xscrollcommand', + ) + + def _create(self, **kwargs): + return tkinter.Entry(self.root, **kwargs) + + def test_disabledbackground(self): + widget = self.create() + self.checkColorParam(widget, 'disabledbackground') + + def test_insertborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'insertborderwidth', 0, 1.3, -2) + self.checkParam(widget, 'insertborderwidth', 2, expected=1) + self.checkParam(widget, 'insertborderwidth', '10p', expected=1) + + def test_insertwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'insertwidth', 1.3, 3.6, '10p') + if tcl_version[:2] == (8, 5): + self.checkParam(widget, 'insertwidth', 0.9, expected=2) + else: + self.checkParam(widget, 'insertwidth', 0.9, expected=1) + self.checkParam(widget, 'insertwidth', 0.1, expected=2) + self.checkParam(widget, 'insertwidth', -2, expected=2) + + def test_invalidcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'invalidcommand') + self.checkCommandParam(widget, 'invcmd') + + def test_readonlybackground(self): + widget = self.create() + self.checkColorParam(widget, 'readonlybackground') + + def test_show(self): + widget = self.create() + self.checkParam(widget, 'show', '*') + self.checkParam(widget, 'show', '') + self.checkParam(widget, 'show', ' ') + + def test_state(self): + widget = self.create() + self.checkEnumParam(widget, 'state', + 'disabled', 'normal', 'readonly') + + def test_validate(self): + widget = self.create() + self.checkEnumParam(widget, 'validate', + 'all', 'key', 'focus', 'focusin', 'focusout', 'none') + + def test_validatecommand(self): + widget = self.create() + self.checkCommandParam(widget, 'validatecommand') + self.checkCommandParam(widget, 'vcmd') + + + at add_standard_options(StandardOptionsTests) +class SpinboxTest(EntryTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'background', 'borderwidth', + 'buttonbackground', 'buttoncursor', 'buttondownrelief', 'buttonuprelief', + 'command', 'cursor', 'disabledbackground', 'disabledforeground', + 'exportselection', 'font', 'foreground', 'format', 'from', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'increment', + 'insertbackground', 'insertborderwidth', + 'insertofftime', 'insertontime', 'insertwidth', + 'invalidcommand', 'justify', 'relief', 'readonlybackground', + 'repeatdelay', 'repeatinterval', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'state', 'takefocus', 'textvariable', 'to', + 'validate', 'validatecommand', 'values', + 'width', 'wrap', 'xscrollcommand', + ) + + def _create(self, **kwargs): + return tkinter.Spinbox(self.root, **kwargs) + + test_show = None + + def test_buttonbackground(self): + widget = self.create() + self.checkColorParam(widget, 'buttonbackground') + + def test_buttoncursor(self): + widget = self.create() + self.checkCursorParam(widget, 'buttoncursor') + + def test_buttondownrelief(self): + widget = self.create() + self.checkReliefParam(widget, 'buttondownrelief') + + def test_buttonuprelief(self): + widget = self.create() + self.checkReliefParam(widget, 'buttonuprelief') + + def test_format(self): + widget = self.create() + self.checkParam(widget, 'format', '%2f') + self.checkParam(widget, 'format', '%2.2f') + self.checkParam(widget, 'format', '%.2f') + self.checkParam(widget, 'format', '%2.f') + self.checkInvalidParam(widget, 'format', '%2e-1f') + self.checkInvalidParam(widget, 'format', '2.2') + self.checkInvalidParam(widget, 'format', '%2.-2f') + self.checkParam(widget, 'format', '%-2.02f') + self.checkParam(widget, 'format', '% 2.02f') + self.checkParam(widget, 'format', '% -2.200f') + self.checkParam(widget, 'format', '%09.200f') + self.checkInvalidParam(widget, 'format', '%d') + + def test_from(self): + widget = self.create() + self.checkParam(widget, 'to', 100.0) + self.checkFloatParam(widget, 'from', -10, 10.2, 11.7) + self.checkInvalidParam(widget, 'from', 200, + errmsg='-to value must be greater than -from value') + + def test_increment(self): + widget = self.create() + self.checkFloatParam(widget, 'increment', -1, 1, 10.2, 12.8, 0) + + def test_to(self): + widget = self.create() + self.checkParam(widget, 'from', -100.0) + self.checkFloatParam(widget, 'to', -10, 10.2, 11.7) + self.checkInvalidParam(widget, 'to', -200, + errmsg='-to value must be greater than -from value') + + def test_values(self): + # XXX + widget = self.create() + self.assertEqual(widget['values'], '') + self.checkParam(widget, 'values', 'mon tue wed thur') + self.checkParam(widget, 'values', ('mon', 'tue', 'wed', 'thur'), + expected='mon tue wed thur') + self.checkParam(widget, 'values', (42, 3.14, '', 'any string'), + expected='42 3.14 {} {any string}') + self.checkParam(widget, 'values', '') + + def test_wrap(self): + widget = self.create() + self.checkBooleanParam(widget, 'wrap') + + + at add_standard_options(StandardOptionsTests) +class TextTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'autoseparators', 'background', 'blockcursor', 'borderwidth', + 'cursor', 'endline', 'exportselection', + 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'inactiveselectbackground', 'insertbackground', 'insertborderwidth', + 'insertofftime', 'insertontime', 'insertunfocussed', 'insertwidth', + 'maxundo', 'padx', 'pady', 'relief', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'setgrid', 'spacing1', 'spacing2', 'spacing3', 'startline', 'state', + 'tabs', 'tabstyle', 'takefocus', 'undo', 'width', 'wrap', + 'xscrollcommand', 'yscrollcommand', + ) + if tcl_version < (8, 5): + wantobjects = False + + def _create(self, **kwargs): + return tkinter.Text(self.root, **kwargs) + + def test_autoseparators(self): + widget = self.create() + self.checkBooleanParam(widget, 'autoseparators') + + @requires_tcl(8, 5) + def test_blockcursor(self): + widget = self.create() + self.checkBooleanParam(widget, 'blockcursor') + + @requires_tcl(8, 5) + def test_endline(self): + widget = self.create() + text = '\n'.join('Line %d' for i in range(100)) + widget.insert('end', text) + self.checkParam(widget, 'endline', 200, expected='') + self.checkParam(widget, 'endline', -10, expected='') + self.checkInvalidParam(widget, 'endline', 'spam', + errmsg='expected integer but got "spam"') + self.checkParam(widget, 'endline', 50) + self.checkParam(widget, 'startline', 15) + self.checkInvalidParam(widget, 'endline', 10, + errmsg='-startline must be less than or equal to -endline') + + def test_height(self): + widget = self.create() + self.checkPixelsParam(widget, 'height', 100, 101.2, 102.6, '3c') + self.checkParam(widget, 'height', -100, expected=1) + self.checkParam(widget, 'height', 0, expected=1) + + def test_maxundo(self): + widget = self.create() + self.checkIntegerParam(widget, 'maxundo', 0, 5, -1) + + @requires_tcl(8, 5) + def test_inactiveselectbackground(self): + widget = self.create() + self.checkColorParam(widget, 'inactiveselectbackground') + + @requires_tcl(8, 6) + def test_insertunfocussed(self): + widget = self.create() + self.checkEnumParam(widget, 'insertunfocussed', + 'hollow', 'none', 'solid') + + def test_selectborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'selectborderwidth', + 1.3, 2.6, -2, '10p', conv=False, + keep_orig=tcl_version >= (8, 5)) + + def test_spacing1(self): + widget = self.create() + self.checkPixelsParam(widget, 'spacing1', 20, 21.4, 22.6, '0.5c') + self.checkParam(widget, 'spacing1', -5, expected=0) + + def test_spacing2(self): + widget = self.create() + self.checkPixelsParam(widget, 'spacing2', 5, 6.4, 7.6, '0.1c') + self.checkParam(widget, 'spacing2', -1, expected=0) + + def test_spacing3(self): + widget = self.create() + self.checkPixelsParam(widget, 'spacing3', 20, 21.4, 22.6, '0.5c') + self.checkParam(widget, 'spacing3', -10, expected=0) + + @requires_tcl(8, 5) + def test_startline(self): + widget = self.create() + text = '\n'.join('Line %d' for i in range(100)) + widget.insert('end', text) + self.checkParam(widget, 'startline', 200, expected='') + self.checkParam(widget, 'startline', -10, expected='') + self.checkInvalidParam(widget, 'startline', 'spam', + errmsg='expected integer but got "spam"') + self.checkParam(widget, 'startline', 10) + self.checkParam(widget, 'endline', 50) + self.checkInvalidParam(widget, 'startline', 70, + errmsg='-startline must be less than or equal to -endline') + + def test_state(self): + widget = self.create() + if tcl_version < (8, 5): + self.checkParams(widget, 'state', 'disabled', 'normal') + else: + self.checkEnumParam(widget, 'state', 'disabled', 'normal') + + def test_tabs(self): + widget = self.create() + self.checkParam(widget, 'tabs', (10.2, 20.7, '1i', '2i')) + self.checkParam(widget, 'tabs', '10.2 20.7 1i 2i', + expected=('10.2', '20.7', '1i', '2i')) + self.checkParam(widget, 'tabs', '2c left 4c 6c center', + expected=('2c', 'left', '4c', '6c', 'center')) + self.checkInvalidParam(widget, 'tabs', 'spam', + errmsg='bad screen distance "spam"', + keep_orig=tcl_version >= (8, 5)) + + @requires_tcl(8, 5) + def test_tabstyle(self): + widget = self.create() + self.checkEnumParam(widget, 'tabstyle', 'tabular', 'wordprocessor') + + def test_undo(self): + widget = self.create() + self.checkBooleanParam(widget, 'undo') + + def test_width(self): + widget = self.create() + self.checkIntegerParam(widget, 'width', 402) + self.checkParam(widget, 'width', -402, expected=1) + self.checkParam(widget, 'width', 0, expected=1) + + def test_wrap(self): + widget = self.create() + if tcl_version < (8, 5): + self.checkParams(widget, 'wrap', 'char', 'none', 'word') + else: + self.checkEnumParam(widget, 'wrap', 'char', 'none', 'word') + + + at add_standard_options(PixelSizeTests, StandardOptionsTests) +class CanvasTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', + 'closeenough', 'confine', 'cursor', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'insertbackground', 'insertborderwidth', + 'insertofftime', 'insertontime', 'insertwidth', + 'relief', 'scrollregion', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'state', 'takefocus', + 'xscrollcommand', 'xscrollincrement', + 'yscrollcommand', 'yscrollincrement', 'width', + ) + + _conv_pixels = round + wantobjects = False + + def _create(self, **kwargs): + return tkinter.Canvas(self.root, **kwargs) + + def test_closeenough(self): + widget = self.create() + self.checkFloatParam(widget, 'closeenough', 24, 2.4, 3.6, -3, + conv=float) + + def test_confine(self): + widget = self.create() + self.checkBooleanParam(widget, 'confine') + + def test_scrollregion(self): + widget = self.create() + self.checkParam(widget, 'scrollregion', '0 0 200 150') + self.checkParam(widget, 'scrollregion', (0, 0, 200, 150), + expected='0 0 200 150') + self.checkParam(widget, 'scrollregion', '') + self.checkInvalidParam(widget, 'scrollregion', 'spam', + errmsg='bad scrollRegion "spam"') + self.checkInvalidParam(widget, 'scrollregion', (0, 0, 200, 'spam')) + self.checkInvalidParam(widget, 'scrollregion', (0, 0, 200)) + self.checkInvalidParam(widget, 'scrollregion', (0, 0, 200, 150, 0)) + + def test_state(self): + widget = self.create() + self.checkEnumParam(widget, 'state', 'disabled', 'normal', + errmsg='bad state value "{}": must be normal or disabled') + + def test_xscrollincrement(self): + widget = self.create() + self.checkPixelsParam(widget, 'xscrollincrement', + 40, 0, 41.2, 43.6, -40, '0.5i') + + def test_yscrollincrement(self): + widget = self.create() + self.checkPixelsParam(widget, 'yscrollincrement', + 10, 0, 11.2, 13.6, -10, '0.1i') + + + at add_standard_options(IntegerSizeTests, StandardOptionsTests) +class ListboxTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'activestyle', 'background', 'borderwidth', 'cursor', + 'disabledforeground', 'exportselection', + 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'listvariable', 'relief', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'selectmode', 'setgrid', 'state', + 'takefocus', 'width', 'xscrollcommand', 'yscrollcommand', + ) + + def _create(self, **kwargs): + return tkinter.Listbox(self.root, **kwargs) + + def test_activestyle(self): + widget = self.create() + self.checkEnumParam(widget, 'activestyle', + 'dotbox', 'none', 'underline') + + def test_listvariable(self): + widget = self.create() + var = tkinter.DoubleVar() + self.checkVariableParam(widget, 'listvariable', var) + + def test_selectmode(self): + widget = self.create() + self.checkParam(widget, 'selectmode', 'single') + self.checkParam(widget, 'selectmode', 'browse') + self.checkParam(widget, 'selectmode', 'multiple') + self.checkParam(widget, 'selectmode', 'extended') + + def test_state(self): + widget = self.create() + self.checkEnumParam(widget, 'state', 'disabled', 'normal') + + at add_standard_options(PixelSizeTests, StandardOptionsTests) +class ScaleTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'background', 'bigincrement', 'borderwidth', + 'command', 'cursor', 'digits', 'font', 'foreground', 'from', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'label', 'length', 'orient', 'relief', + 'repeatdelay', 'repeatinterval', + 'resolution', 'showvalue', 'sliderlength', 'sliderrelief', 'state', + 'takefocus', 'tickinterval', 'to', 'troughcolor', 'variable', 'width', + ) + default_orient = 'vertical' + + def _create(self, **kwargs): + return tkinter.Scale(self.root, **kwargs) + + def test_bigincrement(self): + widget = self.create() + self.checkFloatParam(widget, 'bigincrement', 12.4, 23.6, -5) + + def test_digits(self): + widget = self.create() + self.checkIntegerParam(widget, 'digits', 5, 0) + + def test_from(self): + widget = self.create() + self.checkFloatParam(widget, 'from', 100, 14.9, 15.1, conv=float_round) + + def test_label(self): + widget = self.create() + self.checkParam(widget, 'label', 'any string') + self.checkParam(widget, 'label', '') + + def test_length(self): + widget = self.create() + self.checkPixelsParam(widget, 'length', 130, 131.2, 135.6, '5i') + + def test_resolution(self): + widget = self.create() + self.checkFloatParam(widget, 'resolution', 4.2, 0, 6.7, -2) + + def test_showvalue(self): + widget = self.create() + self.checkBooleanParam(widget, 'showvalue') + + def test_sliderlength(self): + widget = self.create() + self.checkPixelsParam(widget, 'sliderlength', + 10, 11.2, 15.6, -3, '3m') + + def test_sliderrelief(self): + widget = self.create() + self.checkReliefParam(widget, 'sliderrelief') + + def test_tickinterval(self): + widget = self.create() + self.checkFloatParam(widget, 'tickinterval', 1, 4.3, 7.6, 0, + conv=float_round) + self.checkParam(widget, 'tickinterval', -2, expected=2, + conv=float_round) + + def test_to(self): + widget = self.create() + self.checkFloatParam(widget, 'to', 300, 14.9, 15.1, -10, + conv=float_round) + + + at add_standard_options(PixelSizeTests, StandardOptionsTests) +class ScrollbarTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activerelief', + 'background', 'borderwidth', + 'command', 'cursor', 'elementborderwidth', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'jump', 'orient', 'relief', + 'repeatdelay', 'repeatinterval', + 'takefocus', 'troughcolor', 'width', + ) + _conv_pixels = round + wantobjects = False + default_orient = 'vertical' + + def _create(self, **kwargs): + return tkinter.Scrollbar(self.root, **kwargs) + + def test_activerelief(self): + widget = self.create() + self.checkReliefParam(widget, 'activerelief') + + def test_elementborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'elementborderwidth', 4.3, 5.6, -2, '1m') + + def test_orient(self): + widget = self.create() + self.checkEnumParam(widget, 'orient', 'vertical', 'horizontal', + errmsg='bad orientation "{}": must be vertical or horizontal') + + + at add_standard_options(StandardOptionsTests) +class PanedWindowTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', 'cursor', + 'handlepad', 'handlesize', 'height', + 'opaqueresize', 'orient', 'relief', + 'sashcursor', 'sashpad', 'sashrelief', 'sashwidth', + 'showhandle', 'width', + ) + default_orient = 'horizontal' + + def _create(self, **kwargs): + return tkinter.PanedWindow(self.root, **kwargs) + + def test_handlepad(self): + widget = self.create() + self.checkPixelsParam(widget, 'handlepad', 5, 6.4, 7.6, -3, '1m') + + def test_handlesize(self): + widget = self.create() + self.checkPixelsParam(widget, 'handlesize', 8, 9.4, 10.6, -3, '2m', + conv=noconv) + + def test_height(self): + widget = self.create() + self.checkPixelsParam(widget, 'height', 100, 101.2, 102.6, -100, 0, '1i', + conv=noconv) + + def test_opaqueresize(self): + widget = self.create() + self.checkBooleanParam(widget, 'opaqueresize') + + def test_sashcursor(self): + widget = self.create() + self.checkCursorParam(widget, 'sashcursor') + + def test_sashpad(self): + widget = self.create() + self.checkPixelsParam(widget, 'sashpad', 8, 1.3, 2.6, -2, '2m') + + def test_sashrelief(self): + widget = self.create() + self.checkReliefParam(widget, 'sashrelief') + + def test_sashwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'sashwidth', 10, 11.1, 15.6, -3, '1m', + conv=noconv) + + def test_showhandle(self): + widget = self.create() + self.checkBooleanParam(widget, 'showhandle') + + def test_width(self): + widget = self.create() + self.checkPixelsParam(widget, 'width', 402, 403.4, 404.6, -402, 0, '5i', + conv=noconv) + + + at add_standard_options(StandardOptionsTests) +class MenuTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeborderwidth', 'activeforeground', + 'background', 'borderwidth', 'cursor', + 'disabledforeground', 'font', 'foreground', + 'postcommand', 'relief', 'selectcolor', 'takefocus', + 'tearoff', 'tearoffcommand', 'title', 'type', + ) + _conv_pixels = noconv + + def _create(self, **kwargs): + return tkinter.Menu(self.root, **kwargs) + + def test_postcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'postcommand') + + def test_tearoff(self): + widget = self.create() + self.checkBooleanParam(widget, 'tearoff') + + def test_tearoffcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'tearoffcommand') + + def test_title(self): + widget = self.create() + self.checkParam(widget, 'title', 'any string') + + def test_type(self): + widget = self.create() + self.checkEnumParam(widget, 'type', + 'normal', 'tearoff', 'menubar') + + + at add_standard_options(PixelSizeTests, StandardOptionsTests) +class MessageTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'anchor', 'aspect', 'background', 'borderwidth', + 'cursor', 'font', 'foreground', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'justify', 'padx', 'pady', 'relief', + 'takefocus', 'text', 'textvariable', 'width', + ) + _conv_pad_pixels = noconv + + def _create(self, **kwargs): + return tkinter.Message(self.root, **kwargs) + + def test_aspect(self): + widget = self.create() + self.checkIntegerParam(widget, 'aspect', 250, 0, -300) + + +tests_gui = ( + ButtonTest, CanvasTest, CheckbuttonTest, EntryTest, + FrameTest, LabelFrameTest,LabelTest, ListboxTest, + MenubuttonTest, MenuTest, MessageTest, OptionMenuTest, + PanedWindowTest, RadiobuttonTest, ScaleTest, ScrollbarTest, + SpinboxTest, TextTest, ToplevelTest, +) + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/tkinter/test/test_ttk/test_widgets.py b/Lib/tkinter/test/test_ttk/test_widgets.py --- a/Lib/tkinter/test/test_ttk/test_widgets.py +++ b/Lib/tkinter/test/test_ttk/test_widgets.py @@ -1,15 +1,57 @@ import unittest import tkinter -import os from tkinter import ttk -from test.support import requires, run_unittest +from test.support import requires import sys import tkinter.test.support as support -from tkinter.test.test_ttk.test_functions import MockTclObj, MockStateSpec +from tkinter.test.test_ttk.test_functions import MockTclObj +from tkinter.test.support import tcl_version +from tkinter.test.widget_tests import (add_standard_options, noconv, + AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests) requires('gui') + +class StandardTtkOptionsTests(StandardOptionsTests): + + def test_class(self): + widget = self.create() + self.assertEqual(widget['class'], '') + errmsg='attempt to change read-only option' + if tcl_version < (8, 6): + errmsg='Attempt to change read-only option' + self.checkInvalidParam(widget, 'class', 'Foo', errmsg=errmsg) + widget2 = self.create(class_='Foo') + self.assertEqual(widget2['class'], 'Foo') + + def test_padding(self): + widget = self.create() + self.checkParam(widget, 'padding', 0, expected=('0',)) + self.checkParam(widget, 'padding', 5, expected=('5',)) + self.checkParam(widget, 'padding', (5, 6), expected=('5', '6')) + self.checkParam(widget, 'padding', (5, 6, 7), + expected=('5', '6', '7')) + self.checkParam(widget, 'padding', (5, 6, 7, 8), + expected=('5', '6', '7', '8')) + self.checkParam(widget, 'padding', ('5p', '6p', '7p', '8p')) + self.checkParam(widget, 'padding', (), expected='') + + def test_style(self): + widget = self.create() + self.assertEqual(widget['style'], '') + errmsg = 'Layout Foo not found' + if hasattr(self, 'default_orient'): + errmsg = ('Layout %s.Foo not found' % + getattr(self, 'default_orient').title()) + self.checkInvalidParam(widget, 'style', 'Foo', + errmsg=errmsg) + widget2 = self.create(class_='Foo') + self.assertEqual(widget2['class'], 'Foo') + # XXX + pass + + class WidgetTest(unittest.TestCase): """Tests methods available in every ttk widget.""" @@ -73,7 +115,112 @@ self.assertEqual(self.widget.state(), ('active', )) -class ButtonTest(unittest.TestCase): +class AbstractToplevelTest(AbstractWidgetTest, PixelSizeTests): + _conv_pixels = noconv + + + at add_standard_options(StandardTtkOptionsTests) +class FrameTest(AbstractToplevelTest, unittest.TestCase): + OPTIONS = ( + 'borderwidth', 'class', 'cursor', 'height', + 'padding', 'relief', 'style', 'takefocus', + 'width', + ) + + def _create(self, **kwargs): + return ttk.Frame(self.root, **kwargs) + + + at add_standard_options(StandardTtkOptionsTests) +class LabelFrameTest(AbstractToplevelTest, unittest.TestCase): + OPTIONS = ( + 'borderwidth', 'class', 'cursor', 'height', + 'labelanchor', 'labelwidget', + 'padding', 'relief', 'style', 'takefocus', + 'text', 'underline', 'width', + ) + + def _create(self, **kwargs): + return ttk.LabelFrame(self.root, **kwargs) + + def test_labelanchor(self): + widget = self.create() + self.checkEnumParam(widget, 'labelanchor', + 'e', 'en', 'es', 'n', 'ne', 'nw', 's', 'se', 'sw', 'w', 'wn', 'ws', + errmsg='Bad label anchor specification {}') + self.checkInvalidParam(widget, 'labelanchor', 'center') + + def test_labelwidget(self): + widget = self.create() + label = ttk.Label(self.root, text='Mupp', name='foo') + self.checkParam(widget, 'labelwidget', label, expected='.foo') + label.destroy() + + +class AbstractLabelTest(AbstractWidgetTest): + + def checkImageParam(self, widget, name): + image = tkinter.PhotoImage('image1') + image2 = tkinter.PhotoImage('image2') + self.checkParam(widget, name, image, expected=('image1',)) + self.checkParam(widget, name, 'image1', expected=('image1',)) + self.checkParam(widget, name, (image,), expected=('image1',)) + self.checkParam(widget, name, (image, 'active', image2), + expected=('image1', 'active', 'image2')) + self.checkParam(widget, name, 'image1 active image2', + expected=('image1', 'active', 'image2')) + self.checkInvalidParam(widget, name, 'spam', + errmsg='image "spam" doesn\'t exist') + + def test_compound(self): + widget = self.create() + self.checkEnumParam(widget, 'compound', + 'none', 'text', 'image', 'center', + 'top', 'bottom', 'left', 'right') + + def test_state(self): + widget = self.create() + self.checkParams(widget, 'state', 'active', 'disabled', 'normal') + + def test_width(self): + widget = self.create() + self.checkParams(widget, 'width', 402, -402, 0) + + + at add_standard_options(StandardTtkOptionsTests) +class LabelTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'anchor', 'background', + 'class', 'compound', 'cursor', 'font', 'foreground', + 'image', 'justify', 'padding', 'relief', 'state', 'style', + 'takefocus', 'text', 'textvariable', + 'underline', 'width', 'wraplength', + ) + _conv_pixels = noconv + + def _create(self, **kwargs): + return ttk.Label(self.root, **kwargs) + + def test_font(self): + widget = self.create() + self.checkParam(widget, 'font', + '-Adobe-Helvetica-Medium-R-Normal--*-120-*-*-*-*-*-*') + + + at add_standard_options(StandardTtkOptionsTests) +class ButtonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'class', 'command', 'compound', 'cursor', 'default', + 'image', 'state', 'style', 'takefocus', 'text', 'textvariable', + 'underline', 'width', + ) + + def _create(self, **kwargs): + return ttk.Button(self.root, **kwargs) + + def test_default(self): + widget = self.create() + self.checkEnumParam(widget, 'default', 'normal', 'active', 'disabled') def test_invoke(self): success = [] @@ -82,7 +229,27 @@ self.assertTrue(success) -class CheckbuttonTest(unittest.TestCase): + at add_standard_options(StandardTtkOptionsTests) +class CheckbuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'class', 'command', 'compound', 'cursor', + 'image', + 'offvalue', 'onvalue', + 'state', 'style', + 'takefocus', 'text', 'textvariable', + 'underline', 'variable', 'width', + ) + + def _create(self, **kwargs): + return ttk.Checkbutton(self.root, **kwargs) + + def test_offvalue(self): + widget = self.create() + self.checkParams(widget, 'offvalue', 1, 2.3, '', 'any string') + + def test_onvalue(self): + widget = self.create() + self.checkParams(widget, 'onvalue', 1, 2.3, '', 'any string') def test_invoke(self): success = [] @@ -105,21 +272,40 @@ cbtn['command'] = '' res = cbtn.invoke() - self.assertEqual(str(res), '') + self.assertFalse(str(res)) self.assertFalse(len(success) > 1) self.assertEqual(cbtn['offvalue'], cbtn.tk.globalgetvar(cbtn['variable'])) -class ComboboxTest(unittest.TestCase): + at add_standard_options(IntegerSizeTests, StandardTtkOptionsTests) +class ComboboxTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'exportselection', 'height', + 'justify', 'postcommand', 'state', 'style', + 'takefocus', 'textvariable', 'values', 'width', + ) def setUp(self): + super().setUp() support.root_deiconify() - self.combo = ttk.Combobox() + self.combo = self.create() def tearDown(self): self.combo.destroy() support.root_withdraw() + super().tearDown() + + def _create(self, **kwargs): + return ttk.Combobox(self.root, **kwargs) + + def test_height(self): + widget = self.create() + self.checkParams(widget, 'height', 100, 101.2, 102.6, -100, 0, '1i') + + def test_state(self): + widget = self.create() + self.checkParams(widget, 'state', 'active', 'disabled', 'normal') def _show_drop_down_listbox(self): width = self.combo.winfo_width() @@ -167,8 +353,16 @@ self.assertEqual(self.combo.get(), getval) self.assertEqual(self.combo.current(), currval) + self.assertEqual(self.combo['values'], + () if tcl_version < (8, 5) else '') check_get_current('', -1) + self.checkParam(self.combo, 'values', 'mon tue wed thur', + expected=('mon', 'tue', 'wed', 'thur')) + self.checkParam(self.combo, 'values', ('mon', 'tue', 'wed', 'thur')) + self.checkParam(self.combo, 'values', (42, 3.14, '', 'any string')) + self.checkParam(self.combo, 'values', '', expected=()) + self.combo['values'] = ['a', 1, 'c'] self.combo.set('c') @@ -209,15 +403,52 @@ combo2.destroy() -class EntryTest(unittest.TestCase): + at add_standard_options(IntegerSizeTests, StandardTtkOptionsTests) +class EntryTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'background', 'class', 'cursor', + 'exportselection', 'font', + 'invalidcommand', 'justify', + 'show', 'state', 'style', 'takefocus', 'textvariable', + 'validate', 'validatecommand', 'width', 'xscrollcommand', + ) def setUp(self): + super().setUp() support.root_deiconify() - self.entry = ttk.Entry() + self.entry = self.create() def tearDown(self): self.entry.destroy() support.root_withdraw() + super().tearDown() + + def _create(self, **kwargs): + return ttk.Entry(self.root, **kwargs) + + def test_invalidcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'invalidcommand') + + def test_show(self): + widget = self.create() + self.checkParam(widget, 'show', '*') + self.checkParam(widget, 'show', '') + self.checkParam(widget, 'show', ' ') + + def test_state(self): + widget = self.create() + self.checkParams(widget, 'state', + 'disabled', 'normal', 'readonly') + + def test_validate(self): + widget = self.create() + self.checkEnumParam(widget, 'validate', + 'all', 'key', 'focus', 'focusin', 'focusout', 'none') + + def test_validatecommand(self): + widget = self.create() + self.checkCommandParam(widget, 'validatecommand') def test_bbox(self): @@ -313,16 +544,36 @@ self.assertEqual(self.entry.state(), ()) -class PanedwindowTest(unittest.TestCase): + at add_standard_options(IntegerSizeTests, StandardTtkOptionsTests) +class PanedWindowTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'height', + 'orient', 'style', 'takefocus', 'width', + ) def setUp(self): + super().setUp() support.root_deiconify() - self.paned = ttk.Panedwindow() + self.paned = self.create() def tearDown(self): self.paned.destroy() support.root_withdraw() + super().tearDown() + def _create(self, **kwargs): + return ttk.PanedWindow(self.root, **kwargs) + + def test_orient(self): + widget = self.create() + self.assertEqual(str(widget['orient']), 'vertical') + errmsg='attempt to change read-only option' + if tcl_version < (8, 6): + errmsg='Attempt to change read-only option' + self.checkInvalidParam(widget, 'orient', 'horizontal', + errmsg=errmsg) + widget2 = self.create(orient='horizontal') + self.assertEqual(str(widget2['orient']), 'horizontal') def test_add(self): # attempt to add a child that is not a direct child of the paned window @@ -432,7 +683,22 @@ self.assertTrue(isinstance(self.paned.sashpos(0), int)) -class RadiobuttonTest(unittest.TestCase): + at add_standard_options(StandardTtkOptionsTests) +class RadiobuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'class', 'command', 'compound', 'cursor', + 'image', + 'state', 'style', + 'takefocus', 'text', 'textvariable', + 'underline', 'value', 'variable', 'width', + ) + + def _create(self, **kwargs): + return ttk.Radiobutton(self.root, **kwargs) + + def test_value(self): + widget = self.create() + self.checkParams(widget, 'value', 1, 2.3, '', 'any string') def test_invoke(self): success = [] @@ -462,19 +728,68 @@ self.assertEqual(str(cbtn['variable']), str(cbtn2['variable'])) +class MenubuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'class', 'compound', 'cursor', 'direction', + 'image', 'menu', 'state', 'style', + 'takefocus', 'text', 'textvariable', + 'underline', 'width', + ) -class ScaleTest(unittest.TestCase): + def _create(self, **kwargs): + return ttk.Menubutton(self.root, **kwargs) + + def test_direction(self): + widget = self.create() + self.checkEnumParam(widget, 'direction', + 'above', 'below', 'left', 'right', 'flush') + + def test_menu(self): + widget = self.create() + menu = tkinter.Menu(widget, name='menu') + self.checkParam(widget, 'menu', menu, conv=str) + menu.destroy() + + + at add_standard_options(StandardTtkOptionsTests) +class ScaleTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'command', 'cursor', 'from', 'length', + 'orient', 'style', 'takefocus', 'to', 'value', 'variable', + ) + _conv_pixels = noconv + default_orient = 'horizontal' def setUp(self): + super().setUp() support.root_deiconify() - self.scale = ttk.Scale() + self.scale = self.create() self.scale.pack() self.scale.update() def tearDown(self): self.scale.destroy() support.root_withdraw() + super().tearDown() + def _create(self, **kwargs): + return ttk.Scale(self.root, **kwargs) + + def test_from(self): + widget = self.create() + self.checkFloatParam(widget, 'from', 100, 14.9, 15.1, conv=False) + + def test_length(self): + widget = self.create() + self.checkPixelsParam(widget, 'length', 130, 131.2, 135.6, '5i') + + def test_to(self): + widget = self.create() + self.checkFloatParam(widget, 'to', 300, 14.9, 15.1, -10, conv=False) + + def test_value(self): + widget = self.create() + self.checkFloatParam(widget, 'value', 300, 14.9, 15.1, -10, conv=False) def test_custom_event(self): failure = [1, 1, 1] # will need to be empty @@ -539,11 +854,64 @@ self.assertRaises(tkinter.TclError, self.scale.set, None) -class NotebookTest(unittest.TestCase): + at add_standard_options(StandardTtkOptionsTests) +class ProgressbarTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'orient', 'length', + 'mode', 'maximum', 'phase', + 'style', 'takefocus', 'value', 'variable', + ) + _conv_pixels = noconv + default_orient = 'horizontal' + + def _create(self, **kwargs): + return ttk.Progressbar(self.root, **kwargs) + + def test_length(self): + widget = self.create() + self.checkPixelsParam(widget, 'length', 100.1, 56.7, '2i') + + def test_maximum(self): + widget = self.create() + self.checkFloatParam(widget, 'maximum', 150.2, 77.7, 0, -10, conv=False) + + def test_mode(self): + widget = self.create() + self.checkEnumParam(widget, 'mode', 'determinate', 'indeterminate') + + def test_phase(self): + # XXX + pass + + def test_value(self): + widget = self.create() + self.checkFloatParam(widget, 'value', 150.2, 77.7, 0, -10, + conv=False) + + + at unittest.skipIf(sys.platform == 'darwin', + 'ttk.Scrollbar is special on MacOSX') + at add_standard_options(StandardTtkOptionsTests) +class ScrollbarTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'command', 'cursor', 'orient', 'style', 'takefocus', + ) + default_orient = 'vertical' + + def _create(self, **kwargs): + return ttk.Scrollbar(self.root, **kwargs) + + + at add_standard_options(IntegerSizeTests, StandardTtkOptionsTests) +class NotebookTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'height', 'padding', 'style', 'takefocus', + ) def setUp(self): + super().setUp() support.root_deiconify() - self.nb = ttk.Notebook(padding=0) + self.nb = self.create(padding=0) self.child1 = ttk.Label() self.child2 = ttk.Label() self.nb.add(self.child1, text='a') @@ -554,7 +922,10 @@ self.child2.destroy() self.nb.destroy() support.root_withdraw() + super().tearDown() + def _create(self, **kwargs): + return ttk.Notebook(self.root, **kwargs) def test_tab_identifiers(self): self.nb.forget(0) @@ -746,16 +1117,68 @@ self.assertEqual(self.nb.select(), str(self.child1)) -class TreeviewTest(unittest.TestCase): + at add_standard_options(StandardTtkOptionsTests) +class TreeviewTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'columns', 'cursor', 'displaycolumns', + 'height', 'padding', 'selectmode', 'show', + 'style', 'takefocus', 'xscrollcommand', 'yscrollcommand', + ) def setUp(self): + super().setUp() support.root_deiconify() - self.tv = ttk.Treeview(padding=0) + self.tv = self.create(padding=0) def tearDown(self): self.tv.destroy() support.root_withdraw() + super().tearDown() + def _create(self, **kwargs): + return ttk.Treeview(self.root, **kwargs) + + def test_columns(self): + widget = self.create() + self.checkParam(widget, 'columns', 'a b c', + expected=('a', 'b', 'c')) + self.checkParam(widget, 'columns', ('a', 'b', 'c')) + self.checkParam(widget, 'columns', ()) + + def test_displaycolumns(self): + widget = self.create() + widget['columns'] = ('a', 'b', 'c') + self.checkParam(widget, 'displaycolumns', 'b a c', + expected=('b', 'a', 'c')) + self.checkParam(widget, 'displaycolumns', ('b', 'a', 'c')) + self.checkParam(widget, 'displaycolumns', '#all', + expected=('#all',)) + self.checkParam(widget, 'displaycolumns', (2, 1, 0)) + self.checkInvalidParam(widget, 'displaycolumns', ('a', 'b', 'd'), + errmsg='Invalid column index d') + self.checkInvalidParam(widget, 'displaycolumns', (1, 2, 3), + errmsg='Column index 3 out of bounds') + self.checkInvalidParam(widget, 'displaycolumns', (1, -2), + errmsg='Column index -2 out of bounds') + + def test_height(self): + widget = self.create() + self.checkPixelsParam(widget, 'height', 100, -100, 0, '3c', conv=False) + self.checkPixelsParam(widget, 'height', 101.2, 102.6, conv=noconv) + + def test_selectmode(self): + widget = self.create() + self.checkEnumParam(widget, 'selectmode', + 'none', 'browse', 'extended') + + def test_show(self): + widget = self.create() + self.checkParam(widget, 'show', 'tree headings', + expected=('tree', 'headings')) + self.checkParam(widget, 'show', ('tree', 'headings')) + self.checkParam(widget, 'show', ('headings', 'tree')) + self.checkParam(widget, 'show', 'tree', expected=('tree',)) + self.checkParam(widget, 'show', 'headings', expected=('headings',)) def test_bbox(self): self.tv.pack() @@ -1149,11 +1572,35 @@ self.assertTrue(isinstance(self.tv.tag_configure('test'), dict)) + at add_standard_options(StandardTtkOptionsTests) +class SeparatorTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'orient', 'style', 'takefocus', + # 'state'? + ) + default_orient = 'horizontal' + + def _create(self, **kwargs): + return ttk.Separator(self.root, **kwargs) + + + at add_standard_options(StandardTtkOptionsTests) +class SizegripTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'style', 'takefocus', + # 'state'? + ) + + def _create(self, **kwargs): + return ttk.Sizegrip(self.root, **kwargs) + tests_gui = ( - WidgetTest, ButtonTest, CheckbuttonTest, RadiobuttonTest, - ComboboxTest, EntryTest, PanedwindowTest, ScaleTest, NotebookTest, - TreeviewTest + ButtonTest, CheckbuttonTest, ComboboxTest, EntryTest, + FrameTest, LabelFrameTest, LabelTest, MenubuttonTest, + NotebookTest, PanedWindowTest, ProgressbarTest, + RadiobuttonTest, ScaleTest, ScrollbarTest, SeparatorTest, + SizegripTest, TreeviewTest, WidgetTest, ) if __name__ == "__main__": - run_unittest(*tests_gui) + unittest.main() diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py new file mode 100644 --- /dev/null +++ b/Lib/tkinter/test/widget_tests.py @@ -0,0 +1,487 @@ +# Common tests for test_tkinter/test_widgets.py and test_ttk/test_widgets.py + +import tkinter +from tkinter.ttk import setup_master, Scale +from tkinter.test.support import (tcl_version, requires_tcl, pixels_conv, + tcl_obj_eq) + + +noconv = str if tcl_version < (8, 5) else False + +_sentinel = object() + +class AbstractWidgetTest: + _conv_pixels = round if tcl_version[:2] != (8, 5) else int + _conv_pad_pixels = None + wantobjects = True + + def setUp(self): + self.root = setup_master() + self.scaling = float(self.root.call('tk', 'scaling')) + if not self.root.wantobjects(): + self.wantobjects = False + + def create(self, **kwargs): + widget = self._create(**kwargs) + self.addCleanup(widget.destroy) + return widget + + def assertEqual2(self, actual, expected, msg=None, eq=object.__eq__): + if eq(actual, expected): + return + self.assertEqual(actual, expected, msg) + + def checkParam(self, widget, name, value, *, expected=_sentinel, + conv=False, eq=None): + widget[name] = value + if expected is _sentinel: + expected = value + if conv: + expected = conv(expected) + if not self.wantobjects: + if isinstance(expected, tuple): + expected = tkinter._join(expected) + else: + expected = str(expected) + if eq is None: + eq = tcl_obj_eq + self.assertEqual2(widget[name], expected, eq=eq) + self.assertEqual2(widget.cget(name), expected, eq=eq) + # XXX + if not isinstance(widget, Scale): + t = widget.configure(name) + self.assertEqual(len(t), 5) + ## XXX + if not isinstance(t[4], tuple): + self.assertEqual2(t[4], expected, eq=eq) + + def checkInvalidParam(self, widget, name, value, errmsg=None, *, + keep_orig=True): + orig = widget[name] + if errmsg is not None: + errmsg = errmsg.format(value) + with self.assertRaises(tkinter.TclError) as cm: + widget[name] = value + if errmsg is not None: + self.assertEqual(str(cm.exception), errmsg) + if keep_orig: + self.assertEqual(widget[name], orig) + else: + widget[name] = orig + with self.assertRaises(tkinter.TclError) as cm: + widget.configure({name: value}) + if errmsg is not None: + self.assertEqual(str(cm.exception), errmsg) + if keep_orig: + self.assertEqual(widget[name], orig) + else: + widget[name] = orig + + def checkParams(self, widget, name, *values, **kwargs): + for value in values: + self.checkParam(widget, name, value, **kwargs) + + def checkIntegerParam(self, widget, name, *values, **kwargs): + self.checkParams(widget, name, *values, **kwargs) + self.checkInvalidParam(widget, name, '', + errmsg='expected integer but got ""') + self.checkInvalidParam(widget, name, '10p', + errmsg='expected integer but got "10p"') + self.checkInvalidParam(widget, name, 3.2, + errmsg='expected integer but got "3.2"') + + def checkFloatParam(self, widget, name, *values, conv=float, **kwargs): + for value in values: + self.checkParam(widget, name, value, conv=conv, **kwargs) + self.checkInvalidParam(widget, name, '', + errmsg='expected floating-point number but got ""') + self.checkInvalidParam(widget, name, 'spam', + errmsg='expected floating-point number but got "spam"') + + def checkBooleanParam(self, widget, name): + for value in (False, 0, 'false', 'no', 'off'): + self.checkParam(widget, name, value, expected=0) + for value in (True, 1, 'true', 'yes', 'on'): + self.checkParam(widget, name, value, expected=1) + self.checkInvalidParam(widget, name, '', + errmsg='expected boolean value but got ""') + self.checkInvalidParam(widget, name, 'spam', + errmsg='expected boolean value but got "spam"') + + def checkColorParam(self, widget, name, *, allow_empty=None, **kwargs): + self.checkParams(widget, name, + '#ff0000', '#00ff00', '#0000ff', '#123456', + 'red', 'green', 'blue', 'white', 'black', 'grey', + **kwargs) + self.checkInvalidParam(widget, name, 'spam', + errmsg='unknown color name "spam"') + + def checkCursorParam(self, widget, name, **kwargs): + self.checkParams(widget, name, 'arrow', 'watch', 'cross', '',**kwargs) + if tcl_version >= (8, 5): + self.checkParam(widget, name, 'none') + self.checkInvalidParam(widget, name, 'spam', + errmsg='bad cursor spec "spam"') + + def checkCommandParam(self, widget, name): + def command(*args): + pass + widget[name] = command + self.assertTrue(widget[name]) + self.checkParams(widget, name, '') + + def checkEnumParam(self, widget, name, *values, errmsg=None, **kwargs): + self.checkParams(widget, name, *values, **kwargs) + if errmsg is None: + errmsg2 = ' %s "{}": must be %s%s or %s' % ( + name, + ', '.join(values[:-1]), + ',' if len(values) > 2 else '', + values[-1]) + self.checkInvalidParam(widget, name, '', + errmsg='ambiguous' + errmsg2) + errmsg = 'bad' + errmsg2 + self.checkInvalidParam(widget, name, 'spam', errmsg=errmsg) + + def checkPixelsParam(self, widget, name, *values, + conv=None, keep_orig=True, **kwargs): + if conv is None: + conv = self._conv_pixels + for value in values: + expected = _sentinel + conv1 = conv + if isinstance(value, str): + if conv1 and conv1 is not str: + expected = pixels_conv(value) * self.scaling + conv1 = round + self.checkParam(widget, name, value, expected=expected, + conv=conv1, **kwargs) + self.checkInvalidParam(widget, name, '6x', + errmsg='bad screen distance "6x"', keep_orig=keep_orig) + self.checkInvalidParam(widget, name, 'spam', + errmsg='bad screen distance "spam"', keep_orig=keep_orig) + + def checkReliefParam(self, widget, name): + self.checkParams(widget, name, + 'flat', 'groove', 'raised', 'ridge', 'solid', 'sunken') + errmsg='bad relief "spam": must be '\ + 'flat, groove, raised, ridge, solid, or sunken' + if tcl_version < (8, 6): + errmsg = None + self.checkInvalidParam(widget, name, 'spam', + errmsg=errmsg) + + def checkImageParam(self, widget, name): + image = tkinter.PhotoImage('image1') + self.checkParam(widget, name, image, conv=str) + self.checkInvalidParam(widget, name, 'spam', + errmsg='image "spam" doesn\'t exist') + widget[name] = '' + + def checkVariableParam(self, widget, name, var): + self.checkParam(widget, name, var, conv=str) + + +class StandardOptionsTests: + STANDARD_OPTIONS = ( + 'activebackground', 'activeborderwidth', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', 'compound', 'cursor', + 'disabledforeground', 'exportselection', 'font', 'foreground', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'insertbackground', 'insertborderwidth', + 'insertofftime', 'insertontime', 'insertwidth', + 'jump', 'justify', 'orient', 'padx', 'pady', 'relief', + 'repeatdelay', 'repeatinterval', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'setgrid', 'takefocus', 'text', 'textvariable', 'troughcolor', + 'underline', 'wraplength', 'xscrollcommand', 'yscrollcommand', + ) + + def test_activebackground(self): + widget = self.create() + self.checkColorParam(widget, 'activebackground') + + def test_activeborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'activeborderwidth', + 0, 1.3, 2.9, 6, -2, '10p') + + def test_activeforeground(self): + widget = self.create() + self.checkColorParam(widget, 'activeforeground') + + def test_anchor(self): + widget = self.create() + self.checkEnumParam(widget, 'anchor', + 'n', 'ne', 'e', 'se', 's', 'sw', 'w', 'nw', 'center') + + def test_background(self): + widget = self.create() + self.checkColorParam(widget, 'background') + if 'bg' in self.OPTIONS: + self.checkColorParam(widget, 'bg') + + def test_bitmap(self): + widget = self.create() + self.checkParam(widget, 'bitmap', 'questhead') + self.checkParam(widget, 'bitmap', 'gray50') + self.checkInvalidParam(widget, 'bitmap', 'spam', + errmsg='bitmap "spam" not defined') + + def test_borderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'borderwidth', + 0, 1.3, 2.6, 6, -2, '10p') + if 'bd' in self.OPTIONS: + self.checkPixelsParam(widget, 'bd', 0, 1.3, 2.6, 6, -2, '10p') + + def test_compound(self): + widget = self.create() + self.checkEnumParam(widget, 'compound', + 'bottom', 'center', 'left', 'none', 'right', 'top') + + def test_cursor(self): + widget = self.create() + self.checkCursorParam(widget, 'cursor') + + def test_disabledforeground(self): + widget = self.create() + self.checkColorParam(widget, 'disabledforeground') + + def test_exportselection(self): + widget = self.create() + self.checkBooleanParam(widget, 'exportselection') + + def test_font(self): + widget = self.create() + self.checkParam(widget, 'font', + '-Adobe-Helvetica-Medium-R-Normal--*-120-*-*-*-*-*-*') + self.checkInvalidParam(widget, 'font', '', + errmsg='font "" doesn\'t exist') + + def test_foreground(self): + widget = self.create() + self.checkColorParam(widget, 'foreground') + if 'fg' in self.OPTIONS: + self.checkColorParam(widget, 'fg') + + def test_highlightbackground(self): + widget = self.create() + self.checkColorParam(widget, 'highlightbackground') + + def test_highlightcolor(self): + widget = self.create() + self.checkColorParam(widget, 'highlightcolor') + + def test_highlightthickness(self): + widget = self.create() + self.checkPixelsParam(widget, 'highlightthickness', + 0, 1.3, 2.6, 6, '10p') + self.checkParam(widget, 'highlightthickness', -2, expected=0, + conv=self._conv_pixels) + + def test_image(self): + widget = self.create() + self.checkImageParam(widget, 'image') + + def test_insertbackground(self): + widget = self.create() + self.checkColorParam(widget, 'insertbackground') + + def test_insertborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'insertborderwidth', + 0, 1.3, 2.6, 6, -2, '10p') + + def test_insertofftime(self): + widget = self.create() + self.checkIntegerParam(widget, 'insertofftime', 100) + + def test_insertontime(self): + widget = self.create() + self.checkIntegerParam(widget, 'insertontime', 100) + + def test_insertwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'insertwidth', 1.3, 2.6, -2, '10p') + + def test_jump(self): + widget = self.create() + self.checkBooleanParam(widget, 'jump') + + def test_justify(self): + widget = self.create() + self.checkEnumParam(widget, 'justify', 'left', 'right', 'center', + errmsg='bad justification "{}": must be ' + 'left, right, or center') + self.checkInvalidParam(widget, 'justify', '', + errmsg='ambiguous justification "": must be ' + 'left, right, or center') + + def test_orient(self): + widget = self.create() + self.assertEqual(str(widget['orient']), self.default_orient) + self.checkEnumParam(widget, 'orient', 'horizontal', 'vertical') + + def test_padx(self): + widget = self.create() + self.checkPixelsParam(widget, 'padx', 3, 4.4, 5.6, -2, '12m', + conv=self._conv_pad_pixels) + + def test_pady(self): + widget = self.create() + self.checkPixelsParam(widget, 'pady', 3, 4.4, 5.6, -2, '12m', + conv=self._conv_pad_pixels) + + def test_relief(self): + widget = self.create() + self.checkReliefParam(widget, 'relief') + + def test_repeatdelay(self): + widget = self.create() + self.checkIntegerParam(widget, 'repeatdelay', -500, 500) + + def test_repeatinterval(self): + widget = self.create() + self.checkIntegerParam(widget, 'repeatinterval', -500, 500) + + def test_selectbackground(self): + widget = self.create() + self.checkColorParam(widget, 'selectbackground') + + def test_selectborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'selectborderwidth', 1.3, 2.6, -2, '10p') + + def test_selectforeground(self): + widget = self.create() + self.checkColorParam(widget, 'selectforeground') + + def test_setgrid(self): + widget = self.create() + self.checkBooleanParam(widget, 'setgrid') + + def test_state(self): + widget = self.create() + self.checkEnumParam(widget, 'state', 'active', 'disabled', 'normal') + + def test_takefocus(self): + widget = self.create() + self.checkParams(widget, 'takefocus', '0', '1', '') + + def test_text(self): + widget = self.create() + self.checkParams(widget, 'text', '', 'any string') + + def test_textvariable(self): + widget = self.create() + var = tkinter.StringVar() + self.checkVariableParam(widget, 'textvariable', var) + + def test_troughcolor(self): + widget = self.create() + self.checkColorParam(widget, 'troughcolor') + + def test_underline(self): + widget = self.create() + self.checkIntegerParam(widget, 'underline', 0, 1, 10) + + def test_wraplength(self): + widget = self.create() + if tcl_version < (8, 5): + self.checkPixelsParam(widget, 'wraplength', 100) + else: + self.checkParams(widget, 'wraplength', 100) + + def test_xscrollcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'xscrollcommand') + + def test_yscrollcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'yscrollcommand') + + # non-standard but common options + + def test_command(self): + widget = self.create() + self.checkCommandParam(widget, 'command') + + def test_indicatoron(self): + widget = self.create() + self.checkBooleanParam(widget, 'indicatoron') + + def test_offrelief(self): + widget = self.create() + self.checkReliefParam(widget, 'offrelief') + + def test_overrelief(self): + widget = self.create() + self.checkReliefParam(widget, 'overrelief') + + def test_selectcolor(self): + widget = self.create() + self.checkColorParam(widget, 'selectcolor') + + def test_selectimage(self): + widget = self.create() + self.checkImageParam(widget, 'selectimage') + + @requires_tcl(8, 5) + def test_tristateimage(self): + widget = self.create() + self.checkImageParam(widget, 'tristateimage') + + @requires_tcl(8, 5) + def test_tristatevalue(self): + widget = self.create() + self.checkParam(widget, 'tristatevalue', 'unknowable') + + def test_variable(self): + widget = self.create() + var = tkinter.DoubleVar() + self.checkVariableParam(widget, 'variable', var) + + +class IntegerSizeTests: + def test_height(self): + widget = self.create() + self.checkIntegerParam(widget, 'height', 100, -100, 0) + + def test_width(self): + widget = self.create() + self.checkIntegerParam(widget, 'width', 402, -402, 0) + + +class PixelSizeTests: + def test_height(self): + widget = self.create() + self.checkPixelsParam(widget, 'height', 100, 101.2, 102.6, -100, 0, '3c') + + def test_width(self): + widget = self.create() + self.checkPixelsParam(widget, 'width', 402, 403.4, 404.6, -402, 0, '5i') + + +def add_standard_options(*source_classes): + # This decorator adds test_xxx methods from source classes for every xxx + # option in the OPTIONS class attribute if they are not defined explicitly. + def decorator(cls): + for option in cls.OPTIONS: + methodname = 'test_' + option + if not hasattr(cls, methodname): + for source_class in source_classes: + if hasattr(source_class, methodname): + setattr(cls, methodname, + getattr(source_class, methodname)) + break + else: + def test(self, option=option): + widget = self.create() + widget[option] + raise AssertionError('Option "%s" is not tested in %s' % + (option, cls.__name__)) + test.__name__ = methodname + setattr(cls, methodname, test) + return cls + return decorator diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -15,6 +15,11 @@ - Issue #19435: Fix directory traversal attack on CGIHttpRequestHandler. +Tests +----- + +- Issue #19085: Added basic tests for all tkinter widget options. + What's New in Python 3.3.3 release candidate 1? =============================================== -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 09:48:37 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 2 Nov 2013 09:48:37 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319085=3A_Added_basic_tests_for_all_tkinter_widg?= =?utf-8?q?et_options=2E?= Message-ID: <3dBYpF5BW0z7Ljq@mail.python.org> http://hg.python.org/cpython/rev/ab7c2c1d349c changeset: 86835:ab7c2c1d349c parent: 86824:e071977772bf parent: 86834:92e268f2719e user: Serhiy Storchaka date: Sat Nov 02 10:44:55 2013 +0200 summary: Issue #19085: Added basic tests for all tkinter widget options. files: Lib/tkinter/test/support.py | 39 + Lib/tkinter/test/test_tkinter/test_widgets.py | 919 ++++++++++ Lib/tkinter/test/test_ttk/test_widgets.py | 493 +++++- Lib/tkinter/test/widget_tests.py | 487 +++++ Misc/NEWS | 2 + 5 files changed, 1917 insertions(+), 23 deletions(-) diff --git a/Lib/tkinter/test/support.py b/Lib/tkinter/test/support.py --- a/Lib/tkinter/test/support.py +++ b/Lib/tkinter/test/support.py @@ -77,3 +77,42 @@ widget.event_generate('', x=x, y=y) widget.event_generate('', x=x, y=y) widget.event_generate('', x=x, y=y) + + +import _tkinter +tcl_version = tuple(map(int, _tkinter.TCL_VERSION.split('.'))) + +def requires_tcl(*version): + return unittest.skipUnless(tcl_version >= version, + 'requires Tcl version >= ' + '.'.join(map(str, version))) + +units = { + 'c': 72 / 2.54, # centimeters + 'i': 72, # inches + 'm': 72 / 25.4, # millimeters + 'p': 1, # points +} + +def pixels_conv(value): + return float(value[:-1]) * units[value[-1:]] + +def tcl_obj_eq(actual, expected): + if actual == expected: + return True + if isinstance(actual, _tkinter.Tcl_Obj): + if isinstance(expected, str): + return str(actual) == expected + if isinstance(actual, tuple): + if isinstance(expected, tuple): + return (len(actual) == len(expected) and + all(tcl_obj_eq(act, exp) + for act, exp in zip(actual, expected))) + return False + +def widget_eq(actual, expected): + if actual == expected: + return True + if isinstance(actual, (str, tkinter.Widget)): + if isinstance(expected, (str, tkinter.Widget)): + return str(actual) == str(expected) + return False diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py new file mode 100644 --- /dev/null +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -0,0 +1,919 @@ +import unittest +import tkinter +import os +from test.support import requires + +from tkinter.test.support import tcl_version, requires_tcl, widget_eq +from tkinter.test.widget_tests import (add_standard_options, noconv, + AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests) + +requires('gui') + + +def float_round(x): + return float(round(x)) + + +class AbstractToplevelTest(AbstractWidgetTest, PixelSizeTests): + _conv_pad_pixels = noconv + + def test_class(self): + widget = self.create() + self.assertEqual(widget['class'], + widget.__class__.__name__.title()) + self.checkInvalidParam(widget, 'class', 'Foo', + errmsg="can't modify -class option after widget is created") + widget2 = self.create(class_='Foo') + self.assertEqual(widget2['class'], 'Foo') + + def test_colormap(self): + widget = self.create() + self.assertEqual(widget['colormap'], '') + self.checkInvalidParam(widget, 'colormap', 'new', + errmsg="can't modify -colormap option after widget is created") + widget2 = self.create(colormap='new') + self.assertEqual(widget2['colormap'], 'new') + + def test_container(self): + widget = self.create() + self.assertEqual(widget['container'], 0 if self.wantobjects else '0') + self.checkInvalidParam(widget, 'container', 1, + errmsg="can't modify -container option after widget is created") + widget2 = self.create(container=True) + self.assertEqual(widget2['container'], 1 if self.wantobjects else '1') + + def test_visual(self): + widget = self.create() + self.assertEqual(widget['visual'], '') + self.checkInvalidParam(widget, 'visual', 'default', + errmsg="can't modify -visual option after widget is created") + widget2 = self.create(visual='default') + self.assertEqual(widget2['visual'], 'default') + + + at add_standard_options(StandardOptionsTests) +class ToplevelTest(AbstractToplevelTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', + 'class', 'colormap', 'container', 'cursor', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'menu', 'padx', 'pady', 'relief', 'screen', + 'takefocus', 'use', 'visual', 'width', + ) + + def _create(self, **kwargs): + return tkinter.Toplevel(self.root, **kwargs) + + def test_menu(self): + widget = self.create() + menu = tkinter.Menu(self.root) + self.checkParam(widget, 'menu', menu, eq=widget_eq) + self.checkParam(widget, 'menu', '') + + def test_screen(self): + widget = self.create() + self.assertEqual(widget['screen'], '') + display = os.environ['DISPLAY'] + self.checkInvalidParam(widget, 'screen', display, + errmsg="can't modify -screen option after widget is created") + widget2 = self.create(screen=display) + self.assertEqual(widget2['screen'], display) + + def test_use(self): + widget = self.create() + self.assertEqual(widget['use'], '') + widget1 = self.create(container=True) + self.assertEqual(widget1['use'], '') + self.checkInvalidParam(widget1, 'use', '0x44022', + errmsg="can't modify -use option after widget is created") + wid = hex(widget1.winfo_id()) + widget2 = self.create(use=wid) + self.assertEqual(widget2['use'], wid) + + + at add_standard_options(StandardOptionsTests) +class FrameTest(AbstractToplevelTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', + 'class', 'colormap', 'container', 'cursor', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'relief', 'takefocus', 'visual', 'width', + ) + + def _create(self, **kwargs): + return tkinter.Frame(self.root, **kwargs) + + + at add_standard_options(StandardOptionsTests) +class LabelFrameTest(AbstractToplevelTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', + 'class', 'colormap', 'container', 'cursor', + 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'labelanchor', 'labelwidget', 'padx', 'pady', 'relief', + 'takefocus', 'text', 'visual', 'width', + ) + + def _create(self, **kwargs): + return tkinter.LabelFrame(self.root, **kwargs) + + def test_labelanchor(self): + widget = self.create() + self.checkEnumParam(widget, 'labelanchor', + 'e', 'en', 'es', 'n', 'ne', 'nw', + 's', 'se', 'sw', 'w', 'wn', 'ws') + self.checkInvalidParam(widget, 'labelanchor', 'center') + + def test_labelwidget(self): + widget = self.create() + label = tkinter.Label(self.root, text='Mupp', name='foo') + self.checkParam(widget, 'labelwidget', label, expected='.foo') + label.destroy() + + +class AbstractLabelTest(AbstractWidgetTest, IntegerSizeTests): + _conv_pixels = noconv + + def test_highlightthickness(self): + widget = self.create() + self.checkPixelsParam(widget, 'highlightthickness', + 0, 1.3, 2.6, 6, -2, '10p') + + + at add_standard_options(StandardOptionsTests) +class LabelTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', 'compound', 'cursor', + 'disabledforeground', 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'justify', 'padx', 'pady', 'relief', 'state', + 'takefocus', 'text', 'textvariable', + 'underline', 'width', 'wraplength', + ) + + def _create(self, **kwargs): + return tkinter.Label(self.root, **kwargs) + + + at add_standard_options(StandardOptionsTests) +class ButtonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', + 'command', 'compound', 'cursor', 'default', + 'disabledforeground', 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'justify', 'overrelief', 'padx', 'pady', 'relief', + 'repeatdelay', 'repeatinterval', + 'state', 'takefocus', 'text', 'textvariable', + 'underline', 'width', 'wraplength') + + def _create(self, **kwargs): + return tkinter.Button(self.root, **kwargs) + + def test_default(self): + widget = self.create() + self.checkEnumParam(widget, 'default', 'active', 'disabled', 'normal') + + + at add_standard_options(StandardOptionsTests) +class CheckbuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', + 'command', 'compound', 'cursor', + 'disabledforeground', 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'indicatoron', 'justify', + 'offrelief', 'offvalue', 'onvalue', 'overrelief', + 'padx', 'pady', 'relief', 'selectcolor', 'selectimage', 'state', + 'takefocus', 'text', 'textvariable', + 'tristateimage', 'tristatevalue', + 'underline', 'variable', 'width', 'wraplength', + ) + + def _create(self, **kwargs): + return tkinter.Checkbutton(self.root, **kwargs) + + + def test_offvalue(self): + widget = self.create() + self.checkParams(widget, 'offvalue', 1, 2.3, '', 'any string') + + def test_onvalue(self): + widget = self.create() + self.checkParams(widget, 'onvalue', 1, 2.3, '', 'any string') + + + at add_standard_options(StandardOptionsTests) +class RadiobuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', + 'command', 'compound', 'cursor', + 'disabledforeground', 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'indicatoron', 'justify', 'offrelief', 'overrelief', + 'padx', 'pady', 'relief', 'selectcolor', 'selectimage', 'state', + 'takefocus', 'text', 'textvariable', + 'tristateimage', 'tristatevalue', + 'underline', 'value', 'variable', 'width', 'wraplength', + ) + + def _create(self, **kwargs): + return tkinter.Radiobutton(self.root, **kwargs) + + def test_value(self): + widget = self.create() + self.checkParams(widget, 'value', 1, 2.3, '', 'any string') + + + at add_standard_options(StandardOptionsTests) +class MenubuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', + 'compound', 'cursor', 'direction', + 'disabledforeground', 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'indicatoron', 'justify', 'menu', + 'padx', 'pady', 'relief', 'state', + 'takefocus', 'text', 'textvariable', + 'underline', 'width', 'wraplength', + ) + _conv_pixels = AbstractWidgetTest._conv_pixels + + def _create(self, **kwargs): + return tkinter.Menubutton(self.root, **kwargs) + + def test_direction(self): + widget = self.create() + self.checkEnumParam(widget, 'direction', + 'above', 'below', 'flush', 'left', 'right') + + def test_height(self): + widget = self.create() + self.checkIntegerParam(widget, 'height', 100, -100, 0, conv=str) + + test_highlightthickness = StandardOptionsTests.test_highlightthickness + + def test_image(self): + widget = self.create() + image = tkinter.PhotoImage('image1') + self.checkParam(widget, 'image', image, conv=str) + errmsg = 'image "spam" doesn\'t exist' + with self.assertRaises(tkinter.TclError) as cm: + widget['image'] = 'spam' + if errmsg is not None: + self.assertEqual(str(cm.exception), errmsg) + with self.assertRaises(tkinter.TclError) as cm: + widget.configure({'image': 'spam'}) + if errmsg is not None: + self.assertEqual(str(cm.exception), errmsg) + + def test_menu(self): + widget = self.create() + menu = tkinter.Menu(widget, name='menu') + self.checkParam(widget, 'menu', menu, eq=widget_eq) + menu.destroy() + + def test_padx(self): + widget = self.create() + self.checkPixelsParam(widget, 'padx', 3, 4.4, 5.6, '12m') + self.checkParam(widget, 'padx', -2, expected=0) + + def test_pady(self): + widget = self.create() + self.checkPixelsParam(widget, 'pady', 3, 4.4, 5.6, '12m') + self.checkParam(widget, 'pady', -2, expected=0) + + def test_width(self): + widget = self.create() + self.checkIntegerParam(widget, 'width', 402, -402, 0, conv=str) + + +class OptionMenuTest(MenubuttonTest, unittest.TestCase): + + def _create(self, default='b', values=('a', 'b', 'c'), **kwargs): + return tkinter.OptionMenu(self.root, None, default, *values, **kwargs) + + + at add_standard_options(IntegerSizeTests, StandardOptionsTests) +class EntryTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', 'cursor', + 'disabledbackground', 'disabledforeground', + 'exportselection', 'font', 'foreground', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'insertbackground', 'insertborderwidth', + 'insertofftime', 'insertontime', 'insertwidth', + 'invalidcommand', 'justify', 'readonlybackground', 'relief', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'show', 'state', 'takefocus', 'textvariable', + 'validate', 'validatecommand', 'width', 'xscrollcommand', + ) + + def _create(self, **kwargs): + return tkinter.Entry(self.root, **kwargs) + + def test_disabledbackground(self): + widget = self.create() + self.checkColorParam(widget, 'disabledbackground') + + def test_insertborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'insertborderwidth', 0, 1.3, -2) + self.checkParam(widget, 'insertborderwidth', 2, expected=1) + self.checkParam(widget, 'insertborderwidth', '10p', expected=1) + + def test_insertwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'insertwidth', 1.3, 3.6, '10p') + if tcl_version[:2] == (8, 5): + self.checkParam(widget, 'insertwidth', 0.9, expected=2) + else: + self.checkParam(widget, 'insertwidth', 0.9, expected=1) + self.checkParam(widget, 'insertwidth', 0.1, expected=2) + self.checkParam(widget, 'insertwidth', -2, expected=2) + + def test_invalidcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'invalidcommand') + self.checkCommandParam(widget, 'invcmd') + + def test_readonlybackground(self): + widget = self.create() + self.checkColorParam(widget, 'readonlybackground') + + def test_show(self): + widget = self.create() + self.checkParam(widget, 'show', '*') + self.checkParam(widget, 'show', '') + self.checkParam(widget, 'show', ' ') + + def test_state(self): + widget = self.create() + self.checkEnumParam(widget, 'state', + 'disabled', 'normal', 'readonly') + + def test_validate(self): + widget = self.create() + self.checkEnumParam(widget, 'validate', + 'all', 'key', 'focus', 'focusin', 'focusout', 'none') + + def test_validatecommand(self): + widget = self.create() + self.checkCommandParam(widget, 'validatecommand') + self.checkCommandParam(widget, 'vcmd') + + + at add_standard_options(StandardOptionsTests) +class SpinboxTest(EntryTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'background', 'borderwidth', + 'buttonbackground', 'buttoncursor', 'buttondownrelief', 'buttonuprelief', + 'command', 'cursor', 'disabledbackground', 'disabledforeground', + 'exportselection', 'font', 'foreground', 'format', 'from', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'increment', + 'insertbackground', 'insertborderwidth', + 'insertofftime', 'insertontime', 'insertwidth', + 'invalidcommand', 'justify', 'relief', 'readonlybackground', + 'repeatdelay', 'repeatinterval', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'state', 'takefocus', 'textvariable', 'to', + 'validate', 'validatecommand', 'values', + 'width', 'wrap', 'xscrollcommand', + ) + + def _create(self, **kwargs): + return tkinter.Spinbox(self.root, **kwargs) + + test_show = None + + def test_buttonbackground(self): + widget = self.create() + self.checkColorParam(widget, 'buttonbackground') + + def test_buttoncursor(self): + widget = self.create() + self.checkCursorParam(widget, 'buttoncursor') + + def test_buttondownrelief(self): + widget = self.create() + self.checkReliefParam(widget, 'buttondownrelief') + + def test_buttonuprelief(self): + widget = self.create() + self.checkReliefParam(widget, 'buttonuprelief') + + def test_format(self): + widget = self.create() + self.checkParam(widget, 'format', '%2f') + self.checkParam(widget, 'format', '%2.2f') + self.checkParam(widget, 'format', '%.2f') + self.checkParam(widget, 'format', '%2.f') + self.checkInvalidParam(widget, 'format', '%2e-1f') + self.checkInvalidParam(widget, 'format', '2.2') + self.checkInvalidParam(widget, 'format', '%2.-2f') + self.checkParam(widget, 'format', '%-2.02f') + self.checkParam(widget, 'format', '% 2.02f') + self.checkParam(widget, 'format', '% -2.200f') + self.checkParam(widget, 'format', '%09.200f') + self.checkInvalidParam(widget, 'format', '%d') + + def test_from(self): + widget = self.create() + self.checkParam(widget, 'to', 100.0) + self.checkFloatParam(widget, 'from', -10, 10.2, 11.7) + self.checkInvalidParam(widget, 'from', 200, + errmsg='-to value must be greater than -from value') + + def test_increment(self): + widget = self.create() + self.checkFloatParam(widget, 'increment', -1, 1, 10.2, 12.8, 0) + + def test_to(self): + widget = self.create() + self.checkParam(widget, 'from', -100.0) + self.checkFloatParam(widget, 'to', -10, 10.2, 11.7) + self.checkInvalidParam(widget, 'to', -200, + errmsg='-to value must be greater than -from value') + + def test_values(self): + # XXX + widget = self.create() + self.assertEqual(widget['values'], '') + self.checkParam(widget, 'values', 'mon tue wed thur') + self.checkParam(widget, 'values', ('mon', 'tue', 'wed', 'thur'), + expected='mon tue wed thur') + self.checkParam(widget, 'values', (42, 3.14, '', 'any string'), + expected='42 3.14 {} {any string}') + self.checkParam(widget, 'values', '') + + def test_wrap(self): + widget = self.create() + self.checkBooleanParam(widget, 'wrap') + + + at add_standard_options(StandardOptionsTests) +class TextTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'autoseparators', 'background', 'blockcursor', 'borderwidth', + 'cursor', 'endline', 'exportselection', + 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'inactiveselectbackground', 'insertbackground', 'insertborderwidth', + 'insertofftime', 'insertontime', 'insertunfocussed', 'insertwidth', + 'maxundo', 'padx', 'pady', 'relief', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'setgrid', 'spacing1', 'spacing2', 'spacing3', 'startline', 'state', + 'tabs', 'tabstyle', 'takefocus', 'undo', 'width', 'wrap', + 'xscrollcommand', 'yscrollcommand', + ) + if tcl_version < (8, 5): + wantobjects = False + + def _create(self, **kwargs): + return tkinter.Text(self.root, **kwargs) + + def test_autoseparators(self): + widget = self.create() + self.checkBooleanParam(widget, 'autoseparators') + + @requires_tcl(8, 5) + def test_blockcursor(self): + widget = self.create() + self.checkBooleanParam(widget, 'blockcursor') + + @requires_tcl(8, 5) + def test_endline(self): + widget = self.create() + text = '\n'.join('Line %d' for i in range(100)) + widget.insert('end', text) + self.checkParam(widget, 'endline', 200, expected='') + self.checkParam(widget, 'endline', -10, expected='') + self.checkInvalidParam(widget, 'endline', 'spam', + errmsg='expected integer but got "spam"') + self.checkParam(widget, 'endline', 50) + self.checkParam(widget, 'startline', 15) + self.checkInvalidParam(widget, 'endline', 10, + errmsg='-startline must be less than or equal to -endline') + + def test_height(self): + widget = self.create() + self.checkPixelsParam(widget, 'height', 100, 101.2, 102.6, '3c') + self.checkParam(widget, 'height', -100, expected=1) + self.checkParam(widget, 'height', 0, expected=1) + + def test_maxundo(self): + widget = self.create() + self.checkIntegerParam(widget, 'maxundo', 0, 5, -1) + + @requires_tcl(8, 5) + def test_inactiveselectbackground(self): + widget = self.create() + self.checkColorParam(widget, 'inactiveselectbackground') + + @requires_tcl(8, 6) + def test_insertunfocussed(self): + widget = self.create() + self.checkEnumParam(widget, 'insertunfocussed', + 'hollow', 'none', 'solid') + + def test_selectborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'selectborderwidth', + 1.3, 2.6, -2, '10p', conv=False, + keep_orig=tcl_version >= (8, 5)) + + def test_spacing1(self): + widget = self.create() + self.checkPixelsParam(widget, 'spacing1', 20, 21.4, 22.6, '0.5c') + self.checkParam(widget, 'spacing1', -5, expected=0) + + def test_spacing2(self): + widget = self.create() + self.checkPixelsParam(widget, 'spacing2', 5, 6.4, 7.6, '0.1c') + self.checkParam(widget, 'spacing2', -1, expected=0) + + def test_spacing3(self): + widget = self.create() + self.checkPixelsParam(widget, 'spacing3', 20, 21.4, 22.6, '0.5c') + self.checkParam(widget, 'spacing3', -10, expected=0) + + @requires_tcl(8, 5) + def test_startline(self): + widget = self.create() + text = '\n'.join('Line %d' for i in range(100)) + widget.insert('end', text) + self.checkParam(widget, 'startline', 200, expected='') + self.checkParam(widget, 'startline', -10, expected='') + self.checkInvalidParam(widget, 'startline', 'spam', + errmsg='expected integer but got "spam"') + self.checkParam(widget, 'startline', 10) + self.checkParam(widget, 'endline', 50) + self.checkInvalidParam(widget, 'startline', 70, + errmsg='-startline must be less than or equal to -endline') + + def test_state(self): + widget = self.create() + if tcl_version < (8, 5): + self.checkParams(widget, 'state', 'disabled', 'normal') + else: + self.checkEnumParam(widget, 'state', 'disabled', 'normal') + + def test_tabs(self): + widget = self.create() + self.checkParam(widget, 'tabs', (10.2, 20.7, '1i', '2i')) + self.checkParam(widget, 'tabs', '10.2 20.7 1i 2i', + expected=('10.2', '20.7', '1i', '2i')) + self.checkParam(widget, 'tabs', '2c left 4c 6c center', + expected=('2c', 'left', '4c', '6c', 'center')) + self.checkInvalidParam(widget, 'tabs', 'spam', + errmsg='bad screen distance "spam"', + keep_orig=tcl_version >= (8, 5)) + + @requires_tcl(8, 5) + def test_tabstyle(self): + widget = self.create() + self.checkEnumParam(widget, 'tabstyle', 'tabular', 'wordprocessor') + + def test_undo(self): + widget = self.create() + self.checkBooleanParam(widget, 'undo') + + def test_width(self): + widget = self.create() + self.checkIntegerParam(widget, 'width', 402) + self.checkParam(widget, 'width', -402, expected=1) + self.checkParam(widget, 'width', 0, expected=1) + + def test_wrap(self): + widget = self.create() + if tcl_version < (8, 5): + self.checkParams(widget, 'wrap', 'char', 'none', 'word') + else: + self.checkEnumParam(widget, 'wrap', 'char', 'none', 'word') + + + at add_standard_options(PixelSizeTests, StandardOptionsTests) +class CanvasTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', + 'closeenough', 'confine', 'cursor', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'insertbackground', 'insertborderwidth', + 'insertofftime', 'insertontime', 'insertwidth', + 'relief', 'scrollregion', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'state', 'takefocus', + 'xscrollcommand', 'xscrollincrement', + 'yscrollcommand', 'yscrollincrement', 'width', + ) + + _conv_pixels = round + wantobjects = False + + def _create(self, **kwargs): + return tkinter.Canvas(self.root, **kwargs) + + def test_closeenough(self): + widget = self.create() + self.checkFloatParam(widget, 'closeenough', 24, 2.4, 3.6, -3, + conv=float) + + def test_confine(self): + widget = self.create() + self.checkBooleanParam(widget, 'confine') + + def test_scrollregion(self): + widget = self.create() + self.checkParam(widget, 'scrollregion', '0 0 200 150') + self.checkParam(widget, 'scrollregion', (0, 0, 200, 150), + expected='0 0 200 150') + self.checkParam(widget, 'scrollregion', '') + self.checkInvalidParam(widget, 'scrollregion', 'spam', + errmsg='bad scrollRegion "spam"') + self.checkInvalidParam(widget, 'scrollregion', (0, 0, 200, 'spam')) + self.checkInvalidParam(widget, 'scrollregion', (0, 0, 200)) + self.checkInvalidParam(widget, 'scrollregion', (0, 0, 200, 150, 0)) + + def test_state(self): + widget = self.create() + self.checkEnumParam(widget, 'state', 'disabled', 'normal', + errmsg='bad state value "{}": must be normal or disabled') + + def test_xscrollincrement(self): + widget = self.create() + self.checkPixelsParam(widget, 'xscrollincrement', + 40, 0, 41.2, 43.6, -40, '0.5i') + + def test_yscrollincrement(self): + widget = self.create() + self.checkPixelsParam(widget, 'yscrollincrement', + 10, 0, 11.2, 13.6, -10, '0.1i') + + + at add_standard_options(IntegerSizeTests, StandardOptionsTests) +class ListboxTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'activestyle', 'background', 'borderwidth', 'cursor', + 'disabledforeground', 'exportselection', + 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'listvariable', 'relief', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'selectmode', 'setgrid', 'state', + 'takefocus', 'width', 'xscrollcommand', 'yscrollcommand', + ) + + def _create(self, **kwargs): + return tkinter.Listbox(self.root, **kwargs) + + def test_activestyle(self): + widget = self.create() + self.checkEnumParam(widget, 'activestyle', + 'dotbox', 'none', 'underline') + + def test_listvariable(self): + widget = self.create() + var = tkinter.DoubleVar() + self.checkVariableParam(widget, 'listvariable', var) + + def test_selectmode(self): + widget = self.create() + self.checkParam(widget, 'selectmode', 'single') + self.checkParam(widget, 'selectmode', 'browse') + self.checkParam(widget, 'selectmode', 'multiple') + self.checkParam(widget, 'selectmode', 'extended') + + def test_state(self): + widget = self.create() + self.checkEnumParam(widget, 'state', 'disabled', 'normal') + + at add_standard_options(PixelSizeTests, StandardOptionsTests) +class ScaleTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'background', 'bigincrement', 'borderwidth', + 'command', 'cursor', 'digits', 'font', 'foreground', 'from', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'label', 'length', 'orient', 'relief', + 'repeatdelay', 'repeatinterval', + 'resolution', 'showvalue', 'sliderlength', 'sliderrelief', 'state', + 'takefocus', 'tickinterval', 'to', 'troughcolor', 'variable', 'width', + ) + default_orient = 'vertical' + + def _create(self, **kwargs): + return tkinter.Scale(self.root, **kwargs) + + def test_bigincrement(self): + widget = self.create() + self.checkFloatParam(widget, 'bigincrement', 12.4, 23.6, -5) + + def test_digits(self): + widget = self.create() + self.checkIntegerParam(widget, 'digits', 5, 0) + + def test_from(self): + widget = self.create() + self.checkFloatParam(widget, 'from', 100, 14.9, 15.1, conv=float_round) + + def test_label(self): + widget = self.create() + self.checkParam(widget, 'label', 'any string') + self.checkParam(widget, 'label', '') + + def test_length(self): + widget = self.create() + self.checkPixelsParam(widget, 'length', 130, 131.2, 135.6, '5i') + + def test_resolution(self): + widget = self.create() + self.checkFloatParam(widget, 'resolution', 4.2, 0, 6.7, -2) + + def test_showvalue(self): + widget = self.create() + self.checkBooleanParam(widget, 'showvalue') + + def test_sliderlength(self): + widget = self.create() + self.checkPixelsParam(widget, 'sliderlength', + 10, 11.2, 15.6, -3, '3m') + + def test_sliderrelief(self): + widget = self.create() + self.checkReliefParam(widget, 'sliderrelief') + + def test_tickinterval(self): + widget = self.create() + self.checkFloatParam(widget, 'tickinterval', 1, 4.3, 7.6, 0, + conv=float_round) + self.checkParam(widget, 'tickinterval', -2, expected=2, + conv=float_round) + + def test_to(self): + widget = self.create() + self.checkFloatParam(widget, 'to', 300, 14.9, 15.1, -10, + conv=float_round) + + + at add_standard_options(PixelSizeTests, StandardOptionsTests) +class ScrollbarTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activerelief', + 'background', 'borderwidth', + 'command', 'cursor', 'elementborderwidth', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'jump', 'orient', 'relief', + 'repeatdelay', 'repeatinterval', + 'takefocus', 'troughcolor', 'width', + ) + _conv_pixels = round + wantobjects = False + default_orient = 'vertical' + + def _create(self, **kwargs): + return tkinter.Scrollbar(self.root, **kwargs) + + def test_activerelief(self): + widget = self.create() + self.checkReliefParam(widget, 'activerelief') + + def test_elementborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'elementborderwidth', 4.3, 5.6, -2, '1m') + + def test_orient(self): + widget = self.create() + self.checkEnumParam(widget, 'orient', 'vertical', 'horizontal', + errmsg='bad orientation "{}": must be vertical or horizontal') + + + at add_standard_options(StandardOptionsTests) +class PanedWindowTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', 'cursor', + 'handlepad', 'handlesize', 'height', + 'opaqueresize', 'orient', 'relief', + 'sashcursor', 'sashpad', 'sashrelief', 'sashwidth', + 'showhandle', 'width', + ) + default_orient = 'horizontal' + + def _create(self, **kwargs): + return tkinter.PanedWindow(self.root, **kwargs) + + def test_handlepad(self): + widget = self.create() + self.checkPixelsParam(widget, 'handlepad', 5, 6.4, 7.6, -3, '1m') + + def test_handlesize(self): + widget = self.create() + self.checkPixelsParam(widget, 'handlesize', 8, 9.4, 10.6, -3, '2m', + conv=noconv) + + def test_height(self): + widget = self.create() + self.checkPixelsParam(widget, 'height', 100, 101.2, 102.6, -100, 0, '1i', + conv=noconv) + + def test_opaqueresize(self): + widget = self.create() + self.checkBooleanParam(widget, 'opaqueresize') + + def test_sashcursor(self): + widget = self.create() + self.checkCursorParam(widget, 'sashcursor') + + def test_sashpad(self): + widget = self.create() + self.checkPixelsParam(widget, 'sashpad', 8, 1.3, 2.6, -2, '2m') + + def test_sashrelief(self): + widget = self.create() + self.checkReliefParam(widget, 'sashrelief') + + def test_sashwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'sashwidth', 10, 11.1, 15.6, -3, '1m', + conv=noconv) + + def test_showhandle(self): + widget = self.create() + self.checkBooleanParam(widget, 'showhandle') + + def test_width(self): + widget = self.create() + self.checkPixelsParam(widget, 'width', 402, 403.4, 404.6, -402, 0, '5i', + conv=noconv) + + + at add_standard_options(StandardOptionsTests) +class MenuTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeborderwidth', 'activeforeground', + 'background', 'borderwidth', 'cursor', + 'disabledforeground', 'font', 'foreground', + 'postcommand', 'relief', 'selectcolor', 'takefocus', + 'tearoff', 'tearoffcommand', 'title', 'type', + ) + _conv_pixels = noconv + + def _create(self, **kwargs): + return tkinter.Menu(self.root, **kwargs) + + def test_postcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'postcommand') + + def test_tearoff(self): + widget = self.create() + self.checkBooleanParam(widget, 'tearoff') + + def test_tearoffcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'tearoffcommand') + + def test_title(self): + widget = self.create() + self.checkParam(widget, 'title', 'any string') + + def test_type(self): + widget = self.create() + self.checkEnumParam(widget, 'type', + 'normal', 'tearoff', 'menubar') + + + at add_standard_options(PixelSizeTests, StandardOptionsTests) +class MessageTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'anchor', 'aspect', 'background', 'borderwidth', + 'cursor', 'font', 'foreground', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'justify', 'padx', 'pady', 'relief', + 'takefocus', 'text', 'textvariable', 'width', + ) + _conv_pad_pixels = noconv + + def _create(self, **kwargs): + return tkinter.Message(self.root, **kwargs) + + def test_aspect(self): + widget = self.create() + self.checkIntegerParam(widget, 'aspect', 250, 0, -300) + + +tests_gui = ( + ButtonTest, CanvasTest, CheckbuttonTest, EntryTest, + FrameTest, LabelFrameTest,LabelTest, ListboxTest, + MenubuttonTest, MenuTest, MessageTest, OptionMenuTest, + PanedWindowTest, RadiobuttonTest, ScaleTest, ScrollbarTest, + SpinboxTest, TextTest, ToplevelTest, +) + +if __name__ == '__main__': + unittest.main() diff --git a/Lib/tkinter/test/test_ttk/test_widgets.py b/Lib/tkinter/test/test_ttk/test_widgets.py --- a/Lib/tkinter/test/test_ttk/test_widgets.py +++ b/Lib/tkinter/test/test_ttk/test_widgets.py @@ -1,15 +1,57 @@ import unittest import tkinter -import os from tkinter import ttk -from test.support import requires, run_unittest +from test.support import requires import sys import tkinter.test.support as support -from tkinter.test.test_ttk.test_functions import MockTclObj, MockStateSpec +from tkinter.test.test_ttk.test_functions import MockTclObj +from tkinter.test.support import tcl_version +from tkinter.test.widget_tests import (add_standard_options, noconv, + AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests) requires('gui') + +class StandardTtkOptionsTests(StandardOptionsTests): + + def test_class(self): + widget = self.create() + self.assertEqual(widget['class'], '') + errmsg='attempt to change read-only option' + if tcl_version < (8, 6): + errmsg='Attempt to change read-only option' + self.checkInvalidParam(widget, 'class', 'Foo', errmsg=errmsg) + widget2 = self.create(class_='Foo') + self.assertEqual(widget2['class'], 'Foo') + + def test_padding(self): + widget = self.create() + self.checkParam(widget, 'padding', 0, expected=('0',)) + self.checkParam(widget, 'padding', 5, expected=('5',)) + self.checkParam(widget, 'padding', (5, 6), expected=('5', '6')) + self.checkParam(widget, 'padding', (5, 6, 7), + expected=('5', '6', '7')) + self.checkParam(widget, 'padding', (5, 6, 7, 8), + expected=('5', '6', '7', '8')) + self.checkParam(widget, 'padding', ('5p', '6p', '7p', '8p')) + self.checkParam(widget, 'padding', (), expected='') + + def test_style(self): + widget = self.create() + self.assertEqual(widget['style'], '') + errmsg = 'Layout Foo not found' + if hasattr(self, 'default_orient'): + errmsg = ('Layout %s.Foo not found' % + getattr(self, 'default_orient').title()) + self.checkInvalidParam(widget, 'style', 'Foo', + errmsg=errmsg) + widget2 = self.create(class_='Foo') + self.assertEqual(widget2['class'], 'Foo') + # XXX + pass + + class WidgetTest(unittest.TestCase): """Tests methods available in every ttk widget.""" @@ -73,7 +115,112 @@ self.assertEqual(self.widget.state(), ('active', )) -class ButtonTest(unittest.TestCase): +class AbstractToplevelTest(AbstractWidgetTest, PixelSizeTests): + _conv_pixels = noconv + + + at add_standard_options(StandardTtkOptionsTests) +class FrameTest(AbstractToplevelTest, unittest.TestCase): + OPTIONS = ( + 'borderwidth', 'class', 'cursor', 'height', + 'padding', 'relief', 'style', 'takefocus', + 'width', + ) + + def _create(self, **kwargs): + return ttk.Frame(self.root, **kwargs) + + + at add_standard_options(StandardTtkOptionsTests) +class LabelFrameTest(AbstractToplevelTest, unittest.TestCase): + OPTIONS = ( + 'borderwidth', 'class', 'cursor', 'height', + 'labelanchor', 'labelwidget', + 'padding', 'relief', 'style', 'takefocus', + 'text', 'underline', 'width', + ) + + def _create(self, **kwargs): + return ttk.LabelFrame(self.root, **kwargs) + + def test_labelanchor(self): + widget = self.create() + self.checkEnumParam(widget, 'labelanchor', + 'e', 'en', 'es', 'n', 'ne', 'nw', 's', 'se', 'sw', 'w', 'wn', 'ws', + errmsg='Bad label anchor specification {}') + self.checkInvalidParam(widget, 'labelanchor', 'center') + + def test_labelwidget(self): + widget = self.create() + label = ttk.Label(self.root, text='Mupp', name='foo') + self.checkParam(widget, 'labelwidget', label, expected='.foo') + label.destroy() + + +class AbstractLabelTest(AbstractWidgetTest): + + def checkImageParam(self, widget, name): + image = tkinter.PhotoImage('image1') + image2 = tkinter.PhotoImage('image2') + self.checkParam(widget, name, image, expected=('image1',)) + self.checkParam(widget, name, 'image1', expected=('image1',)) + self.checkParam(widget, name, (image,), expected=('image1',)) + self.checkParam(widget, name, (image, 'active', image2), + expected=('image1', 'active', 'image2')) + self.checkParam(widget, name, 'image1 active image2', + expected=('image1', 'active', 'image2')) + self.checkInvalidParam(widget, name, 'spam', + errmsg='image "spam" doesn\'t exist') + + def test_compound(self): + widget = self.create() + self.checkEnumParam(widget, 'compound', + 'none', 'text', 'image', 'center', + 'top', 'bottom', 'left', 'right') + + def test_state(self): + widget = self.create() + self.checkParams(widget, 'state', 'active', 'disabled', 'normal') + + def test_width(self): + widget = self.create() + self.checkParams(widget, 'width', 402, -402, 0) + + + at add_standard_options(StandardTtkOptionsTests) +class LabelTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'anchor', 'background', + 'class', 'compound', 'cursor', 'font', 'foreground', + 'image', 'justify', 'padding', 'relief', 'state', 'style', + 'takefocus', 'text', 'textvariable', + 'underline', 'width', 'wraplength', + ) + _conv_pixels = noconv + + def _create(self, **kwargs): + return ttk.Label(self.root, **kwargs) + + def test_font(self): + widget = self.create() + self.checkParam(widget, 'font', + '-Adobe-Helvetica-Medium-R-Normal--*-120-*-*-*-*-*-*') + + + at add_standard_options(StandardTtkOptionsTests) +class ButtonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'class', 'command', 'compound', 'cursor', 'default', + 'image', 'state', 'style', 'takefocus', 'text', 'textvariable', + 'underline', 'width', + ) + + def _create(self, **kwargs): + return ttk.Button(self.root, **kwargs) + + def test_default(self): + widget = self.create() + self.checkEnumParam(widget, 'default', 'normal', 'active', 'disabled') def test_invoke(self): success = [] @@ -82,7 +229,27 @@ self.assertTrue(success) -class CheckbuttonTest(unittest.TestCase): + at add_standard_options(StandardTtkOptionsTests) +class CheckbuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'class', 'command', 'compound', 'cursor', + 'image', + 'offvalue', 'onvalue', + 'state', 'style', + 'takefocus', 'text', 'textvariable', + 'underline', 'variable', 'width', + ) + + def _create(self, **kwargs): + return ttk.Checkbutton(self.root, **kwargs) + + def test_offvalue(self): + widget = self.create() + self.checkParams(widget, 'offvalue', 1, 2.3, '', 'any string') + + def test_onvalue(self): + widget = self.create() + self.checkParams(widget, 'onvalue', 1, 2.3, '', 'any string') def test_invoke(self): success = [] @@ -105,21 +272,40 @@ cbtn['command'] = '' res = cbtn.invoke() - self.assertEqual(str(res), '') + self.assertFalse(str(res)) self.assertFalse(len(success) > 1) self.assertEqual(cbtn['offvalue'], cbtn.tk.globalgetvar(cbtn['variable'])) -class ComboboxTest(unittest.TestCase): + at add_standard_options(IntegerSizeTests, StandardTtkOptionsTests) +class ComboboxTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'exportselection', 'height', + 'justify', 'postcommand', 'state', 'style', + 'takefocus', 'textvariable', 'values', 'width', + ) def setUp(self): + super().setUp() support.root_deiconify() - self.combo = ttk.Combobox() + self.combo = self.create() def tearDown(self): self.combo.destroy() support.root_withdraw() + super().tearDown() + + def _create(self, **kwargs): + return ttk.Combobox(self.root, **kwargs) + + def test_height(self): + widget = self.create() + self.checkParams(widget, 'height', 100, 101.2, 102.6, -100, 0, '1i') + + def test_state(self): + widget = self.create() + self.checkParams(widget, 'state', 'active', 'disabled', 'normal') def _show_drop_down_listbox(self): width = self.combo.winfo_width() @@ -167,8 +353,16 @@ self.assertEqual(self.combo.get(), getval) self.assertEqual(self.combo.current(), currval) + self.assertEqual(self.combo['values'], + () if tcl_version < (8, 5) else '') check_get_current('', -1) + self.checkParam(self.combo, 'values', 'mon tue wed thur', + expected=('mon', 'tue', 'wed', 'thur')) + self.checkParam(self.combo, 'values', ('mon', 'tue', 'wed', 'thur')) + self.checkParam(self.combo, 'values', (42, 3.14, '', 'any string')) + self.checkParam(self.combo, 'values', '', expected=()) + self.combo['values'] = ['a', 1, 'c'] self.combo.set('c') @@ -209,15 +403,52 @@ combo2.destroy() -class EntryTest(unittest.TestCase): + at add_standard_options(IntegerSizeTests, StandardTtkOptionsTests) +class EntryTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'background', 'class', 'cursor', + 'exportselection', 'font', + 'invalidcommand', 'justify', + 'show', 'state', 'style', 'takefocus', 'textvariable', + 'validate', 'validatecommand', 'width', 'xscrollcommand', + ) def setUp(self): + super().setUp() support.root_deiconify() - self.entry = ttk.Entry() + self.entry = self.create() def tearDown(self): self.entry.destroy() support.root_withdraw() + super().tearDown() + + def _create(self, **kwargs): + return ttk.Entry(self.root, **kwargs) + + def test_invalidcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'invalidcommand') + + def test_show(self): + widget = self.create() + self.checkParam(widget, 'show', '*') + self.checkParam(widget, 'show', '') + self.checkParam(widget, 'show', ' ') + + def test_state(self): + widget = self.create() + self.checkParams(widget, 'state', + 'disabled', 'normal', 'readonly') + + def test_validate(self): + widget = self.create() + self.checkEnumParam(widget, 'validate', + 'all', 'key', 'focus', 'focusin', 'focusout', 'none') + + def test_validatecommand(self): + widget = self.create() + self.checkCommandParam(widget, 'validatecommand') def test_bbox(self): @@ -313,16 +544,36 @@ self.assertEqual(self.entry.state(), ()) -class PanedwindowTest(unittest.TestCase): + at add_standard_options(IntegerSizeTests, StandardTtkOptionsTests) +class PanedWindowTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'height', + 'orient', 'style', 'takefocus', 'width', + ) def setUp(self): + super().setUp() support.root_deiconify() - self.paned = ttk.Panedwindow() + self.paned = self.create() def tearDown(self): self.paned.destroy() support.root_withdraw() + super().tearDown() + def _create(self, **kwargs): + return ttk.PanedWindow(self.root, **kwargs) + + def test_orient(self): + widget = self.create() + self.assertEqual(str(widget['orient']), 'vertical') + errmsg='attempt to change read-only option' + if tcl_version < (8, 6): + errmsg='Attempt to change read-only option' + self.checkInvalidParam(widget, 'orient', 'horizontal', + errmsg=errmsg) + widget2 = self.create(orient='horizontal') + self.assertEqual(str(widget2['orient']), 'horizontal') def test_add(self): # attempt to add a child that is not a direct child of the paned window @@ -432,7 +683,22 @@ self.assertTrue(isinstance(self.paned.sashpos(0), int)) -class RadiobuttonTest(unittest.TestCase): + at add_standard_options(StandardTtkOptionsTests) +class RadiobuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'class', 'command', 'compound', 'cursor', + 'image', + 'state', 'style', + 'takefocus', 'text', 'textvariable', + 'underline', 'value', 'variable', 'width', + ) + + def _create(self, **kwargs): + return ttk.Radiobutton(self.root, **kwargs) + + def test_value(self): + widget = self.create() + self.checkParams(widget, 'value', 1, 2.3, '', 'any string') def test_invoke(self): success = [] @@ -462,19 +728,68 @@ self.assertEqual(str(cbtn['variable']), str(cbtn2['variable'])) +class MenubuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'class', 'compound', 'cursor', 'direction', + 'image', 'menu', 'state', 'style', + 'takefocus', 'text', 'textvariable', + 'underline', 'width', + ) -class ScaleTest(unittest.TestCase): + def _create(self, **kwargs): + return ttk.Menubutton(self.root, **kwargs) + + def test_direction(self): + widget = self.create() + self.checkEnumParam(widget, 'direction', + 'above', 'below', 'left', 'right', 'flush') + + def test_menu(self): + widget = self.create() + menu = tkinter.Menu(widget, name='menu') + self.checkParam(widget, 'menu', menu, conv=str) + menu.destroy() + + + at add_standard_options(StandardTtkOptionsTests) +class ScaleTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'command', 'cursor', 'from', 'length', + 'orient', 'style', 'takefocus', 'to', 'value', 'variable', + ) + _conv_pixels = noconv + default_orient = 'horizontal' def setUp(self): + super().setUp() support.root_deiconify() - self.scale = ttk.Scale() + self.scale = self.create() self.scale.pack() self.scale.update() def tearDown(self): self.scale.destroy() support.root_withdraw() + super().tearDown() + def _create(self, **kwargs): + return ttk.Scale(self.root, **kwargs) + + def test_from(self): + widget = self.create() + self.checkFloatParam(widget, 'from', 100, 14.9, 15.1, conv=False) + + def test_length(self): + widget = self.create() + self.checkPixelsParam(widget, 'length', 130, 131.2, 135.6, '5i') + + def test_to(self): + widget = self.create() + self.checkFloatParam(widget, 'to', 300, 14.9, 15.1, -10, conv=False) + + def test_value(self): + widget = self.create() + self.checkFloatParam(widget, 'value', 300, 14.9, 15.1, -10, conv=False) def test_custom_event(self): failure = [1, 1, 1] # will need to be empty @@ -539,11 +854,64 @@ self.assertRaises(tkinter.TclError, self.scale.set, None) -class NotebookTest(unittest.TestCase): + at add_standard_options(StandardTtkOptionsTests) +class ProgressbarTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'orient', 'length', + 'mode', 'maximum', 'phase', + 'style', 'takefocus', 'value', 'variable', + ) + _conv_pixels = noconv + default_orient = 'horizontal' + + def _create(self, **kwargs): + return ttk.Progressbar(self.root, **kwargs) + + def test_length(self): + widget = self.create() + self.checkPixelsParam(widget, 'length', 100.1, 56.7, '2i') + + def test_maximum(self): + widget = self.create() + self.checkFloatParam(widget, 'maximum', 150.2, 77.7, 0, -10, conv=False) + + def test_mode(self): + widget = self.create() + self.checkEnumParam(widget, 'mode', 'determinate', 'indeterminate') + + def test_phase(self): + # XXX + pass + + def test_value(self): + widget = self.create() + self.checkFloatParam(widget, 'value', 150.2, 77.7, 0, -10, + conv=False) + + + at unittest.skipIf(sys.platform == 'darwin', + 'ttk.Scrollbar is special on MacOSX') + at add_standard_options(StandardTtkOptionsTests) +class ScrollbarTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'command', 'cursor', 'orient', 'style', 'takefocus', + ) + default_orient = 'vertical' + + def _create(self, **kwargs): + return ttk.Scrollbar(self.root, **kwargs) + + + at add_standard_options(IntegerSizeTests, StandardTtkOptionsTests) +class NotebookTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'height', 'padding', 'style', 'takefocus', + ) def setUp(self): + super().setUp() support.root_deiconify() - self.nb = ttk.Notebook(padding=0) + self.nb = self.create(padding=0) self.child1 = ttk.Label() self.child2 = ttk.Label() self.nb.add(self.child1, text='a') @@ -554,7 +922,10 @@ self.child2.destroy() self.nb.destroy() support.root_withdraw() + super().tearDown() + def _create(self, **kwargs): + return ttk.Notebook(self.root, **kwargs) def test_tab_identifiers(self): self.nb.forget(0) @@ -746,16 +1117,68 @@ self.assertEqual(self.nb.select(), str(self.child1)) -class TreeviewTest(unittest.TestCase): + at add_standard_options(StandardTtkOptionsTests) +class TreeviewTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'columns', 'cursor', 'displaycolumns', + 'height', 'padding', 'selectmode', 'show', + 'style', 'takefocus', 'xscrollcommand', 'yscrollcommand', + ) def setUp(self): + super().setUp() support.root_deiconify() - self.tv = ttk.Treeview(padding=0) + self.tv = self.create(padding=0) def tearDown(self): self.tv.destroy() support.root_withdraw() + super().tearDown() + def _create(self, **kwargs): + return ttk.Treeview(self.root, **kwargs) + + def test_columns(self): + widget = self.create() + self.checkParam(widget, 'columns', 'a b c', + expected=('a', 'b', 'c')) + self.checkParam(widget, 'columns', ('a', 'b', 'c')) + self.checkParam(widget, 'columns', ()) + + def test_displaycolumns(self): + widget = self.create() + widget['columns'] = ('a', 'b', 'c') + self.checkParam(widget, 'displaycolumns', 'b a c', + expected=('b', 'a', 'c')) + self.checkParam(widget, 'displaycolumns', ('b', 'a', 'c')) + self.checkParam(widget, 'displaycolumns', '#all', + expected=('#all',)) + self.checkParam(widget, 'displaycolumns', (2, 1, 0)) + self.checkInvalidParam(widget, 'displaycolumns', ('a', 'b', 'd'), + errmsg='Invalid column index d') + self.checkInvalidParam(widget, 'displaycolumns', (1, 2, 3), + errmsg='Column index 3 out of bounds') + self.checkInvalidParam(widget, 'displaycolumns', (1, -2), + errmsg='Column index -2 out of bounds') + + def test_height(self): + widget = self.create() + self.checkPixelsParam(widget, 'height', 100, -100, 0, '3c', conv=False) + self.checkPixelsParam(widget, 'height', 101.2, 102.6, conv=noconv) + + def test_selectmode(self): + widget = self.create() + self.checkEnumParam(widget, 'selectmode', + 'none', 'browse', 'extended') + + def test_show(self): + widget = self.create() + self.checkParam(widget, 'show', 'tree headings', + expected=('tree', 'headings')) + self.checkParam(widget, 'show', ('tree', 'headings')) + self.checkParam(widget, 'show', ('headings', 'tree')) + self.checkParam(widget, 'show', 'tree', expected=('tree',)) + self.checkParam(widget, 'show', 'headings', expected=('headings',)) def test_bbox(self): self.tv.pack() @@ -1149,11 +1572,35 @@ self.assertTrue(isinstance(self.tv.tag_configure('test'), dict)) + at add_standard_options(StandardTtkOptionsTests) +class SeparatorTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'orient', 'style', 'takefocus', + # 'state'? + ) + default_orient = 'horizontal' + + def _create(self, **kwargs): + return ttk.Separator(self.root, **kwargs) + + + at add_standard_options(StandardTtkOptionsTests) +class SizegripTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'style', 'takefocus', + # 'state'? + ) + + def _create(self, **kwargs): + return ttk.Sizegrip(self.root, **kwargs) + tests_gui = ( - WidgetTest, ButtonTest, CheckbuttonTest, RadiobuttonTest, - ComboboxTest, EntryTest, PanedwindowTest, ScaleTest, NotebookTest, - TreeviewTest + ButtonTest, CheckbuttonTest, ComboboxTest, EntryTest, + FrameTest, LabelFrameTest, LabelTest, MenubuttonTest, + NotebookTest, PanedWindowTest, ProgressbarTest, + RadiobuttonTest, ScaleTest, ScrollbarTest, SeparatorTest, + SizegripTest, TreeviewTest, WidgetTest, ) if __name__ == "__main__": - run_unittest(*tests_gui) + unittest.main() diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py new file mode 100644 --- /dev/null +++ b/Lib/tkinter/test/widget_tests.py @@ -0,0 +1,487 @@ +# Common tests for test_tkinter/test_widgets.py and test_ttk/test_widgets.py + +import tkinter +from tkinter.ttk import setup_master, Scale +from tkinter.test.support import (tcl_version, requires_tcl, pixels_conv, + tcl_obj_eq) + + +noconv = str if tcl_version < (8, 5) else False + +_sentinel = object() + +class AbstractWidgetTest: + _conv_pixels = round if tcl_version[:2] != (8, 5) else int + _conv_pad_pixels = None + wantobjects = True + + def setUp(self): + self.root = setup_master() + self.scaling = float(self.root.call('tk', 'scaling')) + if not self.root.wantobjects(): + self.wantobjects = False + + def create(self, **kwargs): + widget = self._create(**kwargs) + self.addCleanup(widget.destroy) + return widget + + def assertEqual2(self, actual, expected, msg=None, eq=object.__eq__): + if eq(actual, expected): + return + self.assertEqual(actual, expected, msg) + + def checkParam(self, widget, name, value, *, expected=_sentinel, + conv=False, eq=None): + widget[name] = value + if expected is _sentinel: + expected = value + if conv: + expected = conv(expected) + if not self.wantobjects: + if isinstance(expected, tuple): + expected = tkinter._join(expected) + else: + expected = str(expected) + if eq is None: + eq = tcl_obj_eq + self.assertEqual2(widget[name], expected, eq=eq) + self.assertEqual2(widget.cget(name), expected, eq=eq) + # XXX + if not isinstance(widget, Scale): + t = widget.configure(name) + self.assertEqual(len(t), 5) + ## XXX + if not isinstance(t[4], tuple): + self.assertEqual2(t[4], expected, eq=eq) + + def checkInvalidParam(self, widget, name, value, errmsg=None, *, + keep_orig=True): + orig = widget[name] + if errmsg is not None: + errmsg = errmsg.format(value) + with self.assertRaises(tkinter.TclError) as cm: + widget[name] = value + if errmsg is not None: + self.assertEqual(str(cm.exception), errmsg) + if keep_orig: + self.assertEqual(widget[name], orig) + else: + widget[name] = orig + with self.assertRaises(tkinter.TclError) as cm: + widget.configure({name: value}) + if errmsg is not None: + self.assertEqual(str(cm.exception), errmsg) + if keep_orig: + self.assertEqual(widget[name], orig) + else: + widget[name] = orig + + def checkParams(self, widget, name, *values, **kwargs): + for value in values: + self.checkParam(widget, name, value, **kwargs) + + def checkIntegerParam(self, widget, name, *values, **kwargs): + self.checkParams(widget, name, *values, **kwargs) + self.checkInvalidParam(widget, name, '', + errmsg='expected integer but got ""') + self.checkInvalidParam(widget, name, '10p', + errmsg='expected integer but got "10p"') + self.checkInvalidParam(widget, name, 3.2, + errmsg='expected integer but got "3.2"') + + def checkFloatParam(self, widget, name, *values, conv=float, **kwargs): + for value in values: + self.checkParam(widget, name, value, conv=conv, **kwargs) + self.checkInvalidParam(widget, name, '', + errmsg='expected floating-point number but got ""') + self.checkInvalidParam(widget, name, 'spam', + errmsg='expected floating-point number but got "spam"') + + def checkBooleanParam(self, widget, name): + for value in (False, 0, 'false', 'no', 'off'): + self.checkParam(widget, name, value, expected=0) + for value in (True, 1, 'true', 'yes', 'on'): + self.checkParam(widget, name, value, expected=1) + self.checkInvalidParam(widget, name, '', + errmsg='expected boolean value but got ""') + self.checkInvalidParam(widget, name, 'spam', + errmsg='expected boolean value but got "spam"') + + def checkColorParam(self, widget, name, *, allow_empty=None, **kwargs): + self.checkParams(widget, name, + '#ff0000', '#00ff00', '#0000ff', '#123456', + 'red', 'green', 'blue', 'white', 'black', 'grey', + **kwargs) + self.checkInvalidParam(widget, name, 'spam', + errmsg='unknown color name "spam"') + + def checkCursorParam(self, widget, name, **kwargs): + self.checkParams(widget, name, 'arrow', 'watch', 'cross', '',**kwargs) + if tcl_version >= (8, 5): + self.checkParam(widget, name, 'none') + self.checkInvalidParam(widget, name, 'spam', + errmsg='bad cursor spec "spam"') + + def checkCommandParam(self, widget, name): + def command(*args): + pass + widget[name] = command + self.assertTrue(widget[name]) + self.checkParams(widget, name, '') + + def checkEnumParam(self, widget, name, *values, errmsg=None, **kwargs): + self.checkParams(widget, name, *values, **kwargs) + if errmsg is None: + errmsg2 = ' %s "{}": must be %s%s or %s' % ( + name, + ', '.join(values[:-1]), + ',' if len(values) > 2 else '', + values[-1]) + self.checkInvalidParam(widget, name, '', + errmsg='ambiguous' + errmsg2) + errmsg = 'bad' + errmsg2 + self.checkInvalidParam(widget, name, 'spam', errmsg=errmsg) + + def checkPixelsParam(self, widget, name, *values, + conv=None, keep_orig=True, **kwargs): + if conv is None: + conv = self._conv_pixels + for value in values: + expected = _sentinel + conv1 = conv + if isinstance(value, str): + if conv1 and conv1 is not str: + expected = pixels_conv(value) * self.scaling + conv1 = round + self.checkParam(widget, name, value, expected=expected, + conv=conv1, **kwargs) + self.checkInvalidParam(widget, name, '6x', + errmsg='bad screen distance "6x"', keep_orig=keep_orig) + self.checkInvalidParam(widget, name, 'spam', + errmsg='bad screen distance "spam"', keep_orig=keep_orig) + + def checkReliefParam(self, widget, name): + self.checkParams(widget, name, + 'flat', 'groove', 'raised', 'ridge', 'solid', 'sunken') + errmsg='bad relief "spam": must be '\ + 'flat, groove, raised, ridge, solid, or sunken' + if tcl_version < (8, 6): + errmsg = None + self.checkInvalidParam(widget, name, 'spam', + errmsg=errmsg) + + def checkImageParam(self, widget, name): + image = tkinter.PhotoImage('image1') + self.checkParam(widget, name, image, conv=str) + self.checkInvalidParam(widget, name, 'spam', + errmsg='image "spam" doesn\'t exist') + widget[name] = '' + + def checkVariableParam(self, widget, name, var): + self.checkParam(widget, name, var, conv=str) + + +class StandardOptionsTests: + STANDARD_OPTIONS = ( + 'activebackground', 'activeborderwidth', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', 'compound', 'cursor', + 'disabledforeground', 'exportselection', 'font', 'foreground', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'insertbackground', 'insertborderwidth', + 'insertofftime', 'insertontime', 'insertwidth', + 'jump', 'justify', 'orient', 'padx', 'pady', 'relief', + 'repeatdelay', 'repeatinterval', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'setgrid', 'takefocus', 'text', 'textvariable', 'troughcolor', + 'underline', 'wraplength', 'xscrollcommand', 'yscrollcommand', + ) + + def test_activebackground(self): + widget = self.create() + self.checkColorParam(widget, 'activebackground') + + def test_activeborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'activeborderwidth', + 0, 1.3, 2.9, 6, -2, '10p') + + def test_activeforeground(self): + widget = self.create() + self.checkColorParam(widget, 'activeforeground') + + def test_anchor(self): + widget = self.create() + self.checkEnumParam(widget, 'anchor', + 'n', 'ne', 'e', 'se', 's', 'sw', 'w', 'nw', 'center') + + def test_background(self): + widget = self.create() + self.checkColorParam(widget, 'background') + if 'bg' in self.OPTIONS: + self.checkColorParam(widget, 'bg') + + def test_bitmap(self): + widget = self.create() + self.checkParam(widget, 'bitmap', 'questhead') + self.checkParam(widget, 'bitmap', 'gray50') + self.checkInvalidParam(widget, 'bitmap', 'spam', + errmsg='bitmap "spam" not defined') + + def test_borderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'borderwidth', + 0, 1.3, 2.6, 6, -2, '10p') + if 'bd' in self.OPTIONS: + self.checkPixelsParam(widget, 'bd', 0, 1.3, 2.6, 6, -2, '10p') + + def test_compound(self): + widget = self.create() + self.checkEnumParam(widget, 'compound', + 'bottom', 'center', 'left', 'none', 'right', 'top') + + def test_cursor(self): + widget = self.create() + self.checkCursorParam(widget, 'cursor') + + def test_disabledforeground(self): + widget = self.create() + self.checkColorParam(widget, 'disabledforeground') + + def test_exportselection(self): + widget = self.create() + self.checkBooleanParam(widget, 'exportselection') + + def test_font(self): + widget = self.create() + self.checkParam(widget, 'font', + '-Adobe-Helvetica-Medium-R-Normal--*-120-*-*-*-*-*-*') + self.checkInvalidParam(widget, 'font', '', + errmsg='font "" doesn\'t exist') + + def test_foreground(self): + widget = self.create() + self.checkColorParam(widget, 'foreground') + if 'fg' in self.OPTIONS: + self.checkColorParam(widget, 'fg') + + def test_highlightbackground(self): + widget = self.create() + self.checkColorParam(widget, 'highlightbackground') + + def test_highlightcolor(self): + widget = self.create() + self.checkColorParam(widget, 'highlightcolor') + + def test_highlightthickness(self): + widget = self.create() + self.checkPixelsParam(widget, 'highlightthickness', + 0, 1.3, 2.6, 6, '10p') + self.checkParam(widget, 'highlightthickness', -2, expected=0, + conv=self._conv_pixels) + + def test_image(self): + widget = self.create() + self.checkImageParam(widget, 'image') + + def test_insertbackground(self): + widget = self.create() + self.checkColorParam(widget, 'insertbackground') + + def test_insertborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'insertborderwidth', + 0, 1.3, 2.6, 6, -2, '10p') + + def test_insertofftime(self): + widget = self.create() + self.checkIntegerParam(widget, 'insertofftime', 100) + + def test_insertontime(self): + widget = self.create() + self.checkIntegerParam(widget, 'insertontime', 100) + + def test_insertwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'insertwidth', 1.3, 2.6, -2, '10p') + + def test_jump(self): + widget = self.create() + self.checkBooleanParam(widget, 'jump') + + def test_justify(self): + widget = self.create() + self.checkEnumParam(widget, 'justify', 'left', 'right', 'center', + errmsg='bad justification "{}": must be ' + 'left, right, or center') + self.checkInvalidParam(widget, 'justify', '', + errmsg='ambiguous justification "": must be ' + 'left, right, or center') + + def test_orient(self): + widget = self.create() + self.assertEqual(str(widget['orient']), self.default_orient) + self.checkEnumParam(widget, 'orient', 'horizontal', 'vertical') + + def test_padx(self): + widget = self.create() + self.checkPixelsParam(widget, 'padx', 3, 4.4, 5.6, -2, '12m', + conv=self._conv_pad_pixels) + + def test_pady(self): + widget = self.create() + self.checkPixelsParam(widget, 'pady', 3, 4.4, 5.6, -2, '12m', + conv=self._conv_pad_pixels) + + def test_relief(self): + widget = self.create() + self.checkReliefParam(widget, 'relief') + + def test_repeatdelay(self): + widget = self.create() + self.checkIntegerParam(widget, 'repeatdelay', -500, 500) + + def test_repeatinterval(self): + widget = self.create() + self.checkIntegerParam(widget, 'repeatinterval', -500, 500) + + def test_selectbackground(self): + widget = self.create() + self.checkColorParam(widget, 'selectbackground') + + def test_selectborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'selectborderwidth', 1.3, 2.6, -2, '10p') + + def test_selectforeground(self): + widget = self.create() + self.checkColorParam(widget, 'selectforeground') + + def test_setgrid(self): + widget = self.create() + self.checkBooleanParam(widget, 'setgrid') + + def test_state(self): + widget = self.create() + self.checkEnumParam(widget, 'state', 'active', 'disabled', 'normal') + + def test_takefocus(self): + widget = self.create() + self.checkParams(widget, 'takefocus', '0', '1', '') + + def test_text(self): + widget = self.create() + self.checkParams(widget, 'text', '', 'any string') + + def test_textvariable(self): + widget = self.create() + var = tkinter.StringVar() + self.checkVariableParam(widget, 'textvariable', var) + + def test_troughcolor(self): + widget = self.create() + self.checkColorParam(widget, 'troughcolor') + + def test_underline(self): + widget = self.create() + self.checkIntegerParam(widget, 'underline', 0, 1, 10) + + def test_wraplength(self): + widget = self.create() + if tcl_version < (8, 5): + self.checkPixelsParam(widget, 'wraplength', 100) + else: + self.checkParams(widget, 'wraplength', 100) + + def test_xscrollcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'xscrollcommand') + + def test_yscrollcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'yscrollcommand') + + # non-standard but common options + + def test_command(self): + widget = self.create() + self.checkCommandParam(widget, 'command') + + def test_indicatoron(self): + widget = self.create() + self.checkBooleanParam(widget, 'indicatoron') + + def test_offrelief(self): + widget = self.create() + self.checkReliefParam(widget, 'offrelief') + + def test_overrelief(self): + widget = self.create() + self.checkReliefParam(widget, 'overrelief') + + def test_selectcolor(self): + widget = self.create() + self.checkColorParam(widget, 'selectcolor') + + def test_selectimage(self): + widget = self.create() + self.checkImageParam(widget, 'selectimage') + + @requires_tcl(8, 5) + def test_tristateimage(self): + widget = self.create() + self.checkImageParam(widget, 'tristateimage') + + @requires_tcl(8, 5) + def test_tristatevalue(self): + widget = self.create() + self.checkParam(widget, 'tristatevalue', 'unknowable') + + def test_variable(self): + widget = self.create() + var = tkinter.DoubleVar() + self.checkVariableParam(widget, 'variable', var) + + +class IntegerSizeTests: + def test_height(self): + widget = self.create() + self.checkIntegerParam(widget, 'height', 100, -100, 0) + + def test_width(self): + widget = self.create() + self.checkIntegerParam(widget, 'width', 402, -402, 0) + + +class PixelSizeTests: + def test_height(self): + widget = self.create() + self.checkPixelsParam(widget, 'height', 100, 101.2, 102.6, -100, 0, '3c') + + def test_width(self): + widget = self.create() + self.checkPixelsParam(widget, 'width', 402, 403.4, 404.6, -402, 0, '5i') + + +def add_standard_options(*source_classes): + # This decorator adds test_xxx methods from source classes for every xxx + # option in the OPTIONS class attribute if they are not defined explicitly. + def decorator(cls): + for option in cls.OPTIONS: + methodname = 'test_' + option + if not hasattr(cls, methodname): + for source_class in source_classes: + if hasattr(source_class, methodname): + setattr(cls, methodname, + getattr(source_class, methodname)) + break + else: + def test(self, option=option): + widget = self.create() + widget[option] + raise AssertionError('Option "%s" is not tested in %s' % + (option, cls.__name__)) + test.__name__ = methodname + setattr(cls, methodname, test) + return cls + return decorator diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -147,6 +147,8 @@ Tests ----- +- Issue #19085: Added basic tests for all tkinter widget options. + - Issue 19384: Fix test_py_compile for root user, patch by Claudiu Popa. Build -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 09:48:39 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 2 Nov 2013 09:48:39 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5MDg1?= =?utf-8?q?=3A_Added_basic_tests_for_all_tkinter_widget_options=2E?= Message-ID: <3dBYpH4dW3z7Ljr@mail.python.org> http://hg.python.org/cpython/rev/ced345326151 changeset: 86836:ced345326151 branch: 2.7 parent: 86818:01087a302721 user: Serhiy Storchaka date: Sat Nov 02 10:46:21 2013 +0200 summary: Issue #19085: Added basic tests for all tkinter widget options. files: Lib/lib-tk/test/test_tkinter/test_widgets.py | 916 ++++++++++ Lib/lib-tk/test/test_ttk/support.py | 40 + Lib/lib-tk/test/test_ttk/test_widgets.py | 488 +++++- Lib/lib-tk/test/widget_tests.py | 504 +++++ Misc/NEWS | 2 + 5 files changed, 1931 insertions(+), 19 deletions(-) diff --git a/Lib/lib-tk/test/test_tkinter/test_widgets.py b/Lib/lib-tk/test/test_tkinter/test_widgets.py new file mode 100644 --- /dev/null +++ b/Lib/lib-tk/test/test_tkinter/test_widgets.py @@ -0,0 +1,916 @@ +import unittest +import Tkinter +import os +from test.test_support import requires, run_unittest + +from test_ttk.support import tcl_version, requires_tcl, widget_eq +from widget_tests import (add_standard_options, noconv, int_round, + AbstractWidgetTest, StandardOptionsTests, + IntegerSizeTests, PixelSizeTests) + +requires('gui') + + +class AbstractToplevelTest(AbstractWidgetTest, PixelSizeTests): + _conv_pad_pixels = noconv + + def test_class(self): + widget = self.create() + self.assertEqual(widget['class'], + widget.__class__.__name__.title()) + self.checkInvalidParam(widget, 'class', 'Foo', + errmsg="can't modify -class option after widget is created") + widget2 = self.create(class_='Foo') + self.assertEqual(widget2['class'], 'Foo') + + def test_colormap(self): + widget = self.create() + self.assertEqual(widget['colormap'], '') + self.checkInvalidParam(widget, 'colormap', 'new', + errmsg="can't modify -colormap option after widget is created") + widget2 = self.create(colormap='new') + self.assertEqual(widget2['colormap'], 'new') + + def test_container(self): + widget = self.create() + self.assertEqual(widget['container'], 0 if self.wantobjects else '0') + self.checkInvalidParam(widget, 'container', 1, + errmsg="can't modify -container option after widget is created") + widget2 = self.create(container=True) + self.assertEqual(widget2['container'], 1 if self.wantobjects else '1') + + def test_visual(self): + widget = self.create() + self.assertEqual(widget['visual'], '') + self.checkInvalidParam(widget, 'visual', 'default', + errmsg="can't modify -visual option after widget is created") + widget2 = self.create(visual='default') + self.assertEqual(widget2['visual'], 'default') + + + at add_standard_options(StandardOptionsTests) +class ToplevelTest(AbstractToplevelTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', + 'class', 'colormap', 'container', 'cursor', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'menu', 'padx', 'pady', 'relief', 'screen', + 'takefocus', 'use', 'visual', 'width', + ) + + def _create(self, **kwargs): + return Tkinter.Toplevel(self.root, **kwargs) + + def test_menu(self): + widget = self.create() + menu = Tkinter.Menu(self.root) + self.checkParam(widget, 'menu', menu, eq=widget_eq) + self.checkParam(widget, 'menu', '') + + def test_screen(self): + widget = self.create() + self.assertEqual(widget['screen'], '') + display = os.environ['DISPLAY'] + self.checkInvalidParam(widget, 'screen', display, + errmsg="can't modify -screen option after widget is created") + widget2 = self.create(screen=display) + self.assertEqual(widget2['screen'], display) + + def test_use(self): + widget = self.create() + self.assertEqual(widget['use'], '') + widget1 = self.create(container=True) + self.assertEqual(widget1['use'], '') + self.checkInvalidParam(widget1, 'use', '0x44022', + errmsg="can't modify -use option after widget is created") + wid = hex(widget1.winfo_id()) + widget2 = self.create(use=wid) + self.assertEqual(widget2['use'], wid) + + + at add_standard_options(StandardOptionsTests) +class FrameTest(AbstractToplevelTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', + 'class', 'colormap', 'container', 'cursor', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'relief', 'takefocus', 'visual', 'width', + ) + + def _create(self, **kwargs): + return Tkinter.Frame(self.root, **kwargs) + + + at add_standard_options(StandardOptionsTests) +class LabelFrameTest(AbstractToplevelTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', + 'class', 'colormap', 'container', 'cursor', + 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'labelanchor', 'labelwidget', 'padx', 'pady', 'relief', + 'takefocus', 'text', 'visual', 'width', + ) + + def _create(self, **kwargs): + return Tkinter.LabelFrame(self.root, **kwargs) + + def test_labelanchor(self): + widget = self.create() + self.checkEnumParam(widget, 'labelanchor', + 'e', 'en', 'es', 'n', 'ne', 'nw', + 's', 'se', 'sw', 'w', 'wn', 'ws') + self.checkInvalidParam(widget, 'labelanchor', 'center') + + def test_labelwidget(self): + widget = self.create() + label = Tkinter.Label(self.root, text='Mupp', name='foo') + self.checkParam(widget, 'labelwidget', label, expected='.foo') + label.destroy() + + +class AbstractLabelTest(AbstractWidgetTest, IntegerSizeTests): + _conv_pixels = noconv + + def test_highlightthickness(self): + widget = self.create() + self.checkPixelsParam(widget, 'highlightthickness', + 0, 1.3, 2.6, 6, -2, '10p') + + + at add_standard_options(StandardOptionsTests) +class LabelTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', 'compound', 'cursor', + 'disabledforeground', 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'justify', 'padx', 'pady', 'relief', 'state', + 'takefocus', 'text', 'textvariable', + 'underline', 'width', 'wraplength', + ) + + def _create(self, **kwargs): + return Tkinter.Label(self.root, **kwargs) + + + at add_standard_options(StandardOptionsTests) +class ButtonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', + 'command', 'compound', 'cursor', 'default', + 'disabledforeground', 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'justify', 'overrelief', 'padx', 'pady', 'relief', + 'repeatdelay', 'repeatinterval', + 'state', 'takefocus', 'text', 'textvariable', + 'underline', 'width', 'wraplength') + + def _create(self, **kwargs): + return Tkinter.Button(self.root, **kwargs) + + def test_default(self): + widget = self.create() + self.checkEnumParam(widget, 'default', 'active', 'disabled', 'normal') + + + at add_standard_options(StandardOptionsTests) +class CheckbuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', + 'command', 'compound', 'cursor', + 'disabledforeground', 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'indicatoron', 'justify', + 'offrelief', 'offvalue', 'onvalue', 'overrelief', + 'padx', 'pady', 'relief', 'selectcolor', 'selectimage', 'state', + 'takefocus', 'text', 'textvariable', + 'tristateimage', 'tristatevalue', + 'underline', 'variable', 'width', 'wraplength', + ) + + def _create(self, **kwargs): + return Tkinter.Checkbutton(self.root, **kwargs) + + + def test_offvalue(self): + widget = self.create() + self.checkParams(widget, 'offvalue', 1, 2.3, '', 'any string') + + def test_onvalue(self): + widget = self.create() + self.checkParams(widget, 'onvalue', 1, 2.3, '', 'any string') + + + at add_standard_options(StandardOptionsTests) +class RadiobuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', + 'command', 'compound', 'cursor', + 'disabledforeground', 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'indicatoron', 'justify', 'offrelief', 'overrelief', + 'padx', 'pady', 'relief', 'selectcolor', 'selectimage', 'state', + 'takefocus', 'text', 'textvariable', + 'tristateimage', 'tristatevalue', + 'underline', 'value', 'variable', 'width', 'wraplength', + ) + + def _create(self, **kwargs): + return Tkinter.Radiobutton(self.root, **kwargs) + + def test_value(self): + widget = self.create() + self.checkParams(widget, 'value', 1, 2.3, '', 'any string') + + + at add_standard_options(StandardOptionsTests) +class MenubuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', + 'compound', 'cursor', 'direction', + 'disabledforeground', 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'indicatoron', 'justify', 'menu', + 'padx', 'pady', 'relief', 'state', + 'takefocus', 'text', 'textvariable', + 'underline', 'width', 'wraplength', + ) + _conv_pixels = AbstractWidgetTest._conv_pixels + + def _create(self, **kwargs): + return Tkinter.Menubutton(self.root, **kwargs) + + def test_direction(self): + widget = self.create() + self.checkEnumParam(widget, 'direction', + 'above', 'below', 'flush', 'left', 'right') + + def test_height(self): + widget = self.create() + self.checkIntegerParam(widget, 'height', 100, -100, 0, conv=str) + + test_highlightthickness = StandardOptionsTests.test_highlightthickness.im_func + + def test_image(self): + widget = self.create() + image = Tkinter.PhotoImage('image1') + self.checkParam(widget, 'image', image, conv=str) + errmsg = 'image "spam" doesn\'t exist' + with self.assertRaises(Tkinter.TclError) as cm: + widget['image'] = 'spam' + if errmsg is not None: + self.assertEqual(str(cm.exception), errmsg) + with self.assertRaises(Tkinter.TclError) as cm: + widget.configure({'image': 'spam'}) + if errmsg is not None: + self.assertEqual(str(cm.exception), errmsg) + + def test_menu(self): + widget = self.create() + menu = Tkinter.Menu(widget, name='menu') + self.checkParam(widget, 'menu', menu, eq=widget_eq) + menu.destroy() + + def test_padx(self): + widget = self.create() + self.checkPixelsParam(widget, 'padx', 3, 4.4, 5.6, '12m') + self.checkParam(widget, 'padx', -2, expected=0) + + def test_pady(self): + widget = self.create() + self.checkPixelsParam(widget, 'pady', 3, 4.4, 5.6, '12m') + self.checkParam(widget, 'pady', -2, expected=0) + + def test_width(self): + widget = self.create() + self.checkIntegerParam(widget, 'width', 402, -402, 0, conv=str) + + +class OptionMenuTest(MenubuttonTest, unittest.TestCase): + + def _create(self, default='b', values=('a', 'b', 'c'), **kwargs): + return Tkinter.OptionMenu(self.root, None, default, *values, **kwargs) + + + at add_standard_options(IntegerSizeTests, StandardOptionsTests) +class EntryTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', 'cursor', + 'disabledbackground', 'disabledforeground', + 'exportselection', 'font', 'foreground', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'insertbackground', 'insertborderwidth', + 'insertofftime', 'insertontime', 'insertwidth', + 'invalidcommand', 'justify', 'readonlybackground', 'relief', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'show', 'state', 'takefocus', 'textvariable', + 'validate', 'validatecommand', 'width', 'xscrollcommand', + ) + + def _create(self, **kwargs): + return Tkinter.Entry(self.root, **kwargs) + + def test_disabledbackground(self): + widget = self.create() + self.checkColorParam(widget, 'disabledbackground') + + def test_insertborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'insertborderwidth', 0, 1.3, -2) + self.checkParam(widget, 'insertborderwidth', 2, expected=1) + self.checkParam(widget, 'insertborderwidth', '10p', expected=1) + + def test_insertwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'insertwidth', 1.3, 3.6, '10p') + if tcl_version[:2] == (8, 5): + self.checkParam(widget, 'insertwidth', 0.9, expected=2) + else: + self.checkParam(widget, 'insertwidth', 0.9, expected=1) + self.checkParam(widget, 'insertwidth', 0.1, expected=2) + self.checkParam(widget, 'insertwidth', -2, expected=2) + + def test_invalidcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'invalidcommand') + self.checkCommandParam(widget, 'invcmd') + + def test_readonlybackground(self): + widget = self.create() + self.checkColorParam(widget, 'readonlybackground') + + def test_show(self): + widget = self.create() + self.checkParam(widget, 'show', '*') + self.checkParam(widget, 'show', '') + self.checkParam(widget, 'show', ' ') + + def test_state(self): + widget = self.create() + self.checkEnumParam(widget, 'state', + 'disabled', 'normal', 'readonly') + + def test_validate(self): + widget = self.create() + self.checkEnumParam(widget, 'validate', + 'all', 'key', 'focus', 'focusin', 'focusout', 'none') + + def test_validatecommand(self): + widget = self.create() + self.checkCommandParam(widget, 'validatecommand') + self.checkCommandParam(widget, 'vcmd') + + + at add_standard_options(StandardOptionsTests) +class SpinboxTest(EntryTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'background', 'borderwidth', + 'buttonbackground', 'buttoncursor', 'buttondownrelief', 'buttonuprelief', + 'command', 'cursor', 'disabledbackground', 'disabledforeground', + 'exportselection', 'font', 'foreground', 'format', 'from', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'increment', + 'insertbackground', 'insertborderwidth', + 'insertofftime', 'insertontime', 'insertwidth', + 'invalidcommand', 'justify', 'relief', 'readonlybackground', + 'repeatdelay', 'repeatinterval', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'state', 'takefocus', 'textvariable', 'to', + 'validate', 'validatecommand', 'values', + 'width', 'wrap', 'xscrollcommand', + ) + + def _create(self, **kwargs): + return Tkinter.Spinbox(self.root, **kwargs) + + test_show = None + + def test_buttonbackground(self): + widget = self.create() + self.checkColorParam(widget, 'buttonbackground') + + def test_buttoncursor(self): + widget = self.create() + self.checkCursorParam(widget, 'buttoncursor') + + def test_buttondownrelief(self): + widget = self.create() + self.checkReliefParam(widget, 'buttondownrelief') + + def test_buttonuprelief(self): + widget = self.create() + self.checkReliefParam(widget, 'buttonuprelief') + + def test_format(self): + widget = self.create() + self.checkParam(widget, 'format', '%2f') + self.checkParam(widget, 'format', '%2.2f') + self.checkParam(widget, 'format', '%.2f') + self.checkParam(widget, 'format', '%2.f') + self.checkInvalidParam(widget, 'format', '%2e-1f') + self.checkInvalidParam(widget, 'format', '2.2') + self.checkInvalidParam(widget, 'format', '%2.-2f') + self.checkParam(widget, 'format', '%-2.02f') + self.checkParam(widget, 'format', '% 2.02f') + self.checkParam(widget, 'format', '% -2.200f') + self.checkParam(widget, 'format', '%09.200f') + self.checkInvalidParam(widget, 'format', '%d') + + def test_from(self): + widget = self.create() + self.checkParam(widget, 'to', 100.0) + self.checkFloatParam(widget, 'from', -10, 10.2, 11.7) + self.checkInvalidParam(widget, 'from', 200, + errmsg='-to value must be greater than -from value') + + def test_increment(self): + widget = self.create() + self.checkFloatParam(widget, 'increment', -1, 1, 10.2, 12.8, 0) + + def test_to(self): + widget = self.create() + self.checkParam(widget, 'from', -100.0) + self.checkFloatParam(widget, 'to', -10, 10.2, 11.7) + self.checkInvalidParam(widget, 'to', -200, + errmsg='-to value must be greater than -from value') + + def test_values(self): + # XXX + widget = self.create() + self.assertEqual(widget['values'], '') + self.checkParam(widget, 'values', 'mon tue wed thur') + self.checkParam(widget, 'values', ('mon', 'tue', 'wed', 'thur'), + expected='mon tue wed thur') + self.checkParam(widget, 'values', (42, 3.14, '', 'any string'), + expected='42 3.14 {} {any string}') + self.checkParam(widget, 'values', '') + + def test_wrap(self): + widget = self.create() + self.checkBooleanParam(widget, 'wrap') + + + at add_standard_options(StandardOptionsTests) +class TextTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'autoseparators', 'background', 'blockcursor', 'borderwidth', + 'cursor', 'endline', 'exportselection', + 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'inactiveselectbackground', 'insertbackground', 'insertborderwidth', + 'insertofftime', 'insertontime', 'insertunfocussed', 'insertwidth', + 'maxundo', 'padx', 'pady', 'relief', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'setgrid', 'spacing1', 'spacing2', 'spacing3', 'startline', 'state', + 'tabs', 'tabstyle', 'takefocus', 'undo', 'width', 'wrap', + 'xscrollcommand', 'yscrollcommand', + ) + if tcl_version < (8, 5): + wantobjects = False + + def _create(self, **kwargs): + return Tkinter.Text(self.root, **kwargs) + + def test_autoseparators(self): + widget = self.create() + self.checkBooleanParam(widget, 'autoseparators') + + @requires_tcl(8, 5) + def test_blockcursor(self): + widget = self.create() + self.checkBooleanParam(widget, 'blockcursor') + + @requires_tcl(8, 5) + def test_endline(self): + widget = self.create() + text = '\n'.join('Line %d' for i in range(100)) + widget.insert('end', text) + self.checkParam(widget, 'endline', 200, expected='') + self.checkParam(widget, 'endline', -10, expected='') + self.checkInvalidParam(widget, 'endline', 'spam', + errmsg='expected integer but got "spam"') + self.checkParam(widget, 'endline', 50) + self.checkParam(widget, 'startline', 15) + self.checkInvalidParam(widget, 'endline', 10, + errmsg='-startline must be less than or equal to -endline') + + def test_height(self): + widget = self.create() + self.checkPixelsParam(widget, 'height', 100, 101.2, 102.6, '3c') + self.checkParam(widget, 'height', -100, expected=1) + self.checkParam(widget, 'height', 0, expected=1) + + def test_maxundo(self): + widget = self.create() + self.checkIntegerParam(widget, 'maxundo', 0, 5, -1) + + @requires_tcl(8, 5) + def test_inactiveselectbackground(self): + widget = self.create() + self.checkColorParam(widget, 'inactiveselectbackground') + + @requires_tcl(8, 6) + def test_insertunfocussed(self): + widget = self.create() + self.checkEnumParam(widget, 'insertunfocussed', + 'hollow', 'none', 'solid') + + def test_selectborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'selectborderwidth', + 1.3, 2.6, -2, '10p', conv=False, + keep_orig=tcl_version >= (8, 5)) + + def test_spacing1(self): + widget = self.create() + self.checkPixelsParam(widget, 'spacing1', 20, 21.4, 22.6, '0.5c') + self.checkParam(widget, 'spacing1', -5, expected=0) + + def test_spacing2(self): + widget = self.create() + self.checkPixelsParam(widget, 'spacing2', 5, 6.4, 7.6, '0.1c') + self.checkParam(widget, 'spacing2', -1, expected=0) + + def test_spacing3(self): + widget = self.create() + self.checkPixelsParam(widget, 'spacing3', 20, 21.4, 22.6, '0.5c') + self.checkParam(widget, 'spacing3', -10, expected=0) + + @requires_tcl(8, 5) + def test_startline(self): + widget = self.create() + text = '\n'.join('Line %d' for i in range(100)) + widget.insert('end', text) + self.checkParam(widget, 'startline', 200, expected='') + self.checkParam(widget, 'startline', -10, expected='') + self.checkInvalidParam(widget, 'startline', 'spam', + errmsg='expected integer but got "spam"') + self.checkParam(widget, 'startline', 10) + self.checkParam(widget, 'endline', 50) + self.checkInvalidParam(widget, 'startline', 70, + errmsg='-startline must be less than or equal to -endline') + + def test_state(self): + widget = self.create() + if tcl_version < (8, 5): + self.checkParams(widget, 'state', 'disabled', 'normal') + else: + self.checkEnumParam(widget, 'state', 'disabled', 'normal') + + def test_tabs(self): + widget = self.create() + self.checkParam(widget, 'tabs', (10.2, 20.7, '1i', '2i')) + self.checkParam(widget, 'tabs', '10.2 20.7 1i 2i', + expected=('10.2', '20.7', '1i', '2i')) + self.checkParam(widget, 'tabs', '2c left 4c 6c center', + expected=('2c', 'left', '4c', '6c', 'center')) + self.checkInvalidParam(widget, 'tabs', 'spam', + errmsg='bad screen distance "spam"', + keep_orig=tcl_version >= (8, 5)) + + @requires_tcl(8, 5) + def test_tabstyle(self): + widget = self.create() + self.checkEnumParam(widget, 'tabstyle', 'tabular', 'wordprocessor') + + def test_undo(self): + widget = self.create() + self.checkBooleanParam(widget, 'undo') + + def test_width(self): + widget = self.create() + self.checkIntegerParam(widget, 'width', 402) + self.checkParam(widget, 'width', -402, expected=1) + self.checkParam(widget, 'width', 0, expected=1) + + def test_wrap(self): + widget = self.create() + if tcl_version < (8, 5): + self.checkParams(widget, 'wrap', 'char', 'none', 'word') + else: + self.checkEnumParam(widget, 'wrap', 'char', 'none', 'word') + + + at add_standard_options(PixelSizeTests, StandardOptionsTests) +class CanvasTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', + 'closeenough', 'confine', 'cursor', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'insertbackground', 'insertborderwidth', + 'insertofftime', 'insertontime', 'insertwidth', + 'relief', 'scrollregion', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'state', 'takefocus', + 'xscrollcommand', 'xscrollincrement', + 'yscrollcommand', 'yscrollincrement', 'width', + ) + + _conv_pixels = staticmethod(int_round) + wantobjects = False + + def _create(self, **kwargs): + return Tkinter.Canvas(self.root, **kwargs) + + def test_closeenough(self): + widget = self.create() + self.checkFloatParam(widget, 'closeenough', 24, 2.4, 3.6, -3, + conv=float) + + def test_confine(self): + widget = self.create() + self.checkBooleanParam(widget, 'confine') + + def test_scrollregion(self): + widget = self.create() + self.checkParam(widget, 'scrollregion', '0 0 200 150') + self.checkParam(widget, 'scrollregion', (0, 0, 200, 150), + expected='0 0 200 150') + self.checkParam(widget, 'scrollregion', '') + self.checkInvalidParam(widget, 'scrollregion', 'spam', + errmsg='bad scrollRegion "spam"') + self.checkInvalidParam(widget, 'scrollregion', (0, 0, 200, 'spam')) + self.checkInvalidParam(widget, 'scrollregion', (0, 0, 200)) + self.checkInvalidParam(widget, 'scrollregion', (0, 0, 200, 150, 0)) + + def test_state(self): + widget = self.create() + self.checkEnumParam(widget, 'state', 'disabled', 'normal', + errmsg='bad state value "{}": must be normal or disabled') + + def test_xscrollincrement(self): + widget = self.create() + self.checkPixelsParam(widget, 'xscrollincrement', + 40, 0, 41.2, 43.6, -40, '0.5i') + + def test_yscrollincrement(self): + widget = self.create() + self.checkPixelsParam(widget, 'yscrollincrement', + 10, 0, 11.2, 13.6, -10, '0.1i') + + + at add_standard_options(IntegerSizeTests, StandardOptionsTests) +class ListboxTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'activestyle', 'background', 'borderwidth', 'cursor', + 'disabledforeground', 'exportselection', + 'font', 'foreground', 'height', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'listvariable', 'relief', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'selectmode', 'setgrid', 'state', + 'takefocus', 'width', 'xscrollcommand', 'yscrollcommand', + ) + + def _create(self, **kwargs): + return Tkinter.Listbox(self.root, **kwargs) + + def test_activestyle(self): + widget = self.create() + self.checkEnumParam(widget, 'activestyle', + 'dotbox', 'none', 'underline') + + def test_listvariable(self): + widget = self.create() + var = Tkinter.DoubleVar() + self.checkVariableParam(widget, 'listvariable', var) + + def test_selectmode(self): + widget = self.create() + self.checkParam(widget, 'selectmode', 'single') + self.checkParam(widget, 'selectmode', 'browse') + self.checkParam(widget, 'selectmode', 'multiple') + self.checkParam(widget, 'selectmode', 'extended') + + def test_state(self): + widget = self.create() + self.checkEnumParam(widget, 'state', 'disabled', 'normal') + + at add_standard_options(PixelSizeTests, StandardOptionsTests) +class ScaleTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'background', 'bigincrement', 'borderwidth', + 'command', 'cursor', 'digits', 'font', 'foreground', 'from', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'label', 'length', 'orient', 'relief', + 'repeatdelay', 'repeatinterval', + 'resolution', 'showvalue', 'sliderlength', 'sliderrelief', 'state', + 'takefocus', 'tickinterval', 'to', 'troughcolor', 'variable', 'width', + ) + default_orient = 'vertical' + + def _create(self, **kwargs): + return Tkinter.Scale(self.root, **kwargs) + + def test_bigincrement(self): + widget = self.create() + self.checkFloatParam(widget, 'bigincrement', 12.4, 23.6, -5) + + def test_digits(self): + widget = self.create() + self.checkIntegerParam(widget, 'digits', 5, 0) + + def test_from(self): + widget = self.create() + self.checkFloatParam(widget, 'from', 100, 14.9, 15.1, conv=round) + + def test_label(self): + widget = self.create() + self.checkParam(widget, 'label', 'any string') + self.checkParam(widget, 'label', '') + + def test_length(self): + widget = self.create() + self.checkPixelsParam(widget, 'length', 130, 131.2, 135.6, '5i') + + def test_resolution(self): + widget = self.create() + self.checkFloatParam(widget, 'resolution', 4.2, 0, 6.7, -2) + + def test_showvalue(self): + widget = self.create() + self.checkBooleanParam(widget, 'showvalue') + + def test_sliderlength(self): + widget = self.create() + self.checkPixelsParam(widget, 'sliderlength', + 10, 11.2, 15.6, -3, '3m') + + def test_sliderrelief(self): + widget = self.create() + self.checkReliefParam(widget, 'sliderrelief') + + def test_tickinterval(self): + widget = self.create() + self.checkFloatParam(widget, 'tickinterval', 1, 4.3, 7.6, 0, + conv=round) + self.checkParam(widget, 'tickinterval', -2, expected=2, + conv=round) + + def test_to(self): + widget = self.create() + self.checkFloatParam(widget, 'to', 300, 14.9, 15.1, -10, + conv=round) + + + at add_standard_options(PixelSizeTests, StandardOptionsTests) +class ScrollbarTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activerelief', + 'background', 'borderwidth', + 'command', 'cursor', 'elementborderwidth', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'jump', 'orient', 'relief', + 'repeatdelay', 'repeatinterval', + 'takefocus', 'troughcolor', 'width', + ) + _conv_pixels = staticmethod(int_round) + wantobjects = False + default_orient = 'vertical' + + def _create(self, **kwargs): + return Tkinter.Scrollbar(self.root, **kwargs) + + def test_activerelief(self): + widget = self.create() + self.checkReliefParam(widget, 'activerelief') + + def test_elementborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'elementborderwidth', 4.3, 5.6, -2, '1m') + + def test_orient(self): + widget = self.create() + self.checkEnumParam(widget, 'orient', 'vertical', 'horizontal', + errmsg='bad orientation "{}": must be vertical or horizontal') + + + at add_standard_options(StandardOptionsTests) +class PanedWindowTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'background', 'borderwidth', 'cursor', + 'handlepad', 'handlesize', 'height', + 'opaqueresize', 'orient', 'relief', + 'sashcursor', 'sashpad', 'sashrelief', 'sashwidth', + 'showhandle', 'width', + ) + default_orient = 'horizontal' + + def _create(self, **kwargs): + return Tkinter.PanedWindow(self.root, **kwargs) + + def test_handlepad(self): + widget = self.create() + self.checkPixelsParam(widget, 'handlepad', 5, 6.4, 7.6, -3, '1m') + + def test_handlesize(self): + widget = self.create() + self.checkPixelsParam(widget, 'handlesize', 8, 9.4, 10.6, -3, '2m', + conv=noconv) + + def test_height(self): + widget = self.create() + self.checkPixelsParam(widget, 'height', 100, 101.2, 102.6, -100, 0, '1i', + conv=noconv) + + def test_opaqueresize(self): + widget = self.create() + self.checkBooleanParam(widget, 'opaqueresize') + + def test_sashcursor(self): + widget = self.create() + self.checkCursorParam(widget, 'sashcursor') + + def test_sashpad(self): + widget = self.create() + self.checkPixelsParam(widget, 'sashpad', 8, 1.3, 2.6, -2, '2m') + + def test_sashrelief(self): + widget = self.create() + self.checkReliefParam(widget, 'sashrelief') + + def test_sashwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'sashwidth', 10, 11.1, 15.6, -3, '1m', + conv=noconv) + + def test_showhandle(self): + widget = self.create() + self.checkBooleanParam(widget, 'showhandle') + + def test_width(self): + widget = self.create() + self.checkPixelsParam(widget, 'width', 402, 403.4, 404.6, -402, 0, '5i', + conv=noconv) + + + at add_standard_options(StandardOptionsTests) +class MenuTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'activebackground', 'activeborderwidth', 'activeforeground', + 'background', 'borderwidth', 'cursor', + 'disabledforeground', 'font', 'foreground', + 'postcommand', 'relief', 'selectcolor', 'takefocus', + 'tearoff', 'tearoffcommand', 'title', 'type', + ) + _conv_pixels = noconv + + def _create(self, **kwargs): + return Tkinter.Menu(self.root, **kwargs) + + def test_postcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'postcommand') + + def test_tearoff(self): + widget = self.create() + self.checkBooleanParam(widget, 'tearoff') + + def test_tearoffcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'tearoffcommand') + + def test_title(self): + widget = self.create() + self.checkParam(widget, 'title', 'any string') + + def test_type(self): + widget = self.create() + self.checkEnumParam(widget, 'type', + 'normal', 'tearoff', 'menubar') + + + at add_standard_options(PixelSizeTests, StandardOptionsTests) +class MessageTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'anchor', 'aspect', 'background', 'borderwidth', + 'cursor', 'font', 'foreground', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'justify', 'padx', 'pady', 'relief', + 'takefocus', 'text', 'textvariable', 'width', + ) + _conv_pad_pixels = noconv + + def _create(self, **kwargs): + return Tkinter.Message(self.root, **kwargs) + + def test_aspect(self): + widget = self.create() + self.checkIntegerParam(widget, 'aspect', 250, 0, -300) + + +tests_gui = [ + ButtonTest, CanvasTest, CheckbuttonTest, EntryTest, + FrameTest, LabelFrameTest,LabelTest, ListboxTest, + MenubuttonTest, MenuTest, MessageTest, OptionMenuTest, + PanedWindowTest, RadiobuttonTest, ScaleTest, ScrollbarTest, + SpinboxTest, TextTest, ToplevelTest, +] + +if __name__ == '__main__': + run_unittest(*tests_gui) diff --git a/Lib/lib-tk/test/test_ttk/support.py b/Lib/lib-tk/test/test_ttk/support.py --- a/Lib/lib-tk/test/test_ttk/support.py +++ b/Lib/lib-tk/test/test_ttk/support.py @@ -1,3 +1,4 @@ +import unittest import Tkinter def get_tk_root(): @@ -31,3 +32,42 @@ widget.event_generate('', x=x, y=y) widget.event_generate('', x=x, y=y) widget.event_generate('', x=x, y=y) + + +import _tkinter +tcl_version = tuple(map(int, _tkinter.TCL_VERSION.split('.'))) + +def requires_tcl(*version): + return unittest.skipUnless(tcl_version >= version, + 'requires Tcl version >= ' + '.'.join(map(str, version))) + +units = { + 'c': 72 / 2.54, # centimeters + 'i': 72, # inches + 'm': 72 / 25.4, # millimeters + 'p': 1, # points +} + +def pixels_conv(value): + return float(value[:-1]) * units[value[-1:]] + +def tcl_obj_eq(actual, expected): + if actual == expected: + return True + if isinstance(actual, _tkinter.Tcl_Obj): + if isinstance(expected, str): + return str(actual) == expected + if isinstance(actual, tuple): + if isinstance(expected, tuple): + return (len(actual) == len(expected) and + all(tcl_obj_eq(act, exp) + for act, exp in zip(actual, expected))) + return False + +def widget_eq(actual, expected): + if actual == expected: + return True + if isinstance(actual, (str, Tkinter.Widget)): + if isinstance(expected, (str, Tkinter.Widget)): + return str(actual) == str(expected) + return False diff --git a/Lib/lib-tk/test/test_ttk/test_widgets.py b/Lib/lib-tk/test/test_ttk/test_widgets.py --- a/Lib/lib-tk/test/test_ttk/test_widgets.py +++ b/Lib/lib-tk/test/test_ttk/test_widgets.py @@ -6,9 +6,53 @@ import support from test_functions import MockTclObj, MockStateSpec +from support import tcl_version +from widget_tests import (add_standard_options, noconv, + AbstractWidgetTest, StandardOptionsTests, + IntegerSizeTests, PixelSizeTests) requires('gui') + +class StandardTtkOptionsTests(StandardOptionsTests): + + def test_class(self): + widget = self.create() + self.assertEqual(widget['class'], '') + errmsg='attempt to change read-only option' + if tcl_version < (8, 6): + errmsg='Attempt to change read-only option' + self.checkInvalidParam(widget, 'class', 'Foo', errmsg=errmsg) + widget2 = self.create(class_='Foo') + self.assertEqual(widget2['class'], 'Foo') + + def test_padding(self): + widget = self.create() + self.checkParam(widget, 'padding', 0, expected=('0',)) + self.checkParam(widget, 'padding', 5, expected=('5',)) + self.checkParam(widget, 'padding', (5, 6), expected=('5', '6')) + self.checkParam(widget, 'padding', (5, 6, 7), + expected=('5', '6', '7')) + self.checkParam(widget, 'padding', (5, 6, 7, 8), + expected=('5', '6', '7', '8')) + self.checkParam(widget, 'padding', ('5p', '6p', '7p', '8p')) + self.checkParam(widget, 'padding', (), expected='') + + def test_style(self): + widget = self.create() + self.assertEqual(widget['style'], '') + errmsg = 'Layout Foo not found' + if hasattr(self, 'default_orient'): + errmsg = ('Layout %s.Foo not found' % + getattr(self, 'default_orient').title()) + self.checkInvalidParam(widget, 'style', 'Foo', + errmsg=errmsg) + widget2 = self.create(class_='Foo') + self.assertEqual(widget2['class'], 'Foo') + # XXX + pass + + class WidgetTest(unittest.TestCase): """Tests methods available in every ttk widget.""" @@ -72,7 +116,112 @@ self.assertEqual(self.widget.state(), ('active', )) -class ButtonTest(unittest.TestCase): +class AbstractToplevelTest(AbstractWidgetTest, PixelSizeTests): + _conv_pixels = noconv + + + at add_standard_options(StandardTtkOptionsTests) +class FrameTest(AbstractToplevelTest, unittest.TestCase): + OPTIONS = ( + 'borderwidth', 'class', 'cursor', 'height', + 'padding', 'relief', 'style', 'takefocus', + 'width', + ) + + def _create(self, **kwargs): + return ttk.Frame(self.root, **kwargs) + + + at add_standard_options(StandardTtkOptionsTests) +class LabelFrameTest(AbstractToplevelTest, unittest.TestCase): + OPTIONS = ( + 'borderwidth', 'class', 'cursor', 'height', + 'labelanchor', 'labelwidget', + 'padding', 'relief', 'style', 'takefocus', + 'text', 'underline', 'width', + ) + + def _create(self, **kwargs): + return ttk.LabelFrame(self.root, **kwargs) + + def test_labelanchor(self): + widget = self.create() + self.checkEnumParam(widget, 'labelanchor', + 'e', 'en', 'es', 'n', 'ne', 'nw', 's', 'se', 'sw', 'w', 'wn', 'ws', + errmsg='Bad label anchor specification {}') + self.checkInvalidParam(widget, 'labelanchor', 'center') + + def test_labelwidget(self): + widget = self.create() + label = ttk.Label(self.root, text='Mupp', name='foo') + self.checkParam(widget, 'labelwidget', label, expected='.foo') + label.destroy() + + +class AbstractLabelTest(AbstractWidgetTest): + + def checkImageParam(self, widget, name): + image = Tkinter.PhotoImage('image1') + image2 = Tkinter.PhotoImage('image2') + self.checkParam(widget, name, image, expected=('image1',)) + self.checkParam(widget, name, 'image1', expected=('image1',)) + self.checkParam(widget, name, (image,), expected=('image1',)) + self.checkParam(widget, name, (image, 'active', image2), + expected=('image1', 'active', 'image2')) + self.checkParam(widget, name, 'image1 active image2', + expected=('image1', 'active', 'image2')) + self.checkInvalidParam(widget, name, 'spam', + errmsg='image "spam" doesn\'t exist') + + def test_compound(self): + widget = self.create() + self.checkEnumParam(widget, 'compound', + 'none', 'text', 'image', 'center', + 'top', 'bottom', 'left', 'right') + + def test_state(self): + widget = self.create() + self.checkParams(widget, 'state', 'active', 'disabled', 'normal') + + def test_width(self): + widget = self.create() + self.checkParams(widget, 'width', 402, -402, 0) + + + at add_standard_options(StandardTtkOptionsTests) +class LabelTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'anchor', 'background', + 'class', 'compound', 'cursor', 'font', 'foreground', + 'image', 'justify', 'padding', 'relief', 'state', 'style', + 'takefocus', 'text', 'textvariable', + 'underline', 'width', 'wraplength', + ) + _conv_pixels = noconv + + def _create(self, **kwargs): + return ttk.Label(self.root, **kwargs) + + def test_font(self): + widget = self.create() + self.checkParam(widget, 'font', + '-Adobe-Helvetica-Medium-R-Normal--*-120-*-*-*-*-*-*') + + + at add_standard_options(StandardTtkOptionsTests) +class ButtonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'class', 'command', 'compound', 'cursor', 'default', + 'image', 'state', 'style', 'takefocus', 'text', 'textvariable', + 'underline', 'width', + ) + + def _create(self, **kwargs): + return ttk.Button(self.root, **kwargs) + + def test_default(self): + widget = self.create() + self.checkEnumParam(widget, 'default', 'normal', 'active', 'disabled') def test_invoke(self): success = [] @@ -81,7 +230,27 @@ self.assertTrue(success) -class CheckbuttonTest(unittest.TestCase): + at add_standard_options(StandardTtkOptionsTests) +class CheckbuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'class', 'command', 'compound', 'cursor', + 'image', + 'offvalue', 'onvalue', + 'state', 'style', + 'takefocus', 'text', 'textvariable', + 'underline', 'variable', 'width', + ) + + def _create(self, **kwargs): + return ttk.Checkbutton(self.root, **kwargs) + + def test_offvalue(self): + widget = self.create() + self.checkParams(widget, 'offvalue', 1, 2.3, '', 'any string') + + def test_onvalue(self): + widget = self.create() + self.checkParams(widget, 'onvalue', 1, 2.3, '', 'any string') def test_invoke(self): success = [] @@ -104,21 +273,40 @@ cbtn['command'] = '' res = cbtn.invoke() - self.assertEqual(str(res), '') + self.assertFalse(str(res)) self.assertFalse(len(success) > 1) self.assertEqual(cbtn['offvalue'], cbtn.tk.globalgetvar(cbtn['variable'])) -class ComboboxTest(unittest.TestCase): + at add_standard_options(IntegerSizeTests, StandardTtkOptionsTests) +class ComboboxTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'exportselection', 'height', + 'justify', 'postcommand', 'state', 'style', + 'takefocus', 'textvariable', 'values', 'width', + ) def setUp(self): + super(ComboboxTest, self).setUp() support.root_deiconify() - self.combo = ttk.Combobox() + self.combo = self.create() def tearDown(self): self.combo.destroy() support.root_withdraw() + super(ComboboxTest, self).tearDown() + + def _create(self, **kwargs): + return ttk.Combobox(self.root, **kwargs) + + def test_height(self): + widget = self.create() + self.checkParams(widget, 'height', 100, 101.2, 102.6, -100, 0, '1i') + + def test_state(self): + widget = self.create() + self.checkParams(widget, 'state', 'active', 'disabled', 'normal') def _show_drop_down_listbox(self): width = self.combo.winfo_width() @@ -166,8 +354,16 @@ self.assertEqual(self.combo.get(), getval) self.assertEqual(self.combo.current(), currval) + self.assertEqual(self.combo['values'], + () if tcl_version < (8, 5) else '') check_get_current('', -1) + self.checkParam(self.combo, 'values', 'mon tue wed thur', + expected=('mon', 'tue', 'wed', 'thur')) + self.checkParam(self.combo, 'values', ('mon', 'tue', 'wed', 'thur')) + self.checkParam(self.combo, 'values', (42, 3.14, '', 'any string')) + self.checkParam(self.combo, 'values', '') + self.combo['values'] = ['a', 1, 'c'] self.combo.set('c') @@ -208,15 +404,52 @@ combo2.destroy() -class EntryTest(unittest.TestCase): + at add_standard_options(IntegerSizeTests, StandardTtkOptionsTests) +class EntryTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'background', 'class', 'cursor', + 'exportselection', 'font', + 'invalidcommand', 'justify', + 'show', 'state', 'style', 'takefocus', 'textvariable', + 'validate', 'validatecommand', 'width', 'xscrollcommand', + ) def setUp(self): + super(EntryTest, self).setUp() support.root_deiconify() - self.entry = ttk.Entry() + self.entry = self.create() def tearDown(self): self.entry.destroy() support.root_withdraw() + super(EntryTest, self).tearDown() + + def _create(self, **kwargs): + return ttk.Entry(self.root, **kwargs) + + def test_invalidcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'invalidcommand') + + def test_show(self): + widget = self.create() + self.checkParam(widget, 'show', '*') + self.checkParam(widget, 'show', '') + self.checkParam(widget, 'show', ' ') + + def test_state(self): + widget = self.create() + self.checkParams(widget, 'state', + 'disabled', 'normal', 'readonly') + + def test_validate(self): + widget = self.create() + self.checkEnumParam(widget, 'validate', + 'all', 'key', 'focus', 'focusin', 'focusout', 'none') + + def test_validatecommand(self): + widget = self.create() + self.checkCommandParam(widget, 'validatecommand') def test_bbox(self): @@ -312,16 +545,36 @@ self.assertEqual(self.entry.state(), ()) -class PanedwindowTest(unittest.TestCase): + at add_standard_options(IntegerSizeTests, StandardTtkOptionsTests) +class PanedWindowTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'height', + 'orient', 'style', 'takefocus', 'width', + ) def setUp(self): + super(PanedWindowTest, self).setUp() support.root_deiconify() - self.paned = ttk.Panedwindow() + self.paned = self.create() def tearDown(self): self.paned.destroy() support.root_withdraw() + super(PanedWindowTest, self).tearDown() + def _create(self, **kwargs): + return ttk.PanedWindow(self.root, **kwargs) + + def test_orient(self): + widget = self.create() + self.assertEqual(str(widget['orient']), 'vertical') + errmsg='attempt to change read-only option' + if tcl_version < (8, 6): + errmsg='Attempt to change read-only option' + self.checkInvalidParam(widget, 'orient', 'horizontal', + errmsg=errmsg) + widget2 = self.create(orient='horizontal') + self.assertEqual(str(widget2['orient']), 'horizontal') def test_add(self): # attempt to add a child that is not a direct child of the paned window @@ -431,7 +684,22 @@ self.assertTrue(isinstance(self.paned.sashpos(0), int)) -class RadiobuttonTest(unittest.TestCase): + at add_standard_options(StandardTtkOptionsTests) +class RadiobuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'class', 'command', 'compound', 'cursor', + 'image', + 'state', 'style', + 'takefocus', 'text', 'textvariable', + 'underline', 'value', 'variable', 'width', + ) + + def _create(self, **kwargs): + return ttk.Radiobutton(self.root, **kwargs) + + def test_value(self): + widget = self.create() + self.checkParams(widget, 'value', 1, 2.3, '', 'any string') def test_invoke(self): success = [] @@ -461,19 +729,68 @@ self.assertEqual(str(cbtn['variable']), str(cbtn2['variable'])) +class MenubuttonTest(AbstractLabelTest, unittest.TestCase): + OPTIONS = ( + 'class', 'compound', 'cursor', 'direction', + 'image', 'menu', 'state', 'style', + 'takefocus', 'text', 'textvariable', + 'underline', 'width', + ) -class ScaleTest(unittest.TestCase): + def _create(self, **kwargs): + return ttk.Menubutton(self.root, **kwargs) + + def test_direction(self): + widget = self.create() + self.checkEnumParam(widget, 'direction', + 'above', 'below', 'left', 'right', 'flush') + + def test_menu(self): + widget = self.create() + menu = Tkinter.Menu(widget, name='menu') + self.checkParam(widget, 'menu', menu, conv=str) + menu.destroy() + + + at add_standard_options(StandardTtkOptionsTests) +class ScaleTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'command', 'cursor', 'from', 'length', + 'orient', 'style', 'takefocus', 'to', 'value', 'variable', + ) + _conv_pixels = noconv + default_orient = 'horizontal' def setUp(self): + super(ScaleTest, self).setUp() support.root_deiconify() - self.scale = ttk.Scale() + self.scale = self.create() self.scale.pack() self.scale.update() def tearDown(self): self.scale.destroy() support.root_withdraw() + super(ScaleTest, self).tearDown() + def _create(self, **kwargs): + return ttk.Scale(self.root, **kwargs) + + def test_from(self): + widget = self.create() + self.checkFloatParam(widget, 'from', 100, 14.9, 15.1, conv=False) + + def test_length(self): + widget = self.create() + self.checkPixelsParam(widget, 'length', 130, 131.2, 135.6, '5i') + + def test_to(self): + widget = self.create() + self.checkFloatParam(widget, 'to', 300, 14.9, 15.1, -10, conv=False) + + def test_value(self): + widget = self.create() + self.checkFloatParam(widget, 'value', 300, 14.9, 15.1, -10, conv=False) def test_custom_event(self): failure = [1, 1, 1] # will need to be empty @@ -538,11 +855,64 @@ self.assertRaises(Tkinter.TclError, self.scale.set, None) -class NotebookTest(unittest.TestCase): + at add_standard_options(StandardTtkOptionsTests) +class ProgressbarTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'orient', 'length', + 'mode', 'maximum', 'phase', + 'style', 'takefocus', 'value', 'variable', + ) + _conv_pixels = noconv + default_orient = 'horizontal' + + def _create(self, **kwargs): + return ttk.Progressbar(self.root, **kwargs) + + def test_length(self): + widget = self.create() + self.checkPixelsParam(widget, 'length', 100.1, 56.7, '2i') + + def test_maximum(self): + widget = self.create() + self.checkFloatParam(widget, 'maximum', 150.2, 77.7, 0, -10, conv=False) + + def test_mode(self): + widget = self.create() + self.checkEnumParam(widget, 'mode', 'determinate', 'indeterminate') + + def test_phase(self): + # XXX + pass + + def test_value(self): + widget = self.create() + self.checkFloatParam(widget, 'value', 150.2, 77.7, 0, -10, + conv=False) + + + at unittest.skipIf(sys.platform == 'darwin', + 'ttk.Scrollbar is special on MacOSX') + at add_standard_options(StandardTtkOptionsTests) +class ScrollbarTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'command', 'cursor', 'orient', 'style', 'takefocus', + ) + default_orient = 'vertical' + + def _create(self, **kwargs): + return ttk.Scrollbar(self.root, **kwargs) + + + at add_standard_options(IntegerSizeTests, StandardTtkOptionsTests) +class NotebookTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'height', 'padding', 'style', 'takefocus', + ) def setUp(self): + super(NotebookTest, self).setUp() support.root_deiconify() - self.nb = ttk.Notebook(padding=0) + self.nb = self.create(padding=0) self.child1 = ttk.Label() self.child2 = ttk.Label() self.nb.add(self.child1, text='a') @@ -553,7 +923,10 @@ self.child2.destroy() self.nb.destroy() support.root_withdraw() + super(NotebookTest, self).tearDown() + def _create(self, **kwargs): + return ttk.Notebook(self.root, **kwargs) def test_tab_identifiers(self): self.nb.forget(0) @@ -745,16 +1118,68 @@ self.assertEqual(self.nb.select(), str(self.child1)) -class TreeviewTest(unittest.TestCase): + at add_standard_options(StandardTtkOptionsTests) +class TreeviewTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'columns', 'cursor', 'displaycolumns', + 'height', 'padding', 'selectmode', 'show', + 'style', 'takefocus', 'xscrollcommand', 'yscrollcommand', + ) def setUp(self): + super(TreeviewTest, self).setUp() support.root_deiconify() - self.tv = ttk.Treeview(padding=0) + self.tv = self.create(padding=0) def tearDown(self): self.tv.destroy() support.root_withdraw() + super(TreeviewTest, self).tearDown() + def _create(self, **kwargs): + return ttk.Treeview(self.root, **kwargs) + + def test_columns(self): + widget = self.create() + self.checkParam(widget, 'columns', 'a b c', + expected=('a', 'b', 'c')) + self.checkParam(widget, 'columns', ('a', 'b', 'c')) + self.checkParam(widget, 'columns', '') + + def test_displaycolumns(self): + widget = self.create() + widget['columns'] = ('a', 'b', 'c') + self.checkParam(widget, 'displaycolumns', 'b a c', + expected=('b', 'a', 'c')) + self.checkParam(widget, 'displaycolumns', ('b', 'a', 'c')) + self.checkParam(widget, 'displaycolumns', '#all', + expected=('#all',)) + self.checkParam(widget, 'displaycolumns', (2, 1, 0)) + self.checkInvalidParam(widget, 'displaycolumns', ('a', 'b', 'd'), + errmsg='Invalid column index d') + self.checkInvalidParam(widget, 'displaycolumns', (1, 2, 3), + errmsg='Column index 3 out of bounds') + self.checkInvalidParam(widget, 'displaycolumns', (1, -2), + errmsg='Column index -2 out of bounds') + + def test_height(self): + widget = self.create() + self.checkPixelsParam(widget, 'height', 100, -100, 0, '3c', conv=False) + self.checkPixelsParam(widget, 'height', 101.2, 102.6, conv=noconv) + + def test_selectmode(self): + widget = self.create() + self.checkEnumParam(widget, 'selectmode', + 'none', 'browse', 'extended') + + def test_show(self): + widget = self.create() + self.checkParam(widget, 'show', 'tree headings', + expected=('tree', 'headings')) + self.checkParam(widget, 'show', ('tree', 'headings')) + self.checkParam(widget, 'show', ('headings', 'tree')) + self.checkParam(widget, 'show', 'tree', expected=('tree',)) + self.checkParam(widget, 'show', 'headings', expected=('headings',)) def test_bbox(self): self.tv.pack() @@ -1148,10 +1573,35 @@ self.assertTrue(isinstance(self.tv.tag_configure('test'), dict)) + at add_standard_options(StandardTtkOptionsTests) +class SeparatorTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'orient', 'style', 'takefocus', + # 'state'? + ) + default_orient = 'horizontal' + + def _create(self, **kwargs): + return ttk.Separator(self.root, **kwargs) + + + at add_standard_options(StandardTtkOptionsTests) +class SizegripTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( + 'class', 'cursor', 'style', 'takefocus', + # 'state'? + ) + + def _create(self, **kwargs): + return ttk.Sizegrip(self.root, **kwargs) + + tests_gui = ( - WidgetTest, ButtonTest, CheckbuttonTest, RadiobuttonTest, - ComboboxTest, EntryTest, PanedwindowTest, ScaleTest, NotebookTest, - TreeviewTest + ButtonTest, CheckbuttonTest, ComboboxTest, EntryTest, + FrameTest, LabelFrameTest, LabelTest, MenubuttonTest, + NotebookTest, PanedWindowTest, ProgressbarTest, + RadiobuttonTest, ScaleTest, ScrollbarTest, SeparatorTest, + SizegripTest, TreeviewTest, WidgetTest, ) if __name__ == "__main__": diff --git a/Lib/lib-tk/test/widget_tests.py b/Lib/lib-tk/test/widget_tests.py new file mode 100644 --- /dev/null +++ b/Lib/lib-tk/test/widget_tests.py @@ -0,0 +1,504 @@ +# Common tests for test_tkinter/test_widgets.py and test_ttk/test_widgets.py + +import Tkinter +from ttk import setup_master, Scale +from test_ttk.support import tcl_version, requires_tcl, pixels_conv, tcl_obj_eq + + +noconv = str if tcl_version < (8, 5) else False + +def int_round(x): + return int(round(x)) + +_sentinel = object() + +class AbstractWidgetTest(object): + _conv_pixels = staticmethod(int_round) if tcl_version[:2] != (8, 5) else int + _conv_pad_pixels = None + wantobjects = True + + def setUp(self): + self.root = setup_master() + self.scaling = float(self.root.call('tk', 'scaling')) + if not self.root.wantobjects(): + self.wantobjects = False + + def create(self, **kwargs): + widget = self._create(**kwargs) + self.addCleanup(widget.destroy) + return widget + + def assertEqual2(self, actual, expected, msg=None, eq=object.__eq__): + if eq(actual, expected): + return + self.assertEqual(actual, expected, msg) + + def checkParam(self, widget, name, value, expected=_sentinel, + conv=False, eq=None): + widget[name] = value + if expected is _sentinel: + expected = value + if conv: + expected = conv(expected) + if not self.wantobjects: + if isinstance(expected, tuple): + expected = Tkinter._join(expected) + else: + expected = str(expected) + if eq is None: + eq = tcl_obj_eq + self.assertEqual2(widget[name], expected, eq=eq) + self.assertEqual2(widget.cget(name), expected, eq=eq) + # XXX + if not isinstance(widget, Scale): + t = widget.configure(name) + self.assertEqual(len(t), 5) + ## XXX + if not isinstance(t[4], tuple): + self.assertEqual2(t[4], expected, eq=eq) + + def checkInvalidParam(self, widget, name, value, errmsg=None, + keep_orig=True): + orig = widget[name] + if errmsg is not None: + errmsg = errmsg.format(value) + with self.assertRaises(Tkinter.TclError) as cm: + widget[name] = value + if errmsg is not None: + self.assertEqual(str(cm.exception), errmsg) + if keep_orig: + self.assertEqual(widget[name], orig) + else: + widget[name] = orig + with self.assertRaises(Tkinter.TclError) as cm: + widget.configure({name: value}) + if errmsg is not None: + self.assertEqual(str(cm.exception), errmsg) + if keep_orig: + self.assertEqual(widget[name], orig) + else: + widget[name] = orig + + def checkParams(self, widget, name, *values, **kwargs): + for value in values: + self.checkParam(widget, name, value, **kwargs) + + def checkIntegerParam(self, widget, name, *values, **kwargs): + self.checkParams(widget, name, *values, **kwargs) + self.checkInvalidParam(widget, name, '', + errmsg='expected integer but got ""') + self.checkInvalidParam(widget, name, '10p', + errmsg='expected integer but got "10p"') + self.checkInvalidParam(widget, name, 3.2, + errmsg='expected integer but got "3.2"') + + def checkFloatParam(self, widget, name, *values, **kwargs): + if 'conv' in kwargs: + conv = kwargs.pop('conv') + else: + conv = float + for value in values: + self.checkParam(widget, name, value, conv=conv, **kwargs) + self.checkInvalidParam(widget, name, '', + errmsg='expected floating-point number but got ""') + self.checkInvalidParam(widget, name, 'spam', + errmsg='expected floating-point number but got "spam"') + + def checkBooleanParam(self, widget, name): + for value in (False, 0, 'false', 'no', 'off'): + self.checkParam(widget, name, value, expected=0) + for value in (True, 1, 'true', 'yes', 'on'): + self.checkParam(widget, name, value, expected=1) + self.checkInvalidParam(widget, name, '', + errmsg='expected boolean value but got ""') + self.checkInvalidParam(widget, name, 'spam', + errmsg='expected boolean value but got "spam"') + + def checkColorParam(self, widget, name, allow_empty=None, **kwargs): + self.checkParams(widget, name, + '#ff0000', '#00ff00', '#0000ff', '#123456', + 'red', 'green', 'blue', 'white', 'black', 'grey', + **kwargs) + self.checkInvalidParam(widget, name, 'spam', + errmsg='unknown color name "spam"') + + def checkCursorParam(self, widget, name, **kwargs): + self.checkParams(widget, name, 'arrow', 'watch', 'cross', '',**kwargs) + if tcl_version >= (8, 5): + self.checkParam(widget, name, 'none') + self.checkInvalidParam(widget, name, 'spam', + errmsg='bad cursor spec "spam"') + + def checkCommandParam(self, widget, name): + def command(*args): + pass + widget[name] = command + self.assertTrue(widget[name]) + self.checkParams(widget, name, '') + + def checkEnumParam(self, widget, name, *values, **kwargs): + if 'errmsg' in kwargs: + errmsg = kwargs.pop('errmsg') + else: + errmsg = None + self.checkParams(widget, name, *values, **kwargs) + if errmsg is None: + errmsg2 = ' %s "{}": must be %s%s or %s' % ( + name, + ', '.join(values[:-1]), + ',' if len(values) > 2 else '', + values[-1]) + self.checkInvalidParam(widget, name, '', + errmsg='ambiguous' + errmsg2) + errmsg = 'bad' + errmsg2 + self.checkInvalidParam(widget, name, 'spam', errmsg=errmsg) + + def checkPixelsParam(self, widget, name, *values, **kwargs): + if 'conv' in kwargs: + conv = kwargs.pop('conv') + else: + conv = None + if conv is None: + conv = self._conv_pixels + if 'keep_orig' in kwargs: + keep_orig = kwargs.pop('keep_orig') + else: + keep_orig = True + for value in values: + expected = _sentinel + conv1 = conv + if isinstance(value, str): + if conv1 and conv1 is not str: + expected = pixels_conv(value) * self.scaling + conv1 = int_round + self.checkParam(widget, name, value, expected=expected, + conv=conv1, **kwargs) + self.checkInvalidParam(widget, name, '6x', + errmsg='bad screen distance "6x"', keep_orig=keep_orig) + self.checkInvalidParam(widget, name, 'spam', + errmsg='bad screen distance "spam"', keep_orig=keep_orig) + + def checkReliefParam(self, widget, name): + self.checkParams(widget, name, + 'flat', 'groove', 'raised', 'ridge', 'solid', 'sunken') + errmsg='bad relief "spam": must be '\ + 'flat, groove, raised, ridge, solid, or sunken' + if tcl_version < (8, 6): + errmsg = None + self.checkInvalidParam(widget, name, 'spam', + errmsg=errmsg) + + def checkImageParam(self, widget, name): + image = Tkinter.PhotoImage('image1') + self.checkParam(widget, name, image, conv=str) + self.checkInvalidParam(widget, name, 'spam', + errmsg='image "spam" doesn\'t exist') + widget[name] = '' + + def checkVariableParam(self, widget, name, var): + self.checkParam(widget, name, var, conv=str) + + +class StandardOptionsTests(object): + STANDARD_OPTIONS = ( + 'activebackground', 'activeborderwidth', 'activeforeground', 'anchor', + 'background', 'bitmap', 'borderwidth', 'compound', 'cursor', + 'disabledforeground', 'exportselection', 'font', 'foreground', + 'highlightbackground', 'highlightcolor', 'highlightthickness', + 'image', 'insertbackground', 'insertborderwidth', + 'insertofftime', 'insertontime', 'insertwidth', + 'jump', 'justify', 'orient', 'padx', 'pady', 'relief', + 'repeatdelay', 'repeatinterval', + 'selectbackground', 'selectborderwidth', 'selectforeground', + 'setgrid', 'takefocus', 'text', 'textvariable', 'troughcolor', + 'underline', 'wraplength', 'xscrollcommand', 'yscrollcommand', + ) + + def test_activebackground(self): + widget = self.create() + self.checkColorParam(widget, 'activebackground') + + def test_activeborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'activeborderwidth', + 0, 1.3, 2.9, 6, -2, '10p') + + def test_activeforeground(self): + widget = self.create() + self.checkColorParam(widget, 'activeforeground') + + def test_anchor(self): + widget = self.create() + self.checkEnumParam(widget, 'anchor', + 'n', 'ne', 'e', 'se', 's', 'sw', 'w', 'nw', 'center') + + def test_background(self): + widget = self.create() + self.checkColorParam(widget, 'background') + if 'bg' in self.OPTIONS: + self.checkColorParam(widget, 'bg') + + def test_bitmap(self): + widget = self.create() + self.checkParam(widget, 'bitmap', 'questhead') + self.checkParam(widget, 'bitmap', 'gray50') + self.checkInvalidParam(widget, 'bitmap', 'spam', + errmsg='bitmap "spam" not defined') + + def test_borderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'borderwidth', + 0, 1.3, 2.6, 6, -2, '10p') + if 'bd' in self.OPTIONS: + self.checkPixelsParam(widget, 'bd', 0, 1.3, 2.6, 6, -2, '10p') + + def test_compound(self): + widget = self.create() + self.checkEnumParam(widget, 'compound', + 'bottom', 'center', 'left', 'none', 'right', 'top') + + def test_cursor(self): + widget = self.create() + self.checkCursorParam(widget, 'cursor') + + def test_disabledforeground(self): + widget = self.create() + self.checkColorParam(widget, 'disabledforeground') + + def test_exportselection(self): + widget = self.create() + self.checkBooleanParam(widget, 'exportselection') + + def test_font(self): + widget = self.create() + self.checkParam(widget, 'font', + '-Adobe-Helvetica-Medium-R-Normal--*-120-*-*-*-*-*-*') + self.checkInvalidParam(widget, 'font', '', + errmsg='font "" doesn\'t exist') + + def test_foreground(self): + widget = self.create() + self.checkColorParam(widget, 'foreground') + if 'fg' in self.OPTIONS: + self.checkColorParam(widget, 'fg') + + def test_highlightbackground(self): + widget = self.create() + self.checkColorParam(widget, 'highlightbackground') + + def test_highlightcolor(self): + widget = self.create() + self.checkColorParam(widget, 'highlightcolor') + + def test_highlightthickness(self): + widget = self.create() + self.checkPixelsParam(widget, 'highlightthickness', + 0, 1.3, 2.6, 6, '10p') + self.checkParam(widget, 'highlightthickness', -2, expected=0, + conv=self._conv_pixels) + + def test_image(self): + widget = self.create() + self.checkImageParam(widget, 'image') + + def test_insertbackground(self): + widget = self.create() + self.checkColorParam(widget, 'insertbackground') + + def test_insertborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'insertborderwidth', + 0, 1.3, 2.6, 6, -2, '10p') + + def test_insertofftime(self): + widget = self.create() + self.checkIntegerParam(widget, 'insertofftime', 100) + + def test_insertontime(self): + widget = self.create() + self.checkIntegerParam(widget, 'insertontime', 100) + + def test_insertwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'insertwidth', 1.3, 2.6, -2, '10p') + + def test_jump(self): + widget = self.create() + self.checkBooleanParam(widget, 'jump') + + def test_justify(self): + widget = self.create() + self.checkEnumParam(widget, 'justify', 'left', 'right', 'center', + errmsg='bad justification "{}": must be ' + 'left, right, or center') + self.checkInvalidParam(widget, 'justify', '', + errmsg='ambiguous justification "": must be ' + 'left, right, or center') + + def test_orient(self): + widget = self.create() + self.assertEqual(str(widget['orient']), self.default_orient) + self.checkEnumParam(widget, 'orient', 'horizontal', 'vertical') + + def test_padx(self): + widget = self.create() + self.checkPixelsParam(widget, 'padx', 3, 4.4, 5.6, -2, '12m', + conv=self._conv_pad_pixels) + + def test_pady(self): + widget = self.create() + self.checkPixelsParam(widget, 'pady', 3, 4.4, 5.6, -2, '12m', + conv=self._conv_pad_pixels) + + def test_relief(self): + widget = self.create() + self.checkReliefParam(widget, 'relief') + + def test_repeatdelay(self): + widget = self.create() + self.checkIntegerParam(widget, 'repeatdelay', -500, 500) + + def test_repeatinterval(self): + widget = self.create() + self.checkIntegerParam(widget, 'repeatinterval', -500, 500) + + def test_selectbackground(self): + widget = self.create() + self.checkColorParam(widget, 'selectbackground') + + def test_selectborderwidth(self): + widget = self.create() + self.checkPixelsParam(widget, 'selectborderwidth', 1.3, 2.6, -2, '10p') + + def test_selectforeground(self): + widget = self.create() + self.checkColorParam(widget, 'selectforeground') + + def test_setgrid(self): + widget = self.create() + self.checkBooleanParam(widget, 'setgrid') + + def test_state(self): + widget = self.create() + self.checkEnumParam(widget, 'state', 'active', 'disabled', 'normal') + + def test_takefocus(self): + widget = self.create() + self.checkParams(widget, 'takefocus', '0', '1', '') + + def test_text(self): + widget = self.create() + self.checkParams(widget, 'text', '', 'any string') + + def test_textvariable(self): + widget = self.create() + var = Tkinter.StringVar() + self.checkVariableParam(widget, 'textvariable', var) + + def test_troughcolor(self): + widget = self.create() + self.checkColorParam(widget, 'troughcolor') + + def test_underline(self): + widget = self.create() + self.checkIntegerParam(widget, 'underline', 0, 1, 10) + + def test_wraplength(self): + widget = self.create() + if tcl_version < (8, 5): + self.checkPixelsParam(widget, 'wraplength', 100) + else: + self.checkParams(widget, 'wraplength', 100) + + def test_xscrollcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'xscrollcommand') + + def test_yscrollcommand(self): + widget = self.create() + self.checkCommandParam(widget, 'yscrollcommand') + + # non-standard but common options + + def test_command(self): + widget = self.create() + self.checkCommandParam(widget, 'command') + + def test_indicatoron(self): + widget = self.create() + self.checkBooleanParam(widget, 'indicatoron') + + def test_offrelief(self): + widget = self.create() + self.checkReliefParam(widget, 'offrelief') + + def test_overrelief(self): + widget = self.create() + self.checkReliefParam(widget, 'overrelief') + + def test_selectcolor(self): + widget = self.create() + self.checkColorParam(widget, 'selectcolor') + + def test_selectimage(self): + widget = self.create() + self.checkImageParam(widget, 'selectimage') + + @requires_tcl(8, 5) + def test_tristateimage(self): + widget = self.create() + self.checkImageParam(widget, 'tristateimage') + + @requires_tcl(8, 5) + def test_tristatevalue(self): + widget = self.create() + self.checkParam(widget, 'tristatevalue', 'unknowable') + + def test_variable(self): + widget = self.create() + var = Tkinter.DoubleVar() + self.checkVariableParam(widget, 'variable', var) + + +class IntegerSizeTests(object): + def test_height(self): + widget = self.create() + self.checkIntegerParam(widget, 'height', 100, -100, 0) + + def test_width(self): + widget = self.create() + self.checkIntegerParam(widget, 'width', 402, -402, 0) + + +class PixelSizeTests(object): + def test_height(self): + widget = self.create() + self.checkPixelsParam(widget, 'height', 100, 101.2, 102.6, -100, 0, '3c') + + def test_width(self): + widget = self.create() + self.checkPixelsParam(widget, 'width', 402, 403.4, 404.6, -402, 0, '5i') + + +def add_standard_options(*source_classes): + # This decorator adds test_xxx methods from source classes for every xxx + # option in the OPTIONS class attribute if they are not defined explicitly. + def decorator(cls): + for option in cls.OPTIONS: + methodname = 'test_' + option + if not hasattr(cls, methodname): + for source_class in source_classes: + if hasattr(source_class, methodname): + setattr(cls, methodname, + getattr(source_class, methodname).im_func) + break + else: + def test(self, option=option): + widget = self.create() + widget[option] + raise AssertionError('Option "%s" is not tested in %s' % + (option, cls.__name__)) + test.__name__ = methodname + setattr(cls, methodname, test) + return cls + return decorator diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -15,6 +15,8 @@ Tests ----- +- Issue #19085: Added basic tests for all tkinter widget options. + Whats' New in Python 2.7.6? =========================== -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 09:48:41 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 2 Nov 2013 09:48:41 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merge_heads?= Message-ID: <3dBYpK3Y04z7Ljq@mail.python.org> http://hg.python.org/cpython/rev/e92bba5b53db changeset: 86837:e92bba5b53db parent: 86835:ab7c2c1d349c parent: 86833:123804a72a8f user: Serhiy Storchaka date: Sat Nov 02 10:47:57 2013 +0200 summary: Merge heads files: Lib/asyncio/base_events.py | 30 +- Lib/asyncio/constants.py | 5 +- Lib/asyncio/events.py | 15 +- Lib/asyncio/selector_events.py | 154 ++++++--- Lib/asyncio/tasks.py | 5 +- Lib/asyncio/windows_events.py | 16 + Lib/asyncio/windows_utils.py | 20 +- Lib/test/test_asyncio/test_base_events.py | 78 ++++- Lib/test/test_asyncio/test_events.py | 3 +- Lib/test/test_asyncio/test_selector_events.py | 154 ++++++--- Lib/test/test_asyncio/test_windows_events.py | 2 +- 11 files changed, 350 insertions(+), 132 deletions(-) diff --git a/Lib/asyncio/base_events.py b/Lib/asyncio/base_events.py --- a/Lib/asyncio/base_events.py +++ b/Lib/asyncio/base_events.py @@ -186,6 +186,11 @@ self.call_soon(_raise_stop_error) def close(self): + """Close the event loop. + + This clears the queues and shuts down the executor, + but does not wait for the executor to finish. + """ self._ready.clear() self._scheduled.clear() executor = self._default_executor @@ -275,8 +280,27 @@ @tasks.coroutine def create_connection(self, protocol_factory, host=None, port=None, *, ssl=None, family=0, proto=0, flags=0, sock=None, - local_addr=None): + local_addr=None, server_hostname=None): """XXX""" + if server_hostname is not None and not ssl: + raise ValueError('server_hostname is only meaningful with ssl') + + if server_hostname is None and ssl: + # Use host as default for server_hostname. It is an error + # if host is empty or not set, e.g. when an + # already-connected socket was passed or when only a port + # is given. To avoid this error, you can pass + # server_hostname='' -- this will bypass the hostname + # check. (This also means that if host is a numeric + # IP/IPv6 address, we will attempt to verify that exact + # address; this will probably fail, but it is possible to + # create a certificate for a specific IP address, so we + # don't judge it here.) + if not host: + raise ValueError('You must set server_hostname ' + 'when using ssl without a host') + server_hostname = host + if host is not None or port is not None: if sock is not None: raise ValueError( @@ -357,7 +381,7 @@ sslcontext = None if isinstance(ssl, bool) else ssl transport = self._make_ssl_transport( sock, protocol, sslcontext, waiter, - server_side=False, server_hostname=host) + server_side=False, server_hostname=server_hostname) else: transport = self._make_socket_transport(sock, protocol, waiter) @@ -442,6 +466,8 @@ ssl=None, reuse_address=None): """XXX""" + if isinstance(ssl, bool): + raise TypeError('ssl argument must be an SSLContext or None') if host is not None or port is not None: if sock is not None: raise ValueError( diff --git a/Lib/asyncio/constants.py b/Lib/asyncio/constants.py --- a/Lib/asyncio/constants.py +++ b/Lib/asyncio/constants.py @@ -1,4 +1,7 @@ """Constants.""" +# After the connection is lost, log warnings after this many write()s. +LOG_THRESHOLD_FOR_CONNLOST_WRITES = 5 -LOG_THRESHOLD_FOR_CONNLOST_WRITES = 5 +# Seconds to wait before retrying accept(). +ACCEPT_RETRY_DELAY = 1 diff --git a/Lib/asyncio/events.py b/Lib/asyncio/events.py --- a/Lib/asyncio/events.py +++ b/Lib/asyncio/events.py @@ -137,6 +137,17 @@ """Return whether the event loop is currently running.""" raise NotImplementedError + def close(self): + """Close the loop. + + The loop should not be running. + + This is idempotent and irreversible. + + No other methods should be called after this one. + """ + raise NotImplementedError + # Methods scheduling callbacks. All these return Handles. def call_soon(self, callback, *args): @@ -172,7 +183,7 @@ def create_connection(self, protocol_factory, host=None, port=None, *, ssl=None, family=0, proto=0, flags=0, sock=None, - local_addr=None): + local_addr=None, server_hostname=None): raise NotImplementedError def create_server(self, protocol_factory, host=None, port=None, *, @@ -214,6 +225,8 @@ family=0, proto=0, flags=0): raise NotImplementedError + # Pipes and subprocesses. + def connect_read_pipe(self, protocol_factory, pipe): """Register read pipe in eventloop. diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -5,6 +5,7 @@ """ import collections +import errno import socket try: import ssl @@ -89,28 +90,37 @@ except (BlockingIOError, InterruptedError): pass - def _start_serving(self, protocol_factory, sock, ssl=None, server=None): + def _start_serving(self, protocol_factory, sock, + sslcontext=None, server=None): self.add_reader(sock.fileno(), self._accept_connection, - protocol_factory, sock, ssl, server) + protocol_factory, sock, sslcontext, server) - def _accept_connection(self, protocol_factory, sock, ssl=None, - server=None): + def _accept_connection(self, protocol_factory, sock, + sslcontext=None, server=None): try: conn, addr = sock.accept() conn.setblocking(False) - except (BlockingIOError, InterruptedError): + except (BlockingIOError, InterruptedError, ConnectionAbortedError): pass # False alarm. - except Exception: - # Bad error. Stop serving. - self.remove_reader(sock.fileno()) - sock.close() + except OSError as exc: # There's nowhere to send the error, so just log it. # TODO: Someone will want an error handler for this. - logger.exception('Accept failed') + if exc.errno in (errno.EMFILE, errno.ENFILE, + errno.ENOBUFS, errno.ENOMEM): + # Some platforms (e.g. Linux keep reporting the FD as + # ready, so we remove the read handler temporarily. + # We'll try again in a while. + logger.exception('Accept out of system resource (%s)', exc) + self.remove_reader(sock.fileno()) + self.call_later(constants.ACCEPT_RETRY_DELAY, + self._start_serving, + protocol_factory, sock, sslcontext, server) + else: + raise # The event loop will catch, log and ignore it. else: - if ssl: + if sslcontext: self._make_ssl_transport( - conn, protocol_factory(), ssl, None, + conn, protocol_factory(), sslcontext, None, server_side=True, extra={'peername': addr}, server=server) else: self._make_socket_transport( @@ -277,7 +287,7 @@ err = sock.getsockopt(socket.SOL_SOCKET, socket.SO_ERROR) if err != 0: # Jump to the except clause below. - raise OSError(err, 'Connect call failed') + raise OSError(err, 'Connect call failed %s' % (address,)) except (BlockingIOError, InterruptedError): self.add_writer(fd, self._sock_connect, fut, True, sock, address) except Exception as exc: @@ -404,15 +414,16 @@ try: self._protocol.pause_writing() except Exception: - tulip_log.exception('pause_writing() failed') + logger.exception('pause_writing() failed') def _maybe_resume_protocol(self): - if self._protocol_paused and self.get_write_buffer_size() <= self._low_water: + if (self._protocol_paused and + self.get_write_buffer_size() <= self._low_water): self._protocol_paused = False try: self._protocol.resume_writing() except Exception: - tulip_log.exception('resume_writing() failed') + logger.exception('resume_writing() failed') def set_write_buffer_limits(self, high=None, low=None): if high is None: @@ -548,22 +559,28 @@ def __init__(self, loop, rawsock, protocol, sslcontext, waiter=None, server_side=False, server_hostname=None, extra=None, server=None): + if ssl is None: + raise RuntimeError('stdlib ssl module not available') + if server_side: - assert isinstance( - sslcontext, ssl.SSLContext), 'Must pass an SSLContext' + if not sslcontext: + raise ValueError('Server side ssl needs a valid SSLContext') else: - # Client-side may pass ssl=True to use a default context. - # The default is the same as used by urllib. - if sslcontext is None: + if not sslcontext: + # Client side may pass ssl=True to use a default + # context; in that case the sslcontext passed is None. + # The default is the same as used by urllib with + # cadefault=True. sslcontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23) sslcontext.options |= ssl.OP_NO_SSLv2 sslcontext.set_default_verify_paths() sslcontext.verify_mode = ssl.CERT_REQUIRED + wrap_kwargs = { 'server_side': server_side, 'do_handshake_on_connect': False, } - if server_hostname is not None and not server_side and ssl.HAS_SNI: + if server_hostname and not server_side and ssl.HAS_SNI: wrap_kwargs['server_hostname'] = server_hostname sslsock = sslcontext.wrap_socket(rawsock, **wrap_kwargs) @@ -609,7 +626,7 @@ # Verify hostname if requested. peercert = self._sock.getpeercert() - if (self._server_hostname is not None and + if (self._server_hostname and self._sslcontext.verify_mode != ssl.CERT_NONE): try: ssl.match_hostname(peercert, self._server_hostname) @@ -625,15 +642,16 @@ compression=self._sock.compression(), ) - self._loop.add_reader(self._sock_fd, self._on_ready) - self._loop.add_writer(self._sock_fd, self._on_ready) + self._read_wants_write = False + self._write_wants_read = False + self._loop.add_reader(self._sock_fd, self._read_ready) self._loop.call_soon(self._protocol.connection_made, self) if self._waiter is not None: self._loop.call_soon(self._waiter.set_result, None) def pause_reading(self): # XXX This is a bit icky, given the comment at the top of - # _on_ready(). Is it possible to evoke a deadlock? I don't + # _read_ready(). Is it possible to evoke a deadlock? I don't # know, although it doesn't look like it; write() will still # accept more data for the buffer and eventually the app will # call resume_reading() again, and things will flow again. @@ -648,41 +666,58 @@ self._paused = False if self._closing: return - self._loop.add_reader(self._sock_fd, self._on_ready) + self._loop.add_reader(self._sock_fd, self._read_ready) - def _on_ready(self): - # Because of renegotiations (?), there's no difference between - # readable and writable. We just try both. XXX This may be - # incorrect; we probably need to keep state about what we - # should do next. + def _read_ready(self): + if self._write_wants_read: + self._write_wants_read = False + self._write_ready() - # First try reading. - if not self._closing and not self._paused: - try: - data = self._sock.recv(self.max_size) - except (BlockingIOError, InterruptedError, - ssl.SSLWantReadError, ssl.SSLWantWriteError): - pass - except Exception as exc: - self._fatal_error(exc) + if self._buffer: + self._loop.add_writer(self._sock_fd, self._write_ready) + + try: + data = self._sock.recv(self.max_size) + except (BlockingIOError, InterruptedError, ssl.SSLWantReadError): + pass + except ssl.SSLWantWriteError: + self._read_wants_write = True + self._loop.remove_reader(self._sock_fd) + self._loop.add_writer(self._sock_fd, self._write_ready) + except Exception as exc: + self._fatal_error(exc) + else: + if data: + self._protocol.data_received(data) else: - if data: - self._protocol.data_received(data) - else: - try: - self._protocol.eof_received() - finally: - self.close() + try: + keep_open = self._protocol.eof_received() + if keep_open: + logger.warning('returning true from eof_received() ' + 'has no effect when using ssl') + finally: + self.close() - # Now try writing, if there's anything to write. + def _write_ready(self): + if self._read_wants_write: + self._read_wants_write = False + self._read_ready() + + if not (self._paused or self._closing): + self._loop.add_reader(self._sock_fd, self._read_ready) + if self._buffer: data = b''.join(self._buffer) self._buffer.clear() try: n = self._sock.send(data) except (BlockingIOError, InterruptedError, - ssl.SSLWantReadError, ssl.SSLWantWriteError): + ssl.SSLWantWriteError): n = 0 + except ssl.SSLWantReadError: + n = 0 + self._loop.remove_writer(self._sock_fd) + self._write_wants_read = True except Exception as exc: self._loop.remove_writer(self._sock_fd) self._fatal_error(exc) @@ -691,11 +726,12 @@ if n < len(data): self._buffer.append(data[n:]) - self._maybe_resume_protocol() # May append to buffer. + self._maybe_resume_protocol() # May append to buffer. - if self._closing and not self._buffer: + if not self._buffer: self._loop.remove_writer(self._sock_fd) - self._call_connection_lost(None) + if self._closing: + self._call_connection_lost(None) def write(self, data): assert isinstance(data, bytes), repr(type(data)) @@ -708,20 +744,16 @@ self._conn_lost += 1 return - # We could optimize, but the callback can do this for now. + if not self._buffer: + self._loop.add_writer(self._sock_fd, self._write_ready) + + # Add it to the buffer. self._buffer.append(data) self._maybe_pause_protocol() def can_write_eof(self): return False - def close(self): - if self._closing: - return - self._closing = True - self._conn_lost += 1 - self._loop.remove_reader(self._sock_fd) - class _SelectorDatagramTransport(_SelectorTransport): diff --git a/Lib/asyncio/tasks.py b/Lib/asyncio/tasks.py --- a/Lib/asyncio/tasks.py +++ b/Lib/asyncio/tasks.py @@ -62,8 +62,9 @@ code = func.__code__ filename = code.co_filename lineno = code.co_firstlineno - logger.error('Coroutine %r defined at %s:%s was never yielded from', - func.__name__, filename, lineno) + logger.error( + 'Coroutine %r defined at %s:%s was never yielded from', + func.__name__, filename, lineno) def coroutine(func): diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -138,6 +138,7 @@ @tasks.coroutine def start_serving_pipe(self, protocol_factory, address): server = PipeServer(address) + def loop(f=None): pipe = None try: @@ -160,6 +161,7 @@ pipe.close() else: f.add_done_callback(loop) + self.call_soon(loop) return [server] @@ -209,6 +211,7 @@ ov.WSARecv(conn.fileno(), nbytes, flags) else: ov.ReadFile(conn.fileno(), nbytes) + def finish(trans, key, ov): try: return ov.getresult() @@ -217,6 +220,7 @@ raise ConnectionResetError(*exc.args) else: raise + return self._register(ov, conn, finish) def send(self, conn, buf, flags=0): @@ -226,6 +230,7 @@ ov.WSASend(conn.fileno(), buf, flags) else: ov.WriteFile(conn.fileno(), buf) + def finish(trans, key, ov): try: return ov.getresult() @@ -234,6 +239,7 @@ raise ConnectionResetError(*exc.args) else: raise + return self._register(ov, conn, finish) def accept(self, listener): @@ -241,6 +247,7 @@ conn = self._get_accept_socket(listener.family) ov = _overlapped.Overlapped(NULL) ov.AcceptEx(listener.fileno(), conn.fileno()) + def finish_accept(trans, key, ov): ov.getresult() # Use SO_UPDATE_ACCEPT_CONTEXT so getsockname() etc work. @@ -249,6 +256,7 @@ _overlapped.SO_UPDATE_ACCEPT_CONTEXT, buf) conn.settimeout(listener.gettimeout()) return conn, conn.getpeername() + return self._register(ov, listener, finish_accept) def connect(self, conn, address): @@ -264,26 +272,31 @@ raise ov = _overlapped.Overlapped(NULL) ov.ConnectEx(conn.fileno(), address) + def finish_connect(trans, key, ov): ov.getresult() # Use SO_UPDATE_CONNECT_CONTEXT so getsockname() etc work. conn.setsockopt(socket.SOL_SOCKET, _overlapped.SO_UPDATE_CONNECT_CONTEXT, 0) return conn + return self._register(ov, conn, finish_connect) def accept_pipe(self, pipe): self._register_with_iocp(pipe) ov = _overlapped.Overlapped(NULL) ov.ConnectNamedPipe(pipe.fileno()) + def finish(trans, key, ov): ov.getresult() return pipe + return self._register(ov, pipe, finish) def connect_pipe(self, address): ov = _overlapped.Overlapped(NULL) ov.WaitNamedPipeAndConnect(address, self._iocp, ov.address) + def finish(err, handle, ov): # err, handle were arguments passed to PostQueuedCompletionStatus() # in a function run in a thread pool. @@ -296,6 +309,7 @@ raise OSError(0, msg, None, err) else: return windows_utils.PipeHandle(handle) + return self._register(ov, None, finish, wait_for_post=True) def wait_for_handle(self, handle, timeout=None): @@ -432,8 +446,10 @@ self._proc = windows_utils.Popen( args, shell=shell, stdin=stdin, stdout=stdout, stderr=stderr, bufsize=bufsize, **kwargs) + def callback(f): returncode = self._proc.poll() self._process_exited(returncode) + f = self._loop._proactor.wait_for_handle(int(self._proc._handle)) f.add_done_callback(callback) diff --git a/Lib/asyncio/windows_utils.py b/Lib/asyncio/windows_utils.py --- a/Lib/asyncio/windows_utils.py +++ b/Lib/asyncio/windows_utils.py @@ -18,18 +18,18 @@ __all__ = ['socketpair', 'pipe', 'Popen', 'PIPE', 'PipeHandle'] -# + # Constants/globals -# + BUFSIZE = 8192 PIPE = subprocess.PIPE STDOUT = subprocess.STDOUT _mmap_counter = itertools.count() -# + # Replacement for socket.socketpair() -# + def socketpair(family=socket.AF_INET, type=socket.SOCK_STREAM, proto=0): """A socket pair usable as a self-pipe, for Windows. @@ -57,9 +57,9 @@ lsock.close() return (ssock, csock) -# + # Replacement for os.pipe() using handles instead of fds -# + def pipe(*, duplex=False, overlapped=(True, True), bufsize=BUFSIZE): """Like os.pipe() but with overlapped support and using handles not fds.""" @@ -105,9 +105,9 @@ _winapi.CloseHandle(h2) raise -# + # Wrapper for a pipe handle -# + class PipeHandle: """Wrapper for an overlapped pipe handle which is vaguely file-object like. @@ -137,9 +137,9 @@ def __exit__(self, t, v, tb): self.close() -# + # Replacement for subprocess.Popen using overlapped pipe handles -# + class Popen(subprocess.Popen): """Replacement for subprocess.Popen using overlapped pipe handles. diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -1,5 +1,6 @@ """Tests for base_events.py""" +import errno import logging import socket import time @@ -8,6 +9,7 @@ from test.support import find_unused_port, IPV6_ENABLED from asyncio import base_events +from asyncio import constants from asyncio import events from asyncio import futures from asyncio import protocols @@ -442,6 +444,71 @@ self.assertRaises( OSError, self.loop.run_until_complete, coro) + def test_create_connection_ssl_server_hostname_default(self): + self.loop.getaddrinfo = unittest.mock.Mock() + + def mock_getaddrinfo(*args, **kwds): + f = futures.Future(loop=self.loop) + f.set_result([(socket.AF_INET, socket.SOCK_STREAM, + socket.SOL_TCP, '', ('1.2.3.4', 80))]) + return f + + self.loop.getaddrinfo.side_effect = mock_getaddrinfo + self.loop.sock_connect = unittest.mock.Mock() + self.loop.sock_connect.return_value = () + self.loop._make_ssl_transport = unittest.mock.Mock() + + def mock_make_ssl_transport(sock, protocol, sslcontext, waiter, + **kwds): + waiter.set_result(None) + + self.loop._make_ssl_transport.side_effect = mock_make_ssl_transport + ANY = unittest.mock.ANY + # First try the default server_hostname. + self.loop._make_ssl_transport.reset_mock() + coro = self.loop.create_connection(MyProto, 'python.org', 80, ssl=True) + self.loop.run_until_complete(coro) + self.loop._make_ssl_transport.assert_called_with( + ANY, ANY, ANY, ANY, + server_side=False, + server_hostname='python.org') + # Next try an explicit server_hostname. + self.loop._make_ssl_transport.reset_mock() + coro = self.loop.create_connection(MyProto, 'python.org', 80, ssl=True, + server_hostname='perl.com') + self.loop.run_until_complete(coro) + self.loop._make_ssl_transport.assert_called_with( + ANY, ANY, ANY, ANY, + server_side=False, + server_hostname='perl.com') + # Finally try an explicit empty server_hostname. + self.loop._make_ssl_transport.reset_mock() + coro = self.loop.create_connection(MyProto, 'python.org', 80, ssl=True, + server_hostname='') + self.loop.run_until_complete(coro) + self.loop._make_ssl_transport.assert_called_with(ANY, ANY, ANY, ANY, + server_side=False, + server_hostname='') + + def test_create_connection_no_ssl_server_hostname_errors(self): + # When not using ssl, server_hostname must be None. + coro = self.loop.create_connection(MyProto, 'python.org', 80, + server_hostname='') + self.assertRaises(ValueError, self.loop.run_until_complete, coro) + coro = self.loop.create_connection(MyProto, 'python.org', 80, + server_hostname='python.org') + self.assertRaises(ValueError, self.loop.run_until_complete, coro) + + def test_create_connection_ssl_server_hostname_errors(self): + # When using ssl, server_hostname may be None if host is non-empty. + coro = self.loop.create_connection(MyProto, '', 80, ssl=True) + self.assertRaises(ValueError, self.loop.run_until_complete, coro) + coro = self.loop.create_connection(MyProto, None, 80, ssl=True) + self.assertRaises(ValueError, self.loop.run_until_complete, coro) + coro = self.loop.create_connection(MyProto, None, None, + ssl=True, sock=socket.socket()) + self.assertRaises(ValueError, self.loop.run_until_complete, coro) + def test_create_server_empty_host(self): # if host is empty string use None instead host = object() @@ -585,11 +652,18 @@ def test_accept_connection_exception(self, m_log): sock = unittest.mock.Mock() sock.fileno.return_value = 10 - sock.accept.side_effect = OSError() + sock.accept.side_effect = OSError(errno.EMFILE, 'Too many open files') + self.loop.remove_reader = unittest.mock.Mock() + self.loop.call_later = unittest.mock.Mock() self.loop._accept_connection(MyProto, sock) - self.assertTrue(sock.close.called) self.assertTrue(m_log.exception.called) + self.assertFalse(sock.close.called) + self.loop.remove_reader.assert_called_with(10) + self.loop.call_later.assert_called_with(constants.ACCEPT_RETRY_DELAY, + # self.loop._start_serving + unittest.mock.ANY, + MyProto, sock, None, None) if __name__ == '__main__': diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -1276,7 +1276,6 @@ def create_event_loop(self): return windows_events.SelectorEventLoop() - class ProactorEventLoopTests(EventLoopTestsMixin, SubprocessTestsMixin, unittest.TestCase): @@ -1472,6 +1471,8 @@ self.assertRaises( NotImplementedError, loop.is_running) self.assertRaises( + NotImplementedError, loop.close) + self.assertRaises( NotImplementedError, loop.call_later, None, None) self.assertRaises( NotImplementedError, loop.call_at, f, f) diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -43,6 +43,7 @@ self.assertIsInstance( self.loop._make_socket_transport(m, m), _SelectorSocketTransport) + @unittest.skipIf(ssl is None, 'No ssl module') def test_make_ssl_transport(self): m = unittest.mock.Mock() self.loop.add_reader = unittest.mock.Mock() @@ -52,6 +53,16 @@ self.assertIsInstance( self.loop._make_ssl_transport(m, m, m, m), _SelectorSslTransport) + @unittest.mock.patch('asyncio.selector_events.ssl', None) + def test_make_ssl_transport_without_ssl_error(self): + m = unittest.mock.Mock() + self.loop.add_reader = unittest.mock.Mock() + self.loop.add_writer = unittest.mock.Mock() + self.loop.remove_reader = unittest.mock.Mock() + self.loop.remove_writer = unittest.mock.Mock() + with self.assertRaises(RuntimeError): + self.loop._make_ssl_transport(m, m, m, m) + def test_close(self): ssock = self.loop._ssock ssock.fileno.return_value = 7 @@ -1003,8 +1014,7 @@ self.loop, self.sock, self.protocol, self.sslcontext, waiter=waiter) self.assertTrue(self.sslsock.do_handshake.called) - self.loop.assert_reader(1, tr._on_ready) - self.loop.assert_writer(1, tr._on_ready) + self.loop.assert_reader(1, tr._read_ready) test_utils.run_briefly(self.loop) self.assertIsNone(waiter.result()) @@ -1047,13 +1057,13 @@ def test_pause_resume_reading(self): tr = self._make_one() self.assertFalse(tr._paused) - self.loop.assert_reader(1, tr._on_ready) + self.loop.assert_reader(1, tr._read_ready) tr.pause_reading() self.assertTrue(tr._paused) self.assertFalse(1 in self.loop.readers) tr.resume_reading() self.assertFalse(tr._paused) - self.loop.assert_reader(1, tr._on_ready) + self.loop.assert_reader(1, tr._read_ready) def test_write_no_data(self): transport = self._make_one() @@ -1084,140 +1094,173 @@ transport.write(b'data') m_log.warning.assert_called_with('socket.send() raised exception.') - def test_on_ready_recv(self): + def test_read_ready_recv(self): self.sslsock.recv.return_value = b'data' transport = self._make_one() - transport._on_ready() + transport._read_ready() self.assertTrue(self.sslsock.recv.called) self.assertEqual((b'data',), self.protocol.data_received.call_args[0]) - def test_on_ready_recv_eof(self): + def test_read_ready_write_wants_read(self): + self.loop.add_writer = unittest.mock.Mock() + self.sslsock.recv.side_effect = BlockingIOError + transport = self._make_one() + transport._write_wants_read = True + transport._write_ready = unittest.mock.Mock() + transport._buffer.append(b'data') + transport._read_ready() + + self.assertFalse(transport._write_wants_read) + transport._write_ready.assert_called_with() + self.loop.add_writer.assert_called_with( + transport._sock_fd, transport._write_ready) + + def test_read_ready_recv_eof(self): self.sslsock.recv.return_value = b'' transport = self._make_one() transport.close = unittest.mock.Mock() - transport._on_ready() + transport._read_ready() transport.close.assert_called_with() self.protocol.eof_received.assert_called_with() - def test_on_ready_recv_conn_reset(self): + def test_read_ready_recv_conn_reset(self): err = self.sslsock.recv.side_effect = ConnectionResetError() transport = self._make_one() transport._force_close = unittest.mock.Mock() - transport._on_ready() + transport._read_ready() transport._force_close.assert_called_with(err) - def test_on_ready_recv_retry(self): + def test_read_ready_recv_retry(self): self.sslsock.recv.side_effect = ssl.SSLWantReadError transport = self._make_one() - transport._on_ready() + transport._read_ready() self.assertTrue(self.sslsock.recv.called) self.assertFalse(self.protocol.data_received.called) - self.sslsock.recv.side_effect = ssl.SSLWantWriteError - transport._on_ready() - self.assertFalse(self.protocol.data_received.called) - self.sslsock.recv.side_effect = BlockingIOError - transport._on_ready() + transport._read_ready() self.assertFalse(self.protocol.data_received.called) self.sslsock.recv.side_effect = InterruptedError - transport._on_ready() + transport._read_ready() self.assertFalse(self.protocol.data_received.called) - def test_on_ready_recv_exc(self): + def test_read_ready_recv_write(self): + self.loop.remove_reader = unittest.mock.Mock() + self.loop.add_writer = unittest.mock.Mock() + self.sslsock.recv.side_effect = ssl.SSLWantWriteError + transport = self._make_one() + transport._read_ready() + self.assertFalse(self.protocol.data_received.called) + self.assertTrue(transport._read_wants_write) + + self.loop.remove_reader.assert_called_with(transport._sock_fd) + self.loop.add_writer.assert_called_with( + transport._sock_fd, transport._write_ready) + + def test_read_ready_recv_exc(self): err = self.sslsock.recv.side_effect = OSError() transport = self._make_one() transport._fatal_error = unittest.mock.Mock() - transport._on_ready() + transport._read_ready() transport._fatal_error.assert_called_with(err) - def test_on_ready_send(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError + def test_write_ready_send(self): self.sslsock.send.return_value = 4 transport = self._make_one() transport._buffer = collections.deque([b'data']) - transport._on_ready() + transport._write_ready() self.assertEqual(collections.deque(), transport._buffer) self.assertTrue(self.sslsock.send.called) - def test_on_ready_send_none(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError + def test_write_ready_send_none(self): self.sslsock.send.return_value = 0 transport = self._make_one() transport._buffer = collections.deque([b'data1', b'data2']) - transport._on_ready() + transport._write_ready() self.assertTrue(self.sslsock.send.called) self.assertEqual(collections.deque([b'data1data2']), transport._buffer) - def test_on_ready_send_partial(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError + def test_write_ready_send_partial(self): self.sslsock.send.return_value = 2 transport = self._make_one() transport._buffer = collections.deque([b'data1', b'data2']) - transport._on_ready() + transport._write_ready() self.assertTrue(self.sslsock.send.called) self.assertEqual(collections.deque([b'ta1data2']), transport._buffer) - def test_on_ready_send_closing_partial(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError + def test_write_ready_send_closing_partial(self): self.sslsock.send.return_value = 2 transport = self._make_one() transport._buffer = collections.deque([b'data1', b'data2']) - transport._on_ready() + transport._write_ready() self.assertTrue(self.sslsock.send.called) self.assertFalse(self.sslsock.close.called) - def test_on_ready_send_closing(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError + def test_write_ready_send_closing(self): self.sslsock.send.return_value = 4 transport = self._make_one() transport.close() transport._buffer = collections.deque([b'data']) - transport._on_ready() + transport._write_ready() self.assertFalse(self.loop.writers) self.protocol.connection_lost.assert_called_with(None) - def test_on_ready_send_closing_empty_buffer(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError + def test_write_ready_send_closing_empty_buffer(self): self.sslsock.send.return_value = 4 transport = self._make_one() transport.close() transport._buffer = collections.deque() - transport._on_ready() + transport._write_ready() self.assertFalse(self.loop.writers) self.protocol.connection_lost.assert_called_with(None) - def test_on_ready_send_retry(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError - + def test_write_ready_send_retry(self): transport = self._make_one() transport._buffer = collections.deque([b'data']) - self.sslsock.send.side_effect = ssl.SSLWantReadError - transport._on_ready() - self.assertTrue(self.sslsock.send.called) - self.assertEqual(collections.deque([b'data']), transport._buffer) - self.sslsock.send.side_effect = ssl.SSLWantWriteError - transport._on_ready() + transport._write_ready() self.assertEqual(collections.deque([b'data']), transport._buffer) self.sslsock.send.side_effect = BlockingIOError() - transport._on_ready() + transport._write_ready() self.assertEqual(collections.deque([b'data']), transport._buffer) - def test_on_ready_send_exc(self): - self.sslsock.recv.side_effect = ssl.SSLWantReadError + def test_write_ready_send_read(self): + transport = self._make_one() + transport._buffer = collections.deque([b'data']) + + self.loop.remove_writer = unittest.mock.Mock() + self.sslsock.send.side_effect = ssl.SSLWantReadError + transport._write_ready() + self.assertFalse(self.protocol.data_received.called) + self.assertTrue(transport._write_wants_read) + self.loop.remove_writer.assert_called_with(transport._sock_fd) + + def test_write_ready_send_exc(self): err = self.sslsock.send.side_effect = OSError() transport = self._make_one() transport._buffer = collections.deque([b'data']) transport._fatal_error = unittest.mock.Mock() - transport._on_ready() + transport._write_ready() transport._fatal_error.assert_called_with(err) self.assertEqual(collections.deque(), transport._buffer) + def test_write_ready_read_wants_write(self): + self.loop.add_reader = unittest.mock.Mock() + self.sslsock.send.side_effect = BlockingIOError + transport = self._make_one() + transport._read_wants_write = True + transport._read_ready = unittest.mock.Mock() + transport._write_ready() + + self.assertFalse(transport._read_wants_write) + transport._read_ready.assert_called_with() + self.loop.add_reader.assert_called_with( + transport._sock_fd, transport._read_ready) + def test_write_eof(self): tr = self._make_one() self.assertFalse(tr.can_write_eof()) @@ -1245,6 +1288,15 @@ server_hostname='localhost') +class SelectorSslWithoutSslTransportTests(unittest.TestCase): + + @unittest.mock.patch('asyncio.selector_events.ssl', None) + def test_ssl_transport_requires_ssl_module(self): + Mock = unittest.mock.Mock + with self.assertRaises(RuntimeError): + transport = _SelectorSslTransport(Mock(), Mock(), Mock(), Mock()) + + class SelectorDatagramTransportTests(unittest.TestCase): def setUp(self): diff --git a/Lib/test/test_asyncio/test_windows_events.py b/Lib/test/test_asyncio/test_windows_events.py --- a/Lib/test/test_asyncio/test_windows_events.py +++ b/Lib/test/test_asyncio/test_windows_events.py @@ -77,7 +77,7 @@ stream_reader = streams.StreamReader(loop=self.loop) protocol = streams.StreamReaderProtocol(stream_reader) trans, proto = yield from self.loop.create_pipe_connection( - lambda:protocol, ADDRESS) + lambda: protocol, ADDRESS) self.assertIsInstance(trans, transports.Transport) self.assertEqual(protocol, proto) clients.append((stream_reader, trans)) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 09:55:26 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 2 Nov 2013 09:55:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzEwNzM0?= =?utf-8?q?=3A_Fix_and_re-enable_test=5Fttk_test=5Fheading=5Fcallback=2E?= Message-ID: <3dBYy65pQzz7Ljf@mail.python.org> http://hg.python.org/cpython/rev/0554e2d37bf8 changeset: 86838:0554e2d37bf8 branch: 2.7 parent: 86836:ced345326151 user: Serhiy Storchaka date: Sat Nov 02 10:54:17 2013 +0200 summary: Issue #10734: Fix and re-enable test_ttk test_heading_callback. files: Lib/lib-tk/test/test_ttk/test_widgets.py | 4 +--- 1 files changed, 1 insertions(+), 3 deletions(-) diff --git a/Lib/lib-tk/test/test_ttk/test_widgets.py b/Lib/lib-tk/test/test_ttk/test_widgets.py --- a/Lib/lib-tk/test/test_ttk/test_widgets.py +++ b/Lib/lib-tk/test/test_ttk/test_widgets.py @@ -1370,12 +1370,10 @@ self.assertRaises(Tkinter.TclError, self.tv.heading, '#0', anchor=1) - # XXX skipping for now; should be fixed to work with newer ttk - @unittest.skip("skipping pending resolution of Issue #10734") def test_heading_callback(self): def simulate_heading_click(x, y): support.simulate_mouse_click(self.tv, x, y) - self.tv.update_idletasks() + self.tv.update() success = [] # no success for now -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 09:55:28 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 2 Nov 2013 09:55:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzEwNzM0?= =?utf-8?q?=3A_Fix_and_re-enable_test=5Fttk_test=5Fheading=5Fcallback=2E?= Message-ID: <3dBYy80PHMz7Ljn@mail.python.org> http://hg.python.org/cpython/rev/a58fce53e873 changeset: 86839:a58fce53e873 branch: 3.3 parent: 86834:92e268f2719e user: Serhiy Storchaka date: Sat Nov 02 10:54:31 2013 +0200 summary: Issue #10734: Fix and re-enable test_ttk test_heading_callback. files: Lib/tkinter/test/test_ttk/test_widgets.py | 4 +--- 1 files changed, 1 insertions(+), 3 deletions(-) diff --git a/Lib/tkinter/test/test_ttk/test_widgets.py b/Lib/tkinter/test/test_ttk/test_widgets.py --- a/Lib/tkinter/test/test_ttk/test_widgets.py +++ b/Lib/tkinter/test/test_ttk/test_widgets.py @@ -1369,12 +1369,10 @@ self.assertRaises(tkinter.TclError, self.tv.heading, '#0', anchor=1) - # XXX skipping for now; should be fixed to work with newer ttk - @unittest.skip("skipping pending resolution of Issue #10734") def test_heading_callback(self): def simulate_heading_click(x, y): support.simulate_mouse_click(self.tv, x, y) - self.tv.update_idletasks() + self.tv.update() success = [] # no success for now -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 09:55:29 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 2 Nov 2013 09:55:29 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2310734=3A_Fix_and_re-enable_test=5Fttk_test=5Fhe?= =?utf-8?q?ading=5Fcallback=2E?= Message-ID: <3dBYy9292Lz7Lk0@mail.python.org> http://hg.python.org/cpython/rev/f647a2c5f290 changeset: 86840:f647a2c5f290 parent: 86837:e92bba5b53db parent: 86839:a58fce53e873 user: Serhiy Storchaka date: Sat Nov 02 10:54:58 2013 +0200 summary: Issue #10734: Fix and re-enable test_ttk test_heading_callback. files: Lib/tkinter/test/test_ttk/test_widgets.py | 4 +--- 1 files changed, 1 insertions(+), 3 deletions(-) diff --git a/Lib/tkinter/test/test_ttk/test_widgets.py b/Lib/tkinter/test/test_ttk/test_widgets.py --- a/Lib/tkinter/test/test_ttk/test_widgets.py +++ b/Lib/tkinter/test/test_ttk/test_widgets.py @@ -1369,12 +1369,10 @@ self.assertRaises(tkinter.TclError, self.tv.heading, '#0', anchor=1) - # XXX skipping for now; should be fixed to work with newer ttk - @unittest.skip("skipping pending resolution of Issue #10734") def test_heading_callback(self): def simulate_heading_click(x, y): support.simulate_mouse_click(self.tv, x, y) - self.tv.update_idletasks() + self.tv.update() success = [] # no success for now -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 15:41:46 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 2 Nov 2013 15:41:46 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319085=2E_Try_to_f?= =?utf-8?q?ix_tkinter_tests_on_Windows=2E?= Message-ID: <3dBjdk24Znz7LjT@mail.python.org> http://hg.python.org/cpython/rev/cee56ef59a6a changeset: 86841:cee56ef59a6a user: Serhiy Storchaka date: Sat Nov 02 16:41:23 2013 +0200 summary: Issue #19085. Try to fix tkinter tests on Windows. files: Lib/tkinter/test/test_tkinter/test_widgets.py | 14 +++++----- 1 files changed, 7 insertions(+), 7 deletions(-) diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -73,7 +73,10 @@ def test_screen(self): widget = self.create() self.assertEqual(widget['screen'], '') - display = os.environ['DISPLAY'] + try: + display = os.environ['DISPLAY'] + except KeyError: + self.skipTest('No $DISPLAY set.') self.checkInvalidParam(widget, 'screen', display, errmsg="can't modify -screen option after widget is created") widget2 = self.create(screen=display) @@ -82,13 +85,10 @@ def test_use(self): widget = self.create() self.assertEqual(widget['use'], '') - widget1 = self.create(container=True) - self.assertEqual(widget1['use'], '') - self.checkInvalidParam(widget1, 'use', '0x44022', - errmsg="can't modify -use option after widget is created") - wid = hex(widget1.winfo_id()) + parent = self.create(container=True) + wid = parent.winfo_id() widget2 = self.create(use=wid) - self.assertEqual(widget2['use'], wid) + self.assertEqual(int(widget2['use']), wid) @add_standard_options(StandardOptionsTests) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 16:08:39 2013 From: python-checkins at python.org (ezio.melotti) Date: Sat, 2 Nov 2013 16:08:39 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=2315114=3A_The_html=2Epar?= =?utf-8?q?ser_module_now_raises_a_DeprecationWarning_when_the_strict?= Message-ID: <3dBkDl6QK4z7LjV@mail.python.org> http://hg.python.org/cpython/rev/0a56709eb798 changeset: 86842:0a56709eb798 user: Ezio Melotti date: Sat Nov 02 17:08:24 2013 +0200 summary: #15114: The html.parser module now raises a DeprecationWarning when the strict argument of HTMLParser or the HTMLParser.error method are used. files: Doc/library/html.parser.rst | 4 ++-- Lib/html/parser.py | 14 ++++++++++---- Lib/test/test_htmlparser.py | 17 ++++++++++++++--- Misc/NEWS | 3 +++ 4 files changed, 29 insertions(+), 9 deletions(-) diff --git a/Doc/library/html.parser.rst b/Doc/library/html.parser.rst --- a/Doc/library/html.parser.rst +++ b/Doc/library/html.parser.rst @@ -74,7 +74,7 @@ def handle_data(self, data): print("Encountered some data :", data) - parser = MyHTMLParser(strict=False) + parser = MyHTMLParser() parser.feed('Test' '

Parse me!

') @@ -272,7 +272,7 @@ def handle_decl(self, data): print("Decl :", data) - parser = MyHTMLParser(strict=False) + parser = MyHTMLParser() Parsing a doctype:: diff --git a/Lib/html/parser.py b/Lib/html/parser.py --- a/Lib/html/parser.py +++ b/Lib/html/parser.py @@ -94,6 +94,8 @@ return result +_strict_sentinel = object() + class HTMLParser(_markupbase.ParserBase): """Find tags and other markup and call handler functions. @@ -116,16 +118,18 @@ CDATA_CONTENT_ELEMENTS = ("script", "style") - def __init__(self, strict=False): + def __init__(self, strict=_strict_sentinel): """Initialize and reset this instance. If strict is set to False (the default) the parser will parse invalid markup, otherwise it will raise an error. Note that the strict mode - is deprecated. + and argument are deprecated. """ - if strict: - warnings.warn("The strict mode is deprecated.", + if strict is not _strict_sentinel: + warnings.warn("The strict argument and mode are deprecated.", DeprecationWarning, stacklevel=2) + else: + strict = False # default self.strict = strict self.reset() @@ -151,6 +155,8 @@ self.goahead(1) def error(self, message): + warnings.warn("The 'error' method is deprecated.", + DeprecationWarning, stacklevel=2) raise HTMLParseError(message, self.getpos()) __starttag_text = None diff --git a/Lib/test/test_htmlparser.py b/Lib/test/test_htmlparser.py --- a/Lib/test/test_htmlparser.py +++ b/Lib/test/test_htmlparser.py @@ -96,7 +96,9 @@ parser = self.get_collector() parser.feed(source) parser.close() - self.assertRaises(html.parser.HTMLParseError, parse) + with self.assertRaises(html.parser.HTMLParseError): + with self.assertWarns(DeprecationWarning): + parse() class HTMLParserStrictTestCase(TestCaseBase): @@ -360,7 +362,16 @@ class HTMLParserTolerantTestCase(HTMLParserStrictTestCase): def get_collector(self): - return EventCollector(strict=False) + return EventCollector() + + def test_deprecation_warnings(self): + with self.assertWarns(DeprecationWarning): + EventCollector(strict=True) + with self.assertWarns(DeprecationWarning): + EventCollector(strict=False) + with self.assertRaises(html.parser.HTMLParseError): + with self.assertWarns(DeprecationWarning): + EventCollector().error('test') def test_tolerant_parsing(self): self._run_check('te>>xt&a<\n' @@ -676,7 +687,7 @@ class AttributesTolerantTestCase(AttributesStrictTestCase): def get_collector(self): - return EventCollector(strict=False) + return EventCollector() def test_attr_funky_names2(self): self._run_check( diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -31,6 +31,9 @@ Library ------- +- Issue #15114: The html.parser module now raises a DeprecationWarning when the + strict argument of HTMLParser or the HTMLParser.error method are used. + - Issue #19410: Undo the special-casing removal of '' for importlib.machinery.FileFinder. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 16:28:24 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 2 Nov 2013 16:28:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5MDg1?= =?utf-8?q?=3A_Fix_Tkinter_tests_with_Tcl/Tk_8=2E4=2E?= Message-ID: <3dBkgX282jz7LjV@mail.python.org> http://hg.python.org/cpython/rev/278d15021d9a changeset: 86843:278d15021d9a branch: 2.7 parent: 86838:0554e2d37bf8 user: Serhiy Storchaka date: Sat Nov 02 17:27:59 2013 +0200 summary: Issue #19085: Fix Tkinter tests with Tcl/Tk 8.4. files: Lib/lib-tk/test/test_tkinter/test_widgets.py | 12 ++++---- Lib/lib-tk/test/test_ttk/test_widgets.py | 14 +++++----- Lib/lib-tk/test/widget_tests.py | 3 +- 3 files changed, 15 insertions(+), 14 deletions(-) diff --git a/Lib/lib-tk/test/test_tkinter/test_widgets.py b/Lib/lib-tk/test/test_tkinter/test_widgets.py --- a/Lib/lib-tk/test/test_tkinter/test_widgets.py +++ b/Lib/lib-tk/test/test_tkinter/test_widgets.py @@ -4,7 +4,7 @@ from test.test_support import requires, run_unittest from test_ttk.support import tcl_version, requires_tcl, widget_eq -from widget_tests import (add_standard_options, noconv, int_round, +from widget_tests import (add_standard_options, noconv, noconv_meth, int_round, AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests) @@ -12,7 +12,7 @@ class AbstractToplevelTest(AbstractWidgetTest, PixelSizeTests): - _conv_pad_pixels = noconv + _conv_pad_pixels = noconv_meth def test_class(self): widget = self.create() @@ -130,7 +130,7 @@ class AbstractLabelTest(AbstractWidgetTest, IntegerSizeTests): - _conv_pixels = noconv + _conv_pixels = noconv_meth def test_highlightthickness(self): widget = self.create() @@ -240,7 +240,7 @@ 'takefocus', 'text', 'textvariable', 'underline', 'width', 'wraplength', ) - _conv_pixels = AbstractWidgetTest._conv_pixels + _conv_pixels = staticmethod(AbstractWidgetTest._conv_pixels) def _create(self, **kwargs): return Tkinter.Menubutton(self.root, **kwargs) @@ -858,7 +858,7 @@ 'postcommand', 'relief', 'selectcolor', 'takefocus', 'tearoff', 'tearoffcommand', 'title', 'type', ) - _conv_pixels = noconv + _conv_pixels = noconv_meth def _create(self, **kwargs): return Tkinter.Menu(self.root, **kwargs) @@ -894,7 +894,7 @@ 'justify', 'padx', 'pady', 'relief', 'takefocus', 'text', 'textvariable', 'width', ) - _conv_pad_pixels = noconv + _conv_pad_pixels = noconv_meth def _create(self, **kwargs): return Tkinter.Message(self.root, **kwargs) diff --git a/Lib/lib-tk/test/test_ttk/test_widgets.py b/Lib/lib-tk/test/test_ttk/test_widgets.py --- a/Lib/lib-tk/test/test_ttk/test_widgets.py +++ b/Lib/lib-tk/test/test_ttk/test_widgets.py @@ -7,7 +7,7 @@ import support from test_functions import MockTclObj, MockStateSpec from support import tcl_version -from widget_tests import (add_standard_options, noconv, +from widget_tests import (add_standard_options, noconv, noconv_meth, AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests) @@ -117,7 +117,7 @@ class AbstractToplevelTest(AbstractWidgetTest, PixelSizeTests): - _conv_pixels = noconv + _conv_pixels = noconv_meth @add_standard_options(StandardTtkOptionsTests) @@ -197,7 +197,7 @@ 'takefocus', 'text', 'textvariable', 'underline', 'width', 'wraplength', ) - _conv_pixels = noconv + _conv_pixels = noconv_meth def _create(self, **kwargs): return ttk.Label(self.root, **kwargs) @@ -362,7 +362,7 @@ expected=('mon', 'tue', 'wed', 'thur')) self.checkParam(self.combo, 'values', ('mon', 'tue', 'wed', 'thur')) self.checkParam(self.combo, 'values', (42, 3.14, '', 'any string')) - self.checkParam(self.combo, 'values', '') + self.checkParam(self.combo, 'values', () if tcl_version < (8, 5) else '') self.combo['values'] = ['a', 1, 'c'] @@ -758,7 +758,7 @@ 'class', 'command', 'cursor', 'from', 'length', 'orient', 'style', 'takefocus', 'to', 'value', 'variable', ) - _conv_pixels = noconv + _conv_pixels = noconv_meth default_orient = 'horizontal' def setUp(self): @@ -862,7 +862,7 @@ 'mode', 'maximum', 'phase', 'style', 'takefocus', 'value', 'variable', ) - _conv_pixels = noconv + _conv_pixels = noconv_meth default_orient = 'horizontal' def _create(self, **kwargs): @@ -1144,7 +1144,7 @@ self.checkParam(widget, 'columns', 'a b c', expected=('a', 'b', 'c')) self.checkParam(widget, 'columns', ('a', 'b', 'c')) - self.checkParam(widget, 'columns', '') + self.checkParam(widget, 'columns', () if tcl_version < (8, 5) else '') def test_displaycolumns(self): widget = self.create() diff --git a/Lib/lib-tk/test/widget_tests.py b/Lib/lib-tk/test/widget_tests.py --- a/Lib/lib-tk/test/widget_tests.py +++ b/Lib/lib-tk/test/widget_tests.py @@ -6,6 +6,7 @@ noconv = str if tcl_version < (8, 5) else False +noconv_meth = noconv and staticmethod(noconv) def int_round(x): return int(round(x)) @@ -13,7 +14,7 @@ _sentinel = object() class AbstractWidgetTest(object): - _conv_pixels = staticmethod(int_round) if tcl_version[:2] != (8, 5) else int + _conv_pixels = staticmethod(int_round if tcl_version[:2] != (8, 5) else int) _conv_pad_pixels = None wantobjects = True -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 16:50:14 2013 From: python-checkins at python.org (ezio.melotti) Date: Sat, 2 Nov 2013 16:50:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Use_unittest?= =?utf-8?b?Lm1haW4oKSBpbiB0ZXN0X2h0bWxwYXJzZXIu?= Message-ID: <3dBl8k4LgDz7LjS@mail.python.org> http://hg.python.org/cpython/rev/b77c1a19028e changeset: 86844:b77c1a19028e branch: 3.3 parent: 86839:a58fce53e873 user: Ezio Melotti date: Sat Nov 02 17:49:08 2013 +0200 summary: Use unittest.main() in test_htmlparser. files: Lib/test/test_htmlparser.py | 8 +------- 1 files changed, 1 insertions(+), 7 deletions(-) diff --git a/Lib/test/test_htmlparser.py b/Lib/test/test_htmlparser.py --- a/Lib/test/test_htmlparser.py +++ b/Lib/test/test_htmlparser.py @@ -753,11 +753,5 @@ ("data", "spam"), ("endtag", "a")]) - -def test_main(): - support.run_unittest(HTMLParserStrictTestCase, HTMLParserTolerantTestCase, - AttributesStrictTestCase, AttributesTolerantTestCase) - - if __name__ == "__main__": - test_main() + unittest.main() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 16:50:15 2013 From: python-checkins at python.org (ezio.melotti) Date: Sat, 2 Nov 2013 16:50:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge_test=5Fhtmlparser_changes_from_3=2E3=2E?= Message-ID: <3dBl8l68Ldz7Ljf@mail.python.org> http://hg.python.org/cpython/rev/0ce6c0a47676 changeset: 86845:0ce6c0a47676 parent: 86842:0a56709eb798 parent: 86844:b77c1a19028e user: Ezio Melotti date: Sat Nov 02 17:50:02 2013 +0200 summary: Merge test_htmlparser changes from 3.3. files: Lib/test/test_htmlparser.py | 8 +------- 1 files changed, 1 insertions(+), 7 deletions(-) diff --git a/Lib/test/test_htmlparser.py b/Lib/test/test_htmlparser.py --- a/Lib/test/test_htmlparser.py +++ b/Lib/test/test_htmlparser.py @@ -764,11 +764,5 @@ ("data", "spam"), ("endtag", "a")]) - -def test_main(): - support.run_unittest(HTMLParserStrictTestCase, HTMLParserTolerantTestCase, - AttributesStrictTestCase, AttributesTolerantTestCase) - - if __name__ == "__main__": - test_main() + unittest.main() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 16:58:18 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 2 Nov 2013 16:58:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5Mjg2?= =?utf-8?q?=3A_Adding_test_demonstrating_the_failure_when_a_directory_is_f?= =?utf-8?q?ound?= Message-ID: <3dBlL24Stcz7Ljf@mail.python.org> http://hg.python.org/cpython/rev/22bac968e226 changeset: 86846:22bac968e226 branch: 2.7 parent: 86838:0554e2d37bf8 user: Jason R. Coombs date: Sat Nov 02 11:29:33 2013 -0400 summary: Issue #19286: Adding test demonstrating the failure when a directory is found in the package_data globs. files: Lib/distutils/tests/test_build_py.py | 31 ++++++++++++++++ 1 files changed, 31 insertions(+), 0 deletions(-) diff --git a/Lib/distutils/tests/test_build_py.py b/Lib/distutils/tests/test_build_py.py --- a/Lib/distutils/tests/test_build_py.py +++ b/Lib/distutils/tests/test_build_py.py @@ -99,6 +99,37 @@ os.chdir(cwd) sys.stdout = old_stdout + def test_dir_in_package_data(self): + """ + A directory in package_data should not be added to the filelist. + """ + # See bug 19286 + sources = self.mkdtemp() + pkg_dir = os.path.join(sources, "pkg") + + os.mkdir(pkg_dir) + open(os.path.join(pkg_dir, "__init__.py"), "w").close() + + docdir = os.path.join(pkg_dir, "doc") + os.mkdir(docdir) + open(os.path.join(docdir, "testfile"), "w").close() + + # create the directory that could be incorrectly detected as a file + os.mkdir(os.path.join(docdir, 'otherdir')) + + os.chdir(sources) + dist = Distribution({"packages": ["pkg"], + "package_data": {"pkg": ["doc/*"]}}) + # script_name need not exist, it just need to be initialized + dist.script_name = os.path.join(sources, "setup.py") + dist.script_args = ["build"] + dist.parse_command_line() + + try: + dist.run_commands() + except DistutilsFileError: + self.fail("failed package_data when data dir includes a dir") + def test_dont_write_bytecode(self): # makes sure byte_compile is not used pkg_dir, dist = self.create_dist() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 16:58:19 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 2 Nov 2013 16:58:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5Mjg2?= =?utf-8?q?=3A_=5Bdistutils=5D_Only_match_files_in_build=5Fpy=2Efind=5Fdat?= =?utf-8?q?a=5Ffiles=2E?= Message-ID: <3dBlL369dSz7Ljf@mail.python.org> http://hg.python.org/cpython/rev/0a1cf947eff6 changeset: 86847:0a1cf947eff6 branch: 2.7 user: Jason R. Coombs date: Sat Nov 02 11:07:35 2013 -0400 summary: Issue #19286: [distutils] Only match files in build_py.find_data_files. files: Lib/distutils/command/build_py.py | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Lib/distutils/command/build_py.py b/Lib/distutils/command/build_py.py --- a/Lib/distutils/command/build_py.py +++ b/Lib/distutils/command/build_py.py @@ -128,7 +128,8 @@ # Each pattern has to be converted to a platform-specific path filelist = glob(os.path.join(src_dir, convert_path(pattern))) # Files that match more than one pattern are only added once - files.extend([fn for fn in filelist if fn not in files]) + files.extend([fn for fn in filelist if fn not in files + and os.path.isfile(fn)]) return files def build_package_data(self): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 16:58:21 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 2 Nov 2013 16:58:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Update_NEWS?= Message-ID: <3dBlL50ywhz7Ljq@mail.python.org> http://hg.python.org/cpython/rev/4db15e465624 changeset: 86848:4db15e465624 branch: 2.7 user: Jason R. Coombs date: Sat Nov 02 11:43:40 2013 -0400 summary: Update NEWS files: Misc/NEWS | 3 +++ 1 files changed, 3 insertions(+), 0 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -12,6 +12,9 @@ Library ------- +- Issue #19286: Directories in ``package_data`` are no longer added to + the filelist, preventing failure outlined in the ticket. + Tests ----- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 16:58:22 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 2 Nov 2013 16:58:22 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMi43IC0+IDIuNyk6?= =?utf-8?q?_Merge?= Message-ID: <3dBlL62sdnz7Lk8@mail.python.org> http://hg.python.org/cpython/rev/2b085aba89f9 changeset: 86849:2b085aba89f9 branch: 2.7 parent: 86843:278d15021d9a parent: 86848:4db15e465624 user: Jason R. Coombs date: Sat Nov 02 11:57:54 2013 -0400 summary: Merge files: Lib/distutils/command/build_py.py | 3 +- Lib/distutils/tests/test_build_py.py | 31 ++++++++++++++++ Misc/NEWS | 3 + 3 files changed, 36 insertions(+), 1 deletions(-) diff --git a/Lib/distutils/command/build_py.py b/Lib/distutils/command/build_py.py --- a/Lib/distutils/command/build_py.py +++ b/Lib/distutils/command/build_py.py @@ -128,7 +128,8 @@ # Each pattern has to be converted to a platform-specific path filelist = glob(os.path.join(src_dir, convert_path(pattern))) # Files that match more than one pattern are only added once - files.extend([fn for fn in filelist if fn not in files]) + files.extend([fn for fn in filelist if fn not in files + and os.path.isfile(fn)]) return files def build_package_data(self): diff --git a/Lib/distutils/tests/test_build_py.py b/Lib/distutils/tests/test_build_py.py --- a/Lib/distutils/tests/test_build_py.py +++ b/Lib/distutils/tests/test_build_py.py @@ -99,6 +99,37 @@ os.chdir(cwd) sys.stdout = old_stdout + def test_dir_in_package_data(self): + """ + A directory in package_data should not be added to the filelist. + """ + # See bug 19286 + sources = self.mkdtemp() + pkg_dir = os.path.join(sources, "pkg") + + os.mkdir(pkg_dir) + open(os.path.join(pkg_dir, "__init__.py"), "w").close() + + docdir = os.path.join(pkg_dir, "doc") + os.mkdir(docdir) + open(os.path.join(docdir, "testfile"), "w").close() + + # create the directory that could be incorrectly detected as a file + os.mkdir(os.path.join(docdir, 'otherdir')) + + os.chdir(sources) + dist = Distribution({"packages": ["pkg"], + "package_data": {"pkg": ["doc/*"]}}) + # script_name need not exist, it just need to be initialized + dist.script_name = os.path.join(sources, "setup.py") + dist.script_args = ["build"] + dist.parse_command_line() + + try: + dist.run_commands() + except DistutilsFileError: + self.fail("failed package_data when data dir includes a dir") + def test_dont_write_bytecode(self): # makes sure byte_compile is not used pkg_dir, dist = self.create_dist() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -12,6 +12,9 @@ Library ------- +- Issue #19286: Directories in ``package_data`` are no longer added to + the filelist, preventing failure outlined in the ticket. + Tests ----- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 17:40:03 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sat, 2 Nov 2013 17:40:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Relax_test_for_process_ret?= =?utf-8?q?urn_code_on_Windows=2E?= Message-ID: <3dBmGC6NwSz7Ljf@mail.python.org> http://hg.python.org/cpython/rev/f051d3aaaef4 changeset: 86850:f051d3aaaef4 parent: 86845:0ce6c0a47676 user: Richard Oudkerk date: Sat Nov 02 16:38:58 2013 +0000 summary: Relax test for process return code on Windows. files: Lib/test/test_asyncio/test_events.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -961,14 +961,14 @@ def check_terminated(self, returncode): if sys.platform == 'win32': self.assertIsInstance(returncode, int) - self.assertNotEqual(0, returncode) + # expect 1 but sometimes get 0 else: self.assertEqual(-signal.SIGTERM, returncode) def check_killed(self, returncode): if sys.platform == 'win32': self.assertIsInstance(returncode, int) - self.assertNotEqual(0, returncode) + # expect 1 but sometimes get 0 else: self.assertEqual(-signal.SIGKILL, returncode) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 17:48:20 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sat, 2 Nov 2013 17:48:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Relax_timeout_?= =?utf-8?q?test=2E?= Message-ID: <3dBmRm6nH5zMgW@mail.python.org> http://hg.python.org/cpython/rev/f70142e3799b changeset: 86851:f70142e3799b branch: 3.3 parent: 86844:b77c1a19028e user: Richard Oudkerk date: Sat Nov 02 16:46:32 2013 +0000 summary: Relax timeout test. files: Lib/test/test_multiprocessing.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_multiprocessing.py b/Lib/test/test_multiprocessing.py --- a/Lib/test/test_multiprocessing.py +++ b/Lib/test/test_multiprocessing.py @@ -716,7 +716,7 @@ start = time.time() self.assertRaises(pyqueue.Empty, q.get, True, 0.2) delta = time.time() - start - self.assertGreaterEqual(delta, 0.19) + self.assertGreaterEqual(delta, 0.18) # # -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 17:48:22 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sat, 2 Nov 2013 17:48:22 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?b?KTogTWVyZ2Uu?= Message-ID: <3dBmRp29dxz7LkP@mail.python.org> http://hg.python.org/cpython/rev/722da29107b7 changeset: 86852:722da29107b7 parent: 86850:f051d3aaaef4 parent: 86851:f70142e3799b user: Richard Oudkerk date: Sat Nov 02 16:47:08 2013 +0000 summary: Merge. files: Lib/test/_test_multiprocessing.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/_test_multiprocessing.py b/Lib/test/_test_multiprocessing.py --- a/Lib/test/_test_multiprocessing.py +++ b/Lib/test/_test_multiprocessing.py @@ -723,7 +723,7 @@ start = time.time() self.assertRaises(pyqueue.Empty, q.get, True, 0.2) delta = time.time() - start - self.assertGreaterEqual(delta, 0.19) + self.assertGreaterEqual(delta, 0.18) # # -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 17:54:17 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 2 Nov 2013 17:54:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5MDg1?= =?utf-8?q?=3A_Fixed_some_Tkinter_tests_on_Windows=2E?= Message-ID: <3dBmZd4FDcz7LjS@mail.python.org> http://hg.python.org/cpython/rev/f25679db52fb changeset: 86853:f25679db52fb branch: 3.3 parent: 86844:b77c1a19028e user: Serhiy Storchaka date: Sat Nov 02 18:50:42 2013 +0200 summary: Issue #19085: Fixed some Tkinter tests on Windows. files: Lib/tkinter/test/test_tkinter/test_widgets.py | 14 +++++----- 1 files changed, 7 insertions(+), 7 deletions(-) diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -73,7 +73,10 @@ def test_screen(self): widget = self.create() self.assertEqual(widget['screen'], '') - display = os.environ['DISPLAY'] + try: + display = os.environ['DISPLAY'] + except KeyError: + self.skipTest('No $DISPLAY set.') self.checkInvalidParam(widget, 'screen', display, errmsg="can't modify -screen option after widget is created") widget2 = self.create(screen=display) @@ -82,13 +85,10 @@ def test_use(self): widget = self.create() self.assertEqual(widget['use'], '') - widget1 = self.create(container=True) - self.assertEqual(widget1['use'], '') - self.checkInvalidParam(widget1, 'use', '0x44022', - errmsg="can't modify -use option after widget is created") - wid = hex(widget1.winfo_id()) + parent = self.create(container=True) + wid = parent.winfo_id() widget2 = self.create(use=wid) - self.assertEqual(widget2['use'], wid) + self.assertEqual(int(widget2['use']), wid) @add_standard_options(StandardOptionsTests) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 17:54:18 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 2 Nov 2013 17:54:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5MDg1?= =?utf-8?q?=3A_Fixed_some_Tkinter_tests_on_Windows=2E?= Message-ID: <3dBmZf6wJ3z7LjS@mail.python.org> http://hg.python.org/cpython/rev/4a2afda8f187 changeset: 86854:4a2afda8f187 branch: 2.7 parent: 86849:2b085aba89f9 user: Serhiy Storchaka date: Sat Nov 02 18:50:53 2013 +0200 summary: Issue #19085: Fixed some Tkinter tests on Windows. files: Lib/lib-tk/test/test_tkinter/test_widgets.py | 14 +++++----- 1 files changed, 7 insertions(+), 7 deletions(-) diff --git a/Lib/lib-tk/test/test_tkinter/test_widgets.py b/Lib/lib-tk/test/test_tkinter/test_widgets.py --- a/Lib/lib-tk/test/test_tkinter/test_widgets.py +++ b/Lib/lib-tk/test/test_tkinter/test_widgets.py @@ -70,7 +70,10 @@ def test_screen(self): widget = self.create() self.assertEqual(widget['screen'], '') - display = os.environ['DISPLAY'] + try: + display = os.environ['DISPLAY'] + except KeyError: + self.skipTest('No $DISPLAY set.') self.checkInvalidParam(widget, 'screen', display, errmsg="can't modify -screen option after widget is created") widget2 = self.create(screen=display) @@ -79,13 +82,10 @@ def test_use(self): widget = self.create() self.assertEqual(widget['use'], '') - widget1 = self.create(container=True) - self.assertEqual(widget1['use'], '') - self.checkInvalidParam(widget1, 'use', '0x44022', - errmsg="can't modify -use option after widget is created") - wid = hex(widget1.winfo_id()) + parent = self.create(container=True) + wid = parent.winfo_id() widget2 = self.create(use=wid) - self.assertEqual(widget2['use'], wid) + self.assertEqual(int(widget2['use']), wid) @add_standard_options(StandardOptionsTests) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 17:54:20 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 2 Nov 2013 17:54:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Null_merge?= Message-ID: <3dBmZh2c2hz7Lk3@mail.python.org> http://hg.python.org/cpython/rev/ab3b58f21fe7 changeset: 86855:ab3b58f21fe7 parent: 86850:f051d3aaaef4 parent: 86853:f25679db52fb user: Serhiy Storchaka date: Sat Nov 02 18:51:30 2013 +0200 summary: Null merge files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 17:54:21 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 2 Nov 2013 17:54:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMy4zIC0+IDMuMyk6?= =?utf-8?q?_Merge_heads?= Message-ID: <3dBmZj5H1Vz7Ljq@mail.python.org> http://hg.python.org/cpython/rev/576ffbaa6363 changeset: 86856:576ffbaa6363 branch: 3.3 parent: 86853:f25679db52fb parent: 86851:f70142e3799b user: Serhiy Storchaka date: Sat Nov 02 18:53:06 2013 +0200 summary: Merge heads files: Lib/test/test_multiprocessing.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_multiprocessing.py b/Lib/test/test_multiprocessing.py --- a/Lib/test/test_multiprocessing.py +++ b/Lib/test/test_multiprocessing.py @@ -716,7 +716,7 @@ start = time.time() self.assertRaises(pyqueue.Empty, q.get, True, 0.2) delta = time.time() - start - self.assertGreaterEqual(delta, 0.19) + self.assertGreaterEqual(delta, 0.18) # # -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 17:54:23 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 2 Nov 2013 17:54:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merge_heads?= Message-ID: <3dBmZl3Kppz7Lk9@mail.python.org> http://hg.python.org/cpython/rev/bc274d0aab88 changeset: 86857:bc274d0aab88 parent: 86855:ab3b58f21fe7 parent: 86852:722da29107b7 user: Serhiy Storchaka date: Sat Nov 02 18:53:19 2013 +0200 summary: Merge heads files: Lib/test/_test_multiprocessing.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/_test_multiprocessing.py b/Lib/test/_test_multiprocessing.py --- a/Lib/test/_test_multiprocessing.py +++ b/Lib/test/_test_multiprocessing.py @@ -723,7 +723,7 @@ start = time.time() self.assertRaises(pyqueue.Empty, q.get, True, 0.2) delta = time.time() - start - self.assertGreaterEqual(delta, 0.19) + self.assertGreaterEqual(delta, 0.18) # # -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 17:54:25 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 2 Nov 2013 17:54:25 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Null_merge?= Message-ID: <3dBmZn2q62z7LjS@mail.python.org> http://hg.python.org/cpython/rev/fdc10b073e79 changeset: 86858:fdc10b073e79 parent: 86857:bc274d0aab88 parent: 86856:576ffbaa6363 user: Serhiy Storchaka date: Sat Nov 02 18:53:39 2013 +0200 summary: Null merge files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 18:05:07 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 2 Nov 2013 18:05:07 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4yKTogSXNzdWUgIzE5Mjg2?= =?utf-8?q?=3A_Adding_test_demonstrating_the_failure_when_a_directory_is_f?= =?utf-8?q?ound?= Message-ID: <3dBmq719xjz7LkF@mail.python.org> http://hg.python.org/cpython/rev/d80207d15294 changeset: 86859:d80207d15294 branch: 3.2 parent: 86778:dda1a32748e0 user: Jason R. Coombs date: Sat Nov 02 11:29:33 2013 -0400 summary: Issue #19286: Adding test demonstrating the failure when a directory is found in the package_data globs. files: Lib/distutils/tests/test_build_py.py | 31 ++++++++++++++++ 1 files changed, 31 insertions(+), 0 deletions(-) diff --git a/Lib/distutils/tests/test_build_py.py b/Lib/distutils/tests/test_build_py.py --- a/Lib/distutils/tests/test_build_py.py +++ b/Lib/distutils/tests/test_build_py.py @@ -121,6 +121,37 @@ found = os.listdir(os.path.join(cmd.build_lib, '__pycache__')) self.assertEqual(sorted(found), ['boiledeggs.%s.pyo' % imp.get_tag()]) + def test_dir_in_package_data(self): + """ + A directory in package_data should not be added to the filelist. + """ + # See bug 19286 + sources = self.mkdtemp() + pkg_dir = os.path.join(sources, "pkg") + + os.mkdir(pkg_dir) + open(os.path.join(pkg_dir, "__init__.py"), "w").close() + + docdir = os.path.join(pkg_dir, "doc") + os.mkdir(docdir) + open(os.path.join(docdir, "testfile"), "w").close() + + # create the directory that could be incorrectly detected as a file + os.mkdir(os.path.join(docdir, 'otherdir')) + + os.chdir(sources) + dist = Distribution({"packages": ["pkg"], + "package_data": {"pkg": ["doc/*"]}}) + # script_name need not exist, it just need to be initialized + dist.script_name = os.path.join(sources, "setup.py") + dist.script_args = ["build"] + dist.parse_command_line() + + try: + dist.run_commands() + except DistutilsFileError: + self.fail("failed package_data when data dir includes a dir") + def test_dont_write_bytecode(self): # makes sure byte_compile is not used dist = self.create_dist()[1] -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 18:05:08 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 2 Nov 2013 18:05:08 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4yKTogSXNzdWUgIzE5Mjg2?= =?utf-8?q?=3A_=5Bdistutils=5D_Only_match_files_in_build=5Fpy=2Efind=5Fdat?= =?utf-8?q?a=5Ffiles=2E?= Message-ID: <3dBmq82rtGz7LkF@mail.python.org> http://hg.python.org/cpython/rev/265d369ad3b9 changeset: 86860:265d369ad3b9 branch: 3.2 user: Jason R. Coombs date: Sat Nov 02 11:07:35 2013 -0400 summary: Issue #19286: [distutils] Only match files in build_py.find_data_files. files: Lib/distutils/command/build_py.py | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Lib/distutils/command/build_py.py b/Lib/distutils/command/build_py.py --- a/Lib/distutils/command/build_py.py +++ b/Lib/distutils/command/build_py.py @@ -127,7 +127,8 @@ # Each pattern has to be converted to a platform-specific path filelist = glob(os.path.join(src_dir, convert_path(pattern))) # Files that match more than one pattern are only added once - files.extend([fn for fn in filelist if fn not in files]) + files.extend([fn for fn in filelist if fn not in files + and os.path.isfile(fn)]) return files def build_package_data(self): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 18:05:09 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 2 Nov 2013 18:05:09 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E2=29=3A_Update_NEWS_fo?= =?utf-8?q?r_265d369ad3b9=2E?= Message-ID: <3dBmq94f4bz7LkF@mail.python.org> http://hg.python.org/cpython/rev/7d399099334d changeset: 86861:7d399099334d branch: 3.2 user: Jason R. Coombs date: Sat Nov 02 13:00:01 2013 -0400 summary: Update NEWS for 265d369ad3b9. files: Misc/NEWS | 3 +++ 1 files changed, 3 insertions(+), 0 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,6 +10,9 @@ Library ------- +- Issue #19286: Directories in ``package_data`` are no longer added to + the filelist, preventing failure outlined in the ticket. + - Issue #19435: Fix directory traversal attack on CGIHttpRequestHandler. - Issue #14984: On POSIX systems, when netrc is called without a filename -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 18:05:10 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 2 Nov 2013 18:05:10 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMy4yIC0+IDMuMyk6?= =?utf-8?q?_Merge_with_3=2E2_for_Issue_=2319286=2E?= Message-ID: <3dBmqB6ZX2z7Ljg@mail.python.org> http://hg.python.org/cpython/rev/69f8288056eb changeset: 86862:69f8288056eb branch: 3.3 parent: 86856:576ffbaa6363 parent: 86861:7d399099334d user: Jason R. Coombs date: Sat Nov 02 13:01:46 2013 -0400 summary: Merge with 3.2 for Issue #19286. files: Lib/distutils/command/build_py.py | 3 +- Lib/distutils/tests/test_build_py.py | 31 ++++++++++++++++ Misc/NEWS | 3 + 3 files changed, 36 insertions(+), 1 deletions(-) diff --git a/Lib/distutils/command/build_py.py b/Lib/distutils/command/build_py.py --- a/Lib/distutils/command/build_py.py +++ b/Lib/distutils/command/build_py.py @@ -127,7 +127,8 @@ # Each pattern has to be converted to a platform-specific path filelist = glob(os.path.join(src_dir, convert_path(pattern))) # Files that match more than one pattern are only added once - files.extend([fn for fn in filelist if fn not in files]) + files.extend([fn for fn in filelist if fn not in files + and os.path.isfile(fn)]) return files def build_package_data(self): diff --git a/Lib/distutils/tests/test_build_py.py b/Lib/distutils/tests/test_build_py.py --- a/Lib/distutils/tests/test_build_py.py +++ b/Lib/distutils/tests/test_build_py.py @@ -121,6 +121,37 @@ found = os.listdir(os.path.join(cmd.build_lib, '__pycache__')) self.assertEqual(sorted(found), ['boiledeggs.%s.pyo' % imp.get_tag()]) + def test_dir_in_package_data(self): + """ + A directory in package_data should not be added to the filelist. + """ + # See bug 19286 + sources = self.mkdtemp() + pkg_dir = os.path.join(sources, "pkg") + + os.mkdir(pkg_dir) + open(os.path.join(pkg_dir, "__init__.py"), "w").close() + + docdir = os.path.join(pkg_dir, "doc") + os.mkdir(docdir) + open(os.path.join(docdir, "testfile"), "w").close() + + # create the directory that could be incorrectly detected as a file + os.mkdir(os.path.join(docdir, 'otherdir')) + + os.chdir(sources) + dist = Distribution({"packages": ["pkg"], + "package_data": {"pkg": ["doc/*"]}}) + # script_name need not exist, it just need to be initialized + dist.script_name = os.path.join(sources, "setup.py") + dist.script_args = ["build"] + dist.parse_command_line() + + try: + dist.run_commands() + except DistutilsFileError: + self.fail("failed package_data when data dir includes a dir") + def test_dont_write_bytecode(self): # makes sure byte_compile is not used dist = self.create_dist()[1] diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -13,6 +13,9 @@ Library ------- +- Issue #19286: Directories in ``package_data`` are no longer added to + the filelist, preventing failure outlined in the ticket. + - Issue #19435: Fix directory traversal attack on CGIHttpRequestHandler. Tests -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 18:05:12 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 2 Nov 2013 18:05:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge_with_3=2E3_for_Issue_=2319286=2E?= Message-ID: <3dBmqD37Scz7Lk1@mail.python.org> http://hg.python.org/cpython/rev/9eafe31251b4 changeset: 86863:9eafe31251b4 parent: 86858:fdc10b073e79 parent: 86862:69f8288056eb user: Jason R. Coombs date: Sat Nov 02 13:04:51 2013 -0400 summary: Merge with 3.3 for Issue #19286. files: Lib/distutils/command/build_py.py | 3 +- Lib/distutils/tests/test_build_py.py | 31 ++++++++++++++++ Misc/NEWS | 3 + 3 files changed, 36 insertions(+), 1 deletions(-) diff --git a/Lib/distutils/command/build_py.py b/Lib/distutils/command/build_py.py --- a/Lib/distutils/command/build_py.py +++ b/Lib/distutils/command/build_py.py @@ -127,7 +127,8 @@ # Each pattern has to be converted to a platform-specific path filelist = glob(os.path.join(src_dir, convert_path(pattern))) # Files that match more than one pattern are only added once - files.extend([fn for fn in filelist if fn not in files]) + files.extend([fn for fn in filelist if fn not in files + and os.path.isfile(fn)]) return files def build_package_data(self): diff --git a/Lib/distutils/tests/test_build_py.py b/Lib/distutils/tests/test_build_py.py --- a/Lib/distutils/tests/test_build_py.py +++ b/Lib/distutils/tests/test_build_py.py @@ -123,6 +123,37 @@ self.assertEqual(sorted(found), ['boiledeggs.%s.pyo' % sys.implementation.cache_tag]) + def test_dir_in_package_data(self): + """ + A directory in package_data should not be added to the filelist. + """ + # See bug 19286 + sources = self.mkdtemp() + pkg_dir = os.path.join(sources, "pkg") + + os.mkdir(pkg_dir) + open(os.path.join(pkg_dir, "__init__.py"), "w").close() + + docdir = os.path.join(pkg_dir, "doc") + os.mkdir(docdir) + open(os.path.join(docdir, "testfile"), "w").close() + + # create the directory that could be incorrectly detected as a file + os.mkdir(os.path.join(docdir, 'otherdir')) + + os.chdir(sources) + dist = Distribution({"packages": ["pkg"], + "package_data": {"pkg": ["doc/*"]}}) + # script_name need not exist, it just need to be initialized + dist.script_name = os.path.join(sources, "setup.py") + dist.script_args = ["build"] + dist.parse_command_line() + + try: + dist.run_commands() + except DistutilsFileError: + self.fail("failed package_data when data dir includes a dir") + def test_dont_write_bytecode(self): # makes sure byte_compile is not used dist = self.create_dist()[1] diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -31,6 +31,9 @@ Library ------- +- Issue #19286: Directories in ``package_data`` are no longer added to + the filelist, preventing failure outlined in the ticket. + - Issue #15114: The html.parser module now raises a DeprecationWarning when the strict argument of HTMLParser or the HTMLParser.error method are used. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 18:09:15 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sat, 2 Nov 2013 18:09:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319478=3A_Make_cho?= =?utf-8?q?ice_of_semaphore_prefix_more_flexible=2E?= Message-ID: <3dBmvv6xwtz7LjV@mail.python.org> http://hg.python.org/cpython/rev/1b5506fc6a50 changeset: 86864:1b5506fc6a50 parent: 86858:fdc10b073e79 user: Richard Oudkerk date: Sat Nov 02 17:05:07 2013 +0000 summary: Issue #19478: Make choice of semaphore prefix more flexible. files: Lib/multiprocessing/process.py | 10 ++++++++-- Lib/multiprocessing/synchronize.py | 4 ++-- 2 files changed, 10 insertions(+), 4 deletions(-) diff --git a/Lib/multiprocessing/process.py b/Lib/multiprocessing/process.py --- a/Lib/multiprocessing/process.py +++ b/Lib/multiprocessing/process.py @@ -301,10 +301,16 @@ self._parent_pid = None self._popen = None self._config = {'authkey': AuthenticationString(os.urandom(32)), - 'semprefix': 'mp'} + 'semprefix': '/mp'} # Note that some versions of FreeBSD only allow named - # semaphores to have names of up to 14 characters. Therfore + # semaphores to have names of up to 14 characters. Therefore # we choose a short prefix. + # + # On MacOSX in a sandbox it may be necessary to use a + # different prefix -- see #19478. + # + # Everything in self._config will be inherited by descendant + # processes. _current_process = _MainProcess() diff --git a/Lib/multiprocessing/synchronize.py b/Lib/multiprocessing/synchronize.py --- a/Lib/multiprocessing/synchronize.py +++ b/Lib/multiprocessing/synchronize.py @@ -115,8 +115,8 @@ @staticmethod def _make_name(): - return '/%s-%s' % (process.current_process()._config['semprefix'], - next(SemLock._rand)) + return '%s-%s' % (process.current_process()._config['semprefix'], + next(SemLock._rand)) # # Semaphore -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 18:09:17 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sat, 2 Nov 2013 18:09:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?b?KTogTWVyZ2Uu?= Message-ID: <3dBmvx1fw8z7Lkb@mail.python.org> http://hg.python.org/cpython/rev/4abd7cbc2683 changeset: 86865:4abd7cbc2683 parent: 86864:1b5506fc6a50 parent: 86863:9eafe31251b4 user: Richard Oudkerk date: Sat Nov 02 17:08:01 2013 +0000 summary: Merge. files: Lib/distutils/command/build_py.py | 3 +- Lib/distutils/tests/test_build_py.py | 31 ++++++++++++++++ Misc/NEWS | 3 + 3 files changed, 36 insertions(+), 1 deletions(-) diff --git a/Lib/distutils/command/build_py.py b/Lib/distutils/command/build_py.py --- a/Lib/distutils/command/build_py.py +++ b/Lib/distutils/command/build_py.py @@ -127,7 +127,8 @@ # Each pattern has to be converted to a platform-specific path filelist = glob(os.path.join(src_dir, convert_path(pattern))) # Files that match more than one pattern are only added once - files.extend([fn for fn in filelist if fn not in files]) + files.extend([fn for fn in filelist if fn not in files + and os.path.isfile(fn)]) return files def build_package_data(self): diff --git a/Lib/distutils/tests/test_build_py.py b/Lib/distutils/tests/test_build_py.py --- a/Lib/distutils/tests/test_build_py.py +++ b/Lib/distutils/tests/test_build_py.py @@ -123,6 +123,37 @@ self.assertEqual(sorted(found), ['boiledeggs.%s.pyo' % sys.implementation.cache_tag]) + def test_dir_in_package_data(self): + """ + A directory in package_data should not be added to the filelist. + """ + # See bug 19286 + sources = self.mkdtemp() + pkg_dir = os.path.join(sources, "pkg") + + os.mkdir(pkg_dir) + open(os.path.join(pkg_dir, "__init__.py"), "w").close() + + docdir = os.path.join(pkg_dir, "doc") + os.mkdir(docdir) + open(os.path.join(docdir, "testfile"), "w").close() + + # create the directory that could be incorrectly detected as a file + os.mkdir(os.path.join(docdir, 'otherdir')) + + os.chdir(sources) + dist = Distribution({"packages": ["pkg"], + "package_data": {"pkg": ["doc/*"]}}) + # script_name need not exist, it just need to be initialized + dist.script_name = os.path.join(sources, "setup.py") + dist.script_args = ["build"] + dist.parse_command_line() + + try: + dist.run_commands() + except DistutilsFileError: + self.fail("failed package_data when data dir includes a dir") + def test_dont_write_bytecode(self): # makes sure byte_compile is not used dist = self.create_dist()[1] diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -31,6 +31,9 @@ Library ------- +- Issue #19286: Directories in ``package_data`` are no longer added to + the filelist, preventing failure outlined in the ticket. + - Issue #15114: The html.parser module now raises a DeprecationWarning when the strict argument of HTMLParser or the HTMLParser.error method are used. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 19:21:48 2013 From: python-checkins at python.org (tim.peters) Date: Sat, 2 Nov 2013 19:21:48 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Mercurial=3A__ignore_preco?= =?utf-8?q?mpiled_header_files_on_Windows=2E?= Message-ID: <3dBpWc1VK7z7LjR@mail.python.org> http://hg.python.org/cpython/rev/9928a927975f changeset: 86866:9928a927975f user: Tim Peters date: Sat Nov 02 13:21:28 2013 -0500 summary: Mercurial: ignore precompiled header files on Windows. files: .hgignore | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/.hgignore b/.hgignore --- a/.hgignore +++ b/.hgignore @@ -86,6 +86,7 @@ PCbuild/Win32-temp-* PCbuild/x64-temp-* PCbuild/amd64 +PCbuild/ipch BuildLog.htm __pycache__ Modules/_freeze_importlib -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 2 22:12:47 2013 From: python-checkins at python.org (eric.snow) Date: Sat, 2 Nov 2013 22:12:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_=5BPEP_451=5D_Changes_related?= =?utf-8?q?_to_the_find=5Fspec=28=29_target_module=2E?= Message-ID: <3dBtJv4xDzz7LjV@mail.python.org> http://hg.python.org/peps/rev/63595acfe51d changeset: 5246:63595acfe51d user: Eric Snow date: Sat Nov 02 15:08:19 2013 -0600 summary: [PEP 451] Changes related to the find_spec() target module. files: pep-0451.txt | 51 ++++++++++++++------------------------- 1 files changed, 19 insertions(+), 32 deletions(-) diff --git a/pep-0451.txt b/pep-0451.txt --- a/pep-0451.txt +++ b/pep-0451.txt @@ -278,7 +278,7 @@ Other API Additions ------------------- -* importlib.find_spec(name, path=None, existing=None) will work exactly +* importlib.find_spec(name, path=None, target=None) will work exactly the same as importlib.find_loader() (which it replaces), but return a spec instead of a loader. @@ -493,7 +493,7 @@ name = module.__spec__.name except AttributeError: name = module.__name__ - spec = find_spec(name, existing=module) + spec = find_spec(name, target=module) if sys.modules.get(name) is not module: raise ImportError @@ -521,7 +521,7 @@ exec_module() is called during load (not reload) the import machinery would already have placed the module in sys.modules. This is part of the reason why find_spec() has -`the "existing" parameter `_. +`the "target" parameter `_. The semantics of reload will remain essentially the same as they exist already [#reload-semantics-fix]_. The impact of this PEP on some kinds @@ -697,9 +697,9 @@ loader would be costly to create, that loader can be designed to defer the cost until later. -**MetaPathFinder.find_spec(name, path=None, existing=None)** +**MetaPathFinder.find_spec(name, path=None, target=None)** -**PathEntryFinder.find_spec(name, existing=None)** +**PathEntryFinder.find_spec(name, target=None)** Finders must return ModuleSpec objects when find_spec() is called. This new method replaces find_module() and @@ -714,8 +714,8 @@ added in Python 3.3. However, the extra complexity and a less-than- explicit method name aren't worth it. -The "existing" parameter of find_spec() ---------------------------------------- +The "target" parameter of find_spec() +------------------------------------- A module object with the same name as the "name" argument (or None, the default) should be passed in to "exising". This argument allows the @@ -727,26 +727,26 @@ will return in the spec. In the case of reload, at this point the finder should also decide whether or not the loader supports loading into the module-to-be-reloaded (which was passed in to find_spec() as -"existing"). This decision may entail consulting with the loader. If +"target"). This decision may entail consulting with the loader. If the finder determines that the loader does not support reloading that -module, it should either find another loader or return None (indicating -that it could not "find" the module). This reload decision is important -since, as noted in `How Reloading Will Work`_, loaders will no longer be -able to trivially identify a reload situation on their own. +module, it should either find another loader or raise ImportError +(completely stopping import of the module). This reload decision is +important since, as noted in `How Reloading Will Work`_, loaders will +no longer be able to trivially identify a reload situation on their own. -Two alternatives were presented to the "existing" parameter: -Loader.supports_reload() and adding "existing" to Loader.exec_module() +Two alternatives were presented to the "target" parameter: +Loader.supports_reload() and adding "target" to Loader.exec_module() instead of find_spec(). supports_reload() was the initial approach to the reload situation. [#supports_reload]_ However, there was some opposition to the loader-specific, reload-centric approach. [#supports_reload_considered_harmful]_ -As to "existing" on exec_module(), the loader may need other information -from the existing module (or spec) during reload, more than just "does +As to "target" on exec_module(), the loader may need other information +from the target module (or spec) during reload, more than just "does this loader support reloading this module", that is no longer available with the move away from load_module(). A proposal on the table was to -add something like "existing" to exec_module(). [#exec_module_existing]_ -However, putting "existing" on find_spec() instead is more in line with +add something like "target" to exec_module(). [#exec_module_target]_ +However, putting "target" on find_spec() instead is more in line with the goals of this PEP. Furthermore, it obviates the need for supports_reload(). @@ -835,19 +835,6 @@ * importlib.reload() will now make use of the per-module import lock. -Open Issues -=========== - -\* In the `Finders`_ section, the PEP specifies returning None (or using -a different loader) when the found loader does not support loading into -an existing module (e.g during reload). An alternative to returning -None would be to raise ImportError with a message like "the loader does -not support reloading the module". This may actually be a better -approach since "could not find a loader" and "the found loader won't -work" are different situations that a single return value (None) may not -sufficiently represent. - - Reference Implementation ======================== @@ -943,7 +930,7 @@ .. [#supports_reload_considered_harmful] https://mail.python.org/pipermail/python-dev/2013-October/129971.html -.. [#exec_module_existing] +.. [#exec_module_target] https://mail.python.org/pipermail/python-dev/2013-October/129933.html Copyright -- Repository URL: http://hg.python.org/peps From solipsis at pitrou.net Sun Nov 3 07:34:13 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sun, 03 Nov 2013 07:34:13 +0100 Subject: [Python-checkins] Daily reference leaks (9928a927975f): sum=4 Message-ID: results for 9928a927975f on branch "default" -------------------------------------------- test_asyncio leaked [4, 0, 0] memory blocks, sum=4 test_site leaked [2, -2, 0] references, sum=0 test_site leaked [2, -2, 0] memory blocks, sum=0 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogPGFII9', '-x'] From python-checkins at python.org Sun Nov 3 07:42:40 2013 From: python-checkins at python.org (nick.coghlan) Date: Sun, 3 Nov 2013 07:42:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=234331=3A_Added_fun?= =?utf-8?q?ctools=2Epartialmethod?= Message-ID: <3dC6yS6ks8z7Ljv@mail.python.org> http://hg.python.org/cpython/rev/46d3c5539981 changeset: 86867:46d3c5539981 user: Nick Coghlan date: Sun Nov 03 16:41:46 2013 +1000 summary: Issue #4331: Added functools.partialmethod Initial patch by Alon Horev files: Doc/library/functools.rst | 43 +++++++++- Doc/whatsnew/3.4.rst | 20 ++++- Lib/functools.py | 78 ++++++++++++++++- Lib/test/test_functools.py | 116 +++++++++++++++++++++++++ Misc/NEWS | 2 + 5 files changed, 255 insertions(+), 4 deletions(-) diff --git a/Doc/library/functools.rst b/Doc/library/functools.rst --- a/Doc/library/functools.rst +++ b/Doc/library/functools.rst @@ -194,6 +194,48 @@ 18 +.. class:: partialmethod(func, *args, **keywords) + + Return a new :class:`partialmethod` descriptor which behaves + like :class:`partial` except that it is designed to be used as a method + definition rather than being directly callable. + + *func* must be a :term:`descriptor` or a callable (objects which are both, + like normal functions, are handled as descriptors). + + When *func* is a descriptor (such as a normal Python function, + :func:`classmethod`, :func:`staticmethod`, :func:`abstractmethod` or + another instance of :class:`partialmethod`), calls to ``__get__`` are + delegated to the underlying descriptor, and an appropriate + :class:`partial` object returned as the result. + + When *func* is a non-descriptor callable, an appropriate bound method is + created dynamically. This behaves like a normal Python function when + used as a method: the *self* argument will be inserted as the first + positional argument, even before the *args* and *keywords* supplied to + the :class:`partialmethod` constructor. + + Example:: + + >>> class Cell(object): + ... @property + ... def alive(self): + ... return self._alive + ... def set_state(self, state): + ... self._alive = bool(state) + ... set_alive = partialmethod(set_alive, True) + ... set_dead = partialmethod(set_alive, False) + ... + >>> c = Cell() + >>> c.alive + False + >>> c.set_alive() + >>> c.alive + True + + .. versionadded:: 3.4 + + .. function:: reduce(function, iterable[, initializer]) Apply *function* of two arguments cumulatively to the items of *sequence*, from @@ -431,4 +473,3 @@ are not created automatically. Also, :class:`partial` objects defined in classes behave like static methods and do not transform into bound methods during instance attribute look-up. - diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -342,7 +342,25 @@ functools --------- -New :func:`functools.singledispatch` decorator: see the :pep:`443`. +The new :func:`~functools.partialmethod` descriptor bring partial argument +application to descriptors, just as :func:`~functools.partial` provides +for normal callables. The new descriptor also makes it easier to get +arbitrary callables (including :func:`~functools.partial` instances) +to behave like normal instance methods when included in a class definition. + +(Contributed by Alon Horev and Nick Coghlan in :issue:`4331`) + +The new :func:`~functools.singledispatch` decorator brings support for +single-dispatch generic functions to the Python standard library. Where +object oriented programming focuses on grouping multiple operations on a +common set of data into a class, a generic function focuses on grouping +multiple implementations of an operation that allows it to work with +*different* kinds of data. + +.. seealso:: + + :pep:`443` - Single-dispatch generic functions + PEP written and implemented by ?ukasz Langa. hashlib diff --git a/Lib/functools.py b/Lib/functools.py --- a/Lib/functools.py +++ b/Lib/functools.py @@ -19,7 +19,7 @@ pass from abc import get_cache_token from collections import namedtuple -from types import MappingProxyType +from types import MappingProxyType, MethodType from weakref import WeakKeyDictionary try: from _thread import RLock @@ -223,8 +223,9 @@ ### partial() argument application ################################################################################ +# Purely functional, no descriptor behaviour def partial(func, *args, **keywords): - """new function with partial application of the given arguments + """New function with partial application of the given arguments and keywords. """ def newfunc(*fargs, **fkeywords): @@ -241,6 +242,79 @@ except ImportError: pass +# Descriptor version +class partialmethod(object): + """Method descriptor with partial application of the given arguments + and keywords. + + Supports wrapping existing descriptors and handles non-descriptor + callables as instance methods. + """ + + def __init__(self, func, *args, **keywords): + if not callable(func) and not hasattr(func, "__get__"): + raise TypeError("{!r} is not callable or a descriptor" + .format(func)) + + # func could be a descriptor like classmethod which isn't callable, + # so we can't inherit from partial (it verifies func is callable) + if isinstance(func, partialmethod): + # flattening is mandatory in order to place cls/self before all + # other arguments + # it's also more efficient since only one function will be called + self.func = func.func + self.args = func.args + args + self.keywords = func.keywords.copy() + self.keywords.update(keywords) + else: + self.func = func + self.args = args + self.keywords = keywords + + def __repr__(self): + args = ", ".join(map(repr, self.args)) + keywords = ", ".join("{}={!r}".format(k, v) + for k, v in self.keywords.items()) + format_string = "{module}.{cls}({func}, {args}, {keywords})" + return format_string.format(module=self.__class__.__module__, + cls=self.__class__.__name__, + func=self.func, + args=args, + keywords=keywords) + + def _make_unbound_method(self): + def _method(*args, **keywords): + call_keywords = self.keywords.copy() + call_keywords.update(keywords) + cls_or_self, *rest = args + call_args = (cls_or_self,) + self.args + tuple(rest) + return self.func(*call_args, **call_keywords) + _method.__isabstractmethod__ = self.__isabstractmethod__ + return _method + + def __get__(self, obj, cls): + get = getattr(self.func, "__get__", None) + result = None + if get is not None: + new_func = get(obj, cls) + if new_func is not self.func: + # Assume __get__ returning something new indicates the + # creation of an appropriate callable + result = partial(new_func, *self.args, **self.keywords) + try: + result.__self__ = new_func.__self__ + except AttributeError: + pass + if result is None: + # If the underlying descriptor didn't do anything, treat this + # like an instance method + result = self._make_unbound_method().__get__(obj, cls) + return result + + @property + def __isabstractmethod__(self): + return getattr(self.func, "__isabstractmethod__", False) + ################################################################################ ### LRU Cache function decorator diff --git a/Lib/test/test_functools.py b/Lib/test/test_functools.py --- a/Lib/test/test_functools.py +++ b/Lib/test/test_functools.py @@ -1,3 +1,4 @@ +import abc import collections from itertools import permutations import pickle @@ -217,6 +218,120 @@ partial = PartialSubclass +class TestPartialMethod(unittest.TestCase): + + class A(object): + nothing = functools.partialmethod(capture) + positional = functools.partialmethod(capture, 1) + keywords = functools.partialmethod(capture, a=2) + both = functools.partialmethod(capture, 3, b=4) + + nested = functools.partialmethod(positional, 5) + + over_partial = functools.partialmethod(functools.partial(capture, c=6), 7) + + static = functools.partialmethod(staticmethod(capture), 8) + cls = functools.partialmethod(classmethod(capture), d=9) + + a = A() + + def test_arg_combinations(self): + self.assertEqual(self.a.nothing(), ((self.a,), {})) + self.assertEqual(self.a.nothing(5), ((self.a, 5), {})) + self.assertEqual(self.a.nothing(c=6), ((self.a,), {'c': 6})) + self.assertEqual(self.a.nothing(5, c=6), ((self.a, 5), {'c': 6})) + + self.assertEqual(self.a.positional(), ((self.a, 1), {})) + self.assertEqual(self.a.positional(5), ((self.a, 1, 5), {})) + self.assertEqual(self.a.positional(c=6), ((self.a, 1), {'c': 6})) + self.assertEqual(self.a.positional(5, c=6), ((self.a, 1, 5), {'c': 6})) + + self.assertEqual(self.a.keywords(), ((self.a,), {'a': 2})) + self.assertEqual(self.a.keywords(5), ((self.a, 5), {'a': 2})) + self.assertEqual(self.a.keywords(c=6), ((self.a,), {'a': 2, 'c': 6})) + self.assertEqual(self.a.keywords(5, c=6), ((self.a, 5), {'a': 2, 'c': 6})) + + self.assertEqual(self.a.both(), ((self.a, 3), {'b': 4})) + self.assertEqual(self.a.both(5), ((self.a, 3, 5), {'b': 4})) + self.assertEqual(self.a.both(c=6), ((self.a, 3), {'b': 4, 'c': 6})) + self.assertEqual(self.a.both(5, c=6), ((self.a, 3, 5), {'b': 4, 'c': 6})) + + self.assertEqual(self.A.both(self.a, 5, c=6), ((self.a, 3, 5), {'b': 4, 'c': 6})) + + def test_nested(self): + self.assertEqual(self.a.nested(), ((self.a, 1, 5), {})) + self.assertEqual(self.a.nested(6), ((self.a, 1, 5, 6), {})) + self.assertEqual(self.a.nested(d=7), ((self.a, 1, 5), {'d': 7})) + self.assertEqual(self.a.nested(6, d=7), ((self.a, 1, 5, 6), {'d': 7})) + + self.assertEqual(self.A.nested(self.a, 6, d=7), ((self.a, 1, 5, 6), {'d': 7})) + + def test_over_partial(self): + self.assertEqual(self.a.over_partial(), ((self.a, 7), {'c': 6})) + self.assertEqual(self.a.over_partial(5), ((self.a, 7, 5), {'c': 6})) + self.assertEqual(self.a.over_partial(d=8), ((self.a, 7), {'c': 6, 'd': 8})) + self.assertEqual(self.a.over_partial(5, d=8), ((self.a, 7, 5), {'c': 6, 'd': 8})) + + self.assertEqual(self.A.over_partial(self.a, 5, d=8), ((self.a, 7, 5), {'c': 6, 'd': 8})) + + def test_bound_method_introspection(self): + obj = self.a + self.assertIs(obj.both.__self__, obj) + self.assertIs(obj.nested.__self__, obj) + self.assertIs(obj.over_partial.__self__, obj) + self.assertIs(obj.cls.__self__, self.A) + self.assertIs(self.A.cls.__self__, self.A) + + def test_unbound_method_retrieval(self): + obj = self.A + self.assertFalse(hasattr(obj.both, "__self__")) + self.assertFalse(hasattr(obj.nested, "__self__")) + self.assertFalse(hasattr(obj.over_partial, "__self__")) + self.assertFalse(hasattr(obj.static, "__self__")) + self.assertFalse(hasattr(self.a.static, "__self__")) + + def test_descriptors(self): + for obj in [self.A, self.a]: + with self.subTest(obj=obj): + self.assertEqual(obj.static(), ((8,), {})) + self.assertEqual(obj.static(5), ((8, 5), {})) + self.assertEqual(obj.static(d=8), ((8,), {'d': 8})) + self.assertEqual(obj.static(5, d=8), ((8, 5), {'d': 8})) + + self.assertEqual(obj.cls(), ((self.A,), {'d': 9})) + self.assertEqual(obj.cls(5), ((self.A, 5), {'d': 9})) + self.assertEqual(obj.cls(c=8), ((self.A,), {'c': 8, 'd': 9})) + self.assertEqual(obj.cls(5, c=8), ((self.A, 5), {'c': 8, 'd': 9})) + + def test_overriding_keywords(self): + self.assertEqual(self.a.keywords(a=3), ((self.a,), {'a': 3})) + self.assertEqual(self.A.keywords(self.a, a=3), ((self.a,), {'a': 3})) + + def test_invalid_args(self): + with self.assertRaises(TypeError): + class B(object): + method = functools.partialmethod(None, 1) + + def test_repr(self): + self.assertEqual(repr(vars(self.A)['both']), + 'functools.partialmethod({}, 3, b=4)'.format(capture)) + + def test_abstract(self): + class Abstract(abc.ABCMeta): + + @abc.abstractmethod + def add(self, x, y): + pass + + add5 = functools.partialmethod(add, 5) + + self.assertTrue(Abstract.add.__isabstractmethod__) + self.assertTrue(Abstract.add5.__isabstractmethod__) + + for func in [self.A.static, self.A.cls, self.A.over_partial, self.A.nested, self.A.both]: + self.assertFalse(getattr(func, '__isabstractmethod__', False)) + + class TestUpdateWrapper(unittest.TestCase): def check_wrapper(self, wrapper, wrapped, @@ -1433,6 +1548,7 @@ TestPartialC, TestPartialPy, TestPartialCSubclass, + TestPartialMethod, TestUpdateWrapper, TestTotalOrdering, TestCmpToKeyC, diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -1192,6 +1192,8 @@ Library ------- +- Issue #4331: Added functools.partialmethod (Initial patch by Alon Horev) + - Issue #13461: Fix a crash in the TextIOWrapper.tell method on 64-bit platforms. Patch by Yogesh Chaudhari. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 07:55:00 2013 From: python-checkins at python.org (nick.coghlan) Date: Sun, 3 Nov 2013 07:55:00 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2319439=3A_execute_?= =?utf-8?q?embedding_tests_on_Windows?= Message-ID: <3dC7Dh2gwFz7Llq@mail.python.org> http://hg.python.org/cpython/rev/c8c6c007ade3 changeset: 86868:c8c6c007ade3 user: Nick Coghlan date: Sun Nov 03 16:54:46 2013 +1000 summary: Close #19439: execute embedding tests on Windows Patch by Zachary Ware files: Lib/test/test_capi.py | 78 ++++--- Misc/NEWS | 3 + PCbuild/_testembed.vcxproj | 161 +++++++++++++++++ PCbuild/_testembed.vcxproj.filters | 22 ++ PCbuild/pcbuild.sln | 18 + 5 files changed, 250 insertions(+), 32 deletions(-) diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -9,7 +9,6 @@ import sys import time import unittest -import textwrap from test import support try: import _posixsubprocess @@ -219,15 +218,17 @@ self.assertEqual(_testcapi.argparsing("Hello", "World"), 1) - at unittest.skipIf( - sys.platform.startswith('win'), - "interpreter embedding tests aren't built under Windows") class EmbeddingTests(unittest.TestCase): - # XXX only tested under Unix checkouts - def setUp(self): basepath = os.path.dirname(os.path.dirname(os.path.dirname(__file__))) - self.test_exe = exe = os.path.join(basepath, "Modules", "_testembed") + exename = "_testembed" + if sys.platform.startswith("win"): + ext = ("_d" if "_d" in sys.executable else "") + ".exe" + exename += ext + exepath = os.path.dirname(sys.executable) + else: + exepath = os.path.join(basepath, "Modules") + self.test_exe = exe = os.path.join(exepath, exename) if not os.path.exists(exe): self.skipTest("%r doesn't exist" % exe) # This is needed otherwise we get a fatal error: @@ -260,6 +261,16 @@ print(out) print(err) + @staticmethod + def _get_default_pipe_encoding(): + rp, wp = os.pipe() + try: + with os.fdopen(wp, 'w') as w: + default_pipe_encoding = w.encoding + finally: + os.close(rp) + return default_pipe_encoding + def test_forced_io_encoding(self): # Checks forced configuration of embedded interpreter IO streams out, err = self.run_embedded_interpreter("forced_io_encoding") @@ -267,31 +278,34 @@ print() print(out) print(err) - expected_output = textwrap.dedent("""\ - --- Use defaults --- - Expected encoding: default - Expected errors: default - stdin: {0.__stdin__.encoding}:strict - stdout: {0.__stdout__.encoding}:strict - stderr: {0.__stderr__.encoding}:backslashreplace - --- Set errors only --- - Expected encoding: default - Expected errors: surrogateescape - stdin: {0.__stdin__.encoding}:surrogateescape - stdout: {0.__stdout__.encoding}:surrogateescape - stderr: {0.__stderr__.encoding}:backslashreplace - --- Set encoding only --- - Expected encoding: latin-1 - Expected errors: default - stdin: latin-1:strict - stdout: latin-1:strict - stderr: latin-1:backslashreplace - --- Set encoding and errors --- - Expected encoding: latin-1 - Expected errors: surrogateescape - stdin: latin-1:surrogateescape - stdout: latin-1:surrogateescape - stderr: latin-1:backslashreplace""").format(sys) + expected_stdin_encoding = sys.__stdin__.encoding + expected_pipe_encoding = self._get_default_pipe_encoding() + expected_output = os.linesep.join([ + "--- Use defaults ---", + "Expected encoding: default", + "Expected errors: default", + "stdin: {0}:strict", + "stdout: {1}:strict", + "stderr: {1}:backslashreplace", + "--- Set errors only ---", + "Expected encoding: default", + "Expected errors: surrogateescape", + "stdin: {0}:surrogateescape", + "stdout: {1}:surrogateescape", + "stderr: {1}:backslashreplace", + "--- Set encoding only ---", + "Expected encoding: latin-1", + "Expected errors: default", + "stdin: latin-1:strict", + "stdout: latin-1:strict", + "stderr: latin-1:backslashreplace", + "--- Set encoding and errors ---", + "Expected encoding: latin-1", + "Expected errors: surrogateescape", + "stdin: latin-1:surrogateescape", + "stdout: latin-1:surrogateescape", + "stderr: latin-1:backslashreplace"]).format(expected_stdin_encoding, + expected_pipe_encoding) # This is useful if we ever trip over odd platform behaviour self.maxDiff = None self.assertEqual(out.strip(), expected_output) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -153,6 +153,9 @@ Tests ----- +- Issue #19439: interpreter embedding tests are now executed on Windows + (Patch by Zachary Ware) + - Issue #19085: Added basic tests for all tkinter widget options. - Issue 19384: Fix test_py_compile for root user, patch by Claudiu Popa. diff --git a/PCbuild/_testembed.vcxproj b/PCbuild/_testembed.vcxproj new file mode 100644 --- /dev/null +++ b/PCbuild/_testembed.vcxproj @@ -0,0 +1,161 @@ +? + + + + Debug + Win32 + + + Debug + x64 + + + Release + Win32 + + + Release + x64 + + + + {6DAC66D9-E703-4624-BE03-49112AB5AA62} + Win32Proj + _testembed + + + + Application + true + Unicode + + + Application + true + Unicode + + + Application + false + true + Unicode + + + Application + false + true + Unicode + + + + + + + + + + + + + + + + + + + + + + + + + + + false + + + false + + + false + + + false + + + + + + Level3 + Disabled + WIN32;_DEBUG;_CONSOLE;%(PreprocessorDefinitions) + + + Console + true + + + + + + + Level3 + Disabled + WIN32;_DEBUG;_CONSOLE;%(PreprocessorDefinitions) + + + Console + true + + + + + Level3 + + + MaxSpeed + true + true + WIN32;NDEBUG;_CONSOLE;%(PreprocessorDefinitions) + + + Console + true + true + true + + + + + Level3 + + + MaxSpeed + true + true + WIN32;NDEBUG;_CONSOLE;%(PreprocessorDefinitions) + + + Console + true + true + true + + + + + + + + {cf7ac3d1-e2df-41d2-bea6-1e2556cdea26} + true + true + false + true + false + + + + + + \ No newline at end of file diff --git a/PCbuild/_testembed.vcxproj.filters b/PCbuild/_testembed.vcxproj.filters new file mode 100644 --- /dev/null +++ b/PCbuild/_testembed.vcxproj.filters @@ -0,0 +1,22 @@ +? + + + + {4FC737F1-C7A5-4376-A066-2A32D752A2FF} + cpp;c;cc;cxx;def;odl;idl;hpj;bat;asm;asmx + + + {93995380-89BD-4b04-88EB-625FBE52EBFB} + h;hpp;hxx;hm;inl;inc;xsd + + + {67DA6AB6-F800-4c08-8B7A-83BB121AAD01} + rc;ico;cur;bmp;dlg;rc2;rct;bin;rgs;gif;jpg;jpeg;jpe;resx;tiff;tif;png;wav;mfcribbon-ms + + + + + Source Files + + + \ No newline at end of file diff --git a/PCbuild/pcbuild.sln b/PCbuild/pcbuild.sln --- a/PCbuild/pcbuild.sln +++ b/PCbuild/pcbuild.sln @@ -78,6 +78,8 @@ EndProject Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "_overlapped", "_overlapped.vcxproj", "{EB6E69DD-04BF-4543-9B92-49FAABCEAC2E}" EndProject +Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "_testembed", "_testembed.vcxproj", "{6DAC66D9-E703-4624-BE03-49112AB5AA62}" +EndProject Global GlobalSection(SolutionConfigurationPlatforms) = preSolution Debug|Win32 = Debug|Win32 @@ -631,6 +633,22 @@ {254A0C05-6696-4B08-8CB2-EF7D533AEE01}.Release|Win32.Build.0 = Release|Win32 {254A0C05-6696-4B08-8CB2-EF7D533AEE01}.Release|x64.ActiveCfg = Release|x64 {254A0C05-6696-4B08-8CB2-EF7D533AEE01}.Release|x64.Build.0 = Release|x64 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.Debug|Win32.ActiveCfg = Debug|Win32 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.Debug|Win32.Build.0 = Debug|Win32 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.Debug|x64.ActiveCfg = Debug|x64 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.Debug|x64.Build.0 = Debug|x64 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.PGInstrument|Win32.ActiveCfg = PGInstrument|Win32 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.PGInstrument|Win32.Build.0 = PGInstrument|Win32 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.PGInstrument|x64.ActiveCfg = PGInstrument|x64 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.PGInstrument|x64.Build.0 = PGInstrument|x64 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.PGUpdate|Win32.ActiveCfg = PGUpdate|Win32 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.PGUpdate|Win32.Build.0 = PGUpdate|Win32 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.PGUpdate|x64.ActiveCfg = PGUpdate|x64 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.PGUpdate|x64.Build.0 = PGUpdate|x64 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.Release|Win32.ActiveCfg = Release|Win32 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.Release|Win32.Build.0 = Release|Win32 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.Release|x64.ActiveCfg = Release|x64 + {6DAC66D9-E703-4624-BE03-49112AB5AA62}.Release|x64.Build.0 = Release|x64 {EB6E69DD-04BF-4543-9B92-49FAABCEAC2E}.Debug|Win32.ActiveCfg = Debug|Win32 {EB6E69DD-04BF-4543-9B92-49FAABCEAC2E}.Debug|Win32.Build.0 = Debug|Win32 {EB6E69DD-04BF-4543-9B92-49FAABCEAC2E}.Debug|x64.ActiveCfg = Debug|x64 -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 08:02:03 2013 From: python-checkins at python.org (nick.coghlan) Date: Sun, 3 Nov 2013 08:02:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2319403=3A_make_con?= =?utf-8?q?textlib=2Eredirect=5Fstdout_reentrant?= Message-ID: <3dC7Nq607Gz7LmX@mail.python.org> http://hg.python.org/cpython/rev/87d49e2cdd34 changeset: 86869:87d49e2cdd34 user: Nick Coghlan date: Sun Nov 03 17:00:51 2013 +1000 summary: Close #19403: make contextlib.redirect_stdout reentrant files: Doc/library/contextlib.rst | 111 ++++++++++++++++-------- Lib/contextlib.py | 12 +- Lib/test/test_contextlib.py | 24 +++- Misc/NEWS | 2 + 4 files changed, 97 insertions(+), 52 deletions(-) diff --git a/Doc/library/contextlib.rst b/Doc/library/contextlib.rst --- a/Doc/library/contextlib.rst +++ b/Doc/library/contextlib.rst @@ -651,22 +651,33 @@ but may also be used *inside* a :keyword:`with` statement that is already using the same context manager. -:class:`threading.RLock` is an example of a reentrant context manager, as is -:func:`suppress`. Here's a toy example of reentrant use (real world -examples of reentrancy are more likely to occur with objects like recursive -locks and are likely to be far more complicated than this example):: +:class:`threading.RLock` is an example of a reentrant context manager, as are +:func:`suppress` and :func:`redirect_stdout`. Here's a very simple example of +reentrant use:: - >>> from contextlib import suppress - >>> ignore_raised_exception = suppress(ZeroDivisionError) - >>> with ignore_raised_exception: - ... with ignore_raised_exception: - ... 1/0 - ... print("This line runs") - ... 1/0 - ... print("This is skipped") + >>> from contextlib import redirect_stdout + >>> from io import StringIO + >>> stream = StringIO() + >>> write_to_stream = redirect_stdout(stream) + >>> with write_to_stream: + ... print("This is written to the stream rather than stdout") + ... with write_to_stream: + ... print("This is also written to the stream") ... - This line runs - >>> # The second exception is also suppressed + >>> print("This is written directly to stdout") + This is written directly to stdout + >>> print(stream.getvalue()) + This is written to the stream rather than stdout + This is also written to the stream + +Real world examples of reentrancy are more likely to involve multiple +functions calling each other and hence be far more complicated than this +example. + +Note also that being reentrant is *not* the same thing as being thread safe. +:func:`redirect_stdout`, for example, is definitely not thread safe, as it +makes a global modification to the system state by binding :data:`sys.stdout` +to a different stream. .. _reusable-cms: @@ -681,32 +692,58 @@ will fail (or otherwise not work correctly) if the specific context manager instance has already been used in a containing with statement. -An example of a reusable context manager is :func:`redirect_stdout`:: +:class:`threading.Lock` is an example of a reusable, but not reentrant, +context manager (for a reentrant lock, it is necessary to use +:class:`threading.RLock` instead). - >>> from contextlib import redirect_stdout - >>> from io import StringIO - >>> f = StringIO() - >>> collect_output = redirect_stdout(f) - >>> with collect_output: - ... print("Collected") +Another example of a reusable, but not reentrant, context manager is +:class:`ExitStack`, as it invokes *all* currently registered callbacks +when leaving any with statement, regardless of where those callbacks +were added:: + + >>> from contextlib import ExitStack + >>> stack = ExitStack() + >>> with stack: + ... stack.callback(print, "Callback: from first context") + ... print("Leaving first context") ... - >>> print("Not collected") - Not collected - >>> with collect_output: - ... print("Also collected") + Leaving first context + Callback: from first context + >>> with stack: + ... stack.callback(print, "Callback: from second context") + ... print("Leaving second context") ... - >>> print(f.getvalue()) - Collected - Also collected + Leaving second context + Callback: from second context + >>> with stack: + ... stack.callback(print, "Callback: from outer context") + ... with stack: + ... stack.callback(print, "Callback: from inner context") + ... print("Leaving inner context") + ... print("Leaving outer context") + ... + Leaving inner context + Callback: from inner context + Callback: from outer context + Leaving outer context -However, this context manager is not reentrant, so attempting to reuse it -within a containing with statement fails: +As the output from the example shows, reusing a single stack object across +multiple with statements works correctly, but attempting to nest them +will cause the stack to be cleared at the end of the innermost with +statement, which is unlikely to be desirable behaviour. - >>> with collect_output: - ... # Nested reuse is not permitted - ... with collect_output: - ... pass +Using separate :class:`ExitStack` instances instead of reusing a single +instance avoids that problem:: + + >>> from contextlib import ExitStack + >>> with ExitStack() as outer_stack: + ... outer_stack.callback(print, "Callback: from outer context") + ... with ExitStack() as inner_stack: + ... inner_stack.callback(print, "Callback: from inner context") + ... print("Leaving inner context") + ... print("Leaving outer context") ... - Traceback (most recent call last): - ... - RuntimeError: Cannot reenter <...> + Leaving inner context + Callback: from inner context + Leaving outer context + Callback: from outer context diff --git a/Lib/contextlib.py b/Lib/contextlib.py --- a/Lib/contextlib.py +++ b/Lib/contextlib.py @@ -166,20 +166,16 @@ def __init__(self, new_target): self._new_target = new_target - self._old_target = self._sentinel = object() + # We use a list of old targets to make this CM re-entrant + self._old_targets = [] def __enter__(self): - if self._old_target is not self._sentinel: - raise RuntimeError("Cannot reenter {!r}".format(self)) - self._old_target = sys.stdout + self._old_targets.append(sys.stdout) sys.stdout = self._new_target return self._new_target def __exit__(self, exctype, excinst, exctb): - restore_stdout = self._old_target - self._old_target = self._sentinel - sys.stdout = restore_stdout - + sys.stdout = self._old_targets.pop() class suppress: diff --git a/Lib/test/test_contextlib.py b/Lib/test/test_contextlib.py --- a/Lib/test/test_contextlib.py +++ b/Lib/test/test_contextlib.py @@ -666,11 +666,18 @@ obj = redirect_stdout(None) self.assertEqual(obj.__doc__, cm_docstring) + def test_no_redirect_in_init(self): + orig_stdout = sys.stdout + redirect_stdout(None) + self.assertIs(sys.stdout, orig_stdout) + def test_redirect_to_string_io(self): f = io.StringIO() msg = "Consider an API like help(), which prints directly to stdout" + orig_stdout = sys.stdout with redirect_stdout(f): print(msg) + self.assertIs(sys.stdout, orig_stdout) s = f.getvalue().strip() self.assertEqual(s, msg) @@ -682,23 +689,26 @@ def test_cm_is_reusable(self): f = io.StringIO() write_to_f = redirect_stdout(f) + orig_stdout = sys.stdout with write_to_f: print("Hello", end=" ") with write_to_f: print("World!") + self.assertIs(sys.stdout, orig_stdout) s = f.getvalue() self.assertEqual(s, "Hello World!\n") - # If this is ever made reentrant, update the reusable-but-not-reentrant - # example at the end of the contextlib docs accordingly. - def test_nested_reentry_fails(self): + def test_cm_is_reentrant(self): f = io.StringIO() write_to_f = redirect_stdout(f) - with self.assertRaisesRegex(RuntimeError, "Cannot reenter"): + orig_stdout = sys.stdout + with write_to_f: + print("Hello", end=" ") with write_to_f: - print("Hello", end=" ") - with write_to_f: - print("World!") + print("World!") + self.assertIs(sys.stdout, orig_stdout) + s = f.getvalue() + self.assertEqual(s, "Hello World!\n") class TestSuppress(unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -31,6 +31,8 @@ Library ------- +- Issue #19403: contextlib.redirect_stdout is now reentrant + - Issue #19286: Directories in ``package_data`` are no longer added to the filelist, preventing failure outlined in the ticket. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 13:01:45 2013 From: python-checkins at python.org (tim.golden) Date: Sun, 3 Nov 2013 13:01:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NDY0?= =?utf-8?q?_Suppress_compiler_warnings_during_clean=2E_Patch_by_Zachary_Wa?= =?utf-8?b?cmUu?= Message-ID: <3dCG2d5p2gz7Ljp@mail.python.org> http://hg.python.org/cpython/rev/dbff708e393f changeset: 86870:dbff708e393f branch: 3.3 parent: 86862:69f8288056eb user: Tim Golden date: Sun Nov 03 11:58:02 2013 +0000 summary: Issue #19464 Suppress compiler warnings during clean. Patch by Zachary Ware. files: PCbuild/ssl.vcxproj | 16 ++++++++-------- 1 files changed, 8 insertions(+), 8 deletions(-) diff --git a/PCbuild/ssl.vcxproj b/PCbuild/ssl.vcxproj --- a/PCbuild/ssl.vcxproj +++ b/PCbuild/ssl.vcxproj @@ -122,7 +122,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -133,7 +133,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -144,7 +144,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -155,7 +155,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -166,7 +166,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -177,7 +177,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -188,7 +188,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -199,7 +199,7 @@ "$(PythonExe)" build_ssl.py Release $(Platform) -a - + echo OpenSSL must be cleaned manually if you want to rebuild it. $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 13:01:47 2013 From: python-checkins at python.org (tim.golden) Date: Sun, 3 Nov 2013 13:01:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319464_Null_merge_with_3=2E3?= Message-ID: <3dCG2g0SwLz7Ljp@mail.python.org> http://hg.python.org/cpython/rev/6e592d972b86 changeset: 86871:6e592d972b86 parent: 86869:87d49e2cdd34 parent: 86870:dbff708e393f user: Tim Golden date: Sun Nov 03 11:59:28 2013 +0000 summary: Issue #19464 Null merge with 3.3 files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 13:15:27 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 3 Nov 2013 13:15:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzYxNjA6?= =?utf-8?q?_The_bbox=28=29_method_of_Tkinter=2ESpinbox_now_returns_a_tuple?= =?utf-8?q?_of?= Message-ID: <3dCGLR2JyMz7Ljl@mail.python.org> http://hg.python.org/cpython/rev/91453ba40b30 changeset: 86872:91453ba40b30 branch: 2.7 parent: 86854:4a2afda8f187 user: Serhiy Storchaka date: Sun Nov 03 14:13:08 2013 +0200 summary: Issue #6160: The bbox() method of Tkinter.Spinbox now returns a tuple of integers instead of a string. Based on patch by Guilherme Polo. files: Lib/lib-tk/Tkinter.py | 2 +- Lib/lib-tk/test/test_tkinter/test_widgets.py | 12 ++++++++++ Misc/NEWS | 3 ++ 3 files changed, 16 insertions(+), 1 deletions(-) diff --git a/Lib/lib-tk/Tkinter.py b/Lib/lib-tk/Tkinter.py --- a/Lib/lib-tk/Tkinter.py +++ b/Lib/lib-tk/Tkinter.py @@ -3411,7 +3411,7 @@ bounding box may refer to a region outside the visible area of the window. """ - return self.tk.call(self._w, 'bbox', index) + return self._getints(self.tk.call(self._w, 'bbox', index)) or None def delete(self, first, last=None): """Delete one or more elements of the spinbox. diff --git a/Lib/lib-tk/test/test_tkinter/test_widgets.py b/Lib/lib-tk/test/test_tkinter/test_widgets.py --- a/Lib/lib-tk/test/test_tkinter/test_widgets.py +++ b/Lib/lib-tk/test/test_tkinter/test_widgets.py @@ -454,6 +454,18 @@ widget = self.create() self.checkBooleanParam(widget, 'wrap') + def test_bbox(self): + widget = self.create() + bbox = widget.bbox(0) + self.assertEqual(len(bbox), 4) + for item in bbox: + self.assertIsInstance(item, int) + + self.assertRaises(Tkinter.TclError, widget.bbox, 'noindex') + self.assertRaises(Tkinter.TclError, widget.bbox, None) + self.assertRaises(TypeError, widget.bbox) + self.assertRaises(TypeError, widget.bbox, 0, 1) + @add_standard_options(StandardOptionsTests) class TextTest(AbstractWidgetTest, unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -12,6 +12,9 @@ Library ------- +- Issue #6160: The bbox() method of tkinter.Spinbox now returns a tuple of + integers instead of a string. Based on patch by Guilherme Polo. + - Issue #19286: Directories in ``package_data`` are no longer added to the filelist, preventing failure outlined in the ticket. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 13:15:28 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 3 Nov 2013 13:15:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzYxNjA6?= =?utf-8?q?_The_bbox=28=29_method_of_tkinter=2ESpinbox_now_returns_a_tuple?= =?utf-8?q?_of?= Message-ID: <3dCGLS4B20z7LkH@mail.python.org> http://hg.python.org/cpython/rev/5bdbf2258563 changeset: 86873:5bdbf2258563 branch: 3.3 parent: 86870:dbff708e393f user: Serhiy Storchaka date: Sun Nov 03 14:13:34 2013 +0200 summary: Issue #6160: The bbox() method of tkinter.Spinbox now returns a tuple of integers instead of a string. Based on patch by Guilherme Polo. files: Lib/tkinter/__init__.py | 2 +- Lib/tkinter/test/test_tkinter/test_widgets.py | 12 ++++++++++ Misc/NEWS | 3 ++ 3 files changed, 16 insertions(+), 1 deletions(-) diff --git a/Lib/tkinter/__init__.py b/Lib/tkinter/__init__.py --- a/Lib/tkinter/__init__.py +++ b/Lib/tkinter/__init__.py @@ -3511,7 +3511,7 @@ bounding box may refer to a region outside the visible area of the window. """ - return self.tk.call(self._w, 'bbox', index) + return self._getints(self.tk.call(self._w, 'bbox', index)) or None def delete(self, first, last=None): """Delete one or more elements of the spinbox. diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -457,6 +457,18 @@ widget = self.create() self.checkBooleanParam(widget, 'wrap') + def test_bbox(self): + widget = self.create() + bbox = widget.bbox(0) + self.assertEqual(len(bbox), 4) + for item in bbox: + self.assertIsInstance(item, int) + + self.assertRaises(tkinter.TclError, widget.bbox, 'noindex') + self.assertRaises(tkinter.TclError, widget.bbox, None) + self.assertRaises(TypeError, widget.bbox) + self.assertRaises(TypeError, widget.bbox, 0, 1) + @add_standard_options(StandardOptionsTests) class TextTest(AbstractWidgetTest, unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -13,6 +13,9 @@ Library ------- +- Issue #6160: The bbox() method of tkinter.Spinbox now returns a tuple of + integers instead of a string. Based on patch by Guilherme Polo. + - Issue #19286: Directories in ``package_data`` are no longer added to the filelist, preventing failure outlined in the ticket. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 13:15:29 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 3 Nov 2013 13:15:29 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=236160=3A_The_bbox=28=29_method_of_tkinter=2ESpin?= =?utf-8?q?box_now_returns_a_tuple_of?= Message-ID: <3dCGLT649Wz7LkL@mail.python.org> http://hg.python.org/cpython/rev/75d8b9136fa6 changeset: 86874:75d8b9136fa6 parent: 86871:6e592d972b86 parent: 86873:5bdbf2258563 user: Serhiy Storchaka date: Sun Nov 03 14:15:00 2013 +0200 summary: Issue #6160: The bbox() method of tkinter.Spinbox now returns a tuple of integers instead of a string. Based on patch by Guilherme Polo. files: Lib/tkinter/__init__.py | 2 +- Lib/tkinter/test/test_tkinter/test_widgets.py | 12 ++++++++++ Misc/NEWS | 3 ++ 3 files changed, 16 insertions(+), 1 deletions(-) diff --git a/Lib/tkinter/__init__.py b/Lib/tkinter/__init__.py --- a/Lib/tkinter/__init__.py +++ b/Lib/tkinter/__init__.py @@ -3472,7 +3472,7 @@ bounding box may refer to a region outside the visible area of the window. """ - return self.tk.call(self._w, 'bbox', index) + return self._getints(self.tk.call(self._w, 'bbox', index)) or None def delete(self, first, last=None): """Delete one or more elements of the spinbox. diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -457,6 +457,18 @@ widget = self.create() self.checkBooleanParam(widget, 'wrap') + def test_bbox(self): + widget = self.create() + bbox = widget.bbox(0) + self.assertEqual(len(bbox), 4) + for item in bbox: + self.assertIsInstance(item, int) + + self.assertRaises(tkinter.TclError, widget.bbox, 'noindex') + self.assertRaises(tkinter.TclError, widget.bbox, None) + self.assertRaises(TypeError, widget.bbox) + self.assertRaises(TypeError, widget.bbox, 0, 1) + @add_standard_options(StandardOptionsTests) class TextTest(AbstractWidgetTest, unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -31,6 +31,9 @@ Library ------- +- Issue #6160: The bbox() method of tkinter.Spinbox now returns a tuple of + integers instead of a string. Based on patch by Guilherme Polo. + - Issue #19403: contextlib.redirect_stdout is now reentrant - Issue #19286: Directories in ``package_data`` are no longer added to -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 13:35:15 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 3 Nov 2013 13:35:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzYxNTc6?= =?utf-8?q?_Fixed_Tkinter=2EText=2Edebug=28=29=2E__Original_patch_by_Guilh?= =?utf-8?q?erme_Polo=2E?= Message-ID: <3dCGnH0Phlz7LjT@mail.python.org> http://hg.python.org/cpython/rev/b3178d03871b changeset: 86875:b3178d03871b branch: 2.7 parent: 86872:91453ba40b30 user: Serhiy Storchaka date: Sun Nov 03 14:28:29 2013 +0200 summary: Issue #6157: Fixed Tkinter.Text.debug(). Original patch by Guilherme Polo. files: Lib/lib-tk/Tkinter.py | 5 ++- Lib/lib-tk/test/test_tkinter/test_text.py | 11 ++++++++ Lib/lib-tk/test/test_tkinter/test_widgets.py | 13 ++++++++++ Misc/NEWS | 2 + 4 files changed, 29 insertions(+), 2 deletions(-) diff --git a/Lib/lib-tk/Tkinter.py b/Lib/lib-tk/Tkinter.py --- a/Lib/lib-tk/Tkinter.py +++ b/Lib/lib-tk/Tkinter.py @@ -2908,8 +2908,9 @@ def debug(self, boolean=None): """Turn on the internal consistency checks of the B-Tree inside the text widget according to BOOLEAN.""" - return self.tk.getboolean(self.tk.call( - self._w, 'debug', boolean)) + if boolean is None: + return self.tk.call(self._w, 'debug') + self.tk.call(self._w, 'debug', boolean) def delete(self, index1, index2=None): """Delete the characters between INDEX1 and INDEX2 (not included).""" self.tk.call(self._w, 'delete', index1, index2) diff --git a/Lib/lib-tk/test/test_tkinter/test_text.py b/Lib/lib-tk/test/test_tkinter/test_text.py --- a/Lib/lib-tk/test/test_tkinter/test_text.py +++ b/Lib/lib-tk/test/test_tkinter/test_text.py @@ -14,6 +14,17 @@ def tearDown(self): self.text.destroy() + def test_debug(self): + text = self.text + olddebug = text.debug() + try: + text.debug(0) + self.assertEqual(text.debug(), 0) + text.debug(1) + self.assertEqual(text.debug(), 1) + finally: + text.debug(olddebug) + self.assertEqual(text.debug(), olddebug) def test_search(self): text = self.text diff --git a/Lib/lib-tk/test/test_tkinter/test_widgets.py b/Lib/lib-tk/test/test_tkinter/test_widgets.py --- a/Lib/lib-tk/test/test_tkinter/test_widgets.py +++ b/Lib/lib-tk/test/test_tkinter/test_widgets.py @@ -607,6 +607,19 @@ else: self.checkEnumParam(widget, 'wrap', 'char', 'none', 'word') + def test_bbox(self): + widget = self.create() + bbox = widget.bbox('1.1') + self.assertEqual(len(bbox), 4) + for item in bbox: + self.assertIsInstance(item, int) + + self.assertIsNone(widget.bbox('end')) + self.assertRaises(Tkinter.TclError, widget.bbox, 'noindex') + self.assertRaises(Tkinter.TclError, widget.bbox, None) + self.assertRaises(Tkinter.TclError, widget.bbox) + self.assertRaises(Tkinter.TclError, widget.bbox, '1.1', 'end') + @add_standard_options(PixelSizeTests, StandardOptionsTests) class CanvasTest(AbstractWidgetTest, unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -12,6 +12,8 @@ Library ------- +- Issue #6157: Fixed Tkinter.Text.debug(). Original patch by Guilherme Polo. + - Issue #6160: The bbox() method of tkinter.Spinbox now returns a tuple of integers instead of a string. Based on patch by Guilherme Polo. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 13:35:16 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 3 Nov 2013 13:35:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzYxNTc6?= =?utf-8?q?_Fixed_tkinter=2EText=2Edebug=28=29=2E__Original_patch_by_Guilh?= =?utf-8?q?erme_Polo=2E?= Message-ID: <3dCGnJ2K7Wz7LjT@mail.python.org> http://hg.python.org/cpython/rev/3f5e35b766ac changeset: 86876:3f5e35b766ac branch: 3.3 parent: 86873:5bdbf2258563 user: Serhiy Storchaka date: Sun Nov 03 14:29:35 2013 +0200 summary: Issue #6157: Fixed tkinter.Text.debug(). Original patch by Guilherme Polo. files: Lib/tkinter/__init__.py | 5 ++- Lib/tkinter/test/test_tkinter/test_text.py | 11 ++++++++ Lib/tkinter/test/test_tkinter/test_widgets.py | 13 ++++++++++ Misc/NEWS | 2 + 4 files changed, 29 insertions(+), 2 deletions(-) diff --git a/Lib/tkinter/__init__.py b/Lib/tkinter/__init__.py --- a/Lib/tkinter/__init__.py +++ b/Lib/tkinter/__init__.py @@ -2990,8 +2990,9 @@ def debug(self, boolean=None): """Turn on the internal consistency checks of the B-Tree inside the text widget according to BOOLEAN.""" - return self.tk.getboolean(self.tk.call( - self._w, 'debug', boolean)) + if boolean is None: + return self.tk.call(self._w, 'debug') + self.tk.call(self._w, 'debug', boolean) def delete(self, index1, index2=None): """Delete the characters between INDEX1 and INDEX2 (not included).""" self.tk.call(self._w, 'delete', index1, index2) diff --git a/Lib/tkinter/test/test_tkinter/test_text.py b/Lib/tkinter/test/test_tkinter/test_text.py --- a/Lib/tkinter/test/test_tkinter/test_text.py +++ b/Lib/tkinter/test/test_tkinter/test_text.py @@ -14,6 +14,17 @@ def tearDown(self): self.text.destroy() + def test_debug(self): + text = self.text + olddebug = text.debug() + try: + text.debug(0) + self.assertEqual(text.debug(), 0) + text.debug(1) + self.assertEqual(text.debug(), 1) + finally: + text.debug(olddebug) + self.assertEqual(text.debug(), olddebug) def test_search(self): text = self.text diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -610,6 +610,19 @@ else: self.checkEnumParam(widget, 'wrap', 'char', 'none', 'word') + def test_bbox(self): + widget = self.create() + bbox = widget.bbox('1.1') + self.assertEqual(len(bbox), 4) + for item in bbox: + self.assertIsInstance(item, int) + + self.assertIsNone(widget.bbox('end')) + self.assertRaises(tkinter.TclError, widget.bbox, 'noindex') + self.assertRaises(tkinter.TclError, widget.bbox, None) + self.assertRaises(tkinter.TclError, widget.bbox) + self.assertRaises(tkinter.TclError, widget.bbox, '1.1', 'end') + @add_standard_options(PixelSizeTests, StandardOptionsTests) class CanvasTest(AbstractWidgetTest, unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -13,6 +13,8 @@ Library ------- +- Issue #6157: Fixed tkinter.Text.debug(). Original patch by Guilherme Polo. + - Issue #6160: The bbox() method of tkinter.Spinbox now returns a tuple of integers instead of a string. Based on patch by Guilherme Polo. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 13:35:17 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 3 Nov 2013 13:35:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?b?KTogSXNzdWUgIzYxNTc6IEZpeGVkIHRraW50ZXIuVGV4dC5kZWJ1ZygpLiAg?= =?utf-8?q?tkinter=2EText=2Ebbox=28=29_now_raises?= Message-ID: <3dCGnK5LXjz7Ljl@mail.python.org> http://hg.python.org/cpython/rev/c40b573c9f7a changeset: 86877:c40b573c9f7a parent: 86874:75d8b9136fa6 parent: 86876:3f5e35b766ac user: Serhiy Storchaka date: Sun Nov 03 14:34:25 2013 +0200 summary: Issue #6157: Fixed tkinter.Text.debug(). tkinter.Text.bbox() now raises TypeError instead of TclError on wrong number of arguments. Original patch by Guilherme Polo. files: Lib/tkinter/__init__.py | 11 ++++--- Lib/tkinter/test/test_tkinter/test_text.py | 11 ++++++++ Lib/tkinter/test/test_tkinter/test_widgets.py | 13 ++++++++++ Misc/NEWS | 4 +++ 4 files changed, 34 insertions(+), 5 deletions(-) diff --git a/Lib/tkinter/__init__.py b/Lib/tkinter/__init__.py --- a/Lib/tkinter/__init__.py +++ b/Lib/tkinter/__init__.py @@ -2911,11 +2911,11 @@ """ Widget.__init__(self, master, 'text', cnf, kw) - def bbox(self, *args): + def bbox(self, index): """Return a tuple of (x,y,width,height) which gives the bounding - box of the visible part of the character at the index in ARGS.""" + box of the visible part of the character at the given index.""" return self._getints( - self.tk.call((self._w, 'bbox') + args)) or None + self.tk.call(self._w, 'bbox', index)) or None def tk_textSelectTo(self, index): self.tk.call('tk_textSelectTo', self._w, index) def tk_textBackspace(self): @@ -2951,8 +2951,9 @@ def debug(self, boolean=None): """Turn on the internal consistency checks of the B-Tree inside the text widget according to BOOLEAN.""" - return self.tk.getboolean(self.tk.call( - self._w, 'debug', boolean)) + if boolean is None: + return self.tk.call(self._w, 'debug') + self.tk.call(self._w, 'debug', boolean) def delete(self, index1, index2=None): """Delete the characters between INDEX1 and INDEX2 (not included).""" self.tk.call(self._w, 'delete', index1, index2) diff --git a/Lib/tkinter/test/test_tkinter/test_text.py b/Lib/tkinter/test/test_tkinter/test_text.py --- a/Lib/tkinter/test/test_tkinter/test_text.py +++ b/Lib/tkinter/test/test_tkinter/test_text.py @@ -14,6 +14,17 @@ def tearDown(self): self.text.destroy() + def test_debug(self): + text = self.text + olddebug = text.debug() + try: + text.debug(0) + self.assertEqual(text.debug(), 0) + text.debug(1) + self.assertEqual(text.debug(), 1) + finally: + text.debug(olddebug) + self.assertEqual(text.debug(), olddebug) def test_search(self): text = self.text diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -610,6 +610,19 @@ else: self.checkEnumParam(widget, 'wrap', 'char', 'none', 'word') + def test_bbox(self): + widget = self.create() + bbox = widget.bbox('1.1') + self.assertEqual(len(bbox), 4) + for item in bbox: + self.assertIsInstance(item, int) + + self.assertIsNone(widget.bbox('end')) + self.assertRaises(tkinter.TclError, widget.bbox, 'noindex') + self.assertRaises(tkinter.TclError, widget.bbox, None) + self.assertRaises(TypeError, widget.bbox) + self.assertRaises(TypeError, widget.bbox, '1.1', 'end') + @add_standard_options(PixelSizeTests, StandardOptionsTests) class CanvasTest(AbstractWidgetTest, unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -31,6 +31,10 @@ Library ------- +- Issue #6157: Fixed tkinter.Text.debug(). tkinter.Text.bbox() now raises + TypeError instead of TclError on wrong number of arguments. Original patch + by Guilherme Polo. + - Issue #6160: The bbox() method of tkinter.Spinbox now returns a tuple of integers instead of a string. Based on patch by Guilherme Polo. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 13:51:00 2013 From: python-checkins at python.org (victor.stinner) Date: Sun, 3 Nov 2013 13:51:00 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_454=3A_remove_enable/disb?= =?utf-8?q?le/is=5Fenabled=28=29_functions?= Message-ID: <3dCH7S5DMfz7LjT@mail.python.org> http://hg.python.org/peps/rev/5a098b7516d9 changeset: 5247:5a098b7516d9 user: Victor Stinner date: Sun Nov 03 13:50:52 2013 +0100 summary: PEP 454: remove enable/disble/is_enabled() functions files: pep-0454.txt | 100 ++++++++++++-------------------------- 1 files changed, 31 insertions(+), 69 deletions(-) diff --git a/pep-0454.txt b/pep-0454.txt --- a/pep-0454.txt +++ b/pep-0454.txt @@ -64,13 +64,12 @@ number: total size, number and average size of allocated memory blocks * Computed differences between two snapshots to detect memory leaks -The API of the tracemalloc module is similar to the API of the -faulthandler module: ``enable()``, ``disable()`` and ``is_enabled()`` -functions, an environment variable (``PYTHONFAULTHANDLER`` and -``PYTHONTRACEMALLOC``), and a ``-X`` command line option (``-X -faulthandler`` and ``-X tracemalloc``). See the -`documentation of the faulthandler module -`_. +The API of the tracemalloc module is similar to the API of the faulthandler +module: ``enable()`` / ``start()``, ``disable()`` / ``stop()`` and +``is_enabled()`` / ``is_tracing()`` functions, an environment variable +(``PYTHONFAULTHANDLER`` and ``PYTHONTRACEMALLOC``), and a ``-X`` command line +option (``-X faulthandler`` and ``-X tracemalloc``). See the `documentation of +the faulthandler module `_. The idea of tracing memory allocations is not new. It was first implemented in the PySizer project in 2005. PySizer was implemented @@ -79,7 +78,7 @@ on CPython adds a overhead on performances and memory footprint, even if the PySizer was not used. tracemalloc attachs a traceback to the underlying layer, to memory blocks, and has no overhead when the module -is disabled. +is not tracing memory allocations. The tracemalloc module has been written for CPython. Other implementations of Python may not be able to provide it. @@ -89,9 +88,9 @@ === To trace most memory blocks allocated by Python, the module should be -enabled as early as possible by setting the ``PYTHONTRACEMALLOC`` +started as early as possible by setting the ``PYTHONTRACEMALLOC`` environment variable to ``1``, or by using ``-X tracemalloc`` command -line option. The ``tracemalloc.enable()`` function can be called at +line option. The ``tracemalloc.start()`` function can be called at runtime to start tracing Python memory allocations. By default, a trace of an allocated memory block only stores the most @@ -112,28 +111,7 @@ Clear traces of memory blocks allocated by Python. - See also ``disable()``. - - -``disable()`` function: - - Disable temporarily tracing new Python memory allocations, - deallocations are still traced. The change is process-wide, tracing - new Python memory allocations is disabled in all threads. Call - ``enable()`` to reenable tracing new Python memory allocations. - - Filters can be used to not trace memory allocations in some files: - use the ``add_filter()`` function. - - See also ``enable()`` and ``is_enabled()`` functions. - - -``enable()`` function: - - Reenable tracing Python memory allocations if was disabled by te - ``disable()`` method. - - See also ``is_enabled()`` functions. + See also ``stop()``. ``get_traced_memory()`` function: @@ -148,26 +126,23 @@ store traces of memory blocks. Return an ``int``. -``is_enabled()`` function: - - ``True`` if the ``tracemalloc`` module is enabled Python memory - allocations, ``False`` if the module is disabled. - - The ``tracemalloc`` module only traces new allocations if - ``is_tracing()`` and ``is_enabled()`` are ``True``. - - See also ``enable()`` and ``disable()`` functions. - - ``is_tracing()`` function: ``True`` if the ``tracemalloc`` module is tracing Python memory allocations, ``False`` otherwise. - The ``tracemalloc`` module only traces new allocations if - ``is_tracing()`` and ``is_enabled()`` are ``True``. + See also ``start()`` and ``stop()`` functions. - See also ``start()`` and ``stop()`` functions. + +``start()`` function: + + Start tracing Python memory allocations. + + The function installs hooks on Python memory allocators. These hooks + have important overhead in term of performances and memory usage: + see `Filter functions`_ to limit the overhead. + + See also ``stop()`` and ``is_tracing()`` functions. ``stop()`` function: @@ -179,21 +154,9 @@ overhead of the module becomes null. Call ``get_traces()`` or ``take_snapshot()`` function to get traces - before clearing them. Use ``disable()`` to disable tracing - temporarily. + before clearing them. - See also ``enable()`` and ``is_enabled()`` functions. - - -``start()`` function: - - Start tracing Python memory allocations. - - The function installs hooks on Python memory allocators. These hooks - have important overhead in term of performances and memory usage: - see `Filter functions`_ to limit the overhead. - - See also ``disable()`` and ``is_tracing()`` functions. + See also ``start()`` and ``is_tracing()`` functions. ``take_snapshot()`` function: @@ -201,8 +164,8 @@ Take a snapshot of traces of memory blocks allocated by Python using the ``get_traces()`` function. Return a new ``Snapshot`` instance. - The ``tracemalloc`` module must be enabled to take a snapshot, see - the the ``enable()`` function. + The ``tracemalloc`` module must be tracing memory allocations to take a + snapshot, see the the ``start()`` function. See also ``get_traces()`` and ``get_object_traceback()`` functions. @@ -235,8 +198,8 @@ Get the traceback where the Python object *obj* was allocated. Return a tuple of ``(filename: str, lineno: int)`` tuples. - Return ``None`` if the ``tracemalloc`` module is disabled or did not - trace the allocation of the object. + Return ``None`` if the ``tracemalloc`` module is not tracing memory + allocations or did not trace the allocation of the object. See also ``gc.get_referrers()`` and ``sys.getsizeof()`` functions. @@ -258,8 +221,8 @@ ``(filename: str, lineno: int)`` tuples. The list of traces do not include memory blocks allocated before the - ``tracemalloc`` module was enabled nor memory blocks ignored by - filters (see ``get_filters()``). + ``tracemalloc`` module started to trace memory allocations nor memory + blocks ignored by filters (see ``get_filters()``). The list is not sorted. Take a snapshot using ``take_snapshot()`` and use the ``Snapshot.statistics()`` method to get a sorted list of @@ -268,7 +231,8 @@ Tracebacks of traces are limited to ``traceback_limit`` frames. Use ``set_traceback_limit()`` to store more frames. - Return an empty list if the ``tracemalloc`` module is disabled. + Return an empty list if the ``tracemalloc`` module is not tracing memory + allocations. See also ``take_snapshot()`` and ``get_object_traceback()`` functions. @@ -308,8 +272,6 @@ By default, there is one exclusive filter to ignore Python memory blocks allocated by the ``tracemalloc`` module. -Tracing can be also be disabled temporarily using the ``disable()`` function. - Use the ``get_tracemalloc_memory()`` function to measure the memory usage. See also the ``set_traceback_limit()`` function to configure how many frames are stored. -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sun Nov 3 13:56:27 2013 From: python-checkins at python.org (victor.stinner) Date: Sun, 3 Nov 2013 13:56:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319424=3A_Fix_a_co?= =?utf-8?q?mpiler_warning_on_comparing_signed/unsigned_size=5Ft?= Message-ID: <3dCHFl72Phz7Ljw@mail.python.org> http://hg.python.org/cpython/rev/2ed8d500e113 changeset: 86878:2ed8d500e113 user: Victor Stinner date: Sun Nov 03 13:53:12 2013 +0100 summary: Issue #19424: Fix a compiler warning on comparing signed/unsigned size_t Patch written by Zachary Ware. files: Objects/unicodeobject.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -10581,7 +10581,7 @@ kind = PyUnicode_KIND(uni); if (kind == PyUnicode_1BYTE_KIND) { const void *data = PyUnicode_1BYTE_DATA(uni); - Py_ssize_t len1 = PyUnicode_GET_LENGTH(uni); + size_t len1 = (size_t)PyUnicode_GET_LENGTH(uni); size_t len, len2 = strlen(str); int cmp; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 15:23:15 2013 From: python-checkins at python.org (tim.golden) Date: Sun, 3 Nov 2013 15:23:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzEwMTk3?= =?utf-8?q?_Rework_subprocess=2Eget=5Bstatus=5Doutput_to_use_subprocess?= Message-ID: <3dCK9v463hz7LjT@mail.python.org> http://hg.python.org/cpython/rev/c34e163c0086 changeset: 86879:c34e163c0086 branch: 3.3 parent: 86870:dbff708e393f user: Tim Golden date: Sun Nov 03 12:53:17 2013 +0000 summary: Issue #10197 Rework subprocess.get[status]output to use subprocess functionality and thus to work on Windows. Patch by Nick Coghlan. files: Lib/subprocess.py | 24 +++++++++--------------- Lib/test/test_subprocess.py | 11 ++--------- Misc/NEWS | 3 +++ 3 files changed, 14 insertions(+), 24 deletions(-) diff --git a/Lib/subprocess.py b/Lib/subprocess.py --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -681,21 +681,15 @@ >>> subprocess.getstatusoutput('/bin/junk') (256, 'sh: /bin/junk: not found') """ - with os.popen('{ ' + cmd + '; } 2>&1', 'r') as pipe: - try: - text = pipe.read() - sts = pipe.close() - except: - process = pipe._proc - process.kill() - process.wait() - raise - if sts is None: - sts = 0 - if text[-1:] == '\n': - text = text[:-1] - return sts, text - + try: + data = check_output(cmd, shell=True, universal_newlines=True, stderr=STDOUT) + status = 0 + except CalledProcessError as ex: + data = ex.output + status = ex.returncode + if data[-1:] == '\n': + data = data[:-1] + return status, data def getoutput(cmd): """Return output (stdout or stderr) of executing cmd in a shell. diff --git a/Lib/test/test_subprocess.py b/Lib/test/test_subprocess.py --- a/Lib/test/test_subprocess.py +++ b/Lib/test/test_subprocess.py @@ -2133,13 +2133,6 @@ def test_terminate_dead(self): self._kill_dead_process('terminate') - -# The module says: -# "NB This only works (and is only relevant) for UNIX." -# -# Actually, getoutput should work on any platform with an os.popen, but -# I'll take the comment as given, and skip this suite. - at unittest.skipUnless(os.name == 'posix', "only relevant for UNIX") class CommandTests(unittest.TestCase): def test_getoutput(self): self.assertEqual(subprocess.getoutput('echo xyzzy'), 'xyzzy') @@ -2153,8 +2146,8 @@ try: dir = tempfile.mkdtemp() name = os.path.join(dir, "foo") - - status, output = subprocess.getstatusoutput('cat ' + name) + status, output = subprocess.getstatusoutput( + ("type " if mswindows else "cat ") + name) self.assertNotEqual(status, 0) finally: if dir is not None: diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -13,6 +13,9 @@ Library ------- +- Issue #10197: Rework subprocess.get[status]output to use subprocess + functionality and thus to work on Windows. Patch by Nick Coghlan. + - Issue #19286: Directories in ``package_data`` are no longer added to the filelist, preventing failure outlined in the ticket. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 15:23:16 2013 From: python-checkins at python.org (tim.golden) Date: Sun, 3 Nov 2013 15:23:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2310197_Rework_subprocess=2Eget=5Bstatus=5Doutput?= =?utf-8?q?_to_use_subprocess?= Message-ID: <3dCK9w6jfsz7Ljr@mail.python.org> http://hg.python.org/cpython/rev/05ce1bd1a4c2 changeset: 86880:05ce1bd1a4c2 parent: 86871:6e592d972b86 parent: 86879:c34e163c0086 user: Tim Golden date: Sun Nov 03 12:55:51 2013 +0000 summary: Issue #10197 Rework subprocess.get[status]output to use subprocess functionality and thus to work on Windows. Patch by Nick Coghlan. files: Lib/subprocess.py | 24 +++++++++--------------- Lib/test/test_subprocess.py | 11 ++--------- Misc/NEWS | 3 +++ 3 files changed, 14 insertions(+), 24 deletions(-) diff --git a/Lib/subprocess.py b/Lib/subprocess.py --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -700,21 +700,15 @@ >>> subprocess.getstatusoutput('/bin/junk') (256, 'sh: /bin/junk: not found') """ - with os.popen('{ ' + cmd + '; } 2>&1', 'r') as pipe: - try: - text = pipe.read() - sts = pipe.close() - except: - process = pipe._proc - process.kill() - process.wait() - raise - if sts is None: - sts = 0 - if text[-1:] == '\n': - text = text[:-1] - return sts, text - + try: + data = check_output(cmd, shell=True, universal_newlines=True, stderr=STDOUT) + status = 0 + except CalledProcessError as ex: + data = ex.output + status = ex.returncode + if data[-1:] == '\n': + data = data[:-1] + return status, data def getoutput(cmd): """Return output (stdout or stderr) of executing cmd in a shell. diff --git a/Lib/test/test_subprocess.py b/Lib/test/test_subprocess.py --- a/Lib/test/test_subprocess.py +++ b/Lib/test/test_subprocess.py @@ -2158,13 +2158,6 @@ def test_terminate_dead(self): self._kill_dead_process('terminate') - -# The module says: -# "NB This only works (and is only relevant) for UNIX." -# -# Actually, getoutput should work on any platform with an os.popen, but -# I'll take the comment as given, and skip this suite. - at unittest.skipUnless(os.name == 'posix', "only relevant for UNIX") class CommandTests(unittest.TestCase): def test_getoutput(self): self.assertEqual(subprocess.getoutput('echo xyzzy'), 'xyzzy') @@ -2178,8 +2171,8 @@ try: dir = tempfile.mkdtemp() name = os.path.join(dir, "foo") - - status, output = subprocess.getstatusoutput('cat ' + name) + status, output = subprocess.getstatusoutput( + ("type " if mswindows else "cat ") + name) self.assertNotEqual(status, 0) finally: if dir is not None: diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -31,6 +31,9 @@ Library ------- +- Issue #10197: Rework subprocess.get[status]output to use subprocess + functionality and thus to work on Windows. Patch by Nick Coghlan + - Issue #19403: contextlib.redirect_stdout is now reentrant - Issue #19286: Directories in ``package_data`` are no longer added to -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 15:23:18 2013 From: python-checkins at python.org (tim.golden) Date: Sun, 3 Nov 2013 15:23:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMy4zIC0+IDMuMyk6?= =?utf-8?q?_Issue_=2310197=3A_merge_heads?= Message-ID: <3dCK9y2HjLz7Lk4@mail.python.org> http://hg.python.org/cpython/rev/b6efaa97ee0e changeset: 86881:b6efaa97ee0e branch: 3.3 parent: 86876:3f5e35b766ac parent: 86879:c34e163c0086 user: Tim Golden date: Sun Nov 03 14:20:23 2013 +0000 summary: Issue #10197: merge heads files: Lib/subprocess.py | 24 +++++++++--------------- Lib/test/test_subprocess.py | 11 ++--------- Misc/NEWS | 3 +++ 3 files changed, 14 insertions(+), 24 deletions(-) diff --git a/Lib/subprocess.py b/Lib/subprocess.py --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -681,21 +681,15 @@ >>> subprocess.getstatusoutput('/bin/junk') (256, 'sh: /bin/junk: not found') """ - with os.popen('{ ' + cmd + '; } 2>&1', 'r') as pipe: - try: - text = pipe.read() - sts = pipe.close() - except: - process = pipe._proc - process.kill() - process.wait() - raise - if sts is None: - sts = 0 - if text[-1:] == '\n': - text = text[:-1] - return sts, text - + try: + data = check_output(cmd, shell=True, universal_newlines=True, stderr=STDOUT) + status = 0 + except CalledProcessError as ex: + data = ex.output + status = ex.returncode + if data[-1:] == '\n': + data = data[:-1] + return status, data def getoutput(cmd): """Return output (stdout or stderr) of executing cmd in a shell. diff --git a/Lib/test/test_subprocess.py b/Lib/test/test_subprocess.py --- a/Lib/test/test_subprocess.py +++ b/Lib/test/test_subprocess.py @@ -2133,13 +2133,6 @@ def test_terminate_dead(self): self._kill_dead_process('terminate') - -# The module says: -# "NB This only works (and is only relevant) for UNIX." -# -# Actually, getoutput should work on any platform with an os.popen, but -# I'll take the comment as given, and skip this suite. - at unittest.skipUnless(os.name == 'posix', "only relevant for UNIX") class CommandTests(unittest.TestCase): def test_getoutput(self): self.assertEqual(subprocess.getoutput('echo xyzzy'), 'xyzzy') @@ -2153,8 +2146,8 @@ try: dir = tempfile.mkdtemp() name = os.path.join(dir, "foo") - - status, output = subprocess.getstatusoutput('cat ' + name) + status, output = subprocess.getstatusoutput( + ("type " if mswindows else "cat ") + name) self.assertNotEqual(status, 0) finally: if dir is not None: diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -16,6 +16,9 @@ - Issue #6157: Fixed tkinter.Text.debug(). Original patch by Guilherme Polo. - Issue #6160: The bbox() method of tkinter.Spinbox now returns a tuple of + +- Issue #10197: Rework subprocess.get[status]output to use subprocess + functionality and thus to work on Windows. Patch by Nick Coghlan. integers instead of a string. Based on patch by Guilherme Polo. - Issue #19286: Directories in ``package_data`` are no longer added to -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 15:23:19 2013 From: python-checkins at python.org (tim.golden) Date: Sun, 3 Nov 2013 15:23:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2310197=3A_merge_heads?= Message-ID: <3dCK9z4tS1z7Lk3@mail.python.org> http://hg.python.org/cpython/rev/28a0ae3dcb16 changeset: 86882:28a0ae3dcb16 parent: 86878:2ed8d500e113 parent: 86880:05ce1bd1a4c2 user: Tim Golden date: Sun Nov 03 14:21:29 2013 +0000 summary: Issue #10197: merge heads files: Lib/subprocess.py | 24 +++++++++--------------- Lib/test/test_subprocess.py | 11 ++--------- Misc/NEWS | 3 +++ 3 files changed, 14 insertions(+), 24 deletions(-) diff --git a/Lib/subprocess.py b/Lib/subprocess.py --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -700,21 +700,15 @@ >>> subprocess.getstatusoutput('/bin/junk') (256, 'sh: /bin/junk: not found') """ - with os.popen('{ ' + cmd + '; } 2>&1', 'r') as pipe: - try: - text = pipe.read() - sts = pipe.close() - except: - process = pipe._proc - process.kill() - process.wait() - raise - if sts is None: - sts = 0 - if text[-1:] == '\n': - text = text[:-1] - return sts, text - + try: + data = check_output(cmd, shell=True, universal_newlines=True, stderr=STDOUT) + status = 0 + except CalledProcessError as ex: + data = ex.output + status = ex.returncode + if data[-1:] == '\n': + data = data[:-1] + return status, data def getoutput(cmd): """Return output (stdout or stderr) of executing cmd in a shell. diff --git a/Lib/test/test_subprocess.py b/Lib/test/test_subprocess.py --- a/Lib/test/test_subprocess.py +++ b/Lib/test/test_subprocess.py @@ -2158,13 +2158,6 @@ def test_terminate_dead(self): self._kill_dead_process('terminate') - -# The module says: -# "NB This only works (and is only relevant) for UNIX." -# -# Actually, getoutput should work on any platform with an os.popen, but -# I'll take the comment as given, and skip this suite. - at unittest.skipUnless(os.name == 'posix', "only relevant for UNIX") class CommandTests(unittest.TestCase): def test_getoutput(self): self.assertEqual(subprocess.getoutput('echo xyzzy'), 'xyzzy') @@ -2178,8 +2171,8 @@ try: dir = tempfile.mkdtemp() name = os.path.join(dir, "foo") - - status, output = subprocess.getstatusoutput('cat ' + name) + status, output = subprocess.getstatusoutput( + ("type " if mswindows else "cat ") + name) self.assertNotEqual(status, 0) finally: if dir is not None: diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -35,6 +35,9 @@ TypeError instead of TclError on wrong number of arguments. Original patch by Guilherme Polo. +- Issue #10197: Rework subprocess.get[status]output to use subprocess + functionality and thus to work on Windows. Patch by Nick Coghlan + - Issue #6160: The bbox() method of tkinter.Spinbox now returns a tuple of integers instead of a string. Based on patch by Guilherme Polo. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 15:23:21 2013 From: python-checkins at python.org (tim.golden) Date: Sun, 3 Nov 2013 15:23:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2310197=3A_merge_3=2E3?= Message-ID: <3dCKB10Htcz7LkD@mail.python.org> http://hg.python.org/cpython/rev/fe828884a077 changeset: 86883:fe828884a077 parent: 86882:28a0ae3dcb16 parent: 86881:b6efaa97ee0e user: Tim Golden date: Sun Nov 03 14:22:14 2013 +0000 summary: Issue #10197: merge 3.3 files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 17:26:08 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 3 Nov 2013 17:26:08 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5MDg1?= =?utf-8?q?=3A_Fixed_pixels_rounding_for_last_Tk_patchlevels=2E?= Message-ID: <3dCMvh2RwTz7Ljj@mail.python.org> http://hg.python.org/cpython/rev/a34889a30d52 changeset: 86884:a34889a30d52 branch: 2.7 parent: 86875:b3178d03871b user: Serhiy Storchaka date: Sun Nov 03 18:24:04 2013 +0200 summary: Issue #19085: Fixed pixels rounding for last Tk patchlevels. files: Lib/lib-tk/test/widget_tests.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/lib-tk/test/widget_tests.py b/Lib/lib-tk/test/widget_tests.py --- a/Lib/lib-tk/test/widget_tests.py +++ b/Lib/lib-tk/test/widget_tests.py @@ -14,7 +14,7 @@ _sentinel = object() class AbstractWidgetTest(object): - _conv_pixels = staticmethod(int_round if tcl_version[:2] != (8, 5) else int) + _conv_pixels = staticmethod(int_round) _conv_pad_pixels = None wantobjects = True -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 17:26:09 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 3 Nov 2013 17:26:09 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5MDg1?= =?utf-8?q?=3A_Fixed_pixels_rounding_for_last_Tk_patchlevels=2E?= Message-ID: <3dCMvj4DsWz7Ljm@mail.python.org> http://hg.python.org/cpython/rev/dfdf47a9aad4 changeset: 86885:dfdf47a9aad4 branch: 3.3 parent: 86881:b6efaa97ee0e user: Serhiy Storchaka date: Sun Nov 03 18:24:31 2013 +0200 summary: Issue #19085: Fixed pixels rounding for last Tk patchlevels. files: Lib/tkinter/test/widget_tests.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/tkinter/test/widget_tests.py @@ -11,7 +11,7 @@ _sentinel = object() class AbstractWidgetTest: - _conv_pixels = round if tcl_version[:2] != (8, 5) else int + _conv_pixels = round _conv_pad_pixels = None wantobjects = True -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 17:26:10 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 3 Nov 2013 17:26:10 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319085=3A_Fixed_pixels_rounding_for_last_Tk_patc?= =?utf-8?q?hlevels=2E?= Message-ID: <3dCMvk5w5nz7Lk3@mail.python.org> http://hg.python.org/cpython/rev/e7be7aceab77 changeset: 86886:e7be7aceab77 parent: 86883:fe828884a077 parent: 86885:dfdf47a9aad4 user: Serhiy Storchaka date: Sun Nov 03 18:25:17 2013 +0200 summary: Issue #19085: Fixed pixels rounding for last Tk patchlevels. files: Lib/tkinter/test/widget_tests.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/tkinter/test/widget_tests.py @@ -11,7 +11,7 @@ _sentinel = object() class AbstractWidgetTest: - _conv_pixels = round if tcl_version[:2] != (8, 5) else int + _conv_pixels = round _conv_pad_pixels = None wantobjects = True -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 18:33:42 2013 From: python-checkins at python.org (r.david.murray) Date: Sun, 3 Nov 2013 18:33:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogIzE5NDg1OiBjbGFy?= =?utf-8?q?ify_get=5Fparam_example=2E?= Message-ID: <3dCPPf5WgFz7LjW@mail.python.org> http://hg.python.org/cpython/rev/c574951deadd changeset: 86887:c574951deadd branch: 3.3 parent: 86885:dfdf47a9aad4 user: R David Murray date: Sun Nov 03 12:23:23 2013 -0500 summary: #19485: clarify get_param example. Patch by Vajrasky Kok. files: Lib/email/message.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/email/message.py b/Lib/email/message.py --- a/Lib/email/message.py +++ b/Lib/email/message.py @@ -636,7 +636,7 @@ If your application doesn't care whether the parameter was RFC 2231 encoded, it can turn the return value into a string as follows: - param = msg.get_param('foo') + rawparam = msg.get_param('foo') param = email.utils.collapse_rfc2231_value(rawparam) """ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 18:33:44 2013 From: python-checkins at python.org (r.david.murray) Date: Sun, 3 Nov 2013 18:33:44 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge_=2319485=3A_clarify_get=5Fparam_example=2E?= Message-ID: <3dCPPh0rqVz7Ljr@mail.python.org> http://hg.python.org/cpython/rev/e3180c58a78c changeset: 86888:e3180c58a78c parent: 86886:e7be7aceab77 parent: 86887:c574951deadd user: R David Murray date: Sun Nov 03 12:23:51 2013 -0500 summary: Merge #19485: clarify get_param example. files: Lib/email/message.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/email/message.py b/Lib/email/message.py --- a/Lib/email/message.py +++ b/Lib/email/message.py @@ -662,7 +662,7 @@ If your application doesn't care whether the parameter was RFC 2231 encoded, it can turn the return value into a string as follows: - param = msg.get_param('foo') + rawparam = msg.get_param('foo') param = email.utils.collapse_rfc2231_value(rawparam) """ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 19:22:44 2013 From: python-checkins at python.org (r.david.murray) Date: Sun, 3 Nov 2013 19:22:44 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogIzE5NDExOiBDbGFy?= =?utf-8?q?ify_that_b2a=5Fhex/hexlify_returns_a_bytes_object=2E?= Message-ID: <3dCQVD73R5z7Ljx@mail.python.org> http://hg.python.org/cpython/rev/25d89a4faede changeset: 86889:25d89a4faede branch: 3.3 parent: 86887:c574951deadd user: R David Murray date: Sun Nov 03 13:21:38 2013 -0500 summary: #19411: Clarify that b2a_hex/hexlify returns a bytes object. Initial patch by Vajrasky Kok. files: Doc/library/binascii.rst | 2 +- Modules/binascii.c | 3 ++- 2 files changed, 3 insertions(+), 2 deletions(-) diff --git a/Doc/library/binascii.rst b/Doc/library/binascii.rst --- a/Doc/library/binascii.rst +++ b/Doc/library/binascii.rst @@ -145,7 +145,7 @@ Return the hexadecimal representation of the binary *data*. Every byte of *data* is converted into the corresponding 2-digit hex representation. The - resulting string is therefore twice as long as the length of *data*. + returned bytes object is therefore twice as long as the length of *data*. .. function:: a2b_hex(hexstr) diff --git a/Modules/binascii.c b/Modules/binascii.c --- a/Modules/binascii.c +++ b/Modules/binascii.c @@ -1129,7 +1129,8 @@ PyDoc_STRVAR(doc_hexlify, "b2a_hex(data) -> s; Hexadecimal representation of binary data.\n\ \n\ -This function is also available as \"hexlify()\"."); +The return value is a bytes object. This function is also\n\ +available as \"hexlify()\"."); static int -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 19:22:46 2013 From: python-checkins at python.org (r.david.murray) Date: Sun, 3 Nov 2013 19:22:46 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge_=2319411=3A_Clarify_that_b2a=5Fhex/hexlify_returns?= =?utf-8?q?_a_bytes_object=2E?= Message-ID: <3dCQVG29KfzMRc@mail.python.org> http://hg.python.org/cpython/rev/ac190d03aed5 changeset: 86890:ac190d03aed5 parent: 86888:e3180c58a78c parent: 86889:25d89a4faede user: R David Murray date: Sun Nov 03 13:22:17 2013 -0500 summary: Merge #19411: Clarify that b2a_hex/hexlify returns a bytes object. files: Doc/library/binascii.rst | 2 +- Modules/binascii.c | 3 ++- 2 files changed, 3 insertions(+), 2 deletions(-) diff --git a/Doc/library/binascii.rst b/Doc/library/binascii.rst --- a/Doc/library/binascii.rst +++ b/Doc/library/binascii.rst @@ -145,7 +145,7 @@ Return the hexadecimal representation of the binary *data*. Every byte of *data* is converted into the corresponding 2-digit hex representation. The - resulting string is therefore twice as long as the length of *data*. + returned bytes object is therefore twice as long as the length of *data*. .. function:: a2b_hex(hexstr) diff --git a/Modules/binascii.c b/Modules/binascii.c --- a/Modules/binascii.c +++ b/Modules/binascii.c @@ -1122,7 +1122,8 @@ PyDoc_STRVAR(doc_hexlify, "b2a_hex(data) -> s; Hexadecimal representation of binary data.\n\ \n\ -This function is also available as \"hexlify()\"."); +The return value is a bytes object. This function is also\n\ +available as \"hexlify()\"."); static int -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 19:28:14 2013 From: python-checkins at python.org (tim.golden) Date: Sun, 3 Nov 2013 19:28:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzEwMTk3?= =?utf-8?q?=3A_Indicate_availability_of_subprocess=2Eget=5Bstatus=5Doutput?= =?utf-8?q?_on_Windows?= Message-ID: <3dCQcZ1QwCz7LjR@mail.python.org> http://hg.python.org/cpython/rev/2924a63aab73 changeset: 86891:2924a63aab73 branch: 3.3 parent: 86887:c574951deadd user: Tim Golden date: Sun Nov 03 18:24:50 2013 +0000 summary: Issue #10197: Indicate availability of subprocess.get[status]output on Windows and add a note about the effects of universal newlines files: Doc/library/subprocess.rst | 16 ++++++++++------ 1 files changed, 10 insertions(+), 6 deletions(-) diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -1050,10 +1050,12 @@ Return ``(status, output)`` of executing *cmd* in a shell. - Execute the string *cmd* in a shell with :func:`os.popen` and return a 2-tuple - ``(status, output)``. *cmd* is actually run as ``{ cmd ; } 2>&1``, so that the - returned output will contain output or error messages. A trailing newline is - stripped from the output. The exit status for the command can be interpreted + Execute the string *cmd* in a shell with :class:`Popen` and return a 2-tuple + ``(status, output)`` via :func:`Popen.communicate`. Universal newlines mode + is used; see the notes on :ref:`frequently-used-arguments` for more details. + + A trailing newline is stripped from the output. + The exit status for the command can be interpreted according to the rules for the C function :c:func:`wait`. Example:: >>> subprocess.getstatusoutput('ls /bin/ls') @@ -1063,7 +1065,8 @@ >>> subprocess.getstatusoutput('/bin/junk') (256, 'sh: /bin/junk: not found') - Availability: UNIX. + .. versionchanged:: 3.3 + Availability: Unix & Windows .. function:: getoutput(cmd) @@ -1076,7 +1079,8 @@ >>> subprocess.getoutput('ls /bin/ls') '/bin/ls' - Availability: UNIX. + .. versionchanged:: 3.3 + Availability: Unix & Windows Notes -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 19:28:15 2013 From: python-checkins at python.org (tim.golden) Date: Sun, 3 Nov 2013 19:28:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2310197=3A_Indicate_availability_of_subprocess=2E?= =?utf-8?q?get=5Bstatus=5Doutput_on_Windows?= Message-ID: <3dCQcb3SLHz7Ljg@mail.python.org> http://hg.python.org/cpython/rev/effad2bda4cb changeset: 86892:effad2bda4cb parent: 86888:e3180c58a78c parent: 86891:2924a63aab73 user: Tim Golden date: Sun Nov 03 18:25:51 2013 +0000 summary: Issue #10197: Indicate availability of subprocess.get[status]output on Windows and add a note about the effects of universal newlines files: Doc/library/subprocess.rst | 16 ++++++++++------ 1 files changed, 10 insertions(+), 6 deletions(-) diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -1058,10 +1058,12 @@ Return ``(status, output)`` of executing *cmd* in a shell. - Execute the string *cmd* in a shell with :func:`os.popen` and return a 2-tuple - ``(status, output)``. *cmd* is actually run as ``{ cmd ; } 2>&1``, so that the - returned output will contain output or error messages. A trailing newline is - stripped from the output. The exit status for the command can be interpreted + Execute the string *cmd* in a shell with :class:`Popen` and return a 2-tuple + ``(status, output)`` via :func:`Popen.communicate`. Universal newlines mode + is used; see the notes on :ref:`frequently-used-arguments` for more details. + + A trailing newline is stripped from the output. + The exit status for the command can be interpreted according to the rules for the C function :c:func:`wait`. Example:: >>> subprocess.getstatusoutput('ls /bin/ls') @@ -1071,7 +1073,8 @@ >>> subprocess.getstatusoutput('/bin/junk') (256, 'sh: /bin/junk: not found') - Availability: UNIX. + .. versionchanged:: 3.3 + Availability: Unix & Windows .. function:: getoutput(cmd) @@ -1084,7 +1087,8 @@ >>> subprocess.getoutput('ls /bin/ls') '/bin/ls' - Availability: UNIX. + .. versionchanged:: 3.3 + Availability: Unix & Windows Notes -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 19:28:16 2013 From: python-checkins at python.org (tim.golden) Date: Sun, 3 Nov 2013 19:28:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merge?= Message-ID: <3dCQcc58h4z7Ljs@mail.python.org> http://hg.python.org/cpython/rev/fc1f353b8e7e changeset: 86893:fc1f353b8e7e parent: 86892:effad2bda4cb parent: 86890:ac190d03aed5 user: Tim Golden date: Sun Nov 03 18:27:07 2013 +0000 summary: Merge files: Doc/library/binascii.rst | 2 +- Modules/binascii.c | 3 ++- 2 files changed, 3 insertions(+), 2 deletions(-) diff --git a/Doc/library/binascii.rst b/Doc/library/binascii.rst --- a/Doc/library/binascii.rst +++ b/Doc/library/binascii.rst @@ -145,7 +145,7 @@ Return the hexadecimal representation of the binary *data*. Every byte of *data* is converted into the corresponding 2-digit hex representation. The - resulting string is therefore twice as long as the length of *data*. + returned bytes object is therefore twice as long as the length of *data*. .. function:: a2b_hex(hexstr) diff --git a/Modules/binascii.c b/Modules/binascii.c --- a/Modules/binascii.c +++ b/Modules/binascii.c @@ -1122,7 +1122,8 @@ PyDoc_STRVAR(doc_hexlify, "b2a_hex(data) -> s; Hexadecimal representation of binary data.\n\ \n\ -This function is also available as \"hexlify()\"."); +The return value is a bytes object. This function is also\n\ +available as \"hexlify()\"."); static int -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 19:28:17 2013 From: python-checkins at python.org (tim.golden) Date: Sun, 3 Nov 2013 19:28:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMy4zIC0+IDMuMyk6?= =?utf-8?q?_Merge?= Message-ID: <3dCQcd74lpz7Ljt@mail.python.org> http://hg.python.org/cpython/rev/b91f4f13f2b0 changeset: 86894:b91f4f13f2b0 branch: 3.3 parent: 86889:25d89a4faede parent: 86891:2924a63aab73 user: Tim Golden date: Sun Nov 03 18:27:40 2013 +0000 summary: Merge files: Doc/library/subprocess.rst | 16 ++++++++++------ 1 files changed, 10 insertions(+), 6 deletions(-) diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -1050,10 +1050,12 @@ Return ``(status, output)`` of executing *cmd* in a shell. - Execute the string *cmd* in a shell with :func:`os.popen` and return a 2-tuple - ``(status, output)``. *cmd* is actually run as ``{ cmd ; } 2>&1``, so that the - returned output will contain output or error messages. A trailing newline is - stripped from the output. The exit status for the command can be interpreted + Execute the string *cmd* in a shell with :class:`Popen` and return a 2-tuple + ``(status, output)`` via :func:`Popen.communicate`. Universal newlines mode + is used; see the notes on :ref:`frequently-used-arguments` for more details. + + A trailing newline is stripped from the output. + The exit status for the command can be interpreted according to the rules for the C function :c:func:`wait`. Example:: >>> subprocess.getstatusoutput('ls /bin/ls') @@ -1063,7 +1065,8 @@ >>> subprocess.getstatusoutput('/bin/junk') (256, 'sh: /bin/junk: not found') - Availability: UNIX. + .. versionchanged:: 3.3 + Availability: Unix & Windows .. function:: getoutput(cmd) @@ -1076,7 +1079,8 @@ >>> subprocess.getoutput('ls /bin/ls') '/bin/ls' - Availability: UNIX. + .. versionchanged:: 3.3 + Availability: Unix & Windows Notes -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 20:32:21 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 3 Nov 2013 20:32:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE4NzAy?= =?utf-8?q?=3A_All_skipped_tests_now_reported_as_skipped=2E?= Message-ID: <3dCS2Y0g86z7LkD@mail.python.org> http://hg.python.org/cpython/rev/1feeeb8992f8 changeset: 86895:1feeeb8992f8 branch: 3.3 user: Serhiy Storchaka date: Sun Nov 03 21:31:18 2013 +0200 summary: Issue #18702: All skipped tests now reported as skipped. files: Lib/test/test_array.py | 17 +- Lib/test/test_compileall.py | 3 +- Lib/test/test_csv.py | 135 +++--- Lib/test/test_dbm_dumb.py | 6 +- Lib/test/test_enumerate.py | 3 +- Lib/test/test_ftplib.py | 14 +- Lib/test/test_mailbox.py | 36 +- Lib/test/test_math.py | 59 +- Lib/test/test_mmap.py | 144 +++--- Lib/test/test_nntplib.py | 74 +- Lib/test/test_os.py | 453 ++++++++++----------- Lib/test/test_poplib.py | 57 +- Lib/test/test_posix.py | 304 +++++++------ Lib/test/test_set.py | 8 +- Lib/test/test_shutil.py | 119 ++-- Lib/test/test_socket.py | 103 ++-- Lib/test/test_socketserver.py | 85 ++- Lib/test/test_sys.py | 17 +- Lib/test/test_warnings.py | 3 +- Lib/test/test_zlib.py | 159 ++++--- Misc/NEWS | 2 + 21 files changed, 913 insertions(+), 888 deletions(-) diff --git a/Lib/test/test_array.py b/Lib/test/test_array.py --- a/Lib/test/test_array.py +++ b/Lib/test/test_array.py @@ -11,6 +11,7 @@ import io import math import struct +import sys import warnings import array @@ -993,15 +994,15 @@ s = None self.assertRaises(ReferenceError, len, p) + @unittest.skipUnless(hasattr(sys, 'getrefcount'), + 'test needs sys.getrefcount()') def test_bug_782369(self): - import sys - if hasattr(sys, "getrefcount"): - for i in range(10): - b = array.array('B', range(64)) - rc = sys.getrefcount(10) - for i in range(10): - b = array.array('B', range(64)) - self.assertEqual(rc, sys.getrefcount(10)) + for i in range(10): + b = array.array('B', range(64)) + rc = sys.getrefcount(10) + for i in range(10): + b = array.array('B', range(64)) + self.assertEqual(rc, sys.getrefcount(10)) def test_subclass_with_kwargs(self): # SF bug #1486663 -- this used to erroneously raise a TypeError diff --git a/Lib/test/test_compileall.py b/Lib/test/test_compileall.py --- a/Lib/test/test_compileall.py +++ b/Lib/test/test_compileall.py @@ -39,11 +39,10 @@ compare = struct.pack('<4sl', imp.get_magic(), mtime) return data, compare + @unittest.skipUnless(hasattr(os, 'stat'), 'test needs os.stat()') def recreation_check(self, metadata): """Check that compileall recreates bytecode when the new metadata is used.""" - if not hasattr(os, 'stat'): - return py_compile.compile(self.source_path) self.assertEqual(*self.data()) with open(self.bc_path, 'rb') as file: diff --git a/Lib/test/test_csv.py b/Lib/test/test_csv.py --- a/Lib/test/test_csv.py +++ b/Lib/test/test_csv.py @@ -896,78 +896,77 @@ dialect = sniffer.sniff(self.sample9) self.assertTrue(dialect.doublequote) -if not hasattr(sys, "gettotalrefcount"): - if support.verbose: print("*** skipping leakage tests ***") -else: - class NUL: - def write(s, *args): - pass - writelines = write +class NUL: + def write(s, *args): + pass + writelines = write - class TestLeaks(unittest.TestCase): - def test_create_read(self): - delta = 0 - lastrc = sys.gettotalrefcount() - for i in range(20): - gc.collect() - self.assertEqual(gc.garbage, []) - rc = sys.gettotalrefcount() - csv.reader(["a,b,c\r\n"]) - csv.reader(["a,b,c\r\n"]) - csv.reader(["a,b,c\r\n"]) - delta = rc-lastrc - lastrc = rc - # if csv.reader() leaks, last delta should be 3 or more - self.assertEqual(delta < 3, True) + at unittest.skipUnless(hasattr(sys, "gettotalrefcount"), + 'requires sys.gettotalrefcount()') +class TestLeaks(unittest.TestCase): + def test_create_read(self): + delta = 0 + lastrc = sys.gettotalrefcount() + for i in range(20): + gc.collect() + self.assertEqual(gc.garbage, []) + rc = sys.gettotalrefcount() + csv.reader(["a,b,c\r\n"]) + csv.reader(["a,b,c\r\n"]) + csv.reader(["a,b,c\r\n"]) + delta = rc-lastrc + lastrc = rc + # if csv.reader() leaks, last delta should be 3 or more + self.assertEqual(delta < 3, True) - def test_create_write(self): - delta = 0 - lastrc = sys.gettotalrefcount() - s = NUL() - for i in range(20): - gc.collect() - self.assertEqual(gc.garbage, []) - rc = sys.gettotalrefcount() - csv.writer(s) - csv.writer(s) - csv.writer(s) - delta = rc-lastrc - lastrc = rc - # if csv.writer() leaks, last delta should be 3 or more - self.assertEqual(delta < 3, True) + def test_create_write(self): + delta = 0 + lastrc = sys.gettotalrefcount() + s = NUL() + for i in range(20): + gc.collect() + self.assertEqual(gc.garbage, []) + rc = sys.gettotalrefcount() + csv.writer(s) + csv.writer(s) + csv.writer(s) + delta = rc-lastrc + lastrc = rc + # if csv.writer() leaks, last delta should be 3 or more + self.assertEqual(delta < 3, True) - def test_read(self): - delta = 0 - rows = ["a,b,c\r\n"]*5 - lastrc = sys.gettotalrefcount() - for i in range(20): - gc.collect() - self.assertEqual(gc.garbage, []) - rc = sys.gettotalrefcount() - rdr = csv.reader(rows) - for row in rdr: - pass - delta = rc-lastrc - lastrc = rc - # if reader leaks during read, delta should be 5 or more - self.assertEqual(delta < 5, True) + def test_read(self): + delta = 0 + rows = ["a,b,c\r\n"]*5 + lastrc = sys.gettotalrefcount() + for i in range(20): + gc.collect() + self.assertEqual(gc.garbage, []) + rc = sys.gettotalrefcount() + rdr = csv.reader(rows) + for row in rdr: + pass + delta = rc-lastrc + lastrc = rc + # if reader leaks during read, delta should be 5 or more + self.assertEqual(delta < 5, True) - def test_write(self): - delta = 0 - rows = [[1,2,3]]*5 - s = NUL() - lastrc = sys.gettotalrefcount() - for i in range(20): - gc.collect() - self.assertEqual(gc.garbage, []) - rc = sys.gettotalrefcount() - writer = csv.writer(s) - for row in rows: - writer.writerow(row) - delta = rc-lastrc - lastrc = rc - # if writer leaks during write, last delta should be 5 or more - self.assertEqual(delta < 5, True) + def test_write(self): + delta = 0 + rows = [[1,2,3]]*5 + s = NUL() + lastrc = sys.gettotalrefcount() + for i in range(20): + gc.collect() + self.assertEqual(gc.garbage, []) + rc = sys.gettotalrefcount() + writer = csv.writer(s) + for row in rows: + writer.writerow(row) + delta = rc-lastrc + lastrc = rc + # if writer leaks during write, last delta should be 5 or more + self.assertEqual(delta < 5, True) class TestUnicode(unittest.TestCase): diff --git a/Lib/test/test_dbm_dumb.py b/Lib/test/test_dbm_dumb.py --- a/Lib/test/test_dbm_dumb.py +++ b/Lib/test/test_dbm_dumb.py @@ -37,11 +37,9 @@ self.read_helper(f) f.close() + @unittest.skipUnless(hasattr(os, 'umask'), 'test needs os.umask()') + @unittest.skipUnless(hasattr(os, 'chmod'), 'test needs os.chmod()') def test_dumbdbm_creation_mode(self): - # On platforms without chmod, don't do anything. - if not (hasattr(os, 'chmod') and hasattr(os, 'umask')): - return - try: old_umask = os.umask(0o002) f = dumbdbm.open(_fname, 'c', 0o637) diff --git a/Lib/test/test_enumerate.py b/Lib/test/test_enumerate.py --- a/Lib/test/test_enumerate.py +++ b/Lib/test/test_enumerate.py @@ -204,11 +204,10 @@ self.assertRaises(TypeError, reversed) self.assertRaises(TypeError, reversed, [], 'extra') + @unittest.skipUnless(hasattr(sys, 'getrefcount'), 'test needs sys.getrefcount()') def test_bug1229429(self): # this bug was never in reversed, it was in # PyObject_CallMethod, and reversed_new calls that sometimes. - if not hasattr(sys, "getrefcount"): - return def f(): pass r = f.__reversed__ = object() diff --git a/Lib/test/test_ftplib.py b/Lib/test/test_ftplib.py --- a/Lib/test/test_ftplib.py +++ b/Lib/test/test_ftplib.py @@ -16,7 +16,7 @@ except ImportError: ssl = None -from unittest import TestCase +from unittest import TestCase, skipUnless from test import support from test.support import HOST, HOSTv6 threading = support.import_module('threading') @@ -779,6 +779,7 @@ self.assertRaises(ftplib.Error, self.client.storlines, 'stor', f) + at skipUnless(support.IPV6_ENABLED, "IPv6 not enabled") class TestIPv6Environment(TestCase): def setUp(self): @@ -819,6 +820,7 @@ retr() + at skipUnless(ssl, "SSL not available") class TestTLS_FTPClassMixin(TestFTPClass): """Repeat TestFTPClass tests starting the TLS layer for both control and data connections first. @@ -834,6 +836,7 @@ self.client.prot_p() + at skipUnless(ssl, "SSL not available") class TestTLS_FTPClass(TestCase): """Specific TLS_FTP class tests.""" @@ -1015,12 +1018,9 @@ def test_main(): - tests = [TestFTPClass, TestTimeouts] - if support.IPV6_ENABLED: - tests.append(TestIPv6Environment) - - if ssl is not None: - tests.extend([TestTLS_FTPClassMixin, TestTLS_FTPClass]) + tests = [TestFTPClass, TestTimeouts, + TestIPv6Environment, + TestTLS_FTPClassMixin, TestTLS_FTPClass] thread_info = support.threading_setup() try: diff --git a/Lib/test/test_mailbox.py b/Lib/test/test_mailbox.py --- a/Lib/test/test_mailbox.py +++ b/Lib/test/test_mailbox.py @@ -868,10 +868,10 @@ for msg in self._box: pass + @unittest.skipUnless(hasattr(os, 'umask'), 'test needs os.umask()') + @unittest.skipUnless(hasattr(os, 'stat'), 'test needs os.stat()') def test_file_permissions(self): # Verify that message files are created without execute permissions - if not hasattr(os, "stat") or not hasattr(os, "umask"): - return msg = mailbox.MaildirMessage(self._template % 0) orig_umask = os.umask(0) try: @@ -882,12 +882,11 @@ mode = os.stat(path).st_mode self.assertFalse(mode & 0o111) + @unittest.skipUnless(hasattr(os, 'umask'), 'test needs os.umask()') + @unittest.skipUnless(hasattr(os, 'stat'), 'test needs os.stat()') def test_folder_file_perms(self): # From bug #3228, we want to verify that the file created inside a Maildir # subfolder isn't marked as executable. - if not hasattr(os, "stat") or not hasattr(os, "umask"): - return - orig_umask = os.umask(0) try: subfolder = self._box.add_folder('subfolder') @@ -1097,24 +1096,25 @@ _factory = lambda self, path, factory=None: mailbox.mbox(path, factory) + @unittest.skipUnless(hasattr(os, 'umask'), 'test needs os.umask()') + @unittest.skipUnless(hasattr(os, 'stat'), 'test needs os.stat()') def test_file_perms(self): # From bug #3228, we want to verify that the mailbox file isn't executable, # even if the umask is set to something that would leave executable bits set. # We only run this test on platforms that support umask. - if hasattr(os, 'umask') and hasattr(os, 'stat'): - try: - old_umask = os.umask(0o077) - self._box.close() - os.unlink(self._path) - self._box = mailbox.mbox(self._path, create=True) - self._box.add('') - self._box.close() - finally: - os.umask(old_umask) + try: + old_umask = os.umask(0o077) + self._box.close() + os.unlink(self._path) + self._box = mailbox.mbox(self._path, create=True) + self._box.add('') + self._box.close() + finally: + os.umask(old_umask) - st = os.stat(self._path) - perms = st.st_mode - self.assertFalse((perms & 0o111)) # Execute bits should all be off. + st = os.stat(self._path) + perms = st.st_mode + self.assertFalse((perms & 0o111)) # Execute bits should all be off. def test_terminating_newline(self): message = email.message.Message() diff --git a/Lib/test/test_math.py b/Lib/test/test_math.py --- a/Lib/test/test_math.py +++ b/Lib/test/test_math.py @@ -980,38 +980,37 @@ # still fails this part of the test on some platforms. For now, we only # *run* test_exceptions() in verbose mode, so that this isn't normally # tested. + @unittest.skipUnless(verbose, 'requires verbose mode') + def test_exceptions(self): + try: + x = math.exp(-1000000000) + except: + # mathmodule.c is failing to weed out underflows from libm, or + # we've got an fp format with huge dynamic range + self.fail("underflowing exp() should not have raised " + "an exception") + if x != 0: + self.fail("underflowing exp() should have returned 0") - if verbose: - def test_exceptions(self): - try: - x = math.exp(-1000000000) - except: - # mathmodule.c is failing to weed out underflows from libm, or - # we've got an fp format with huge dynamic range - self.fail("underflowing exp() should not have raised " - "an exception") - if x != 0: - self.fail("underflowing exp() should have returned 0") + # If this fails, probably using a strict IEEE-754 conforming libm, and x + # is +Inf afterwards. But Python wants overflows detected by default. + try: + x = math.exp(1000000000) + except OverflowError: + pass + else: + self.fail("overflowing exp() didn't trigger OverflowError") - # If this fails, probably using a strict IEEE-754 conforming libm, and x - # is +Inf afterwards. But Python wants overflows detected by default. - try: - x = math.exp(1000000000) - except OverflowError: - pass - else: - self.fail("overflowing exp() didn't trigger OverflowError") - - # If this fails, it could be a puzzle. One odd possibility is that - # mathmodule.c's macros are getting confused while comparing - # Inf (HUGE_VAL) to a NaN, and artificially setting errno to ERANGE - # as a result (and so raising OverflowError instead). - try: - x = math.sqrt(-1.0) - except ValueError: - pass - else: - self.fail("sqrt(-1) didn't raise ValueError") + # If this fails, it could be a puzzle. One odd possibility is that + # mathmodule.c's macros are getting confused while comparing + # Inf (HUGE_VAL) to a NaN, and artificially setting errno to ERANGE + # as a result (and so raising OverflowError instead). + try: + x = math.sqrt(-1.0) + except ValueError: + pass + else: + self.fail("sqrt(-1) didn't raise ValueError") @requires_IEEE_754 def test_testfile(self): diff --git a/Lib/test/test_mmap.py b/Lib/test/test_mmap.py --- a/Lib/test/test_mmap.py +++ b/Lib/test/test_mmap.py @@ -314,26 +314,25 @@ mf.close() f.close() + @unittest.skipUnless(hasattr(os, "stat"), "needs os.stat()") def test_entire_file(self): # test mapping of entire file by passing 0 for map length - if hasattr(os, "stat"): - f = open(TESTFN, "wb+") + f = open(TESTFN, "wb+") - f.write(2**16 * b'm') # Arbitrary character - f.close() + f.write(2**16 * b'm') # Arbitrary character + f.close() - f = open(TESTFN, "rb+") - mf = mmap.mmap(f.fileno(), 0) - self.assertEqual(len(mf), 2**16, "Map size should equal file size.") - self.assertEqual(mf.read(2**16), 2**16 * b"m") - mf.close() - f.close() + f = open(TESTFN, "rb+") + mf = mmap.mmap(f.fileno(), 0) + self.assertEqual(len(mf), 2**16, "Map size should equal file size.") + self.assertEqual(mf.read(2**16), 2**16 * b"m") + mf.close() + f.close() + @unittest.skipUnless(hasattr(os, "stat"), "needs os.stat()") def test_length_0_offset(self): # Issue #10916: test mapping of remainder of file by passing 0 for # map length with an offset doesn't cause a segfault. - if not hasattr(os, "stat"): - self.skipTest("needs os.stat") # NOTE: allocation granularity is currently 65536 under Win64, # and therefore the minimum offset alignment. with open(TESTFN, "wb") as f: @@ -343,12 +342,10 @@ with mmap.mmap(f.fileno(), 0, offset=65536, access=mmap.ACCESS_READ) as mf: self.assertRaises(IndexError, mf.__getitem__, 80000) + @unittest.skipUnless(hasattr(os, "stat"), "needs os.stat()") def test_length_0_large_offset(self): # Issue #10959: test mapping of a file by passing 0 for # map length with a large offset doesn't cause a segfault. - if not hasattr(os, "stat"): - self.skipTest("needs os.stat") - with open(TESTFN, "wb") as f: f.write(115699 * b'm') # Arbitrary character @@ -560,9 +557,8 @@ return mmap.mmap.__new__(klass, -1, *args, **kwargs) anon_mmap(PAGESIZE) + @unittest.skipUnless(hasattr(mmap, 'PROT_READ'), "needs mmap.PROT_READ") def test_prot_readonly(self): - if not hasattr(mmap, 'PROT_READ'): - return mapsize = 10 with open(TESTFN, "wb") as fp: fp.write(b"a"*mapsize) @@ -616,67 +612,69 @@ self.assertEqual(m.read_byte(), b) m.close() - if os.name == 'nt': - def test_tagname(self): - data1 = b"0123456789" - data2 = b"abcdefghij" - assert len(data1) == len(data2) + @unittest.skipUnless(os.name == 'nt', 'requires Windows') + def test_tagname(self): + data1 = b"0123456789" + data2 = b"abcdefghij" + assert len(data1) == len(data2) - # Test same tag - m1 = mmap.mmap(-1, len(data1), tagname="foo") - m1[:] = data1 - m2 = mmap.mmap(-1, len(data2), tagname="foo") - m2[:] = data2 - self.assertEqual(m1[:], data2) - self.assertEqual(m2[:], data2) - m2.close() - m1.close() + # Test same tag + m1 = mmap.mmap(-1, len(data1), tagname="foo") + m1[:] = data1 + m2 = mmap.mmap(-1, len(data2), tagname="foo") + m2[:] = data2 + self.assertEqual(m1[:], data2) + self.assertEqual(m2[:], data2) + m2.close() + m1.close() - # Test different tag - m1 = mmap.mmap(-1, len(data1), tagname="foo") - m1[:] = data1 - m2 = mmap.mmap(-1, len(data2), tagname="boo") - m2[:] = data2 - self.assertEqual(m1[:], data1) - self.assertEqual(m2[:], data2) - m2.close() - m1.close() + # Test different tag + m1 = mmap.mmap(-1, len(data1), tagname="foo") + m1[:] = data1 + m2 = mmap.mmap(-1, len(data2), tagname="boo") + m2[:] = data2 + self.assertEqual(m1[:], data1) + self.assertEqual(m2[:], data2) + m2.close() + m1.close() - def test_crasher_on_windows(self): - # Should not crash (Issue 1733986) - m = mmap.mmap(-1, 1000, tagname="foo") - try: - mmap.mmap(-1, 5000, tagname="foo")[:] # same tagname, but larger size - except: - pass - m.close() + @unittest.skipUnless(os.name == 'nt', 'requires Windows') + def test_crasher_on_windows(self): + # Should not crash (Issue 1733986) + m = mmap.mmap(-1, 1000, tagname="foo") + try: + mmap.mmap(-1, 5000, tagname="foo")[:] # same tagname, but larger size + except: + pass + m.close() - # Should not crash (Issue 5385) - with open(TESTFN, "wb") as fp: - fp.write(b"x"*10) - f = open(TESTFN, "r+b") - m = mmap.mmap(f.fileno(), 0) - f.close() - try: - m.resize(0) # will raise WindowsError - except: - pass - try: - m[:] - except: - pass - m.close() + # Should not crash (Issue 5385) + with open(TESTFN, "wb") as fp: + fp.write(b"x"*10) + f = open(TESTFN, "r+b") + m = mmap.mmap(f.fileno(), 0) + f.close() + try: + m.resize(0) # will raise WindowsError + except: + pass + try: + m[:] + except: + pass + m.close() - def test_invalid_descriptor(self): - # socket file descriptors are valid, but out of range - # for _get_osfhandle, causing a crash when validating the - # parameters to _get_osfhandle. - s = socket.socket() - try: - with self.assertRaises(mmap.error): - m = mmap.mmap(s.fileno(), 10) - finally: - s.close() + @unittest.skipUnless(os.name == 'nt', 'requires Windows') + def test_invalid_descriptor(self): + # socket file descriptors are valid, but out of range + # for _get_osfhandle, causing a crash when validating the + # parameters to _get_osfhandle. + s = socket.socket() + try: + with self.assertRaises(mmap.error): + m = mmap.mmap(s.fileno(), 10) + finally: + s.close() def test_context_manager(self): with mmap.mmap(-1, 10) as m: diff --git a/Lib/test/test_nntplib.py b/Lib/test/test_nntplib.py --- a/Lib/test/test_nntplib.py +++ b/Lib/test/test_nntplib.py @@ -6,10 +6,12 @@ import functools import contextlib from test import support -from nntplib import NNTP, GroupInfo, _have_ssl +from nntplib import NNTP, GroupInfo import nntplib -if _have_ssl: +try: import ssl +except ImportError: + ssl = None TIMEOUT = 30 @@ -199,23 +201,23 @@ resp, caps = self.server.capabilities() _check_caps(caps) - if _have_ssl: - def test_starttls(self): - file = self.server.file - sock = self.server.sock - try: - self.server.starttls() - except nntplib.NNTPPermanentError: - self.skipTest("STARTTLS not supported by server.") - else: - # Check that the socket and internal pseudo-file really were - # changed. - self.assertNotEqual(file, self.server.file) - self.assertNotEqual(sock, self.server.sock) - # Check that the new socket really is an SSL one - self.assertIsInstance(self.server.sock, ssl.SSLSocket) - # Check that trying starttls when it's already active fails. - self.assertRaises(ValueError, self.server.starttls) + @unittest.skipUnless(ssl, 'requires SSL support') + def test_starttls(self): + file = self.server.file + sock = self.server.sock + try: + self.server.starttls() + except nntplib.NNTPPermanentError: + self.skipTest("STARTTLS not supported by server.") + else: + # Check that the socket and internal pseudo-file really were + # changed. + self.assertNotEqual(file, self.server.file) + self.assertNotEqual(sock, self.server.sock) + # Check that the new socket really is an SSL one + self.assertIsInstance(self.server.sock, ssl.SSLSocket) + # Check that trying starttls when it's already active fails. + self.assertRaises(ValueError, self.server.starttls) def test_zlogin(self): # This test must be the penultimate because further commands will be @@ -300,25 +302,24 @@ if cls.server is not None: cls.server.quit() + at unittest.skipUnless(ssl, 'requires SSL support') +class NetworkedNNTP_SSLTests(NetworkedNNTPTests): -if _have_ssl: - class NetworkedNNTP_SSLTests(NetworkedNNTPTests): + # Technical limits for this public NNTP server (see http://www.aioe.org): + # "Only two concurrent connections per IP address are allowed and + # 400 connections per day are accepted from each IP address." - # Technical limits for this public NNTP server (see http://www.aioe.org): - # "Only two concurrent connections per IP address are allowed and - # 400 connections per day are accepted from each IP address." + NNTP_HOST = 'nntp.aioe.org' + GROUP_NAME = 'comp.lang.python' + GROUP_PAT = 'comp.lang.*' - NNTP_HOST = 'nntp.aioe.org' - GROUP_NAME = 'comp.lang.python' - GROUP_PAT = 'comp.lang.*' + NNTP_CLASS = getattr(nntplib, 'NNTP_SSL', None) - NNTP_CLASS = nntplib.NNTP_SSL + # Disabled as it produces too much data + test_list = None - # Disabled as it produces too much data - test_list = None - - # Disabled as the connection will already be encrypted. - test_starttls = None + # Disabled as the connection will already be encrypted. + test_starttls = None # @@ -1407,12 +1408,13 @@ gives(2000, 6, 23, "000623", "000000") gives(2010, 6, 5, "100605", "000000") + @unittest.skipUnless(ssl, 'requires SSL support') + def test_ssl_support(self): + self.assertTrue(hasattr(nntplib, 'NNTP_SSL')) def test_main(): tests = [MiscTests, NNTPv1Tests, NNTPv2Tests, CapsAfterLoginNNTPv2Tests, - SendReaderNNTPv2Tests, NetworkedNNTPTests] - if _have_ssl: - tests.append(NetworkedNNTP_SSLTests) + SendReaderNNTPv2Tests, NetworkedNNTPTests, NetworkedNNTP_SSLTests] support.run_unittest(*tests) diff --git a/Lib/test/test_os.py b/Lib/test/test_os.py --- a/Lib/test/test_os.py +++ b/Lib/test/test_os.py @@ -178,10 +178,8 @@ os.unlink(self.fname) os.rmdir(support.TESTFN) + @unittest.skipUnless(hasattr(os, 'stat'), 'test needs os.stat()') def check_stat_attributes(self, fname): - if not hasattr(os, "stat"): - return - result = os.stat(fname) # Make sure direct access works @@ -258,10 +256,8 @@ warnings.simplefilter("ignore", DeprecationWarning) self.check_stat_attributes(fname) + @unittest.skipUnless(hasattr(os, 'statvfs'), 'test needs os.statvfs()') def test_statvfs_attributes(self): - if not hasattr(os, "statvfs"): - return - try: result = os.statvfs(self.fname) except OSError as e: @@ -450,10 +446,10 @@ os.close(dirfd) self._test_utime_subsecond(set_time) - # Restrict test to Win32, since there is no guarantee other + # Restrict tests to Win32, since there is no guarantee other # systems support centiseconds - if sys.platform == 'win32': - def get_file_system(path): + def get_file_system(path): + if sys.platform == 'win32': root = os.path.splitdrive(os.path.abspath(path))[0] + '\\' import ctypes kernel32 = ctypes.windll.kernel32 @@ -461,38 +457,45 @@ if kernel32.GetVolumeInformationW(root, None, 0, None, None, None, buf, len(buf)): return buf.value - if get_file_system(support.TESTFN) == "NTFS": - def test_1565150(self): - t1 = 1159195039.25 - os.utime(self.fname, (t1, t1)) - self.assertEqual(os.stat(self.fname).st_mtime, t1) + @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") + @unittest.skipUnless(get_file_system(support.TESTFN) == "NTFS", + "requires NTFS") + def test_1565150(self): + t1 = 1159195039.25 + os.utime(self.fname, (t1, t1)) + self.assertEqual(os.stat(self.fname).st_mtime, t1) - def test_large_time(self): - t1 = 5000000000 # some day in 2128 - os.utime(self.fname, (t1, t1)) - self.assertEqual(os.stat(self.fname).st_mtime, t1) + @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") + @unittest.skipUnless(get_file_system(support.TESTFN) == "NTFS", + "requires NTFS") + def test_large_time(self): + t1 = 5000000000 # some day in 2128 + os.utime(self.fname, (t1, t1)) + self.assertEqual(os.stat(self.fname).st_mtime, t1) - def test_1686475(self): - # Verify that an open file can be stat'ed - try: - os.stat(r"c:\pagefile.sys") - except WindowsError as e: - if e.errno == 2: # file does not exist; cannot run test - return - self.fail("Could not stat pagefile.sys") + @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") + def test_1686475(self): + # Verify that an open file can be stat'ed + try: + os.stat(r"c:\pagefile.sys") + except WindowsError as e: + if e.errno == 2: # file does not exist; cannot run test + return + self.fail("Could not stat pagefile.sys") - @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") - def test_15261(self): - # Verify that stat'ing a closed fd does not cause crash - r, w = os.pipe() - try: - os.stat(r) # should not raise error - finally: - os.close(r) - os.close(w) - with self.assertRaises(OSError) as ctx: - os.stat(r) - self.assertEqual(ctx.exception.errno, errno.EBADF) + @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") + @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") + def test_15261(self): + # Verify that stat'ing a closed fd does not cause crash + r, w = os.pipe() + try: + os.stat(r) # should not raise error + finally: + os.close(r) + os.close(w) + with self.assertRaises(OSError) as ctx: + os.stat(r) + self.assertEqual(ctx.exception.errno, errno.EBADF) from test import mapping_tests @@ -1127,6 +1130,7 @@ self._test_internal_execvpe(bytes) + at unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") class Win32ErrorTests(unittest.TestCase): def test_rename(self): self.assertRaises(WindowsError, os.rename, support.TESTFN, support.TESTFN+".bak") @@ -1173,63 +1177,63 @@ self.fail("%r didn't raise a OSError with a bad file descriptor" % f) + @unittest.skipUnless(hasattr(os, 'isatty'), 'test needs os.isatty()') def test_isatty(self): - if hasattr(os, "isatty"): - self.assertEqual(os.isatty(support.make_bad_fd()), False) + self.assertEqual(os.isatty(support.make_bad_fd()), False) + @unittest.skipUnless(hasattr(os, 'closerange'), 'test needs os.closerange()') def test_closerange(self): - if hasattr(os, "closerange"): - fd = support.make_bad_fd() - # Make sure none of the descriptors we are about to close are - # currently valid (issue 6542). - for i in range(10): - try: os.fstat(fd+i) - except OSError: - pass - else: - break - if i < 2: - raise unittest.SkipTest( - "Unable to acquire a range of invalid file descriptors") - self.assertEqual(os.closerange(fd, fd + i-1), None) + fd = support.make_bad_fd() + # Make sure none of the descriptors we are about to close are + # currently valid (issue 6542). + for i in range(10): + try: os.fstat(fd+i) + except OSError: + pass + else: + break + if i < 2: + raise unittest.SkipTest( + "Unable to acquire a range of invalid file descriptors") + self.assertEqual(os.closerange(fd, fd + i-1), None) + @unittest.skipUnless(hasattr(os, 'dup2'), 'test needs os.dup2()') def test_dup2(self): - if hasattr(os, "dup2"): - self.check(os.dup2, 20) + self.check(os.dup2, 20) + @unittest.skipUnless(hasattr(os, 'fchmod'), 'test needs os.fchmod()') def test_fchmod(self): - if hasattr(os, "fchmod"): - self.check(os.fchmod, 0) + self.check(os.fchmod, 0) + @unittest.skipUnless(hasattr(os, 'fchown'), 'test needs os.fchown()') def test_fchown(self): - if hasattr(os, "fchown"): - self.check(os.fchown, -1, -1) + self.check(os.fchown, -1, -1) + @unittest.skipUnless(hasattr(os, 'fpathconf'), 'test needs os.fpathconf()') def test_fpathconf(self): - if hasattr(os, "fpathconf"): - self.check(os.pathconf, "PC_NAME_MAX") - self.check(os.fpathconf, "PC_NAME_MAX") + self.check(os.pathconf, "PC_NAME_MAX") + self.check(os.fpathconf, "PC_NAME_MAX") + @unittest.skipUnless(hasattr(os, 'ftruncate'), 'test needs os.ftruncate()') def test_ftruncate(self): - if hasattr(os, "ftruncate"): - self.check(os.truncate, 0) - self.check(os.ftruncate, 0) + self.check(os.truncate, 0) + self.check(os.ftruncate, 0) + @unittest.skipUnless(hasattr(os, 'lseek'), 'test needs os.lseek()') def test_lseek(self): - if hasattr(os, "lseek"): - self.check(os.lseek, 0, 0) + self.check(os.lseek, 0, 0) + @unittest.skipUnless(hasattr(os, 'read'), 'test needs os.read()') def test_read(self): - if hasattr(os, "read"): - self.check(os.read, 1) + self.check(os.read, 1) + @unittest.skipUnless(hasattr(os, 'tcsetpgrp'), 'test needs os.tcsetpgrp()') def test_tcsetpgrpt(self): - if hasattr(os, "tcsetpgrp"): - self.check(os.tcsetpgrp, 0) + self.check(os.tcsetpgrp, 0) + @unittest.skipUnless(hasattr(os, 'write'), 'test needs os.write()') def test_write(self): - if hasattr(os, "write"): - self.check(os.write, b" ") + self.check(os.write, b" ") class LinkTests(unittest.TestCase): @@ -1269,138 +1273,117 @@ self.file2 = self.file1 + "2" self._test_link(self.file1, self.file2) -if sys.platform != 'win32': - class Win32ErrorTests(unittest.TestCase): - pass + at unittest.skipIf(sys.platform == "win32", "Posix specific tests") +class PosixUidGidTests(unittest.TestCase): + @unittest.skipUnless(hasattr(os, 'setuid'), 'test needs os.setuid()') + def test_setuid(self): + if os.getuid() != 0: + self.assertRaises(os.error, os.setuid, 0) + self.assertRaises(OverflowError, os.setuid, 1<<32) - class PosixUidGidTests(unittest.TestCase): - if hasattr(os, 'setuid'): - def test_setuid(self): - if os.getuid() != 0: - self.assertRaises(os.error, os.setuid, 0) - self.assertRaises(OverflowError, os.setuid, 1<<32) + @unittest.skipUnless(hasattr(os, 'setgid'), 'test needs os.setgid()') + def test_setgid(self): + if os.getuid() != 0 and not HAVE_WHEEL_GROUP: + self.assertRaises(os.error, os.setgid, 0) + self.assertRaises(OverflowError, os.setgid, 1<<32) - if hasattr(os, 'setgid'): - def test_setgid(self): - if os.getuid() != 0 and not HAVE_WHEEL_GROUP: - self.assertRaises(os.error, os.setgid, 0) - self.assertRaises(OverflowError, os.setgid, 1<<32) + @unittest.skipUnless(hasattr(os, 'seteuid'), 'test needs os.seteuid()') + def test_seteuid(self): + if os.getuid() != 0: + self.assertRaises(os.error, os.seteuid, 0) + self.assertRaises(OverflowError, os.seteuid, 1<<32) - if hasattr(os, 'seteuid'): - def test_seteuid(self): - if os.getuid() != 0: - self.assertRaises(os.error, os.seteuid, 0) - self.assertRaises(OverflowError, os.seteuid, 1<<32) + @unittest.skipUnless(hasattr(os, 'setegid'), 'test needs os.setegid()') + def test_setegid(self): + if os.getuid() != 0 and not HAVE_WHEEL_GROUP: + self.assertRaises(os.error, os.setegid, 0) + self.assertRaises(OverflowError, os.setegid, 1<<32) - if hasattr(os, 'setegid'): - def test_setegid(self): - if os.getuid() != 0 and not HAVE_WHEEL_GROUP: - self.assertRaises(os.error, os.setegid, 0) - self.assertRaises(OverflowError, os.setegid, 1<<32) + @unittest.skipUnless(hasattr(os, 'setreuid'), 'test needs os.setreuid()') + def test_setreuid(self): + if os.getuid() != 0: + self.assertRaises(os.error, os.setreuid, 0, 0) + self.assertRaises(OverflowError, os.setreuid, 1<<32, 0) + self.assertRaises(OverflowError, os.setreuid, 0, 1<<32) - if hasattr(os, 'setreuid'): - def test_setreuid(self): - if os.getuid() != 0: - self.assertRaises(os.error, os.setreuid, 0, 0) - self.assertRaises(OverflowError, os.setreuid, 1<<32, 0) - self.assertRaises(OverflowError, os.setreuid, 0, 1<<32) + @unittest.skipUnless(hasattr(os, 'setregid'), 'test needs os.setregid()') + def test_setregid(self): + if os.getuid() != 0 and not HAVE_WHEEL_GROUP: + self.assertRaises(os.error, os.setregid, 0, 0) + self.assertRaises(OverflowError, os.setregid, 1<<32, 0) + self.assertRaises(OverflowError, os.setregid, 0, 1<<32) - def test_setreuid_neg1(self): - # Needs to accept -1. We run this in a subprocess to avoid - # altering the test runner's process state (issue8045). - subprocess.check_call([ - sys.executable, '-c', - 'import os,sys;os.setreuid(-1,-1);sys.exit(0)']) + at unittest.skipIf(sys.platform == "win32", "Posix specific tests") +class Pep383Tests(unittest.TestCase): + def setUp(self): + if support.TESTFN_UNENCODABLE: + self.dir = support.TESTFN_UNENCODABLE + elif support.TESTFN_NONASCII: + self.dir = support.TESTFN_NONASCII + else: + self.dir = support.TESTFN + self.bdir = os.fsencode(self.dir) - if hasattr(os, 'setregid'): - def test_setregid(self): - if os.getuid() != 0 and not HAVE_WHEEL_GROUP: - self.assertRaises(os.error, os.setregid, 0, 0) - self.assertRaises(OverflowError, os.setregid, 1<<32, 0) - self.assertRaises(OverflowError, os.setregid, 0, 1<<32) + bytesfn = [] + def add_filename(fn): + try: + fn = os.fsencode(fn) + except UnicodeEncodeError: + return + bytesfn.append(fn) + add_filename(support.TESTFN_UNICODE) + if support.TESTFN_UNENCODABLE: + add_filename(support.TESTFN_UNENCODABLE) + if support.TESTFN_NONASCII: + add_filename(support.TESTFN_NONASCII) + if not bytesfn: + self.skipTest("couldn't create any non-ascii filename") - def test_setregid_neg1(self): - # Needs to accept -1. We run this in a subprocess to avoid - # altering the test runner's process state (issue8045). - subprocess.check_call([ - sys.executable, '-c', - 'import os,sys;os.setregid(-1,-1);sys.exit(0)']) + self.unicodefn = set() + os.mkdir(self.dir) + try: + for fn in bytesfn: + support.create_empty_file(os.path.join(self.bdir, fn)) + fn = os.fsdecode(fn) + if fn in self.unicodefn: + raise ValueError("duplicate filename") + self.unicodefn.add(fn) + except: + shutil.rmtree(self.dir) + raise - class Pep383Tests(unittest.TestCase): - def setUp(self): - if support.TESTFN_UNENCODABLE: - self.dir = support.TESTFN_UNENCODABLE - elif support.TESTFN_NONASCII: - self.dir = support.TESTFN_NONASCII - else: - self.dir = support.TESTFN - self.bdir = os.fsencode(self.dir) + def tearDown(self): + shutil.rmtree(self.dir) - bytesfn = [] - def add_filename(fn): - try: - fn = os.fsencode(fn) - except UnicodeEncodeError: - return - bytesfn.append(fn) - add_filename(support.TESTFN_UNICODE) - if support.TESTFN_UNENCODABLE: - add_filename(support.TESTFN_UNENCODABLE) - if support.TESTFN_NONASCII: - add_filename(support.TESTFN_NONASCII) - if not bytesfn: - self.skipTest("couldn't create any non-ascii filename") + def test_listdir(self): + expected = self.unicodefn + found = set(os.listdir(self.dir)) + self.assertEqual(found, expected) + # test listdir without arguments + current_directory = os.getcwd() + try: + os.chdir(os.sep) + self.assertEqual(set(os.listdir()), set(os.listdir(os.sep))) + finally: + os.chdir(current_directory) - self.unicodefn = set() - os.mkdir(self.dir) - try: - for fn in bytesfn: - support.create_empty_file(os.path.join(self.bdir, fn)) - fn = os.fsdecode(fn) - if fn in self.unicodefn: - raise ValueError("duplicate filename") - self.unicodefn.add(fn) - except: - shutil.rmtree(self.dir) - raise + def test_open(self): + for fn in self.unicodefn: + f = open(os.path.join(self.dir, fn), 'rb') + f.close() - def tearDown(self): - shutil.rmtree(self.dir) + @unittest.skipUnless(hasattr(os, 'statvfs'), + "need os.statvfs()") + def test_statvfs(self): + # issue #9645 + for fn in self.unicodefn: + # should not fail with file not found error + fullname = os.path.join(self.dir, fn) + os.statvfs(fullname) - def test_listdir(self): - expected = self.unicodefn - found = set(os.listdir(self.dir)) - self.assertEqual(found, expected) - # test listdir without arguments - current_directory = os.getcwd() - try: - os.chdir(os.sep) - self.assertEqual(set(os.listdir()), set(os.listdir(os.sep))) - finally: - os.chdir(current_directory) - - def test_open(self): - for fn in self.unicodefn: - f = open(os.path.join(self.dir, fn), 'rb') - f.close() - - @unittest.skipUnless(hasattr(os, 'statvfs'), - "need os.statvfs()") - def test_statvfs(self): - # issue #9645 - for fn in self.unicodefn: - # should not fail with file not found error - fullname = os.path.join(self.dir, fn) - os.statvfs(fullname) - - def test_stat(self): - for fn in self.unicodefn: - os.stat(os.path.join(self.dir, fn)) -else: - class PosixUidGidTests(unittest.TestCase): - pass - class Pep383Tests(unittest.TestCase): - pass + def test_stat(self): + for fn in self.unicodefn: + os.stat(os.path.join(self.dir, fn)) @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") class Win32KillTests(unittest.TestCase): @@ -1838,6 +1821,8 @@ SUPPORT_HEADERS_TRAILERS = not sys.platform.startswith("linux") and \ not sys.platform.startswith("solaris") and \ not sys.platform.startswith("sunos") + requires_headers_trailers = unittest.skipUnless(SUPPORT_HEADERS_TRAILERS, + 'requires headers and trailers support') @classmethod def setUpClass(cls): @@ -1956,52 +1941,54 @@ # --- headers / trailers tests - if SUPPORT_HEADERS_TRAILERS: + @requires_headers_trailers + def test_headers(self): + total_sent = 0 + sent = os.sendfile(self.sockno, self.fileno, 0, 4096, + headers=[b"x" * 512]) + total_sent += sent + offset = 4096 + nbytes = 4096 + while 1: + sent = self.sendfile_wrapper(self.sockno, self.fileno, + offset, nbytes) + if sent == 0: + break + total_sent += sent + offset += sent - def test_headers(self): - total_sent = 0 - sent = os.sendfile(self.sockno, self.fileno, 0, 4096, - headers=[b"x" * 512]) - total_sent += sent - offset = 4096 - nbytes = 4096 - while 1: - sent = self.sendfile_wrapper(self.sockno, self.fileno, - offset, nbytes) - if sent == 0: - break - total_sent += sent - offset += sent + expected_data = b"x" * 512 + self.DATA + self.assertEqual(total_sent, len(expected_data)) + self.client.close() + self.server.wait() + data = self.server.handler_instance.get_data() + self.assertEqual(hash(data), hash(expected_data)) - expected_data = b"x" * 512 + self.DATA - self.assertEqual(total_sent, len(expected_data)) + @requires_headers_trailers + def test_trailers(self): + TESTFN2 = support.TESTFN + "2" + file_data = b"abcdef" + with open(TESTFN2, 'wb') as f: + f.write(file_data) + with open(TESTFN2, 'rb')as f: + self.addCleanup(os.remove, TESTFN2) + os.sendfile(self.sockno, f.fileno(), 0, len(file_data), + trailers=[b"1234"]) self.client.close() self.server.wait() data = self.server.handler_instance.get_data() - self.assertEqual(hash(data), hash(expected_data)) + self.assertEqual(data, b"abcdef1234") - def test_trailers(self): - TESTFN2 = support.TESTFN + "2" - file_data = b"abcdef" - with open(TESTFN2, 'wb') as f: - f.write(file_data) - with open(TESTFN2, 'rb')as f: - self.addCleanup(os.remove, TESTFN2) - os.sendfile(self.sockno, f.fileno(), 0, len(file_data), - trailers=[b"1234"]) - self.client.close() - self.server.wait() - data = self.server.handler_instance.get_data() - self.assertEqual(data, b"abcdef1234") - - if hasattr(os, "SF_NODISKIO"): - def test_flags(self): - try: - os.sendfile(self.sockno, self.fileno, 0, 4096, - flags=os.SF_NODISKIO) - except OSError as err: - if err.errno not in (errno.EBUSY, errno.EAGAIN): - raise + @requires_headers_trailers + @unittest.skipUnless(hasattr(os, 'SF_NODISKIO'), + 'test needs os.SF_NODISKIO') + def test_flags(self): + try: + os.sendfile(self.sockno, self.fileno, 0, 4096, + flags=os.SF_NODISKIO) + except OSError as err: + if err.errno not in (errno.EBUSY, errno.EAGAIN): + raise def supports_extended_attributes(): diff --git a/Lib/test/test_poplib.py b/Lib/test/test_poplib.py --- a/Lib/test/test_poplib.py +++ b/Lib/test/test_poplib.py @@ -11,7 +11,7 @@ import time import errno -from unittest import TestCase +from unittest import TestCase, skipUnless from test import support as test_support threading = test_support.import_module('threading') @@ -288,35 +288,37 @@ else: DummyPOP3Handler.handle_read(self) +requires_ssl = skipUnless(SUPPORTS_SSL, 'SSL not supported') - class TestPOP3_SSLClass(TestPOP3Class): - # repeat previous tests by using poplib.POP3_SSL + at requires_ssl +class TestPOP3_SSLClass(TestPOP3Class): + # repeat previous tests by using poplib.POP3_SSL - def setUp(self): - self.server = DummyPOP3Server((HOST, PORT)) - self.server.handler = DummyPOP3_SSLHandler - self.server.start() - self.client = poplib.POP3_SSL(self.server.host, self.server.port) + def setUp(self): + self.server = DummyPOP3Server((HOST, PORT)) + self.server.handler = DummyPOP3_SSLHandler + self.server.start() + self.client = poplib.POP3_SSL(self.server.host, self.server.port) - def test__all__(self): - self.assertIn('POP3_SSL', poplib.__all__) + def test__all__(self): + self.assertIn('POP3_SSL', poplib.__all__) - def test_context(self): - ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) - self.assertRaises(ValueError, poplib.POP3_SSL, self.server.host, - self.server.port, keyfile=CERTFILE, context=ctx) - self.assertRaises(ValueError, poplib.POP3_SSL, self.server.host, - self.server.port, certfile=CERTFILE, context=ctx) - self.assertRaises(ValueError, poplib.POP3_SSL, self.server.host, - self.server.port, keyfile=CERTFILE, - certfile=CERTFILE, context=ctx) + def test_context(self): + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + self.assertRaises(ValueError, poplib.POP3_SSL, self.server.host, + self.server.port, keyfile=CERTFILE, context=ctx) + self.assertRaises(ValueError, poplib.POP3_SSL, self.server.host, + self.server.port, certfile=CERTFILE, context=ctx) + self.assertRaises(ValueError, poplib.POP3_SSL, self.server.host, + self.server.port, keyfile=CERTFILE, + certfile=CERTFILE, context=ctx) - self.client.quit() - self.client = poplib.POP3_SSL(self.server.host, self.server.port, - context=ctx) - self.assertIsInstance(self.client.sock, ssl.SSLSocket) - self.assertIs(self.client.sock.context, ctx) - self.assertTrue(self.client.noop().startswith(b'+OK')) + self.client.quit() + self.client = poplib.POP3_SSL(self.server.host, self.server.port, + context=ctx) + self.assertIsInstance(self.client.sock, ssl.SSLSocket) + self.assertIs(self.client.sock.context, ctx) + self.assertTrue(self.client.noop().startswith(b'+OK')) class TestTimeouts(TestCase): @@ -374,9 +376,8 @@ def test_main(): - tests = [TestPOP3Class, TestTimeouts] - if SUPPORTS_SSL: - tests.append(TestPOP3_SSLClass) + tests = [TestPOP3Class, TestTimeouts, + TestPOP3_SSLClass] thread_info = test_support.threading_setup() try: test_support.run_unittest(*tests) diff --git a/Lib/test/test_posix.py b/Lib/test/test_posix.py --- a/Lib/test/test_posix.py +++ b/Lib/test/test_posix.py @@ -54,47 +54,55 @@ posix_func() self.assertRaises(TypeError, posix_func, 1) - if hasattr(posix, 'getresuid'): - def test_getresuid(self): - user_ids = posix.getresuid() - self.assertEqual(len(user_ids), 3) - for val in user_ids: - self.assertGreaterEqual(val, 0) + @unittest.skipUnless(hasattr(posix, 'getresuid'), + 'test needs posix.getresuid()') + def test_getresuid(self): + user_ids = posix.getresuid() + self.assertEqual(len(user_ids), 3) + for val in user_ids: + self.assertGreaterEqual(val, 0) - if hasattr(posix, 'getresgid'): - def test_getresgid(self): - group_ids = posix.getresgid() - self.assertEqual(len(group_ids), 3) - for val in group_ids: - self.assertGreaterEqual(val, 0) + @unittest.skipUnless(hasattr(posix, 'getresgid'), + 'test needs posix.getresgid()') + def test_getresgid(self): + group_ids = posix.getresgid() + self.assertEqual(len(group_ids), 3) + for val in group_ids: + self.assertGreaterEqual(val, 0) - if hasattr(posix, 'setresuid'): - def test_setresuid(self): - current_user_ids = posix.getresuid() - self.assertIsNone(posix.setresuid(*current_user_ids)) - # -1 means don't change that value. - self.assertIsNone(posix.setresuid(-1, -1, -1)) + @unittest.skipUnless(hasattr(posix, 'setresuid'), + 'test needs posix.setresuid()') + def test_setresuid(self): + current_user_ids = posix.getresuid() + self.assertIsNone(posix.setresuid(*current_user_ids)) + # -1 means don't change that value. + self.assertIsNone(posix.setresuid(-1, -1, -1)) - def test_setresuid_exception(self): - # Don't do this test if someone is silly enough to run us as root. - current_user_ids = posix.getresuid() - if 0 not in current_user_ids: - new_user_ids = (current_user_ids[0]+1, -1, -1) - self.assertRaises(OSError, posix.setresuid, *new_user_ids) + @unittest.skipUnless(hasattr(posix, 'setresuid'), + 'test needs posix.setresuid()') + def test_setresuid_exception(self): + # Don't do this test if someone is silly enough to run us as root. + current_user_ids = posix.getresuid() + if 0 not in current_user_ids: + new_user_ids = (current_user_ids[0]+1, -1, -1) + self.assertRaises(OSError, posix.setresuid, *new_user_ids) - if hasattr(posix, 'setresgid'): - def test_setresgid(self): - current_group_ids = posix.getresgid() - self.assertIsNone(posix.setresgid(*current_group_ids)) - # -1 means don't change that value. - self.assertIsNone(posix.setresgid(-1, -1, -1)) + @unittest.skipUnless(hasattr(posix, 'setresgid'), + 'test needs posix.setresgid()') + def test_setresgid(self): + current_group_ids = posix.getresgid() + self.assertIsNone(posix.setresgid(*current_group_ids)) + # -1 means don't change that value. + self.assertIsNone(posix.setresgid(-1, -1, -1)) - def test_setresgid_exception(self): - # Don't do this test if someone is silly enough to run us as root. - current_group_ids = posix.getresgid() - if 0 not in current_group_ids: - new_group_ids = (current_group_ids[0]+1, -1, -1) - self.assertRaises(OSError, posix.setresgid, *new_group_ids) + @unittest.skipUnless(hasattr(posix, 'setresgid'), + 'test needs posix.setresgid()') + def test_setresgid_exception(self): + # Don't do this test if someone is silly enough to run us as root. + current_group_ids = posix.getresgid() + if 0 not in current_group_ids: + new_group_ids = (current_group_ids[0]+1, -1, -1) + self.assertRaises(OSError, posix.setresgid, *new_group_ids) @unittest.skipUnless(hasattr(posix, 'initgroups'), "test needs os.initgroups()") @@ -121,29 +129,32 @@ else: self.fail("Expected OSError to be raised by initgroups") + @unittest.skipUnless(hasattr(posix, 'statvfs'), + 'test needs posix.statvfs()') def test_statvfs(self): - if hasattr(posix, 'statvfs'): - self.assertTrue(posix.statvfs(os.curdir)) + self.assertTrue(posix.statvfs(os.curdir)) + @unittest.skipUnless(hasattr(posix, 'fstatvfs'), + 'test needs posix.fstatvfs()') def test_fstatvfs(self): - if hasattr(posix, 'fstatvfs'): - fp = open(support.TESTFN) - try: - self.assertTrue(posix.fstatvfs(fp.fileno())) - self.assertTrue(posix.statvfs(fp.fileno())) - finally: - fp.close() + fp = open(support.TESTFN) + try: + self.assertTrue(posix.fstatvfs(fp.fileno())) + self.assertTrue(posix.statvfs(fp.fileno())) + finally: + fp.close() + @unittest.skipUnless(hasattr(posix, 'ftruncate'), + 'test needs posix.ftruncate()') def test_ftruncate(self): - if hasattr(posix, 'ftruncate'): - fp = open(support.TESTFN, 'w+') - try: - # we need to have some data to truncate - fp.write('test') - fp.flush() - posix.ftruncate(fp.fileno(), 0) - finally: - fp.close() + fp = open(support.TESTFN, 'w+') + try: + # we need to have some data to truncate + fp.write('test') + fp.flush() + posix.ftruncate(fp.fileno(), 0) + finally: + fp.close() @unittest.skipUnless(hasattr(posix, 'truncate'), "test needs posix.truncate()") def test_truncate(self): @@ -290,30 +301,33 @@ finally: os.close(fd) + @unittest.skipUnless(hasattr(posix, 'dup'), + 'test needs posix.dup()') def test_dup(self): - if hasattr(posix, 'dup'): - fp = open(support.TESTFN) - try: - fd = posix.dup(fp.fileno()) - self.assertIsInstance(fd, int) - os.close(fd) - finally: - fp.close() + fp = open(support.TESTFN) + try: + fd = posix.dup(fp.fileno()) + self.assertIsInstance(fd, int) + os.close(fd) + finally: + fp.close() + @unittest.skipUnless(hasattr(posix, 'confstr'), + 'test needs posix.confstr()') def test_confstr(self): - if hasattr(posix, 'confstr'): - self.assertRaises(ValueError, posix.confstr, "CS_garbage") - self.assertEqual(len(posix.confstr("CS_PATH")) > 0, True) + self.assertRaises(ValueError, posix.confstr, "CS_garbage") + self.assertEqual(len(posix.confstr("CS_PATH")) > 0, True) + @unittest.skipUnless(hasattr(posix, 'dup2'), + 'test needs posix.dup2()') def test_dup2(self): - if hasattr(posix, 'dup2'): - fp1 = open(support.TESTFN) - fp2 = open(support.TESTFN) - try: - posix.dup2(fp1.fileno(), fp2.fileno()) - finally: - fp1.close() - fp2.close() + fp1 = open(support.TESTFN) + fp2 = open(support.TESTFN) + try: + posix.dup2(fp1.fileno(), fp2.fileno()) + finally: + fp1.close() + fp2.close() @unittest.skipUnless(hasattr(os, 'O_CLOEXEC'), "needs os.O_CLOEXEC") @support.requires_linux_version(2, 6, 23) @@ -322,65 +336,69 @@ self.addCleanup(os.close, fd) self.assertTrue(fcntl.fcntl(fd, fcntl.F_GETFD) & fcntl.FD_CLOEXEC) + @unittest.skipUnless(hasattr(posix, 'O_EXLOCK'), + 'test needs posix.O_EXLOCK') def test_osexlock(self): - if hasattr(posix, "O_EXLOCK"): + fd = os.open(support.TESTFN, + os.O_WRONLY|os.O_EXLOCK|os.O_CREAT) + self.assertRaises(OSError, os.open, support.TESTFN, + os.O_WRONLY|os.O_EXLOCK|os.O_NONBLOCK) + os.close(fd) + + if hasattr(posix, "O_SHLOCK"): fd = os.open(support.TESTFN, - os.O_WRONLY|os.O_EXLOCK|os.O_CREAT) + os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) self.assertRaises(OSError, os.open, support.TESTFN, os.O_WRONLY|os.O_EXLOCK|os.O_NONBLOCK) os.close(fd) - if hasattr(posix, "O_SHLOCK"): - fd = os.open(support.TESTFN, - os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) - self.assertRaises(OSError, os.open, support.TESTFN, - os.O_WRONLY|os.O_EXLOCK|os.O_NONBLOCK) - os.close(fd) + @unittest.skipUnless(hasattr(posix, 'O_SHLOCK'), + 'test needs posix.O_SHLOCK') + def test_osshlock(self): + fd1 = os.open(support.TESTFN, + os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) + fd2 = os.open(support.TESTFN, + os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) + os.close(fd2) + os.close(fd1) - def test_osshlock(self): - if hasattr(posix, "O_SHLOCK"): - fd1 = os.open(support.TESTFN, + if hasattr(posix, "O_EXLOCK"): + fd = os.open(support.TESTFN, os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) - fd2 = os.open(support.TESTFN, - os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) - os.close(fd2) - os.close(fd1) + self.assertRaises(OSError, os.open, support.TESTFN, + os.O_RDONLY|os.O_EXLOCK|os.O_NONBLOCK) + os.close(fd) - if hasattr(posix, "O_EXLOCK"): - fd = os.open(support.TESTFN, - os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) - self.assertRaises(OSError, os.open, support.TESTFN, - os.O_RDONLY|os.O_EXLOCK|os.O_NONBLOCK) - os.close(fd) - + @unittest.skipUnless(hasattr(posix, 'fstat'), + 'test needs posix.fstat()') def test_fstat(self): - if hasattr(posix, 'fstat'): - fp = open(support.TESTFN) - try: - self.assertTrue(posix.fstat(fp.fileno())) - self.assertTrue(posix.stat(fp.fileno())) - - self.assertRaisesRegex(TypeError, - 'should be string, bytes or integer, not', - posix.stat, float(fp.fileno())) - finally: - fp.close() - - def test_stat(self): - if hasattr(posix, 'stat'): - self.assertTrue(posix.stat(support.TESTFN)) - self.assertTrue(posix.stat(os.fsencode(support.TESTFN))) - self.assertTrue(posix.stat(bytearray(os.fsencode(support.TESTFN)))) + fp = open(support.TESTFN) + try: + self.assertTrue(posix.fstat(fp.fileno())) + self.assertTrue(posix.stat(fp.fileno())) self.assertRaisesRegex(TypeError, - 'can\'t specify None for path argument', - posix.stat, None) - self.assertRaisesRegex(TypeError, 'should be string, bytes or integer, not', - posix.stat, list(support.TESTFN)) - self.assertRaisesRegex(TypeError, - 'should be string, bytes or integer, not', - posix.stat, list(os.fsencode(support.TESTFN))) + posix.stat, float(fp.fileno())) + finally: + fp.close() + + @unittest.skipUnless(hasattr(posix, 'stat'), + 'test needs posix.stat()') + def test_stat(self): + self.assertTrue(posix.stat(support.TESTFN)) + self.assertTrue(posix.stat(os.fsencode(support.TESTFN))) + self.assertTrue(posix.stat(bytearray(os.fsencode(support.TESTFN)))) + + self.assertRaisesRegex(TypeError, + 'can\'t specify None for path argument', + posix.stat, None) + self.assertRaisesRegex(TypeError, + 'should be string, bytes or integer, not', + posix.stat, list(support.TESTFN)) + self.assertRaisesRegex(TypeError, + 'should be string, bytes or integer, not', + posix.stat, list(os.fsencode(support.TESTFN))) @unittest.skipUnless(hasattr(posix, 'mkfifo'), "don't have mkfifo()") def test_mkfifo(self): @@ -495,10 +513,10 @@ self._test_all_chown_common(posix.lchown, support.TESTFN, getattr(posix, 'lstat', None)) + @unittest.skipUnless(hasattr(posix, 'chdir'), 'test needs posix.chdir()') def test_chdir(self): - if hasattr(posix, 'chdir'): - posix.chdir(os.curdir) - self.assertRaises(OSError, posix.chdir, support.TESTFN) + posix.chdir(os.curdir) + self.assertRaises(OSError, posix.chdir, support.TESTFN) def test_listdir(self): self.assertTrue(support.TESTFN in posix.listdir(os.curdir)) @@ -528,25 +546,26 @@ sorted(posix.listdir(f)) ) + @unittest.skipUnless(hasattr(posix, 'access'), 'test needs posix.access()') def test_access(self): - if hasattr(posix, 'access'): - self.assertTrue(posix.access(support.TESTFN, os.R_OK)) + self.assertTrue(posix.access(support.TESTFN, os.R_OK)) + @unittest.skipUnless(hasattr(posix, 'umask'), 'test needs posix.umask()') def test_umask(self): - if hasattr(posix, 'umask'): - old_mask = posix.umask(0) - self.assertIsInstance(old_mask, int) - posix.umask(old_mask) + old_mask = posix.umask(0) + self.assertIsInstance(old_mask, int) + posix.umask(old_mask) + @unittest.skipUnless(hasattr(posix, 'strerror'), + 'test needs posix.strerror()') def test_strerror(self): - if hasattr(posix, 'strerror'): - self.assertTrue(posix.strerror(0)) + self.assertTrue(posix.strerror(0)) + @unittest.skipUnless(hasattr(posix, 'pipe'), 'test needs posix.pipe()') def test_pipe(self): - if hasattr(posix, 'pipe'): - reader, writer = posix.pipe() - os.close(reader) - os.close(writer) + reader, writer = posix.pipe() + os.close(reader) + os.close(writer) @unittest.skipUnless(hasattr(os, 'pipe2'), "test needs os.pipe2()") @support.requires_linux_version(2, 6, 27) @@ -578,15 +597,15 @@ self.assertRaises(OverflowError, os.pipe2, _testcapi.INT_MAX + 1) self.assertRaises(OverflowError, os.pipe2, _testcapi.UINT_MAX + 1) + @unittest.skipUnless(hasattr(posix, 'utime'), 'test needs posix.utime()') def test_utime(self): - if hasattr(posix, 'utime'): - now = time.time() - posix.utime(support.TESTFN, None) - self.assertRaises(TypeError, posix.utime, support.TESTFN, (None, None)) - self.assertRaises(TypeError, posix.utime, support.TESTFN, (now, None)) - self.assertRaises(TypeError, posix.utime, support.TESTFN, (None, now)) - posix.utime(support.TESTFN, (int(now), int(now))) - posix.utime(support.TESTFN, (now, now)) + now = time.time() + posix.utime(support.TESTFN, None) + self.assertRaises(TypeError, posix.utime, support.TESTFN, (None, None)) + self.assertRaises(TypeError, posix.utime, support.TESTFN, (now, None)) + self.assertRaises(TypeError, posix.utime, support.TESTFN, (None, now)) + posix.utime(support.TESTFN, (int(now), int(now))) + posix.utime(support.TESTFN, (now, now)) def _test_chflags_regular_file(self, chflags_func, target_file, **kwargs): st = os.stat(target_file) @@ -663,6 +682,7 @@ self.assertEqual(type(k), item_type) self.assertEqual(type(v), item_type) + @unittest.skipUnless(hasattr(posix, 'getcwd'), 'test needs posix.getcwd()') def test_getcwd_long_pathnames(self): if hasattr(posix, 'getcwd'): dirname = 'getcwd-test-directory-0123456789abcdef-01234567890abcdef' diff --git a/Lib/test/test_set.py b/Lib/test/test_set.py --- a/Lib/test/test_set.py +++ b/Lib/test/test_set.py @@ -625,10 +625,10 @@ myset >= myobj self.assertTrue(myobj.le_called) - # C API test only available in a debug build - if hasattr(set, "test_c_api"): - def test_c_api(self): - self.assertEqual(set().test_c_api(), True) + @unittest.skipUnless(hasattr(set, "test_c_api"), + 'C API test only available in a debug build') + def test_c_api(self): + self.assertEqual(set().test_c_api(), True) class SetSubclass(set): pass diff --git a/Lib/test/test_shutil.py b/Lib/test/test_shutil.py --- a/Lib/test/test_shutil.py +++ b/Lib/test/test_shutil.py @@ -194,37 +194,37 @@ self.assertIn(errors[1][2][1].filename, possible_args) - # See bug #1071513 for why we don't run this on cygwin - # and bug #1076467 for why we don't run this as root. - if (hasattr(os, 'chmod') and sys.platform[:6] != 'cygwin' - and not (hasattr(os, 'geteuid') and os.geteuid() == 0)): - def test_on_error(self): - self.errorState = 0 - os.mkdir(TESTFN) - self.addCleanup(shutil.rmtree, TESTFN) + @unittest.skipUnless(hasattr(os, 'chmod'), 'requires os.chmod()') + @unittest.skipIf(sys.platform[:6] == 'cygwin', + "This test can't be run on Cygwin (issue #1071513).") + @unittest.skipIf(hasattr(os, 'geteuid') and os.geteuid() == 0, + "This test can't be run reliably as root (issue #1076467).") + def test_on_error(self): + self.errorState = 0 + os.mkdir(TESTFN) + self.addCleanup(shutil.rmtree, TESTFN) - self.child_file_path = os.path.join(TESTFN, 'a') - self.child_dir_path = os.path.join(TESTFN, 'b') - support.create_empty_file(self.child_file_path) - os.mkdir(self.child_dir_path) - old_dir_mode = os.stat(TESTFN).st_mode - old_child_file_mode = os.stat(self.child_file_path).st_mode - old_child_dir_mode = os.stat(self.child_dir_path).st_mode - # Make unwritable. - new_mode = stat.S_IREAD|stat.S_IEXEC - os.chmod(self.child_file_path, new_mode) - os.chmod(self.child_dir_path, new_mode) - os.chmod(TESTFN, new_mode) + self.child_file_path = os.path.join(TESTFN, 'a') + self.child_dir_path = os.path.join(TESTFN, 'b') + support.create_empty_file(self.child_file_path) + os.mkdir(self.child_dir_path) + old_dir_mode = os.stat(TESTFN).st_mode + old_child_file_mode = os.stat(self.child_file_path).st_mode + old_child_dir_mode = os.stat(self.child_dir_path).st_mode + # Make unwritable. + new_mode = stat.S_IREAD|stat.S_IEXEC + os.chmod(self.child_file_path, new_mode) + os.chmod(self.child_dir_path, new_mode) + os.chmod(TESTFN, new_mode) - self.addCleanup(os.chmod, TESTFN, old_dir_mode) - self.addCleanup(os.chmod, self.child_file_path, old_child_file_mode) - self.addCleanup(os.chmod, self.child_dir_path, old_child_dir_mode) + self.addCleanup(os.chmod, TESTFN, old_dir_mode) + self.addCleanup(os.chmod, self.child_file_path, old_child_file_mode) + self.addCleanup(os.chmod, self.child_dir_path, old_child_dir_mode) - shutil.rmtree(TESTFN, onerror=self.check_args_to_onerror) - # Test whether onerror has actually been called. - self.assertEqual(self.errorState, 3, - "Expected call to onerror function did not " - "happen.") + shutil.rmtree(TESTFN, onerror=self.check_args_to_onerror) + # Test whether onerror has actually been called. + self.assertEqual(self.errorState, 3, + "Expected call to onerror function did not happen.") def check_args_to_onerror(self, func, arg, exc): # test_rmtree_errors deliberately runs rmtree @@ -806,38 +806,39 @@ finally: shutil.rmtree(TESTFN, ignore_errors=True) - if hasattr(os, "mkfifo"): - # Issue #3002: copyfile and copytree block indefinitely on named pipes - def test_copyfile_named_pipe(self): - os.mkfifo(TESTFN) + # Issue #3002: copyfile and copytree block indefinitely on named pipes + @unittest.skipUnless(hasattr(os, "mkfifo"), 'requires os.mkfifo()') + def test_copyfile_named_pipe(self): + os.mkfifo(TESTFN) + try: + self.assertRaises(shutil.SpecialFileError, + shutil.copyfile, TESTFN, TESTFN2) + self.assertRaises(shutil.SpecialFileError, + shutil.copyfile, __file__, TESTFN) + finally: + os.remove(TESTFN) + + @unittest.skipUnless(hasattr(os, "mkfifo"), 'requires os.mkfifo()') + @support.skip_unless_symlink + def test_copytree_named_pipe(self): + os.mkdir(TESTFN) + try: + subdir = os.path.join(TESTFN, "subdir") + os.mkdir(subdir) + pipe = os.path.join(subdir, "mypipe") + os.mkfifo(pipe) try: - self.assertRaises(shutil.SpecialFileError, - shutil.copyfile, TESTFN, TESTFN2) - self.assertRaises(shutil.SpecialFileError, - shutil.copyfile, __file__, TESTFN) - finally: - os.remove(TESTFN) - - @support.skip_unless_symlink - def test_copytree_named_pipe(self): - os.mkdir(TESTFN) - try: - subdir = os.path.join(TESTFN, "subdir") - os.mkdir(subdir) - pipe = os.path.join(subdir, "mypipe") - os.mkfifo(pipe) - try: - shutil.copytree(TESTFN, TESTFN2) - except shutil.Error as e: - errors = e.args[0] - self.assertEqual(len(errors), 1) - src, dst, error_msg = errors[0] - self.assertEqual("`%s` is a named pipe" % pipe, error_msg) - else: - self.fail("shutil.Error should have been raised") - finally: - shutil.rmtree(TESTFN, ignore_errors=True) - shutil.rmtree(TESTFN2, ignore_errors=True) + shutil.copytree(TESTFN, TESTFN2) + except shutil.Error as e: + errors = e.args[0] + self.assertEqual(len(errors), 1) + src, dst, error_msg = errors[0] + self.assertEqual("`%s` is a named pipe" % pipe, error_msg) + else: + self.fail("shutil.Error should have been raised") + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + shutil.rmtree(TESTFN2, ignore_errors=True) def test_copytree_special_func(self): diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -772,16 +772,17 @@ self.assertRaises(TypeError, socket.if_nametoindex, 0) self.assertRaises(TypeError, socket.if_indextoname, '_DEADBEEF') + @unittest.skipUnless(hasattr(sys, 'getrefcount'), + 'test needs sys.getrefcount()') def testRefCountGetNameInfo(self): # Testing reference count for getnameinfo - if hasattr(sys, "getrefcount"): - try: - # On some versions, this loses a reference - orig = sys.getrefcount(__name__) - socket.getnameinfo(__name__,0) - except TypeError: - if sys.getrefcount(__name__) != orig: - self.fail("socket.getnameinfo loses a reference") + try: + # On some versions, this loses a reference + orig = sys.getrefcount(__name__) + socket.getnameinfo(__name__,0) + except TypeError: + if sys.getrefcount(__name__) != orig: + self.fail("socket.getnameinfo loses a reference") def testInterpreterCrash(self): # Making sure getnameinfo doesn't crash the interpreter @@ -886,17 +887,17 @@ # Check that setting it to an invalid type raises TypeError self.assertRaises(TypeError, socket.setdefaulttimeout, "spam") + @unittest.skipUnless(hasattr(socket, 'inet_aton'), + 'test needs socket.inet_aton()') def testIPv4_inet_aton_fourbytes(self): - if not hasattr(socket, 'inet_aton'): - return # No inet_aton, nothing to check # Test that issue1008086 and issue767150 are fixed. # It must return 4 bytes. self.assertEqual(b'\x00'*4, socket.inet_aton('0.0.0.0')) self.assertEqual(b'\xff'*4, socket.inet_aton('255.255.255.255')) + @unittest.skipUnless(hasattr(socket, 'inet_pton'), + 'test needs socket.inet_pton()') def testIPv4toString(self): - if not hasattr(socket, 'inet_pton'): - return # No inet_pton() on this platform from socket import inet_aton as f, inet_pton, AF_INET g = lambda a: inet_pton(AF_INET, a) @@ -925,9 +926,9 @@ assertInvalid(g, '1.2.3.4.5') assertInvalid(g, '::1') + @unittest.skipUnless(hasattr(socket, 'inet_pton'), + 'test needs socket.inet_pton()') def testIPv6toString(self): - if not hasattr(socket, 'inet_pton'): - return # No inet_pton() on this platform try: from socket import inet_pton, AF_INET6, has_ipv6 if not has_ipv6: @@ -979,9 +980,9 @@ assertInvalid('::1.2.3.4:0') assertInvalid('0.100.200.0:3:4:5:6:7:8') + @unittest.skipUnless(hasattr(socket, 'inet_ntop'), + 'test needs socket.inet_ntop()') def testStringToIPv4(self): - if not hasattr(socket, 'inet_ntop'): - return # No inet_ntop() on this platform from socket import inet_ntoa as f, inet_ntop, AF_INET g = lambda a: inet_ntop(AF_INET, a) assertInvalid = lambda func,a: self.assertRaises( @@ -1003,9 +1004,9 @@ assertInvalid(g, b'\x00' * 5) assertInvalid(g, b'\x00' * 16) + @unittest.skipUnless(hasattr(socket, 'inet_ntop'), + 'test needs socket.inet_ntop()') def testStringToIPv6(self): - if not hasattr(socket, 'inet_ntop'): - return # No inet_ntop() on this platform try: from socket import inet_ntop, AF_INET6, has_ipv6 if not has_ipv6: @@ -3531,6 +3532,8 @@ self.cli.connect((HOST, self.port)) time.sleep(1.0) + at unittest.skipUnless(hasattr(socket, 'socketpair'), + 'test needs socket.socketpair()') @unittest.skipUnless(thread, 'Threading required for this test.') class BasicSocketPairTest(SocketPairTest): @@ -3593,26 +3596,27 @@ def _testSetBlocking(self): pass - if hasattr(socket, "SOCK_NONBLOCK"): - @support.requires_linux_version(2, 6, 28) - def testInitNonBlocking(self): - # reinit server socket - self.serv.close() - self.serv = socket.socket(socket.AF_INET, socket.SOCK_STREAM | - socket.SOCK_NONBLOCK) - self.port = support.bind_port(self.serv) - self.serv.listen(1) - # actual testing - start = time.time() - try: - self.serv.accept() - except socket.error: - pass - end = time.time() - self.assertTrue((end - start) < 1.0, "Error creating with non-blocking mode.") - - def _testInitNonBlocking(self): + @unittest.skipUnless(hasattr(socket, 'SOCK_NONBLOCK'), + 'test needs socket.SOCK_NONBLOCK') + @support.requires_linux_version(2, 6, 28) + def testInitNonBlocking(self): + # reinit server socket + self.serv.close() + self.serv = socket.socket(socket.AF_INET, socket.SOCK_STREAM | + socket.SOCK_NONBLOCK) + self.port = support.bind_port(self.serv) + self.serv.listen(1) + # actual testing + start = time.time() + try: + self.serv.accept() + except socket.error: pass + end = time.time() + self.assertTrue((end - start) < 1.0, "Error creating with non-blocking mode.") + + def _testInitNonBlocking(self): + pass def testInheritFlags(self): # Issue #7995: when calling accept() on a listening socket with a @@ -4302,12 +4306,12 @@ if not ok: self.fail("accept() returned success when we did not expect it") + @unittest.skipUnless(hasattr(signal, 'alarm'), + 'test needs signal.alarm()') def testInterruptedTimeout(self): # XXX I don't know how to do this test on MSWindows or any other # plaform that doesn't support signal.alarm() or os.kill(), though # the bug should have existed on all platforms. - if not hasattr(signal, "alarm"): - return # can only test on *nix self.serv.settimeout(5.0) # must be longer than alarm class Alarm(Exception): pass @@ -4367,6 +4371,7 @@ self.assertTrue(issubclass(socket.gaierror, socket.error)) self.assertTrue(issubclass(socket.timeout, socket.error)) + at unittest.skipUnless(sys.platform == 'linux', 'Linux specific test') class TestLinuxAbstractNamespace(unittest.TestCase): UNIX_PATH_MAX = 108 @@ -4402,6 +4407,7 @@ finally: s.close() + at unittest.skipUnless(hasattr(socket, 'AF_UNIX'), 'test needs socket.AF_UNIX') class TestUnixDomain(unittest.TestCase): def setUp(self): @@ -4551,10 +4557,10 @@ for line in f: if line.startswith("tipc "): return True - if support.verbose: - print("TIPC module is not loaded, please 'sudo modprobe tipc'") return False + at unittest.skipUnless(isTipcAvailable(), + "TIPC module is not loaded, please 'sudo modprobe tipc'") class TIPCTest(unittest.TestCase): def testRDM(self): srv = socket.socket(socket.AF_TIPC, socket.SOCK_RDM) @@ -4577,6 +4583,8 @@ self.assertEqual(msg, MSG) + at unittest.skipUnless(isTipcAvailable(), + "TIPC module is not loaded, please 'sudo modprobe tipc'") class TIPCThreadableTest(unittest.TestCase, ThreadableTest): def __init__(self, methodName = 'runTest'): unittest.TestCase.__init__(self, methodName = methodName) @@ -4842,15 +4850,10 @@ CloexecConstantTest, NonblockConstantTest ]) - if hasattr(socket, "socketpair"): - tests.append(BasicSocketPairTest) - if hasattr(socket, "AF_UNIX"): - tests.append(TestUnixDomain) - if sys.platform == 'linux': - tests.append(TestLinuxAbstractNamespace) - if isTipcAvailable(): - tests.append(TIPCTest) - tests.append(TIPCThreadableTest) + tests.append(BasicSocketPairTest) + tests.append(TestUnixDomain) + tests.append(TestLinuxAbstractNamespace) + tests.extend([TIPCTest, TIPCThreadableTest]) tests.extend([BasicCANTest, CANTest]) tests.extend([BasicRDSTest, RDSTest]) tests.extend([ diff --git a/Lib/test/test_socketserver.py b/Lib/test/test_socketserver.py --- a/Lib/test/test_socketserver.py +++ b/Lib/test/test_socketserver.py @@ -27,7 +27,10 @@ HOST = test.support.HOST HAVE_UNIX_SOCKETS = hasattr(socket, "AF_UNIX") +requires_unix_sockets = unittest.skipUnless(HAVE_UNIX_SOCKETS, + 'requires Unix sockets') HAVE_FORKING = hasattr(os, "fork") and os.name != "os2" +requires_forking = unittest.skipUnless(HAVE_FORKING, 'requires forking') def signal_alarm(n): """Call signal.alarm when it exists (i.e. not on Windows).""" @@ -189,31 +192,33 @@ socketserver.StreamRequestHandler, self.stream_examine) - if HAVE_FORKING: - def test_ForkingTCPServer(self): - with simple_subprocess(self): - self.run_server(socketserver.ForkingTCPServer, - socketserver.StreamRequestHandler, - self.stream_examine) - - if HAVE_UNIX_SOCKETS: - def test_UnixStreamServer(self): - self.run_server(socketserver.UnixStreamServer, + @requires_forking + def test_ForkingTCPServer(self): + with simple_subprocess(self): + self.run_server(socketserver.ForkingTCPServer, socketserver.StreamRequestHandler, self.stream_examine) - def test_ThreadingUnixStreamServer(self): - self.run_server(socketserver.ThreadingUnixStreamServer, + @requires_unix_sockets + def test_UnixStreamServer(self): + self.run_server(socketserver.UnixStreamServer, + socketserver.StreamRequestHandler, + self.stream_examine) + + @requires_unix_sockets + def test_ThreadingUnixStreamServer(self): + self.run_server(socketserver.ThreadingUnixStreamServer, + socketserver.StreamRequestHandler, + self.stream_examine) + + @requires_unix_sockets + @requires_forking + def test_ForkingUnixStreamServer(self): + with simple_subprocess(self): + self.run_server(ForkingUnixStreamServer, socketserver.StreamRequestHandler, self.stream_examine) - if HAVE_FORKING: - def test_ForkingUnixStreamServer(self): - with simple_subprocess(self): - self.run_server(ForkingUnixStreamServer, - socketserver.StreamRequestHandler, - self.stream_examine) - def test_UDPServer(self): self.run_server(socketserver.UDPServer, socketserver.DatagramRequestHandler, @@ -224,12 +229,12 @@ socketserver.DatagramRequestHandler, self.dgram_examine) - if HAVE_FORKING: - def test_ForkingUDPServer(self): - with simple_subprocess(self): - self.run_server(socketserver.ForkingUDPServer, - socketserver.DatagramRequestHandler, - self.dgram_examine) + @requires_forking + def test_ForkingUDPServer(self): + with simple_subprocess(self): + self.run_server(socketserver.ForkingUDPServer, + socketserver.DatagramRequestHandler, + self.dgram_examine) @contextlib.contextmanager def mocked_select_module(self): @@ -266,22 +271,24 @@ # Alas, on Linux (at least) recvfrom() doesn't return a meaningful # client address so this cannot work: - # if HAVE_UNIX_SOCKETS: - # def test_UnixDatagramServer(self): - # self.run_server(socketserver.UnixDatagramServer, - # socketserver.DatagramRequestHandler, - # self.dgram_examine) + # @requires_unix_sockets + # def test_UnixDatagramServer(self): + # self.run_server(socketserver.UnixDatagramServer, + # socketserver.DatagramRequestHandler, + # self.dgram_examine) # - # def test_ThreadingUnixDatagramServer(self): - # self.run_server(socketserver.ThreadingUnixDatagramServer, - # socketserver.DatagramRequestHandler, - # self.dgram_examine) + # @requires_unix_sockets + # def test_ThreadingUnixDatagramServer(self): + # self.run_server(socketserver.ThreadingUnixDatagramServer, + # socketserver.DatagramRequestHandler, + # self.dgram_examine) # - # if HAVE_FORKING: - # def test_ForkingUnixDatagramServer(self): - # self.run_server(socketserver.ForkingUnixDatagramServer, - # socketserver.DatagramRequestHandler, - # self.dgram_examine) + # @requires_unix_sockets + # @requires_forking + # def test_ForkingUnixDatagramServer(self): + # self.run_server(socketserver.ForkingUnixDatagramServer, + # socketserver.DatagramRequestHandler, + # self.dgram_examine) @reap_threads def test_shutdown(self): diff --git a/Lib/test/test_sys.py b/Lib/test/test_sys.py --- a/Lib/test/test_sys.py +++ b/Lib/test/test_sys.py @@ -291,15 +291,16 @@ def test_call_tracing(self): self.assertRaises(TypeError, sys.call_tracing, type, 2) + @unittest.skipUnless(hasattr(sys, "setdlopenflags"), + 'test needs sys.setdlopenflags()') def test_dlopenflags(self): - if hasattr(sys, "setdlopenflags"): - self.assertTrue(hasattr(sys, "getdlopenflags")) - self.assertRaises(TypeError, sys.getdlopenflags, 42) - oldflags = sys.getdlopenflags() - self.assertRaises(TypeError, sys.setdlopenflags) - sys.setdlopenflags(oldflags+1) - self.assertEqual(sys.getdlopenflags(), oldflags+1) - sys.setdlopenflags(oldflags) + self.assertTrue(hasattr(sys, "getdlopenflags")) + self.assertRaises(TypeError, sys.getdlopenflags, 42) + oldflags = sys.getdlopenflags() + self.assertRaises(TypeError, sys.setdlopenflags) + sys.setdlopenflags(oldflags+1) + self.assertEqual(sys.getdlopenflags(), oldflags+1) + sys.setdlopenflags(oldflags) @test.support.refcount_test def test_refcount(self): diff --git a/Lib/test/test_warnings.py b/Lib/test/test_warnings.py --- a/Lib/test/test_warnings.py +++ b/Lib/test/test_warnings.py @@ -271,11 +271,10 @@ finally: warning_tests.__file__ = filename + @unittest.skipUnless(hasattr(sys, 'argv'), 'test needs sys.argv') def test_missing_filename_main_with_argv(self): # If __file__ is not specified and the caller is __main__ and sys.argv # exists, then use sys.argv[0] as the file. - if not hasattr(sys, 'argv'): - return filename = warning_tests.__file__ module_name = warning_tests.__name__ try: diff --git a/Lib/test/test_zlib.py b/Lib/test/test_zlib.py --- a/Lib/test/test_zlib.py +++ b/Lib/test/test_zlib.py @@ -7,6 +7,13 @@ zlib = support.import_module('zlib') +requires_Compress_copy = unittest.skipUnless( + hasattr(zlib.compressobj(), "copy"), + 'requires Compress.copy()') +requires_Decompress_copy = unittest.skipUnless( + hasattr(zlib.decompressobj(), "copy"), + 'requires Decompress.copy()') + class VersionTestCase(unittest.TestCase): @@ -381,39 +388,39 @@ "mode=%i, level=%i") % (sync, level)) del obj + @unittest.skipUnless(hasattr(zlib, 'Z_SYNC_FLUSH'), + 'requires zlib.Z_SYNC_FLUSH') def test_odd_flush(self): # Test for odd flushing bugs noted in 2.0, and hopefully fixed in 2.1 import random + # Testing on 17K of "random" data - if hasattr(zlib, 'Z_SYNC_FLUSH'): - # Testing on 17K of "random" data + # Create compressor and decompressor objects + co = zlib.compressobj(zlib.Z_BEST_COMPRESSION) + dco = zlib.decompressobj() - # Create compressor and decompressor objects - co = zlib.compressobj(zlib.Z_BEST_COMPRESSION) - dco = zlib.decompressobj() + # Try 17K of data + # generate random data stream + try: + # In 2.3 and later, WichmannHill is the RNG of the bug report + gen = random.WichmannHill() + except AttributeError: + try: + # 2.2 called it Random + gen = random.Random() + except AttributeError: + # others might simply have a single RNG + gen = random + gen.seed(1) + data = genblock(1, 17 * 1024, generator=gen) - # Try 17K of data - # generate random data stream - try: - # In 2.3 and later, WichmannHill is the RNG of the bug report - gen = random.WichmannHill() - except AttributeError: - try: - # 2.2 called it Random - gen = random.Random() - except AttributeError: - # others might simply have a single RNG - gen = random - gen.seed(1) - data = genblock(1, 17 * 1024, generator=gen) + # compress, sync-flush, and decompress + first = co.compress(data) + second = co.flush(zlib.Z_SYNC_FLUSH) + expanded = dco.decompress(first + second) - # compress, sync-flush, and decompress - first = co.compress(data) - second = co.flush(zlib.Z_SYNC_FLUSH) - expanded = dco.decompress(first + second) - - # if decompressed data is different from the input data, choke. - self.assertEqual(expanded, data, "17K random source doesn't match") + # if decompressed data is different from the input data, choke. + self.assertEqual(expanded, data, "17K random source doesn't match") def test_empty_flush(self): # Test that calling .flush() on unused objects works. @@ -525,67 +532,69 @@ data = zlib.compress(input2) self.assertEqual(dco.flush(), input1[1:]) - if hasattr(zlib.compressobj(), "copy"): - def test_compresscopy(self): - # Test copying a compression object - data0 = HAMLET_SCENE - data1 = bytes(str(HAMLET_SCENE, "ascii").swapcase(), "ascii") - c0 = zlib.compressobj(zlib.Z_BEST_COMPRESSION) - bufs0 = [] - bufs0.append(c0.compress(data0)) + @requires_Compress_copy + def test_compresscopy(self): + # Test copying a compression object + data0 = HAMLET_SCENE + data1 = bytes(str(HAMLET_SCENE, "ascii").swapcase(), "ascii") + c0 = zlib.compressobj(zlib.Z_BEST_COMPRESSION) + bufs0 = [] + bufs0.append(c0.compress(data0)) - c1 = c0.copy() - bufs1 = bufs0[:] + c1 = c0.copy() + bufs1 = bufs0[:] - bufs0.append(c0.compress(data0)) - bufs0.append(c0.flush()) - s0 = b''.join(bufs0) + bufs0.append(c0.compress(data0)) + bufs0.append(c0.flush()) + s0 = b''.join(bufs0) - bufs1.append(c1.compress(data1)) - bufs1.append(c1.flush()) - s1 = b''.join(bufs1) + bufs1.append(c1.compress(data1)) + bufs1.append(c1.flush()) + s1 = b''.join(bufs1) - self.assertEqual(zlib.decompress(s0),data0+data0) - self.assertEqual(zlib.decompress(s1),data0+data1) + self.assertEqual(zlib.decompress(s0),data0+data0) + self.assertEqual(zlib.decompress(s1),data0+data1) - def test_badcompresscopy(self): - # Test copying a compression object in an inconsistent state - c = zlib.compressobj() - c.compress(HAMLET_SCENE) - c.flush() - self.assertRaises(ValueError, c.copy) + @requires_Compress_copy + def test_badcompresscopy(self): + # Test copying a compression object in an inconsistent state + c = zlib.compressobj() + c.compress(HAMLET_SCENE) + c.flush() + self.assertRaises(ValueError, c.copy) - if hasattr(zlib.decompressobj(), "copy"): - def test_decompresscopy(self): - # Test copying a decompression object - data = HAMLET_SCENE - comp = zlib.compress(data) - # Test type of return value - self.assertIsInstance(comp, bytes) + @requires_Decompress_copy + def test_decompresscopy(self): + # Test copying a decompression object + data = HAMLET_SCENE + comp = zlib.compress(data) + # Test type of return value + self.assertIsInstance(comp, bytes) - d0 = zlib.decompressobj() - bufs0 = [] - bufs0.append(d0.decompress(comp[:32])) + d0 = zlib.decompressobj() + bufs0 = [] + bufs0.append(d0.decompress(comp[:32])) - d1 = d0.copy() - bufs1 = bufs0[:] + d1 = d0.copy() + bufs1 = bufs0[:] - bufs0.append(d0.decompress(comp[32:])) - s0 = b''.join(bufs0) + bufs0.append(d0.decompress(comp[32:])) + s0 = b''.join(bufs0) - bufs1.append(d1.decompress(comp[32:])) - s1 = b''.join(bufs1) + bufs1.append(d1.decompress(comp[32:])) + s1 = b''.join(bufs1) - self.assertEqual(s0,s1) - self.assertEqual(s0,data) + self.assertEqual(s0,s1) + self.assertEqual(s0,data) - def test_baddecompresscopy(self): - # Test copying a compression object in an inconsistent state - data = zlib.compress(HAMLET_SCENE) - d = zlib.decompressobj() - d.decompress(data) - d.flush() - self.assertRaises(ValueError, d.copy) + @requires_Decompress_copy + def test_baddecompresscopy(self): + # Test copying a compression object in an inconsistent state + data = zlib.compress(HAMLET_SCENE) + d = zlib.decompressobj() + d.decompress(data) + d.flush() + self.assertRaises(ValueError, d.copy) # Memory use of the following functions takes into account overallocation diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -29,6 +29,8 @@ Tests ----- +- Issue #18702: All skipped tests now reported as skipped. + - Issue #19085: Added basic tests for all tkinter widget options. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 20:32:23 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 3 Nov 2013 20:32:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2318702=3A_All_skip?= =?utf-8?q?ped_tests_now_reported_as_skipped=2E?= Message-ID: <3dCS2b1sW0z7LkF@mail.python.org> http://hg.python.org/cpython/rev/09105051b9f4 changeset: 86896:09105051b9f4 parent: 86893:fc1f353b8e7e user: Serhiy Storchaka date: Sun Nov 03 21:31:38 2013 +0200 summary: Issue #18702: All skipped tests now reported as skipped. files: Lib/test/test_array.py | 17 +- Lib/test/test_compileall.py | 3 +- Lib/test/test_csv.py | 135 +++--- Lib/test/test_dbm_dumb.py | 6 +- Lib/test/test_enumerate.py | 3 +- Lib/test/test_ftplib.py | 14 +- Lib/test/test_mailbox.py | 36 +- Lib/test/test_math.py | 59 +- Lib/test/test_mmap.py | 144 +++--- Lib/test/test_nntplib.py | 74 +- Lib/test/test_os.py | 465 +++++++++++---------- Lib/test/test_poplib.py | 146 +++--- Lib/test/test_posix.py | 304 +++++++------ Lib/test/test_set.py | 8 +- Lib/test/test_shutil.py | 119 ++-- Lib/test/test_socket.py | 103 ++-- Lib/test/test_socketserver.py | 85 ++- Lib/test/test_sys.py | 17 +- Lib/test/test_warnings.py | 3 +- Lib/test/test_zlib.py | 159 +++--- Misc/NEWS | 2 + 21 files changed, 972 insertions(+), 930 deletions(-) diff --git a/Lib/test/test_array.py b/Lib/test/test_array.py --- a/Lib/test/test_array.py +++ b/Lib/test/test_array.py @@ -11,6 +11,7 @@ import io import math import struct +import sys import warnings import array @@ -993,15 +994,15 @@ s = None self.assertRaises(ReferenceError, len, p) + @unittest.skipUnless(hasattr(sys, 'getrefcount'), + 'test needs sys.getrefcount()') def test_bug_782369(self): - import sys - if hasattr(sys, "getrefcount"): - for i in range(10): - b = array.array('B', range(64)) - rc = sys.getrefcount(10) - for i in range(10): - b = array.array('B', range(64)) - self.assertEqual(rc, sys.getrefcount(10)) + for i in range(10): + b = array.array('B', range(64)) + rc = sys.getrefcount(10) + for i in range(10): + b = array.array('B', range(64)) + self.assertEqual(rc, sys.getrefcount(10)) def test_subclass_with_kwargs(self): # SF bug #1486663 -- this used to erroneously raise a TypeError diff --git a/Lib/test/test_compileall.py b/Lib/test/test_compileall.py --- a/Lib/test/test_compileall.py +++ b/Lib/test/test_compileall.py @@ -40,11 +40,10 @@ compare = struct.pack('<4sl', importlib.util.MAGIC_NUMBER, mtime) return data, compare + @unittest.skipUnless(hasattr(os, 'stat'), 'test needs os.stat()') def recreation_check(self, metadata): """Check that compileall recreates bytecode when the new metadata is used.""" - if not hasattr(os, 'stat'): - return py_compile.compile(self.source_path) self.assertEqual(*self.data()) with open(self.bc_path, 'rb') as file: diff --git a/Lib/test/test_csv.py b/Lib/test/test_csv.py --- a/Lib/test/test_csv.py +++ b/Lib/test/test_csv.py @@ -905,78 +905,77 @@ dialect = sniffer.sniff(self.sample9) self.assertTrue(dialect.doublequote) -if not hasattr(sys, "gettotalrefcount"): - if support.verbose: print("*** skipping leakage tests ***") -else: - class NUL: - def write(s, *args): - pass - writelines = write +class NUL: + def write(s, *args): + pass + writelines = write - class TestLeaks(unittest.TestCase): - def test_create_read(self): - delta = 0 - lastrc = sys.gettotalrefcount() - for i in range(20): - gc.collect() - self.assertEqual(gc.garbage, []) - rc = sys.gettotalrefcount() - csv.reader(["a,b,c\r\n"]) - csv.reader(["a,b,c\r\n"]) - csv.reader(["a,b,c\r\n"]) - delta = rc-lastrc - lastrc = rc - # if csv.reader() leaks, last delta should be 3 or more - self.assertEqual(delta < 3, True) + at unittest.skipUnless(hasattr(sys, "gettotalrefcount"), + 'requires sys.gettotalrefcount()') +class TestLeaks(unittest.TestCase): + def test_create_read(self): + delta = 0 + lastrc = sys.gettotalrefcount() + for i in range(20): + gc.collect() + self.assertEqual(gc.garbage, []) + rc = sys.gettotalrefcount() + csv.reader(["a,b,c\r\n"]) + csv.reader(["a,b,c\r\n"]) + csv.reader(["a,b,c\r\n"]) + delta = rc-lastrc + lastrc = rc + # if csv.reader() leaks, last delta should be 3 or more + self.assertEqual(delta < 3, True) - def test_create_write(self): - delta = 0 - lastrc = sys.gettotalrefcount() - s = NUL() - for i in range(20): - gc.collect() - self.assertEqual(gc.garbage, []) - rc = sys.gettotalrefcount() - csv.writer(s) - csv.writer(s) - csv.writer(s) - delta = rc-lastrc - lastrc = rc - # if csv.writer() leaks, last delta should be 3 or more - self.assertEqual(delta < 3, True) + def test_create_write(self): + delta = 0 + lastrc = sys.gettotalrefcount() + s = NUL() + for i in range(20): + gc.collect() + self.assertEqual(gc.garbage, []) + rc = sys.gettotalrefcount() + csv.writer(s) + csv.writer(s) + csv.writer(s) + delta = rc-lastrc + lastrc = rc + # if csv.writer() leaks, last delta should be 3 or more + self.assertEqual(delta < 3, True) - def test_read(self): - delta = 0 - rows = ["a,b,c\r\n"]*5 - lastrc = sys.gettotalrefcount() - for i in range(20): - gc.collect() - self.assertEqual(gc.garbage, []) - rc = sys.gettotalrefcount() - rdr = csv.reader(rows) - for row in rdr: - pass - delta = rc-lastrc - lastrc = rc - # if reader leaks during read, delta should be 5 or more - self.assertEqual(delta < 5, True) + def test_read(self): + delta = 0 + rows = ["a,b,c\r\n"]*5 + lastrc = sys.gettotalrefcount() + for i in range(20): + gc.collect() + self.assertEqual(gc.garbage, []) + rc = sys.gettotalrefcount() + rdr = csv.reader(rows) + for row in rdr: + pass + delta = rc-lastrc + lastrc = rc + # if reader leaks during read, delta should be 5 or more + self.assertEqual(delta < 5, True) - def test_write(self): - delta = 0 - rows = [[1,2,3]]*5 - s = NUL() - lastrc = sys.gettotalrefcount() - for i in range(20): - gc.collect() - self.assertEqual(gc.garbage, []) - rc = sys.gettotalrefcount() - writer = csv.writer(s) - for row in rows: - writer.writerow(row) - delta = rc-lastrc - lastrc = rc - # if writer leaks during write, last delta should be 5 or more - self.assertEqual(delta < 5, True) + def test_write(self): + delta = 0 + rows = [[1,2,3]]*5 + s = NUL() + lastrc = sys.gettotalrefcount() + for i in range(20): + gc.collect() + self.assertEqual(gc.garbage, []) + rc = sys.gettotalrefcount() + writer = csv.writer(s) + for row in rows: + writer.writerow(row) + delta = rc-lastrc + lastrc = rc + # if writer leaks during write, last delta should be 5 or more + self.assertEqual(delta < 5, True) class TestUnicode(unittest.TestCase): diff --git a/Lib/test/test_dbm_dumb.py b/Lib/test/test_dbm_dumb.py --- a/Lib/test/test_dbm_dumb.py +++ b/Lib/test/test_dbm_dumb.py @@ -37,11 +37,9 @@ self.read_helper(f) f.close() + @unittest.skipUnless(hasattr(os, 'umask'), 'test needs os.umask()') + @unittest.skipUnless(hasattr(os, 'chmod'), 'test needs os.chmod()') def test_dumbdbm_creation_mode(self): - # On platforms without chmod, don't do anything. - if not (hasattr(os, 'chmod') and hasattr(os, 'umask')): - return - try: old_umask = os.umask(0o002) f = dumbdbm.open(_fname, 'c', 0o637) diff --git a/Lib/test/test_enumerate.py b/Lib/test/test_enumerate.py --- a/Lib/test/test_enumerate.py +++ b/Lib/test/test_enumerate.py @@ -202,11 +202,10 @@ self.assertRaises(TypeError, reversed) self.assertRaises(TypeError, reversed, [], 'extra') + @unittest.skipUnless(hasattr(sys, 'getrefcount'), 'test needs sys.getrefcount()') def test_bug1229429(self): # this bug was never in reversed, it was in # PyObject_CallMethod, and reversed_new calls that sometimes. - if not hasattr(sys, "getrefcount"): - return def f(): pass r = f.__reversed__ = object() diff --git a/Lib/test/test_ftplib.py b/Lib/test/test_ftplib.py --- a/Lib/test/test_ftplib.py +++ b/Lib/test/test_ftplib.py @@ -16,7 +16,7 @@ except ImportError: ssl = None -from unittest import TestCase +from unittest import TestCase, skipUnless from test import support from test.support import HOST, HOSTv6 threading = support.import_module('threading') @@ -780,6 +780,7 @@ self.assertRaises(ftplib.Error, self.client.storlines, 'stor', f) + at skipUnless(support.IPV6_ENABLED, "IPv6 not enabled") class TestIPv6Environment(TestCase): def setUp(self): @@ -820,6 +821,7 @@ retr() + at skipUnless(ssl, "SSL not available") class TestTLS_FTPClassMixin(TestFTPClass): """Repeat TestFTPClass tests starting the TLS layer for both control and data connections first. @@ -835,6 +837,7 @@ self.client.prot_p() + at skipUnless(ssl, "SSL not available") class TestTLS_FTPClass(TestCase): """Specific TLS_FTP class tests.""" @@ -1027,12 +1030,9 @@ def test_main(): - tests = [TestFTPClass, TestTimeouts, TestNetrcDeprecation] - if support.IPV6_ENABLED: - tests.append(TestIPv6Environment) - - if ssl is not None: - tests.extend([TestTLS_FTPClassMixin, TestTLS_FTPClass]) + tests = [TestFTPClass, TestTimeouts, TestNetrcDeprecation, + TestIPv6Environment, + TestTLS_FTPClassMixin, TestTLS_FTPClass] thread_info = support.threading_setup() try: diff --git a/Lib/test/test_mailbox.py b/Lib/test/test_mailbox.py --- a/Lib/test/test_mailbox.py +++ b/Lib/test/test_mailbox.py @@ -868,10 +868,10 @@ for msg in self._box: pass + @unittest.skipUnless(hasattr(os, 'umask'), 'test needs os.umask()') + @unittest.skipUnless(hasattr(os, 'stat'), 'test needs os.stat()') def test_file_permissions(self): # Verify that message files are created without execute permissions - if not hasattr(os, "stat") or not hasattr(os, "umask"): - return msg = mailbox.MaildirMessage(self._template % 0) orig_umask = os.umask(0) try: @@ -882,12 +882,11 @@ mode = os.stat(path).st_mode self.assertFalse(mode & 0o111) + @unittest.skipUnless(hasattr(os, 'umask'), 'test needs os.umask()') + @unittest.skipUnless(hasattr(os, 'stat'), 'test needs os.stat()') def test_folder_file_perms(self): # From bug #3228, we want to verify that the file created inside a Maildir # subfolder isn't marked as executable. - if not hasattr(os, "stat") or not hasattr(os, "umask"): - return - orig_umask = os.umask(0) try: subfolder = self._box.add_folder('subfolder') @@ -1097,24 +1096,25 @@ _factory = lambda self, path, factory=None: mailbox.mbox(path, factory) + @unittest.skipUnless(hasattr(os, 'umask'), 'test needs os.umask()') + @unittest.skipUnless(hasattr(os, 'stat'), 'test needs os.stat()') def test_file_perms(self): # From bug #3228, we want to verify that the mailbox file isn't executable, # even if the umask is set to something that would leave executable bits set. # We only run this test on platforms that support umask. - if hasattr(os, 'umask') and hasattr(os, 'stat'): - try: - old_umask = os.umask(0o077) - self._box.close() - os.unlink(self._path) - self._box = mailbox.mbox(self._path, create=True) - self._box.add('') - self._box.close() - finally: - os.umask(old_umask) + try: + old_umask = os.umask(0o077) + self._box.close() + os.unlink(self._path) + self._box = mailbox.mbox(self._path, create=True) + self._box.add('') + self._box.close() + finally: + os.umask(old_umask) - st = os.stat(self._path) - perms = st.st_mode - self.assertFalse((perms & 0o111)) # Execute bits should all be off. + st = os.stat(self._path) + perms = st.st_mode + self.assertFalse((perms & 0o111)) # Execute bits should all be off. def test_terminating_newline(self): message = email.message.Message() diff --git a/Lib/test/test_math.py b/Lib/test/test_math.py --- a/Lib/test/test_math.py +++ b/Lib/test/test_math.py @@ -980,38 +980,37 @@ # still fails this part of the test on some platforms. For now, we only # *run* test_exceptions() in verbose mode, so that this isn't normally # tested. + @unittest.skipUnless(verbose, 'requires verbose mode') + def test_exceptions(self): + try: + x = math.exp(-1000000000) + except: + # mathmodule.c is failing to weed out underflows from libm, or + # we've got an fp format with huge dynamic range + self.fail("underflowing exp() should not have raised " + "an exception") + if x != 0: + self.fail("underflowing exp() should have returned 0") - if verbose: - def test_exceptions(self): - try: - x = math.exp(-1000000000) - except: - # mathmodule.c is failing to weed out underflows from libm, or - # we've got an fp format with huge dynamic range - self.fail("underflowing exp() should not have raised " - "an exception") - if x != 0: - self.fail("underflowing exp() should have returned 0") + # If this fails, probably using a strict IEEE-754 conforming libm, and x + # is +Inf afterwards. But Python wants overflows detected by default. + try: + x = math.exp(1000000000) + except OverflowError: + pass + else: + self.fail("overflowing exp() didn't trigger OverflowError") - # If this fails, probably using a strict IEEE-754 conforming libm, and x - # is +Inf afterwards. But Python wants overflows detected by default. - try: - x = math.exp(1000000000) - except OverflowError: - pass - else: - self.fail("overflowing exp() didn't trigger OverflowError") - - # If this fails, it could be a puzzle. One odd possibility is that - # mathmodule.c's macros are getting confused while comparing - # Inf (HUGE_VAL) to a NaN, and artificially setting errno to ERANGE - # as a result (and so raising OverflowError instead). - try: - x = math.sqrt(-1.0) - except ValueError: - pass - else: - self.fail("sqrt(-1) didn't raise ValueError") + # If this fails, it could be a puzzle. One odd possibility is that + # mathmodule.c's macros are getting confused while comparing + # Inf (HUGE_VAL) to a NaN, and artificially setting errno to ERANGE + # as a result (and so raising OverflowError instead). + try: + x = math.sqrt(-1.0) + except ValueError: + pass + else: + self.fail("sqrt(-1) didn't raise ValueError") @requires_IEEE_754 def test_testfile(self): diff --git a/Lib/test/test_mmap.py b/Lib/test/test_mmap.py --- a/Lib/test/test_mmap.py +++ b/Lib/test/test_mmap.py @@ -315,26 +315,25 @@ mf.close() f.close() + @unittest.skipUnless(hasattr(os, "stat"), "needs os.stat()") def test_entire_file(self): # test mapping of entire file by passing 0 for map length - if hasattr(os, "stat"): - f = open(TESTFN, "wb+") + f = open(TESTFN, "wb+") - f.write(2**16 * b'm') # Arbitrary character - f.close() + f.write(2**16 * b'm') # Arbitrary character + f.close() - f = open(TESTFN, "rb+") - mf = mmap.mmap(f.fileno(), 0) - self.assertEqual(len(mf), 2**16, "Map size should equal file size.") - self.assertEqual(mf.read(2**16), 2**16 * b"m") - mf.close() - f.close() + f = open(TESTFN, "rb+") + mf = mmap.mmap(f.fileno(), 0) + self.assertEqual(len(mf), 2**16, "Map size should equal file size.") + self.assertEqual(mf.read(2**16), 2**16 * b"m") + mf.close() + f.close() + @unittest.skipUnless(hasattr(os, "stat"), "needs os.stat()") def test_length_0_offset(self): # Issue #10916: test mapping of remainder of file by passing 0 for # map length with an offset doesn't cause a segfault. - if not hasattr(os, "stat"): - self.skipTest("needs os.stat") # NOTE: allocation granularity is currently 65536 under Win64, # and therefore the minimum offset alignment. with open(TESTFN, "wb") as f: @@ -344,12 +343,10 @@ with mmap.mmap(f.fileno(), 0, offset=65536, access=mmap.ACCESS_READ) as mf: self.assertRaises(IndexError, mf.__getitem__, 80000) + @unittest.skipUnless(hasattr(os, "stat"), "needs os.stat()") def test_length_0_large_offset(self): # Issue #10959: test mapping of a file by passing 0 for # map length with a large offset doesn't cause a segfault. - if not hasattr(os, "stat"): - self.skipTest("needs os.stat") - with open(TESTFN, "wb") as f: f.write(115699 * b'm') # Arbitrary character @@ -561,9 +558,8 @@ return mmap.mmap.__new__(klass, -1, *args, **kwargs) anon_mmap(PAGESIZE) + @unittest.skipUnless(hasattr(mmap, 'PROT_READ'), "needs mmap.PROT_READ") def test_prot_readonly(self): - if not hasattr(mmap, 'PROT_READ'): - return mapsize = 10 with open(TESTFN, "wb") as fp: fp.write(b"a"*mapsize) @@ -617,67 +613,69 @@ self.assertEqual(m.read_byte(), b) m.close() - if os.name == 'nt': - def test_tagname(self): - data1 = b"0123456789" - data2 = b"abcdefghij" - assert len(data1) == len(data2) + @unittest.skipUnless(os.name == 'nt', 'requires Windows') + def test_tagname(self): + data1 = b"0123456789" + data2 = b"abcdefghij" + assert len(data1) == len(data2) - # Test same tag - m1 = mmap.mmap(-1, len(data1), tagname="foo") - m1[:] = data1 - m2 = mmap.mmap(-1, len(data2), tagname="foo") - m2[:] = data2 - self.assertEqual(m1[:], data2) - self.assertEqual(m2[:], data2) - m2.close() - m1.close() + # Test same tag + m1 = mmap.mmap(-1, len(data1), tagname="foo") + m1[:] = data1 + m2 = mmap.mmap(-1, len(data2), tagname="foo") + m2[:] = data2 + self.assertEqual(m1[:], data2) + self.assertEqual(m2[:], data2) + m2.close() + m1.close() - # Test different tag - m1 = mmap.mmap(-1, len(data1), tagname="foo") - m1[:] = data1 - m2 = mmap.mmap(-1, len(data2), tagname="boo") - m2[:] = data2 - self.assertEqual(m1[:], data1) - self.assertEqual(m2[:], data2) - m2.close() - m1.close() + # Test different tag + m1 = mmap.mmap(-1, len(data1), tagname="foo") + m1[:] = data1 + m2 = mmap.mmap(-1, len(data2), tagname="boo") + m2[:] = data2 + self.assertEqual(m1[:], data1) + self.assertEqual(m2[:], data2) + m2.close() + m1.close() - def test_crasher_on_windows(self): - # Should not crash (Issue 1733986) - m = mmap.mmap(-1, 1000, tagname="foo") - try: - mmap.mmap(-1, 5000, tagname="foo")[:] # same tagname, but larger size - except: - pass - m.close() + @unittest.skipUnless(os.name == 'nt', 'requires Windows') + def test_crasher_on_windows(self): + # Should not crash (Issue 1733986) + m = mmap.mmap(-1, 1000, tagname="foo") + try: + mmap.mmap(-1, 5000, tagname="foo")[:] # same tagname, but larger size + except: + pass + m.close() - # Should not crash (Issue 5385) - with open(TESTFN, "wb") as fp: - fp.write(b"x"*10) - f = open(TESTFN, "r+b") - m = mmap.mmap(f.fileno(), 0) - f.close() - try: - m.resize(0) # will raise OSError - except: - pass - try: - m[:] - except: - pass - m.close() + # Should not crash (Issue 5385) + with open(TESTFN, "wb") as fp: + fp.write(b"x"*10) + f = open(TESTFN, "r+b") + m = mmap.mmap(f.fileno(), 0) + f.close() + try: + m.resize(0) # will raise OSError + except: + pass + try: + m[:] + except: + pass + m.close() - def test_invalid_descriptor(self): - # socket file descriptors are valid, but out of range - # for _get_osfhandle, causing a crash when validating the - # parameters to _get_osfhandle. - s = socket.socket() - try: - with self.assertRaises(OSError): - m = mmap.mmap(s.fileno(), 10) - finally: - s.close() + @unittest.skipUnless(os.name == 'nt', 'requires Windows') + def test_invalid_descriptor(self): + # socket file descriptors are valid, but out of range + # for _get_osfhandle, causing a crash when validating the + # parameters to _get_osfhandle. + s = socket.socket() + try: + with self.assertRaises(OSError): + m = mmap.mmap(s.fileno(), 10) + finally: + s.close() def test_context_manager(self): with mmap.mmap(-1, 10) as m: diff --git a/Lib/test/test_nntplib.py b/Lib/test/test_nntplib.py --- a/Lib/test/test_nntplib.py +++ b/Lib/test/test_nntplib.py @@ -6,10 +6,12 @@ import functools import contextlib from test import support -from nntplib import NNTP, GroupInfo, _have_ssl +from nntplib import NNTP, GroupInfo import nntplib -if _have_ssl: +try: import ssl +except ImportError: + ssl = None TIMEOUT = 30 @@ -199,23 +201,23 @@ resp, caps = self.server.capabilities() _check_caps(caps) - if _have_ssl: - def test_starttls(self): - file = self.server.file - sock = self.server.sock - try: - self.server.starttls() - except nntplib.NNTPPermanentError: - self.skipTest("STARTTLS not supported by server.") - else: - # Check that the socket and internal pseudo-file really were - # changed. - self.assertNotEqual(file, self.server.file) - self.assertNotEqual(sock, self.server.sock) - # Check that the new socket really is an SSL one - self.assertIsInstance(self.server.sock, ssl.SSLSocket) - # Check that trying starttls when it's already active fails. - self.assertRaises(ValueError, self.server.starttls) + @unittest.skipUnless(ssl, 'requires SSL support') + def test_starttls(self): + file = self.server.file + sock = self.server.sock + try: + self.server.starttls() + except nntplib.NNTPPermanentError: + self.skipTest("STARTTLS not supported by server.") + else: + # Check that the socket and internal pseudo-file really were + # changed. + self.assertNotEqual(file, self.server.file) + self.assertNotEqual(sock, self.server.sock) + # Check that the new socket really is an SSL one + self.assertIsInstance(self.server.sock, ssl.SSLSocket) + # Check that trying starttls when it's already active fails. + self.assertRaises(ValueError, self.server.starttls) def test_zlogin(self): # This test must be the penultimate because further commands will be @@ -300,25 +302,24 @@ if cls.server is not None: cls.server.quit() + at unittest.skipUnless(ssl, 'requires SSL support') +class NetworkedNNTP_SSLTests(NetworkedNNTPTests): -if _have_ssl: - class NetworkedNNTP_SSLTests(NetworkedNNTPTests): + # Technical limits for this public NNTP server (see http://www.aioe.org): + # "Only two concurrent connections per IP address are allowed and + # 400 connections per day are accepted from each IP address." - # Technical limits for this public NNTP server (see http://www.aioe.org): - # "Only two concurrent connections per IP address are allowed and - # 400 connections per day are accepted from each IP address." + NNTP_HOST = 'nntp.aioe.org' + GROUP_NAME = 'comp.lang.python' + GROUP_PAT = 'comp.lang.*' - NNTP_HOST = 'nntp.aioe.org' - GROUP_NAME = 'comp.lang.python' - GROUP_PAT = 'comp.lang.*' + NNTP_CLASS = getattr(nntplib, 'NNTP_SSL', None) - NNTP_CLASS = nntplib.NNTP_SSL + # Disabled as it produces too much data + test_list = None - # Disabled as it produces too much data - test_list = None - - # Disabled as the connection will already be encrypted. - test_starttls = None + # Disabled as the connection will already be encrypted. + test_starttls = None # @@ -1407,12 +1408,13 @@ gives(2000, 6, 23, "000623", "000000") gives(2010, 6, 5, "100605", "000000") + @unittest.skipUnless(ssl, 'requires SSL support') + def test_ssl_support(self): + self.assertTrue(hasattr(nntplib, 'NNTP_SSL')) def test_main(): tests = [MiscTests, NNTPv1Tests, NNTPv2Tests, CapsAfterLoginNNTPv2Tests, - SendReaderNNTPv2Tests, NetworkedNNTPTests] - if _have_ssl: - tests.append(NetworkedNNTP_SSLTests) + SendReaderNNTPv2Tests, NetworkedNNTPTests, NetworkedNNTP_SSLTests] support.run_unittest(*tests) diff --git a/Lib/test/test_os.py b/Lib/test/test_os.py --- a/Lib/test/test_os.py +++ b/Lib/test/test_os.py @@ -185,10 +185,8 @@ os.unlink(self.fname) os.rmdir(support.TESTFN) + @unittest.skipUnless(hasattr(os, 'stat'), 'test needs os.stat()') def check_stat_attributes(self, fname): - if not hasattr(os, "stat"): - return - result = os.stat(fname) # Make sure direct access works @@ -272,10 +270,8 @@ unpickled = pickle.loads(p) self.assertEqual(result, unpickled) + @unittest.skipUnless(hasattr(os, 'statvfs'), 'test needs os.statvfs()') def test_statvfs_attributes(self): - if not hasattr(os, "statvfs"): - return - try: result = os.statvfs(self.fname) except OSError as e: @@ -479,10 +475,10 @@ os.close(dirfd) self._test_utime_subsecond(set_time) - # Restrict test to Win32, since there is no guarantee other + # Restrict tests to Win32, since there is no guarantee other # systems support centiseconds - if sys.platform == 'win32': - def get_file_system(path): + def get_file_system(path): + if sys.platform == 'win32': root = os.path.splitdrive(os.path.abspath(path))[0] + '\\' import ctypes kernel32 = ctypes.windll.kernel32 @@ -490,38 +486,45 @@ if kernel32.GetVolumeInformationW(root, None, 0, None, None, None, buf, len(buf)): return buf.value - if get_file_system(support.TESTFN) == "NTFS": - def test_1565150(self): - t1 = 1159195039.25 - os.utime(self.fname, (t1, t1)) - self.assertEqual(os.stat(self.fname).st_mtime, t1) + @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") + @unittest.skipUnless(get_file_system(support.TESTFN) == "NTFS", + "requires NTFS") + def test_1565150(self): + t1 = 1159195039.25 + os.utime(self.fname, (t1, t1)) + self.assertEqual(os.stat(self.fname).st_mtime, t1) - def test_large_time(self): - t1 = 5000000000 # some day in 2128 - os.utime(self.fname, (t1, t1)) - self.assertEqual(os.stat(self.fname).st_mtime, t1) + @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") + @unittest.skipUnless(get_file_system(support.TESTFN) == "NTFS", + "requires NTFS") + def test_large_time(self): + t1 = 5000000000 # some day in 2128 + os.utime(self.fname, (t1, t1)) + self.assertEqual(os.stat(self.fname).st_mtime, t1) - def test_1686475(self): - # Verify that an open file can be stat'ed - try: - os.stat(r"c:\pagefile.sys") - except FileNotFoundError: - pass # file does not exist; cannot run test - except OSError as e: - self.fail("Could not stat pagefile.sys") + @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") + def test_1686475(self): + # Verify that an open file can be stat'ed + try: + os.stat(r"c:\pagefile.sys") + except FileNotFoundError: + pass # file does not exist; cannot run test + except OSError as e: + self.fail("Could not stat pagefile.sys") - @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") - def test_15261(self): - # Verify that stat'ing a closed fd does not cause crash - r, w = os.pipe() - try: - os.stat(r) # should not raise error - finally: - os.close(r) - os.close(w) - with self.assertRaises(OSError) as ctx: - os.stat(r) - self.assertEqual(ctx.exception.errno, errno.EBADF) + @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") + @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") + def test_15261(self): + # Verify that stat'ing a closed fd does not cause crash + r, w = os.pipe() + try: + os.stat(r) # should not raise error + finally: + os.close(r) + os.close(w) + with self.assertRaises(OSError) as ctx: + os.stat(r) + self.assertEqual(ctx.exception.errno, errno.EBADF) from test import mapping_tests @@ -1167,6 +1170,7 @@ self._test_internal_execvpe(bytes) + at unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") class Win32ErrorTests(unittest.TestCase): def test_rename(self): self.assertRaises(OSError, os.rename, support.TESTFN, support.TESTFN+".bak") @@ -1213,63 +1217,63 @@ self.fail("%r didn't raise a OSError with a bad file descriptor" % f) + @unittest.skipUnless(hasattr(os, 'isatty'), 'test needs os.isatty()') def test_isatty(self): - if hasattr(os, "isatty"): - self.assertEqual(os.isatty(support.make_bad_fd()), False) + self.assertEqual(os.isatty(support.make_bad_fd()), False) + @unittest.skipUnless(hasattr(os, 'closerange'), 'test needs os.closerange()') def test_closerange(self): - if hasattr(os, "closerange"): - fd = support.make_bad_fd() - # Make sure none of the descriptors we are about to close are - # currently valid (issue 6542). - for i in range(10): - try: os.fstat(fd+i) - except OSError: - pass - else: - break - if i < 2: - raise unittest.SkipTest( - "Unable to acquire a range of invalid file descriptors") - self.assertEqual(os.closerange(fd, fd + i-1), None) + fd = support.make_bad_fd() + # Make sure none of the descriptors we are about to close are + # currently valid (issue 6542). + for i in range(10): + try: os.fstat(fd+i) + except OSError: + pass + else: + break + if i < 2: + raise unittest.SkipTest( + "Unable to acquire a range of invalid file descriptors") + self.assertEqual(os.closerange(fd, fd + i-1), None) + @unittest.skipUnless(hasattr(os, 'dup2'), 'test needs os.dup2()') def test_dup2(self): - if hasattr(os, "dup2"): - self.check(os.dup2, 20) + self.check(os.dup2, 20) + @unittest.skipUnless(hasattr(os, 'fchmod'), 'test needs os.fchmod()') def test_fchmod(self): - if hasattr(os, "fchmod"): - self.check(os.fchmod, 0) + self.check(os.fchmod, 0) + @unittest.skipUnless(hasattr(os, 'fchown'), 'test needs os.fchown()') def test_fchown(self): - if hasattr(os, "fchown"): - self.check(os.fchown, -1, -1) + self.check(os.fchown, -1, -1) + @unittest.skipUnless(hasattr(os, 'fpathconf'), 'test needs os.fpathconf()') def test_fpathconf(self): - if hasattr(os, "fpathconf"): - self.check(os.pathconf, "PC_NAME_MAX") - self.check(os.fpathconf, "PC_NAME_MAX") + self.check(os.pathconf, "PC_NAME_MAX") + self.check(os.fpathconf, "PC_NAME_MAX") + @unittest.skipUnless(hasattr(os, 'ftruncate'), 'test needs os.ftruncate()') def test_ftruncate(self): - if hasattr(os, "ftruncate"): - self.check(os.truncate, 0) - self.check(os.ftruncate, 0) + self.check(os.truncate, 0) + self.check(os.ftruncate, 0) + @unittest.skipUnless(hasattr(os, 'lseek'), 'test needs os.lseek()') def test_lseek(self): - if hasattr(os, "lseek"): - self.check(os.lseek, 0, 0) + self.check(os.lseek, 0, 0) + @unittest.skipUnless(hasattr(os, 'read'), 'test needs os.read()') def test_read(self): - if hasattr(os, "read"): - self.check(os.read, 1) + self.check(os.read, 1) + @unittest.skipUnless(hasattr(os, 'tcsetpgrp'), 'test needs os.tcsetpgrp()') def test_tcsetpgrpt(self): - if hasattr(os, "tcsetpgrp"): - self.check(os.tcsetpgrp, 0) + self.check(os.tcsetpgrp, 0) + @unittest.skipUnless(hasattr(os, 'write'), 'test needs os.write()') def test_write(self): - if hasattr(os, "write"): - self.check(os.write, b" ") + self.check(os.write, b" ") class LinkTests(unittest.TestCase): @@ -1309,138 +1313,133 @@ self.file2 = self.file1 + "2" self._test_link(self.file1, self.file2) -if sys.platform != 'win32': - class Win32ErrorTests(unittest.TestCase): - pass + at unittest.skipIf(sys.platform == "win32", "Posix specific tests") +class PosixUidGidTests(unittest.TestCase): + @unittest.skipUnless(hasattr(os, 'setuid'), 'test needs os.setuid()') + def test_setuid(self): + if os.getuid() != 0: + self.assertRaises(OSError, os.setuid, 0) + self.assertRaises(OverflowError, os.setuid, 1<<32) - class PosixUidGidTests(unittest.TestCase): - if hasattr(os, 'setuid'): - def test_setuid(self): - if os.getuid() != 0: - self.assertRaises(OSError, os.setuid, 0) - self.assertRaises(OverflowError, os.setuid, 1<<32) + @unittest.skipUnless(hasattr(os, 'setgid'), 'test needs os.setgid()') + def test_setgid(self): + if os.getuid() != 0 and not HAVE_WHEEL_GROUP: + self.assertRaises(OSError, os.setgid, 0) + self.assertRaises(OverflowError, os.setgid, 1<<32) - if hasattr(os, 'setgid'): - def test_setgid(self): - if os.getuid() != 0 and not HAVE_WHEEL_GROUP: - self.assertRaises(OSError, os.setgid, 0) - self.assertRaises(OverflowError, os.setgid, 1<<32) + @unittest.skipUnless(hasattr(os, 'seteuid'), 'test needs os.seteuid()') + def test_seteuid(self): + if os.getuid() != 0: + self.assertRaises(OSError, os.seteuid, 0) + self.assertRaises(OverflowError, os.seteuid, 1<<32) - if hasattr(os, 'seteuid'): - def test_seteuid(self): - if os.getuid() != 0: - self.assertRaises(OSError, os.seteuid, 0) - self.assertRaises(OverflowError, os.seteuid, 1<<32) + @unittest.skipUnless(hasattr(os, 'setegid'), 'test needs os.setegid()') + def test_setegid(self): + if os.getuid() != 0 and not HAVE_WHEEL_GROUP: + self.assertRaises(OSError, os.setegid, 0) + self.assertRaises(OverflowError, os.setegid, 1<<32) - if hasattr(os, 'setegid'): - def test_setegid(self): - if os.getuid() != 0 and not HAVE_WHEEL_GROUP: - self.assertRaises(OSError, os.setegid, 0) - self.assertRaises(OverflowError, os.setegid, 1<<32) + @unittest.skipUnless(hasattr(os, 'setreuid'), 'test needs os.setreuid()') + def test_setreuid(self): + if os.getuid() != 0: + self.assertRaises(OSError, os.setreuid, 0, 0) + self.assertRaises(OverflowError, os.setreuid, 1<<32, 0) + self.assertRaises(OverflowError, os.setreuid, 0, 1<<32) - if hasattr(os, 'setreuid'): - def test_setreuid(self): - if os.getuid() != 0: - self.assertRaises(OSError, os.setreuid, 0, 0) - self.assertRaises(OverflowError, os.setreuid, 1<<32, 0) - self.assertRaises(OverflowError, os.setreuid, 0, 1<<32) + @unittest.skipUnless(hasattr(os, 'setreuid'), 'test needs os.setreuid()') + def test_setreuid_neg1(self): + # Needs to accept -1. We run this in a subprocess to avoid + # altering the test runner's process state (issue8045). + subprocess.check_call([ + sys.executable, '-c', + 'import os,sys;os.setreuid(-1,-1);sys.exit(0)']) - def test_setreuid_neg1(self): - # Needs to accept -1. We run this in a subprocess to avoid - # altering the test runner's process state (issue8045). - subprocess.check_call([ - sys.executable, '-c', - 'import os,sys;os.setreuid(-1,-1);sys.exit(0)']) + @unittest.skipUnless(hasattr(os, 'setregid'), 'test needs os.setregid()') + def test_setregid(self): + if os.getuid() != 0 and not HAVE_WHEEL_GROUP: + self.assertRaises(OSError, os.setregid, 0, 0) + self.assertRaises(OverflowError, os.setregid, 1<<32, 0) + self.assertRaises(OverflowError, os.setregid, 0, 1<<32) - if hasattr(os, 'setregid'): - def test_setregid(self): - if os.getuid() != 0 and not HAVE_WHEEL_GROUP: - self.assertRaises(OSError, os.setregid, 0, 0) - self.assertRaises(OverflowError, os.setregid, 1<<32, 0) - self.assertRaises(OverflowError, os.setregid, 0, 1<<32) + @unittest.skipUnless(hasattr(os, 'setregid'), 'test needs os.setregid()') + def test_setregid_neg1(self): + # Needs to accept -1. We run this in a subprocess to avoid + # altering the test runner's process state (issue8045). + subprocess.check_call([ + sys.executable, '-c', + 'import os,sys;os.setregid(-1,-1);sys.exit(0)']) - def test_setregid_neg1(self): - # Needs to accept -1. We run this in a subprocess to avoid - # altering the test runner's process state (issue8045). - subprocess.check_call([ - sys.executable, '-c', - 'import os,sys;os.setregid(-1,-1);sys.exit(0)']) + at unittest.skipIf(sys.platform == "win32", "Posix specific tests") +class Pep383Tests(unittest.TestCase): + def setUp(self): + if support.TESTFN_UNENCODABLE: + self.dir = support.TESTFN_UNENCODABLE + elif support.TESTFN_NONASCII: + self.dir = support.TESTFN_NONASCII + else: + self.dir = support.TESTFN + self.bdir = os.fsencode(self.dir) - class Pep383Tests(unittest.TestCase): - def setUp(self): - if support.TESTFN_UNENCODABLE: - self.dir = support.TESTFN_UNENCODABLE - elif support.TESTFN_NONASCII: - self.dir = support.TESTFN_NONASCII - else: - self.dir = support.TESTFN - self.bdir = os.fsencode(self.dir) + bytesfn = [] + def add_filename(fn): + try: + fn = os.fsencode(fn) + except UnicodeEncodeError: + return + bytesfn.append(fn) + add_filename(support.TESTFN_UNICODE) + if support.TESTFN_UNENCODABLE: + add_filename(support.TESTFN_UNENCODABLE) + if support.TESTFN_NONASCII: + add_filename(support.TESTFN_NONASCII) + if not bytesfn: + self.skipTest("couldn't create any non-ascii filename") - bytesfn = [] - def add_filename(fn): - try: - fn = os.fsencode(fn) - except UnicodeEncodeError: - return - bytesfn.append(fn) - add_filename(support.TESTFN_UNICODE) - if support.TESTFN_UNENCODABLE: - add_filename(support.TESTFN_UNENCODABLE) - if support.TESTFN_NONASCII: - add_filename(support.TESTFN_NONASCII) - if not bytesfn: - self.skipTest("couldn't create any non-ascii filename") + self.unicodefn = set() + os.mkdir(self.dir) + try: + for fn in bytesfn: + support.create_empty_file(os.path.join(self.bdir, fn)) + fn = os.fsdecode(fn) + if fn in self.unicodefn: + raise ValueError("duplicate filename") + self.unicodefn.add(fn) + except: + shutil.rmtree(self.dir) + raise - self.unicodefn = set() - os.mkdir(self.dir) - try: - for fn in bytesfn: - support.create_empty_file(os.path.join(self.bdir, fn)) - fn = os.fsdecode(fn) - if fn in self.unicodefn: - raise ValueError("duplicate filename") - self.unicodefn.add(fn) - except: - shutil.rmtree(self.dir) - raise + def tearDown(self): + shutil.rmtree(self.dir) - def tearDown(self): - shutil.rmtree(self.dir) + def test_listdir(self): + expected = self.unicodefn + found = set(os.listdir(self.dir)) + self.assertEqual(found, expected) + # test listdir without arguments + current_directory = os.getcwd() + try: + os.chdir(os.sep) + self.assertEqual(set(os.listdir()), set(os.listdir(os.sep))) + finally: + os.chdir(current_directory) - def test_listdir(self): - expected = self.unicodefn - found = set(os.listdir(self.dir)) - self.assertEqual(found, expected) - # test listdir without arguments - current_directory = os.getcwd() - try: - os.chdir(os.sep) - self.assertEqual(set(os.listdir()), set(os.listdir(os.sep))) - finally: - os.chdir(current_directory) + def test_open(self): + for fn in self.unicodefn: + f = open(os.path.join(self.dir, fn), 'rb') + f.close() - def test_open(self): - for fn in self.unicodefn: - f = open(os.path.join(self.dir, fn), 'rb') - f.close() + @unittest.skipUnless(hasattr(os, 'statvfs'), + "need os.statvfs()") + def test_statvfs(self): + # issue #9645 + for fn in self.unicodefn: + # should not fail with file not found error + fullname = os.path.join(self.dir, fn) + os.statvfs(fullname) - @unittest.skipUnless(hasattr(os, 'statvfs'), - "need os.statvfs()") - def test_statvfs(self): - # issue #9645 - for fn in self.unicodefn: - # should not fail with file not found error - fullname = os.path.join(self.dir, fn) - os.statvfs(fullname) - - def test_stat(self): - for fn in self.unicodefn: - os.stat(os.path.join(self.dir, fn)) -else: - class PosixUidGidTests(unittest.TestCase): - pass - class Pep383Tests(unittest.TestCase): - pass + def test_stat(self): + for fn in self.unicodefn: + os.stat(os.path.join(self.dir, fn)) @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") class Win32KillTests(unittest.TestCase): @@ -1924,6 +1923,8 @@ SUPPORT_HEADERS_TRAILERS = not sys.platform.startswith("linux") and \ not sys.platform.startswith("solaris") and \ not sys.platform.startswith("sunos") + requires_headers_trailers = unittest.skipUnless(SUPPORT_HEADERS_TRAILERS, + 'requires headers and trailers support') @classmethod def setUpClass(cls): @@ -2042,52 +2043,54 @@ # --- headers / trailers tests - if SUPPORT_HEADERS_TRAILERS: + @requires_headers_trailers + def test_headers(self): + total_sent = 0 + sent = os.sendfile(self.sockno, self.fileno, 0, 4096, + headers=[b"x" * 512]) + total_sent += sent + offset = 4096 + nbytes = 4096 + while 1: + sent = self.sendfile_wrapper(self.sockno, self.fileno, + offset, nbytes) + if sent == 0: + break + total_sent += sent + offset += sent - def test_headers(self): - total_sent = 0 - sent = os.sendfile(self.sockno, self.fileno, 0, 4096, - headers=[b"x" * 512]) - total_sent += sent - offset = 4096 - nbytes = 4096 - while 1: - sent = self.sendfile_wrapper(self.sockno, self.fileno, - offset, nbytes) - if sent == 0: - break - total_sent += sent - offset += sent + expected_data = b"x" * 512 + self.DATA + self.assertEqual(total_sent, len(expected_data)) + self.client.close() + self.server.wait() + data = self.server.handler_instance.get_data() + self.assertEqual(hash(data), hash(expected_data)) - expected_data = b"x" * 512 + self.DATA - self.assertEqual(total_sent, len(expected_data)) + @requires_headers_trailers + def test_trailers(self): + TESTFN2 = support.TESTFN + "2" + file_data = b"abcdef" + with open(TESTFN2, 'wb') as f: + f.write(file_data) + with open(TESTFN2, 'rb')as f: + self.addCleanup(os.remove, TESTFN2) + os.sendfile(self.sockno, f.fileno(), 0, len(file_data), + trailers=[b"1234"]) self.client.close() self.server.wait() data = self.server.handler_instance.get_data() - self.assertEqual(hash(data), hash(expected_data)) + self.assertEqual(data, b"abcdef1234") - def test_trailers(self): - TESTFN2 = support.TESTFN + "2" - file_data = b"abcdef" - with open(TESTFN2, 'wb') as f: - f.write(file_data) - with open(TESTFN2, 'rb')as f: - self.addCleanup(os.remove, TESTFN2) - os.sendfile(self.sockno, f.fileno(), 0, len(file_data), - trailers=[b"1234"]) - self.client.close() - self.server.wait() - data = self.server.handler_instance.get_data() - self.assertEqual(data, b"abcdef1234") - - if hasattr(os, "SF_NODISKIO"): - def test_flags(self): - try: - os.sendfile(self.sockno, self.fileno, 0, 4096, - flags=os.SF_NODISKIO) - except OSError as err: - if err.errno not in (errno.EBUSY, errno.EAGAIN): - raise + @requires_headers_trailers + @unittest.skipUnless(hasattr(os, 'SF_NODISKIO'), + 'test needs os.SF_NODISKIO') + def test_flags(self): + try: + os.sendfile(self.sockno, self.fileno, 0, 4096, + flags=os.SF_NODISKIO) + except OSError as err: + if err.errno not in (errno.EBUSY, errno.EAGAIN): + raise def supports_extended_attributes(): diff --git a/Lib/test/test_poplib.py b/Lib/test/test_poplib.py --- a/Lib/test/test_poplib.py +++ b/Lib/test/test_poplib.py @@ -11,7 +11,7 @@ import time import errno -from unittest import TestCase +from unittest import TestCase, skipUnless from test import support as test_support threading = test_support.import_module('threading') @@ -24,6 +24,7 @@ SUPPORTS_SSL = True CERTFILE = os.path.join(os.path.dirname(__file__) or os.curdir, "keycert.pem") +requires_ssl = skipUnless(SUPPORTS_SSL, 'SSL not supported') # the dummy data returned by server when LIST and RETR commands are issued LIST_RESP = b'1 1\r\n2 2\r\n3 3\r\n4 4\r\n5 5\r\n.\r\n' @@ -316,22 +317,23 @@ self.assertIsNone(self.client.sock) self.assertIsNone(self.client.file) - if SUPPORTS_SSL: + @requires_ssl + def test_stls_capa(self): + capa = self.client.capa() + self.assertTrue('STLS' in capa.keys()) - def test_stls_capa(self): - capa = self.client.capa() - self.assertTrue('STLS' in capa.keys()) + @requires_ssl + def test_stls(self): + expected = b'+OK Begin TLS negotiation' + resp = self.client.stls() + self.assertEqual(resp, expected) - def test_stls(self): - expected = b'+OK Begin TLS negotiation' - resp = self.client.stls() - self.assertEqual(resp, expected) - - def test_stls_context(self): - expected = b'+OK Begin TLS negotiation' - ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) - resp = self.client.stls(context=ctx) - self.assertEqual(resp, expected) + @requires_ssl + def test_stls_context(self): + expected = b'+OK Begin TLS negotiation' + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + resp = self.client.stls(context=ctx) + self.assertEqual(resp, expected) if SUPPORTS_SSL: @@ -354,73 +356,75 @@ self.push('+OK dummy pop3 server ready. ') - class TestPOP3_SSLClass(TestPOP3Class): - # repeat previous tests by using poplib.POP3_SSL + at requires_ssl +class TestPOP3_SSLClass(TestPOP3Class): + # repeat previous tests by using poplib.POP3_SSL - def setUp(self): - self.server = DummyPOP3Server((HOST, PORT)) - self.server.handler = DummyPOP3_SSLHandler - self.server.start() - self.client = poplib.POP3_SSL(self.server.host, self.server.port) + def setUp(self): + self.server = DummyPOP3Server((HOST, PORT)) + self.server.handler = DummyPOP3_SSLHandler + self.server.start() + self.client = poplib.POP3_SSL(self.server.host, self.server.port) - def test__all__(self): - self.assertIn('POP3_SSL', poplib.__all__) + def test__all__(self): + self.assertIn('POP3_SSL', poplib.__all__) - def test_context(self): - ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) - self.assertRaises(ValueError, poplib.POP3_SSL, self.server.host, - self.server.port, keyfile=CERTFILE, context=ctx) - self.assertRaises(ValueError, poplib.POP3_SSL, self.server.host, - self.server.port, certfile=CERTFILE, context=ctx) - self.assertRaises(ValueError, poplib.POP3_SSL, self.server.host, - self.server.port, keyfile=CERTFILE, - certfile=CERTFILE, context=ctx) + def test_context(self): + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + self.assertRaises(ValueError, poplib.POP3_SSL, self.server.host, + self.server.port, keyfile=CERTFILE, context=ctx) + self.assertRaises(ValueError, poplib.POP3_SSL, self.server.host, + self.server.port, certfile=CERTFILE, context=ctx) + self.assertRaises(ValueError, poplib.POP3_SSL, self.server.host, + self.server.port, keyfile=CERTFILE, + certfile=CERTFILE, context=ctx) - self.client.quit() - self.client = poplib.POP3_SSL(self.server.host, self.server.port, - context=ctx) - self.assertIsInstance(self.client.sock, ssl.SSLSocket) - self.assertIs(self.client.sock.context, ctx) - self.assertTrue(self.client.noop().startswith(b'+OK')) + self.client.quit() + self.client = poplib.POP3_SSL(self.server.host, self.server.port, + context=ctx) + self.assertIsInstance(self.client.sock, ssl.SSLSocket) + self.assertIs(self.client.sock.context, ctx) + self.assertTrue(self.client.noop().startswith(b'+OK')) - def test_stls(self): - self.assertRaises(poplib.error_proto, self.client.stls) + def test_stls(self): + self.assertRaises(poplib.error_proto, self.client.stls) - test_stls_context = test_stls + test_stls_context = test_stls - def test_stls_capa(self): - capa = self.client.capa() - self.assertFalse('STLS' in capa.keys()) + def test_stls_capa(self): + capa = self.client.capa() + self.assertFalse('STLS' in capa.keys()) - class TestPOP3_TLSClass(TestPOP3Class): - # repeat previous tests by using poplib.POP3.stls() + at requires_ssl +class TestPOP3_TLSClass(TestPOP3Class): + # repeat previous tests by using poplib.POP3.stls() - def setUp(self): - self.server = DummyPOP3Server((HOST, PORT)) - self.server.start() - self.client = poplib.POP3(self.server.host, self.server.port, timeout=3) - self.client.stls() + def setUp(self): + self.server = DummyPOP3Server((HOST, PORT)) + self.server.start() + self.client = poplib.POP3(self.server.host, self.server.port, timeout=3) + self.client.stls() - def tearDown(self): - if self.client.file is not None and self.client.sock is not None: - try: - self.client.quit() - except poplib.error_proto: - # happens in the test_too_long_lines case; the overlong - # response will be treated as response to QUIT and raise - # this exception - pass - self.server.stop() + def tearDown(self): + if self.client.file is not None and self.client.sock is not None: + try: + self.client.quit() + except poplib.error_proto: + # happens in the test_too_long_lines case; the overlong + # response will be treated as response to QUIT and raise + # this exception + pass + self.server.stop() - def test_stls(self): - self.assertRaises(poplib.error_proto, self.client.stls) + def test_stls(self): + self.assertRaises(poplib.error_proto, self.client.stls) - test_stls_context = test_stls + test_stls_context = test_stls - def test_stls_capa(self): - capa = self.client.capa() - self.assertFalse(b'STLS' in capa.keys()) + def test_stls_capa(self): + capa = self.client.capa() + self.assertFalse(b'STLS' in capa.keys()) class TestTimeouts(TestCase): @@ -478,10 +482,8 @@ def test_main(): - tests = [TestPOP3Class, TestTimeouts] - if SUPPORTS_SSL: - tests.append(TestPOP3_SSLClass) - tests.append(TestPOP3_TLSClass) + tests = [TestPOP3Class, TestTimeouts, + TestPOP3_SSLClass, TestPOP3_TLSClass] thread_info = test_support.threading_setup() try: test_support.run_unittest(*tests) diff --git a/Lib/test/test_posix.py b/Lib/test/test_posix.py --- a/Lib/test/test_posix.py +++ b/Lib/test/test_posix.py @@ -54,47 +54,55 @@ posix_func() self.assertRaises(TypeError, posix_func, 1) - if hasattr(posix, 'getresuid'): - def test_getresuid(self): - user_ids = posix.getresuid() - self.assertEqual(len(user_ids), 3) - for val in user_ids: - self.assertGreaterEqual(val, 0) + @unittest.skipUnless(hasattr(posix, 'getresuid'), + 'test needs posix.getresuid()') + def test_getresuid(self): + user_ids = posix.getresuid() + self.assertEqual(len(user_ids), 3) + for val in user_ids: + self.assertGreaterEqual(val, 0) - if hasattr(posix, 'getresgid'): - def test_getresgid(self): - group_ids = posix.getresgid() - self.assertEqual(len(group_ids), 3) - for val in group_ids: - self.assertGreaterEqual(val, 0) + @unittest.skipUnless(hasattr(posix, 'getresgid'), + 'test needs posix.getresgid()') + def test_getresgid(self): + group_ids = posix.getresgid() + self.assertEqual(len(group_ids), 3) + for val in group_ids: + self.assertGreaterEqual(val, 0) - if hasattr(posix, 'setresuid'): - def test_setresuid(self): - current_user_ids = posix.getresuid() - self.assertIsNone(posix.setresuid(*current_user_ids)) - # -1 means don't change that value. - self.assertIsNone(posix.setresuid(-1, -1, -1)) + @unittest.skipUnless(hasattr(posix, 'setresuid'), + 'test needs posix.setresuid()') + def test_setresuid(self): + current_user_ids = posix.getresuid() + self.assertIsNone(posix.setresuid(*current_user_ids)) + # -1 means don't change that value. + self.assertIsNone(posix.setresuid(-1, -1, -1)) - def test_setresuid_exception(self): - # Don't do this test if someone is silly enough to run us as root. - current_user_ids = posix.getresuid() - if 0 not in current_user_ids: - new_user_ids = (current_user_ids[0]+1, -1, -1) - self.assertRaises(OSError, posix.setresuid, *new_user_ids) + @unittest.skipUnless(hasattr(posix, 'setresuid'), + 'test needs posix.setresuid()') + def test_setresuid_exception(self): + # Don't do this test if someone is silly enough to run us as root. + current_user_ids = posix.getresuid() + if 0 not in current_user_ids: + new_user_ids = (current_user_ids[0]+1, -1, -1) + self.assertRaises(OSError, posix.setresuid, *new_user_ids) - if hasattr(posix, 'setresgid'): - def test_setresgid(self): - current_group_ids = posix.getresgid() - self.assertIsNone(posix.setresgid(*current_group_ids)) - # -1 means don't change that value. - self.assertIsNone(posix.setresgid(-1, -1, -1)) + @unittest.skipUnless(hasattr(posix, 'setresgid'), + 'test needs posix.setresgid()') + def test_setresgid(self): + current_group_ids = posix.getresgid() + self.assertIsNone(posix.setresgid(*current_group_ids)) + # -1 means don't change that value. + self.assertIsNone(posix.setresgid(-1, -1, -1)) - def test_setresgid_exception(self): - # Don't do this test if someone is silly enough to run us as root. - current_group_ids = posix.getresgid() - if 0 not in current_group_ids: - new_group_ids = (current_group_ids[0]+1, -1, -1) - self.assertRaises(OSError, posix.setresgid, *new_group_ids) + @unittest.skipUnless(hasattr(posix, 'setresgid'), + 'test needs posix.setresgid()') + def test_setresgid_exception(self): + # Don't do this test if someone is silly enough to run us as root. + current_group_ids = posix.getresgid() + if 0 not in current_group_ids: + new_group_ids = (current_group_ids[0]+1, -1, -1) + self.assertRaises(OSError, posix.setresgid, *new_group_ids) @unittest.skipUnless(hasattr(posix, 'initgroups'), "test needs os.initgroups()") @@ -121,29 +129,32 @@ else: self.fail("Expected OSError to be raised by initgroups") + @unittest.skipUnless(hasattr(posix, 'statvfs'), + 'test needs posix.statvfs()') def test_statvfs(self): - if hasattr(posix, 'statvfs'): - self.assertTrue(posix.statvfs(os.curdir)) + self.assertTrue(posix.statvfs(os.curdir)) + @unittest.skipUnless(hasattr(posix, 'fstatvfs'), + 'test needs posix.fstatvfs()') def test_fstatvfs(self): - if hasattr(posix, 'fstatvfs'): - fp = open(support.TESTFN) - try: - self.assertTrue(posix.fstatvfs(fp.fileno())) - self.assertTrue(posix.statvfs(fp.fileno())) - finally: - fp.close() + fp = open(support.TESTFN) + try: + self.assertTrue(posix.fstatvfs(fp.fileno())) + self.assertTrue(posix.statvfs(fp.fileno())) + finally: + fp.close() + @unittest.skipUnless(hasattr(posix, 'ftruncate'), + 'test needs posix.ftruncate()') def test_ftruncate(self): - if hasattr(posix, 'ftruncate'): - fp = open(support.TESTFN, 'w+') - try: - # we need to have some data to truncate - fp.write('test') - fp.flush() - posix.ftruncate(fp.fileno(), 0) - finally: - fp.close() + fp = open(support.TESTFN, 'w+') + try: + # we need to have some data to truncate + fp.write('test') + fp.flush() + posix.ftruncate(fp.fileno(), 0) + finally: + fp.close() @unittest.skipUnless(hasattr(posix, 'truncate'), "test needs posix.truncate()") def test_truncate(self): @@ -290,30 +301,33 @@ finally: os.close(fd) + @unittest.skipUnless(hasattr(posix, 'dup'), + 'test needs posix.dup()') def test_dup(self): - if hasattr(posix, 'dup'): - fp = open(support.TESTFN) - try: - fd = posix.dup(fp.fileno()) - self.assertIsInstance(fd, int) - os.close(fd) - finally: - fp.close() + fp = open(support.TESTFN) + try: + fd = posix.dup(fp.fileno()) + self.assertIsInstance(fd, int) + os.close(fd) + finally: + fp.close() + @unittest.skipUnless(hasattr(posix, 'confstr'), + 'test needs posix.confstr()') def test_confstr(self): - if hasattr(posix, 'confstr'): - self.assertRaises(ValueError, posix.confstr, "CS_garbage") - self.assertEqual(len(posix.confstr("CS_PATH")) > 0, True) + self.assertRaises(ValueError, posix.confstr, "CS_garbage") + self.assertEqual(len(posix.confstr("CS_PATH")) > 0, True) + @unittest.skipUnless(hasattr(posix, 'dup2'), + 'test needs posix.dup2()') def test_dup2(self): - if hasattr(posix, 'dup2'): - fp1 = open(support.TESTFN) - fp2 = open(support.TESTFN) - try: - posix.dup2(fp1.fileno(), fp2.fileno()) - finally: - fp1.close() - fp2.close() + fp1 = open(support.TESTFN) + fp2 = open(support.TESTFN) + try: + posix.dup2(fp1.fileno(), fp2.fileno()) + finally: + fp1.close() + fp2.close() @unittest.skipUnless(hasattr(os, 'O_CLOEXEC'), "needs os.O_CLOEXEC") @support.requires_linux_version(2, 6, 23) @@ -322,65 +336,69 @@ self.addCleanup(os.close, fd) self.assertTrue(fcntl.fcntl(fd, fcntl.F_GETFD) & fcntl.FD_CLOEXEC) + @unittest.skipUnless(hasattr(posix, 'O_EXLOCK'), + 'test needs posix.O_EXLOCK') def test_osexlock(self): - if hasattr(posix, "O_EXLOCK"): + fd = os.open(support.TESTFN, + os.O_WRONLY|os.O_EXLOCK|os.O_CREAT) + self.assertRaises(OSError, os.open, support.TESTFN, + os.O_WRONLY|os.O_EXLOCK|os.O_NONBLOCK) + os.close(fd) + + if hasattr(posix, "O_SHLOCK"): fd = os.open(support.TESTFN, - os.O_WRONLY|os.O_EXLOCK|os.O_CREAT) + os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) self.assertRaises(OSError, os.open, support.TESTFN, os.O_WRONLY|os.O_EXLOCK|os.O_NONBLOCK) os.close(fd) - if hasattr(posix, "O_SHLOCK"): - fd = os.open(support.TESTFN, - os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) - self.assertRaises(OSError, os.open, support.TESTFN, - os.O_WRONLY|os.O_EXLOCK|os.O_NONBLOCK) - os.close(fd) + @unittest.skipUnless(hasattr(posix, 'O_SHLOCK'), + 'test needs posix.O_SHLOCK') + def test_osshlock(self): + fd1 = os.open(support.TESTFN, + os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) + fd2 = os.open(support.TESTFN, + os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) + os.close(fd2) + os.close(fd1) - def test_osshlock(self): - if hasattr(posix, "O_SHLOCK"): - fd1 = os.open(support.TESTFN, + if hasattr(posix, "O_EXLOCK"): + fd = os.open(support.TESTFN, os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) - fd2 = os.open(support.TESTFN, - os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) - os.close(fd2) - os.close(fd1) + self.assertRaises(OSError, os.open, support.TESTFN, + os.O_RDONLY|os.O_EXLOCK|os.O_NONBLOCK) + os.close(fd) - if hasattr(posix, "O_EXLOCK"): - fd = os.open(support.TESTFN, - os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) - self.assertRaises(OSError, os.open, support.TESTFN, - os.O_RDONLY|os.O_EXLOCK|os.O_NONBLOCK) - os.close(fd) - + @unittest.skipUnless(hasattr(posix, 'fstat'), + 'test needs posix.fstat()') def test_fstat(self): - if hasattr(posix, 'fstat'): - fp = open(support.TESTFN) - try: - self.assertTrue(posix.fstat(fp.fileno())) - self.assertTrue(posix.stat(fp.fileno())) - - self.assertRaisesRegex(TypeError, - 'should be string, bytes or integer, not', - posix.stat, float(fp.fileno())) - finally: - fp.close() - - def test_stat(self): - if hasattr(posix, 'stat'): - self.assertTrue(posix.stat(support.TESTFN)) - self.assertTrue(posix.stat(os.fsencode(support.TESTFN))) - self.assertTrue(posix.stat(bytearray(os.fsencode(support.TESTFN)))) + fp = open(support.TESTFN) + try: + self.assertTrue(posix.fstat(fp.fileno())) + self.assertTrue(posix.stat(fp.fileno())) self.assertRaisesRegex(TypeError, - 'can\'t specify None for path argument', - posix.stat, None) - self.assertRaisesRegex(TypeError, 'should be string, bytes or integer, not', - posix.stat, list(support.TESTFN)) - self.assertRaisesRegex(TypeError, - 'should be string, bytes or integer, not', - posix.stat, list(os.fsencode(support.TESTFN))) + posix.stat, float(fp.fileno())) + finally: + fp.close() + + @unittest.skipUnless(hasattr(posix, 'stat'), + 'test needs posix.stat()') + def test_stat(self): + self.assertTrue(posix.stat(support.TESTFN)) + self.assertTrue(posix.stat(os.fsencode(support.TESTFN))) + self.assertTrue(posix.stat(bytearray(os.fsencode(support.TESTFN)))) + + self.assertRaisesRegex(TypeError, + 'can\'t specify None for path argument', + posix.stat, None) + self.assertRaisesRegex(TypeError, + 'should be string, bytes or integer, not', + posix.stat, list(support.TESTFN)) + self.assertRaisesRegex(TypeError, + 'should be string, bytes or integer, not', + posix.stat, list(os.fsencode(support.TESTFN))) @unittest.skipUnless(hasattr(posix, 'mkfifo'), "don't have mkfifo()") def test_mkfifo(self): @@ -495,10 +513,10 @@ self._test_all_chown_common(posix.lchown, support.TESTFN, getattr(posix, 'lstat', None)) + @unittest.skipUnless(hasattr(posix, 'chdir'), 'test needs posix.chdir()') def test_chdir(self): - if hasattr(posix, 'chdir'): - posix.chdir(os.curdir) - self.assertRaises(OSError, posix.chdir, support.TESTFN) + posix.chdir(os.curdir) + self.assertRaises(OSError, posix.chdir, support.TESTFN) def test_listdir(self): self.assertTrue(support.TESTFN in posix.listdir(os.curdir)) @@ -528,25 +546,26 @@ sorted(posix.listdir(f)) ) + @unittest.skipUnless(hasattr(posix, 'access'), 'test needs posix.access()') def test_access(self): - if hasattr(posix, 'access'): - self.assertTrue(posix.access(support.TESTFN, os.R_OK)) + self.assertTrue(posix.access(support.TESTFN, os.R_OK)) + @unittest.skipUnless(hasattr(posix, 'umask'), 'test needs posix.umask()') def test_umask(self): - if hasattr(posix, 'umask'): - old_mask = posix.umask(0) - self.assertIsInstance(old_mask, int) - posix.umask(old_mask) + old_mask = posix.umask(0) + self.assertIsInstance(old_mask, int) + posix.umask(old_mask) + @unittest.skipUnless(hasattr(posix, 'strerror'), + 'test needs posix.strerror()') def test_strerror(self): - if hasattr(posix, 'strerror'): - self.assertTrue(posix.strerror(0)) + self.assertTrue(posix.strerror(0)) + @unittest.skipUnless(hasattr(posix, 'pipe'), 'test needs posix.pipe()') def test_pipe(self): - if hasattr(posix, 'pipe'): - reader, writer = posix.pipe() - os.close(reader) - os.close(writer) + reader, writer = posix.pipe() + os.close(reader) + os.close(writer) @unittest.skipUnless(hasattr(os, 'pipe2'), "test needs os.pipe2()") @support.requires_linux_version(2, 6, 27) @@ -580,15 +599,15 @@ self.assertRaises(OverflowError, os.pipe2, _testcapi.INT_MAX + 1) self.assertRaises(OverflowError, os.pipe2, _testcapi.UINT_MAX + 1) + @unittest.skipUnless(hasattr(posix, 'utime'), 'test needs posix.utime()') def test_utime(self): - if hasattr(posix, 'utime'): - now = time.time() - posix.utime(support.TESTFN, None) - self.assertRaises(TypeError, posix.utime, support.TESTFN, (None, None)) - self.assertRaises(TypeError, posix.utime, support.TESTFN, (now, None)) - self.assertRaises(TypeError, posix.utime, support.TESTFN, (None, now)) - posix.utime(support.TESTFN, (int(now), int(now))) - posix.utime(support.TESTFN, (now, now)) + now = time.time() + posix.utime(support.TESTFN, None) + self.assertRaises(TypeError, posix.utime, support.TESTFN, (None, None)) + self.assertRaises(TypeError, posix.utime, support.TESTFN, (now, None)) + self.assertRaises(TypeError, posix.utime, support.TESTFN, (None, now)) + posix.utime(support.TESTFN, (int(now), int(now))) + posix.utime(support.TESTFN, (now, now)) def _test_chflags_regular_file(self, chflags_func, target_file, **kwargs): st = os.stat(target_file) @@ -665,6 +684,7 @@ self.assertEqual(type(k), item_type) self.assertEqual(type(v), item_type) + @unittest.skipUnless(hasattr(posix, 'getcwd'), 'test needs posix.getcwd()') def test_getcwd_long_pathnames(self): dirname = 'getcwd-test-directory-0123456789abcdef-01234567890abcdef' curdir = os.getcwd() diff --git a/Lib/test/test_set.py b/Lib/test/test_set.py --- a/Lib/test/test_set.py +++ b/Lib/test/test_set.py @@ -625,10 +625,10 @@ myset >= myobj self.assertTrue(myobj.le_called) - # C API test only available in a debug build - if hasattr(set, "test_c_api"): - def test_c_api(self): - self.assertEqual(set().test_c_api(), True) + @unittest.skipUnless(hasattr(set, "test_c_api"), + 'C API test only available in a debug build') + def test_c_api(self): + self.assertEqual(set().test_c_api(), True) class SetSubclass(set): pass diff --git a/Lib/test/test_shutil.py b/Lib/test/test_shutil.py --- a/Lib/test/test_shutil.py +++ b/Lib/test/test_shutil.py @@ -195,37 +195,37 @@ self.assertIn(errors[1][2][1].filename, possible_args) - # See bug #1071513 for why we don't run this on cygwin - # and bug #1076467 for why we don't run this as root. - if (hasattr(os, 'chmod') and sys.platform[:6] != 'cygwin' - and not (hasattr(os, 'geteuid') and os.geteuid() == 0)): - def test_on_error(self): - self.errorState = 0 - os.mkdir(TESTFN) - self.addCleanup(shutil.rmtree, TESTFN) + @unittest.skipUnless(hasattr(os, 'chmod'), 'requires os.chmod()') + @unittest.skipIf(sys.platform[:6] == 'cygwin', + "This test can't be run on Cygwin (issue #1071513).") + @unittest.skipIf(hasattr(os, 'geteuid') and os.geteuid() == 0, + "This test can't be run reliably as root (issue #1076467).") + def test_on_error(self): + self.errorState = 0 + os.mkdir(TESTFN) + self.addCleanup(shutil.rmtree, TESTFN) - self.child_file_path = os.path.join(TESTFN, 'a') - self.child_dir_path = os.path.join(TESTFN, 'b') - support.create_empty_file(self.child_file_path) - os.mkdir(self.child_dir_path) - old_dir_mode = os.stat(TESTFN).st_mode - old_child_file_mode = os.stat(self.child_file_path).st_mode - old_child_dir_mode = os.stat(self.child_dir_path).st_mode - # Make unwritable. - new_mode = stat.S_IREAD|stat.S_IEXEC - os.chmod(self.child_file_path, new_mode) - os.chmod(self.child_dir_path, new_mode) - os.chmod(TESTFN, new_mode) + self.child_file_path = os.path.join(TESTFN, 'a') + self.child_dir_path = os.path.join(TESTFN, 'b') + support.create_empty_file(self.child_file_path) + os.mkdir(self.child_dir_path) + old_dir_mode = os.stat(TESTFN).st_mode + old_child_file_mode = os.stat(self.child_file_path).st_mode + old_child_dir_mode = os.stat(self.child_dir_path).st_mode + # Make unwritable. + new_mode = stat.S_IREAD|stat.S_IEXEC + os.chmod(self.child_file_path, new_mode) + os.chmod(self.child_dir_path, new_mode) + os.chmod(TESTFN, new_mode) - self.addCleanup(os.chmod, TESTFN, old_dir_mode) - self.addCleanup(os.chmod, self.child_file_path, old_child_file_mode) - self.addCleanup(os.chmod, self.child_dir_path, old_child_dir_mode) + self.addCleanup(os.chmod, TESTFN, old_dir_mode) + self.addCleanup(os.chmod, self.child_file_path, old_child_file_mode) + self.addCleanup(os.chmod, self.child_dir_path, old_child_dir_mode) - shutil.rmtree(TESTFN, onerror=self.check_args_to_onerror) - # Test whether onerror has actually been called. - self.assertEqual(self.errorState, 3, - "Expected call to onerror function did not " - "happen.") + shutil.rmtree(TESTFN, onerror=self.check_args_to_onerror) + # Test whether onerror has actually been called. + self.assertEqual(self.errorState, 3, + "Expected call to onerror function did not happen.") def check_args_to_onerror(self, func, arg, exc): # test_rmtree_errors deliberately runs rmtree @@ -807,38 +807,39 @@ finally: shutil.rmtree(TESTFN, ignore_errors=True) - if hasattr(os, "mkfifo"): - # Issue #3002: copyfile and copytree block indefinitely on named pipes - def test_copyfile_named_pipe(self): - os.mkfifo(TESTFN) + # Issue #3002: copyfile and copytree block indefinitely on named pipes + @unittest.skipUnless(hasattr(os, "mkfifo"), 'requires os.mkfifo()') + def test_copyfile_named_pipe(self): + os.mkfifo(TESTFN) + try: + self.assertRaises(shutil.SpecialFileError, + shutil.copyfile, TESTFN, TESTFN2) + self.assertRaises(shutil.SpecialFileError, + shutil.copyfile, __file__, TESTFN) + finally: + os.remove(TESTFN) + + @unittest.skipUnless(hasattr(os, "mkfifo"), 'requires os.mkfifo()') + @support.skip_unless_symlink + def test_copytree_named_pipe(self): + os.mkdir(TESTFN) + try: + subdir = os.path.join(TESTFN, "subdir") + os.mkdir(subdir) + pipe = os.path.join(subdir, "mypipe") + os.mkfifo(pipe) try: - self.assertRaises(shutil.SpecialFileError, - shutil.copyfile, TESTFN, TESTFN2) - self.assertRaises(shutil.SpecialFileError, - shutil.copyfile, __file__, TESTFN) - finally: - os.remove(TESTFN) - - @support.skip_unless_symlink - def test_copytree_named_pipe(self): - os.mkdir(TESTFN) - try: - subdir = os.path.join(TESTFN, "subdir") - os.mkdir(subdir) - pipe = os.path.join(subdir, "mypipe") - os.mkfifo(pipe) - try: - shutil.copytree(TESTFN, TESTFN2) - except shutil.Error as e: - errors = e.args[0] - self.assertEqual(len(errors), 1) - src, dst, error_msg = errors[0] - self.assertEqual("`%s` is a named pipe" % pipe, error_msg) - else: - self.fail("shutil.Error should have been raised") - finally: - shutil.rmtree(TESTFN, ignore_errors=True) - shutil.rmtree(TESTFN2, ignore_errors=True) + shutil.copytree(TESTFN, TESTFN2) + except shutil.Error as e: + errors = e.args[0] + self.assertEqual(len(errors), 1) + src, dst, error_msg = errors[0] + self.assertEqual("`%s` is a named pipe" % pipe, error_msg) + else: + self.fail("shutil.Error should have been raised") + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + shutil.rmtree(TESTFN2, ignore_errors=True) def test_copytree_special_func(self): diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -817,16 +817,17 @@ self.assertRaises(TypeError, socket.if_nametoindex, 0) self.assertRaises(TypeError, socket.if_indextoname, '_DEADBEEF') + @unittest.skipUnless(hasattr(sys, 'getrefcount'), + 'test needs sys.getrefcount()') def testRefCountGetNameInfo(self): # Testing reference count for getnameinfo - if hasattr(sys, "getrefcount"): - try: - # On some versions, this loses a reference - orig = sys.getrefcount(__name__) - socket.getnameinfo(__name__,0) - except TypeError: - if sys.getrefcount(__name__) != orig: - self.fail("socket.getnameinfo loses a reference") + try: + # On some versions, this loses a reference + orig = sys.getrefcount(__name__) + socket.getnameinfo(__name__,0) + except TypeError: + if sys.getrefcount(__name__) != orig: + self.fail("socket.getnameinfo loses a reference") def testInterpreterCrash(self): # Making sure getnameinfo doesn't crash the interpreter @@ -931,17 +932,17 @@ # Check that setting it to an invalid type raises TypeError self.assertRaises(TypeError, socket.setdefaulttimeout, "spam") + @unittest.skipUnless(hasattr(socket, 'inet_aton'), + 'test needs socket.inet_aton()') def testIPv4_inet_aton_fourbytes(self): - if not hasattr(socket, 'inet_aton'): - return # No inet_aton, nothing to check # Test that issue1008086 and issue767150 are fixed. # It must return 4 bytes. self.assertEqual(b'\x00'*4, socket.inet_aton('0.0.0.0')) self.assertEqual(b'\xff'*4, socket.inet_aton('255.255.255.255')) + @unittest.skipUnless(hasattr(socket, 'inet_pton'), + 'test needs socket.inet_pton()') def testIPv4toString(self): - if not hasattr(socket, 'inet_pton'): - return # No inet_pton() on this platform from socket import inet_aton as f, inet_pton, AF_INET g = lambda a: inet_pton(AF_INET, a) @@ -970,9 +971,9 @@ assertInvalid(g, '1.2.3.4.5') assertInvalid(g, '::1') + @unittest.skipUnless(hasattr(socket, 'inet_pton'), + 'test needs socket.inet_pton()') def testIPv6toString(self): - if not hasattr(socket, 'inet_pton'): - return # No inet_pton() on this platform try: from socket import inet_pton, AF_INET6, has_ipv6 if not has_ipv6: @@ -1024,9 +1025,9 @@ assertInvalid('::1.2.3.4:0') assertInvalid('0.100.200.0:3:4:5:6:7:8') + @unittest.skipUnless(hasattr(socket, 'inet_ntop'), + 'test needs socket.inet_ntop()') def testStringToIPv4(self): - if not hasattr(socket, 'inet_ntop'): - return # No inet_ntop() on this platform from socket import inet_ntoa as f, inet_ntop, AF_INET g = lambda a: inet_ntop(AF_INET, a) assertInvalid = lambda func,a: self.assertRaises( @@ -1048,9 +1049,9 @@ assertInvalid(g, b'\x00' * 5) assertInvalid(g, b'\x00' * 16) + @unittest.skipUnless(hasattr(socket, 'inet_ntop'), + 'test needs socket.inet_ntop()') def testStringToIPv6(self): - if not hasattr(socket, 'inet_ntop'): - return # No inet_ntop() on this platform try: from socket import inet_ntop, AF_INET6, has_ipv6 if not has_ipv6: @@ -3660,6 +3661,8 @@ self.cli.connect((HOST, self.port)) time.sleep(1.0) + at unittest.skipUnless(hasattr(socket, 'socketpair'), + 'test needs socket.socketpair()') @unittest.skipUnless(thread, 'Threading required for this test.') class BasicSocketPairTest(SocketPairTest): @@ -3722,26 +3725,27 @@ def _testSetBlocking(self): pass - if hasattr(socket, "SOCK_NONBLOCK"): - @support.requires_linux_version(2, 6, 28) - def testInitNonBlocking(self): - # reinit server socket - self.serv.close() - self.serv = socket.socket(socket.AF_INET, socket.SOCK_STREAM | - socket.SOCK_NONBLOCK) - self.port = support.bind_port(self.serv) - self.serv.listen(1) - # actual testing - start = time.time() - try: - self.serv.accept() - except OSError: - pass - end = time.time() - self.assertTrue((end - start) < 1.0, "Error creating with non-blocking mode.") - - def _testInitNonBlocking(self): + @unittest.skipUnless(hasattr(socket, 'SOCK_NONBLOCK'), + 'test needs socket.SOCK_NONBLOCK') + @support.requires_linux_version(2, 6, 28) + def testInitNonBlocking(self): + # reinit server socket + self.serv.close() + self.serv = socket.socket(socket.AF_INET, socket.SOCK_STREAM | + socket.SOCK_NONBLOCK) + self.port = support.bind_port(self.serv) + self.serv.listen(1) + # actual testing + start = time.time() + try: + self.serv.accept() + except OSError: pass + end = time.time() + self.assertTrue((end - start) < 1.0, "Error creating with non-blocking mode.") + + def _testInitNonBlocking(self): + pass def testInheritFlags(self): # Issue #7995: when calling accept() on a listening socket with a @@ -4431,12 +4435,12 @@ if not ok: self.fail("accept() returned success when we did not expect it") + @unittest.skipUnless(hasattr(signal, 'alarm'), + 'test needs signal.alarm()') def testInterruptedTimeout(self): # XXX I don't know how to do this test on MSWindows or any other # plaform that doesn't support signal.alarm() or os.kill(), though # the bug should have existed on all platforms. - if not hasattr(signal, "alarm"): - return # can only test on *nix self.serv.settimeout(5.0) # must be longer than alarm class Alarm(Exception): pass @@ -4496,6 +4500,7 @@ self.assertTrue(issubclass(socket.gaierror, OSError)) self.assertTrue(issubclass(socket.timeout, OSError)) + at unittest.skipUnless(sys.platform == 'linux', 'Linux specific test') class TestLinuxAbstractNamespace(unittest.TestCase): UNIX_PATH_MAX = 108 @@ -4531,6 +4536,7 @@ finally: s.close() + at unittest.skipUnless(hasattr(socket, 'AF_UNIX'), 'test needs socket.AF_UNIX') class TestUnixDomain(unittest.TestCase): def setUp(self): @@ -4680,10 +4686,10 @@ for line in f: if line.startswith("tipc "): return True - if support.verbose: - print("TIPC module is not loaded, please 'sudo modprobe tipc'") return False + at unittest.skipUnless(isTipcAvailable(), + "TIPC module is not loaded, please 'sudo modprobe tipc'") class TIPCTest(unittest.TestCase): def testRDM(self): srv = socket.socket(socket.AF_TIPC, socket.SOCK_RDM) @@ -4706,6 +4712,8 @@ self.assertEqual(msg, MSG) + at unittest.skipUnless(isTipcAvailable(), + "TIPC module is not loaded, please 'sudo modprobe tipc'") class TIPCThreadableTest(unittest.TestCase, ThreadableTest): def __init__(self, methodName = 'runTest'): unittest.TestCase.__init__(self, methodName = methodName) @@ -5028,15 +5036,10 @@ InheritanceTest, NonblockConstantTest ]) - if hasattr(socket, "socketpair"): - tests.append(BasicSocketPairTest) - if hasattr(socket, "AF_UNIX"): - tests.append(TestUnixDomain) - if sys.platform == 'linux': - tests.append(TestLinuxAbstractNamespace) - if isTipcAvailable(): - tests.append(TIPCTest) - tests.append(TIPCThreadableTest) + tests.append(BasicSocketPairTest) + tests.append(TestUnixDomain) + tests.append(TestLinuxAbstractNamespace) + tests.extend([TIPCTest, TIPCThreadableTest]) tests.extend([BasicCANTest, CANTest]) tests.extend([BasicRDSTest, RDSTest]) tests.extend([ diff --git a/Lib/test/test_socketserver.py b/Lib/test/test_socketserver.py --- a/Lib/test/test_socketserver.py +++ b/Lib/test/test_socketserver.py @@ -27,7 +27,10 @@ HOST = test.support.HOST HAVE_UNIX_SOCKETS = hasattr(socket, "AF_UNIX") +requires_unix_sockets = unittest.skipUnless(HAVE_UNIX_SOCKETS, + 'requires Unix sockets') HAVE_FORKING = hasattr(os, "fork") +requires_forking = unittest.skipUnless(HAVE_FORKING, 'requires forking') def signal_alarm(n): """Call signal.alarm when it exists (i.e. not on Windows).""" @@ -175,31 +178,33 @@ socketserver.StreamRequestHandler, self.stream_examine) - if HAVE_FORKING: - def test_ForkingTCPServer(self): - with simple_subprocess(self): - self.run_server(socketserver.ForkingTCPServer, - socketserver.StreamRequestHandler, - self.stream_examine) - - if HAVE_UNIX_SOCKETS: - def test_UnixStreamServer(self): - self.run_server(socketserver.UnixStreamServer, + @requires_forking + def test_ForkingTCPServer(self): + with simple_subprocess(self): + self.run_server(socketserver.ForkingTCPServer, socketserver.StreamRequestHandler, self.stream_examine) - def test_ThreadingUnixStreamServer(self): - self.run_server(socketserver.ThreadingUnixStreamServer, + @requires_unix_sockets + def test_UnixStreamServer(self): + self.run_server(socketserver.UnixStreamServer, + socketserver.StreamRequestHandler, + self.stream_examine) + + @requires_unix_sockets + def test_ThreadingUnixStreamServer(self): + self.run_server(socketserver.ThreadingUnixStreamServer, + socketserver.StreamRequestHandler, + self.stream_examine) + + @requires_unix_sockets + @requires_forking + def test_ForkingUnixStreamServer(self): + with simple_subprocess(self): + self.run_server(ForkingUnixStreamServer, socketserver.StreamRequestHandler, self.stream_examine) - if HAVE_FORKING: - def test_ForkingUnixStreamServer(self): - with simple_subprocess(self): - self.run_server(ForkingUnixStreamServer, - socketserver.StreamRequestHandler, - self.stream_examine) - def test_UDPServer(self): self.run_server(socketserver.UDPServer, socketserver.DatagramRequestHandler, @@ -210,12 +215,12 @@ socketserver.DatagramRequestHandler, self.dgram_examine) - if HAVE_FORKING: - def test_ForkingUDPServer(self): - with simple_subprocess(self): - self.run_server(socketserver.ForkingUDPServer, - socketserver.DatagramRequestHandler, - self.dgram_examine) + @requires_forking + def test_ForkingUDPServer(self): + with simple_subprocess(self): + self.run_server(socketserver.ForkingUDPServer, + socketserver.DatagramRequestHandler, + self.dgram_examine) @contextlib.contextmanager def mocked_select_module(self): @@ -252,22 +257,24 @@ # Alas, on Linux (at least) recvfrom() doesn't return a meaningful # client address so this cannot work: - # if HAVE_UNIX_SOCKETS: - # def test_UnixDatagramServer(self): - # self.run_server(socketserver.UnixDatagramServer, - # socketserver.DatagramRequestHandler, - # self.dgram_examine) + # @requires_unix_sockets + # def test_UnixDatagramServer(self): + # self.run_server(socketserver.UnixDatagramServer, + # socketserver.DatagramRequestHandler, + # self.dgram_examine) # - # def test_ThreadingUnixDatagramServer(self): - # self.run_server(socketserver.ThreadingUnixDatagramServer, - # socketserver.DatagramRequestHandler, - # self.dgram_examine) + # @requires_unix_sockets + # def test_ThreadingUnixDatagramServer(self): + # self.run_server(socketserver.ThreadingUnixDatagramServer, + # socketserver.DatagramRequestHandler, + # self.dgram_examine) # - # if HAVE_FORKING: - # def test_ForkingUnixDatagramServer(self): - # self.run_server(socketserver.ForkingUnixDatagramServer, - # socketserver.DatagramRequestHandler, - # self.dgram_examine) + # @requires_unix_sockets + # @requires_forking + # def test_ForkingUnixDatagramServer(self): + # self.run_server(socketserver.ForkingUnixDatagramServer, + # socketserver.DatagramRequestHandler, + # self.dgram_examine) @reap_threads def test_shutdown(self): diff --git a/Lib/test/test_sys.py b/Lib/test/test_sys.py --- a/Lib/test/test_sys.py +++ b/Lib/test/test_sys.py @@ -293,15 +293,16 @@ def test_call_tracing(self): self.assertRaises(TypeError, sys.call_tracing, type, 2) + @unittest.skipUnless(hasattr(sys, "setdlopenflags"), + 'test needs sys.setdlopenflags()') def test_dlopenflags(self): - if hasattr(sys, "setdlopenflags"): - self.assertTrue(hasattr(sys, "getdlopenflags")) - self.assertRaises(TypeError, sys.getdlopenflags, 42) - oldflags = sys.getdlopenflags() - self.assertRaises(TypeError, sys.setdlopenflags) - sys.setdlopenflags(oldflags+1) - self.assertEqual(sys.getdlopenflags(), oldflags+1) - sys.setdlopenflags(oldflags) + self.assertTrue(hasattr(sys, "getdlopenflags")) + self.assertRaises(TypeError, sys.getdlopenflags, 42) + oldflags = sys.getdlopenflags() + self.assertRaises(TypeError, sys.setdlopenflags) + sys.setdlopenflags(oldflags+1) + self.assertEqual(sys.getdlopenflags(), oldflags+1) + sys.setdlopenflags(oldflags) @test.support.refcount_test def test_refcount(self): diff --git a/Lib/test/test_warnings.py b/Lib/test/test_warnings.py --- a/Lib/test/test_warnings.py +++ b/Lib/test/test_warnings.py @@ -271,11 +271,10 @@ finally: warning_tests.__file__ = filename + @unittest.skipUnless(hasattr(sys, 'argv'), 'test needs sys.argv') def test_missing_filename_main_with_argv(self): # If __file__ is not specified and the caller is __main__ and sys.argv # exists, then use sys.argv[0] as the file. - if not hasattr(sys, 'argv'): - return filename = warning_tests.__file__ module_name = warning_tests.__name__ try: diff --git a/Lib/test/test_zlib.py b/Lib/test/test_zlib.py --- a/Lib/test/test_zlib.py +++ b/Lib/test/test_zlib.py @@ -7,6 +7,13 @@ zlib = support.import_module('zlib') +requires_Compress_copy = unittest.skipUnless( + hasattr(zlib.compressobj(), "copy"), + 'requires Compress.copy()') +requires_Decompress_copy = unittest.skipUnless( + hasattr(zlib.decompressobj(), "copy"), + 'requires Decompress.copy()') + class VersionTestCase(unittest.TestCase): @@ -381,39 +388,39 @@ "mode=%i, level=%i") % (sync, level)) del obj + @unittest.skipUnless(hasattr(zlib, 'Z_SYNC_FLUSH'), + 'requires zlib.Z_SYNC_FLUSH') def test_odd_flush(self): # Test for odd flushing bugs noted in 2.0, and hopefully fixed in 2.1 import random + # Testing on 17K of "random" data - if hasattr(zlib, 'Z_SYNC_FLUSH'): - # Testing on 17K of "random" data + # Create compressor and decompressor objects + co = zlib.compressobj(zlib.Z_BEST_COMPRESSION) + dco = zlib.decompressobj() - # Create compressor and decompressor objects - co = zlib.compressobj(zlib.Z_BEST_COMPRESSION) - dco = zlib.decompressobj() + # Try 17K of data + # generate random data stream + try: + # In 2.3 and later, WichmannHill is the RNG of the bug report + gen = random.WichmannHill() + except AttributeError: + try: + # 2.2 called it Random + gen = random.Random() + except AttributeError: + # others might simply have a single RNG + gen = random + gen.seed(1) + data = genblock(1, 17 * 1024, generator=gen) - # Try 17K of data - # generate random data stream - try: - # In 2.3 and later, WichmannHill is the RNG of the bug report - gen = random.WichmannHill() - except AttributeError: - try: - # 2.2 called it Random - gen = random.Random() - except AttributeError: - # others might simply have a single RNG - gen = random - gen.seed(1) - data = genblock(1, 17 * 1024, generator=gen) + # compress, sync-flush, and decompress + first = co.compress(data) + second = co.flush(zlib.Z_SYNC_FLUSH) + expanded = dco.decompress(first + second) - # compress, sync-flush, and decompress - first = co.compress(data) - second = co.flush(zlib.Z_SYNC_FLUSH) - expanded = dco.decompress(first + second) - - # if decompressed data is different from the input data, choke. - self.assertEqual(expanded, data, "17K random source doesn't match") + # if decompressed data is different from the input data, choke. + self.assertEqual(expanded, data, "17K random source doesn't match") def test_empty_flush(self): # Test that calling .flush() on unused objects works. @@ -525,67 +532,69 @@ data = zlib.compress(input2) self.assertEqual(dco.flush(), input1[1:]) - if hasattr(zlib.compressobj(), "copy"): - def test_compresscopy(self): - # Test copying a compression object - data0 = HAMLET_SCENE - data1 = bytes(str(HAMLET_SCENE, "ascii").swapcase(), "ascii") - c0 = zlib.compressobj(zlib.Z_BEST_COMPRESSION) - bufs0 = [] - bufs0.append(c0.compress(data0)) + @requires_Compress_copy + def test_compresscopy(self): + # Test copying a compression object + data0 = HAMLET_SCENE + data1 = bytes(str(HAMLET_SCENE, "ascii").swapcase(), "ascii") + c0 = zlib.compressobj(zlib.Z_BEST_COMPRESSION) + bufs0 = [] + bufs0.append(c0.compress(data0)) - c1 = c0.copy() - bufs1 = bufs0[:] + c1 = c0.copy() + bufs1 = bufs0[:] - bufs0.append(c0.compress(data0)) - bufs0.append(c0.flush()) - s0 = b''.join(bufs0) + bufs0.append(c0.compress(data0)) + bufs0.append(c0.flush()) + s0 = b''.join(bufs0) - bufs1.append(c1.compress(data1)) - bufs1.append(c1.flush()) - s1 = b''.join(bufs1) + bufs1.append(c1.compress(data1)) + bufs1.append(c1.flush()) + s1 = b''.join(bufs1) - self.assertEqual(zlib.decompress(s0),data0+data0) - self.assertEqual(zlib.decompress(s1),data0+data1) + self.assertEqual(zlib.decompress(s0),data0+data0) + self.assertEqual(zlib.decompress(s1),data0+data1) - def test_badcompresscopy(self): - # Test copying a compression object in an inconsistent state - c = zlib.compressobj() - c.compress(HAMLET_SCENE) - c.flush() - self.assertRaises(ValueError, c.copy) + @requires_Compress_copy + def test_badcompresscopy(self): + # Test copying a compression object in an inconsistent state + c = zlib.compressobj() + c.compress(HAMLET_SCENE) + c.flush() + self.assertRaises(ValueError, c.copy) - if hasattr(zlib.decompressobj(), "copy"): - def test_decompresscopy(self): - # Test copying a decompression object - data = HAMLET_SCENE - comp = zlib.compress(data) - # Test type of return value - self.assertIsInstance(comp, bytes) + @requires_Decompress_copy + def test_decompresscopy(self): + # Test copying a decompression object + data = HAMLET_SCENE + comp = zlib.compress(data) + # Test type of return value + self.assertIsInstance(comp, bytes) - d0 = zlib.decompressobj() - bufs0 = [] - bufs0.append(d0.decompress(comp[:32])) + d0 = zlib.decompressobj() + bufs0 = [] + bufs0.append(d0.decompress(comp[:32])) - d1 = d0.copy() - bufs1 = bufs0[:] + d1 = d0.copy() + bufs1 = bufs0[:] - bufs0.append(d0.decompress(comp[32:])) - s0 = b''.join(bufs0) + bufs0.append(d0.decompress(comp[32:])) + s0 = b''.join(bufs0) - bufs1.append(d1.decompress(comp[32:])) - s1 = b''.join(bufs1) + bufs1.append(d1.decompress(comp[32:])) + s1 = b''.join(bufs1) - self.assertEqual(s0,s1) - self.assertEqual(s0,data) + self.assertEqual(s0,s1) + self.assertEqual(s0,data) - def test_baddecompresscopy(self): - # Test copying a compression object in an inconsistent state - data = zlib.compress(HAMLET_SCENE) - d = zlib.decompressobj() - d.decompress(data) - d.flush() - self.assertRaises(ValueError, d.copy) + @requires_Decompress_copy + def test_baddecompresscopy(self): + # Test copying a compression object in an inconsistent state + data = zlib.compress(HAMLET_SCENE) + d = zlib.decompressobj() + d.decompress(data) + d.flush() + self.assertRaises(ValueError, d.copy) # Memory use of the following functions takes into account overallocation diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -165,6 +165,8 @@ Tests ----- +- Issue #18702: All skipped tests now reported as skipped. + - Issue #19439: interpreter embedding tests are now executed on Windows (Patch by Zachary Ware) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 22:16:13 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 3 Nov 2013 22:16:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE4NzAy?= =?utf-8?q?=3A_All_skipped_tests_now_reported_as_skipped=2E?= Message-ID: <3dCVLP5jCzz7LjT@mail.python.org> http://hg.python.org/cpython/rev/2d330f7908e7 changeset: 86897:2d330f7908e7 branch: 2.7 parent: 86884:a34889a30d52 user: Serhiy Storchaka date: Sun Nov 03 23:15:46 2013 +0200 summary: Issue #18702: All skipped tests now reported as skipped. files: Lib/test/test_array.py | 17 +- Lib/test/test_compileall.py | 3 +- Lib/test/test_csv.py | 135 +++--- Lib/test/test_enumerate.py | 3 +- Lib/test/test_ftplib.py | 26 +- Lib/test/test_mailbox.py | 36 +- Lib/test/test_math.py | 59 +- Lib/test/test_mmap.py | 142 +++--- Lib/test/test_os.py | 231 ++++++------ Lib/test/test_poplib.py | 28 +- Lib/test/test_posix.py | 401 +++++++++++---------- Lib/test/test_set.py | 8 +- Lib/test/test_shutil.py | 110 +++-- Lib/test/test_socket.py | 64 +- Lib/test/test_socketserver.py | 85 ++-- Lib/test/test_sys.py | 17 +- Lib/test/test_warnings.py | 3 +- Lib/test/test_zlib.py | 155 ++++---- Misc/NEWS | 2 + 19 files changed, 783 insertions(+), 742 deletions(-) diff --git a/Lib/test/test_array.py b/Lib/test/test_array.py --- a/Lib/test/test_array.py +++ b/Lib/test/test_array.py @@ -9,6 +9,7 @@ from weakref import proxy import array, cStringIO from cPickle import loads, dumps, HIGHEST_PROTOCOL +import sys class ArraySubclass(array.array): pass @@ -772,15 +773,15 @@ s = None self.assertRaises(ReferenceError, len, p) + @unittest.skipUnless(hasattr(sys, 'getrefcount'), + 'test needs sys.getrefcount()') def test_bug_782369(self): - import sys - if hasattr(sys, "getrefcount"): - for i in range(10): - b = array.array('B', range(64)) - rc = sys.getrefcount(10) - for i in range(10): - b = array.array('B', range(64)) - self.assertEqual(rc, sys.getrefcount(10)) + for i in range(10): + b = array.array('B', range(64)) + rc = sys.getrefcount(10) + for i in range(10): + b = array.array('B', range(64)) + self.assertEqual(rc, sys.getrefcount(10)) def test_subclass_with_kwargs(self): # SF bug #1486663 -- this used to erroneously raise a TypeError diff --git a/Lib/test/test_compileall.py b/Lib/test/test_compileall.py --- a/Lib/test/test_compileall.py +++ b/Lib/test/test_compileall.py @@ -31,11 +31,10 @@ compare = struct.pack('<4sl', imp.get_magic(), mtime) return data, compare + @unittest.skipUnless(hasattr(os, 'stat'), 'test needs os.stat()') def recreation_check(self, metadata): """Check that compileall recreates bytecode when the new metadata is used.""" - if not hasattr(os, 'stat'): - return py_compile.compile(self.source_path) self.assertEqual(*self.data()) with open(self.bc_path, 'rb') as file: diff --git a/Lib/test/test_csv.py b/Lib/test/test_csv.py --- a/Lib/test/test_csv.py +++ b/Lib/test/test_csv.py @@ -1014,78 +1014,77 @@ dialect = sniffer.sniff(self.sample9) self.assertTrue(dialect.doublequote) -if not hasattr(sys, "gettotalrefcount"): - if test_support.verbose: print "*** skipping leakage tests ***" -else: - class NUL: - def write(s, *args): - pass - writelines = write +class NUL: + def write(s, *args): + pass + writelines = write - class TestLeaks(unittest.TestCase): - def test_create_read(self): - delta = 0 - lastrc = sys.gettotalrefcount() - for i in xrange(20): - gc.collect() - self.assertEqual(gc.garbage, []) - rc = sys.gettotalrefcount() - csv.reader(["a,b,c\r\n"]) - csv.reader(["a,b,c\r\n"]) - csv.reader(["a,b,c\r\n"]) - delta = rc-lastrc - lastrc = rc - # if csv.reader() leaks, last delta should be 3 or more - self.assertEqual(delta < 3, True) + at unittest.skipUnless(hasattr(sys, "gettotalrefcount"), + 'requires sys.gettotalrefcount()') +class TestLeaks(unittest.TestCase): + def test_create_read(self): + delta = 0 + lastrc = sys.gettotalrefcount() + for i in xrange(20): + gc.collect() + self.assertEqual(gc.garbage, []) + rc = sys.gettotalrefcount() + csv.reader(["a,b,c\r\n"]) + csv.reader(["a,b,c\r\n"]) + csv.reader(["a,b,c\r\n"]) + delta = rc-lastrc + lastrc = rc + # if csv.reader() leaks, last delta should be 3 or more + self.assertEqual(delta < 3, True) - def test_create_write(self): - delta = 0 - lastrc = sys.gettotalrefcount() - s = NUL() - for i in xrange(20): - gc.collect() - self.assertEqual(gc.garbage, []) - rc = sys.gettotalrefcount() - csv.writer(s) - csv.writer(s) - csv.writer(s) - delta = rc-lastrc - lastrc = rc - # if csv.writer() leaks, last delta should be 3 or more - self.assertEqual(delta < 3, True) + def test_create_write(self): + delta = 0 + lastrc = sys.gettotalrefcount() + s = NUL() + for i in xrange(20): + gc.collect() + self.assertEqual(gc.garbage, []) + rc = sys.gettotalrefcount() + csv.writer(s) + csv.writer(s) + csv.writer(s) + delta = rc-lastrc + lastrc = rc + # if csv.writer() leaks, last delta should be 3 or more + self.assertEqual(delta < 3, True) - def test_read(self): - delta = 0 - rows = ["a,b,c\r\n"]*5 - lastrc = sys.gettotalrefcount() - for i in xrange(20): - gc.collect() - self.assertEqual(gc.garbage, []) - rc = sys.gettotalrefcount() - rdr = csv.reader(rows) - for row in rdr: - pass - delta = rc-lastrc - lastrc = rc - # if reader leaks during read, delta should be 5 or more - self.assertEqual(delta < 5, True) + def test_read(self): + delta = 0 + rows = ["a,b,c\r\n"]*5 + lastrc = sys.gettotalrefcount() + for i in xrange(20): + gc.collect() + self.assertEqual(gc.garbage, []) + rc = sys.gettotalrefcount() + rdr = csv.reader(rows) + for row in rdr: + pass + delta = rc-lastrc + lastrc = rc + # if reader leaks during read, delta should be 5 or more + self.assertEqual(delta < 5, True) - def test_write(self): - delta = 0 - rows = [[1,2,3]]*5 - s = NUL() - lastrc = sys.gettotalrefcount() - for i in xrange(20): - gc.collect() - self.assertEqual(gc.garbage, []) - rc = sys.gettotalrefcount() - writer = csv.writer(s) - for row in rows: - writer.writerow(row) - delta = rc-lastrc - lastrc = rc - # if writer leaks during write, last delta should be 5 or more - self.assertEqual(delta < 5, True) + def test_write(self): + delta = 0 + rows = [[1,2,3]]*5 + s = NUL() + lastrc = sys.gettotalrefcount() + for i in xrange(20): + gc.collect() + self.assertEqual(gc.garbage, []) + rc = sys.gettotalrefcount() + writer = csv.writer(s) + for row in rows: + writer.writerow(row) + delta = rc-lastrc + lastrc = rc + # if writer leaks during write, last delta should be 5 or more + self.assertEqual(delta < 5, True) # commented out for now - csv module doesn't yet support Unicode ## class TestUnicode(unittest.TestCase): diff --git a/Lib/test/test_enumerate.py b/Lib/test/test_enumerate.py --- a/Lib/test/test_enumerate.py +++ b/Lib/test/test_enumerate.py @@ -188,11 +188,10 @@ self.assertRaises(TypeError, reversed) self.assertRaises(TypeError, reversed, [], 'extra') + @unittest.skipUnless(hasattr(sys, 'getrefcount'), 'test needs sys.getrefcount()') def test_bug1229429(self): # this bug was never in reversed, it was in # PyObject_CallMethod, and reversed_new calls that sometimes. - if not hasattr(sys, "getrefcount"): - return def f(): pass r = f.__reversed__ = object() diff --git a/Lib/test/test_ftplib.py b/Lib/test/test_ftplib.py --- a/Lib/test/test_ftplib.py +++ b/Lib/test/test_ftplib.py @@ -15,7 +15,7 @@ except ImportError: ssl = None -from unittest import TestCase +from unittest import TestCase, SkipTest, skipUnless from test import test_support from test.test_support import HOST, HOSTv6 threading = test_support.import_module('threading') @@ -579,8 +579,16 @@ self.assertRaises(ftplib.Error, self.client.storlines, 'stor', f) + at skipUnless(socket.has_ipv6, "IPv6 not enabled") class TestIPv6Environment(TestCase): + @classmethod + def setUpClass(cls): + try: + DummyFTPServer((HOST, 0), af=socket.AF_INET6) + except socket.error: + raise SkipTest("IPv6 not enabled") + def setUp(self): self.server = DummyFTPServer((HOSTv6, 0), af=socket.AF_INET6) self.server.start() @@ -615,6 +623,7 @@ retr() + at skipUnless(ssl, "SSL not available") class TestTLS_FTPClassMixin(TestFTPClass): """Repeat TestFTPClass tests starting the TLS layer for both control and data connections first. @@ -630,6 +639,7 @@ self.client.prot_p() + at skipUnless(ssl, "SSL not available") class TestTLS_FTPClass(TestCase): """Specific TLS_FTP class tests.""" @@ -783,17 +793,9 @@ def test_main(): - tests = [TestFTPClass, TestTimeouts] - if socket.has_ipv6: - try: - DummyFTPServer((HOST, 0), af=socket.AF_INET6) - except socket.error: - pass - else: - tests.append(TestIPv6Environment) - - if ssl is not None: - tests.extend([TestTLS_FTPClassMixin, TestTLS_FTPClass]) + tests = [TestFTPClass, TestTimeouts, + TestIPv6Environment, + TestTLS_FTPClassMixin, TestTLS_FTPClass] thread_info = test_support.threading_setup() try: diff --git a/Lib/test/test_mailbox.py b/Lib/test/test_mailbox.py --- a/Lib/test/test_mailbox.py +++ b/Lib/test/test_mailbox.py @@ -772,10 +772,10 @@ for msg in self._box: pass + @unittest.skipUnless(hasattr(os, 'umask'), 'test needs os.umask()') + @unittest.skipUnless(hasattr(os, 'stat'), 'test needs os.stat()') def test_file_permissions(self): # Verify that message files are created without execute permissions - if not hasattr(os, "stat") or not hasattr(os, "umask"): - return msg = mailbox.MaildirMessage(self._template % 0) orig_umask = os.umask(0) try: @@ -786,12 +786,11 @@ mode = os.stat(path).st_mode self.assertEqual(mode & 0111, 0) + @unittest.skipUnless(hasattr(os, 'umask'), 'test needs os.umask()') + @unittest.skipUnless(hasattr(os, 'stat'), 'test needs os.stat()') def test_folder_file_perms(self): # From bug #3228, we want to verify that the file created inside a Maildir # subfolder isn't marked as executable. - if not hasattr(os, "stat") or not hasattr(os, "umask"): - return - orig_umask = os.umask(0) try: subfolder = self._box.add_folder('subfolder') @@ -991,24 +990,25 @@ _factory = lambda self, path, factory=None: mailbox.mbox(path, factory) + @unittest.skipUnless(hasattr(os, 'umask'), 'test needs os.umask()') + @unittest.skipUnless(hasattr(os, 'stat'), 'test needs os.stat()') def test_file_perms(self): # From bug #3228, we want to verify that the mailbox file isn't executable, # even if the umask is set to something that would leave executable bits set. # We only run this test on platforms that support umask. - if hasattr(os, 'umask') and hasattr(os, 'stat'): - try: - old_umask = os.umask(0077) - self._box.close() - os.unlink(self._path) - self._box = mailbox.mbox(self._path, create=True) - self._box.add('') - self._box.close() - finally: - os.umask(old_umask) + try: + old_umask = os.umask(0077) + self._box.close() + os.unlink(self._path) + self._box = mailbox.mbox(self._path, create=True) + self._box.add('') + self._box.close() + finally: + os.umask(old_umask) - st = os.stat(self._path) - perms = st.st_mode - self.assertFalse((perms & 0111)) # Execute bits should all be off. + st = os.stat(self._path) + perms = st.st_mode + self.assertFalse((perms & 0111)) # Execute bits should all be off. def test_terminating_newline(self): message = email.message.Message() diff --git a/Lib/test/test_math.py b/Lib/test/test_math.py --- a/Lib/test/test_math.py +++ b/Lib/test/test_math.py @@ -906,38 +906,37 @@ # still fails this part of the test on some platforms. For now, we only # *run* test_exceptions() in verbose mode, so that this isn't normally # tested. + @unittest.skipUnless(verbose, 'requires verbose mode') + def test_exceptions(self): + try: + x = math.exp(-1000000000) + except: + # mathmodule.c is failing to weed out underflows from libm, or + # we've got an fp format with huge dynamic range + self.fail("underflowing exp() should not have raised " + "an exception") + if x != 0: + self.fail("underflowing exp() should have returned 0") - if verbose: - def test_exceptions(self): - try: - x = math.exp(-1000000000) - except: - # mathmodule.c is failing to weed out underflows from libm, or - # we've got an fp format with huge dynamic range - self.fail("underflowing exp() should not have raised " - "an exception") - if x != 0: - self.fail("underflowing exp() should have returned 0") + # If this fails, probably using a strict IEEE-754 conforming libm, and x + # is +Inf afterwards. But Python wants overflows detected by default. + try: + x = math.exp(1000000000) + except OverflowError: + pass + else: + self.fail("overflowing exp() didn't trigger OverflowError") - # If this fails, probably using a strict IEEE-754 conforming libm, and x - # is +Inf afterwards. But Python wants overflows detected by default. - try: - x = math.exp(1000000000) - except OverflowError: - pass - else: - self.fail("overflowing exp() didn't trigger OverflowError") - - # If this fails, it could be a puzzle. One odd possibility is that - # mathmodule.c's macros are getting confused while comparing - # Inf (HUGE_VAL) to a NaN, and artificially setting errno to ERANGE - # as a result (and so raising OverflowError instead). - try: - x = math.sqrt(-1.0) - except ValueError: - pass - else: - self.fail("sqrt(-1) didn't raise ValueError") + # If this fails, it could be a puzzle. One odd possibility is that + # mathmodule.c's macros are getting confused while comparing + # Inf (HUGE_VAL) to a NaN, and artificially setting errno to ERANGE + # as a result (and so raising OverflowError instead). + try: + x = math.sqrt(-1.0) + except ValueError: + pass + else: + self.fail("sqrt(-1) didn't raise ValueError") @requires_IEEE_754 def test_testfile(self): diff --git a/Lib/test/test_mmap.py b/Lib/test/test_mmap.py --- a/Lib/test/test_mmap.py +++ b/Lib/test/test_mmap.py @@ -320,26 +320,25 @@ mf.close() f.close() + @unittest.skipUnless(hasattr(os, "stat"), "needs os.stat()") def test_entire_file(self): # test mapping of entire file by passing 0 for map length - if hasattr(os, "stat"): - f = open(TESTFN, "w+") + f = open(TESTFN, "w+") - f.write(2**16 * 'm') # Arbitrary character - f.close() + f.write(2**16 * 'm') # Arbitrary character + f.close() - f = open(TESTFN, "rb+") - mf = mmap.mmap(f.fileno(), 0) - self.assertEqual(len(mf), 2**16, "Map size should equal file size.") - self.assertEqual(mf.read(2**16), 2**16 * "m") - mf.close() - f.close() + f = open(TESTFN, "rb+") + mf = mmap.mmap(f.fileno(), 0) + self.assertEqual(len(mf), 2**16, "Map size should equal file size.") + self.assertEqual(mf.read(2**16), 2**16 * "m") + mf.close() + f.close() + @unittest.skipUnless(hasattr(os, "stat"), "needs os.stat()") def test_length_0_offset(self): # Issue #10916: test mapping of remainder of file by passing 0 for # map length with an offset doesn't cause a segfault. - if not hasattr(os, "stat"): - self.skipTest("needs os.stat") # NOTE: allocation granularity is currently 65536 under Win64, # and therefore the minimum offset alignment. with open(TESTFN, "wb") as f: @@ -352,12 +351,10 @@ finally: mf.close() + @unittest.skipUnless(hasattr(os, "stat"), "needs os.stat()") def test_length_0_large_offset(self): # Issue #10959: test mapping of a file by passing 0 for # map length with a large offset doesn't cause a segfault. - if not hasattr(os, "stat"): - self.skipTest("needs os.stat") - with open(TESTFN, "wb") as f: f.write(115699 * b'm') # Arbitrary character @@ -538,9 +535,8 @@ return mmap.mmap.__new__(klass, -1, *args, **kwargs) anon_mmap(PAGESIZE) + @unittest.skipUnless(hasattr(mmap, 'PROT_READ'), "needs mmap.PROT_READ") def test_prot_readonly(self): - if not hasattr(mmap, 'PROT_READ'): - return mapsize = 10 open(TESTFN, "wb").write("a"*mapsize) f = open(TESTFN, "rb") @@ -584,66 +580,68 @@ m.seek(8) self.assertRaises(ValueError, m.write, "bar") - if os.name == 'nt': - def test_tagname(self): - data1 = "0123456789" - data2 = "abcdefghij" - assert len(data1) == len(data2) + @unittest.skipUnless(os.name == 'nt', 'requires Windows') + def test_tagname(self): + data1 = "0123456789" + data2 = "abcdefghij" + assert len(data1) == len(data2) - # Test same tag - m1 = mmap.mmap(-1, len(data1), tagname="foo") - m1[:] = data1 - m2 = mmap.mmap(-1, len(data2), tagname="foo") - m2[:] = data2 - self.assertEqual(m1[:], data2) - self.assertEqual(m2[:], data2) - m2.close() - m1.close() + # Test same tag + m1 = mmap.mmap(-1, len(data1), tagname="foo") + m1[:] = data1 + m2 = mmap.mmap(-1, len(data2), tagname="foo") + m2[:] = data2 + self.assertEqual(m1[:], data2) + self.assertEqual(m2[:], data2) + m2.close() + m1.close() - # Test different tag - m1 = mmap.mmap(-1, len(data1), tagname="foo") - m1[:] = data1 - m2 = mmap.mmap(-1, len(data2), tagname="boo") - m2[:] = data2 - self.assertEqual(m1[:], data1) - self.assertEqual(m2[:], data2) - m2.close() - m1.close() + # Test different tag + m1 = mmap.mmap(-1, len(data1), tagname="foo") + m1[:] = data1 + m2 = mmap.mmap(-1, len(data2), tagname="boo") + m2[:] = data2 + self.assertEqual(m1[:], data1) + self.assertEqual(m2[:], data2) + m2.close() + m1.close() - def test_crasher_on_windows(self): - # Should not crash (Issue 1733986) - m = mmap.mmap(-1, 1000, tagname="foo") - try: - mmap.mmap(-1, 5000, tagname="foo")[:] # same tagname, but larger size - except: - pass - m.close() + @unittest.skipUnless(os.name == 'nt', 'requires Windows') + def test_crasher_on_windows(self): + # Should not crash (Issue 1733986) + m = mmap.mmap(-1, 1000, tagname="foo") + try: + mmap.mmap(-1, 5000, tagname="foo")[:] # same tagname, but larger size + except: + pass + m.close() - # Should not crash (Issue 5385) - open(TESTFN, "wb").write("x"*10) - f = open(TESTFN, "r+b") - m = mmap.mmap(f.fileno(), 0) - f.close() - try: - m.resize(0) # will raise WindowsError - except: - pass - try: - m[:] - except: - pass - m.close() + # Should not crash (Issue 5385) + open(TESTFN, "wb").write("x"*10) + f = open(TESTFN, "r+b") + m = mmap.mmap(f.fileno(), 0) + f.close() + try: + m.resize(0) # will raise WindowsError + except: + pass + try: + m[:] + except: + pass + m.close() - def test_invalid_descriptor(self): - # socket file descriptors are valid, but out of range - # for _get_osfhandle, causing a crash when validating the - # parameters to _get_osfhandle. - s = socket.socket() - try: - with self.assertRaises(mmap.error): - m = mmap.mmap(s.fileno(), 10) - finally: - s.close() + @unittest.skipUnless(os.name == 'nt', 'requires Windows') + def test_invalid_descriptor(self): + # socket file descriptors are valid, but out of range + # for _get_osfhandle, causing a crash when validating the + # parameters to _get_osfhandle. + s = socket.socket() + try: + with self.assertRaises(mmap.error): + m = mmap.mmap(s.fileno(), 10) + finally: + s.close() class LargeMmapTests(unittest.TestCase): diff --git a/Lib/test/test_os.py b/Lib/test/test_os.py --- a/Lib/test/test_os.py +++ b/Lib/test/test_os.py @@ -83,9 +83,8 @@ open(name, "w") self.files.append(name) + @unittest.skipUnless(hasattr(os, 'tempnam'), 'test needs os.tempnam()') def test_tempnam(self): - if not hasattr(os, "tempnam"): - return with warnings.catch_warnings(): warnings.filterwarnings("ignore", "tempnam", RuntimeWarning, r"test_os$") @@ -99,9 +98,8 @@ self.assertTrue(os.path.basename(name)[:3] == "pfx") self.check_tempfile(name) + @unittest.skipUnless(hasattr(os, 'tmpfile'), 'test needs os.tmpfile()') def test_tmpfile(self): - if not hasattr(os, "tmpfile"): - return # As with test_tmpnam() below, the Windows implementation of tmpfile() # attempts to create a file in the root directory of the current drive. # On Vista and Server 2008, this test will always fail for normal users @@ -150,9 +148,8 @@ fp.close() self.assertTrue(s == "foobar") + @unittest.skipUnless(hasattr(os, 'tmpnam'), 'test needs os.tmpnam()') def test_tmpnam(self): - if not hasattr(os, "tmpnam"): - return with warnings.catch_warnings(): warnings.filterwarnings("ignore", "tmpnam", RuntimeWarning, r"test_os$") @@ -193,10 +190,8 @@ os.unlink(self.fname) os.rmdir(test_support.TESTFN) + @unittest.skipUnless(hasattr(os, 'stat'), 'test needs os.stat()') def test_stat_attributes(self): - if not hasattr(os, "stat"): - return - import stat result = os.stat(self.fname) @@ -256,10 +251,8 @@ pass + @unittest.skipUnless(hasattr(os, 'statvfs'), 'test needs os.statvfs()') def test_statvfs_attributes(self): - if not hasattr(os, "statvfs"): - return - try: result = os.statvfs(self.fname) except OSError, e: @@ -311,10 +304,10 @@ st2 = os.stat(test_support.TESTFN) self.assertEqual(st2.st_mtime, int(st.st_mtime-delta)) - # Restrict test to Win32, since there is no guarantee other + # Restrict tests to Win32, since there is no guarantee other # systems support centiseconds - if sys.platform == 'win32': - def get_file_system(path): + def get_file_system(path): + if sys.platform == 'win32': root = os.path.splitdrive(os.path.abspath(path))[0] + '\\' import ctypes kernel32 = ctypes.windll.kernel32 @@ -322,25 +315,31 @@ if kernel32.GetVolumeInformationA(root, None, 0, None, None, None, buf, len(buf)): return buf.value - if get_file_system(test_support.TESTFN) == "NTFS": - def test_1565150(self): - t1 = 1159195039.25 - os.utime(self.fname, (t1, t1)) - self.assertEqual(os.stat(self.fname).st_mtime, t1) + @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") + @unittest.skipUnless(get_file_system(support.TESTFN) == "NTFS", + "requires NTFS") + def test_1565150(self): + t1 = 1159195039.25 + os.utime(self.fname, (t1, t1)) + self.assertEqual(os.stat(self.fname).st_mtime, t1) - def test_large_time(self): - t1 = 5000000000 # some day in 2128 - os.utime(self.fname, (t1, t1)) - self.assertEqual(os.stat(self.fname).st_mtime, t1) + @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") + @unittest.skipUnless(get_file_system(support.TESTFN) == "NTFS", + "requires NTFS") + def test_large_time(self): + t1 = 5000000000 # some day in 2128 + os.utime(self.fname, (t1, t1)) + self.assertEqual(os.stat(self.fname).st_mtime, t1) - def test_1686475(self): - # Verify that an open file can be stat'ed - try: - os.stat(r"c:\pagefile.sys") - except WindowsError, e: - if e.errno == 2: # file does not exist; cannot run test - return - self.fail("Could not stat pagefile.sys") + @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") + def test_1686475(self): + # Verify that an open file can be stat'ed + try: + os.stat(r"c:\pagefile.sys") + except WindowsError, e: + if e.errno == 2: # file does not exist; cannot run test + return + self.fail("Could not stat pagefile.sys") from test import mapping_tests @@ -598,6 +597,7 @@ self.assertRaises(ValueError, os.execvpe, 'notepad', [], None) + at unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") class Win32ErrorTests(unittest.TestCase): def test_rename(self): self.assertRaises(WindowsError, os.rename, test_support.TESTFN, test_support.TESTFN+".bak") @@ -644,121 +644,118 @@ self.fail("%r didn't raise a OSError with a bad file descriptor" % f) + @unittest.skipUnless(hasattr(os, 'isatty'), 'test needs os.isatty()') def test_isatty(self): - if hasattr(os, "isatty"): - self.assertEqual(os.isatty(test_support.make_bad_fd()), False) + self.assertEqual(os.isatty(test_support.make_bad_fd()), False) + @unittest.skipUnless(hasattr(os, 'closerange'), 'test needs os.closerange()') def test_closerange(self): - if hasattr(os, "closerange"): - fd = test_support.make_bad_fd() - # Make sure none of the descriptors we are about to close are - # currently valid (issue 6542). - for i in range(10): - try: os.fstat(fd+i) - except OSError: - pass - else: - break - if i < 2: - raise unittest.SkipTest( - "Unable to acquire a range of invalid file descriptors") - self.assertEqual(os.closerange(fd, fd + i-1), None) + fd = test_support.make_bad_fd() + # Make sure none of the descriptors we are about to close are + # currently valid (issue 6542). + for i in range(10): + try: os.fstat(fd+i) + except OSError: + pass + else: + break + if i < 2: + raise unittest.SkipTest( + "Unable to acquire a range of invalid file descriptors") + self.assertEqual(os.closerange(fd, fd + i-1), None) + @unittest.skipUnless(hasattr(os, 'dup2'), 'test needs os.dup2()') def test_dup2(self): - if hasattr(os, "dup2"): - self.check(os.dup2, 20) + self.check(os.dup2, 20) + @unittest.skipUnless(hasattr(os, 'fchmod'), 'test needs os.fchmod()') def test_fchmod(self): - if hasattr(os, "fchmod"): - self.check(os.fchmod, 0) + self.check(os.fchmod, 0) + @unittest.skipUnless(hasattr(os, 'fchown'), 'test needs os.fchown()') def test_fchown(self): - if hasattr(os, "fchown"): - self.check(os.fchown, -1, -1) + self.check(os.fchown, -1, -1) + @unittest.skipUnless(hasattr(os, 'fpathconf'), 'test needs os.fpathconf()') def test_fpathconf(self): - if hasattr(os, "fpathconf"): - self.check(os.fpathconf, "PC_NAME_MAX") + self.check(os.fpathconf, "PC_NAME_MAX") + @unittest.skipUnless(hasattr(os, 'ftruncate'), 'test needs os.ftruncate()') def test_ftruncate(self): - if hasattr(os, "ftruncate"): - self.check(os.ftruncate, 0) + self.check(os.ftruncate, 0) + @unittest.skipUnless(hasattr(os, 'lseek'), 'test needs os.lseek()') def test_lseek(self): - if hasattr(os, "lseek"): - self.check(os.lseek, 0, 0) + self.check(os.lseek, 0, 0) + @unittest.skipUnless(hasattr(os, 'read'), 'test needs os.read()') def test_read(self): - if hasattr(os, "read"): - self.check(os.read, 1) + self.check(os.read, 1) + @unittest.skipUnless(hasattr(os, 'tcsetpgrp'), 'test needs os.tcsetpgrp()') def test_tcsetpgrpt(self): - if hasattr(os, "tcsetpgrp"): - self.check(os.tcsetpgrp, 0) + self.check(os.tcsetpgrp, 0) + @unittest.skipUnless(hasattr(os, 'write'), 'test needs os.write()') def test_write(self): - if hasattr(os, "write"): - self.check(os.write, " ") + self.check(os.write, " ") -if sys.platform != 'win32': - class Win32ErrorTests(unittest.TestCase): - pass + at unittest.skipIf(sys.platform == "win32", "Posix specific tests") +class PosixUidGidTests(unittest.TestCase): + @unittest.skipUnless(hasattr(os, 'setuid'), 'test needs os.setuid()') + def test_setuid(self): + if os.getuid() != 0: + self.assertRaises(os.error, os.setuid, 0) + self.assertRaises(OverflowError, os.setuid, 1<<32) - class PosixUidGidTests(unittest.TestCase): - if hasattr(os, 'setuid'): - def test_setuid(self): - if os.getuid() != 0: - self.assertRaises(os.error, os.setuid, 0) - self.assertRaises(OverflowError, os.setuid, 1<<32) + @unittest.skipUnless(hasattr(os, 'setgid'), 'test needs os.setgid()') + def test_setgid(self): + if os.getuid() != 0: + self.assertRaises(os.error, os.setgid, 0) + self.assertRaises(OverflowError, os.setgid, 1<<32) - if hasattr(os, 'setgid'): - def test_setgid(self): - if os.getuid() != 0: - self.assertRaises(os.error, os.setgid, 0) - self.assertRaises(OverflowError, os.setgid, 1<<32) + @unittest.skipUnless(hasattr(os, 'seteuid'), 'test needs os.seteuid()') + def test_seteuid(self): + if os.getuid() != 0: + self.assertRaises(os.error, os.seteuid, 0) + self.assertRaises(OverflowError, os.seteuid, 1<<32) - if hasattr(os, 'seteuid'): - def test_seteuid(self): - if os.getuid() != 0: - self.assertRaises(os.error, os.seteuid, 0) - self.assertRaises(OverflowError, os.seteuid, 1<<32) + @unittest.skipUnless(hasattr(os, 'setegid'), 'test needs os.setegid()') + def test_setegid(self): + if os.getuid() != 0: + self.assertRaises(os.error, os.setegid, 0) + self.assertRaises(OverflowError, os.setegid, 1<<32) - if hasattr(os, 'setegid'): - def test_setegid(self): - if os.getuid() != 0: - self.assertRaises(os.error, os.setegid, 0) - self.assertRaises(OverflowError, os.setegid, 1<<32) + @unittest.skipUnless(hasattr(os, 'setreuid'), 'test needs os.setreuid()') + def test_setreuid(self): + if os.getuid() != 0: + self.assertRaises(os.error, os.setreuid, 0, 0) + self.assertRaises(OverflowError, os.setreuid, 1<<32, 0) + self.assertRaises(OverflowError, os.setreuid, 0, 1<<32) - if hasattr(os, 'setreuid'): - def test_setreuid(self): - if os.getuid() != 0: - self.assertRaises(os.error, os.setreuid, 0, 0) - self.assertRaises(OverflowError, os.setreuid, 1<<32, 0) - self.assertRaises(OverflowError, os.setreuid, 0, 1<<32) + @unittest.skipUnless(hasattr(os, 'setreuid'), 'test needs os.setreuid()') + def test_setreuid_neg1(self): + # Needs to accept -1. We run this in a subprocess to avoid + # altering the test runner's process state (issue8045). + subprocess.check_call([ + sys.executable, '-c', + 'import os,sys;os.setreuid(-1,-1);sys.exit(0)']) - def test_setreuid_neg1(self): - # Needs to accept -1. We run this in a subprocess to avoid - # altering the test runner's process state (issue8045). - subprocess.check_call([ - sys.executable, '-c', - 'import os,sys;os.setreuid(-1,-1);sys.exit(0)']) + @unittest.skipUnless(hasattr(os, 'setregid'), 'test needs os.setregid()') + def test_setregid(self): + if os.getuid() != 0: + self.assertRaises(os.error, os.setregid, 0, 0) + self.assertRaises(OverflowError, os.setregid, 1<<32, 0) + self.assertRaises(OverflowError, os.setregid, 0, 1<<32) - if hasattr(os, 'setregid'): - def test_setregid(self): - if os.getuid() != 0: - self.assertRaises(os.error, os.setregid, 0, 0) - self.assertRaises(OverflowError, os.setregid, 1<<32, 0) - self.assertRaises(OverflowError, os.setregid, 0, 1<<32) + @unittest.skipUnless(hasattr(os, 'setregid'), 'test needs os.setregid()') + def test_setregid_neg1(self): + # Needs to accept -1. We run this in a subprocess to avoid + # altering the test runner's process state (issue8045). + subprocess.check_call([ + sys.executable, '-c', + 'import os,sys;os.setregid(-1,-1);sys.exit(0)']) - def test_setregid_neg1(self): - # Needs to accept -1. We run this in a subprocess to avoid - # altering the test runner's process state (issue8045). - subprocess.check_call([ - sys.executable, '-c', - 'import os,sys;os.setregid(-1,-1);sys.exit(0)']) -else: - class PosixUidGidTests(unittest.TestCase): - pass @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") class Win32KillTests(unittest.TestCase): diff --git a/Lib/test/test_poplib.py b/Lib/test/test_poplib.py --- a/Lib/test/test_poplib.py +++ b/Lib/test/test_poplib.py @@ -11,7 +11,7 @@ import time import errno -from unittest import TestCase +from unittest import TestCase, skipUnless from test import test_support from test.test_support import HOST threading = test_support.import_module('threading') @@ -263,17 +263,20 @@ else: DummyPOP3Handler.handle_read(self) - class TestPOP3_SSLClass(TestPOP3Class): - # repeat previous tests by using poplib.POP3_SSL +requires_ssl = skipUnless(SUPPORTS_SSL, 'SSL not supported') - def setUp(self): - self.server = DummyPOP3Server((HOST, 0)) - self.server.handler = DummyPOP3_SSLHandler - self.server.start() - self.client = poplib.POP3_SSL(self.server.host, self.server.port) + at requires_ssl +class TestPOP3_SSLClass(TestPOP3Class): + # repeat previous tests by using poplib.POP3_SSL - def test__all__(self): - self.assertIn('POP3_SSL', poplib.__all__) + def setUp(self): + self.server = DummyPOP3Server((HOST, 0)) + self.server.handler = DummyPOP3_SSLHandler + self.server.start() + self.client = poplib.POP3_SSL(self.server.host, self.server.port) + + def test__all__(self): + self.assertIn('POP3_SSL', poplib.__all__) class TestTimeouts(TestCase): @@ -331,9 +334,8 @@ def test_main(): - tests = [TestPOP3Class, TestTimeouts] - if SUPPORTS_SSL: - tests.append(TestPOP3_SSLClass) + tests = [TestPOP3Class, TestTimeouts, + TestPOP3_SSLClass] thread_info = test_support.threading_setup() try: test_support.run_unittest(*tests) diff --git a/Lib/test/test_posix.py b/Lib/test/test_posix.py --- a/Lib/test/test_posix.py +++ b/Lib/test/test_posix.py @@ -53,47 +53,55 @@ posix_func() self.assertRaises(TypeError, posix_func, 1) - if hasattr(posix, 'getresuid'): - def test_getresuid(self): - user_ids = posix.getresuid() - self.assertEqual(len(user_ids), 3) - for val in user_ids: - self.assertGreaterEqual(val, 0) + @unittest.skipUnless(hasattr(posix, 'getresuid'), + 'test needs posix.getresuid()') + def test_getresuid(self): + user_ids = posix.getresuid() + self.assertEqual(len(user_ids), 3) + for val in user_ids: + self.assertGreaterEqual(val, 0) - if hasattr(posix, 'getresgid'): - def test_getresgid(self): - group_ids = posix.getresgid() - self.assertEqual(len(group_ids), 3) - for val in group_ids: - self.assertGreaterEqual(val, 0) + @unittest.skipUnless(hasattr(posix, 'getresgid'), + 'test needs posix.getresgid()') + def test_getresgid(self): + group_ids = posix.getresgid() + self.assertEqual(len(group_ids), 3) + for val in group_ids: + self.assertGreaterEqual(val, 0) - if hasattr(posix, 'setresuid'): - def test_setresuid(self): - current_user_ids = posix.getresuid() - self.assertIsNone(posix.setresuid(*current_user_ids)) - # -1 means don't change that value. - self.assertIsNone(posix.setresuid(-1, -1, -1)) + @unittest.skipUnless(hasattr(posix, 'setresuid'), + 'test needs posix.setresuid()') + def test_setresuid(self): + current_user_ids = posix.getresuid() + self.assertIsNone(posix.setresuid(*current_user_ids)) + # -1 means don't change that value. + self.assertIsNone(posix.setresuid(-1, -1, -1)) - def test_setresuid_exception(self): - # Don't do this test if someone is silly enough to run us as root. - current_user_ids = posix.getresuid() - if 0 not in current_user_ids: - new_user_ids = (current_user_ids[0]+1, -1, -1) - self.assertRaises(OSError, posix.setresuid, *new_user_ids) + @unittest.skipUnless(hasattr(posix, 'setresuid'), + 'test needs posix.setresuid()') + def test_setresuid_exception(self): + # Don't do this test if someone is silly enough to run us as root. + current_user_ids = posix.getresuid() + if 0 not in current_user_ids: + new_user_ids = (current_user_ids[0]+1, -1, -1) + self.assertRaises(OSError, posix.setresuid, *new_user_ids) - if hasattr(posix, 'setresgid'): - def test_setresgid(self): - current_group_ids = posix.getresgid() - self.assertIsNone(posix.setresgid(*current_group_ids)) - # -1 means don't change that value. - self.assertIsNone(posix.setresgid(-1, -1, -1)) + @unittest.skipUnless(hasattr(posix, 'setresgid'), + 'test needs posix.setresgid()') + def test_setresgid(self): + current_group_ids = posix.getresgid() + self.assertIsNone(posix.setresgid(*current_group_ids)) + # -1 means don't change that value. + self.assertIsNone(posix.setresgid(-1, -1, -1)) - def test_setresgid_exception(self): - # Don't do this test if someone is silly enough to run us as root. - current_group_ids = posix.getresgid() - if 0 not in current_group_ids: - new_group_ids = (current_group_ids[0]+1, -1, -1) - self.assertRaises(OSError, posix.setresgid, *new_group_ids) + @unittest.skipUnless(hasattr(posix, 'setresgid'), + 'test needs posix.setresgid()') + def test_setresgid_exception(self): + # Don't do this test if someone is silly enough to run us as root. + current_group_ids = posix.getresgid() + if 0 not in current_group_ids: + new_group_ids = (current_group_ids[0]+1, -1, -1) + self.assertRaises(OSError, posix.setresgid, *new_group_ids) @unittest.skipUnless(hasattr(posix, 'initgroups'), "test needs os.initgroups()") @@ -120,107 +128,118 @@ else: self.fail("Expected OSError to be raised by initgroups") + @unittest.skipUnless(hasattr(posix, 'statvfs'), + 'test needs posix.statvfs()') def test_statvfs(self): - if hasattr(posix, 'statvfs'): - self.assertTrue(posix.statvfs(os.curdir)) + self.assertTrue(posix.statvfs(os.curdir)) + @unittest.skipUnless(hasattr(posix, 'fstatvfs'), + 'test needs posix.fstatvfs()') def test_fstatvfs(self): - if hasattr(posix, 'fstatvfs'): - fp = open(test_support.TESTFN) - try: - self.assertTrue(posix.fstatvfs(fp.fileno())) - finally: - fp.close() + fp = open(test_support.TESTFN) + try: + self.assertTrue(posix.fstatvfs(fp.fileno())) + finally: + fp.close() + @unittest.skipUnless(hasattr(posix, 'ftruncate'), + 'test needs posix.ftruncate()') def test_ftruncate(self): - if hasattr(posix, 'ftruncate'): - fp = open(test_support.TESTFN, 'w+') - try: - # we need to have some data to truncate - fp.write('test') - fp.flush() - posix.ftruncate(fp.fileno(), 0) - finally: - fp.close() + fp = open(test_support.TESTFN, 'w+') + try: + # we need to have some data to truncate + fp.write('test') + fp.flush() + posix.ftruncate(fp.fileno(), 0) + finally: + fp.close() + @unittest.skipUnless(hasattr(posix, 'dup'), + 'test needs posix.dup()') def test_dup(self): - if hasattr(posix, 'dup'): - fp = open(test_support.TESTFN) - try: - fd = posix.dup(fp.fileno()) - self.assertIsInstance(fd, int) - os.close(fd) - finally: - fp.close() + fp = open(test_support.TESTFN) + try: + fd = posix.dup(fp.fileno()) + self.assertIsInstance(fd, int) + os.close(fd) + finally: + fp.close() + @unittest.skipUnless(hasattr(posix, 'confstr'), + 'test needs posix.confstr()') def test_confstr(self): - if hasattr(posix, 'confstr'): - self.assertRaises(ValueError, posix.confstr, "CS_garbage") - self.assertEqual(len(posix.confstr("CS_PATH")) > 0, True) + self.assertRaises(ValueError, posix.confstr, "CS_garbage") + self.assertEqual(len(posix.confstr("CS_PATH")) > 0, True) + @unittest.skipUnless(hasattr(posix, 'dup2'), + 'test needs posix.dup2()') def test_dup2(self): - if hasattr(posix, 'dup2'): - fp1 = open(test_support.TESTFN) - fp2 = open(test_support.TESTFN) - try: - posix.dup2(fp1.fileno(), fp2.fileno()) - finally: - fp1.close() - fp2.close() + fp1 = open(test_support.TESTFN) + fp2 = open(test_support.TESTFN) + try: + posix.dup2(fp1.fileno(), fp2.fileno()) + finally: + fp1.close() + fp2.close() def fdopen_helper(self, *args): fd = os.open(test_support.TESTFN, os.O_RDONLY) fp2 = posix.fdopen(fd, *args) fp2.close() + @unittest.skipUnless(hasattr(posix, 'fdopen'), + 'test needs posix.fdopen()') def test_fdopen(self): - if hasattr(posix, 'fdopen'): - self.fdopen_helper() - self.fdopen_helper('r') - self.fdopen_helper('r', 100) + self.fdopen_helper() + self.fdopen_helper('r') + self.fdopen_helper('r', 100) + @unittest.skipUnless(hasattr(posix, 'O_EXLOCK'), + 'test needs posix.O_EXLOCK') def test_osexlock(self): - if hasattr(posix, "O_EXLOCK"): + fd = os.open(test_support.TESTFN, + os.O_WRONLY|os.O_EXLOCK|os.O_CREAT) + self.assertRaises(OSError, os.open, test_support.TESTFN, + os.O_WRONLY|os.O_EXLOCK|os.O_NONBLOCK) + os.close(fd) + + if hasattr(posix, "O_SHLOCK"): fd = os.open(test_support.TESTFN, - os.O_WRONLY|os.O_EXLOCK|os.O_CREAT) + os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) self.assertRaises(OSError, os.open, test_support.TESTFN, os.O_WRONLY|os.O_EXLOCK|os.O_NONBLOCK) os.close(fd) - if hasattr(posix, "O_SHLOCK"): - fd = os.open(test_support.TESTFN, - os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) - self.assertRaises(OSError, os.open, test_support.TESTFN, - os.O_WRONLY|os.O_EXLOCK|os.O_NONBLOCK) - os.close(fd) + @unittest.skipUnless(hasattr(posix, 'O_SHLOCK'), + 'test needs posix.O_SHLOCK') + def test_osshlock(self): + fd1 = os.open(test_support.TESTFN, + os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) + fd2 = os.open(test_support.TESTFN, + os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) + os.close(fd2) + os.close(fd1) - def test_osshlock(self): - if hasattr(posix, "O_SHLOCK"): - fd1 = os.open(test_support.TESTFN, + if hasattr(posix, "O_EXLOCK"): + fd = os.open(test_support.TESTFN, os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) - fd2 = os.open(test_support.TESTFN, - os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) - os.close(fd2) - os.close(fd1) + self.assertRaises(OSError, os.open, test_support.TESTFN, + os.O_RDONLY|os.O_EXLOCK|os.O_NONBLOCK) + os.close(fd) - if hasattr(posix, "O_EXLOCK"): - fd = os.open(test_support.TESTFN, - os.O_WRONLY|os.O_SHLOCK|os.O_CREAT) - self.assertRaises(OSError, os.open, test_support.TESTFN, - os.O_RDONLY|os.O_EXLOCK|os.O_NONBLOCK) - os.close(fd) + @unittest.skipUnless(hasattr(posix, 'fstat'), + 'test needs posix.fstat()') + def test_fstat(self): + fp = open(test_support.TESTFN) + try: + self.assertTrue(posix.fstat(fp.fileno())) + finally: + fp.close() - def test_fstat(self): - if hasattr(posix, 'fstat'): - fp = open(test_support.TESTFN) - try: - self.assertTrue(posix.fstat(fp.fileno())) - finally: - fp.close() - + @unittest.skipUnless(hasattr(posix, 'stat'), + 'test needs posix.stat()') def test_stat(self): - if hasattr(posix, 'stat'): - self.assertTrue(posix.stat(test_support.TESTFN)) + self.assertTrue(posix.stat(test_support.TESTFN)) def _test_all_chown_common(self, chown_func, first_param, stat_func): """Common code for chown, fchown and lchown tests.""" @@ -313,59 +332,62 @@ self._test_all_chown_common(posix.lchown, test_support.TESTFN, getattr(posix, 'lstat', None)) + @unittest.skipUnless(hasattr(posix, 'chdir'), 'test needs posix.chdir()') def test_chdir(self): - if hasattr(posix, 'chdir'): - posix.chdir(os.curdir) - self.assertRaises(OSError, posix.chdir, test_support.TESTFN) + posix.chdir(os.curdir) + self.assertRaises(OSError, posix.chdir, test_support.TESTFN) + @unittest.skipUnless(hasattr(posix, 'lsdir'), 'test needs posix.lsdir()') def test_lsdir(self): - if hasattr(posix, 'lsdir'): - self.assertIn(test_support.TESTFN, posix.lsdir(os.curdir)) + self.assertIn(test_support.TESTFN, posix.lsdir(os.curdir)) + @unittest.skipUnless(hasattr(posix, 'access'), 'test needs posix.access()') def test_access(self): - if hasattr(posix, 'access'): - self.assertTrue(posix.access(test_support.TESTFN, os.R_OK)) + self.assertTrue(posix.access(test_support.TESTFN, os.R_OK)) + @unittest.skipUnless(hasattr(posix, 'umask'), 'test needs posix.umask()') def test_umask(self): - if hasattr(posix, 'umask'): - old_mask = posix.umask(0) - self.assertIsInstance(old_mask, int) - posix.umask(old_mask) + old_mask = posix.umask(0) + self.assertIsInstance(old_mask, int) + posix.umask(old_mask) + @unittest.skipUnless(hasattr(posix, 'strerror'), + 'test needs posix.strerror()') def test_strerror(self): - if hasattr(posix, 'strerror'): - self.assertTrue(posix.strerror(0)) + self.assertTrue(posix.strerror(0)) + @unittest.skipUnless(hasattr(posix, 'pipe'), 'test needs posix.pipe()') def test_pipe(self): - if hasattr(posix, 'pipe'): - reader, writer = posix.pipe() - os.close(reader) - os.close(writer) + reader, writer = posix.pipe() + os.close(reader) + os.close(writer) + @unittest.skipUnless(hasattr(posix, 'tempnam'), + 'test needs posix.tempnam()') def test_tempnam(self): - if hasattr(posix, 'tempnam'): - with warnings.catch_warnings(): - warnings.filterwarnings("ignore", "tempnam", DeprecationWarning) - self.assertTrue(posix.tempnam()) - self.assertTrue(posix.tempnam(os.curdir)) - self.assertTrue(posix.tempnam(os.curdir, 'blah')) + with warnings.catch_warnings(): + warnings.filterwarnings("ignore", "tempnam", DeprecationWarning) + self.assertTrue(posix.tempnam()) + self.assertTrue(posix.tempnam(os.curdir)) + self.assertTrue(posix.tempnam(os.curdir, 'blah')) + @unittest.skipUnless(hasattr(posix, 'tmpfile'), + 'test needs posix.tmpfile()') def test_tmpfile(self): - if hasattr(posix, 'tmpfile'): - with warnings.catch_warnings(): - warnings.filterwarnings("ignore", "tmpfile", DeprecationWarning) - fp = posix.tmpfile() - fp.close() + with warnings.catch_warnings(): + warnings.filterwarnings("ignore", "tmpfile", DeprecationWarning) + fp = posix.tmpfile() + fp.close() + @unittest.skipUnless(hasattr(posix, 'utime'), 'test needs posix.utime()') def test_utime(self): - if hasattr(posix, 'utime'): - now = time.time() - posix.utime(test_support.TESTFN, None) - self.assertRaises(TypeError, posix.utime, test_support.TESTFN, (None, None)) - self.assertRaises(TypeError, posix.utime, test_support.TESTFN, (now, None)) - self.assertRaises(TypeError, posix.utime, test_support.TESTFN, (None, now)) - posix.utime(test_support.TESTFN, (int(now), int(now))) - posix.utime(test_support.TESTFN, (now, now)) + now = time.time() + posix.utime(test_support.TESTFN, None) + self.assertRaises(TypeError, posix.utime, test_support.TESTFN, (None, None)) + self.assertRaises(TypeError, posix.utime, test_support.TESTFN, (now, None)) + self.assertRaises(TypeError, posix.utime, test_support.TESTFN, (None, now)) + posix.utime(test_support.TESTFN, (int(now), int(now))) + posix.utime(test_support.TESTFN, (now, now)) def _test_chflags_regular_file(self, chflags_func, target_file): st = os.stat(target_file) @@ -428,56 +450,57 @@ finally: posix.lchflags(_DUMMY_SYMLINK, dummy_symlink_st.st_flags) + @unittest.skipUnless(hasattr(posix, 'getcwd'), + 'test needs posix.getcwd()') def test_getcwd_long_pathnames(self): - if hasattr(posix, 'getcwd'): - dirname = 'getcwd-test-directory-0123456789abcdef-01234567890abcdef' - curdir = os.getcwd() - base_path = os.path.abspath(test_support.TESTFN) + '.getcwd' + dirname = 'getcwd-test-directory-0123456789abcdef-01234567890abcdef' + curdir = os.getcwd() + base_path = os.path.abspath(test_support.TESTFN) + '.getcwd' - try: - os.mkdir(base_path) - os.chdir(base_path) - except: -# Just returning nothing instead of the SkipTest exception, -# because the test results in Error in that case. -# Is that ok? -# raise unittest.SkipTest, "cannot create directory for testing" - return + try: + os.mkdir(base_path) + os.chdir(base_path) + except: +# Just returning nothing instead of the SkipTest exception, +# because the test results in Error in that case. +# Is that ok? +# raise unittest.SkipTest, "cannot create directory for testing" + return - try: - def _create_and_do_getcwd(dirname, current_path_length = 0): - try: - os.mkdir(dirname) - except: - raise unittest.SkipTest, "mkdir cannot create directory sufficiently deep for getcwd test" + try: + def _create_and_do_getcwd(dirname, current_path_length = 0): + try: + os.mkdir(dirname) + except: + raise unittest.SkipTest, "mkdir cannot create directory sufficiently deep for getcwd test" - os.chdir(dirname) - try: - os.getcwd() - if current_path_length < 4099: - _create_and_do_getcwd(dirname, current_path_length + len(dirname) + 1) - except OSError as e: - expected_errno = errno.ENAMETOOLONG - # The following platforms have quirky getcwd() - # behaviour -- see issue 9185 and 15765 for - # more information. - quirky_platform = ( - 'sunos' in sys.platform or - 'netbsd' in sys.platform or - 'openbsd' in sys.platform - ) - if quirky_platform: - expected_errno = errno.ERANGE - self.assertEqual(e.errno, expected_errno) - finally: - os.chdir('..') - os.rmdir(dirname) + os.chdir(dirname) + try: + os.getcwd() + if current_path_length < 4099: + _create_and_do_getcwd(dirname, current_path_length + len(dirname) + 1) + except OSError as e: + expected_errno = errno.ENAMETOOLONG + # The following platforms have quirky getcwd() + # behaviour -- see issue 9185 and 15765 for + # more information. + quirky_platform = ( + 'sunos' in sys.platform or + 'netbsd' in sys.platform or + 'openbsd' in sys.platform + ) + if quirky_platform: + expected_errno = errno.ERANGE + self.assertEqual(e.errno, expected_errno) + finally: + os.chdir('..') + os.rmdir(dirname) - _create_and_do_getcwd(dirname) + _create_and_do_getcwd(dirname) - finally: - os.chdir(curdir) - shutil.rmtree(base_path) + finally: + os.chdir(curdir) + shutil.rmtree(base_path) @unittest.skipUnless(hasattr(os, 'getegid'), "test needs os.getegid()") def test_getgroups(self): @@ -522,7 +545,7 @@ posix.initgroups(name, self.saved_groups[0]) @unittest.skipUnless(hasattr(posix, 'initgroups'), - "test needs posix.initgroups()") + 'test needs posix.initgroups()') def test_initgroups(self): # find missing group @@ -532,7 +555,7 @@ self.assertIn(g, posix.getgroups()) @unittest.skipUnless(hasattr(posix, 'setgroups'), - "test needs posix.setgroups()") + 'test needs posix.setgroups()') def test_setgroups(self): for groups in [[0], range(16)]: posix.setgroups(groups) diff --git a/Lib/test/test_set.py b/Lib/test/test_set.py --- a/Lib/test/test_set.py +++ b/Lib/test/test_set.py @@ -561,10 +561,10 @@ s = None self.assertRaises(ReferenceError, str, p) - # C API test only available in a debug build - if hasattr(set, "test_c_api"): - def test_c_api(self): - self.assertEqual(set().test_c_api(), True) + @unittest.skipUnless(hasattr(set, "test_c_api"), + 'C API test only available in a debug build') + def test_c_api(self): + self.assertEqual(set().test_c_api(), True) class SetSubclass(set): pass diff --git a/Lib/test/test_shutil.py b/Lib/test/test_shutil.py --- a/Lib/test/test_shutil.py +++ b/Lib/test/test_shutil.py @@ -78,33 +78,34 @@ filename = tempfile.mktemp() self.assertRaises(OSError, shutil.rmtree, filename) - # See bug #1071513 for why we don't run this on cygwin - # and bug #1076467 for why we don't run this as root. - if (hasattr(os, 'chmod') and sys.platform[:6] != 'cygwin' - and not (hasattr(os, 'geteuid') and os.geteuid() == 0)): - def test_on_error(self): - self.errorState = 0 - os.mkdir(TESTFN) - self.childpath = os.path.join(TESTFN, 'a') - f = open(self.childpath, 'w') - f.close() - old_dir_mode = os.stat(TESTFN).st_mode - old_child_mode = os.stat(self.childpath).st_mode - # Make unwritable. - os.chmod(self.childpath, stat.S_IREAD) - os.chmod(TESTFN, stat.S_IREAD) + @unittest.skipUnless(hasattr(os, 'chmod'), 'requires os.chmod()') + @unittest.skipIf(sys.platform[:6] == 'cygwin', + "This test can't be run on Cygwin (issue #1071513).") + @unittest.skipIf(hasattr(os, 'geteuid') and os.geteuid() == 0, + "This test can't be run reliably as root (issue #1076467).") + def test_on_error(self): + self.errorState = 0 + os.mkdir(TESTFN) + self.childpath = os.path.join(TESTFN, 'a') + f = open(self.childpath, 'w') + f.close() + old_dir_mode = os.stat(TESTFN).st_mode + old_child_mode = os.stat(self.childpath).st_mode + # Make unwritable. + os.chmod(self.childpath, stat.S_IREAD) + os.chmod(TESTFN, stat.S_IREAD) - shutil.rmtree(TESTFN, onerror=self.check_args_to_onerror) - # Test whether onerror has actually been called. - self.assertEqual(self.errorState, 2, - "Expected call to onerror function did not happen.") + shutil.rmtree(TESTFN, onerror=self.check_args_to_onerror) + # Test whether onerror has actually been called. + self.assertEqual(self.errorState, 2, + "Expected call to onerror function did not happen.") - # Make writable again. - os.chmod(TESTFN, old_dir_mode) - os.chmod(self.childpath, old_child_mode) + # Make writable again. + os.chmod(TESTFN, old_dir_mode) + os.chmod(self.childpath, old_child_mode) - # Clean up. - shutil.rmtree(TESTFN) + # Clean up. + shutil.rmtree(TESTFN) def check_args_to_onerror(self, func, arg, exc): # test_rmtree_errors deliberately runs rmtree @@ -308,37 +309,38 @@ finally: shutil.rmtree(TESTFN, ignore_errors=True) - if hasattr(os, "mkfifo"): - # Issue #3002: copyfile and copytree block indefinitely on named pipes - def test_copyfile_named_pipe(self): - os.mkfifo(TESTFN) + # Issue #3002: copyfile and copytree block indefinitely on named pipes + @unittest.skipUnless(hasattr(os, "mkfifo"), 'requires os.mkfifo()') + def test_copyfile_named_pipe(self): + os.mkfifo(TESTFN) + try: + self.assertRaises(shutil.SpecialFileError, + shutil.copyfile, TESTFN, TESTFN2) + self.assertRaises(shutil.SpecialFileError, + shutil.copyfile, __file__, TESTFN) + finally: + os.remove(TESTFN) + + @unittest.skipUnless(hasattr(os, "mkfifo"), 'requires os.mkfifo()') + def test_copytree_named_pipe(self): + os.mkdir(TESTFN) + try: + subdir = os.path.join(TESTFN, "subdir") + os.mkdir(subdir) + pipe = os.path.join(subdir, "mypipe") + os.mkfifo(pipe) try: - self.assertRaises(shutil.SpecialFileError, - shutil.copyfile, TESTFN, TESTFN2) - self.assertRaises(shutil.SpecialFileError, - shutil.copyfile, __file__, TESTFN) - finally: - os.remove(TESTFN) - - def test_copytree_named_pipe(self): - os.mkdir(TESTFN) - try: - subdir = os.path.join(TESTFN, "subdir") - os.mkdir(subdir) - pipe = os.path.join(subdir, "mypipe") - os.mkfifo(pipe) - try: - shutil.copytree(TESTFN, TESTFN2) - except shutil.Error as e: - errors = e.args[0] - self.assertEqual(len(errors), 1) - src, dst, error_msg = errors[0] - self.assertEqual("`%s` is a named pipe" % pipe, error_msg) - else: - self.fail("shutil.Error should have been raised") - finally: - shutil.rmtree(TESTFN, ignore_errors=True) - shutil.rmtree(TESTFN2, ignore_errors=True) + shutil.copytree(TESTFN, TESTFN2) + except shutil.Error as e: + errors = e.args[0] + self.assertEqual(len(errors), 1) + src, dst, error_msg = errors[0] + self.assertEqual("`%s` is a named pipe" % pipe, error_msg) + else: + self.fail("shutil.Error should have been raised") + finally: + shutil.rmtree(TESTFN, ignore_errors=True) + shutil.rmtree(TESTFN2, ignore_errors=True) @unittest.skipUnless(hasattr(os, 'chflags') and hasattr(errno, 'EOPNOTSUPP') and diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -343,16 +343,17 @@ if not fqhn in all_host_names: self.fail("Error testing host resolution mechanisms. (fqdn: %s, all: %s)" % (fqhn, repr(all_host_names))) + @unittest.skipUnless(hasattr(sys, 'getrefcount'), + 'test needs sys.getrefcount()') def testRefCountGetNameInfo(self): # Testing reference count for getnameinfo - if hasattr(sys, "getrefcount"): - try: - # On some versions, this loses a reference - orig = sys.getrefcount(__name__) - socket.getnameinfo(__name__,0) - except TypeError: - self.assertEqual(sys.getrefcount(__name__), orig, - "socket.getnameinfo loses a reference") + try: + # On some versions, this loses a reference + orig = sys.getrefcount(__name__) + socket.getnameinfo(__name__,0) + except TypeError: + self.assertEqual(sys.getrefcount(__name__), orig, + "socket.getnameinfo loses a reference") def testInterpreterCrash(self): # Making sure getnameinfo doesn't crash the interpreter @@ -459,17 +460,17 @@ # Check that setting it to an invalid type raises TypeError self.assertRaises(TypeError, socket.setdefaulttimeout, "spam") + @unittest.skipUnless(hasattr(socket, 'inet_aton'), + 'test needs socket.inet_aton()') def testIPv4_inet_aton_fourbytes(self): - if not hasattr(socket, 'inet_aton'): - return # No inet_aton, nothing to check # Test that issue1008086 and issue767150 are fixed. # It must return 4 bytes. self.assertEqual('\x00'*4, socket.inet_aton('0.0.0.0')) self.assertEqual('\xff'*4, socket.inet_aton('255.255.255.255')) + @unittest.skipUnless(hasattr(socket, 'inet_pton'), + 'test needs socket.inet_pton()') def testIPv4toString(self): - if not hasattr(socket, 'inet_pton'): - return # No inet_pton() on this platform from socket import inet_aton as f, inet_pton, AF_INET g = lambda a: inet_pton(AF_INET, a) @@ -484,9 +485,9 @@ self.assertEqual('\xaa\xaa\xaa\xaa', g('170.170.170.170')) self.assertEqual('\xff\xff\xff\xff', g('255.255.255.255')) + @unittest.skipUnless(hasattr(socket, 'inet_pton'), + 'test needs socket.inet_pton()') def testIPv6toString(self): - if not hasattr(socket, 'inet_pton'): - return # No inet_pton() on this platform try: from socket import inet_pton, AF_INET6, has_ipv6 if not has_ipv6: @@ -503,9 +504,9 @@ f('45ef:76cb:1a:56ef:afeb:bac:1924:aeae') ) + @unittest.skipUnless(hasattr(socket, 'inet_ntop'), + 'test needs socket.inet_ntop()') def testStringToIPv4(self): - if not hasattr(socket, 'inet_ntop'): - return # No inet_ntop() on this platform from socket import inet_ntoa as f, inet_ntop, AF_INET g = lambda a: inet_ntop(AF_INET, a) @@ -518,9 +519,9 @@ self.assertEqual('170.85.170.85', g('\xaa\x55\xaa\x55')) self.assertEqual('255.255.255.255', g('\xff\xff\xff\xff')) + @unittest.skipUnless(hasattr(socket, 'inet_ntop'), + 'test needs socket.inet_ntop()') def testStringToIPv6(self): - if not hasattr(socket, 'inet_ntop'): - return # No inet_ntop() on this platform try: from socket import inet_ntop, AF_INET6, has_ipv6 if not has_ipv6: @@ -871,6 +872,8 @@ self.cli.connect((HOST, self.port)) time.sleep(1.0) + at unittest.skipUnless(hasattr(socket, 'socketpair'), + 'test needs socket.socketpair()') @unittest.skipUnless(thread, 'Threading required for this test.') class BasicSocketPairTest(SocketPairTest): @@ -1456,12 +1459,12 @@ if not ok: self.fail("accept() returned success when we did not expect it") + @unittest.skipUnless(hasattr(signal, 'alarm'), + 'test needs signal.alarm()') def testInterruptedTimeout(self): # XXX I don't know how to do this test on MSWindows or any other # plaform that doesn't support signal.alarm() or os.kill(), though # the bug should have existed on all platforms. - if not hasattr(signal, "alarm"): - return # can only test on *nix self.serv.settimeout(5.0) # must be longer than alarm class Alarm(Exception): pass @@ -1521,6 +1524,7 @@ self.assertTrue(issubclass(socket.gaierror, socket.error)) self.assertTrue(issubclass(socket.timeout, socket.error)) + at unittest.skipUnless(sys.platform == 'linux', 'Linux specific test') class TestLinuxAbstractNamespace(unittest.TestCase): UNIX_PATH_MAX = 108 @@ -1635,11 +1639,11 @@ for line in f: if line.startswith("tipc "): return True - if test_support.verbose: - print "TIPC module is not loaded, please 'sudo modprobe tipc'" return False -class TIPCTest (unittest.TestCase): + at unittest.skipUnless(isTipcAvailable(), + "TIPC module is not loaded, please 'sudo modprobe tipc'") +class TIPCTest(unittest.TestCase): def testRDM(self): srv = socket.socket(socket.AF_TIPC, socket.SOCK_RDM) cli = socket.socket(socket.AF_TIPC, socket.SOCK_RDM) @@ -1659,7 +1663,9 @@ self.assertEqual(msg, MSG) -class TIPCThreadableTest (unittest.TestCase, ThreadableTest): + at unittest.skipUnless(isTipcAvailable(), + "TIPC module is not loaded, please 'sudo modprobe tipc'") +class TIPCThreadableTest(unittest.TestCase, ThreadableTest): def __init__(self, methodName = 'runTest'): unittest.TestCase.__init__(self, methodName = methodName) ThreadableTest.__init__(self) @@ -1712,13 +1718,9 @@ NetworkConnectionAttributesTest, NetworkConnectionBehaviourTest, ]) - if hasattr(socket, "socketpair"): - tests.append(BasicSocketPairTest) - if sys.platform == 'linux2': - tests.append(TestLinuxAbstractNamespace) - if isTipcAvailable(): - tests.append(TIPCTest) - tests.append(TIPCThreadableTest) + tests.append(BasicSocketPairTest) + tests.append(TestLinuxAbstractNamespace) + tests.extend([TIPCTest, TIPCThreadableTest]) thread_info = test_support.threading_setup() test_support.run_unittest(*tests) diff --git a/Lib/test/test_socketserver.py b/Lib/test/test_socketserver.py --- a/Lib/test/test_socketserver.py +++ b/Lib/test/test_socketserver.py @@ -27,7 +27,10 @@ HOST = test.test_support.HOST HAVE_UNIX_SOCKETS = hasattr(socket, "AF_UNIX") +requires_unix_sockets = unittest.skipUnless(HAVE_UNIX_SOCKETS, + 'requires Unix sockets') HAVE_FORKING = hasattr(os, "fork") and os.name != "os2" +requires_forking = unittest.skipUnless(HAVE_FORKING, 'requires forking') def signal_alarm(n): """Call signal.alarm when it exists (i.e. not on Windows).""" @@ -188,31 +191,33 @@ SocketServer.StreamRequestHandler, self.stream_examine) - if HAVE_FORKING: - def test_ForkingTCPServer(self): - with simple_subprocess(self): - self.run_server(SocketServer.ForkingTCPServer, - SocketServer.StreamRequestHandler, - self.stream_examine) - - if HAVE_UNIX_SOCKETS: - def test_UnixStreamServer(self): - self.run_server(SocketServer.UnixStreamServer, + @requires_forking + def test_ForkingTCPServer(self): + with simple_subprocess(self): + self.run_server(SocketServer.ForkingTCPServer, SocketServer.StreamRequestHandler, self.stream_examine) - def test_ThreadingUnixStreamServer(self): - self.run_server(SocketServer.ThreadingUnixStreamServer, + @requires_unix_sockets + def test_UnixStreamServer(self): + self.run_server(SocketServer.UnixStreamServer, + SocketServer.StreamRequestHandler, + self.stream_examine) + + @requires_unix_sockets + def test_ThreadingUnixStreamServer(self): + self.run_server(SocketServer.ThreadingUnixStreamServer, + SocketServer.StreamRequestHandler, + self.stream_examine) + + @requires_unix_sockets + @requires_forking + def test_ForkingUnixStreamServer(self): + with simple_subprocess(self): + self.run_server(ForkingUnixStreamServer, SocketServer.StreamRequestHandler, self.stream_examine) - if HAVE_FORKING: - def test_ForkingUnixStreamServer(self): - with simple_subprocess(self): - self.run_server(ForkingUnixStreamServer, - SocketServer.StreamRequestHandler, - self.stream_examine) - def test_UDPServer(self): self.run_server(SocketServer.UDPServer, SocketServer.DatagramRequestHandler, @@ -223,12 +228,12 @@ SocketServer.DatagramRequestHandler, self.dgram_examine) - if HAVE_FORKING: - def test_ForkingUDPServer(self): - with simple_subprocess(self): - self.run_server(SocketServer.ForkingUDPServer, - SocketServer.DatagramRequestHandler, - self.dgram_examine) + @requires_forking + def test_ForkingUDPServer(self): + with simple_subprocess(self): + self.run_server(SocketServer.ForkingUDPServer, + SocketServer.DatagramRequestHandler, + self.dgram_examine) @contextlib.contextmanager def mocked_select_module(self): @@ -265,22 +270,24 @@ # Alas, on Linux (at least) recvfrom() doesn't return a meaningful # client address so this cannot work: - # if HAVE_UNIX_SOCKETS: - # def test_UnixDatagramServer(self): - # self.run_server(SocketServer.UnixDatagramServer, - # SocketServer.DatagramRequestHandler, - # self.dgram_examine) + # @requires_unix_sockets + # def test_UnixDatagramServer(self): + # self.run_server(SocketServer.UnixDatagramServer, + # SocketServer.DatagramRequestHandler, + # self.dgram_examine) # - # def test_ThreadingUnixDatagramServer(self): - # self.run_server(SocketServer.ThreadingUnixDatagramServer, - # SocketServer.DatagramRequestHandler, - # self.dgram_examine) + # @requires_unix_sockets + # def test_ThreadingUnixDatagramServer(self): + # self.run_server(SocketServer.ThreadingUnixDatagramServer, + # SocketServer.DatagramRequestHandler, + # self.dgram_examine) # - # if HAVE_FORKING: - # def test_ForkingUnixDatagramServer(self): - # self.run_server(SocketServer.ForkingUnixDatagramServer, - # SocketServer.DatagramRequestHandler, - # self.dgram_examine) + # @requires_unix_sockets + # @requires_forking + # def test_ForkingUnixDatagramServer(self): + # self.run_server(SocketServer.ForkingUnixDatagramServer, + # SocketServer.DatagramRequestHandler, + # self.dgram_examine) @reap_threads def test_shutdown(self): diff --git a/Lib/test/test_sys.py b/Lib/test/test_sys.py --- a/Lib/test/test_sys.py +++ b/Lib/test/test_sys.py @@ -266,15 +266,16 @@ # still has 5 elements maj, min, buildno, plat, csd = sys.getwindowsversion() + @unittest.skipUnless(hasattr(sys, "setdlopenflags"), + 'test needs sys.setdlopenflags()') def test_dlopenflags(self): - if hasattr(sys, "setdlopenflags"): - self.assertTrue(hasattr(sys, "getdlopenflags")) - self.assertRaises(TypeError, sys.getdlopenflags, 42) - oldflags = sys.getdlopenflags() - self.assertRaises(TypeError, sys.setdlopenflags) - sys.setdlopenflags(oldflags+1) - self.assertEqual(sys.getdlopenflags(), oldflags+1) - sys.setdlopenflags(oldflags) + self.assertTrue(hasattr(sys, "getdlopenflags")) + self.assertRaises(TypeError, sys.getdlopenflags, 42) + oldflags = sys.getdlopenflags() + self.assertRaises(TypeError, sys.setdlopenflags) + sys.setdlopenflags(oldflags+1) + self.assertEqual(sys.getdlopenflags(), oldflags+1) + sys.setdlopenflags(oldflags) def test_refcount(self): # n here must be a global in order for this test to pass while diff --git a/Lib/test/test_warnings.py b/Lib/test/test_warnings.py --- a/Lib/test/test_warnings.py +++ b/Lib/test/test_warnings.py @@ -259,11 +259,10 @@ finally: warning_tests.__file__ = filename + @unittest.skipUnless(hasattr(sys, 'argv'), 'test needs sys.argv') def test_missing_filename_main_with_argv(self): # If __file__ is not specified and the caller is __main__ and sys.argv # exists, then use sys.argv[0] as the file. - if not hasattr(sys, 'argv'): - return filename = warning_tests.__file__ module_name = warning_tests.__name__ try: diff --git a/Lib/test/test_zlib.py b/Lib/test/test_zlib.py --- a/Lib/test/test_zlib.py +++ b/Lib/test/test_zlib.py @@ -12,6 +12,13 @@ zlib = import_module('zlib') +requires_Compress_copy = unittest.skipUnless( + hasattr(zlib.compressobj(), "copy"), + 'requires Compress.copy()') +requires_Decompress_copy = unittest.skipUnless( + hasattr(zlib.decompressobj(), "copy"), + 'requires Decompress.copy()') + class ChecksumTestCase(unittest.TestCase): # checksum test cases @@ -339,39 +346,39 @@ "mode=%i, level=%i") % (sync, level)) del obj + @unittest.skipUnless(hasattr(zlib, 'Z_SYNC_FLUSH'), + 'requires zlib.Z_SYNC_FLUSH') def test_odd_flush(self): # Test for odd flushing bugs noted in 2.0, and hopefully fixed in 2.1 import random + # Testing on 17K of "random" data - if hasattr(zlib, 'Z_SYNC_FLUSH'): - # Testing on 17K of "random" data + # Create compressor and decompressor objects + co = zlib.compressobj(zlib.Z_BEST_COMPRESSION) + dco = zlib.decompressobj() - # Create compressor and decompressor objects - co = zlib.compressobj(zlib.Z_BEST_COMPRESSION) - dco = zlib.decompressobj() + # Try 17K of data + # generate random data stream + try: + # In 2.3 and later, WichmannHill is the RNG of the bug report + gen = random.WichmannHill() + except AttributeError: + try: + # 2.2 called it Random + gen = random.Random() + except AttributeError: + # others might simply have a single RNG + gen = random + gen.seed(1) + data = genblock(1, 17 * 1024, generator=gen) - # Try 17K of data - # generate random data stream - try: - # In 2.3 and later, WichmannHill is the RNG of the bug report - gen = random.WichmannHill() - except AttributeError: - try: - # 2.2 called it Random - gen = random.Random() - except AttributeError: - # others might simply have a single RNG - gen = random - gen.seed(1) - data = genblock(1, 17 * 1024, generator=gen) + # compress, sync-flush, and decompress + first = co.compress(data) + second = co.flush(zlib.Z_SYNC_FLUSH) + expanded = dco.decompress(first + second) - # compress, sync-flush, and decompress - first = co.compress(data) - second = co.flush(zlib.Z_SYNC_FLUSH) - expanded = dco.decompress(first + second) - - # if decompressed data is different from the input data, choke. - self.assertEqual(expanded, data, "17K random source doesn't match") + # if decompressed data is different from the input data, choke. + self.assertEqual(expanded, data, "17K random source doesn't match") def test_empty_flush(self): # Test that calling .flush() on unused objects works. @@ -408,35 +415,36 @@ data = zlib.compress(input2) self.assertEqual(dco.flush(), input1[1:]) - if hasattr(zlib.compressobj(), "copy"): - def test_compresscopy(self): - # Test copying a compression object - data0 = HAMLET_SCENE - data1 = HAMLET_SCENE.swapcase() - c0 = zlib.compressobj(zlib.Z_BEST_COMPRESSION) - bufs0 = [] - bufs0.append(c0.compress(data0)) + @requires_Compress_copy + def test_compresscopy(self): + # Test copying a compression object + data0 = HAMLET_SCENE + data1 = HAMLET_SCENE.swapcase() + c0 = zlib.compressobj(zlib.Z_BEST_COMPRESSION) + bufs0 = [] + bufs0.append(c0.compress(data0)) - c1 = c0.copy() - bufs1 = bufs0[:] + c1 = c0.copy() + bufs1 = bufs0[:] - bufs0.append(c0.compress(data0)) - bufs0.append(c0.flush()) - s0 = ''.join(bufs0) + bufs0.append(c0.compress(data0)) + bufs0.append(c0.flush()) + s0 = ''.join(bufs0) - bufs1.append(c1.compress(data1)) - bufs1.append(c1.flush()) - s1 = ''.join(bufs1) + bufs1.append(c1.compress(data1)) + bufs1.append(c1.flush()) + s1 = ''.join(bufs1) - self.assertEqual(zlib.decompress(s0),data0+data0) - self.assertEqual(zlib.decompress(s1),data0+data1) + self.assertEqual(zlib.decompress(s0),data0+data0) + self.assertEqual(zlib.decompress(s1),data0+data1) - def test_badcompresscopy(self): - # Test copying a compression object in an inconsistent state - c = zlib.compressobj() - c.compress(HAMLET_SCENE) - c.flush() - self.assertRaises(ValueError, c.copy) + @requires_Compress_copy + def test_badcompresscopy(self): + # Test copying a compression object in an inconsistent state + c = zlib.compressobj() + c.compress(HAMLET_SCENE) + c.flush() + self.assertRaises(ValueError, c.copy) def test_decompress_unused_data(self): # Repeated calls to decompress() after EOF should accumulate data in @@ -463,35 +471,36 @@ self.assertEqual(dco.unconsumed_tail, b'') self.assertEqual(dco.unused_data, remainder) - if hasattr(zlib.decompressobj(), "copy"): - def test_decompresscopy(self): - # Test copying a decompression object - data = HAMLET_SCENE - comp = zlib.compress(data) + @requires_Decompress_copy + def test_decompresscopy(self): + # Test copying a decompression object + data = HAMLET_SCENE + comp = zlib.compress(data) - d0 = zlib.decompressobj() - bufs0 = [] - bufs0.append(d0.decompress(comp[:32])) + d0 = zlib.decompressobj() + bufs0 = [] + bufs0.append(d0.decompress(comp[:32])) - d1 = d0.copy() - bufs1 = bufs0[:] + d1 = d0.copy() + bufs1 = bufs0[:] - bufs0.append(d0.decompress(comp[32:])) - s0 = ''.join(bufs0) + bufs0.append(d0.decompress(comp[32:])) + s0 = ''.join(bufs0) - bufs1.append(d1.decompress(comp[32:])) - s1 = ''.join(bufs1) + bufs1.append(d1.decompress(comp[32:])) + s1 = ''.join(bufs1) - self.assertEqual(s0,s1) - self.assertEqual(s0,data) + self.assertEqual(s0,s1) + self.assertEqual(s0,data) - def test_baddecompresscopy(self): - # Test copying a compression object in an inconsistent state - data = zlib.compress(HAMLET_SCENE) - d = zlib.decompressobj() - d.decompress(data) - d.flush() - self.assertRaises(ValueError, d.copy) + @requires_Decompress_copy + def test_baddecompresscopy(self): + # Test copying a compression object in an inconsistent state + data = zlib.compress(HAMLET_SCENE) + d = zlib.decompressobj() + d.decompress(data) + d.flush() + self.assertRaises(ValueError, d.copy) # Memory use of the following functions takes into account overallocation diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -23,6 +23,8 @@ Tests ----- +- Issue #18702: All skipped tests now reported as skipped. + - Issue #19085: Added basic tests for all tkinter widget options. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 3 22:26:33 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 3 Nov 2013 22:26:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogRml4IHRlc3Rfb3Mg?= =?utf-8?b?KGlzc3VlICMxODcwMiku?= Message-ID: <3dCVZK6kZRz7Ljy@mail.python.org> http://hg.python.org/cpython/rev/0d8f0526813f changeset: 86898:0d8f0526813f branch: 2.7 user: Serhiy Storchaka date: Sun Nov 03 23:25:42 2013 +0200 summary: Fix test_os (issue #18702). files: Lib/test/test_os.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_os.py b/Lib/test/test_os.py --- a/Lib/test/test_os.py +++ b/Lib/test/test_os.py @@ -316,7 +316,7 @@ return buf.value @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") - @unittest.skipUnless(get_file_system(support.TESTFN) == "NTFS", + @unittest.skipUnless(get_file_system(test_support.TESTFN) == "NTFS", "requires NTFS") def test_1565150(self): t1 = 1159195039.25 @@ -324,7 +324,7 @@ self.assertEqual(os.stat(self.fname).st_mtime, t1) @unittest.skipUnless(sys.platform == "win32", "Win32 specific tests") - @unittest.skipUnless(get_file_system(support.TESTFN) == "NTFS", + @unittest.skipUnless(get_file_system(test_support.TESTFN) == "NTFS", "requires NTFS") def test_large_time(self): t1 = 5000000000 # some day in 2128 -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 00:35:47 2013 From: python-checkins at python.org (ned.deily) Date: Mon, 4 Nov 2013 00:35:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2318702_null_merge?= Message-ID: <3dCYRR5GXGzSqw@mail.python.org> http://hg.python.org/cpython/rev/a699550bc73b changeset: 86899:a699550bc73b parent: 86896:09105051b9f4 parent: 86895:1feeeb8992f8 user: Ned Deily date: Sun Nov 03 15:34:37 2013 -0800 summary: Issue #18702 null merge files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 01:54:27 2013 From: python-checkins at python.org (r.david.murray) Date: Mon, 4 Nov 2013 01:54:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=2318678=3A_Correct_names_?= =?utf-8?q?of_spwd_struct_members=2E?= Message-ID: <3dCbBC62qrz7LjR@mail.python.org> http://hg.python.org/cpython/rev/1b0ca1a7a3ca changeset: 86900:1b0ca1a7a3ca user: R David Murray date: Sun Nov 03 19:54:05 2013 -0500 summary: #18678: Correct names of spwd struct members. The old names (sp_nam and sp_pwd) are kept for backward compatibility. Since this is a long standing bug that hasn't caused any real-world problems, I'm not backporting it. However, it is worth fixing because the corrected names match the documentation, and more importantly now match the C struct, just like the other struct members. Patch by Vajrasky Kok. files: Doc/library/spwd.rst | 10 +++++----- Misc/NEWS | 4 ++++ Modules/spwdmodule.c | 14 +++++++++----- 3 files changed, 18 insertions(+), 10 deletions(-) diff --git a/Doc/library/spwd.rst b/Doc/library/spwd.rst --- a/Doc/library/spwd.rst +++ b/Doc/library/spwd.rst @@ -19,9 +19,9 @@ +-------+---------------+---------------------------------+ | Index | Attribute | Meaning | +=======+===============+=================================+ -| 0 | ``sp_nam`` | Login name | +| 0 | ``sp_namp`` | Login name | +-------+---------------+---------------------------------+ -| 1 | ``sp_pwd`` | Encrypted password | +| 1 | ``sp_pwdp`` | Encrypted password | +-------+---------------+---------------------------------+ | 2 | ``sp_lstchg`` | Date of last change | +-------+---------------+---------------------------------+ @@ -36,15 +36,15 @@ +-------+---------------+---------------------------------+ | 6 | ``sp_inact`` | Number of days after password | | | | expires until account is | -| | | blocked | +| | | disabled | +-------+---------------+---------------------------------+ | 7 | ``sp_expire`` | Number of days since 1970-01-01 | -| | | until account is disabled | +| | | when account expires | +-------+---------------+---------------------------------+ | 8 | ``sp_flag`` | Reserved | +-------+---------------+---------------------------------+ -The sp_nam and sp_pwd items are strings, all others are integers. +The sp_namp and sp_pwdp items are strings, all others are integers. :exc:`KeyError` is raised if the entry asked for cannot be found. The following functions are defined: diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -31,6 +31,10 @@ Library ------- +- Issue #18678: Corrected spwd struct member names in spwd module: + sp_nam->sp_namp, and sp_pwd->sp_pwdp. The old names are kept as extra + structseq members, for backward compatibility. + - Issue #6157: Fixed tkinter.Text.debug(). tkinter.Text.bbox() now raises TypeError instead of TclError on wrong number of arguments. Original patch by Guilherme Polo. diff --git a/Modules/spwdmodule.c b/Modules/spwdmodule.c --- a/Modules/spwdmodule.c +++ b/Modules/spwdmodule.c @@ -26,22 +26,24 @@ #if defined(HAVE_GETSPNAM) || defined(HAVE_GETSPENT) static PyStructSequence_Field struct_spwd_type_fields[] = { - {"sp_nam", "login name"}, - {"sp_pwd", "encrypted password"}, + {"sp_namp", "login name"}, + {"sp_pwdp", "encrypted password"}, {"sp_lstchg", "date of last change"}, {"sp_min", "min #days between changes"}, {"sp_max", "max #days between changes"}, {"sp_warn", "#days before pw expires to warn user about it"}, - {"sp_inact", "#days after pw expires until account is blocked"}, - {"sp_expire", "#days since 1970-01-01 until account is disabled"}, + {"sp_inact", "#days after pw expires until account is disabled"}, + {"sp_expire", "#days since 1970-01-01 when account expires"}, {"sp_flag", "reserved"}, + {"sp_nam", "login name; deprecated"}, /* Backward compatibility */ + {"sp_pwd", "encrypted password; deprecated"}, /* Backward compatibility */ {0} }; PyDoc_STRVAR(struct_spwd__doc__, "spwd.struct_spwd: Results from getsp*() routines.\n\n\ This object may be accessed either as a 9-tuple of\n\ - (sp_nam,sp_pwd,sp_lstchg,sp_min,sp_max,sp_warn,sp_inact,sp_expire,sp_flag)\n\ + (sp_namp,sp_pwdp,sp_lstchg,sp_min,sp_max,sp_warn,sp_inact,sp_expire,sp_flag)\n\ or via the object attributes as named in the above tuple."); static PyStructSequence_Desc struct_spwd_type_desc = { @@ -86,6 +88,8 @@ SETI(setIndex++, p->sp_inact); SETI(setIndex++, p->sp_expire); SETI(setIndex++, p->sp_flag); + SETS(setIndex++, p->sp_namp); /* Backward compatibility for sp_nam */ + SETS(setIndex++, p->sp_pwdp); /* Backward compatibility for sp_pwd */ #undef SETS #undef SETI -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 04:53:10 2013 From: python-checkins at python.org (zach.ware) Date: Mon, 4 Nov 2013 04:53:10 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5Mzkx?= =?utf-8?q?=3A_Clean_up_PCbuild/readme=2Etxt?= Message-ID: <3dCg8Q6VdWz7LjN@mail.python.org> http://hg.python.org/cpython/rev/7b6ac858bb17 changeset: 86901:7b6ac858bb17 branch: 2.7 parent: 86898:0d8f0526813f user: Zachary Ware date: Sun Nov 03 21:43:33 2013 -0600 summary: Issue #19391: Clean up PCbuild/readme.txt files: PCbuild/readme.txt | 62 ++++++++++----------------------- 1 files changed, 20 insertions(+), 42 deletions(-) diff --git a/PCbuild/readme.txt b/PCbuild/readme.txt --- a/PCbuild/readme.txt +++ b/PCbuild/readme.txt @@ -1,7 +1,7 @@ Building Python using VC++ 9.0 ------------------------------ -This directory is used to build Python for Win32 and x64 platforms, e.g. +This directory is used to build Python for Win32 and x64 platforms, e.g. Windows 2000, XP, Vista and Windows Server 2008. In order to build 32-bit debug and release executables, Microsoft Visual C++ 2008 Express Edition is required at the very least. In order to build 64-bit debug and release @@ -27,7 +27,7 @@ The solution is configured to build the projects in the correct order. "Build Solution" or F7 takes care of dependencies except for x64 builds. To make -cross compiling x64 builds on a 32bit OS possible the x64 builds require a +cross compiling x64 builds on a 32bit OS possible the x64 builds require a 32bit version of Python. NOTE: @@ -37,7 +37,7 @@ running a Python core buildbot test slave; see SUBPROJECTS below) When using the Debug setting, the output files have a _d added to -their name: python30_d.dll, python_d.exe, parser_d.pyd, and so on. Both +their name: python27_d.dll, python_d.exe, parser_d.pyd, and so on. Both the build and rt batch files accept a -d option for debug builds. The 32bit builds end up in the solution folder PCbuild while the x64 builds @@ -47,7 +47,7 @@ Legacy support -------------- -You can find build directories for older versions of Visual Studio and +You can find build directories for older versions of Visual Studio and Visual C++ in the PC directory. The legacy build directories are no longer actively maintained and may not work out of the box. @@ -64,7 +64,7 @@ Visual Studio 2008 uses version 9 of the C runtime (MSVCRT9). The executables are linked to a CRT "side by side" assembly which must be present on the target -machine. This is avalible under the VC/Redist folder of your visual studio +machine. This is available under the VC/Redist folder of your visual studio distribution. On XP and later operating systems that support side-by-side assemblies it is not enough to have the msvcrt90.dll present, it has to be there as a whole assembly, that is, a folder with the .dll @@ -105,16 +105,16 @@ Python-controlled subprojects that wrap external projects: _bsddb Wraps Berkeley DB 4.7.25, which is currently built by _bsddb.vcproj. - project (see below). + project. _sqlite3 - Wraps SQLite 3.6.21, which is currently built by sqlite3.vcproj (see below). + Wraps SQLite 3.6.21, which is currently built by sqlite3.vcproj. _tkinter Wraps the Tk windowing system. Unlike _bsddb and _sqlite3, there's no corresponding tcltk.vcproj-type project that builds Tcl/Tk from vcproj's within our pcbuild.sln, which means this module expects to find a pre-built Tcl/Tk in either ..\..\tcltk for 32-bit or ..\..\tcltk64 for 64-bit (relative to this directory). See below for instructions to build - Tcl/Tk. + Tcl/Tk. bz2 Python wrapper for the libbz2 compression library. Homepage http://sources.redhat.com/bzip2/ @@ -127,16 +127,6 @@ obtaining external sources then you don't need to manually get the source above via subversion. ** - A custom pre-link step in the bz2 project settings should manage to - build bzip2-1.0.6\libbz2.lib by magic before bz2.pyd (or bz2_d.pyd) is - linked in PCbuild\. - However, the bz2 project is not smart enough to remove anything under - bzip2-1.0.6\ when you do a clean, so if you want to rebuild bzip2.lib - you need to clean up bzip2-1.0.6\ by hand. - - All of this managed to build libbz2.lib in - bzip2-1.0.6\$platform-$configuration\, which the Python project links in. - _ssl Python wrapper for the secure sockets library. @@ -154,18 +144,16 @@ You must install the NASM assembler from http://nasm.sf.net - for x86 builds. Put nasmw.exe anywhere in your PATH. - Note: recent releases of nasm only have nasm.exe. Just rename it to - nasmw.exe. + for x86 builds. Put nasm.exe anywhere in your PATH. You can also install ActivePerl from http://www.activestate.com/activeperl/ - if you like to use the official sources instead of the files from + if you like to use the official sources instead of the files from python's subversion repository. The svn version contains pre-build makefiles and assembly files. The build process makes sure that no patented algorithms are included. - For now RC5, MDC2 and IDEA are excluded from the build. You may have + For now RC5, MDC2 and IDEA are excluded from the build. You may have to manually remove $(OBJ_D)\i_*.obj from ms\nt.mak if the build process complains about missing files or forbidden IDEA. Again the files provided in the subversion repository are already fixed. @@ -186,16 +174,16 @@ this by hand. The subprojects above wrap external projects Python doesn't control, and as -such, a little more work is required in order to download the relevant source +such, a little more work is required in order to download the relevant source files for each project before they can be built. The buildbots do this each -time they're built, so the easiest approach is to run either external.bat or +time they're built, so the easiest approach is to run either external.bat or external-amd64.bat in the ..\Tools\buildbot directory from ..\, i.e.: C:\..\svn.python.org\projects\python\trunk\PCbuild>cd .. C:\..\svn.python.org\projects\python\trunk>Tools\buildbot\external.bat This extracts all the external subprojects from http://svn.python.org/external -via Subversion (so you'll need an svn.exe on your PATH) and places them in +via Subversion (so you'll need an svn.exe on your PATH) and places them in ..\.. (relative to this directory). The external(-amd64).bat scripts will also build a debug build of Tcl/Tk; there aren't any equivalent batch files for building release versions of Tcl/Tk lying around in the Tools\buildbot @@ -238,7 +226,7 @@ junction as follows (using the directory structure above as an example): C:\..\python\trunk\external <- already exists and has built versions - of the external subprojects + of the external subprojects C:\..\python\branches\py3k>linkd.exe external ..\..\trunk\external Link created at: external @@ -251,19 +239,9 @@ Building for Itanium -------------------- -NOTE: Official support for Itanium builds have been dropped from the build. Please contact us and provide patches if you are interested in Itanium builds. -The project files support a ReleaseItanium configuration which creates -Win64/Itanium binaries. For this to work, you need to install the Platform -SDK, in particular the 64-bit support. This includes an Itanium compiler -(future releases of the SDK likely include an AMD64 compiler as well). -In addition, you need the Visual Studio plugin for external C compilers, -from http://sf.net/projects/vsextcomp. The plugin will wrap cl.exe, to -locate the proper target compiler, and convert compiler options -accordingly. The project files require at least version 0.9. - Building for AMD64 ------------------ @@ -283,7 +261,7 @@ The solution has two configurations for PGO. The PGInstrument configuration must be build first. The PGInstrument binaries are -lniked against a profiling library and contain extra debug +linked against a profiling library and contain extra debug information. The PGUpdate configuration takes the profiling data and generates optimized binaries. @@ -291,22 +269,22 @@ creates the PGI files, runs the unit test suite or PyBench with the PGI python and finally creates the optimized files. -http://msdn2.microsoft.com/en-us/library/e7k32f4k(VS.90).aspx +http://msdn.microsoft.com/en-us/library/e7k32f4k(VS.90).aspx Static library -------------- The solution has no configuration for static libraries. However it is easy -it build a static library instead of a DLL. You simply have to set the +it build a static library instead of a DLL. You simply have to set the "Configuration Type" to "Static Library (.lib)" and alter the preprocessor macro "Py_ENABLE_SHARED" to "Py_NO_ENABLE_SHARED". You may also have to -change the "Runtime Library" from "Multi-threaded DLL (/MD)" to +change the "Runtime Library" from "Multi-threaded DLL (/MD)" to "Multi-threaded (/MT)". Visual Studio properties ------------------------ -The PCbuild solution makes heavy use of Visual Studio property files +The PCbuild solution makes heavy use of Visual Studio property files (*.vsprops). The properties can be viewed and altered in the Property Manager (View -> Other Windows -> Property Manager). -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 04:53:12 2013 From: python-checkins at python.org (zach.ware) Date: Mon, 4 Nov 2013 04:53:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5Mzkx?= =?utf-8?q?=3A_Clean_up_PCbuild/readme=2Etxt?= Message-ID: <3dCg8S2KkTz7LjT@mail.python.org> http://hg.python.org/cpython/rev/f28a2d072767 changeset: 86902:f28a2d072767 branch: 3.3 parent: 86895:1feeeb8992f8 user: Zachary Ware date: Sun Nov 03 21:48:54 2013 -0600 summary: Issue #19391: Clean up PCbuild/readme.txt files: PCbuild/readme.txt | 74 +++++++++++---------------------- 1 files changed, 26 insertions(+), 48 deletions(-) diff --git a/PCbuild/readme.txt b/PCbuild/readme.txt --- a/PCbuild/readme.txt +++ b/PCbuild/readme.txt @@ -1,8 +1,8 @@ Building Python using VC++ 10.0 ------------------------------- -This directory is used to build Python for Win32 and x64 platforms, e.g. -Windows 2000, XP, Vista and Windows Server 2008. In order to build 32-bit +This directory is used to build Python for Win32 and x64 platforms, e.g. +Windows XP, Vista and Windows Server 2008. In order to build 32-bit debug and release executables, Microsoft Visual C++ 2010 Express Edition is required at the very least. In order to build 64-bit debug and release executables, Visual Studio 2010 Standard Edition is required at the very @@ -27,7 +27,7 @@ The solution is configured to build the projects in the correct order. "Build Solution" or F7 takes care of dependencies except for x64 builds. To make -cross compiling x64 builds on a 32bit OS possible the x64 builds require a +cross compiling x64 builds on a 32bit OS possible the x64 builds require a 32bit version of Python. NOTE: @@ -47,7 +47,7 @@ Legacy support -------------- -You can find build directories for older versions of Visual Studio and +You can find build directories for older versions of Visual Studio and Visual C++ in the PC directory. The legacy build directories are no longer actively maintained and may not work out of the box. @@ -64,10 +64,10 @@ C RUNTIME --------- -Visual Studio 2010 uses version 10 of the C runtime (MSVCRT9). The executables +Visual Studio 2010 uses version 10 of the C runtime (MSVCRT10). The executables no longer use the "Side by Side" assemblies used in previous versions of the compiler. This simplifies distribution of applications. -The run time libraries are avalible under the VC/Redist folder of your visual studio +The run time libraries are available under the VC/Redist folder of your visual studio distribution. For more info, see the Readme in the VC/Redist folder. SUBPROJECTS @@ -103,14 +103,14 @@ Python-controlled subprojects that wrap external projects: _sqlite3 - Wraps SQLite 3.7.4, which is currently built by sqlite3.vcproj (see below). + Wraps SQLite 3.7.12, which is currently built by sqlite3.vcxproj. _tkinter Wraps the Tk windowing system. Unlike _sqlite3, there's no - corresponding tcltk.vcproj-type project that builds Tcl/Tk from vcproj's + corresponding tcltk.vcxproj-type project that builds Tcl/Tk from vcxproj's within our pcbuild.sln, which means this module expects to find a pre-built Tcl/Tk in either ..\..\tcltk for 32-bit or ..\..\tcltk64 for 64-bit (relative to this directory). See below for instructions to build - Tcl/Tk. + Tcl/Tk. _bz2 Python wrapper for the libbzip2 compression library. Homepage http://www.bzip.org/ @@ -122,16 +122,6 @@ ** NOTE: if you use the Tools\buildbot\external(-amd64).bat approach for obtaining external sources then you don't need to manually get the source above via subversion. ** - - A custom pre-link step in the bz2 project settings should manage to - build bzip2-1.0.6\libbz2.lib by magic before bz2.pyd (or bz2_d.pyd) is - linked in PCbuild\. - However, the bz2 project is not smart enough to remove anything under - bzip2-1.0.6\ when you do a clean, so if you want to rebuild bzip2.lib - you need to clean up bzip2-1.0.6\ by hand. - - All of this managed to build libbz2.lib in - bzip2-1.0.6\$platform-$configuration\, which the Python project links in. _lzma Python wrapper for the liblzma compression library. @@ -156,21 +146,19 @@ You must install the NASM assembler 2.10 or newer from http://nasm.sf.net - for x86 builds. Put nasmw.exe anywhere in your PATH. More recent + for x86 builds. Put nasm.exe anywhere in your PATH. More recent versions of OpenSSL may need a later version of NASM. If OpenSSL's self tests don't pass, you should first try to update NASM and do a full rebuild of OpenSSL. - Note: recent releases of nasm only have nasm.exe. Just rename it to - nasmw.exe. You can also install ActivePerl from http://www.activestate.com/activeperl/ - if you like to use the official sources instead of the files from + if you like to use the official sources instead of the files from python's subversion repository. The svn version contains pre-build makefiles and assembly files. The build process makes sure that no patented algorithms are included. - For now RC5, MDC2 and IDEA are excluded from the build. You may have + For now RC5, MDC2 and IDEA are excluded from the build. You may have to manually remove $(OBJ_D)\i_*.obj from ms\nt.mak if the build process complains about missing files or forbidden IDEA. Again the files provided in the subversion repository are already fixed. @@ -191,16 +179,16 @@ this by hand. The subprojects above wrap external projects Python doesn't control, and as -such, a little more work is required in order to download the relevant source +such, a little more work is required in order to download the relevant source files for each project before they can be built. The buildbots do this each -time they're built, so the easiest approach is to run either external.bat or +time they're built, so the easiest approach is to run either external.bat or external-amd64.bat in the ..\Tools\buildbot directory from ..\, i.e.: C:\..\svn.python.org\projects\python\trunk\PCbuild>cd .. C:\..\svn.python.org\projects\python\trunk>Tools\buildbot\external.bat This extracts all the external subprojects from http://svn.python.org/external -via Subversion (so you'll need an svn.exe on your PATH) and places them in +via Subversion (so you'll need an svn.exe on your PATH) and places them in ..\.. (relative to this directory). The external(-amd64).bat scripts will also build a debug build of Tcl/Tk; there aren't any equivalent batch files for building release versions of Tcl/Tk lying around in the Tools\buildbot @@ -209,18 +197,18 @@ two nmake lines, then call each one without the 'DEBUG=1' parameter, i.e.: The external-amd64.bat file contains this for tcl: - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 clean all install + nmake -f makefile.vc DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 clean all install So for a release build, you'd call it as: - nmake -f makefile.vc COMPILERFLAGS=-DWINVER=0x0500 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 clean all install + nmake -f makefile.vc MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 clean all install XXX Should we compile with OPTS=threads? XXX Our installer copies a lot of stuff out of the Tcl/Tk install XXX directory. Is all of that really needed for Python use of Tcl/Tk? This will be cleaned up in the future; ideally Tcl/Tk will be brought into our -pcbuild.sln as custom .vcproj files, just as we've recently done with the -sqlite3.vcproj file, which will remove the need for Tcl/Tk to be built +pcbuild.sln as custom .vcxproj files, just as we've recently done with the +sqlite3.vcxproj file, which will remove the need for Tcl/Tk to be built separately via a batch file. XXX trent.nelson 02-Apr-08: @@ -243,7 +231,7 @@ junction as follows (using the directory structure above as an example): C:\..\python\trunk\external <- already exists and has built versions - of the external subprojects + of the external subprojects C:\..\python\branches\py3k>linkd.exe external ..\..\trunk\external Link created at: external @@ -256,19 +244,9 @@ Building for Itanium -------------------- -NOTE: Official support for Itanium builds have been dropped from the build. Please contact us and provide patches if you are interested in Itanium builds. -The project files support a ReleaseItanium configuration which creates -Win64/Itanium binaries. For this to work, you need to install the Platform -SDK, in particular the 64-bit support. This includes an Itanium compiler -(future releases of the SDK likely include an AMD64 compiler as well). -In addition, you need the Visual Studio plugin for external C compilers, -from http://sf.net/projects/vsextcomp. The plugin will wrap cl.exe, to -locate the proper target compiler, and convert compiler options -accordingly. The project files require at least version 0.9. - Building for AMD64 ------------------ @@ -288,7 +266,7 @@ The solution has two configurations for PGO. The PGInstrument configuration must be build first. The PGInstrument binaries are -lniked against a profiling library and contain extra debug +linked against a profiling library and contain extra debug information. The PGUpdate configuration takes the profiling data and generates optimized binaries. @@ -296,23 +274,23 @@ creates the PGI files, runs the unit test suite or PyBench with the PGI python and finally creates the optimized files. -http://msdn2.microsoft.com/en-us/library/e7k32f4k(VS.90).aspx +http://msdn.microsoft.com/en-us/library/e7k32f4k(VS.100).aspx Static library -------------- The solution has no configuration for static libraries. However it is easy -it build a static library instead of a DLL. You simply have to set the +it build a static library instead of a DLL. You simply have to set the "Configuration Type" to "Static Library (.lib)" and alter the preprocessor macro "Py_ENABLE_SHARED" to "Py_NO_ENABLE_SHARED". You may also have to -change the "Runtime Library" from "Multi-threaded DLL (/MD)" to +change the "Runtime Library" from "Multi-threaded DLL (/MD)" to "Multi-threaded (/MT)". Visual Studio properties ------------------------ -The PCbuild solution makes heavy use of Visual Studio property files -(*.vsprops). The properties can be viewed and altered in the Property +The PCbuild solution makes heavy use of Visual Studio property files +(*.props). The properties can be viewed and altered in the Property Manager (View -> Other Windows -> Property Manager). * debug (debug macro: _DEBUG) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 04:53:13 2013 From: python-checkins at python.org (zach.ware) Date: Mon, 4 Nov 2013 04:53:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Null_merge_3=2E3?= Message-ID: <3dCg8T3y84z7Lk3@mail.python.org> http://hg.python.org/cpython/rev/74a118d260a7 changeset: 86903:74a118d260a7 parent: 86900:1b0ca1a7a3ca parent: 86902:f28a2d072767 user: Zachary Ware date: Sun Nov 03 21:51:42 2013 -0600 summary: Null merge 3.3 files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 05:10:21 2013 From: python-checkins at python.org (ned.deily) Date: Mon, 4 Nov 2013 05:10:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE1Mzky?= =?utf-8?q?=3A_Install_idlelib/idle=5Ftest=2E?= Message-ID: <3dCgXF5KWjz7Ljc@mail.python.org> http://hg.python.org/cpython/rev/dac6aea39814 changeset: 86904:dac6aea39814 branch: 2.7 parent: 86901:7b6ac858bb17 user: Ned Deily date: Sun Nov 03 20:08:17 2013 -0800 summary: Issue #15392: Install idlelib/idle_test. files: Makefile.pre.in | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Makefile.pre.in b/Makefile.pre.in --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -939,7 +939,8 @@ logging bsddb bsddb/test csv importlib wsgiref \ lib2to3 lib2to3/fixes lib2to3/pgen2 lib2to3/tests \ lib2to3/tests/data lib2to3/tests/data/fixers lib2to3/tests/data/fixers/myfixes \ - ctypes ctypes/test ctypes/macholib idlelib idlelib/Icons \ + ctypes ctypes/test ctypes/macholib \ + idlelib idlelib/Icons idlelib/idle_test \ distutils distutils/command distutils/tests $(XMLLIBSUBDIRS) \ multiprocessing multiprocessing/dummy \ unittest unittest/test \ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 05:10:23 2013 From: python-checkins at python.org (ned.deily) Date: Mon, 4 Nov 2013 05:10:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE1Mzky?= =?utf-8?q?=3A_Install_idlelib/idle=5Ftest=2E?= Message-ID: <3dCgXH02VFz7Ljx@mail.python.org> http://hg.python.org/cpython/rev/e52dad892521 changeset: 86905:e52dad892521 branch: 3.3 parent: 86902:f28a2d072767 user: Ned Deily date: Sun Nov 03 20:08:53 2013 -0800 summary: Issue #15392: Install idlelib/idle_test. files: Makefile.pre.in | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Makefile.pre.in b/Makefile.pre.in --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -1048,7 +1048,8 @@ lib2to3 lib2to3/fixes lib2to3/pgen2 lib2to3/tests \ lib2to3/tests/data lib2to3/tests/data/fixers \ lib2to3/tests/data/fixers/myfixes \ - ctypes ctypes/test ctypes/macholib idlelib idlelib/Icons \ + ctypes ctypes/test ctypes/macholib \ + idlelib idlelib/Icons idlelib/idle_test \ distutils distutils/command distutils/tests $(XMLLIBSUBDIRS) \ importlib test/test_importlib test/test_importlib/builtin \ test/test_importlib/extension test/test_importlib/frozen \ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 05:10:24 2013 From: python-checkins at python.org (ned.deily) Date: Mon, 4 Nov 2013 05:10:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2315392=3A_merge_from_3=2E3?= Message-ID: <3dCgXJ1pT5z7Lkg@mail.python.org> http://hg.python.org/cpython/rev/06239fe781fe changeset: 86906:06239fe781fe parent: 86903:74a118d260a7 parent: 86905:e52dad892521 user: Ned Deily date: Sun Nov 03 20:09:51 2013 -0800 summary: Issue #15392: merge from 3.3 files: Makefile.pre.in | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Makefile.pre.in b/Makefile.pre.in --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -1097,7 +1097,8 @@ lib2to3 lib2to3/fixes lib2to3/pgen2 lib2to3/tests \ lib2to3/tests/data lib2to3/tests/data/fixers \ lib2to3/tests/data/fixers/myfixes \ - ctypes ctypes/test ctypes/macholib idlelib idlelib/Icons \ + ctypes ctypes/test ctypes/macholib \ + idlelib idlelib/Icons idlelib/idle_test \ distutils distutils/command distutils/tests $(XMLLIBSUBDIRS) \ importlib test/test_importlib test/test_importlib/builtin \ test/test_importlib/extension test/test_importlib/frozen \ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 05:27:51 2013 From: python-checkins at python.org (zach.ware) Date: Mon, 4 Nov 2013 05:27:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE3ODgz?= =?utf-8?q?=3A_Backport_test=2Etest=5Fsupport=2E=5Fis=5Fgui=5Favailable=28?= =?utf-8?q?=29?= Message-ID: <3dCgwR2k66z7LjN@mail.python.org> http://hg.python.org/cpython/rev/358496e67a89 changeset: 86907:358496e67a89 branch: 2.7 parent: 86904:dac6aea39814 user: Zachary Ware date: Sun Nov 03 22:27:04 2013 -0600 summary: Issue #17883: Backport test.test_support._is_gui_available() This should stop the Windows buildbots from hanging on test_ttk_guionly. files: Lib/test/test_support.py | 34 ++++++++++++++++++++++++++++ Misc/NEWS | 3 ++ 2 files changed, 37 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_support.py b/Lib/test/test_support.py --- a/Lib/test/test_support.py +++ b/Lib/test/test_support.py @@ -271,6 +271,36 @@ # is exited) but there is a .pyo file. unlink(os.path.join(dirname, modname + os.extsep + 'pyo')) +# On some platforms, should not run gui test even if it is allowed +# in `use_resources'. +if sys.platform.startswith('win'): + import ctypes + import ctypes.wintypes + def _is_gui_available(): + UOI_FLAGS = 1 + WSF_VISIBLE = 0x0001 + class USEROBJECTFLAGS(ctypes.Structure): + _fields_ = [("fInherit", ctypes.wintypes.BOOL), + ("fReserved", ctypes.wintypes.BOOL), + ("dwFlags", ctypes.wintypes.DWORD)] + dll = ctypes.windll.user32 + h = dll.GetProcessWindowStation() + if not h: + raise ctypes.WinError() + uof = USEROBJECTFLAGS() + needed = ctypes.wintypes.DWORD() + res = dll.GetUserObjectInformationW(h, + UOI_FLAGS, + ctypes.byref(uof), + ctypes.sizeof(uof), + ctypes.byref(needed)) + if not res: + raise ctypes.WinError() + return bool(uof.dwFlags & WSF_VISIBLE) +else: + def _is_gui_available(): + return True + def is_resource_enabled(resource): """Test whether a resource is enabled. Known resources are set by regrtest.py.""" @@ -281,6 +311,8 @@ If the caller's module is __main__ then automatically return True. The possibility of False being returned occurs when regrtest.py is executing.""" + if resource == 'gui' and not _is_gui_available(): + raise unittest.SkipTest("Cannot use the 'gui' resource") # see if the caller's module is __main__ - if so, treat as if # the resource was set if sys._getframe(1).f_globals.get("__name__") == "__main__": @@ -1128,6 +1160,8 @@ return obj def requires_resource(resource): + if resource == 'gui' and not _is_gui_available(): + return unittest.skip("resource 'gui' is not available") if is_resource_enabled(resource): return _id else: diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -23,6 +23,9 @@ Tests ----- +- Issue #17883: Backported _is_gui_available() in test.test_support to + avoid hanging Windows buildbots on test_ttk_guionly. + - Issue #18702: All skipped tests now reported as skipped. - Issue #19085: Added basic tests for all tkinter widget options. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 05:48:04 2013 From: python-checkins at python.org (terry.reedy) Date: Mon, 4 Nov 2013 05:48:04 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgI1hYWFhY?= =?utf-8?q?=3A_Fix_test=5Fidle_so_that_idlelib_test_cases_are_actually_run?= Message-ID: <3dChMm1bw5z7LjR@mail.python.org> http://hg.python.org/cpython/rev/cced7981ec4d changeset: 86908:cced7981ec4d branch: 2.7 user: Terry Jan Reedy date: Sun Nov 03 23:37:54 2013 -0500 summary: Issue #XXXXX: Fix test_idle so that idlelib test cases are actually run under test.regrtest on 2.7. files: Lib/test/test_idle.py | 4 ++++ 1 files changed, 4 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_idle.py b/Lib/test/test_idle.py --- a/Lib/test/test_idle.py +++ b/Lib/test/test_idle.py @@ -23,6 +23,10 @@ # load_tests() if it finds it. (Unittest.main does the same.) load_tests = idletest.load_tests +# pre-3.3 regrtest does not support the load_tests protocol. use test_main +def test_main(): + support.run_unittest(unittest.TestLoader().loadTestsFromModule(idletest)) + if __name__ == '__main__': # Until unittest supports resources, we emulate regrtest's -ugui # so loaded tests run the same as if textually present here. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 05:52:47 2013 From: python-checkins at python.org (zach.ware) Date: Mon, 4 Nov 2013 05:52:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE3ODgz?= =?utf-8?q?=3A_Tweak_test=5Ftcl_testLoadWithUNC_to_skip_the_test_in_the?= Message-ID: <3dChTC6Ycyz7LjS@mail.python.org> http://hg.python.org/cpython/rev/72c3ca3ed22a changeset: 86909:72c3ca3ed22a branch: 2.7 user: Zachary Ware date: Sun Nov 03 22:51:25 2013 -0600 summary: Issue #17883: Tweak test_tcl testLoadWithUNC to skip the test in the event of a permission error on Windows and to properly report other skip conditions. files: Lib/test/test_tcl.py | 16 ++++++++++------ Misc/NEWS | 4 ++++ 2 files changed, 14 insertions(+), 6 deletions(-) diff --git a/Lib/test/test_tcl.py b/Lib/test/test_tcl.py --- a/Lib/test/test_tcl.py +++ b/Lib/test/test_tcl.py @@ -138,18 +138,15 @@ tcl = self.interp self.assertRaises(TclError,tcl.eval,'package require DNE') + @unittest.skipUnless(sys.platform == 'win32', "only applies to Windows") def testLoadWithUNC(self): - import sys - if sys.platform != 'win32': - return - # Build a UNC path from the regular path. # Something like # \\%COMPUTERNAME%\c$\python27\python.exe fullname = os.path.abspath(sys.executable) if fullname[1] != ':': - return + self.skipTest('unusable path: %r' % fullname) unc_name = r'\\%s\%s$\%s' % (os.environ['COMPUTERNAME'], fullname[0], fullname[3:]) @@ -158,7 +155,14 @@ env.unset("TCL_LIBRARY") cmd = '%s -c "import Tkinter; print Tkinter"' % (unc_name,) - p = Popen(cmd, stdout=PIPE, stderr=PIPE) + try: + p = Popen(cmd, stdout=PIPE, stderr=PIPE) + except WindowsError as e: + if e.winerror == 5: + self.skipTest('Not permitted to start the child process') + else: + raise + out_data, err_data = p.communicate() msg = '\n\n'.join(['"Tkinter.py" not in output', diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -23,6 +23,10 @@ Tests ----- +- Issue #17883: Tweak test_tcl testLoadWithUNC to skip the test in the + event of a permission error on Windows and to properly report other + skip conditions. + - Issue #17883: Backported _is_gui_available() in test.test_support to avoid hanging Windows buildbots on test_ttk_guionly. -- Repository URL: http://hg.python.org/cpython From tjreedy at udel.edu Mon Nov 4 05:54:23 2013 From: tjreedy at udel.edu (Terry Reedy) Date: Sun, 03 Nov 2013 23:54:23 -0500 Subject: [Python-checkins] cpython (2.7): Issue #XXXXX: Fix test_idle so that idlelib test cases are actually run In-Reply-To: <3dChMm1bw5z7LjR@mail.python.org> References: <3dChMm1bw5z7LjR@mail.python.org> Message-ID: <5277287F.8060607@udel.edu> On 11/3/2013 11:48 PM, terry.reedy wrote: > http://hg.python.org/cpython/rev/cced7981ec4d > changeset: 86908:cced7981ec4d > branch: 2.7 > user: Terry Jan Reedy > date: Sun Nov 03 23:37:54 2013 -0500 > summary: > Issue #XXXXX: Fix test_idle so that idlelib test cases are actually run > under test.regrtest on 2.7. This message is the one included with the patch by Ned Daily. Because a message *was* included (not normal), hg import committed the patch immediately, without giving me a chance to edit the patch or message. As far as I know, there is no way I could have edited the message after the commit. If there was, let me know. From python-checkins at python.org Mon Nov 4 07:26:56 2013 From: python-checkins at python.org (georg.brandl) Date: Mon, 4 Nov 2013 07:26:56 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_update_for_3=2E3=2E3?= Message-ID: <3dCkYr1kxHz7LjS@mail.python.org> http://hg.python.org/peps/rev/f6470cb739db changeset: 5248:f6470cb739db user: Georg Brandl date: Mon Nov 04 07:28:07 2013 +0100 summary: update for 3.3.3 files: pep-0398.txt | 7 +++++++ 1 files changed, 7 insertions(+), 0 deletions(-) diff --git a/pep-0398.txt b/pep-0398.txt --- a/pep-0398.txt +++ b/pep-0398.txt @@ -78,6 +78,13 @@ - 3.3.2 final: May 13, 2013 +3.3.3 schedule +-------------- + +- 3.3.3 candidate 1: October 27, 2013 +- 3.3.3 candidate 2: November 9, 2013 +- 3.3.3 final: November 16, 2013 + Features for 3.3 ================ -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Mon Nov 4 07:29:42 2013 From: python-checkins at python.org (georg.brandl) Date: Mon, 4 Nov 2013 07:29:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Note_that_examples_are_sor?= =?utf-8?q?ted_only_for_convenience=2E?= Message-ID: <3dCkd23hxTz7LjS@mail.python.org> http://hg.python.org/cpython/rev/ed8ad769fb1f changeset: 86910:ed8ad769fb1f parent: 86906:06239fe781fe user: Georg Brandl date: Mon Nov 04 07:30:50 2013 +0100 summary: Note that examples are sorted only for convenience. files: Doc/library/statistics.rst | 3 +++ 1 files changed, 3 insertions(+), 0 deletions(-) diff --git a/Doc/library/statistics.rst b/Doc/library/statistics.rst --- a/Doc/library/statistics.rst +++ b/Doc/library/statistics.rst @@ -52,6 +52,9 @@ Function details ---------------- +Note: The functions do not require the data given to them to be sorted. +However, for reading convenience, most of the examples show sorted sequences. + .. function:: mean(data) Return the sample arithmetic mean of *data*, a sequence or iterator of -- Repository URL: http://hg.python.org/cpython From brian at python.org Mon Nov 4 07:30:56 2013 From: brian at python.org (Brian Curtin) Date: Sun, 3 Nov 2013 22:30:56 -0800 Subject: [Python-checkins] cpython (2.7): Issue #XXXXX: Fix test_idle so that idlelib test cases are actually run In-Reply-To: <5277287F.8060607@udel.edu> References: <3dChMm1bw5z7LjR@mail.python.org> <5277287F.8060607@udel.edu> Message-ID: On Sun, Nov 3, 2013 at 8:54 PM, Terry Reedy wrote: > On 11/3/2013 11:48 PM, terry.reedy wrote: >> >> http://hg.python.org/cpython/rev/cced7981ec4d >> changeset: 86908:cced7981ec4d >> branch: 2.7 >> user: Terry Jan Reedy >> date: Sun Nov 03 23:37:54 2013 -0500 >> summary: >> Issue #XXXXX: Fix test_idle so that idlelib test cases are actually run >> under test.regrtest on 2.7. > > > This message is the one included with the patch by Ned Daily. Because a > message *was* included (not normal), hg import committed the patch > immediately, without giving me a chance to edit the patch or message. As far > as I know, there is no way I could have edited the message after the commit. > If there was, let me know. Besides what Zach mentions, most of the time you probably want to "hg import --no-commit ", run it, test it, then commit it with whatever message you want. From solipsis at pitrou.net Mon Nov 4 07:33:44 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Mon, 04 Nov 2013 07:33:44 +0100 Subject: [Python-checkins] Daily reference leaks (1b0ca1a7a3ca): sum=0 Message-ID: results for 1b0ca1a7a3ca on branch "default" -------------------------------------------- test_site leaked [2, 0, -2] references, sum=0 test_site leaked [2, 0, -2] memory blocks, sum=0 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogbBtchS', '-x'] From python-checkins at python.org Mon Nov 4 07:43:56 2013 From: python-checkins at python.org (georg.brandl) Date: Mon, 4 Nov 2013 07:43:56 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E2=29=3A_Backout_d80207?= =?utf-8?q?d15294=2E?= Message-ID: <3dCkxS4VV3zRJ8@mail.python.org> http://hg.python.org/cpython/rev/f27394782ed7 changeset: 86911:f27394782ed7 branch: 3.2 parent: 86861:7d399099334d user: Georg Brandl date: Mon Nov 04 07:43:32 2013 +0100 summary: Backout d80207d15294. files: Lib/distutils/tests/test_build_py.py | 31 ---------------- 1 files changed, 0 insertions(+), 31 deletions(-) diff --git a/Lib/distutils/tests/test_build_py.py b/Lib/distutils/tests/test_build_py.py --- a/Lib/distutils/tests/test_build_py.py +++ b/Lib/distutils/tests/test_build_py.py @@ -121,37 +121,6 @@ found = os.listdir(os.path.join(cmd.build_lib, '__pycache__')) self.assertEqual(sorted(found), ['boiledeggs.%s.pyo' % imp.get_tag()]) - def test_dir_in_package_data(self): - """ - A directory in package_data should not be added to the filelist. - """ - # See bug 19286 - sources = self.mkdtemp() - pkg_dir = os.path.join(sources, "pkg") - - os.mkdir(pkg_dir) - open(os.path.join(pkg_dir, "__init__.py"), "w").close() - - docdir = os.path.join(pkg_dir, "doc") - os.mkdir(docdir) - open(os.path.join(docdir, "testfile"), "w").close() - - # create the directory that could be incorrectly detected as a file - os.mkdir(os.path.join(docdir, 'otherdir')) - - os.chdir(sources) - dist = Distribution({"packages": ["pkg"], - "package_data": {"pkg": ["doc/*"]}}) - # script_name need not exist, it just need to be initialized - dist.script_name = os.path.join(sources, "setup.py") - dist.script_args = ["build"] - dist.parse_command_line() - - try: - dist.run_commands() - except DistutilsFileError: - self.fail("failed package_data when data dir includes a dir") - def test_dont_write_bytecode(self): # makes sure byte_compile is not used dist = self.create_dist()[1] -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 07:43:57 2013 From: python-checkins at python.org (georg.brandl) Date: Mon, 4 Nov 2013 07:43:57 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E2=29=3A_Backout_265d36?= =?utf-8?q?9ad3b9=2E?= Message-ID: <3dCkxT6FsKz7Ljc@mail.python.org> http://hg.python.org/cpython/rev/6432d6ce3d6b changeset: 86912:6432d6ce3d6b branch: 3.2 user: Georg Brandl date: Mon Nov 04 07:43:41 2013 +0100 summary: Backout 265d369ad3b9. files: Lib/distutils/command/build_py.py | 3 +-- 1 files changed, 1 insertions(+), 2 deletions(-) diff --git a/Lib/distutils/command/build_py.py b/Lib/distutils/command/build_py.py --- a/Lib/distutils/command/build_py.py +++ b/Lib/distutils/command/build_py.py @@ -127,8 +127,7 @@ # Each pattern has to be converted to a platform-specific path filelist = glob(os.path.join(src_dir, convert_path(pattern))) # Files that match more than one pattern are only added once - files.extend([fn for fn in filelist if fn not in files - and os.path.isfile(fn)]) + files.extend([fn for fn in filelist if fn not in files]) return files def build_package_data(self): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 07:43:59 2013 From: python-checkins at python.org (georg.brandl) Date: Mon, 4 Nov 2013 07:43:59 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E2=29=3A_Backout_7d3990?= =?utf-8?q?99334d=2E?= Message-ID: <3dCkxW0wGHz7Ljw@mail.python.org> http://hg.python.org/cpython/rev/8c9769b17171 changeset: 86913:8c9769b17171 branch: 3.2 user: Georg Brandl date: Mon Nov 04 07:44:29 2013 +0100 summary: Backout 7d399099334d. files: Misc/NEWS | 3 --- 1 files changed, 0 insertions(+), 3 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,9 +10,6 @@ Library ------- -- Issue #19286: Directories in ``package_data`` are no longer added to - the filelist, preventing failure outlined in the ticket. - - Issue #19435: Fix directory traversal attack on CGIHttpRequestHandler. - Issue #14984: On POSIX systems, when netrc is called without a filename -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 07:45:19 2013 From: python-checkins at python.org (georg.brandl) Date: Mon, 4 Nov 2013 07:45:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMy4yIC0+IDMuMyk6?= =?utf-8?q?_null-merge_3=2E2_backouts?= Message-ID: <3dCkz34J0Yz7LjS@mail.python.org> http://hg.python.org/cpython/rev/00ad499704c3 changeset: 86914:00ad499704c3 branch: 3.3 parent: 86905:e52dad892521 parent: 86913:8c9769b17171 user: Georg Brandl date: Mon Nov 04 07:46:02 2013 +0100 summary: null-merge 3.2 backouts files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 07:45:20 2013 From: python-checkins at python.org (georg.brandl) Date: Mon, 4 Nov 2013 07:45:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_merge_with_3=2E3?= Message-ID: <3dCkz466kBz7Ljv@mail.python.org> http://hg.python.org/cpython/rev/627e024fc17c changeset: 86915:627e024fc17c parent: 86910:ed8ad769fb1f parent: 86914:00ad499704c3 user: Georg Brandl date: Mon Nov 04 07:46:23 2013 +0100 summary: merge with 3.3 files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 08:51:55 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 4 Nov 2013 08:51:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_454=3A_cleanup?= Message-ID: <3dCmRv3tlnz7LjR@mail.python.org> http://hg.python.org/peps/rev/a5d89a8b66fe changeset: 5249:a5d89a8b66fe user: Victor Stinner date: Mon Nov 04 08:50:35 2013 +0100 summary: PEP 454: cleanup files: pep-0454.txt | 50 ++++++++++++++++++++-------------------- 1 files changed, 25 insertions(+), 25 deletions(-) diff --git a/pep-0454.txt b/pep-0454.txt --- a/pep-0454.txt +++ b/pep-0454.txt @@ -134,17 +134,6 @@ See also ``start()`` and ``stop()`` functions. -``start()`` function: - - Start tracing Python memory allocations. - - The function installs hooks on Python memory allocators. These hooks - have important overhead in term of performances and memory usage: - see `Filter functions`_ to limit the overhead. - - See also ``stop()`` and ``is_tracing()`` functions. - - ``stop()`` function: Stop tracing Python memory allocations and clear traces of memory @@ -159,13 +148,24 @@ See also ``start()`` and ``is_tracing()`` functions. +``start()`` function: + + Start tracing Python memory allocations. + + The function installs hooks on Python memory allocators. These hooks + have important overhead in term of performances and memory usage: + see `Filter functions`_ to limit the overhead. + + See also ``stop()`` and ``is_tracing()`` functions. + + ``take_snapshot()`` function: Take a snapshot of traces of memory blocks allocated by Python using the ``get_traces()`` function. Return a new ``Snapshot`` instance. - The ``tracemalloc`` module must be tracing memory allocations to take a - snapshot, see the the ``start()`` function. + The ``tracemalloc`` module must be tracing memory allocations to + take a snapshot, see the the ``start()`` function. See also ``get_traces()`` and ``get_object_traceback()`` functions. @@ -175,7 +175,7 @@ When Python allocates a memory block, ``tracemalloc`` attachs a "trace" to the memory block to store its size in bytes and the traceback where the -allocation occured. +allocation occurred. The following functions give access to these traces. A trace is a ``(size: int, traceback)`` tuple. *size* is the size of the memory block in bytes. @@ -208,8 +208,8 @@ Get the maximum number of frames stored in the traceback of a trace. - By default, a trace of an allocated memory block only stores the - most recent frame: the limit is ``1``. + By default, a trace of a memory block only stores the most recent + frame: the limit is ``1``. Use the ``set_traceback_limit()`` function to change the limit. @@ -220,19 +220,19 @@ ``(size: int, traceback: tuple)`` tuples. *traceback* is a tuple of ``(filename: str, lineno: int)`` tuples. - The list of traces do not include memory blocks allocated before the - ``tracemalloc`` module started to trace memory allocations nor memory - blocks ignored by filters (see ``get_filters()``). + The list of traces does not include memory blocks allocated before + the ``tracemalloc`` module started to trace memory allocations nor + memory blocks ignored by filters (see ``get_filters()``). - The list is not sorted. Take a snapshot using ``take_snapshot()`` - and use the ``Snapshot.statistics()`` method to get a sorted list of - statistics. + The list has an undefined order. Take a snapshot using + ``take_snapshot()`` and use the ``Snapshot.statistics()`` method to + get a sorted list of statistics. Tracebacks of traces are limited to ``traceback_limit`` frames. Use ``set_traceback_limit()`` to store more frames. - Return an empty list if the ``tracemalloc`` module is not tracing memory - allocations. + Return an empty list if the ``tracemalloc`` module is not tracing + memory allocations. See also ``take_snapshot()`` and ``get_object_traceback()`` functions. @@ -266,7 +266,7 @@ To limit the overhead, some files can be excluded or tracing can be restricted to a set of files using filters. Examples: ``add_filter(Filter(True, subprocess.__file__))`` only traces memory allocations in the ``subprocess`` -module, and ``add_filter(Filter(False, tracemalloc.__file__))`` do not trace +module, and ``add_filter(Filter(False, tracemalloc.__file__))`` ignores memory allocations in the ``tracemalloc`` module By default, there is one exclusive filter to ignore Python memory blocks -- Repository URL: http://hg.python.org/peps From tjreedy at udel.edu Mon Nov 4 09:18:05 2013 From: tjreedy at udel.edu (Terry Reedy) Date: Mon, 04 Nov 2013 03:18:05 -0500 Subject: [Python-checkins] cpython (2.7): Issue #XXXXX: Fix test_idle so that idlelib test cases are actually run In-Reply-To: References: <3dChMm1bw5z7LjR@mail.python.org> <5277287F.8060607@udel.edu> Message-ID: <5277583D.7090203@udel.edu> On 11/4/2013 1:30 AM, Brian Curtin wrote: > On Sun, Nov 3, 2013 at 8:54 PM, Terry Reedy wrote: >> On 11/3/2013 11:48 PM, terry.reedy wrote: >>> >>> http://hg.python.org/cpython/rev/cced7981ec4d >>> changeset: 86908:cced7981ec4d >>> branch: 2.7 >>> user: Terry Jan Reedy >>> date: Sun Nov 03 23:37:54 2013 -0500 >>> summary: >>> Issue #XXXXX: Fix test_idle so that idlelib test cases are actually run >>> under test.regrtest on 2.7. >> >> >> This message is the one included with the patch by Ned Daily. Because a >> message *was* included (not normal), hg import committed the patch >> immediately, without giving me a chance to edit the patch or message. As far >> as I know, there is no way I could have edited the message after the commit. >> If there was, let me know. > > Besides what Zach mentions, I have only seen commit messages here. > most of the time you probably want to "hg > import --no-commit ", run it, test it, then commit it with > whatever message you want. That is what I normally do, in effect, along with pulling just before committing. To be more precise. I use TortoiseHg Workbench -Repository/Import which normally brings up an editor window for a commit message after doing the patch. Leaving the window blank, as I always do, has the effect of --no-commit. A 'changeset patch' containing a commit message bypasses the option of either editing the message or making it blank. There does not seem to be a 'setting' to change this. The issue is http://bugs.python.org/issue19488 From python-checkins at python.org Mon Nov 4 11:08:47 2013 From: python-checkins at python.org (nick.coghlan) Date: Mon, 4 Nov 2013 11:08:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogQ2xvc2UgIzE3ODI3?= =?utf-8?q?=3A_Document_codecs=2Eencode_=26_codecs=2Edecode?= Message-ID: <3dCqTq1QnYz7Ljy@mail.python.org> http://hg.python.org/cpython/rev/bdb30bdf60a5 changeset: 86916:bdb30bdf60a5 branch: 2.7 parent: 86909:72c3ca3ed22a user: Nick Coghlan date: Mon Nov 04 20:05:16 2013 +1000 summary: Close #17827: Document codecs.encode & codecs.decode files: Doc/library/codecs.rst | 23 +++++++++++++++++++++++ Misc/NEWS | 4 ++++ 2 files changed, 27 insertions(+), 0 deletions(-) diff --git a/Doc/library/codecs.rst b/Doc/library/codecs.rst --- a/Doc/library/codecs.rst +++ b/Doc/library/codecs.rst @@ -23,6 +23,29 @@ It defines the following functions: +.. function:: encode(obj, encoding='ascii', errors='strict') + + Encodes *obj* using the codec registered for *encoding*. + + *Errors* may be given to set the desired error handling scheme. The + default error handler is ``strict`` meaning that encoding errors raise + :exc:`ValueError` (or a more codec specific subclass, such as + :exc:`UnicodeEncodeError`). Refer to :ref:`codec-base-classes` for more + information on codec error handling. + + .. versionadded:: 2.4 + +.. function:: decode(obj, encoding='ascii', errors='strict') + + Decodes *obj* using the codec registered for *encoding*. + + *Errors* may be given to set the desired error handling scheme. The + default error handler is ``strict`` meaning that decoding errors raise + :exc:`ValueError` (or a more codec specific subclass, such as + :exc:`UnicodeDecodeError`). Refer to :ref:`codec-base-classes` for more + information on codec error handling. + + .. versionadded:: 2.4 .. function:: register(search_function) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -12,9 +12,13 @@ Library ------- +- Issue #17827: Add the missing documentation for ``codecs.encode`` and + ``codecs.decode``. + - Issue #6157: Fixed Tkinter.Text.debug(). Original patch by Guilherme Polo. - Issue #6160: The bbox() method of tkinter.Spinbox now returns a tuple of + integers instead of a string. Based on patch by Guilherme Polo. - Issue #19286: Directories in ``package_data`` are no longer added to -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 11:11:12 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 4 Nov 2013 11:11:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2316286=3A_write_a_?= =?utf-8?q?new_subfunction_bytes=5Fcompare=5Feq=28=29?= Message-ID: <3dCqXc0KZsz7Ljv@mail.python.org> http://hg.python.org/cpython/rev/536a7c09c7fd changeset: 86917:536a7c09c7fd parent: 86915:627e024fc17c user: Victor Stinner date: Mon Nov 04 11:08:10 2013 +0100 summary: Issue #16286: write a new subfunction bytes_compare_eq() * cleanup bytes_richcompare() * PyUnicode_RichCompare(): replace a test with a XOR files: Objects/bytesobject.c | 84 ++++++++++++++++------------ Objects/unicodeobject.c | 8 +- 2 files changed, 50 insertions(+), 42 deletions(-) diff --git a/Objects/bytesobject.c b/Objects/bytesobject.c --- a/Objects/bytesobject.c +++ b/Objects/bytesobject.c @@ -802,6 +802,23 @@ return PyLong_FromLong((unsigned char)a->ob_sval[i]); } +Py_LOCAL(int) +bytes_compare_eq(PyBytesObject *a, PyBytesObject *b) +{ + int cmp; + Py_ssize_t len; + + len = Py_SIZE(a); + if (Py_SIZE(b) != len) + return 0; + + if (a->ob_sval[0] != b->ob_sval[0]) + return 0; + + cmp = memcmp(a->ob_sval, b->ob_sval, len); + return (cmp == 0); +} + static PyObject* bytes_richcompare(PyBytesObject *a, PyBytesObject *b, int op) { @@ -822,53 +839,46 @@ return NULL; } result = Py_NotImplemented; - goto out; } - if (a == b) { + else if (a == b) { switch (op) { case Py_EQ:case Py_LE:case Py_GE: result = Py_True; - goto out; + break; case Py_NE:case Py_LT:case Py_GT: result = Py_False; - goto out; + break; } } - if (op == Py_EQ) { - /* Supporting Py_NE here as well does not save - much time, since Py_NE is rarely used. */ - if (Py_SIZE(a) == Py_SIZE(b) - && (a->ob_sval[0] == b->ob_sval[0] - && memcmp(a->ob_sval, b->ob_sval, Py_SIZE(a)) == 0)) { - result = Py_True; - } else { - result = Py_False; + else if (op == Py_EQ || op == Py_NE) { + int eq = bytes_compare_eq(a, b); + eq ^= (op == Py_NE); + result = eq ? Py_True : Py_False; + } + else { + len_a = Py_SIZE(a); len_b = Py_SIZE(b); + min_len = (len_a < len_b) ? len_a : len_b; + if (min_len > 0) { + c = Py_CHARMASK(*a->ob_sval) - Py_CHARMASK(*b->ob_sval); + if (c==0) + c = memcmp(a->ob_sval, b->ob_sval, min_len); } - goto out; + else + c = 0; + if (c == 0) + c = (len_a < len_b) ? -1 : (len_a > len_b) ? 1 : 0; + switch (op) { + case Py_LT: c = c < 0; break; + case Py_LE: c = c <= 0; break; + case Py_GT: c = c > 0; break; + case Py_GE: c = c >= 0; break; + default: + assert(op != Py_EQ && op != Py_NE); + Py_RETURN_NOTIMPLEMENTED; + } + result = c ? Py_True : Py_False; } - len_a = Py_SIZE(a); len_b = Py_SIZE(b); - min_len = (len_a < len_b) ? len_a : len_b; - if (min_len > 0) { - c = Py_CHARMASK(*a->ob_sval) - Py_CHARMASK(*b->ob_sval); - if (c==0) - c = memcmp(a->ob_sval, b->ob_sval, min_len); - } else - c = 0; - if (c == 0) - c = (len_a < len_b) ? -1 : (len_a > len_b) ? 1 : 0; - switch (op) { - case Py_LT: c = c < 0; break; - case Py_LE: c = c <= 0; break; - case Py_EQ: assert(0); break; /* unreachable */ - case Py_NE: c = c != 0; break; - case Py_GT: c = c > 0; break; - case Py_GE: c = c >= 0; break; - default: - result = Py_NotImplemented; - goto out; - } - result = c ? Py_True : Py_False; - out: + Py_INCREF(result); return result; } diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -10526,7 +10526,7 @@ #undef COMPARE } -static int +Py_LOCAL(int) unicode_compare_eq(PyObject *str1, PyObject *str2) { int kind; @@ -10630,10 +10630,8 @@ if (op == Py_EQ || op == Py_NE) { result = unicode_compare_eq(left, right); - if (op == Py_EQ) - v = TEST_COND(result); - else - v = TEST_COND(!result); + result ^= (op == Py_NE); + v = TEST_COND(result); } else { result = unicode_compare(left, right); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 11:24:53 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 4 Nov 2013 11:24:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2316286=3A_optimize?= =?utf-8?q?_PyUnicode=5FRichCompare=28=29_for_identical_strings_=28same?= Message-ID: <3dCqrP3SCSz7LkF@mail.python.org> http://hg.python.org/cpython/rev/5fa291435740 changeset: 86918:5fa291435740 user: Victor Stinner date: Mon Nov 04 11:23:05 2013 +0100 summary: Issue #16286: optimize PyUnicode_RichCompare() for identical strings (same pointer) for any operator, not only Py_EQ and Py_NE. Code of bytes_richcompare() and PyUnicode_RichCompare() is now closer. files: Objects/bytesobject.c | 23 ++++++++++++++++------- Objects/unicodeobject.c | 24 +++++++++++++++++++----- 2 files changed, 35 insertions(+), 12 deletions(-) diff --git a/Objects/bytesobject.c b/Objects/bytesobject.c --- a/Objects/bytesobject.c +++ b/Objects/bytesobject.c @@ -842,12 +842,20 @@ } else if (a == b) { switch (op) { - case Py_EQ:case Py_LE:case Py_GE: + case Py_EQ: + case Py_LE: + case Py_GE: + /* a string is equal to itself */ result = Py_True; break; - case Py_NE:case Py_LT:case Py_GT: + case Py_NE: + case Py_LT: + case Py_GT: result = Py_False; break; + default: + PyErr_BadArgument(); + return NULL; } } else if (op == Py_EQ || op == Py_NE) { @@ -856,11 +864,12 @@ result = eq ? Py_True : Py_False; } else { - len_a = Py_SIZE(a); len_b = Py_SIZE(b); - min_len = (len_a < len_b) ? len_a : len_b; + len_a = Py_SIZE(a); + len_b = Py_SIZE(b); + min_len = Py_MIN(len_a, len_b); if (min_len > 0) { c = Py_CHARMASK(*a->ob_sval) - Py_CHARMASK(*b->ob_sval); - if (c==0) + if (c == 0) c = memcmp(a->ob_sval, b->ob_sval, min_len); } else @@ -873,8 +882,8 @@ case Py_GT: c = c > 0; break; case Py_GE: c = c >= 0; break; default: - assert(op != Py_EQ && op != Py_NE); - Py_RETURN_NOTIMPLEMENTED; + PyErr_BadArgument(); + return NULL; } result = c ? Py_True : Py_False; } diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -10534,10 +10534,6 @@ Py_ssize_t len; int cmp; - /* a string is equal to itself */ - if (str1 == str2) - return 1; - len = PyUnicode_GET_LENGTH(str1); if (PyUnicode_GET_LENGTH(str2) != len) return 0; @@ -10628,7 +10624,25 @@ PyUnicode_READY(right) == -1) return NULL; - if (op == Py_EQ || op == Py_NE) { + if (left == right) { + switch (op) { + case Py_EQ: + case Py_LE: + case Py_GE: + /* a string is equal to itself */ + v = Py_True; + break; + case Py_NE: + case Py_LT: + case Py_GT: + v = Py_False; + break; + default: + PyErr_BadArgument(); + return NULL; + } + } + else if (op == Py_EQ || op == Py_NE) { result = unicode_compare_eq(left, right); result ^= (op == Py_NE); v = TEST_COND(result); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 11:29:11 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 4 Nov 2013 11:29:11 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2316286=3A_remove_d?= =?utf-8?q?uplicated_identity_check_from_unicode=5Fcompare=28=29?= Message-ID: <3dCqxM2XJfz7LjW@mail.python.org> http://hg.python.org/cpython/rev/da9c6e4ef301 changeset: 86919:da9c6e4ef301 user: Victor Stinner date: Mon Nov 04 11:27:14 2013 +0100 summary: Issue #16286: remove duplicated identity check from unicode_compare() Move the test to PyUnicode_Compare() files: Objects/unicodeobject.c | 9 +++++---- 1 files changed, 5 insertions(+), 4 deletions(-) diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -10428,10 +10428,6 @@ void *data1, *data2; Py_ssize_t len1, len2, len; - /* a string is equal to itself */ - if (str1 == str2) - return 0; - kind1 = PyUnicode_KIND(str1); kind2 = PyUnicode_KIND(str2); data1 = PyUnicode_DATA(str1); @@ -10555,6 +10551,11 @@ if (PyUnicode_READY(left) == -1 || PyUnicode_READY(right) == -1) return -1; + + /* a string is equal to itself */ + if (left == right) + return 0; + return unicode_compare(left, right); } PyErr_Format(PyExc_TypeError, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 11:29:12 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 4 Nov 2013 11:29:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319424=3A_PyUnicod?= =?utf-8?q?e=5FCompareWithASCIIString=28=29_normalizes_memcmp=28=29_result?= Message-ID: <3dCqxN4P6Qz7LjP@mail.python.org> http://hg.python.org/cpython/rev/494f736f5945 changeset: 86920:494f736f5945 user: Victor Stinner date: Mon Nov 04 11:28:26 2013 +0100 summary: Issue #19424: PyUnicode_CompareWithASCIIString() normalizes memcmp() result to -1, 0, 1 files: Objects/unicodeobject.c | 8 ++++++-- 1 files changed, 6 insertions(+), 2 deletions(-) diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -10584,8 +10584,12 @@ len = Py_MIN(len1, len2); cmp = memcmp(data, str, len); - if (cmp != 0) - return cmp; + if (cmp != 0) { + if (cmp < 0) + return -1; + else + return 1; + } if (len1 > len2) return 1; /* uni is longer */ if (len2 > len1) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 14:32:35 2013 From: python-checkins at python.org (nick.coghlan) Date: Mon, 4 Nov 2013 14:32:35 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_functools=2Epartialmet?= =?utf-8?q?hod_docs_and_=5F=5Fall=5F=5F?= Message-ID: <3dCw0z6Ksdz7Lkr@mail.python.org> http://hg.python.org/cpython/rev/ac1685661b07 changeset: 86921:ac1685661b07 user: Nick Coghlan date: Mon Nov 04 23:32:16 2013 +1000 summary: Fix functools.partialmethod docs and __all__ files: Doc/library/functools.rst | 4 ++-- Lib/functools.py | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Doc/library/functools.rst b/Doc/library/functools.rst --- a/Doc/library/functools.rst +++ b/Doc/library/functools.rst @@ -223,8 +223,8 @@ ... return self._alive ... def set_state(self, state): ... self._alive = bool(state) - ... set_alive = partialmethod(set_alive, True) - ... set_dead = partialmethod(set_alive, False) + ... set_alive = partialmethod(set_state, True) + ... set_dead = partialmethod(set_state, False) ... >>> c = Cell() >>> c.alive diff --git a/Lib/functools.py b/Lib/functools.py --- a/Lib/functools.py +++ b/Lib/functools.py @@ -11,7 +11,7 @@ __all__ = ['update_wrapper', 'wraps', 'WRAPPER_ASSIGNMENTS', 'WRAPPER_UPDATES', 'total_ordering', 'cmp_to_key', 'lru_cache', 'reduce', 'partial', - 'singledispatch'] + 'partialmethod', 'singledispatch'] try: from _functools import reduce -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 15:24:19 2013 From: python-checkins at python.org (nick.coghlan) Date: Mon, 4 Nov 2013 15:24:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Remove_merge_a?= =?utf-8?q?rtifact_from_Misc/NEWS?= Message-ID: <3dCx8g3cblz7LjY@mail.python.org> http://hg.python.org/cpython/rev/7268838063e1 changeset: 86922:7268838063e1 branch: 2.7 parent: 86916:bdb30bdf60a5 user: Nick Coghlan date: Tue Nov 05 00:24:05 2013 +1000 summary: Remove merge artifact from Misc/NEWS files: Misc/NEWS | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -18,7 +18,6 @@ - Issue #6157: Fixed Tkinter.Text.debug(). Original patch by Guilherme Polo. - Issue #6160: The bbox() method of tkinter.Spinbox now returns a tuple of - integers instead of a string. Based on patch by Guilherme Polo. - Issue #19286: Directories in ``package_data`` are no longer added to -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 16:19:27 2013 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 4 Nov 2013 16:19:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_relative=28=29_becomes_relati?= =?utf-8?b?dmVfdG8oKQ==?= Message-ID: <3dCyNH5Jh1z7LjQ@mail.python.org> http://hg.python.org/peps/rev/857f13a5c8c1 changeset: 5250:857f13a5c8c1 user: Antoine Pitrou date: Mon Nov 04 16:19:23 2013 +0100 summary: relative() becomes relative_to() files: pep-0428.txt | 8 ++++---- 1 files changed, 4 insertions(+), 4 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -445,18 +445,18 @@ Making the path relative ^^^^^^^^^^^^^^^^^^^^^^^^ -The ``relative()`` method computes the relative difference of a path to +The ``relative_to()`` method computes the relative difference of a path to another:: - >>> PurePosixPath('/usr/bin/python').relative('/usr') + >>> PurePosixPath('/usr/bin/python').relative_to('/usr') PurePosixPath('bin/python') ValueError is raised if the method cannot return a meaningful value:: - >>> PurePosixPath('/usr/bin/python').relative('/etc') + >>> PurePosixPath('/usr/bin/python').relative_to('/etc') Traceback (most recent call last): File "", line 1, in - File "pathlib.py", line 926, in relative + File "pathlib.py", line 926, in relative_to .format(str(self), str(formatted))) ValueError: '/usr/bin/python' does not start with '/etc' -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Mon Nov 4 16:20:16 2013 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 4 Nov 2013 16:20:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Remove_the_st=5F*_shortcuts?= Message-ID: <3dCyPD23Smz7Llw@mail.python.org> http://hg.python.org/peps/rev/7ab9323febf9 changeset: 5251:7ab9323febf9 user: Antoine Pitrou date: Mon Nov 04 16:20:12 2013 +0100 summary: Remove the st_* shortcuts files: pep-0428.txt | 10 ---------- 1 files changed, 0 insertions(+), 10 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -221,8 +221,6 @@ PosixPath('pathlib/setup.py') >>> p.exists() True - >>> p.st_size - 928 Pure paths API @@ -560,14 +558,6 @@ >>> p.stat() posix.stat_result(st_mode=33277, st_ino=7483155, st_dev=2053, st_nlink=1, st_uid=500, st_gid=500, st_size=928, st_atime=1343597970, st_mtime=1328287308, st_ctime=1343597964) -For ease of use, direct attribute access to the fields of the stat structure -is provided over the path object itself:: - - >>> p.st_size - 928 - >>> p.st_mtime - 1328287308.889562 - Higher-level methods help examine the kind of the file:: >>> p.exists() -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Mon Nov 4 16:22:07 2013 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 4 Nov 2013 16:22:07 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Remove_stat_caching?= Message-ID: <3dCyRM6CDfz7LlN@mail.python.org> http://hg.python.org/peps/rev/101c459e5570 changeset: 5252:101c459e5570 user: Antoine Pitrou date: Mon Nov 04 16:22:03 2013 +0100 summary: Remove stat caching files: pep-0428.txt | 6 +++--- 1 files changed, 3 insertions(+), 3 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -551,9 +551,9 @@ File metadata ------------- -The ``stat()`` method caches and returns the file's stat() result; -``restat()`` forces refreshing of the cache. ``lstat()`` is also provided, -but doesn't have any caching behaviour:: +The ``stat()`` returns the file's stat() result; similarly, ``lstat()`` +returns the file's lstat() result (which is different iff the file is a +symbolic link):: >>> p.stat() posix.stat_result(st_mode=33277, st_ino=7483155, st_dev=2053, st_nlink=1, st_uid=500, st_gid=500, st_size=928, st_atime=1343597970, st_mtime=1328287308, st_ctime=1343597964) -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Mon Nov 4 21:12:52 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Mon, 4 Nov 2013 21:12:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5MDg1?= =?utf-8?q?=3A_Fix_Tkinter_tests_on_Tk_8=2E5_with_patchlevel_=3C_8=2E5=2E1?= =?utf-8?q?2=2E?= Message-ID: <3dD4tr5fzlz7LjP@mail.python.org> http://hg.python.org/cpython/rev/c3fa22d04fb2 changeset: 86923:c3fa22d04fb2 branch: 2.7 user: Serhiy Storchaka date: Mon Nov 04 22:10:35 2013 +0200 summary: Issue #19085: Fix Tkinter tests on Tk 8.5 with patchlevel < 8.5.12. files: Lib/lib-tk/test/widget_tests.py | 14 +++++++++++++- 1 files changed, 13 insertions(+), 1 deletions(-) diff --git a/Lib/lib-tk/test/widget_tests.py b/Lib/lib-tk/test/widget_tests.py --- a/Lib/lib-tk/test/widget_tests.py +++ b/Lib/lib-tk/test/widget_tests.py @@ -11,10 +11,22 @@ def int_round(x): return int(round(x)) +pixels_round = int_round +if tcl_version[:2] == (8, 5): + # Issue #19085: Workaround a bug in Tk + # http://core.tcl.tk/tk/info/3497848 + root = setup_master() + patchlevel = root.call('info', 'patchlevel') + patchlevel = tuple(map(int, patchlevel.split('.'))) + if patchlevel < (8, 5, 12): + pixels_round = int + del root + + _sentinel = object() class AbstractWidgetTest(object): - _conv_pixels = staticmethod(int_round) + _conv_pixels = staticmethod(pixels_round) _conv_pad_pixels = None wantobjects = True -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 21:12:54 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Mon, 4 Nov 2013 21:12:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5MDg1?= =?utf-8?q?=3A_Fix_Tkinter_tests_on_Tk_8=2E5_with_patchlevel_=3C_8=2E5=2E1?= =?utf-8?q?2=2E?= Message-ID: <3dD4tt15Cjz7Ljv@mail.python.org> http://hg.python.org/cpython/rev/583347b79aa0 changeset: 86924:583347b79aa0 branch: 3.3 parent: 86914:00ad499704c3 user: Serhiy Storchaka date: Mon Nov 04 22:11:12 2013 +0200 summary: Issue #19085: Fix Tkinter tests on Tk 8.5 with patchlevel < 8.5.12. files: Lib/tkinter/test/widget_tests.py | 14 +++++++++++++- 1 files changed, 13 insertions(+), 1 deletions(-) diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/tkinter/test/widget_tests.py @@ -8,10 +8,22 @@ noconv = str if tcl_version < (8, 5) else False +pixels_round = round +if tcl_version[:2] == (8, 5): + # Issue #19085: Workaround a bug in Tk + # http://core.tcl.tk/tk/info/3497848 + root = setup_master() + patchlevel = root.call('info', 'patchlevel') + patchlevel = tuple(map(int, patchlevel.split('.'))) + if patchlevel < (8, 5, 12): + pixels_round = int + del root + + _sentinel = object() class AbstractWidgetTest: - _conv_pixels = round + _conv_pixels = pixels_round _conv_pad_pixels = None wantobjects = True -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 21:12:55 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Mon, 4 Nov 2013 21:12:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319085=3A_Fix_Tkinter_tests_on_Tk_8=2E5_with_pat?= =?utf-8?b?Y2hsZXZlbCA8IDguNS4xMi4=?= Message-ID: <3dD4tv3Tk4z7Lld@mail.python.org> http://hg.python.org/cpython/rev/fe5a829bd645 changeset: 86925:fe5a829bd645 parent: 86921:ac1685661b07 parent: 86924:583347b79aa0 user: Serhiy Storchaka date: Mon Nov 04 22:11:43 2013 +0200 summary: Issue #19085: Fix Tkinter tests on Tk 8.5 with patchlevel < 8.5.12. files: Lib/tkinter/test/widget_tests.py | 14 +++++++++++++- 1 files changed, 13 insertions(+), 1 deletions(-) diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/tkinter/test/widget_tests.py @@ -8,10 +8,22 @@ noconv = str if tcl_version < (8, 5) else False +pixels_round = round +if tcl_version[:2] == (8, 5): + # Issue #19085: Workaround a bug in Tk + # http://core.tcl.tk/tk/info/3497848 + root = setup_master() + patchlevel = root.call('info', 'patchlevel') + patchlevel = tuple(map(int, patchlevel.split('.'))) + if patchlevel < (8, 5, 12): + pixels_round = int + del root + + _sentinel = object() class AbstractWidgetTest: - _conv_pixels = round + _conv_pixels = pixels_round _conv_pad_pixels = None wantobjects = True -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 22:07:26 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Mon, 4 Nov 2013 22:07:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5MDg1?= =?utf-8?q?=3A_Fix_running_test=5Fttk=5Ftextonly_on_displayless_host=2E?= Message-ID: <3dD65p0SVsz7LjT@mail.python.org> http://hg.python.org/cpython/rev/fe7aaf14b129 changeset: 86926:fe7aaf14b129 branch: 2.7 parent: 86923:c3fa22d04fb2 user: Serhiy Storchaka date: Mon Nov 04 23:05:23 2013 +0200 summary: Issue #19085: Fix running test_ttk_textonly on displayless host. files: Lib/lib-tk/test/test_tkinter/test_widgets.py | 5 +- Lib/lib-tk/test/widget_tests.py | 18 ++++++--- 2 files changed, 15 insertions(+), 8 deletions(-) diff --git a/Lib/lib-tk/test/test_tkinter/test_widgets.py b/Lib/lib-tk/test/test_tkinter/test_widgets.py --- a/Lib/lib-tk/test/test_tkinter/test_widgets.py +++ b/Lib/lib-tk/test/test_tkinter/test_widgets.py @@ -4,7 +4,8 @@ from test.test_support import requires, run_unittest from test_ttk.support import tcl_version, requires_tcl, widget_eq -from widget_tests import (add_standard_options, noconv, noconv_meth, int_round, +from widget_tests import ( + add_standard_options, noconv, noconv_meth, int_round, pixels_round, AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests) @@ -240,7 +241,7 @@ 'takefocus', 'text', 'textvariable', 'underline', 'width', 'wraplength', ) - _conv_pixels = staticmethod(AbstractWidgetTest._conv_pixels) + _conv_pixels = staticmethod(pixels_round) def _create(self, **kwargs): return Tkinter.Menubutton(self.root, **kwargs) diff --git a/Lib/lib-tk/test/widget_tests.py b/Lib/lib-tk/test/widget_tests.py --- a/Lib/lib-tk/test/widget_tests.py +++ b/Lib/lib-tk/test/widget_tests.py @@ -15,12 +15,18 @@ if tcl_version[:2] == (8, 5): # Issue #19085: Workaround a bug in Tk # http://core.tcl.tk/tk/info/3497848 - root = setup_master() - patchlevel = root.call('info', 'patchlevel') - patchlevel = tuple(map(int, patchlevel.split('.'))) - if patchlevel < (8, 5, 12): - pixels_round = int - del root + _pixels_round = None + def pixels_round(x): + global _pixels_round + if _pixels_round is None: + root = setup_master() + patchlevel = root.call('info', 'patchlevel') + patchlevel = tuple(map(int, patchlevel.split('.'))) + if patchlevel < (8, 5, 12): + _pixels_round = int + else: + _pixels_round = int_round + return _pixels_round(x) _sentinel = object() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 22:07:27 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Mon, 4 Nov 2013 22:07:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5MDg1?= =?utf-8?q?=3A_Fix_running_test=5Fttk=5Ftextonly_on_displayless_host=2E?= Message-ID: <3dD65q4B4Jz7Ll9@mail.python.org> http://hg.python.org/cpython/rev/47d3714dcb33 changeset: 86927:47d3714dcb33 branch: 3.3 parent: 86924:583347b79aa0 user: Serhiy Storchaka date: Mon Nov 04 23:05:37 2013 +0200 summary: Issue #19085: Fix running test_ttk_textonly on displayless host. files: Lib/tkinter/test/test_tkinter/test_widgets.py | 5 +- Lib/tkinter/test/widget_tests.py | 20 ++++++--- 2 files changed, 16 insertions(+), 9 deletions(-) diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -4,7 +4,8 @@ from test.support import requires from tkinter.test.support import tcl_version, requires_tcl, widget_eq -from tkinter.test.widget_tests import (add_standard_options, noconv, +from tkinter.test.widget_tests import ( + add_standard_options, noconv, pixels_round, AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests) requires('gui') @@ -243,7 +244,7 @@ 'takefocus', 'text', 'textvariable', 'underline', 'width', 'wraplength', ) - _conv_pixels = AbstractWidgetTest._conv_pixels + _conv_pixels = staticmethod(pixels_round) def _create(self, **kwargs): return tkinter.Menubutton(self.root, **kwargs) diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/tkinter/test/widget_tests.py @@ -12,18 +12,24 @@ if tcl_version[:2] == (8, 5): # Issue #19085: Workaround a bug in Tk # http://core.tcl.tk/tk/info/3497848 - root = setup_master() - patchlevel = root.call('info', 'patchlevel') - patchlevel = tuple(map(int, patchlevel.split('.'))) - if patchlevel < (8, 5, 12): - pixels_round = int - del root + _pixels_round = None + def pixels_round(x): + global _pixels_round + if _pixels_round is None: + root = setup_master() + patchlevel = root.call('info', 'patchlevel') + patchlevel = tuple(map(int, patchlevel.split('.'))) + if patchlevel < (8, 5, 12): + _pixels_round = int + else: + _pixels_round = int_round + return _pixels_round(x) _sentinel = object() class AbstractWidgetTest: - _conv_pixels = pixels_round + _conv_pixels = staticmethod(pixels_round) _conv_pad_pixels = None wantobjects = True -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 22:07:29 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Mon, 4 Nov 2013 22:07:29 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319085=3A_Fix_running_test=5Fttk=5Ftextonly_on_d?= =?utf-8?q?isplayless_host=2E?= Message-ID: <3dD65s0ZtDz7Llh@mail.python.org> http://hg.python.org/cpython/rev/713cc4908a96 changeset: 86928:713cc4908a96 parent: 86925:fe5a829bd645 parent: 86927:47d3714dcb33 user: Serhiy Storchaka date: Mon Nov 04 23:06:51 2013 +0200 summary: Issue #19085: Fix running test_ttk_textonly on displayless host. files: Lib/tkinter/test/test_tkinter/test_widgets.py | 5 +- Lib/tkinter/test/widget_tests.py | 20 ++++++--- 2 files changed, 16 insertions(+), 9 deletions(-) diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -4,7 +4,8 @@ from test.support import requires from tkinter.test.support import tcl_version, requires_tcl, widget_eq -from tkinter.test.widget_tests import (add_standard_options, noconv, +from tkinter.test.widget_tests import ( + add_standard_options, noconv, pixels_round, AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests) requires('gui') @@ -243,7 +244,7 @@ 'takefocus', 'text', 'textvariable', 'underline', 'width', 'wraplength', ) - _conv_pixels = AbstractWidgetTest._conv_pixels + _conv_pixels = staticmethod(pixels_round) def _create(self, **kwargs): return tkinter.Menubutton(self.root, **kwargs) diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/tkinter/test/widget_tests.py @@ -12,18 +12,24 @@ if tcl_version[:2] == (8, 5): # Issue #19085: Workaround a bug in Tk # http://core.tcl.tk/tk/info/3497848 - root = setup_master() - patchlevel = root.call('info', 'patchlevel') - patchlevel = tuple(map(int, patchlevel.split('.'))) - if patchlevel < (8, 5, 12): - pixels_round = int - del root + _pixels_round = None + def pixels_round(x): + global _pixels_round + if _pixels_round is None: + root = setup_master() + patchlevel = root.call('info', 'patchlevel') + patchlevel = tuple(map(int, patchlevel.split('.'))) + if patchlevel < (8, 5, 12): + _pixels_round = int + else: + _pixels_round = int_round + return _pixels_round(x) _sentinel = object() class AbstractWidgetTest: - _conv_pixels = pixels_round + _conv_pixels = staticmethod(pixels_round) _conv_pad_pixels = None wantobjects = True -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 22:18:27 2013 From: python-checkins at python.org (guido.van.rossum) Date: Mon, 4 Nov 2013 22:18:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Locks_improveme?= =?utf-8?q?nts_by_Arnaud_Faure=3A_better_repr=28=29=2C_change_Conditio=5C?= Message-ID: <3dD6LW2Gyjz7LjP@mail.python.org> http://hg.python.org/cpython/rev/268259370a01 changeset: 86929:268259370a01 user: Guido van Rossum date: Mon Nov 04 13:18:19 2013 -0800 summary: asyncio: Locks improvements by Arnaud Faure: better repr(), change Conditio\ n structure. files: Lib/asyncio/locks.py | 78 +++++++++++----- Lib/test/test_asyncio/test_locks.py | 71 +++++++++++++++- 2 files changed, 124 insertions(+), 25 deletions(-) diff --git a/Lib/asyncio/locks.py b/Lib/asyncio/locks.py --- a/Lib/asyncio/locks.py +++ b/Lib/asyncio/locks.py @@ -155,9 +155,11 @@ self._loop = events.get_event_loop() def __repr__(self): - # TODO: add waiters:N if > 0. res = super().__repr__() - return '<{} [{}]>'.format(res[1:-1], 'set' if self._value else 'unset') + extra = 'set' if self._value else 'unset' + if self._waiters: + extra = '{},waiters:{}'.format(extra, len(self._waiters)) + return '<{} [{}]>'.format(res[1:-1], extra) def is_set(self): """Return true if and only if the internal flag is true.""" @@ -201,20 +203,38 @@ self._waiters.remove(fut) -# TODO: Why is this a Lock subclass? threading.Condition *has* a lock. -class Condition(Lock): - """A Condition implementation. +class Condition: + """A Condition implementation, our equivalent to threading.Condition. This class implements condition variable objects. A condition variable allows one or more coroutines to wait until they are notified by another coroutine. + + A new Lock object is created and used as the underlying lock. """ def __init__(self, *, loop=None): - super().__init__(loop=loop) - self._condition_waiters = collections.deque() + if loop is not None: + self._loop = loop + else: + self._loop = events.get_event_loop() - # TODO: Add __repr__() with len(_condition_waiters). + # Lock as an attribute as in threading.Condition. + lock = Lock(loop=self._loop) + self._lock = lock + # Export the lock's locked(), acquire() and release() methods. + self.locked = lock.locked + self.acquire = lock.acquire + self.release = lock.release + + self._waiters = collections.deque() + + def __repr__(self): + res = super().__repr__() + extra = 'locked' if self.locked() else 'unlocked' + if self._waiters: + extra = '{},waiters:{}'.format(extra, len(self._waiters)) + return '<{} [{}]>'.format(res[1:-1], extra) @tasks.coroutine def wait(self): @@ -228,19 +248,19 @@ the same condition variable in another coroutine. Once awakened, it re-acquires the lock and returns True. """ - if not self._locked: + if not self.locked(): raise RuntimeError('cannot wait on un-acquired lock') keep_lock = True self.release() try: fut = futures.Future(loop=self._loop) - self._condition_waiters.append(fut) + self._waiters.append(fut) try: yield from fut return True finally: - self._condition_waiters.remove(fut) + self._waiters.remove(fut) except GeneratorExit: keep_lock = False # Prevent yield in finally clause. @@ -275,11 +295,11 @@ wait() call until it can reacquire the lock. Since notify() does not release the lock, its caller should. """ - if not self._locked: + if not self.locked(): raise RuntimeError('cannot notify on un-acquired lock') idx = 0 - for fut in self._condition_waiters: + for fut in self._waiters: if idx >= n: break @@ -293,7 +313,17 @@ calling thread has not acquired the lock when this method is called, a RuntimeError is raised. """ - self.notify(len(self._condition_waiters)) + self.notify(len(self._waiters)) + + def __enter__(self): + return self._lock.__enter__() + + def __exit__(self, *args): + return self._lock.__exit__(*args) + + def __iter__(self): + yield from self.acquire() + return self class Semaphore: @@ -310,10 +340,10 @@ counter; it defaults to 1. If the value given is less than 0, ValueError is raised. - The second optional argument determins can semophore be released more than - initial internal counter value; it defaults to False. If the value given - is True and number of release() is more than number of successfull - acquire() calls ValueError is raised. + The second optional argument determines if the semaphore can be released + more than initial internal counter value; it defaults to False. If the + value given is True and number of release() is more than number of + successful acquire() calls ValueError is raised. """ def __init__(self, value=1, bound=False, *, loop=None): @@ -330,12 +360,12 @@ self._loop = events.get_event_loop() def __repr__(self): - # TODO: add waiters:N if > 0. res = super().__repr__() - return '<{} [{}]>'.format( - res[1:-1], - 'locked' if self._locked else 'unlocked,value:{}'.format( - self._value)) + extra = 'locked' if self._locked else 'unlocked,value:{}'.format( + self._value) + if self._waiters: + extra = '{},waiters:{}'.format(extra, len(self._waiters)) + return '<{} [{}]>'.format(res[1:-1], extra) def locked(self): """Returns True if semaphore can not be acquired immediately.""" @@ -373,7 +403,7 @@ When it was zero on entry and another coroutine is waiting for it to become larger than zero again, wake up that coroutine. - If Semaphore is create with "bound" paramter equals true, then + If Semaphore is created with "bound" parameter equals true, then release() method checks to make sure its current value doesn't exceed its initial value. If it does, ValueError is raised. """ diff --git a/Lib/test/test_asyncio/test_locks.py b/Lib/test/test_asyncio/test_locks.py --- a/Lib/test/test_asyncio/test_locks.py +++ b/Lib/test/test_asyncio/test_locks.py @@ -2,6 +2,7 @@ import unittest import unittest.mock +import re from asyncio import events from asyncio import futures @@ -10,6 +11,15 @@ from asyncio import test_utils +STR_RGX_REPR = ( + r'^<(?P.*?) object at (?P
.*?)' + r'\[(?P' + r'(set|unset|locked|unlocked)(,value:\d)?(,waiters:\d+)?' + r')\]>\Z' +) +RGX_REPR = re.compile(STR_RGX_REPR) + + class LockTests(unittest.TestCase): def setUp(self): @@ -38,6 +48,7 @@ def test_repr(self): lock = locks.Lock(loop=self.loop) self.assertTrue(repr(lock).endswith('[unlocked]>')) + self.assertTrue(RGX_REPR.match(repr(lock))) @tasks.coroutine def acquire_lock(): @@ -45,6 +56,7 @@ self.loop.run_until_complete(acquire_lock()) self.assertTrue(repr(lock).endswith('[locked]>')) + self.assertTrue(RGX_REPR.match(repr(lock))) def test_lock(self): lock = locks.Lock(loop=self.loop) @@ -239,9 +251,16 @@ def test_repr(self): ev = locks.Event(loop=self.loop) self.assertTrue(repr(ev).endswith('[unset]>')) + match = RGX_REPR.match(repr(ev)) + self.assertEqual(match.group('extras'), 'unset') ev.set() self.assertTrue(repr(ev).endswith('[set]>')) + self.assertTrue(RGX_REPR.match(repr(ev))) + + ev._waiters.append(unittest.mock.Mock()) + self.assertTrue('waiters:1' in repr(ev)) + self.assertTrue(RGX_REPR.match(repr(ev))) def test_wait(self): ev = locks.Event(loop=self.loop) @@ -440,7 +459,7 @@ self.assertRaises( futures.CancelledError, self.loop.run_until_complete, wait) - self.assertFalse(cond._condition_waiters) + self.assertFalse(cond._waiters) self.assertTrue(cond.locked()) def test_wait_unacquired(self): @@ -600,6 +619,45 @@ cond = locks.Condition(loop=self.loop) self.assertRaises(RuntimeError, cond.notify_all) + def test_repr(self): + cond = locks.Condition(loop=self.loop) + self.assertTrue('unlocked' in repr(cond)) + self.assertTrue(RGX_REPR.match(repr(cond))) + + self.loop.run_until_complete(cond.acquire()) + self.assertTrue('locked' in repr(cond)) + + cond._waiters.append(unittest.mock.Mock()) + self.assertTrue('waiters:1' in repr(cond)) + self.assertTrue(RGX_REPR.match(repr(cond))) + + cond._waiters.append(unittest.mock.Mock()) + self.assertTrue('waiters:2' in repr(cond)) + self.assertTrue(RGX_REPR.match(repr(cond))) + + def test_context_manager(self): + cond = locks.Condition(loop=self.loop) + + @tasks.coroutine + def acquire_cond(): + return (yield from cond) + + with self.loop.run_until_complete(acquire_cond()): + self.assertTrue(cond.locked()) + + self.assertFalse(cond.locked()) + + def test_context_manager_no_yield(self): + cond = locks.Condition(loop=self.loop) + + try: + with cond: + self.fail('RuntimeError is not raised in with expression') + except RuntimeError as err: + self.assertEqual( + str(err), + '"yield from" should be used as context manager expression') + class SemaphoreTests(unittest.TestCase): @@ -629,9 +687,20 @@ def test_repr(self): sem = locks.Semaphore(loop=self.loop) self.assertTrue(repr(sem).endswith('[unlocked,value:1]>')) + self.assertTrue(RGX_REPR.match(repr(sem))) self.loop.run_until_complete(sem.acquire()) self.assertTrue(repr(sem).endswith('[locked]>')) + self.assertTrue('waiters' not in repr(sem)) + self.assertTrue(RGX_REPR.match(repr(sem))) + + sem._waiters.append(unittest.mock.Mock()) + self.assertTrue('waiters:1' in repr(sem)) + self.assertTrue(RGX_REPR.match(repr(sem))) + + sem._waiters.append(unittest.mock.Mock()) + self.assertTrue('waiters:2' in repr(sem)) + self.assertTrue(RGX_REPR.match(repr(sem))) def test_semaphore(self): sem = locks.Semaphore(loop=self.loop) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 22:35:40 2013 From: python-checkins at python.org (benjamin.peterson) Date: Mon, 4 Nov 2013 22:35:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Fix_unintended?= =?utf-8?q?_switch_from_a_constant_to_a_global_in_56a3c0bc4634?= Message-ID: <3dD6kN1FgBz7Lq4@mail.python.org> http://hg.python.org/cpython/rev/d9790d8526fa changeset: 86930:d9790d8526fa branch: 2.7 parent: 86817:2d02b7a97e0b user: Raymond Hettinger date: Mon Oct 28 02:39:04 2013 -0600 summary: Fix unintended switch from a constant to a global in 56a3c0bc4634 files: Lib/heapq.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/heapq.py b/Lib/heapq.py --- a/Lib/heapq.py +++ b/Lib/heapq.py @@ -380,7 +380,7 @@ while _len(h) > 1: try: - while True: + while 1: v, itnum, next = s = h[0] yield v s[0] = next() # raises StopIteration when exhausted -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 4 22:35:41 2013 From: python-checkins at python.org (benjamin.peterson) Date: Mon, 4 Nov 2013 22:35:41 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMi43IC0+IDIuNyk6?= =?utf-8?q?_merge_2=2E7=2E6_release_branch?= Message-ID: <3dD6kP3zd8z7Lnl@mail.python.org> http://hg.python.org/cpython/rev/c7c90330eea1 changeset: 86931:c7c90330eea1 branch: 2.7 parent: 86926:fe7aaf14b129 parent: 86930:d9790d8526fa user: Benjamin Peterson date: Mon Nov 04 16:35:33 2013 -0500 summary: merge 2.7.6 release branch files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 00:45:08 2013 From: python-checkins at python.org (eric.snow) Date: Tue, 5 Nov 2013 00:45:08 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?cGVwczogW1BFUCA0NTFdICJ0YXJnZXQiIGFy?= =?utf-8?q?g_of_find=5Fspec=28=29_is_not_exclusive_to_reloading=2E?= Message-ID: <3dD9bm1LMnz7LpC@mail.python.org> http://hg.python.org/peps/rev/b8c8299914c5 changeset: 5253:b8c8299914c5 user: Eric Snow date: Mon Nov 04 16:40:45 2013 -0700 summary: [PEP 451] "target" arg of find_spec() is not exclusive to reloading. files: pep-0451.txt | 33 +++++++++++++++++++-------------- 1 files changed, 19 insertions(+), 14 deletions(-) diff --git a/pep-0451.txt b/pep-0451.txt --- a/pep-0451.txt +++ b/pep-0451.txt @@ -717,22 +717,27 @@ The "target" parameter of find_spec() ------------------------------------- -A module object with the same name as the "name" argument (or None, the -default) should be passed in to "exising". This argument allows the -finder to build the module spec with more information than is otherwise -available. This is particularly relevant in identifying the loader to -use. +A call to find_spec() may optionally include a "target" argument. This +is the module object that will be used subsequently as the target of +loading. During normal import (and by default) "target" is None, +meaning the target module has yet to be created. During reloading the +module passed in to reload() is passed through to find_spec() as the +target. This argument allows the finder to build the module spec with +more information than is otherwise available. Doing so is particularly +relevant in identifying the loader to use. Through find_spec() the finder will always identify the loader it -will return in the spec. In the case of reload, at this point the -finder should also decide whether or not the loader supports loading -into the module-to-be-reloaded (which was passed in to find_spec() as -"target"). This decision may entail consulting with the loader. If -the finder determines that the loader does not support reloading that -module, it should either find another loader or raise ImportError -(completely stopping import of the module). This reload decision is -important since, as noted in `How Reloading Will Work`_, loaders will -no longer be able to trivially identify a reload situation on their own. +will return in the spec (or return None). At the point the loader is +identified, the finder should also decide whether or not the loader +supports loading into the target module, in the case that "target" is +passed in. This decision may entail consulting with the loader. + +If the finder determines that the loader does not support loading into +the target module, it should either find another loader or raise +ImportError (completely stopping import of the module). This +determination is especially important during reload since, as noted in +`How Reloading Will Work`_, loaders will no longer be able to trivially +identify a reload situation on their own. Two alternatives were presented to the "target" parameter: Loader.supports_reload() and adding "target" to Loader.exec_module() -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Tue Nov 5 00:50:55 2013 From: python-checkins at python.org (guido.van.rossum) Date: Tue, 5 Nov 2013 00:50:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Refactor_SIGCHL?= =?utf-8?q?D_handling=2E_By_Anthony_Baire=2E?= Message-ID: <3dD9kR3cS1z7Lnt@mail.python.org> http://hg.python.org/cpython/rev/8d93ad260714 changeset: 86932:8d93ad260714 parent: 86929:268259370a01 user: Guido van Rossum date: Mon Nov 04 15:50:46 2013 -0800 summary: asyncio: Refactor SIGCHLD handling. By Anthony Baire. files: Lib/asyncio/events.py | 70 +- Lib/asyncio/unix_events.py | 396 +++- Lib/asyncio/windows_events.py | 17 +- Lib/test/test_asyncio/test_events.py | 44 +- Lib/test/test_asyncio/test_unix_events.py | 987 ++++++++- 5 files changed, 1315 insertions(+), 199 deletions(-) diff --git a/Lib/asyncio/events.py b/Lib/asyncio/events.py --- a/Lib/asyncio/events.py +++ b/Lib/asyncio/events.py @@ -1,10 +1,11 @@ """Event loop and event loop policy.""" -__all__ = ['AbstractEventLoopPolicy', 'DefaultEventLoopPolicy', +__all__ = ['AbstractEventLoopPolicy', 'AbstractEventLoop', 'AbstractServer', 'Handle', 'TimerHandle', 'get_event_loop_policy', 'set_event_loop_policy', 'get_event_loop', 'set_event_loop', 'new_event_loop', + 'get_child_watcher', 'set_child_watcher', ] import subprocess @@ -318,8 +319,18 @@ """XXX""" raise NotImplementedError + # Child processes handling (Unix only). -class DefaultEventLoopPolicy(threading.local, AbstractEventLoopPolicy): + def get_child_watcher(self): + """XXX""" + raise NotImplementedError + + def set_child_watcher(self, watcher): + """XXX""" + raise NotImplementedError + + +class BaseDefaultEventLoopPolicy(AbstractEventLoopPolicy): """Default policy implementation for accessing the event loop. In this policy, each thread has its own event loop. However, we @@ -332,28 +343,34 @@ associated). """ - _loop = None - _set_called = False + _loop_factory = None + + class _Local(threading.local): + _loop = None + _set_called = False + + def __init__(self): + self._local = self._Local() def get_event_loop(self): """Get the event loop. This may be None or an instance of EventLoop. """ - if (self._loop is None and - not self._set_called and + if (self._local._loop is None and + not self._local._set_called and isinstance(threading.current_thread(), threading._MainThread)): - self._loop = self.new_event_loop() - assert self._loop is not None, \ + self._local._loop = self.new_event_loop() + assert self._local._loop is not None, \ ('There is no current event loop in thread %r.' % threading.current_thread().name) - return self._loop + return self._local._loop def set_event_loop(self, loop): """Set the event loop.""" - self._set_called = True + self._local._set_called = True assert loop is None or isinstance(loop, AbstractEventLoop) - self._loop = loop + self._local._loop = loop def new_event_loop(self): """Create a new event loop. @@ -361,12 +378,7 @@ You must call set_event_loop() to make this the current event loop. """ - if sys.platform == 'win32': # pragma: no cover - from . import windows_events - return windows_events.SelectorEventLoop() - else: # pragma: no cover - from . import unix_events - return unix_events.SelectorEventLoop() + return self._loop_factory() # Event loop policy. The policy itself is always global, even if the @@ -375,12 +387,22 @@ # call to get_event_loop_policy(). _event_loop_policy = None +# Lock for protecting the on-the-fly creation of the event loop policy. +_lock = threading.Lock() + + +def _init_event_loop_policy(): + global _event_loop_policy + with _lock: + if _event_loop_policy is None: # pragma: no branch + from . import DefaultEventLoopPolicy + _event_loop_policy = DefaultEventLoopPolicy() + def get_event_loop_policy(): """XXX""" - global _event_loop_policy if _event_loop_policy is None: - _event_loop_policy = DefaultEventLoopPolicy() + _init_event_loop_policy() return _event_loop_policy @@ -404,3 +426,13 @@ def new_event_loop(): """XXX""" return get_event_loop_policy().new_event_loop() + + +def get_child_watcher(): + """XXX""" + return get_event_loop_policy().get_child_watcher() + + +def set_child_watcher(watcher): + """XXX""" + return get_event_loop_policy().set_child_watcher(watcher) diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -8,6 +8,7 @@ import stat import subprocess import sys +import threading from . import base_subprocess @@ -20,7 +21,10 @@ from .log import logger -__all__ = ['SelectorEventLoop', 'STDIN', 'STDOUT', 'STDERR'] +__all__ = ['SelectorEventLoop', 'STDIN', 'STDOUT', 'STDERR', + 'AbstractChildWatcher', 'SafeChildWatcher', + 'FastChildWatcher', 'DefaultEventLoopPolicy', + ] STDIN = 0 STDOUT = 1 @@ -31,7 +35,7 @@ raise ImportError('Signals are not really supported on Windows') -class SelectorEventLoop(selector_events.BaseSelectorEventLoop): +class _UnixSelectorEventLoop(selector_events.BaseSelectorEventLoop): """Unix event loop Adds signal handling to SelectorEventLoop @@ -40,17 +44,10 @@ def __init__(self, selector=None): super().__init__(selector) self._signal_handlers = {} - self._subprocesses = {} def _socketpair(self): return socket.socketpair() - def close(self): - handler = self._signal_handlers.get(signal.SIGCHLD) - if handler is not None: - self.remove_signal_handler(signal.SIGCHLD) - super().close() - def add_signal_handler(self, sig, callback, *args): """Add a handler for a signal. UNIX only. @@ -152,49 +149,20 @@ def _make_subprocess_transport(self, protocol, args, shell, stdin, stdout, stderr, bufsize, extra=None, **kwargs): - self._reg_sigchld() - transp = _UnixSubprocessTransport(self, protocol, args, shell, - stdin, stdout, stderr, bufsize, - extra=None, **kwargs) - self._subprocesses[transp.get_pid()] = transp + with events.get_child_watcher() as watcher: + transp = _UnixSubprocessTransport(self, protocol, args, shell, + stdin, stdout, stderr, bufsize, + extra=None, **kwargs) + watcher.add_child_handler(transp.get_pid(), + self._child_watcher_callback, transp) yield from transp._post_init() return transp - def _reg_sigchld(self): - if signal.SIGCHLD not in self._signal_handlers: - self.add_signal_handler(signal.SIGCHLD, self._sig_chld) + def _child_watcher_callback(self, pid, returncode, transp): + self.call_soon_threadsafe(transp._process_exited, returncode) - def _sig_chld(self): - try: - # Because of signal coalescing, we must keep calling waitpid() as - # long as we're able to reap a child. - while True: - try: - pid, status = os.waitpid(-1, os.WNOHANG) - except ChildProcessError: - break # No more child processes exist. - if pid == 0: - break # All remaining child processes are still alive. - elif os.WIFSIGNALED(status): - # A child process died because of a signal. - returncode = -os.WTERMSIG(status) - elif os.WIFEXITED(status): - # A child process exited (e.g. sys.exit()). - returncode = os.WEXITSTATUS(status) - else: - # A child exited, but we don't understand its status. - # This shouldn't happen, but if it does, let's just - # return that status; perhaps that helps debug it. - returncode = status - transp = self._subprocesses.get(pid) - if transp is not None: - transp._process_exited(returncode) - except Exception: - logger.exception('Unknown exception in SIGCHLD handler') - - def _subprocess_closed(self, transport): - pid = transport.get_pid() - self._subprocesses.pop(pid, None) + def _subprocess_closed(self, transp): + pass def _set_nonblocking(fd): @@ -423,3 +391,335 @@ if stdin_w is not None: stdin.close() self._proc.stdin = open(stdin_w.detach(), 'rb', buffering=bufsize) + + +class AbstractChildWatcher: + """Abstract base class for monitoring child processes. + + Objects derived from this class monitor a collection of subprocesses and + report their termination or interruption by a signal. + + New callbacks are registered with .add_child_handler(). Starting a new + process must be done within a 'with' block to allow the watcher to suspend + its activity until the new process if fully registered (this is needed to + prevent a race condition in some implementations). + + Example: + with watcher: + proc = subprocess.Popen("sleep 1") + watcher.add_child_handler(proc.pid, callback) + + Notes: + Implementations of this class must be thread-safe. + + Since child watcher objects may catch the SIGCHLD signal and call + waitpid(-1), there should be only one active object per process. + """ + + def add_child_handler(self, pid, callback, *args): + """Register a new child handler. + + Arrange for callback(pid, returncode, *args) to be called when + process 'pid' terminates. Specifying another callback for the same + process replaces the previous handler. + + Note: callback() must be thread-safe + """ + raise NotImplementedError() + + def remove_child_handler(self, pid): + """Removes the handler for process 'pid'. + + The function returns True if the handler was successfully removed, + False if there was nothing to remove.""" + + raise NotImplementedError() + + def set_loop(self, loop): + """Reattach the watcher to another event loop. + + Note: loop may be None + """ + raise NotImplementedError() + + def close(self): + """Close the watcher. + + This must be called to make sure that any underlying resource is freed. + """ + raise NotImplementedError() + + def __enter__(self): + """Enter the watcher's context and allow starting new processes + + This function must return self""" + raise NotImplementedError() + + def __exit__(self, a, b, c): + """Exit the watcher's context""" + raise NotImplementedError() + + +class BaseChildWatcher(AbstractChildWatcher): + + def __init__(self, loop): + self._loop = None + self._callbacks = {} + + self.set_loop(loop) + + def close(self): + self.set_loop(None) + self._callbacks.clear() + + def _do_waitpid(self, expected_pid): + raise NotImplementedError() + + def _do_waitpid_all(self): + raise NotImplementedError() + + def set_loop(self, loop): + assert loop is None or isinstance(loop, events.AbstractEventLoop) + + if self._loop is not None: + self._loop.remove_signal_handler(signal.SIGCHLD) + + self._loop = loop + if loop is not None: + loop.add_signal_handler(signal.SIGCHLD, self._sig_chld) + + # Prevent a race condition in case a child terminated + # during the switch. + self._do_waitpid_all() + + def remove_child_handler(self, pid): + try: + del self._callbacks[pid] + return True + except KeyError: + return False + + def _sig_chld(self): + try: + self._do_waitpid_all() + except Exception: + logger.exception('Unknown exception in SIGCHLD handler') + + def _compute_returncode(self, status): + if os.WIFSIGNALED(status): + # The child process died because of a signal. + return -os.WTERMSIG(status) + elif os.WIFEXITED(status): + # The child process exited (e.g sys.exit()). + return os.WEXITSTATUS(status) + else: + # The child exited, but we don't understand its status. + # This shouldn't happen, but if it does, let's just + # return that status; perhaps that helps debug it. + return status + + +class SafeChildWatcher(BaseChildWatcher): + """'Safe' child watcher implementation. + + This implementation avoids disrupting other code spawning processes by + polling explicitly each process in the SIGCHLD handler instead of calling + os.waitpid(-1). + + This is a safe solution but it has a significant overhead when handling a + big number of children (O(n) each time SIGCHLD is raised) + """ + + def __enter__(self): + return self + + def __exit__(self, a, b, c): + pass + + def add_child_handler(self, pid, callback, *args): + self._callbacks[pid] = callback, args + + # Prevent a race condition in case the child is already terminated. + self._do_waitpid(pid) + + def _do_waitpid_all(self): + + for pid in list(self._callbacks): + self._do_waitpid(pid) + + def _do_waitpid(self, expected_pid): + assert expected_pid > 0 + + try: + pid, status = os.waitpid(expected_pid, os.WNOHANG) + except ChildProcessError: + # The child process is already reaped + # (may happen if waitpid() is called elsewhere). + pid = expected_pid + returncode = 255 + logger.warning( + "Unknown child process pid %d, will report returncode 255", + pid) + else: + if pid == 0: + # The child process is still alive. + return + + returncode = self._compute_returncode(status) + + try: + callback, args = self._callbacks.pop(pid) + except KeyError: # pragma: no cover + # May happen if .remove_child_handler() is called + # after os.waitpid() returns. + pass + else: + callback(pid, returncode, *args) + + +class FastChildWatcher(BaseChildWatcher): + """'Fast' child watcher implementation. + + This implementation reaps every terminated processes by calling + os.waitpid(-1) directly, possibly breaking other code spawning processes + and waiting for their termination. + + There is no noticeable overhead when handling a big number of children + (O(1) each time a child terminates). + """ + def __init__(self, loop): + super().__init__(loop) + + self._lock = threading.Lock() + self._zombies = {} + self._forks = 0 + + def close(self): + super().close() + self._zombies.clear() + + def __enter__(self): + with self._lock: + self._forks += 1 + + return self + + def __exit__(self, a, b, c): + with self._lock: + self._forks -= 1 + + if self._forks or not self._zombies: + return + + collateral_victims = str(self._zombies) + self._zombies.clear() + + logger.warning( + "Caught subprocesses termination from unknown pids: %s", + collateral_victims) + + def add_child_handler(self, pid, callback, *args): + assert self._forks, "Must use the context manager" + + self._callbacks[pid] = callback, args + + try: + # Ensure that the child is not already terminated. + # (raise KeyError if still alive) + returncode = self._zombies.pop(pid) + + # Child is dead, therefore we can fire the callback immediately. + # First we remove it from the dict. + # (raise KeyError if .remove_child_handler() was called in-between) + del self._callbacks[pid] + except KeyError: + pass + else: + callback(pid, returncode, *args) + + def _do_waitpid_all(self): + # Because of signal coalescing, we must keep calling waitpid() as + # long as we're able to reap a child. + while True: + try: + pid, status = os.waitpid(-1, os.WNOHANG) + except ChildProcessError: + # No more child processes exist. + return + else: + if pid == 0: + # A child process is still alive. + return + + returncode = self._compute_returncode(status) + + try: + callback, args = self._callbacks.pop(pid) + except KeyError: + # unknown child + with self._lock: + if self._forks: + # It may not be registered yet. + self._zombies[pid] = returncode + continue + + logger.warning( + "Caught subprocess termination from unknown pid: " + "%d -> %d", pid, returncode) + else: + callback(pid, returncode, *args) + + +class _UnixDefaultEventLoopPolicy(events.BaseDefaultEventLoopPolicy): + """XXX""" + _loop_factory = _UnixSelectorEventLoop + + def __init__(self): + super().__init__() + self._watcher = None + + def _init_watcher(self): + with events._lock: + if self._watcher is None: # pragma: no branch + if isinstance(threading.current_thread(), + threading._MainThread): + self._watcher = SafeChildWatcher(self._local._loop) + else: + self._watcher = SafeChildWatcher(None) + + def set_event_loop(self, loop): + """Set the event loop. + + As a side effect, if a child watcher was set before, then calling + .set_event_loop() from the main thread will call .set_loop(loop) on the + child watcher. + """ + + super().set_event_loop(loop) + + if self._watcher is not None and \ + isinstance(threading.current_thread(), threading._MainThread): + self._watcher.set_loop(loop) + + def get_child_watcher(self): + """Get the child watcher + + If not yet set, a SafeChildWatcher object is automatically created. + """ + if self._watcher is None: + self._init_watcher() + + return self._watcher + + def set_child_watcher(self, watcher): + """Set the child watcher""" + + assert watcher is None or isinstance(watcher, AbstractChildWatcher) + + if self._watcher is not None: + self._watcher.close() + + self._watcher = watcher + +SelectorEventLoop = _UnixSelectorEventLoop +DefaultEventLoopPolicy = _UnixDefaultEventLoopPolicy diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -7,6 +7,7 @@ import struct import _winapi +from . import events from . import base_subprocess from . import futures from . import proactor_events @@ -17,7 +18,9 @@ from . import _overlapped -__all__ = ['SelectorEventLoop', 'ProactorEventLoop', 'IocpProactor'] +__all__ = ['SelectorEventLoop', 'ProactorEventLoop', 'IocpProactor', + 'DefaultEventLoopPolicy', + ] NULL = 0 @@ -108,7 +111,7 @@ __del__ = close -class SelectorEventLoop(selector_events.BaseSelectorEventLoop): +class _WindowsSelectorEventLoop(selector_events.BaseSelectorEventLoop): """Windows version of selector event loop.""" def _socketpair(self): @@ -453,3 +456,13 @@ f = self._loop._proactor.wait_for_handle(int(self._proc._handle)) f.add_done_callback(callback) + + +SelectorEventLoop = _WindowsSelectorEventLoop + + +class _WindowsDefaultEventLoopPolicy(events.BaseDefaultEventLoopPolicy): + _loop_factory = SelectorEventLoop + + +DefaultEventLoopPolicy = _WindowsDefaultEventLoopPolicy diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -1308,8 +1308,17 @@ from asyncio import selectors from asyncio import unix_events + class UnixEventLoopTestsMixin(EventLoopTestsMixin): + def setUp(self): + super().setUp() + events.set_child_watcher(unix_events.SafeChildWatcher(self.loop)) + + def tearDown(self): + events.set_child_watcher(None) + super().tearDown() + if hasattr(selectors, 'KqueueSelector'): - class KqueueEventLoopTests(EventLoopTestsMixin, + class KqueueEventLoopTests(UnixEventLoopTestsMixin, SubprocessTestsMixin, unittest.TestCase): @@ -1318,7 +1327,7 @@ selectors.KqueueSelector()) if hasattr(selectors, 'EpollSelector'): - class EPollEventLoopTests(EventLoopTestsMixin, + class EPollEventLoopTests(UnixEventLoopTestsMixin, SubprocessTestsMixin, unittest.TestCase): @@ -1326,7 +1335,7 @@ return unix_events.SelectorEventLoop(selectors.EpollSelector()) if hasattr(selectors, 'PollSelector'): - class PollEventLoopTests(EventLoopTestsMixin, + class PollEventLoopTests(UnixEventLoopTestsMixin, SubprocessTestsMixin, unittest.TestCase): @@ -1334,7 +1343,7 @@ return unix_events.SelectorEventLoop(selectors.PollSelector()) # Should always exist. - class SelectEventLoopTests(EventLoopTestsMixin, + class SelectEventLoopTests(UnixEventLoopTestsMixin, SubprocessTestsMixin, unittest.TestCase): @@ -1557,25 +1566,36 @@ class PolicyTests(unittest.TestCase): + def create_policy(self): + if sys.platform == "win32": + from asyncio import windows_events + return windows_events.DefaultEventLoopPolicy() + else: + from asyncio import unix_events + return unix_events.DefaultEventLoopPolicy() + def test_event_loop_policy(self): policy = events.AbstractEventLoopPolicy() self.assertRaises(NotImplementedError, policy.get_event_loop) self.assertRaises(NotImplementedError, policy.set_event_loop, object()) self.assertRaises(NotImplementedError, policy.new_event_loop) + self.assertRaises(NotImplementedError, policy.get_child_watcher) + self.assertRaises(NotImplementedError, policy.set_child_watcher, + object()) def test_get_event_loop(self): - policy = events.DefaultEventLoopPolicy() - self.assertIsNone(policy._loop) + policy = self.create_policy() + self.assertIsNone(policy._local._loop) loop = policy.get_event_loop() self.assertIsInstance(loop, events.AbstractEventLoop) - self.assertIs(policy._loop, loop) + self.assertIs(policy._local._loop, loop) self.assertIs(loop, policy.get_event_loop()) loop.close() def test_get_event_loop_after_set_none(self): - policy = events.DefaultEventLoopPolicy() + policy = self.create_policy() policy.set_event_loop(None) self.assertRaises(AssertionError, policy.get_event_loop) @@ -1583,7 +1603,7 @@ def test_get_event_loop_thread(self, m_current_thread): def f(): - policy = events.DefaultEventLoopPolicy() + policy = self.create_policy() self.assertRaises(AssertionError, policy.get_event_loop) th = threading.Thread(target=f) @@ -1591,14 +1611,14 @@ th.join() def test_new_event_loop(self): - policy = events.DefaultEventLoopPolicy() + policy = self.create_policy() loop = policy.new_event_loop() self.assertIsInstance(loop, events.AbstractEventLoop) loop.close() def test_set_event_loop(self): - policy = events.DefaultEventLoopPolicy() + policy = self.create_policy() old_loop = policy.get_event_loop() self.assertRaises(AssertionError, policy.set_event_loop, object()) @@ -1621,7 +1641,7 @@ old_policy = events.get_event_loop_policy() - policy = events.DefaultEventLoopPolicy() + policy = self.create_policy() events.set_event_loop_policy(policy) self.assertIs(policy, events.get_event_loop_policy()) self.assertIsNot(policy, old_policy) diff --git a/Lib/test/test_asyncio/test_unix_events.py b/Lib/test/test_asyncio/test_unix_events.py --- a/Lib/test/test_asyncio/test_unix_events.py +++ b/Lib/test/test_asyncio/test_unix_events.py @@ -3,10 +3,12 @@ import gc import errno import io +import os import pprint import signal import stat import sys +import threading import unittest import unittest.mock @@ -181,124 +183,6 @@ self.assertRaises( RuntimeError, self.loop.remove_signal_handler, signal.SIGHUP) - @unittest.mock.patch('os.WTERMSIG') - @unittest.mock.patch('os.WEXITSTATUS') - @unittest.mock.patch('os.WIFSIGNALED') - @unittest.mock.patch('os.WIFEXITED') - @unittest.mock.patch('os.waitpid') - def test__sig_chld(self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, - m_WEXITSTATUS, m_WTERMSIG): - m_waitpid.side_effect = [(7, object()), ChildProcessError] - m_WIFEXITED.return_value = True - m_WIFSIGNALED.return_value = False - m_WEXITSTATUS.return_value = 3 - transp = unittest.mock.Mock() - self.loop._subprocesses[7] = transp - - self.loop._sig_chld() - transp._process_exited.assert_called_with(3) - self.assertFalse(m_WTERMSIG.called) - - @unittest.mock.patch('os.WTERMSIG') - @unittest.mock.patch('os.WEXITSTATUS') - @unittest.mock.patch('os.WIFSIGNALED') - @unittest.mock.patch('os.WIFEXITED') - @unittest.mock.patch('os.waitpid') - def test__sig_chld_signal(self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, - m_WEXITSTATUS, m_WTERMSIG): - m_waitpid.side_effect = [(7, object()), ChildProcessError] - m_WIFEXITED.return_value = False - m_WIFSIGNALED.return_value = True - m_WTERMSIG.return_value = 1 - transp = unittest.mock.Mock() - self.loop._subprocesses[7] = transp - - self.loop._sig_chld() - transp._process_exited.assert_called_with(-1) - self.assertFalse(m_WEXITSTATUS.called) - - @unittest.mock.patch('os.WTERMSIG') - @unittest.mock.patch('os.WEXITSTATUS') - @unittest.mock.patch('os.WIFSIGNALED') - @unittest.mock.patch('os.WIFEXITED') - @unittest.mock.patch('os.waitpid') - def test__sig_chld_zero_pid(self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, - m_WEXITSTATUS, m_WTERMSIG): - m_waitpid.side_effect = [(0, object()), ChildProcessError] - transp = unittest.mock.Mock() - self.loop._subprocesses[7] = transp - - self.loop._sig_chld() - self.assertFalse(transp._process_exited.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WTERMSIG.called) - self.assertFalse(m_WEXITSTATUS.called) - - @unittest.mock.patch('os.WTERMSIG') - @unittest.mock.patch('os.WEXITSTATUS') - @unittest.mock.patch('os.WIFSIGNALED') - @unittest.mock.patch('os.WIFEXITED') - @unittest.mock.patch('os.waitpid') - def test__sig_chld_not_registered_subprocess(self, m_waitpid, - m_WIFEXITED, m_WIFSIGNALED, - m_WEXITSTATUS, m_WTERMSIG): - m_waitpid.side_effect = [(7, object()), ChildProcessError] - m_WIFEXITED.return_value = True - m_WIFSIGNALED.return_value = False - m_WEXITSTATUS.return_value = 3 - - self.loop._sig_chld() - self.assertFalse(m_WTERMSIG.called) - - @unittest.mock.patch('os.WTERMSIG') - @unittest.mock.patch('os.WEXITSTATUS') - @unittest.mock.patch('os.WIFSIGNALED') - @unittest.mock.patch('os.WIFEXITED') - @unittest.mock.patch('os.waitpid') - def test__sig_chld_unknown_status(self, m_waitpid, - m_WIFEXITED, m_WIFSIGNALED, - m_WEXITSTATUS, m_WTERMSIG): - m_waitpid.side_effect = [(7, object()), ChildProcessError] - m_WIFEXITED.return_value = False - m_WIFSIGNALED.return_value = False - transp = unittest.mock.Mock() - self.loop._subprocesses[7] = transp - - self.loop._sig_chld() - self.assertTrue(transp._process_exited.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) - - @unittest.mock.patch('asyncio.unix_events.logger') - @unittest.mock.patch('os.WTERMSIG') - @unittest.mock.patch('os.WEXITSTATUS') - @unittest.mock.patch('os.WIFSIGNALED') - @unittest.mock.patch('os.WIFEXITED') - @unittest.mock.patch('os.waitpid') - def test__sig_chld_unknown_status_in_handler(self, m_waitpid, - m_WIFEXITED, m_WIFSIGNALED, - m_WEXITSTATUS, m_WTERMSIG, - m_log): - m_waitpid.side_effect = Exception - transp = unittest.mock.Mock() - self.loop._subprocesses[7] = transp - - self.loop._sig_chld() - self.assertFalse(transp._process_exited.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WTERMSIG.called) - self.assertFalse(m_WEXITSTATUS.called) - m_log.exception.assert_called_with( - 'Unknown exception in SIGCHLD handler') - - @unittest.mock.patch('os.waitpid') - def test__sig_chld_process_error(self, m_waitpid): - m_waitpid.side_effect = ChildProcessError - self.loop._sig_chld() - self.assertTrue(m_waitpid.called) - class UnixReadPipeTransportTests(unittest.TestCase): @@ -777,5 +661,872 @@ self.assertFalse(self.protocol.connection_lost.called) +class AbstractChildWatcherTests(unittest.TestCase): + + def test_not_implemented(self): + f = unittest.mock.Mock() + watcher = unix_events.AbstractChildWatcher() + self.assertRaises( + NotImplementedError, watcher.add_child_handler, f, f) + self.assertRaises( + NotImplementedError, watcher.remove_child_handler, f) + self.assertRaises( + NotImplementedError, watcher.set_loop, f) + self.assertRaises( + NotImplementedError, watcher.close) + self.assertRaises( + NotImplementedError, watcher.__enter__) + self.assertRaises( + NotImplementedError, watcher.__exit__, f, f, f) + + +class BaseChildWatcherTests(unittest.TestCase): + + def test_not_implemented(self): + f = unittest.mock.Mock() + watcher = unix_events.BaseChildWatcher(None) + self.assertRaises( + NotImplementedError, watcher._do_waitpid, f) + + +class ChildWatcherTestsMixin: + instance = None + + ignore_warnings = unittest.mock.patch.object(unix_events.logger, "warning") + + def setUp(self): + self.loop = test_utils.TestLoop() + self.running = False + self.zombies = {} + + assert ChildWatcherTestsMixin.instance is None + ChildWatcherTestsMixin.instance = self + + with unittest.mock.patch.object( + self.loop, "add_signal_handler") as self.m_add_signal_handler: + self.watcher = self.create_watcher(self.loop) + + def tearDown(self): + ChildWatcherTestsMixin.instance = None + + def waitpid(pid, flags): + self = ChildWatcherTestsMixin.instance + if isinstance(self.watcher, unix_events.SafeChildWatcher) or pid != -1: + self.assertGreater(pid, 0) + try: + if pid < 0: + return self.zombies.popitem() + else: + return pid, self.zombies.pop(pid) + except KeyError: + pass + if self.running: + return 0, 0 + else: + raise ChildProcessError() + + def add_zombie(self, pid, returncode): + self.zombies[pid] = returncode + 32768 + + def WIFEXITED(status): + return status >= 32768 + + def WIFSIGNALED(status): + return 32700 < status < 32768 + + def WEXITSTATUS(status): + self = ChildWatcherTestsMixin.instance + self.assertTrue(type(self).WIFEXITED(status)) + return status - 32768 + + def WTERMSIG(status): + self = ChildWatcherTestsMixin.instance + self.assertTrue(type(self).WIFSIGNALED(status)) + return 32768 - status + + def test_create_watcher(self): + self.m_add_signal_handler.assert_called_once_with( + signal.SIGCHLD, self.watcher._sig_chld) + + @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) + @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) + @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) + @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) + @unittest.mock.patch('os.waitpid', wraps=waitpid) + def test_sigchld(self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, + m_WEXITSTATUS, m_WTERMSIG): + # register a child + callback = unittest.mock.Mock() + + with self.watcher: + self.running = True + self.watcher.add_child_handler(42, callback, 9, 10, 14) + + self.assertFalse(callback.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + # child is running + self.watcher._sig_chld() + + self.assertFalse(callback.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + # child terminates (returncode 12) + self.running = False + self.add_zombie(42, 12) + self.watcher._sig_chld() + + self.assertTrue(m_WIFEXITED.called) + self.assertTrue(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + callback.assert_called_once_with(42, 12, 9, 10, 14) + + m_WIFSIGNALED.reset_mock() + m_WIFEXITED.reset_mock() + m_WEXITSTATUS.reset_mock() + callback.reset_mock() + + # ensure that the child is effectively reaped + self.add_zombie(42, 13) + with self.ignore_warnings: + self.watcher._sig_chld() + + self.assertFalse(callback.called) + self.assertFalse(m_WTERMSIG.called) + + m_WIFSIGNALED.reset_mock() + m_WIFEXITED.reset_mock() + m_WEXITSTATUS.reset_mock() + + # sigchld called again + self.zombies.clear() + self.watcher._sig_chld() + + self.assertFalse(callback.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) + @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) + @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) + @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) + @unittest.mock.patch('os.waitpid', wraps=waitpid) + def test_sigchld_two_children(self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, + m_WEXITSTATUS, m_WTERMSIG): + callback1 = unittest.mock.Mock() + callback2 = unittest.mock.Mock() + + # register child 1 + with self.watcher: + self.running = True + self.watcher.add_child_handler(43, callback1, 7, 8) + + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + # register child 2 + with self.watcher: + self.watcher.add_child_handler(44, callback2, 147, 18) + + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + # childen are running + self.watcher._sig_chld() + + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + # child 1 terminates (signal 3) + self.add_zombie(43, -3) + self.watcher._sig_chld() + + callback1.assert_called_once_with(43, -3, 7, 8) + self.assertFalse(callback2.called) + self.assertTrue(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertTrue(m_WTERMSIG.called) + + m_WIFSIGNALED.reset_mock() + m_WIFEXITED.reset_mock() + m_WTERMSIG.reset_mock() + callback1.reset_mock() + + # child 2 still running + self.watcher._sig_chld() + + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + # child 2 terminates (code 108) + self.add_zombie(44, 108) + self.running = False + self.watcher._sig_chld() + + callback2.assert_called_once_with(44, 108, 147, 18) + self.assertFalse(callback1.called) + self.assertTrue(m_WIFEXITED.called) + self.assertTrue(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + m_WIFSIGNALED.reset_mock() + m_WIFEXITED.reset_mock() + m_WEXITSTATUS.reset_mock() + callback2.reset_mock() + + # ensure that the children are effectively reaped + self.add_zombie(43, 14) + self.add_zombie(44, 15) + with self.ignore_warnings: + self.watcher._sig_chld() + + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + self.assertFalse(m_WTERMSIG.called) + + m_WIFSIGNALED.reset_mock() + m_WIFEXITED.reset_mock() + m_WEXITSTATUS.reset_mock() + + # sigchld called again + self.zombies.clear() + self.watcher._sig_chld() + + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) + @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) + @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) + @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) + @unittest.mock.patch('os.waitpid', wraps=waitpid) + def test_sigchld_two_children_terminating_together( + self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, + m_WTERMSIG): + callback1 = unittest.mock.Mock() + callback2 = unittest.mock.Mock() + + # register child 1 + with self.watcher: + self.running = True + self.watcher.add_child_handler(45, callback1, 17, 8) + + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + # register child 2 + with self.watcher: + self.watcher.add_child_handler(46, callback2, 1147, 18) + + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + # childen are running + self.watcher._sig_chld() + + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + # child 1 terminates (code 78) + # child 2 terminates (signal 5) + self.add_zombie(45, 78) + self.add_zombie(46, -5) + self.running = False + self.watcher._sig_chld() + + callback1.assert_called_once_with(45, 78, 17, 8) + callback2.assert_called_once_with(46, -5, 1147, 18) + self.assertTrue(m_WIFSIGNALED.called) + self.assertTrue(m_WIFEXITED.called) + self.assertTrue(m_WEXITSTATUS.called) + self.assertTrue(m_WTERMSIG.called) + + m_WIFSIGNALED.reset_mock() + m_WIFEXITED.reset_mock() + m_WTERMSIG.reset_mock() + m_WEXITSTATUS.reset_mock() + callback1.reset_mock() + callback2.reset_mock() + + # ensure that the children are effectively reaped + self.add_zombie(45, 14) + self.add_zombie(46, 15) + with self.ignore_warnings: + self.watcher._sig_chld() + + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + self.assertFalse(m_WTERMSIG.called) + + @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) + @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) + @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) + @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) + @unittest.mock.patch('os.waitpid', wraps=waitpid) + def test_sigchld_race_condition( + self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, + m_WTERMSIG): + # register a child + callback = unittest.mock.Mock() + + with self.watcher: + # child terminates before being registered + self.add_zombie(50, 4) + self.watcher._sig_chld() + + self.watcher.add_child_handler(50, callback, 1, 12) + + callback.assert_called_once_with(50, 4, 1, 12) + callback.reset_mock() + + # ensure that the child is effectively reaped + self.add_zombie(50, -1) + with self.ignore_warnings: + self.watcher._sig_chld() + + self.assertFalse(callback.called) + + @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) + @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) + @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) + @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) + @unittest.mock.patch('os.waitpid', wraps=waitpid) + def test_sigchld_replace_handler( + self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, + m_WTERMSIG): + callback1 = unittest.mock.Mock() + callback2 = unittest.mock.Mock() + + # register a child + with self.watcher: + self.running = True + self.watcher.add_child_handler(51, callback1, 19) + + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + # register the same child again + with self.watcher: + self.watcher.add_child_handler(51, callback2, 21) + + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + # child terminates (signal 8) + self.running = False + self.add_zombie(51, -8) + self.watcher._sig_chld() + + callback2.assert_called_once_with(51, -8, 21) + self.assertFalse(callback1.called) + self.assertTrue(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertTrue(m_WTERMSIG.called) + + m_WIFSIGNALED.reset_mock() + m_WIFEXITED.reset_mock() + m_WTERMSIG.reset_mock() + callback2.reset_mock() + + # ensure that the child is effectively reaped + self.add_zombie(51, 13) + with self.ignore_warnings: + self.watcher._sig_chld() + + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + self.assertFalse(m_WTERMSIG.called) + + @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) + @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) + @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) + @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) + @unittest.mock.patch('os.waitpid', wraps=waitpid) + def test_sigchld_remove_handler(self, m_waitpid, m_WIFEXITED, + m_WIFSIGNALED, m_WEXITSTATUS, m_WTERMSIG): + callback = unittest.mock.Mock() + + # register a child + with self.watcher: + self.running = True + self.watcher.add_child_handler(52, callback, 1984) + + self.assertFalse(callback.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + # unregister the child + self.watcher.remove_child_handler(52) + + self.assertFalse(callback.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + # child terminates (code 99) + self.running = False + self.add_zombie(52, 99) + with self.ignore_warnings: + self.watcher._sig_chld() + + self.assertFalse(callback.called) + + @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) + @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) + @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) + @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) + @unittest.mock.patch('os.waitpid', wraps=waitpid) + def test_sigchld_unknown_status(self, m_waitpid, m_WIFEXITED, + m_WIFSIGNALED, m_WEXITSTATUS, m_WTERMSIG): + callback = unittest.mock.Mock() + + # register a child + with self.watcher: + self.running = True + self.watcher.add_child_handler(53, callback, -19) + + self.assertFalse(callback.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + # terminate with unknown status + self.zombies[53] = 1178 + self.running = False + self.watcher._sig_chld() + + callback.assert_called_once_with(53, 1178, -19) + self.assertTrue(m_WIFEXITED.called) + self.assertTrue(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + callback.reset_mock() + m_WIFEXITED.reset_mock() + m_WIFSIGNALED.reset_mock() + + # ensure that the child is effectively reaped + self.add_zombie(53, 101) + with self.ignore_warnings: + self.watcher._sig_chld() + + self.assertFalse(callback.called) + + @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) + @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) + @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) + @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) + @unittest.mock.patch('os.waitpid', wraps=waitpid) + def test_remove_child_handler(self, m_waitpid, m_WIFEXITED, + m_WIFSIGNALED, m_WEXITSTATUS, m_WTERMSIG): + callback1 = unittest.mock.Mock() + callback2 = unittest.mock.Mock() + callback3 = unittest.mock.Mock() + + # register children + with self.watcher: + self.running = True + self.watcher.add_child_handler(54, callback1, 1) + self.watcher.add_child_handler(55, callback2, 2) + self.watcher.add_child_handler(56, callback3, 3) + + # remove child handler 1 + self.assertTrue(self.watcher.remove_child_handler(54)) + + # remove child handler 2 multiple times + self.assertTrue(self.watcher.remove_child_handler(55)) + self.assertFalse(self.watcher.remove_child_handler(55)) + self.assertFalse(self.watcher.remove_child_handler(55)) + + # all children terminate + self.add_zombie(54, 0) + self.add_zombie(55, 1) + self.add_zombie(56, 2) + self.running = False + with self.ignore_warnings: + self.watcher._sig_chld() + + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + callback3.assert_called_once_with(56, 2, 3) + + @unittest.mock.patch('os.waitpid', wraps=waitpid) + def test_sigchld_unhandled_exception(self, m_waitpid): + callback = unittest.mock.Mock() + + # register a child + with self.watcher: + self.running = True + self.watcher.add_child_handler(57, callback) + + # raise an exception + m_waitpid.side_effect = ValueError + + with unittest.mock.patch.object(unix_events.logger, + "exception") as m_exception: + + self.assertEqual(self.watcher._sig_chld(), None) + self.assertTrue(m_exception.called) + + @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) + @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) + @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) + @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) + @unittest.mock.patch('os.waitpid', wraps=waitpid) + def test_sigchld_child_reaped_elsewhere( + self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, + m_WTERMSIG): + + # register a child + callback = unittest.mock.Mock() + + with self.watcher: + self.running = True + self.watcher.add_child_handler(58, callback) + + self.assertFalse(callback.called) + self.assertFalse(m_WIFEXITED.called) + self.assertFalse(m_WIFSIGNALED.called) + self.assertFalse(m_WEXITSTATUS.called) + self.assertFalse(m_WTERMSIG.called) + + # child terminates + self.running = False + self.add_zombie(58, 4) + + # waitpid is called elsewhere + os.waitpid(58, os.WNOHANG) + + m_waitpid.reset_mock() + + # sigchld + with self.ignore_warnings: + self.watcher._sig_chld() + + callback.assert_called(m_waitpid) + if isinstance(self.watcher, unix_events.FastChildWatcher): + # here the FastChildWatche enters a deadlock + # (there is no way to prevent it) + self.assertFalse(callback.called) + else: + callback.assert_called_once_with(58, 255) + + @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) + @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) + @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) + @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) + @unittest.mock.patch('os.waitpid', wraps=waitpid) + def test_sigchld_unknown_pid_during_registration( + self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, + m_WTERMSIG): + + # register two children + callback1 = unittest.mock.Mock() + callback2 = unittest.mock.Mock() + + with self.ignore_warnings, self.watcher: + self.running = True + # child 1 terminates + self.add_zombie(591, 7) + # an unknown child terminates + self.add_zombie(593, 17) + + self.watcher._sig_chld() + + self.watcher.add_child_handler(591, callback1) + self.watcher.add_child_handler(592, callback2) + + callback1.assert_called_once_with(591, 7) + self.assertFalse(callback2.called) + + @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) + @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) + @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) + @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) + @unittest.mock.patch('os.waitpid', wraps=waitpid) + def test_set_loop( + self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, + m_WTERMSIG): + + # register a child + callback = unittest.mock.Mock() + + with self.watcher: + self.running = True + self.watcher.add_child_handler(60, callback) + + # attach a new loop + old_loop = self.loop + self.loop = test_utils.TestLoop() + + with unittest.mock.patch.object( + old_loop, + "remove_signal_handler") as m_old_remove_signal_handler, \ + unittest.mock.patch.object( + self.loop, + "add_signal_handler") as m_new_add_signal_handler: + + self.watcher.set_loop(self.loop) + + m_old_remove_signal_handler.assert_called_once_with( + signal.SIGCHLD) + m_new_add_signal_handler.assert_called_once_with( + signal.SIGCHLD, self.watcher._sig_chld) + + # child terminates + self.running = False + self.add_zombie(60, 9) + self.watcher._sig_chld() + + callback.assert_called_once_with(60, 9) + + @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) + @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) + @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) + @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) + @unittest.mock.patch('os.waitpid', wraps=waitpid) + def test_set_loop_race_condition( + self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, + m_WTERMSIG): + + # register 3 children + callback1 = unittest.mock.Mock() + callback2 = unittest.mock.Mock() + callback3 = unittest.mock.Mock() + + with self.watcher: + self.running = True + self.watcher.add_child_handler(61, callback1) + self.watcher.add_child_handler(62, callback2) + self.watcher.add_child_handler(622, callback3) + + # detach the loop + old_loop = self.loop + self.loop = None + + with unittest.mock.patch.object( + old_loop, "remove_signal_handler") as m_remove_signal_handler: + + self.watcher.set_loop(None) + + m_remove_signal_handler.assert_called_once_with( + signal.SIGCHLD) + + # child 1 & 2 terminate + self.add_zombie(61, 11) + self.add_zombie(62, -5) + + # SIGCHLD was not catched + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + self.assertFalse(callback3.called) + + # attach a new loop + self.loop = test_utils.TestLoop() + + with unittest.mock.patch.object( + self.loop, "add_signal_handler") as m_add_signal_handler: + + self.watcher.set_loop(self.loop) + + m_add_signal_handler.assert_called_once_with( + signal.SIGCHLD, self.watcher._sig_chld) + callback1.assert_called_once_with(61, 11) # race condition! + callback2.assert_called_once_with(62, -5) # race condition! + self.assertFalse(callback3.called) + + callback1.reset_mock() + callback2.reset_mock() + + # child 3 terminates + self.running = False + self.add_zombie(622, 19) + self.watcher._sig_chld() + + self.assertFalse(callback1.called) + self.assertFalse(callback2.called) + callback3.assert_called_once_with(622, 19) + + @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) + @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) + @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) + @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) + @unittest.mock.patch('os.waitpid', wraps=waitpid) + def test_close( + self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, + m_WTERMSIG): + + # register two children + callback1 = unittest.mock.Mock() + callback2 = unittest.mock.Mock() + + with self.watcher: + self.running = True + # child 1 terminates + self.add_zombie(63, 9) + # other child terminates + self.add_zombie(65, 18) + self.watcher._sig_chld() + + self.watcher.add_child_handler(63, callback1) + self.watcher.add_child_handler(64, callback1) + + self.assertEqual(len(self.watcher._callbacks), 1) + if isinstance(self.watcher, unix_events.FastChildWatcher): + self.assertEqual(len(self.watcher._zombies), 1) + + with unittest.mock.patch.object( + self.loop, + "remove_signal_handler") as m_remove_signal_handler: + + self.watcher.close() + + m_remove_signal_handler.assert_called_once_with( + signal.SIGCHLD) + self.assertFalse(self.watcher._callbacks) + if isinstance(self.watcher, unix_events.FastChildWatcher): + self.assertFalse(self.watcher._zombies) + + +class SafeChildWatcherTests (ChildWatcherTestsMixin, unittest.TestCase): + def create_watcher(self, loop): + return unix_events.SafeChildWatcher(loop) + + +class FastChildWatcherTests (ChildWatcherTestsMixin, unittest.TestCase): + def create_watcher(self, loop): + return unix_events.FastChildWatcher(loop) + + +class PolicyTests(unittest.TestCase): + + def create_policy(self): + return unix_events.DefaultEventLoopPolicy() + + def test_get_child_watcher(self): + policy = self.create_policy() + self.assertIsNone(policy._watcher) + + watcher = policy.get_child_watcher() + self.assertIsInstance(watcher, unix_events.SafeChildWatcher) + + self.assertIs(policy._watcher, watcher) + + self.assertIs(watcher, policy.get_child_watcher()) + self.assertIsNone(watcher._loop) + + def test_get_child_watcher_after_set(self): + policy = self.create_policy() + watcher = unix_events.FastChildWatcher(None) + + policy.set_child_watcher(watcher) + self.assertIs(policy._watcher, watcher) + self.assertIs(watcher, policy.get_child_watcher()) + + def test_get_child_watcher_with_mainloop_existing(self): + policy = self.create_policy() + loop = policy.get_event_loop() + + self.assertIsNone(policy._watcher) + watcher = policy.get_child_watcher() + + self.assertIsInstance(watcher, unix_events.SafeChildWatcher) + self.assertIs(watcher._loop, loop) + + loop.close() + + def test_get_child_watcher_thread(self): + + def f(): + policy.set_event_loop(policy.new_event_loop()) + + self.assertIsInstance(policy.get_event_loop(), + events.AbstractEventLoop) + watcher = policy.get_child_watcher() + + self.assertIsInstance(watcher, unix_events.SafeChildWatcher) + self.assertIsNone(watcher._loop) + + policy.get_event_loop().close() + + policy = self.create_policy() + + th = threading.Thread(target=f) + th.start() + th.join() + + def test_child_watcher_replace_mainloop_existing(self): + policy = self.create_policy() + loop = policy.get_event_loop() + + watcher = policy.get_child_watcher() + + self.assertIs(watcher._loop, loop) + + new_loop = policy.new_event_loop() + policy.set_event_loop(new_loop) + + self.assertIs(watcher._loop, new_loop) + + policy.set_event_loop(None) + + self.assertIs(watcher._loop, None) + + loop.close() + new_loop.close() + + if __name__ == '__main__': unittest.main() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 03:43:40 2013 From: python-checkins at python.org (terry.reedy) Date: Tue, 5 Nov 2013 03:43:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5Mzk3?= =?utf-8?q?=3A_test=5Fpydoc_now_works_with_-S_=28help_not_added_to_builtin?= =?utf-8?b?cyku?= Message-ID: <3dDFYm6HZ0z7LlK@mail.python.org> http://hg.python.org/cpython/rev/2c191b0b5e7a changeset: 86933:2c191b0b5e7a branch: 3.3 parent: 86927:47d3714dcb33 user: Terry Jan Reedy date: Mon Nov 04 21:43:26 2013 -0500 summary: Issue #19397: test_pydoc now works with -S (help not added to builtins). Patch by Serhiy Storchaka and Vajrasky Kok. files: Lib/test/test_pydoc.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_pydoc.py b/Lib/test/test_pydoc.py --- a/Lib/test/test_pydoc.py +++ b/Lib/test/test_pydoc.py @@ -404,7 +404,7 @@ def test_namedtuple_public_underscore(self): NT = namedtuple('NT', ['abc', 'def'], rename=True) with captured_stdout() as help_io: - help(NT) + pydoc.help(NT) helptext = help_io.getvalue() self.assertIn('_1', helptext) self.assertIn('_replace', helptext) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 03:44:30 2013 From: python-checkins at python.org (terry.reedy) Date: Tue, 5 Nov 2013 03:44:30 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge_with_3=2E3?= Message-ID: <3dDFZk3YhDz7LlK@mail.python.org> http://hg.python.org/cpython/rev/26511010988b changeset: 86934:26511010988b parent: 86932:8d93ad260714 parent: 86933:2c191b0b5e7a user: Terry Jan Reedy date: Mon Nov 04 21:44:17 2013 -0500 summary: Merge with 3.3 files: Lib/test/test_pydoc.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_pydoc.py b/Lib/test/test_pydoc.py --- a/Lib/test/test_pydoc.py +++ b/Lib/test/test_pydoc.py @@ -470,7 +470,7 @@ def test_namedtuple_public_underscore(self): NT = namedtuple('NT', ['abc', 'def'], rename=True) with captured_stdout() as help_io: - help(NT) + pydoc.help(NT) helptext = help_io.getvalue() self.assertIn('_1', helptext) self.assertIn('_replace', helptext) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 03:45:47 2013 From: python-checkins at python.org (terry.reedy) Date: Tue, 5 Nov 2013 03:45:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5Mzk3?= =?utf-8?q?=3A_test=5Fpydoc_now_works_with_-S_=28help_not_added_to_builtin?= =?utf-8?b?cyku?= Message-ID: <3dDFcC3gcsz7LpC@mail.python.org> http://hg.python.org/cpython/rev/92022b45e60b changeset: 86935:92022b45e60b branch: 2.7 parent: 86931:c7c90330eea1 user: Terry Jan Reedy date: Mon Nov 04 21:45:33 2013 -0500 summary: Issue #19397: test_pydoc now works with -S (help not added to builtins). Patch by Serhiy Storchaka and Vajrasky Kok. files: Lib/test/test_pydoc.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_pydoc.py b/Lib/test/test_pydoc.py --- a/Lib/test/test_pydoc.py +++ b/Lib/test/test_pydoc.py @@ -421,7 +421,7 @@ def test_namedtuple_public_underscore(self): NT = namedtuple('NT', ['abc', 'def'], rename=True) with captured_stdout() as help_io: - help(NT) + pydoc.help(NT) helptext = help_io.getvalue() self.assertIn('_1', helptext) self.assertIn('_replace', helptext) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 11:03:35 2013 From: python-checkins at python.org (vinay.sajip) Date: Tue, 5 Nov 2013 11:03:35 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE4MzQ1?= =?utf-8?q?=3A_Added_cookbook_example_illustrating_handler_customisation?= =?utf-8?q?=2E?= Message-ID: <3dDRKM2Zv2z7LqL@mail.python.org> http://hg.python.org/cpython/rev/5636366db039 changeset: 86936:5636366db039 branch: 3.3 parent: 86933:2c191b0b5e7a user: Vinay Sajip date: Tue Nov 05 10:02:21 2013 +0000 summary: Issue #18345: Added cookbook example illustrating handler customisation. files: Doc/howto/logging-cookbook.rst | 135 +++++++++++++++++++++ 1 files changed, 135 insertions(+), 0 deletions(-) diff --git a/Doc/howto/logging-cookbook.rst b/Doc/howto/logging-cookbook.rst --- a/Doc/howto/logging-cookbook.rst +++ b/Doc/howto/logging-cookbook.rst @@ -1694,3 +1694,138 @@ Note that the order of items might be different according to the version of Python used. +.. currentmodule:: logging.config + +Customising handlers with :func:`dictConfig` +-------------------------------------------- + +There are times when you want to customise logging handlers in particular ways, +and if you use :func:`dictConfig` you may be able to do this without +subclassing. As an example, consider that you may want to set the ownership of a +log file. On POSIX, this is easily done using :func:`shutil.chown`, but the file +handlers in the stdlib don't offer built-in support. You can customise handler +creation using a plain function such as:: + + def owned_file_handler(filename, mode='a', encoding=None, owner=None): + if owner: + if not os.path.exists(filename): + open(filename, 'a').close() + shutil.chown(filename, *owner) + return logging.FileHandler(filename, mode, encoding) + +You can then specify, in a logging configuration passed to :func:`dictConfig`, +that a logging handler be created by calling this function:: + + LOGGING = { + 'version': 1, + 'disable_existing_loggers': False, + 'formatters': { + 'default': { + 'format': '%(asctime)s %(levelname)s %(name)s %(message)s' + }, + }, + 'handlers': { + 'file':{ + # The values below are popped from this dictionary and + # used to create the handler, set the handler's level and + # its formatter. + '()': owned_file_handler, + 'level':'DEBUG', + 'formatter': 'default', + # The values below are passed to the handler creator callable + # as keyword arguments. + 'owner': ['pulse', 'pulse'], + 'filename': 'chowntest.log', + 'mode': 'w', + 'encoding': 'utf-8', + }, + }, + 'root': { + 'handlers': ['file'], + 'level': 'DEBUG', + }, + } + +In this example I am setting the ownership using the ``pulse`` user and group, +just for the purposes of illustration. Putting it together into a working +script, ``chowntest.py``:: + + import logging, logging.config, os, shutil + + def owned_file_handler(filename, mode='a', encoding=None, owner=None): + if owner: + if not os.path.exists(filename): + open(filename, 'a').close() + shutil.chown(filename, *owner) + return logging.FileHandler(filename, mode, encoding) + + LOGGING = { + 'version': 1, + 'disable_existing_loggers': False, + 'formatters': { + 'default': { + 'format': '%(asctime)s %(levelname)s %(name)s %(message)s' + }, + }, + 'handlers': { + 'file':{ + # The values below are popped from this dictionary and + # used to create the handler, set the handler's level and + # its formatter. + '()': owned_file_handler, + 'level':'DEBUG', + 'formatter': 'default', + # The values below are passed to the handler creator callable + # as keyword arguments. + 'owner': ['pulse', 'pulse'], + 'filename': 'chowntest.log', + 'mode': 'w', + 'encoding': 'utf-8', + }, + }, + 'root': { + 'handlers': ['file'], + 'level': 'DEBUG', + }, + } + + logging.config.dictConfig(LOGGING) + logger = logging.getLogger('mylogger') + logger.debug('A debug message') + +To run this, you will probably need to run as ``root``:: + + $ sudo python3.3 chowntest.py + $ cat chowntest.log + 2013-11-05 09:34:51,128 DEBUG mylogger A debug message + $ ls -l chowntest.log + -rw-r--r-- 1 pulse pulse 55 2013-11-05 09:34 chowntest.log + +Note that this example uses Python 3.3 because that's where :func:`shutil.chown` +makes an appearance. This approach should work with any Python version that +supports :func:`dictConfig` - namely, Python 2.7, 3.2 or later. With pre-3.3 +versions, you would need to implement the actual ownership change using e.g. +:func:`os.chown`. + +In practice, the handler-creating function may be in a utility module somewhere +in your project. Instead of the line in the configuration:: + + '()': owned_file_handler, + +you could use e.g.:: + + '()': 'ext://project.util.owned_file_handler', + +where ``project.util`` can be replaced with the actual name of the package +where the function resides. In the above working script, using +``'ext://__main__.owned_file_handler'`` should work. Here, the actual callable +is resolved by :func:`dictConfig` from the ``ext://`` specification. + +This example hopefully also points the way to how you could implement other +types of file change - e.g. setting specific POSIX permission bits - in the +same way, using :func:`os.chmod`. + +Of course, the approach could also be extended to types of handler other than a +:class:`~logging.FileHandler` - for example, one of the rotating file handlers, +or a different type of handler altogether. + -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 11:03:36 2013 From: python-checkins at python.org (vinay.sajip) Date: Tue, 5 Nov 2013 11:03:36 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Closes_=2318345=3A_Merged_documentation_update_from_3=2E?= =?utf-8?q?3=2E?= Message-ID: <3dDRKN5SGBz7LpH@mail.python.org> http://hg.python.org/cpython/rev/388cc713ad33 changeset: 86937:388cc713ad33 parent: 86934:26511010988b parent: 86936:5636366db039 user: Vinay Sajip date: Tue Nov 05 10:03:20 2013 +0000 summary: Closes #18345: Merged documentation update from 3.3. files: Doc/howto/logging-cookbook.rst | 135 +++++++++++++++++++++ 1 files changed, 135 insertions(+), 0 deletions(-) diff --git a/Doc/howto/logging-cookbook.rst b/Doc/howto/logging-cookbook.rst --- a/Doc/howto/logging-cookbook.rst +++ b/Doc/howto/logging-cookbook.rst @@ -1692,3 +1692,138 @@ Note that the order of items might be different according to the version of Python used. +.. currentmodule:: logging.config + +Customising handlers with :func:`dictConfig` +-------------------------------------------- + +There are times when you want to customise logging handlers in particular ways, +and if you use :func:`dictConfig` you may be able to do this without +subclassing. As an example, consider that you may want to set the ownership of a +log file. On POSIX, this is easily done using :func:`shutil.chown`, but the file +handlers in the stdlib don't offer built-in support. You can customise handler +creation using a plain function such as:: + + def owned_file_handler(filename, mode='a', encoding=None, owner=None): + if owner: + if not os.path.exists(filename): + open(filename, 'a').close() + shutil.chown(filename, *owner) + return logging.FileHandler(filename, mode, encoding) + +You can then specify, in a logging configuration passed to :func:`dictConfig`, +that a logging handler be created by calling this function:: + + LOGGING = { + 'version': 1, + 'disable_existing_loggers': False, + 'formatters': { + 'default': { + 'format': '%(asctime)s %(levelname)s %(name)s %(message)s' + }, + }, + 'handlers': { + 'file':{ + # The values below are popped from this dictionary and + # used to create the handler, set the handler's level and + # its formatter. + '()': owned_file_handler, + 'level':'DEBUG', + 'formatter': 'default', + # The values below are passed to the handler creator callable + # as keyword arguments. + 'owner': ['pulse', 'pulse'], + 'filename': 'chowntest.log', + 'mode': 'w', + 'encoding': 'utf-8', + }, + }, + 'root': { + 'handlers': ['file'], + 'level': 'DEBUG', + }, + } + +In this example I am setting the ownership using the ``pulse`` user and group, +just for the purposes of illustration. Putting it together into a working +script, ``chowntest.py``:: + + import logging, logging.config, os, shutil + + def owned_file_handler(filename, mode='a', encoding=None, owner=None): + if owner: + if not os.path.exists(filename): + open(filename, 'a').close() + shutil.chown(filename, *owner) + return logging.FileHandler(filename, mode, encoding) + + LOGGING = { + 'version': 1, + 'disable_existing_loggers': False, + 'formatters': { + 'default': { + 'format': '%(asctime)s %(levelname)s %(name)s %(message)s' + }, + }, + 'handlers': { + 'file':{ + # The values below are popped from this dictionary and + # used to create the handler, set the handler's level and + # its formatter. + '()': owned_file_handler, + 'level':'DEBUG', + 'formatter': 'default', + # The values below are passed to the handler creator callable + # as keyword arguments. + 'owner': ['pulse', 'pulse'], + 'filename': 'chowntest.log', + 'mode': 'w', + 'encoding': 'utf-8', + }, + }, + 'root': { + 'handlers': ['file'], + 'level': 'DEBUG', + }, + } + + logging.config.dictConfig(LOGGING) + logger = logging.getLogger('mylogger') + logger.debug('A debug message') + +To run this, you will probably need to run as ``root``:: + + $ sudo python3.3 chowntest.py + $ cat chowntest.log + 2013-11-05 09:34:51,128 DEBUG mylogger A debug message + $ ls -l chowntest.log + -rw-r--r-- 1 pulse pulse 55 2013-11-05 09:34 chowntest.log + +Note that this example uses Python 3.3 because that's where :func:`shutil.chown` +makes an appearance. This approach should work with any Python version that +supports :func:`dictConfig` - namely, Python 2.7, 3.2 or later. With pre-3.3 +versions, you would need to implement the actual ownership change using e.g. +:func:`os.chown`. + +In practice, the handler-creating function may be in a utility module somewhere +in your project. Instead of the line in the configuration:: + + '()': owned_file_handler, + +you could use e.g.:: + + '()': 'ext://project.util.owned_file_handler', + +where ``project.util`` can be replaced with the actual name of the package +where the function resides. In the above working script, using +``'ext://__main__.owned_file_handler'`` should work. Here, the actual callable +is resolved by :func:`dictConfig` from the ``ext://`` specification. + +This example hopefully also points the way to how you could implement other +types of file change - e.g. setting specific POSIX permission bits - in the +same way, using :func:`os.chmod`. + +Of course, the approach could also be extended to types of handler other than a +:class:`~logging.FileHandler` - for example, one of the rotating file handlers, +or a different type of handler altogether. + -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 11:51:53 2013 From: python-checkins at python.org (ned.deily) Date: Tue, 5 Nov 2013 11:51:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE1NjYz?= =?utf-8?q?=3A_Revert_OS_X_installer_built-in_Tcl/Tk_support_for_2=2E7=2E6?= =?utf-8?q?=2E?= Message-ID: <3dDSP54HSxz7Lqw@mail.python.org> http://hg.python.org/cpython/rev/fc8f19b4b662 changeset: 86938:fc8f19b4b662 branch: 2.7 parent: 86935:92022b45e60b user: Ned Deily date: Tue Nov 05 02:40:03 2013 -0800 summary: Issue #15663: Revert OS X installer built-in Tcl/Tk support for 2.7.6. Some third-party projects, such as matplotlib and PIL/Pillow, depended on being able to build with Tcl and Tk frameworks in /Library/Frameworks. They were unable to build with the built-in Tcl/Tk and/or execute correctly. files: Mac/BuildScript/README.txt | 28 +-------------- Mac/BuildScript/build-installer.py | 10 ++-- Mac/BuildScript/resources/ReadMe.txt | 23 +++-------- Mac/BuildScript/resources/Welcome.rtf | 18 +++++--- Misc/NEWS | 8 ++++ 5 files changed, 32 insertions(+), 55 deletions(-) diff --git a/Mac/BuildScript/README.txt b/Mac/BuildScript/README.txt --- a/Mac/BuildScript/README.txt +++ b/Mac/BuildScript/README.txt @@ -57,39 +57,13 @@ * NCurses 5.9 (http://bugs.python.org/issue15037) * SQLite 3.7.13 - * Tcl 8.5.15 - uses system-supplied versions of third-party libraries * readline module links with Apple BSD editline (libedit) * builds Oracle Sleepycat DB 4.8 (Python 2.x only) - - requires ActiveState Tcl/Tk 8.5.14 (or later) to be installed for building - - * Beginning with Python 2.7.6, this installer now includes its own - builtin copy of Tcl and Tk 8.5.15 libraries and thus is no longer - dependent on the buggy releases of Aqua Cocoa Tk 8.5 shipped with - OS X 10.6 or on installing a newer third-party version of Tcl/Tk - in /Library/Frameworks, such as from ActiveState. If it is - necessary to fallback to using a third-party Tcl/Tk because of - a problem with the builtin Tcl/Tk, there is a backup version of - the _tkinter extension included which will dynamically link to - Tcl and Tk frameworks in /Library/Frameworks as in previous releases. - To enable (for all users of this Python 2.7):: - - sudo bash - cd /Library/Frameworks/Python.framework/Versions/2.7 - cd ./lib/python2.7 - cp -p ./lib-tkinter/library/_tkinter.so ./lib-dynload - exit - - To restore using Python's builtin versions of Tcl and Tk:: - - sudo bash - cd /Library/Frameworks/Python.framework/Versions/2.7 - cd ./lib/python2.7 - cp -p ./lib-tkinter/builtin/_tkinter.so ./lib-dynload - exit + - requires ActiveState Tcl/Tk 8.5.9 (or later) to be installed for building - recommended build environment: diff --git a/Mac/BuildScript/build-installer.py b/Mac/BuildScript/build-installer.py --- a/Mac/BuildScript/build-installer.py +++ b/Mac/BuildScript/build-installer.py @@ -193,7 +193,7 @@ LT_10_5 = bool(DEPTARGET < '10.5') - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): result.extend([ dict( name="Tcl 8.5.15", @@ -571,7 +571,7 @@ # - the traditional version (renamed to _tkinter_library.so) linked # with /Library/Frameworks/{Tcl,Tk}.framework # - the default version linked with our builtin copies of Tcl and Tk - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): EXPECTED_SHARED_LIBS['_tkinter_library.so'] = \ EXPECTED_SHARED_LIBS['_tkinter.so'] EXPECTED_SHARED_LIBS['_tkinter.so'] = [ @@ -971,7 +971,7 @@ # out-of-date and has critical bugs. Save the _tkinter.so that was # linked with /Library/Frameworks/{Tck,Tk}.framework and build # another _tkinter.so linked with our builtin Tcl and Tk libs. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): runCommand("find build -name '_tkinter.so' " " -execdir mv '{}' _tkinter_library.so \;") print("Running make to build builtin _tkinter") @@ -1012,7 +1012,7 @@ # users to select which to import by manipulating sys.path # directly or with PYTHONPATH. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): TKINTERS = ['builtin', 'library'] tkinter_moves = [('_tkinter_' + tkn + '.so', os.path.join(path_to_lib, 'lib-tkinter', tkn)) @@ -1059,7 +1059,7 @@ # The files are moved after the entire tree has been walked # since the shared library checking depends on the files # having unique names. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): for tkm in tkinter_moves: if fn == tkm[0]: moves_list.append( diff --git a/Mac/BuildScript/resources/ReadMe.txt b/Mac/BuildScript/resources/ReadMe.txt --- a/Mac/BuildScript/resources/ReadMe.txt +++ b/Mac/BuildScript/resources/ReadMe.txt @@ -17,27 +17,18 @@ installer package icon. Then select "Open using ... Installer" from the contextual menu that appears. - **** IMPORTANT changes if you use IDLE and Tkinter **** + **** IMPORTANT **** -Installing a third-party version of Tcl/Tk is no longer required -================================================================ +Update your version of Tcl/Tk to use IDLE or other Tk applications +================================================================== -As of Python 2.7.6, the 10.6+ 64-bit installer now -comes with its own private copy of Tcl and Tk 8.5 libraries. For -this version of Python, it is no longer necessary to install -a third-party version of Tcl/Tk 8.5, such as those from ActiveState, -to work around the problematic versions of Tcl/Tk 8.5 shipped by -Apple in OS X 10.6 and later. (This does not change the requirements -for older versions of Python installed from python.org.) By default, -this version of Python will always use its own private version, -regardless of whether a third-party Tcl/Tk is installed. -The 10.3+ 32-bit-only installer continues to use Tcl/Tk 8.4, -either a third-party or system-supplied version. - -Visit http://www.python.org/download/mac/tcltk/ +To use IDLE or other programs that use the Tkinter graphical user +interface toolkit, you may need to install a newer third-party version +of the Tcl/Tk frameworks. Visit http://www.python.org/download/mac/tcltk/ for current information about supported and recommended versions of Tcl/Tk for this version of Python and of Mac OS X. + Using this version of Python on OS X ==================================== diff --git a/Mac/BuildScript/resources/Welcome.rtf b/Mac/BuildScript/resources/Welcome.rtf --- a/Mac/BuildScript/resources/Welcome.rtf +++ b/Mac/BuildScript/resources/Welcome.rtf @@ -1,8 +1,8 @@ -{\rtf1\ansi\ansicpg1252\cocoartf1187\cocoasubrtf400 -\cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} +{\rtf1\ansi\ansicpg1252\cocoartf1038\cocoasubrtf350 +{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} -\paperw11905\paperh16837\margl1440\margr1440\vieww9640\viewh10620\viewkind0 -\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640 +\paperw11904\paperh16836\margl1440\margr1440\vieww9640\viewh10620\viewkind0 +\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\ql\qnatural \f0\fs24 \cf0 This package will install \b Python $FULL_VERSION @@ -19,7 +19,11 @@ See the ReadMe file and the Python documentation for more information.\ \ -\b IMPORTANT for users of IDLE and tkinter: -\b0 As of Python 2.7.6, it is no longer necessary to install third-party versions of the +\b IMPORTANT: +\b0 +\b IDLE +\b0 and other programs using the +\b tkinter +\b0 graphical user interface toolkit require specific versions of the \b Tcl/Tk -\b0 platform independent windowing toolkit. Please read the ReadMe file and visit {\field{\*\fldinst{HYPERLINK "http://www.python.org/download/mac/tcltk/"}}{\fldrslt http://www.python.org/download/mac/tcltk/}} for more information on supported and recommended versions of Tcl/Tk for this version of Python and Mac OS X.} \ No newline at end of file +\b0 platform independent windowing toolkit. Visit {\field{\*\fldinst{HYPERLINK "http://www.python.org/download/mac/tcltk/"}}{\fldrslt http://www.python.org/download/mac/tcltk/}} for current information on supported and recommended versions of Tcl/Tk for this version of Python and Mac OS X.} \ No newline at end of file diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -59,6 +59,14 @@ - Issue #19457: Fixed xmlcharrefreplace tests on wide build when tests are loaded from .py[co] files. +Build +----- + +- Issue #15663: Revert OS X installer built-in Tcl/Tk support for 2.7.6. + Some third-party projects, such as Matplotlib and PIL/Pillow, + depended on being able to build with Tcl and Tk frameworks in + /Library/Frameworks. + What's New in Python 2.7.6 release candidate 1? =============================================== -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 11:51:55 2013 From: python-checkins at python.org (ned.deily) Date: Tue, 5 Nov 2013 11:51:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE1NjYz?= =?utf-8?q?=3A_Revert_OS_X_installer_built-in_Tcl/Tk_support_for_3=2E3=2E3?= =?utf-8?q?=2E?= Message-ID: <3dDSP707R4z7Lqm@mail.python.org> http://hg.python.org/cpython/rev/268dc81c2527 changeset: 86939:268dc81c2527 branch: 3.3 parent: 86936:5636366db039 user: Ned Deily date: Tue Nov 05 02:44:17 2013 -0800 summary: Issue #15663: Revert OS X installer built-in Tcl/Tk support for 3.3.3. Some third-party projects, such as matplotlib and PIL/Pillow, depended on being able to build with Tcl and Tk frameworks in /Library/Frameworks. They were unable to build with the built-in Tcl/Tk and/or execute correctly. files: Mac/BuildScript/README.txt | 29 +-------------- Mac/BuildScript/build-installer.py | 10 ++-- Mac/BuildScript/resources/ReadMe.txt | 23 +++-------- Mac/BuildScript/resources/Welcome.rtf | 10 +++- Misc/NEWS | 8 ++++ 5 files changed, 28 insertions(+), 52 deletions(-) diff --git a/Mac/BuildScript/README.txt b/Mac/BuildScript/README.txt --- a/Mac/BuildScript/README.txt +++ b/Mac/BuildScript/README.txt @@ -57,40 +57,13 @@ * NCurses 5.9 (http://bugs.python.org/issue15037) * SQLite 3.7.13 - * Tcl 8.5.15 - * Tk 8.5.15 * XZ 5.0.3 - uses system-supplied versions of third-party libraries * readline module links with Apple BSD editline (libedit) - - requires ActiveState Tcl/Tk 8.5.14 (or later) to be installed for building - - * Beginning with Python 3.3.3, this installer now includes its own - builtin copy of Tcl and Tk 8.5.15 libraries and thus is no longer - dependent on the buggy releases of Aqua Cocoa Tk 8.5 shipped with - OS X 10.6 or on installing a newer third-party version of Tcl/Tk - in /Library/Frameworks, such as from ActiveState. If it is - necessary to fallback to using a third-party Tcl/Tk because of - a problem with the builtin Tcl/Tk, there is a backup version of - the _tkinter extension included which will dynamically link to - Tcl and Tk frameworks in /Library/Frameworks as in previous releases. - To enable (for all users of this Python 3.3):: - - sudo bash - cd /Library/Frameworks/Python.framework/Versions/3.3 - cd ./lib/python3.3 - cp -p ./lib-tkinter/library/_tkinter.so ./lib-dynload - exit - - To restore using Python's builtin versions of Tcl and Tk:: - - sudo bash - cd /Library/Frameworks/Python.framework/Versions/3.3 - cd ./lib/python3.3 - cp -p ./lib-tkinter/builtin/_tkinter.so ./lib-dynload - exit + - requires ActiveState Tcl/Tk 8.5.9 (or later) to be installed for building - recommended build environment: diff --git a/Mac/BuildScript/build-installer.py b/Mac/BuildScript/build-installer.py --- a/Mac/BuildScript/build-installer.py +++ b/Mac/BuildScript/build-installer.py @@ -193,7 +193,7 @@ LT_10_5 = bool(DEPTARGET < '10.5') - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): result.extend([ dict( name="Tcl 8.5.15", @@ -571,7 +571,7 @@ # - the traditional version (renamed to _tkinter_library.so) linked # with /Library/Frameworks/{Tcl,Tk}.framework # - the default version linked with our builtin copies of Tcl and Tk - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): EXPECTED_SHARED_LIBS['_tkinter_library.so'] = \ EXPECTED_SHARED_LIBS['_tkinter.so'] EXPECTED_SHARED_LIBS['_tkinter.so'] = [ @@ -971,7 +971,7 @@ # out-of-date and has critical bugs. Save the _tkinter.so that was # linked with /Library/Frameworks/{Tck,Tk}.framework and build # another _tkinter.so linked with our builtin Tcl and Tk libs. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): runCommand("find build -name '_tkinter.so' " " -execdir mv '{}' _tkinter_library.so \;") print("Running make to build builtin _tkinter") @@ -1012,7 +1012,7 @@ # users to select which to import by manipulating sys.path # directly or with PYTHONPATH. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): TKINTERS = ['builtin', 'library'] tkinter_moves = [('_tkinter_' + tkn + '.so', os.path.join(path_to_lib, 'lib-tkinter', tkn)) @@ -1059,7 +1059,7 @@ # The files are moved after the entire tree has been walked # since the shared library checking depends on the files # having unique names. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): for tkm in tkinter_moves: if fn == tkm[0]: moves_list.append( diff --git a/Mac/BuildScript/resources/ReadMe.txt b/Mac/BuildScript/resources/ReadMe.txt --- a/Mac/BuildScript/resources/ReadMe.txt +++ b/Mac/BuildScript/resources/ReadMe.txt @@ -17,27 +17,18 @@ installer package icon. Then select "Open using ... Installer" from the contextual menu that appears. - **** IMPORTANT changes if you use IDLE and Tkinter **** + **** IMPORTANT **** -Installing a third-party version of Tcl/Tk is no longer required -================================================================ +Update your version of Tcl/Tk to use IDLE or other Tk applications +================================================================== -As of Python 3.3.3, the 10.6+ 64-bit installer now -comes with its own private copy of Tcl and Tk 8.5 libraries. For -this version of Python, it is no longer necessary to install -a third-party version of Tcl/Tk 8.5, such as those from ActiveState, -to work around the problematic versions of Tcl/Tk 8.5 shipped by -Apple in OS X 10.6 and later. (This does not change the requirements -for older versions of Python installed from python.org.) By default, -this version of Python will always use its own private version, -regardless of whether a third-party Tcl/Tk is installed. -The 10.5+ 32-bit-only installer continues to use Tcl/Tk 8.4, -either a third-party or system-supplied version. - -Visit http://www.python.org/download/mac/tcltk/ +To use IDLE or other programs that use the Tkinter graphical user +interface toolkit, you may need to install a newer third-party version +of the Tcl/Tk frameworks. Visit http://www.python.org/download/mac/tcltk/ for current information about supported and recommended versions of Tcl/Tk for this version of Python and of Mac OS X. + Using this version of Python on OS X ==================================== diff --git a/Mac/BuildScript/resources/Welcome.rtf b/Mac/BuildScript/resources/Welcome.rtf --- a/Mac/BuildScript/resources/Welcome.rtf +++ b/Mac/BuildScript/resources/Welcome.rtf @@ -25,7 +25,11 @@ \b0 at any time to make $FULL_VERSION the default Python 3 version. This version can co-exist with other installed versions of Python 3 and Python 2.\ \ -\b IMPORTANT for users of IDLE and tkinter: -\b0 As of Python 3.3.3, it is no longer necessary to install third-party versions of the +\b IMPORTANT: +\b0 +\b IDLE +\b0 and other programs using the +\b tkinter +\b0 graphical user interface toolkit require specific versions of the \b Tcl/Tk -\b0 platform independent windowing toolkit. Please read the ReadMe file and visit {\field{\*\fldinst{HYPERLINK "http://www.python.org/download/mac/tcltk/"}}{\fldrslt http://www.python.org/download/mac/tcltk/}} for more information on supported and recommended versions of Tcl/Tk for this version of Python and Mac OS X.} \ No newline at end of file +\b0 platform independent windowing toolkit. Visit {\field{\*\fldinst{HYPERLINK "http://www.python.org/download/mac/tcltk/"}}{\fldrslt http://www.python.org/download/mac/tcltk/}} for current information on supported and recommended versions of Tcl/Tk for this version of Python and Mac OS X.} \ No newline at end of file diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -33,6 +33,14 @@ - Issue #19085: Added basic tests for all tkinter widget options. +Build +----- + +- Issue #15663: Revert OS X installer built-in Tcl/Tk support for 3.3.3. + Some third-party projects, such as Matplotlib and PIL/Pillow, + depended on being able to build with Tcl and Tk frameworks in + /Library/Frameworks. + What's New in Python 3.3.3 release candidate 1? =============================================== -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 11:51:56 2013 From: python-checkins at python.org (ned.deily) Date: Tue, 5 Nov 2013 11:51:56 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2315663=3A_merge_build-installer=2Epy_changes?= Message-ID: <3dDSP823zYz7Lqs@mail.python.org> http://hg.python.org/cpython/rev/f89beccd470c changeset: 86940:f89beccd470c parent: 86937:388cc713ad33 parent: 86939:268dc81c2527 user: Ned Deily date: Tue Nov 05 02:50:49 2013 -0800 summary: Issue #15663: merge build-installer.py changes files: Mac/BuildScript/build-installer.py | 10 +++++----- 1 files changed, 5 insertions(+), 5 deletions(-) diff --git a/Mac/BuildScript/build-installer.py b/Mac/BuildScript/build-installer.py --- a/Mac/BuildScript/build-installer.py +++ b/Mac/BuildScript/build-installer.py @@ -193,7 +193,7 @@ LT_10_5 = bool(DEPTARGET < '10.5') - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): result.extend([ dict( name="Tcl 8.5.15", @@ -571,7 +571,7 @@ # - the traditional version (renamed to _tkinter_library.so) linked # with /Library/Frameworks/{Tcl,Tk}.framework # - the default version linked with our builtin copies of Tcl and Tk - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): EXPECTED_SHARED_LIBS['_tkinter_library.so'] = \ EXPECTED_SHARED_LIBS['_tkinter.so'] EXPECTED_SHARED_LIBS['_tkinter.so'] = [ @@ -971,7 +971,7 @@ # out-of-date and has critical bugs. Save the _tkinter.so that was # linked with /Library/Frameworks/{Tck,Tk}.framework and build # another _tkinter.so linked with our builtin Tcl and Tk libs. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): runCommand("find build -name '_tkinter.so' " " -execdir mv '{}' _tkinter_library.so \;") print("Running make to build builtin _tkinter") @@ -1012,7 +1012,7 @@ # users to select which to import by manipulating sys.path # directly or with PYTHONPATH. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): TKINTERS = ['builtin', 'library'] tkinter_moves = [('_tkinter_' + tkn + '.so', os.path.join(path_to_lib, 'lib-tkinter', tkn)) @@ -1059,7 +1059,7 @@ # The files are moved after the entire tree has been walked # since the shared library checking depends on the files # having unique names. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): for tkm in tkinter_moves: if fn == tkm[0]: moves_list.append( -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 13:58:41 2013 From: python-checkins at python.org (tim.golden) Date: Tue, 5 Nov 2013 13:58:41 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2310197_Tweak_docs_?= =?utf-8?q?for_subprocess=2Egetstatusoutput_and_align_the?= Message-ID: <3dDWCP2lXXz7Lp3@mail.python.org> http://hg.python.org/cpython/rev/0aa2aedc6a21 changeset: 86941:0aa2aedc6a21 user: Tim Golden date: Tue Nov 05 12:57:25 2013 +0000 summary: Issue #10197 Tweak docs for subprocess.getstatusoutput and align the documentation, the module docstring, and the function docstring. files: Doc/library/subprocess.rst | 6 +++--- Lib/subprocess.py | 26 +++++++++++++++----------- 2 files changed, 18 insertions(+), 14 deletions(-) diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -1058,9 +1058,9 @@ Return ``(status, output)`` of executing *cmd* in a shell. - Execute the string *cmd* in a shell with :class:`Popen` and return a 2-tuple - ``(status, output)`` via :func:`Popen.communicate`. Universal newlines mode - is used; see the notes on :ref:`frequently-used-arguments` for more details. + Execute the string *cmd* in a shell with :meth:`Popen.check_output` and + return a 2-tuple ``(status, output)``. Universal newlines mode is used; + see the notes on :ref:`frequently-used-arguments` for more details. A trailing newline is stripped from the output. The exit status for the command can be interpreted diff --git a/Lib/subprocess.py b/Lib/subprocess.py --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -145,11 +145,13 @@ getstatusoutput(cmd): Return (status, output) of executing cmd in a shell. - Execute the string 'cmd' in a shell with os.popen() and return a 2-tuple - (status, output). cmd is actually run as '{ cmd ; } 2>&1', so that the - returned output will contain output or error messages. A trailing newline - is stripped from the output. The exit status for the command can be - interpreted according to the rules for the C function wait(). Example: + Execute the string 'cmd' in a shell with 'check_output' and + return a 2-tuple (status, output). Universal newlines mode is used, + meaning that the result with be decoded to a string. + + A trailing newline is stripped from the output. + The exit status for the command can be interpreted + according to the rules for the function 'wait'. Example: >>> subprocess.getstatusoutput('ls /bin/ls') (0, '/bin/ls') @@ -684,13 +686,15 @@ # NB This only works (and is only relevant) for POSIX. def getstatusoutput(cmd): - """Return (status, output) of executing cmd in a shell. + """ Return (status, output) of executing cmd in a shell. - Execute the string 'cmd' in a shell with os.popen() and return a 2-tuple - (status, output). cmd is actually run as '{ cmd ; } 2>&1', so that the - returned output will contain output or error messages. A trailing newline - is stripped from the output. The exit status for the command can be - interpreted according to the rules for the C function wait(). Example: + Execute the string 'cmd' in a shell with 'check_output' and + return a 2-tuple (status, output). Universal newlines mode is used, + meaning that the result with be decoded to a string. + + A trailing newline is stripped from the output. + The exit status for the command can be interpreted + according to the rules for the function 'wait'. Example: >>> import subprocess >>> subprocess.getstatusoutput('ls /bin/ls') -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 16:10:00 2013 From: python-checkins at python.org (benjamin.peterson) Date: Tue, 5 Nov 2013 16:10:00 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE1NjYz?= =?utf-8?q?=3A_Revert_OS_X_installer_built-in_Tcl/Tk_support_for_2=2E7=2E6?= =?utf-8?q?=2E?= Message-ID: <3dDZ6w0mc2z7LpY@mail.python.org> http://hg.python.org/cpython/rev/ba31940588b6 changeset: 86942:ba31940588b6 branch: 2.7 parent: 86930:d9790d8526fa user: Ned Deily date: Tue Nov 05 02:40:03 2013 -0800 summary: Issue #15663: Revert OS X installer built-in Tcl/Tk support for 2.7.6. Some third-party projects, such as matplotlib and PIL/Pillow, depended on being able to build with Tcl and Tk frameworks in /Library/Frameworks. They were unable to build with the built-in Tcl/Tk and/or execute correctly. files: Mac/BuildScript/README.txt | 28 +-------------- Mac/BuildScript/build-installer.py | 10 ++-- Mac/BuildScript/resources/ReadMe.txt | 23 +++-------- Mac/BuildScript/resources/Welcome.rtf | 18 +++++--- Misc/NEWS | 8 ++++ 5 files changed, 32 insertions(+), 55 deletions(-) diff --git a/Mac/BuildScript/README.txt b/Mac/BuildScript/README.txt --- a/Mac/BuildScript/README.txt +++ b/Mac/BuildScript/README.txt @@ -57,39 +57,13 @@ * NCurses 5.9 (http://bugs.python.org/issue15037) * SQLite 3.7.13 - * Tcl 8.5.15 - uses system-supplied versions of third-party libraries * readline module links with Apple BSD editline (libedit) * builds Oracle Sleepycat DB 4.8 (Python 2.x only) - - requires ActiveState Tcl/Tk 8.5.14 (or later) to be installed for building - - * Beginning with Python 2.7.6, this installer now includes its own - builtin copy of Tcl and Tk 8.5.15 libraries and thus is no longer - dependent on the buggy releases of Aqua Cocoa Tk 8.5 shipped with - OS X 10.6 or on installing a newer third-party version of Tcl/Tk - in /Library/Frameworks, such as from ActiveState. If it is - necessary to fallback to using a third-party Tcl/Tk because of - a problem with the builtin Tcl/Tk, there is a backup version of - the _tkinter extension included which will dynamically link to - Tcl and Tk frameworks in /Library/Frameworks as in previous releases. - To enable (for all users of this Python 2.7):: - - sudo bash - cd /Library/Frameworks/Python.framework/Versions/2.7 - cd ./lib/python2.7 - cp -p ./lib-tkinter/library/_tkinter.so ./lib-dynload - exit - - To restore using Python's builtin versions of Tcl and Tk:: - - sudo bash - cd /Library/Frameworks/Python.framework/Versions/2.7 - cd ./lib/python2.7 - cp -p ./lib-tkinter/builtin/_tkinter.so ./lib-dynload - exit + - requires ActiveState Tcl/Tk 8.5.9 (or later) to be installed for building - recommended build environment: diff --git a/Mac/BuildScript/build-installer.py b/Mac/BuildScript/build-installer.py --- a/Mac/BuildScript/build-installer.py +++ b/Mac/BuildScript/build-installer.py @@ -193,7 +193,7 @@ LT_10_5 = bool(DEPTARGET < '10.5') - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): result.extend([ dict( name="Tcl 8.5.15", @@ -571,7 +571,7 @@ # - the traditional version (renamed to _tkinter_library.so) linked # with /Library/Frameworks/{Tcl,Tk}.framework # - the default version linked with our builtin copies of Tcl and Tk - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): EXPECTED_SHARED_LIBS['_tkinter_library.so'] = \ EXPECTED_SHARED_LIBS['_tkinter.so'] EXPECTED_SHARED_LIBS['_tkinter.so'] = [ @@ -971,7 +971,7 @@ # out-of-date and has critical bugs. Save the _tkinter.so that was # linked with /Library/Frameworks/{Tck,Tk}.framework and build # another _tkinter.so linked with our builtin Tcl and Tk libs. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): runCommand("find build -name '_tkinter.so' " " -execdir mv '{}' _tkinter_library.so \;") print("Running make to build builtin _tkinter") @@ -1012,7 +1012,7 @@ # users to select which to import by manipulating sys.path # directly or with PYTHONPATH. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): TKINTERS = ['builtin', 'library'] tkinter_moves = [('_tkinter_' + tkn + '.so', os.path.join(path_to_lib, 'lib-tkinter', tkn)) @@ -1059,7 +1059,7 @@ # The files are moved after the entire tree has been walked # since the shared library checking depends on the files # having unique names. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): for tkm in tkinter_moves: if fn == tkm[0]: moves_list.append( diff --git a/Mac/BuildScript/resources/ReadMe.txt b/Mac/BuildScript/resources/ReadMe.txt --- a/Mac/BuildScript/resources/ReadMe.txt +++ b/Mac/BuildScript/resources/ReadMe.txt @@ -17,27 +17,18 @@ installer package icon. Then select "Open using ... Installer" from the contextual menu that appears. - **** IMPORTANT changes if you use IDLE and Tkinter **** + **** IMPORTANT **** -Installing a third-party version of Tcl/Tk is no longer required -================================================================ +Update your version of Tcl/Tk to use IDLE or other Tk applications +================================================================== -As of Python 2.7.6, the 10.6+ 64-bit installer now -comes with its own private copy of Tcl and Tk 8.5 libraries. For -this version of Python, it is no longer necessary to install -a third-party version of Tcl/Tk 8.5, such as those from ActiveState, -to work around the problematic versions of Tcl/Tk 8.5 shipped by -Apple in OS X 10.6 and later. (This does not change the requirements -for older versions of Python installed from python.org.) By default, -this version of Python will always use its own private version, -regardless of whether a third-party Tcl/Tk is installed. -The 10.3+ 32-bit-only installer continues to use Tcl/Tk 8.4, -either a third-party or system-supplied version. - -Visit http://www.python.org/download/mac/tcltk/ +To use IDLE or other programs that use the Tkinter graphical user +interface toolkit, you may need to install a newer third-party version +of the Tcl/Tk frameworks. Visit http://www.python.org/download/mac/tcltk/ for current information about supported and recommended versions of Tcl/Tk for this version of Python and of Mac OS X. + Using this version of Python on OS X ==================================== diff --git a/Mac/BuildScript/resources/Welcome.rtf b/Mac/BuildScript/resources/Welcome.rtf --- a/Mac/BuildScript/resources/Welcome.rtf +++ b/Mac/BuildScript/resources/Welcome.rtf @@ -1,8 +1,8 @@ -{\rtf1\ansi\ansicpg1252\cocoartf1187\cocoasubrtf400 -\cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} +{\rtf1\ansi\ansicpg1252\cocoartf1038\cocoasubrtf350 +{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} -\paperw11905\paperh16837\margl1440\margr1440\vieww9640\viewh10620\viewkind0 -\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640 +\paperw11904\paperh16836\margl1440\margr1440\vieww9640\viewh10620\viewkind0 +\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\ql\qnatural \f0\fs24 \cf0 This package will install \b Python $FULL_VERSION @@ -19,7 +19,11 @@ See the ReadMe file and the Python documentation for more information.\ \ -\b IMPORTANT for users of IDLE and tkinter: -\b0 As of Python 2.7.6, it is no longer necessary to install third-party versions of the +\b IMPORTANT: +\b0 +\b IDLE +\b0 and other programs using the +\b tkinter +\b0 graphical user interface toolkit require specific versions of the \b Tcl/Tk -\b0 platform independent windowing toolkit. Please read the ReadMe file and visit {\field{\*\fldinst{HYPERLINK "http://www.python.org/download/mac/tcltk/"}}{\fldrslt http://www.python.org/download/mac/tcltk/}} for more information on supported and recommended versions of Tcl/Tk for this version of Python and Mac OS X.} \ No newline at end of file +\b0 platform independent windowing toolkit. Visit {\field{\*\fldinst{HYPERLINK "http://www.python.org/download/mac/tcltk/"}}{\fldrslt http://www.python.org/download/mac/tcltk/}} for current information on supported and recommended versions of Tcl/Tk for this version of Python and Mac OS X.} \ No newline at end of file diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -22,6 +22,14 @@ - Issue #19457: Fixed xmlcharrefreplace tests on wide build when tests are loaded from .py[co] files. +Build +----- + +- Issue #15663: Revert OS X installer built-in Tcl/Tk support for 2.7.6. + Some third-party projects, such as Matplotlib and PIL/Pillow, + depended on being able to build with Tcl and Tk frameworks in + /Library/Frameworks. + What's New in Python 2.7.6 release candidate 1? =============================================== -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 16:10:01 2013 From: python-checkins at python.org (benjamin.peterson) Date: Tue, 5 Nov 2013 16:10:01 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMi43IC0+IDIuNyk6?= =?utf-8?q?_merge_2=2E7=2E6_release_branch?= Message-ID: <3dDZ6x2jt3z7LpY@mail.python.org> http://hg.python.org/cpython/rev/6ff5bd97635f changeset: 86943:6ff5bd97635f branch: 2.7 parent: 86938:fc8f19b4b662 parent: 86942:ba31940588b6 user: Benjamin Peterson date: Tue Nov 05 10:09:52 2013 -0500 summary: merge 2.7.6 release branch files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 18:07:59 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 5 Nov 2013 18:07:59 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_pysq?= =?utf-8?q?lite=5Fcursor=5Fiternext=28=29_of_sqlite3=2C_handle?= Message-ID: <3dDcl34vSsz7LpQ@mail.python.org> http://hg.python.org/cpython/rev/b93614f7ed83 changeset: 86944:b93614f7ed83 parent: 86941:0aa2aedc6a21 user: Victor Stinner date: Tue Nov 05 14:30:11 2013 +0100 summary: Issue #19437: Fix pysqlite_cursor_iternext() of sqlite3, handle _pysqlite_fetch_one_row() failure files: Modules/_sqlite/cursor.c | 6 ++++++ 1 files changed, 6 insertions(+), 0 deletions(-) diff --git a/Modules/_sqlite/cursor.c b/Modules/_sqlite/cursor.c --- a/Modules/_sqlite/cursor.c +++ b/Modules/_sqlite/cursor.c @@ -891,6 +891,12 @@ if (rc == SQLITE_ROW) { self->next_row = _pysqlite_fetch_one_row(self); + if (self->next_row == NULL) { + (void)pysqlite_statement_reset(self->statement); + Py_DECREF(next_row); + _pysqlite_seterror(self->connection->db, NULL); + return NULL; + } } } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 18:08:01 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 5 Nov 2013 18:08:01 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_pysq?= =?utf-8?q?lite=5Fconnection=5Fcall=28=29_of_sqlite3=2C_return_NULL_when?= Message-ID: <3dDcl51NMYz7Lpk@mail.python.org> http://hg.python.org/cpython/rev/00ee08fac522 changeset: 86945:00ee08fac522 user: Victor Stinner date: Tue Nov 05 14:46:13 2013 +0100 summary: Issue #19437: Fix pysqlite_connection_call() of sqlite3, return NULL when PyList_Append() fails files: Modules/_sqlite/connection.c | 34 ++++++++++------------- 1 files changed, 15 insertions(+), 19 deletions(-) diff --git a/Modules/_sqlite/connection.c b/Modules/_sqlite/connection.c --- a/Modules/_sqlite/connection.c +++ b/Modules/_sqlite/connection.c @@ -1229,9 +1229,8 @@ return NULL; } - if (!PyArg_ParseTuple(args, "O", &sql)) { + if (!PyArg_ParseTuple(args, "O", &sql)) return NULL; - } _pysqlite_drop_unused_statement_references(self); @@ -1247,7 +1246,6 @@ statement->in_weakreflist = NULL; rc = pysqlite_statement_create(statement, self, sql); - if (rc != SQLITE_OK) { if (rc == PYSQLITE_TOO_MUCH_SQL) { PyErr_SetString(pysqlite_Warning, "You can only execute one statement at a time."); @@ -1257,25 +1255,23 @@ (void)pysqlite_statement_reset(statement); _pysqlite_seterror(self->db, NULL); } - - Py_CLEAR(statement); - } else { - weakref = PyWeakref_NewRef((PyObject*)statement, NULL); - if (!weakref) { - Py_CLEAR(statement); - goto error; - } - - if (PyList_Append(self->statements, weakref) != 0) { - Py_CLEAR(weakref); - goto error; - } - - Py_DECREF(weakref); + goto error; } + weakref = PyWeakref_NewRef((PyObject*)statement, NULL); + if (weakref == NULL) + goto error; + if (PyList_Append(self->statements, weakref) != 0) { + Py_DECREF(weakref); + goto error; + } + Py_DECREF(weakref); + + return (PyObject*)statement; + error: - return (PyObject*)statement; + Py_DECREF(statement); + return NULL; } PyObject* pysqlite_connection_execute(pysqlite_Connection* self, PyObject* args, PyObject* kwargs) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 18:08:02 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 5 Nov 2013 18:08:02 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_pysq?= =?utf-8?q?lite=5Fcursor=5Fiternext=28=29_of_sqlite3=2C_when_the_row_facto?= =?utf-8?q?ry?= Message-ID: <3dDcl64zPGz7Lsk@mail.python.org> http://hg.python.org/cpython/rev/374635037b0a changeset: 86946:374635037b0a user: Victor Stinner date: Tue Nov 05 14:50:30 2013 +0100 summary: Issue #19437: Fix pysqlite_cursor_iternext() of sqlite3, when the row factory fails, don't consume the row (restore it) and fail immediatly (don't call pysqlite_step()) files: Modules/_sqlite/cursor.c | 5 +++++ 1 files changed, 5 insertions(+), 0 deletions(-) diff --git a/Modules/_sqlite/cursor.c b/Modules/_sqlite/cursor.c --- a/Modules/_sqlite/cursor.c +++ b/Modules/_sqlite/cursor.c @@ -871,10 +871,15 @@ } next_row_tuple = self->next_row; + assert(next_row_tuple != NULL); self->next_row = NULL; if (self->row_factory != Py_None) { next_row = PyObject_CallFunction(self->row_factory, "OO", self, next_row_tuple); + if (next_row == NULL) { + self->next_row = next_row_tuple; + return NULL; + } Py_DECREF(next_row_tuple); } else { next_row = next_row_tuple; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 18:08:04 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 5 Nov 2013 18:08:04 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_=5Ft?= =?utf-8?q?hreading=2ERLock_constructor_=28rlock=5Fnew=29=2C_call?= Message-ID: <3dDcl81hPgz7Lt6@mail.python.org> http://hg.python.org/cpython/rev/35fdb15b4939 changeset: 86947:35fdb15b4939 user: Victor Stinner date: Tue Nov 05 15:10:19 2013 +0100 summary: Issue #19437: Fix _threading.RLock constructor (rlock_new), call Py_DECREF(self) if PyThread_allocate_lock() failed instead of calling directly type->tp_free(self), to keep the chained list of objects consistent when Python is compiled in debug mode fails, don't consume the row (restore it) and fail immediatly (don't call pysqlite_step()) files: Modules/_threadmodule.c | 24 ++++++++++++++---------- 1 files changed, 14 insertions(+), 10 deletions(-) diff --git a/Modules/_threadmodule.c b/Modules/_threadmodule.c --- a/Modules/_threadmodule.c +++ b/Modules/_threadmodule.c @@ -68,7 +68,7 @@ Py_BEGIN_ALLOW_THREADS r = PyThread_acquire_lock_timed(lock, microseconds, 1); Py_END_ALLOW_THREADS - } + } if (r == PY_LOCK_INTR) { /* Run signal handlers if we were interrupted. Propagate @@ -255,14 +255,17 @@ static void rlock_dealloc(rlockobject *self) { - assert(self->rlock_lock); if (self->in_weakreflist != NULL) PyObject_ClearWeakRefs((PyObject *) self); - /* Unlock the lock so it's safe to free it */ - if (self->rlock_count > 0) - PyThread_release_lock(self->rlock_lock); + /* self->rlock_lock can be NULL if PyThread_allocate_lock() failed + in rlock_new() */ + if (self->rlock_lock != NULL) { + /* Unlock the lock so it's safe to free it */ + if (self->rlock_count > 0) + PyThread_release_lock(self->rlock_lock); - PyThread_free_lock(self->rlock_lock); + PyThread_free_lock(self->rlock_lock); + } Py_TYPE(self)->tp_free(self); } @@ -452,15 +455,16 @@ self = (rlockobject *) type->tp_alloc(type, 0); if (self != NULL) { + self->in_weakreflist = NULL; + self->rlock_owner = 0; + self->rlock_count = 0; + self->rlock_lock = PyThread_allocate_lock(); if (self->rlock_lock == NULL) { - type->tp_free(self); + Py_DECREF(self); PyErr_SetString(ThreadError, "can't allocate lock"); return NULL; } - self->in_weakreflist = NULL; - self->rlock_owner = 0; - self->rlock_count = 0; } return (PyObject *) self; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 18:08:05 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 5 Nov 2013 18:08:05 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_comp?= =?utf-8?q?iler=5Fclass=28=29=2C_handle_compiler=5Flookup=5Farg=28=29_fail?= =?utf-8?q?ure?= Message-ID: <3dDcl95ln0z7Lrn@mail.python.org> http://hg.python.org/cpython/rev/ea373a14f9e9 changeset: 86948:ea373a14f9e9 user: Victor Stinner date: Tue Nov 05 18:07:34 2013 +0100 summary: Issue #19437: Fix compiler_class(), handle compiler_lookup_arg() failure files: Python/compile.c | 4 ++++ 1 files changed, 4 insertions(+), 0 deletions(-) diff --git a/Python/compile.c b/Python/compile.c --- a/Python/compile.c +++ b/Python/compile.c @@ -1772,6 +1772,10 @@ } i = compiler_lookup_arg(c->u->u_cellvars, str); Py_DECREF(str); + if (i < 0) { + compiler_exit_scope(c); + return 0; + } assert(i == 0); /* Return the cell where to store __class__ */ ADDOP_I(c, LOAD_CLOSURE, i); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 19:18:57 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 5 Nov 2013 19:18:57 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Fix_typo_in_tk?= =?utf-8?q?inter_tests_=28issue_=2319085=29=2E?= Message-ID: <3dDfJx3s5cz7Ls4@mail.python.org> http://hg.python.org/cpython/rev/d5d0356ba5ac changeset: 86949:d5d0356ba5ac branch: 3.3 parent: 86939:268dc81c2527 user: Serhiy Storchaka date: Tue Nov 05 20:17:50 2013 +0200 summary: Fix typo in tkinter tests (issue #19085). files: Lib/tkinter/test/widget_tests.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/tkinter/test/widget_tests.py @@ -22,7 +22,7 @@ if patchlevel < (8, 5, 12): _pixels_round = int else: - _pixels_round = int_round + _pixels_round = round return _pixels_round(x) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 19:18:58 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 5 Nov 2013 19:18:58 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Fix_typo_in_tkinter_tests_=28issue_=2319085=29=2E?= Message-ID: <3dDfJy6YVPz7Ljk@mail.python.org> http://hg.python.org/cpython/rev/fc4ef17c7db8 changeset: 86950:fc4ef17c7db8 parent: 86948:ea373a14f9e9 parent: 86949:d5d0356ba5ac user: Serhiy Storchaka date: Tue Nov 05 20:18:17 2013 +0200 summary: Fix typo in tkinter tests (issue #19085). files: Lib/tkinter/test/widget_tests.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/tkinter/test/widget_tests.py @@ -22,7 +22,7 @@ if patchlevel < (8, 5, 12): _pixels_round = int else: - _pixels_round = int_round + _pixels_round = round return _pixels_round(x) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 20:12:41 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 5 Nov 2013 20:12:41 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogRml4IHRlc3RfaW5z?= =?utf-8?q?ertwidth_Tkinter_tests_on_Tk_8=2E5_with_patchlevel_=3E=3D_8=2E5?= =?utf-8?q?=2E12_=28issue?= Message-ID: <3dDgVx0z5Tz7Lp9@mail.python.org> http://hg.python.org/cpython/rev/eb126f976fa2 changeset: 86951:eb126f976fa2 branch: 2.7 parent: 86926:fe7aaf14b129 user: Serhiy Storchaka date: Tue Nov 05 21:04:54 2013 +0200 summary: Fix test_insertwidth Tkinter tests on Tk 8.5 with patchlevel >= 8.5.12 (issue #19085). files: Lib/lib-tk/test/test_tkinter/test_widgets.py | 8 ++++---- 1 files changed, 4 insertions(+), 4 deletions(-) diff --git a/Lib/lib-tk/test/test_tkinter/test_widgets.py b/Lib/lib-tk/test/test_tkinter/test_widgets.py --- a/Lib/lib-tk/test/test_tkinter/test_widgets.py +++ b/Lib/lib-tk/test/test_tkinter/test_widgets.py @@ -329,12 +329,12 @@ def test_insertwidth(self): widget = self.create() self.checkPixelsParam(widget, 'insertwidth', 1.3, 3.6, '10p') - if tcl_version[:2] == (8, 5): + self.checkParam(widget, 'insertwidth', 0.1, expected=2) + self.checkParam(widget, 'insertwidth', -2, expected=2) + if pixels_round(0.9) <= 0: self.checkParam(widget, 'insertwidth', 0.9, expected=2) else: - self.checkParam(widget, 'insertwidth', 0.9, expected=1) - self.checkParam(widget, 'insertwidth', 0.1, expected=2) - self.checkParam(widget, 'insertwidth', -2, expected=2) + self.checkParam(widget, 'insertwidth', 0.9) def test_invalidcommand(self): widget = self.create() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 20:12:42 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 5 Nov 2013 20:12:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogRml4IHRlc3RfaW5z?= =?utf-8?q?ertwidth_Tkinter_tests_on_Tk_8=2E5_with_patchlevel_=3E=3D_8=2E5?= =?utf-8?q?=2E12_=28issue?= Message-ID: <3dDgVy3DYnz7Lp9@mail.python.org> http://hg.python.org/cpython/rev/21fbe3ec90dc changeset: 86952:21fbe3ec90dc branch: 3.3 parent: 86949:d5d0356ba5ac user: Serhiy Storchaka date: Tue Nov 05 21:05:10 2013 +0200 summary: Fix test_insertwidth Tkinter tests on Tk 8.5 with patchlevel >= 8.5.12 (issue #19085). files: Lib/tkinter/test/test_tkinter/test_widgets.py | 8 ++++---- 1 files changed, 4 insertions(+), 4 deletions(-) diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -332,12 +332,12 @@ def test_insertwidth(self): widget = self.create() self.checkPixelsParam(widget, 'insertwidth', 1.3, 3.6, '10p') - if tcl_version[:2] == (8, 5): + self.checkParam(widget, 'insertwidth', 0.1, expected=2) + self.checkParam(widget, 'insertwidth', -2, expected=2) + if pixels_round(0.9) <= 0: self.checkParam(widget, 'insertwidth', 0.9, expected=2) else: - self.checkParam(widget, 'insertwidth', 0.9, expected=1) - self.checkParam(widget, 'insertwidth', 0.1, expected=2) - self.checkParam(widget, 'insertwidth', -2, expected=2) + self.checkParam(widget, 'insertwidth', 0.9) def test_invalidcommand(self): widget = self.create() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 20:12:43 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 5 Nov 2013 20:12:43 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Fix_test=5Finsertwidth_Tkinter_tests_on_Tk_8=2E5_with_pa?= =?utf-8?q?tchlevel_=3E=3D_8=2E5=2E12_=28issue?= Message-ID: <3dDgVz602lz7LtF@mail.python.org> http://hg.python.org/cpython/rev/ce08158e3f6c changeset: 86953:ce08158e3f6c parent: 86950:fc4ef17c7db8 parent: 86952:21fbe3ec90dc user: Serhiy Storchaka date: Tue Nov 05 21:06:05 2013 +0200 summary: Fix test_insertwidth Tkinter tests on Tk 8.5 with patchlevel >= 8.5.12 (issue #19085). files: Lib/tkinter/test/test_tkinter/test_widgets.py | 8 ++++---- 1 files changed, 4 insertions(+), 4 deletions(-) diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -332,12 +332,12 @@ def test_insertwidth(self): widget = self.create() self.checkPixelsParam(widget, 'insertwidth', 1.3, 3.6, '10p') - if tcl_version[:2] == (8, 5): + self.checkParam(widget, 'insertwidth', 0.1, expected=2) + self.checkParam(widget, 'insertwidth', -2, expected=2) + if pixels_round(0.9) <= 0: self.checkParam(widget, 'insertwidth', 0.9, expected=2) else: - self.checkParam(widget, 'insertwidth', 0.9, expected=1) - self.checkParam(widget, 'insertwidth', 0.1, expected=2) - self.checkParam(widget, 'insertwidth', -2, expected=2) + self.checkParam(widget, 'insertwidth', 0.9) def test_invalidcommand(self): widget = self.create() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 20:12:45 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 5 Nov 2013 20:12:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMi43IC0+IDIuNyk6?= =?utf-8?q?_Merge_heads?= Message-ID: <3dDgW12Qy0z7Ltb@mail.python.org> http://hg.python.org/cpython/rev/ea73e04bce54 changeset: 86954:ea73e04bce54 branch: 2.7 parent: 86951:eb126f976fa2 parent: 86943:6ff5bd97635f user: Serhiy Storchaka date: Tue Nov 05 21:08:04 2013 +0200 summary: Merge heads files: Lib/test/test_pydoc.py | 2 +- Mac/BuildScript/README.txt | 28 +-------------- Mac/BuildScript/build-installer.py | 10 ++-- Mac/BuildScript/resources/ReadMe.txt | 23 +++-------- Mac/BuildScript/resources/Welcome.rtf | 18 +++++--- Misc/NEWS | 8 ++++ 6 files changed, 33 insertions(+), 56 deletions(-) diff --git a/Lib/test/test_pydoc.py b/Lib/test/test_pydoc.py --- a/Lib/test/test_pydoc.py +++ b/Lib/test/test_pydoc.py @@ -421,7 +421,7 @@ def test_namedtuple_public_underscore(self): NT = namedtuple('NT', ['abc', 'def'], rename=True) with captured_stdout() as help_io: - help(NT) + pydoc.help(NT) helptext = help_io.getvalue() self.assertIn('_1', helptext) self.assertIn('_replace', helptext) diff --git a/Mac/BuildScript/README.txt b/Mac/BuildScript/README.txt --- a/Mac/BuildScript/README.txt +++ b/Mac/BuildScript/README.txt @@ -57,39 +57,13 @@ * NCurses 5.9 (http://bugs.python.org/issue15037) * SQLite 3.7.13 - * Tcl 8.5.15 - uses system-supplied versions of third-party libraries * readline module links with Apple BSD editline (libedit) * builds Oracle Sleepycat DB 4.8 (Python 2.x only) - - requires ActiveState Tcl/Tk 8.5.14 (or later) to be installed for building - - * Beginning with Python 2.7.6, this installer now includes its own - builtin copy of Tcl and Tk 8.5.15 libraries and thus is no longer - dependent on the buggy releases of Aqua Cocoa Tk 8.5 shipped with - OS X 10.6 or on installing a newer third-party version of Tcl/Tk - in /Library/Frameworks, such as from ActiveState. If it is - necessary to fallback to using a third-party Tcl/Tk because of - a problem with the builtin Tcl/Tk, there is a backup version of - the _tkinter extension included which will dynamically link to - Tcl and Tk frameworks in /Library/Frameworks as in previous releases. - To enable (for all users of this Python 2.7):: - - sudo bash - cd /Library/Frameworks/Python.framework/Versions/2.7 - cd ./lib/python2.7 - cp -p ./lib-tkinter/library/_tkinter.so ./lib-dynload - exit - - To restore using Python's builtin versions of Tcl and Tk:: - - sudo bash - cd /Library/Frameworks/Python.framework/Versions/2.7 - cd ./lib/python2.7 - cp -p ./lib-tkinter/builtin/_tkinter.so ./lib-dynload - exit + - requires ActiveState Tcl/Tk 8.5.9 (or later) to be installed for building - recommended build environment: diff --git a/Mac/BuildScript/build-installer.py b/Mac/BuildScript/build-installer.py --- a/Mac/BuildScript/build-installer.py +++ b/Mac/BuildScript/build-installer.py @@ -193,7 +193,7 @@ LT_10_5 = bool(DEPTARGET < '10.5') - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): result.extend([ dict( name="Tcl 8.5.15", @@ -571,7 +571,7 @@ # - the traditional version (renamed to _tkinter_library.so) linked # with /Library/Frameworks/{Tcl,Tk}.framework # - the default version linked with our builtin copies of Tcl and Tk - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): EXPECTED_SHARED_LIBS['_tkinter_library.so'] = \ EXPECTED_SHARED_LIBS['_tkinter.so'] EXPECTED_SHARED_LIBS['_tkinter.so'] = [ @@ -971,7 +971,7 @@ # out-of-date and has critical bugs. Save the _tkinter.so that was # linked with /Library/Frameworks/{Tck,Tk}.framework and build # another _tkinter.so linked with our builtin Tcl and Tk libs. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): runCommand("find build -name '_tkinter.so' " " -execdir mv '{}' _tkinter_library.so \;") print("Running make to build builtin _tkinter") @@ -1012,7 +1012,7 @@ # users to select which to import by manipulating sys.path # directly or with PYTHONPATH. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): TKINTERS = ['builtin', 'library'] tkinter_moves = [('_tkinter_' + tkn + '.so', os.path.join(path_to_lib, 'lib-tkinter', tkn)) @@ -1059,7 +1059,7 @@ # The files are moved after the entire tree has been walked # since the shared library checking depends on the files # having unique names. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): for tkm in tkinter_moves: if fn == tkm[0]: moves_list.append( diff --git a/Mac/BuildScript/resources/ReadMe.txt b/Mac/BuildScript/resources/ReadMe.txt --- a/Mac/BuildScript/resources/ReadMe.txt +++ b/Mac/BuildScript/resources/ReadMe.txt @@ -17,27 +17,18 @@ installer package icon. Then select "Open using ... Installer" from the contextual menu that appears. - **** IMPORTANT changes if you use IDLE and Tkinter **** + **** IMPORTANT **** -Installing a third-party version of Tcl/Tk is no longer required -================================================================ +Update your version of Tcl/Tk to use IDLE or other Tk applications +================================================================== -As of Python 2.7.6, the 10.6+ 64-bit installer now -comes with its own private copy of Tcl and Tk 8.5 libraries. For -this version of Python, it is no longer necessary to install -a third-party version of Tcl/Tk 8.5, such as those from ActiveState, -to work around the problematic versions of Tcl/Tk 8.5 shipped by -Apple in OS X 10.6 and later. (This does not change the requirements -for older versions of Python installed from python.org.) By default, -this version of Python will always use its own private version, -regardless of whether a third-party Tcl/Tk is installed. -The 10.3+ 32-bit-only installer continues to use Tcl/Tk 8.4, -either a third-party or system-supplied version. - -Visit http://www.python.org/download/mac/tcltk/ +To use IDLE or other programs that use the Tkinter graphical user +interface toolkit, you may need to install a newer third-party version +of the Tcl/Tk frameworks. Visit http://www.python.org/download/mac/tcltk/ for current information about supported and recommended versions of Tcl/Tk for this version of Python and of Mac OS X. + Using this version of Python on OS X ==================================== diff --git a/Mac/BuildScript/resources/Welcome.rtf b/Mac/BuildScript/resources/Welcome.rtf --- a/Mac/BuildScript/resources/Welcome.rtf +++ b/Mac/BuildScript/resources/Welcome.rtf @@ -1,8 +1,8 @@ -{\rtf1\ansi\ansicpg1252\cocoartf1187\cocoasubrtf400 -\cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} +{\rtf1\ansi\ansicpg1252\cocoartf1038\cocoasubrtf350 +{\fonttbl\f0\fswiss\fcharset0 Helvetica;} {\colortbl;\red255\green255\blue255;} -\paperw11905\paperh16837\margl1440\margr1440\vieww9640\viewh10620\viewkind0 -\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640 +\paperw11904\paperh16836\margl1440\margr1440\vieww9640\viewh10620\viewkind0 +\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\ql\qnatural \f0\fs24 \cf0 This package will install \b Python $FULL_VERSION @@ -19,7 +19,11 @@ See the ReadMe file and the Python documentation for more information.\ \ -\b IMPORTANT for users of IDLE and tkinter: -\b0 As of Python 2.7.6, it is no longer necessary to install third-party versions of the +\b IMPORTANT: +\b0 +\b IDLE +\b0 and other programs using the +\b tkinter +\b0 graphical user interface toolkit require specific versions of the \b Tcl/Tk -\b0 platform independent windowing toolkit. Please read the ReadMe file and visit {\field{\*\fldinst{HYPERLINK "http://www.python.org/download/mac/tcltk/"}}{\fldrslt http://www.python.org/download/mac/tcltk/}} for more information on supported and recommended versions of Tcl/Tk for this version of Python and Mac OS X.} \ No newline at end of file +\b0 platform independent windowing toolkit. Visit {\field{\*\fldinst{HYPERLINK "http://www.python.org/download/mac/tcltk/"}}{\fldrslt http://www.python.org/download/mac/tcltk/}} for current information on supported and recommended versions of Tcl/Tk for this version of Python and Mac OS X.} \ No newline at end of file diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -59,6 +59,14 @@ - Issue #19457: Fixed xmlcharrefreplace tests on wide build when tests are loaded from .py[co] files. +Build +----- + +- Issue #15663: Revert OS X installer built-in Tcl/Tk support for 2.7.6. + Some third-party projects, such as Matplotlib and PIL/Pillow, + depended on being able to build with Tcl and Tk frameworks in + /Library/Frameworks. + What's New in Python 2.7.6 release candidate 1? =============================================== -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 21:04:33 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 5 Nov 2013 21:04:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Revert_wrong_c?= =?utf-8?q?hange_in_previous_commit_=28issue_=2319085=29=2E?= Message-ID: <3dDhfn4snQz7Lp9@mail.python.org> http://hg.python.org/cpython/rev/c97600bdd726 changeset: 86955:c97600bdd726 branch: 2.7 user: Serhiy Storchaka date: Tue Nov 05 22:01:31 2013 +0200 summary: Revert wrong change in previous commit (issue #19085). files: Lib/lib-tk/test/test_tkinter/test_widgets.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/lib-tk/test/test_tkinter/test_widgets.py b/Lib/lib-tk/test/test_tkinter/test_widgets.py --- a/Lib/lib-tk/test/test_tkinter/test_widgets.py +++ b/Lib/lib-tk/test/test_tkinter/test_widgets.py @@ -334,7 +334,7 @@ if pixels_round(0.9) <= 0: self.checkParam(widget, 'insertwidth', 0.9, expected=2) else: - self.checkParam(widget, 'insertwidth', 0.9) + self.checkParam(widget, 'insertwidth', 0.9, expected=1) def test_invalidcommand(self): widget = self.create() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 21:04:35 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 5 Nov 2013 21:04:35 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Revert_wrong_c?= =?utf-8?q?hange_in_previous_commit_=28issue_=2319085=29=2E?= Message-ID: <3dDhfq13tVz7Lp9@mail.python.org> http://hg.python.org/cpython/rev/bec6df56c053 changeset: 86956:bec6df56c053 branch: 3.3 parent: 86952:21fbe3ec90dc user: Serhiy Storchaka date: Tue Nov 05 22:01:46 2013 +0200 summary: Revert wrong change in previous commit (issue #19085). files: Lib/tkinter/test/test_tkinter/test_widgets.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -337,7 +337,7 @@ if pixels_round(0.9) <= 0: self.checkParam(widget, 'insertwidth', 0.9, expected=2) else: - self.checkParam(widget, 'insertwidth', 0.9) + self.checkParam(widget, 'insertwidth', 0.9, expected=1) def test_invalidcommand(self): widget = self.create() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 21:04:37 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 5 Nov 2013 21:04:37 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Revert_wrong_change_in_previous_commit_=28issue_=2319085?= =?utf-8?b?KS4=?= Message-ID: <3dDhfs111kz7Ltw@mail.python.org> http://hg.python.org/cpython/rev/545feebd58fb changeset: 86957:545feebd58fb parent: 86953:ce08158e3f6c parent: 86956:bec6df56c053 user: Serhiy Storchaka date: Tue Nov 05 22:02:17 2013 +0200 summary: Revert wrong change in previous commit (issue #19085). files: Lib/tkinter/test/test_tkinter/test_widgets.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -337,7 +337,7 @@ if pixels_round(0.9) <= 0: self.checkParam(widget, 'insertwidth', 0.9, expected=2) else: - self.checkParam(widget, 'insertwidth', 0.9) + self.checkParam(widget, 'insertwidth', 0.9, expected=1) def test_invalidcommand(self): widget = self.create() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 5 22:04:36 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 5 Nov 2013 22:04:36 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_454=3A_replace_Statistic?= =?utf-8?q?=2Ekey_with_Statistic=2Etraceback?= Message-ID: <3dDk042JBLz7Lqv@mail.python.org> http://hg.python.org/peps/rev/d0a629d375c1 changeset: 5254:d0a629d375c1 user: Victor Stinner date: Tue Nov 05 22:04:26 2013 +0100 summary: PEP 454: replace Statistic.key with Statistic.traceback files: pep-0454.txt | 45 +++++++++++++++++++-------------------- 1 files changed, 22 insertions(+), 23 deletions(-) diff --git a/pep-0454.txt b/pep-0454.txt --- a/pep-0454.txt +++ b/pep-0454.txt @@ -134,6 +134,17 @@ See also ``start()`` and ``stop()`` functions. +``start()`` function: + + Start tracing Python memory allocations. + + The function installs hooks on Python memory allocators. These hooks + have important overhead in term of performances and memory usage: + see `Filter functions`_ to limit the overhead. + + See also ``stop()`` and ``is_tracing()`` functions. + + ``stop()`` function: Stop tracing Python memory allocations and clear traces of memory @@ -148,17 +159,6 @@ See also ``start()`` and ``is_tracing()`` functions. -``start()`` function: - - Start tracing Python memory allocations. - - The function installs hooks on Python memory allocators. These hooks - have important overhead in term of performances and memory usage: - see `Filter functions`_ to limit the overhead. - - See also ``stop()`` and ``is_tracing()`` functions. - - ``take_snapshot()`` function: Take a snapshot of traces of memory blocks allocated by Python using @@ -384,18 +384,18 @@ See also ``dump()``. -``statistics(key_type: str, cumulative: bool=False, compare_to=None)`` method: +``statistics(group_by: str, cumulative: bool=False, compare_to=None)`` method: Get statistics as a sorted list of ``Statistic`` instances, grouped - by *key_type*: + by *group_by*: - ===================== ======================== ================================================ - key_type description type - ===================== ======================== ================================================ - ``'filename'`` filename ``str`` - ``'lineno'`` filename and line number ``(filename: str, lineno: int)`` - ``'traceback'`` traceback tuple of ``(filename: str, lineno: int)`` tuples - ===================== ======================== ================================================ + ===================== ======================== + group_by description + ===================== ======================== + ``'filename'`` filename + ``'lineno'`` filename and line number + ``'traceback'`` traceback + ===================== ======================== If *cumulative* is ``True``, cumulate size and count of memory blocks of all frames of the traceback of a trace, not only the most @@ -442,10 +442,9 @@ ``Snapshot.statistics()`` returns a list of ``Statistic`` instances. -``key`` attribute: +``traceback`` attribute: - Key identifying the statistic. The key type depends on the - *key_type* parameter of the ``Snapshot.statistics()`` method. + Tuple of ``(filename: str, lineno: int)`` tuples. ``count`` attribute: -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Wed Nov 6 01:49:09 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 6 Nov 2013 01:49:09 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_doc=3A_fix_typo?= Message-ID: <3dDpz902mSz7Ljk@mail.python.org> http://hg.python.org/cpython/rev/0cfc80d49aa4 changeset: 86958:0cfc80d49aa4 user: Victor Stinner date: Wed Nov 06 01:48:45 2013 +0100 summary: doc: fix typo files: Doc/library/os.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/library/os.rst b/Doc/library/os.rst --- a/Doc/library/os.rst +++ b/Doc/library/os.rst @@ -962,7 +962,7 @@ .. function:: pipe() Create a pipe. Return a pair of file descriptors ``(r, w)`` usable for - reading and writing, respectively. The new file descriptor are + reading and writing, respectively. The new file descriptor is :ref:`non-inheritable `. Availability: Unix, Windows. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 05:03:52 2013 From: python-checkins at python.org (zach.ware) Date: Wed, 6 Nov 2013 05:03:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=2319439=3A_Update_PCbuild?= =?utf-8?q?/readme=2Etxt_with_new_sub-project?= Message-ID: <3dDvHr2Qkgz7Lvx@mail.python.org> http://hg.python.org/cpython/rev/99640494ca7f changeset: 86959:99640494ca7f user: Zachary Ware date: Tue Nov 05 21:55:46 2013 -0600 summary: #19439: Update PCbuild/readme.txt with new sub-project files: PCbuild/readme.txt | 3 +++ 1 files changed, 3 insertions(+), 0 deletions(-) diff --git a/PCbuild/readme.txt b/PCbuild/readme.txt --- a/PCbuild/readme.txt +++ b/PCbuild/readme.txt @@ -110,6 +110,9 @@ pywlauncher pyw.exe, a variant of py.exe that doesn't open a Command Prompt window +_testembed + _testembed.exe, a small program that embeds Python for testing + purposes, used by test_capi.py These are miscellaneous sub-projects that don't really fit the other categories. By default, these projects do not build in Debug -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 12:58:28 2013 From: python-checkins at python.org (nick.coghlan) Date: Wed, 6 Nov 2013 12:58:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?devguide=3A_Issue_triage_updates_for_?= =?utf-8?q?dis=2C_contextlib=2C_venv?= Message-ID: <3dF5qS5v13z7LlF@mail.python.org> http://hg.python.org/devguide/rev/dfcb8e226a73 changeset: 651:dfcb8e226a73 user: Nick Coghlan date: Wed Nov 06 21:58:19 2013 +1000 summary: Issue triage updates for dis, contextlib, venv - dis & contextlib issues can be assigned directly to me - venv was missing, added vinay as the original author files: experts.rst | 5 +++-- 1 files changed, 3 insertions(+), 2 deletions(-) diff --git a/experts.rst b/experts.rst --- a/experts.rst +++ b/experts.rst @@ -82,7 +82,7 @@ compileall concurrent.futures bquinlan configparser lukasz.langa* -contextlib ncoghlan +contextlib ncoghlan* copy alexandre.vassalotti copyreg alexandre.vassalotti cProfile @@ -95,7 +95,7 @@ dbm decimal facundobatista, rhettinger, mark.dickinson difflib tim.peters (inactive) -dis ncoghlan +dis ncoghlan* distutils tarek*, eric.araujo* doctest tim.peters (inactive) dummy_threading brett.cannon @@ -241,6 +241,7 @@ urllib orsenthil uu uuid +venv vinay.sajip warnings wave weakref fdrake, pitrou -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Wed Nov 6 13:08:51 2013 From: python-checkins at python.org (nick.coghlan) Date: Wed, 6 Nov 2013 13:08:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2319378=3A_address_?= =?utf-8?q?flaws_in_the_new_dis_module_APIs?= Message-ID: <3dF63R4JRRz7LlL@mail.python.org> http://hg.python.org/cpython/rev/ce8dd299cdc4 changeset: 86960:ce8dd299cdc4 user: Nick Coghlan date: Wed Nov 06 22:08:36 2013 +1000 summary: Close #19378: address flaws in the new dis module APIs - confusing line_offset parameter -> first_line parameter - systematically test and fix new file parameter - remove redundant Bytecode.show_info() API - rename Bytecode.display_code() to Bytecode.dis() and have it return the multi-line string rather than printing it directly - eliminated some not-so-helpful helpers from the bytecode_helper test support module Also fixed a longstanding defect (worked around in the test suite) where lines emitted by the dis module could include trailing white space. That no longer happens, allowing the formatting tests to be simplified to use plain string comparisons. files: Doc/library/dis.rst | 41 ++++--- Lib/dis.py | 72 ++++++++----- Lib/test/bytecode_helper.py | 31 ----- Lib/test/test_dis.py | 125 +++++++++++++++-------- Misc/NEWS | 20 +++ 5 files changed, 167 insertions(+), 122 deletions(-) diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -44,36 +44,39 @@ :class:`Bytecode` object that provides easy access to details of the compiled code. -.. class:: Bytecode +.. class:: Bytecode(x, *, first_line=None) - The bytecode operations of a piece of code + Analyse the bytecode corresponding to a function, method, string of + source code, or a code object (as returned by :func:`compile`). - This is a convenient wrapper around many of the functions listed below. - Instantiate it with a function, method, string of code, or a code object - (as returned by :func:`compile`). + This is a convenience wrapper around many of the functions listed below, + most notably :func:`get_instructions`, as iterating over a + :class:`ByteCode` instance yields the bytecode operations as + :class:`Instruction` instances. - Iterating over this yields the bytecode operations as :class:`Instruction` - instances. + If *first_line* is not None, it indicates the line number that should + be reported for the first source line in the disassembled code. + Otherwise, the source line information (if any) is taken directly from + the disassembled code object. .. data:: codeobj The compiled code object. - .. method:: display_code(*, file=None) + .. data:: first_line - Print a formatted view of the bytecode operations, like :func:`dis`. + The first source line of the code object (if available) + + .. method:: dis() + + Return a formatted view of the bytecode operations (the same as + printed by :func:`dis`, but returned as a multi-line string). .. method:: info() Return a formatted multi-line string with detailed information about the code object, like :func:`code_info`. - .. method:: show_info(*, file=None) - - Print the information about the code object as returned by :meth:`info`. - - .. versionadded:: 3.4 - Example:: >>> bytecode = dis.Bytecode(myfunc) @@ -176,7 +179,7 @@ Added ``file`` parameter -.. function:: get_instructions(x, *, line_offset=0) +.. function:: get_instructions(x, *, first_line=None) Return an iterator over the instructions in the supplied function, method, source code string or code object. @@ -184,8 +187,10 @@ The iterator generates a series of :class:`Instruction` named tuples giving the details of each operation in the supplied code. - The given *line_offset* is added to the ``starts_line`` attribute of any - instructions that start a new line. + If *first_line* is not None, it indicates the line number that should + be reported for the first source line in the disassembled code. + Otherwise, the source line information (if any) is taken directly from + the disassembled code object. .. versionadded:: 3.4 diff --git a/Lib/dis.py b/Lib/dis.py --- a/Lib/dis.py +++ b/Lib/dis.py @@ -3,6 +3,7 @@ import sys import types import collections +import io from opcode import * from opcode import __all__ as _opcodes_all @@ -34,7 +35,7 @@ """ if x is None: - distb() + distb(file=file) return if hasattr(x, '__func__'): # Method x = x.__func__ @@ -46,7 +47,7 @@ if isinstance(x1, _have_code): print("Disassembly of %s:" % name, file=file) try: - dis(x1) + dis(x1, file=file) except TypeError as msg: print("Sorry:", msg, file=file) print(file=file) @@ -203,21 +204,27 @@ # Column: Opcode argument details if self.argrepr: fields.append('(' + self.argrepr + ')') - return ' '.join(fields) + return ' '.join(fields).rstrip() -def get_instructions(x, *, line_offset=0): +def get_instructions(x, *, first_line=None): """Iterator for the opcodes in methods, functions or code Generates a series of Instruction named tuples giving the details of each operations in the supplied code. - The given line offset is added to the 'starts_line' attribute of any - instructions that start a new line. + If *first_line* is not None, it indicates the line number that should + be reported for the first source line in the disassembled code. + Otherwise, the source line information (if any) is taken directly from + the disassembled code object. """ co = _get_code_object(x) cell_names = co.co_cellvars + co.co_freevars linestarts = dict(findlinestarts(co)) + if first_line is not None: + line_offset = first_line - co.co_firstlineno + else: + line_offset = 0 return _get_instructions_bytes(co.co_code, co.co_varnames, co.co_names, co.co_consts, cell_names, linestarts, line_offset) @@ -320,13 +327,14 @@ def _disassemble_bytes(code, lasti=-1, varnames=None, names=None, constants=None, cells=None, linestarts=None, - *, file=None): + *, file=None, line_offset=0): # Omit the line number column entirely if we have no line number info show_lineno = linestarts is not None # TODO?: Adjust width upwards if max(linestarts.values()) >= 1000? lineno_width = 3 if show_lineno else 0 for instr in _get_instructions_bytes(code, varnames, names, - constants, cells, linestarts): + constants, cells, linestarts, + line_offset=line_offset): new_source_line = (show_lineno and instr.starts_line is not None and instr.offset > 0) @@ -398,40 +406,44 @@ Iterating over this yields the bytecode operations as Instruction instances. """ - def __init__(self, x): - self.codeobj = _get_code_object(x) - self.cell_names = self.codeobj.co_cellvars + self.codeobj.co_freevars - self.linestarts = dict(findlinestarts(self.codeobj)) - self.line_offset = 0 - self.original_object = x + def __init__(self, x, *, first_line=None): + self.codeobj = co = _get_code_object(x) + if first_line is None: + self.first_line = co.co_firstlineno + self._line_offset = 0 + else: + self.first_line = first_line + self._line_offset = first_line - co.co_firstlineno + self._cell_names = co.co_cellvars + co.co_freevars + self._linestarts = dict(findlinestarts(co)) + self._original_object = x def __iter__(self): co = self.codeobj return _get_instructions_bytes(co.co_code, co.co_varnames, co.co_names, - co.co_consts, self.cell_names, - self.linestarts, self.line_offset) + co.co_consts, self._cell_names, + self._linestarts, + line_offset=self._line_offset) def __repr__(self): - return "{}({!r})".format(self.__class__.__name__, self.original_object) + return "{}({!r})".format(self.__class__.__name__, + self._original_object) def info(self): """Return formatted information about the code object.""" return _format_code_info(self.codeobj) - def show_info(self, *, file=None): - """Print the information about the code object as returned by info().""" - print(self.info(), file=file) - - def display_code(self, *, file=None): - """Print a formatted view of the bytecode operations. - """ + def dis(self): + """Return a formatted view of the bytecode operations.""" co = self.codeobj - return _disassemble_bytes(co.co_code, varnames=co.co_varnames, - names=co.co_names, constants=co.co_consts, - cells=self.cell_names, - linestarts=self.linestarts, - file=file - ) + with io.StringIO() as output: + _disassemble_bytes(co.co_code, varnames=co.co_varnames, + names=co.co_names, constants=co.co_consts, + cells=self._cell_names, + linestarts=self._linestarts, + line_offset=self._line_offset, + file=output) + return output.getvalue() def _test(): diff --git a/Lib/test/bytecode_helper.py b/Lib/test/bytecode_helper.py --- a/Lib/test/bytecode_helper.py +++ b/Lib/test/bytecode_helper.py @@ -14,37 +14,6 @@ dis.dis(co, file=s) return s.getvalue() - def assertInstructionMatches(self, instr, expected, *, line_offset=0): - # Deliberately test opname first, since that gives a more - # meaningful error message than testing opcode - self.assertEqual(instr.opname, expected.opname) - self.assertEqual(instr.opcode, expected.opcode) - self.assertEqual(instr.arg, expected.arg) - self.assertEqual(instr.argval, expected.argval) - self.assertEqual(instr.argrepr, expected.argrepr) - self.assertEqual(instr.offset, expected.offset) - if expected.starts_line is None: - self.assertIsNone(instr.starts_line) - else: - self.assertEqual(instr.starts_line, - expected.starts_line + line_offset) - self.assertEqual(instr.is_jump_target, expected.is_jump_target) - - - def assertBytecodeExactlyMatches(self, x, expected, *, line_offset=0): - """Throws AssertionError if any discrepancy is found in bytecode - - *x* is the object to be introspected - *expected* is a list of dis.Instruction objects - - Set *line_offset* as appropriate to adjust for the location of the - object to be disassembled within the test file. If the expected list - assumes the first line is line 1, then an appropriate offset would be - ``1 - f.__code__.co_firstlineno``. - """ - actual = dis.get_instructions(x, line_offset=line_offset) - self.assertEqual(list(actual), expected) - def assertInBytecode(self, x, opname, argval=_UNSPECIFIED): """Returns instr if op is found, otherwise throws AssertionError""" for instr in dis.get_instructions(x): diff --git a/Lib/test/test_dis.py b/Lib/test/test_dis.py --- a/Lib/test/test_dis.py +++ b/Lib/test/test_dis.py @@ -8,6 +8,7 @@ import dis import io import types +import contextlib class _C: def __init__(self, x): @@ -176,30 +177,20 @@ class DisTests(unittest.TestCase): def get_disassembly(self, func, lasti=-1, wrapper=True): - s = io.StringIO() - save_stdout = sys.stdout - sys.stdout = s - try: + # We want to test the default printing behaviour, not the file arg + output = io.StringIO() + with contextlib.redirect_stdout(output): if wrapper: dis.dis(func) else: dis.disassemble(func, lasti) - finally: - sys.stdout = save_stdout - # Trim trailing blanks (if any). - return [line.rstrip() for line in s.getvalue().splitlines()] + return output.getvalue() def get_disassemble_as_string(self, func, lasti=-1): - return '\n'.join(self.get_disassembly(func, lasti, False)) + return self.get_disassembly(func, lasti, False) def do_disassembly_test(self, func, expected): - lines = self.get_disassembly(func) - expected = expected.splitlines() - if expected != lines: - self.fail( - "events did not match expectation:\n" + - "\n".join(difflib.ndiff(expected, - lines))) + self.assertEqual(self.get_disassembly(func), expected) def test_opmap(self): self.assertEqual(dis.opmap["NOP"], 9) @@ -290,6 +281,20 @@ def test_dis_object(self): self.assertRaises(TypeError, dis.dis, object()) +class DisWithFileTests(DisTests): + + # Run the tests again, using the file arg instead of print + def get_disassembly(self, func, lasti=-1, wrapper=True): + # We want to test the default printing behaviour, not the file arg + output = io.StringIO() + if wrapper: + dis.dis(func, file=output) + else: + dis.disassemble(func, lasti, file=output) + return output.getvalue() + + + code_info_code_info = """\ Name: code_info Filename: (.*) @@ -482,26 +487,29 @@ print("OK, now we're done") # End fodder for opinfo generation tests -expected_outer_offset = 1 - outer.__code__.co_firstlineno -expected_jumpy_offset = 1 - jumpy.__code__.co_firstlineno +expected_outer_line = 1 +_line_offset = outer.__code__.co_firstlineno - 1 code_object_f = outer.__code__.co_consts[3] +expected_f_line = code_object_f.co_firstlineno - _line_offset code_object_inner = code_object_f.co_consts[3] +expected_inner_line = code_object_inner.co_firstlineno - _line_offset +expected_jumpy_line = 1 # The following lines are useful to regenerate the expected results after # either the fodder is modified or the bytecode generation changes # After regeneration, update the references to code_object_f and # code_object_inner before rerunning the tests -#_instructions = dis.get_instructions(outer, line_offset=expected_outer_offset) +#_instructions = dis.get_instructions(outer, first_line=expected_outer_line) #print('expected_opinfo_outer = [\n ', #',\n '.join(map(str, _instructions)), ',\n]', sep='') -#_instructions = dis.get_instructions(outer(), line_offset=expected_outer_offset) +#_instructions = dis.get_instructions(outer(), first_line=expected_outer_line) #print('expected_opinfo_f = [\n ', #',\n '.join(map(str, _instructions)), ',\n]', sep='') -#_instructions = dis.get_instructions(outer()(), line_offset=expected_outer_offset) +#_instructions = dis.get_instructions(outer()(), first_line=expected_outer_line) #print('expected_opinfo_inner = [\n ', #',\n '.join(map(str, _instructions)), ',\n]', sep='') -#_instructions = dis.get_instructions(jumpy, line_offset=expected_jumpy_offset) +#_instructions = dis.get_instructions(jumpy, first_line=expected_jumpy_line) #print('expected_opinfo_jumpy = [\n ', #',\n '.join(map(str, _instructions)), ',\n]', sep='') @@ -671,42 +679,75 @@ Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=243, starts_line=None, is_jump_target=False), ] +# One last piece of inspect fodder to check the default line number handling +def simple(): pass +expected_opinfo_simple = [ + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=0, starts_line=simple.__code__.co_firstlineno, is_jump_target=False), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=3, starts_line=None, is_jump_target=False) +] + + class InstructionTests(BytecodeTestCase): + + def test_default_first_line(self): + actual = dis.get_instructions(simple) + self.assertEqual(list(actual), expected_opinfo_simple) + + def test_first_line_set_to_None(self): + actual = dis.get_instructions(simple, first_line=None) + self.assertEqual(list(actual), expected_opinfo_simple) + def test_outer(self): - self.assertBytecodeExactlyMatches(outer, expected_opinfo_outer, - line_offset=expected_outer_offset) + actual = dis.get_instructions(outer, first_line=expected_outer_line) + self.assertEqual(list(actual), expected_opinfo_outer) def test_nested(self): with captured_stdout(): f = outer() - self.assertBytecodeExactlyMatches(f, expected_opinfo_f, - line_offset=expected_outer_offset) + actual = dis.get_instructions(f, first_line=expected_f_line) + self.assertEqual(list(actual), expected_opinfo_f) def test_doubly_nested(self): with captured_stdout(): inner = outer()() - self.assertBytecodeExactlyMatches(inner, expected_opinfo_inner, - line_offset=expected_outer_offset) + actual = dis.get_instructions(inner, first_line=expected_inner_line) + self.assertEqual(list(actual), expected_opinfo_inner) def test_jumpy(self): - self.assertBytecodeExactlyMatches(jumpy, expected_opinfo_jumpy, - line_offset=expected_jumpy_offset) + actual = dis.get_instructions(jumpy, first_line=expected_jumpy_line) + self.assertEqual(list(actual), expected_opinfo_jumpy) +# get_instructions has its own tests above, so can rely on it to validate +# the object oriented API class BytecodeTests(unittest.TestCase): def test_instantiation(self): # Test with function, method, code string and code object for obj in [_f, _C(1).__init__, "a=1", _f.__code__]: - b = dis.Bytecode(obj) - self.assertIsInstance(b.codeobj, types.CodeType) + with self.subTest(obj=obj): + b = dis.Bytecode(obj) + self.assertIsInstance(b.codeobj, types.CodeType) self.assertRaises(TypeError, dis.Bytecode, object()) def test_iteration(self): - b = dis.Bytecode(_f) - for instr in b: - self.assertIsInstance(instr, dis.Instruction) + for obj in [_f, _C(1).__init__, "a=1", _f.__code__]: + with self.subTest(obj=obj): + via_object = list(dis.Bytecode(obj)) + via_generator = list(dis.get_instructions(obj)) + self.assertEqual(via_object, via_generator) - assert len(list(b)) > 0 # Iterating should yield at least 1 instruction + def test_explicit_first_line(self): + actual = dis.Bytecode(outer, first_line=expected_outer_line) + self.assertEqual(list(actual), expected_opinfo_outer) + + def test_source_line_in_disassembly(self): + # Use the line in the source code + actual = dis.Bytecode(simple).dis()[:3] + expected = "{:>3}".format(simple.__code__.co_firstlineno) + self.assertEqual(actual, expected) + # Use an explicit first line number + actual = dis.Bytecode(simple, first_line=350).dis()[:3] + self.assertEqual(actual, "350") def test_info(self): self.maxDiff = 1000 @@ -714,16 +755,14 @@ b = dis.Bytecode(x) self.assertRegex(b.info(), expected) - def test_display_code(self): - b = dis.Bytecode(_f) - output = io.StringIO() - b.display_code(file=output) - result = [line.rstrip() for line in output.getvalue().splitlines()] - self.assertEqual(result, dis_f.splitlines()) + def test_disassembled(self): + actual = dis.Bytecode(_f).dis() + self.assertEqual(actual, dis_f) def test_main(): - run_unittest(DisTests, CodeInfoTests, InstructionTests, BytecodeTests) + run_unittest(DisTests, DisWithFileTests, CodeInfoTests, + InstructionTests, BytecodeTests) if __name__ == "__main__": test_main() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -31,6 +31,20 @@ Library ------- +- Issue #19378: Fixed a number of cases in the dis module where the new + "file" parameter was not being honoured correctly + +- Issue #19378: Removed the "dis.Bytecode.show_info" method + +- Issue #19378: Renamed the "dis.Bytecode.display_code" method to + "dis.Bytecode.dis" and converted it to returning a string rather than + printing output. + +- Issue #19378: the "line_offset" parameter in the new "dis.get_instructions" + API has been renamed to "first_line" (and the default value and usage + changed accordingly). This should reduce confusion with the more common use + of "offset" in the dis docs to refer to bytecode offsets. + - Issue #18678: Corrected spwd struct member names in spwd module: sp_nam->sp_namp, and sp_pwd->sp_pwdp. The old names are kept as extra structseq members, for backward compatibility. @@ -169,6 +183,12 @@ Tests ----- +- Issue #19378: the main dis module tests are now run with both stdout + redirection *and* passing an explicit file parameter + +- Issue #19378: removed the not-actually-helpful assertInstructionMatches + and assertBytecodeExactlyMatches helpers from bytecode_helper + - Issue #18702: All skipped tests now reported as skipped. - Issue #19439: interpreter embedding tests are now executed on Windows -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 13:12:19 2013 From: python-checkins at python.org (nick.coghlan) Date: Wed, 6 Nov 2013 13:12:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_typo_in_updated_dis_do?= =?utf-8?q?cs?= Message-ID: <3dF67R4cDYz7Lmq@mail.python.org> http://hg.python.org/cpython/rev/eacbedf9cc3e changeset: 86961:eacbedf9cc3e user: Nick Coghlan date: Wed Nov 06 22:12:07 2013 +1000 summary: Fix typo in updated dis docs files: Doc/library/dis.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -51,7 +51,7 @@ This is a convenience wrapper around many of the functions listed below, most notably :func:`get_instructions`, as iterating over a - :class:`ByteCode` instance yields the bytecode operations as + :class:`Bytecode` instance yields the bytecode operations as :class:`Instruction` instances. If *first_line* is not None, it indicates the line number that should -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 13:17:55 2013 From: python-checkins at python.org (nick.coghlan) Date: Wed, 6 Nov 2013 13:17:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Remove_incorrect_comment_f?= =?utf-8?q?rom_dis_tests?= Message-ID: <3dF6Fv2z0Vz7LjM@mail.python.org> http://hg.python.org/cpython/rev/2de806c8b070 changeset: 86962:2de806c8b070 user: Nick Coghlan date: Wed Nov 06 22:17:39 2013 +1000 summary: Remove incorrect comment from dis tests files: Lib/test/test_dis.py | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_dis.py b/Lib/test/test_dis.py --- a/Lib/test/test_dis.py +++ b/Lib/test/test_dis.py @@ -285,7 +285,6 @@ # Run the tests again, using the file arg instead of print def get_disassembly(self, func, lasti=-1, wrapper=True): - # We want to test the default printing behaviour, not the file arg output = io.StringIO() if wrapper: dis.dis(func, file=output) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 17:25:33 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 6 Nov 2013 17:25:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2318582=3A_fix_memo?= =?utf-8?q?ry_leak_in_pbkdf2_code?= Message-ID: <3dFCld5Lnpz7Ljw@mail.python.org> http://hg.python.org/cpython/rev/07fa1ed0d551 changeset: 86963:07fa1ed0d551 user: Christian Heimes date: Wed Nov 06 17:25:17 2013 +0100 summary: Issue #18582: fix memory leak in pbkdf2 code files: Modules/_hashopenssl.c | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Modules/_hashopenssl.c b/Modules/_hashopenssl.c --- a/Modules/_hashopenssl.c +++ b/Modules/_hashopenssl.c @@ -535,6 +535,7 @@ HMAC_CTX_cleanup(&hctx); return 0; } + HMAC_CTX_cleanup(&hctx); memcpy(p, digtmp, cplen); for (j = 1; j < iter; j++) { if (!HMAC_CTX_copy(&hctx, &hctx_tpl)) { -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 18:45:52 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 6 Nov 2013 18:45:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_sys=5Fdi?= =?utf-8?q?splayhook=28=29_now_uses_an_identifier_for_=22builtins=22?= Message-ID: <3dFFXJ1hMbz7LpQ@mail.python.org> http://hg.python.org/cpython/rev/a2f42d57b91d changeset: 86964:a2f42d57b91d user: Victor Stinner date: Wed Nov 06 18:27:13 2013 +0100 summary: Issue #19512: sys_displayhook() now uses an identifier for "builtins" dictionary key and only decodes "\n" string once to write a newline. So "builtins" and "\n" are only decoded once from UTF-8, at the first call. files: Python/sysmodule.c | 12 ++++++++++-- 1 files changed, 10 insertions(+), 2 deletions(-) diff --git a/Python/sysmodule.c b/Python/sysmodule.c --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -137,10 +137,13 @@ PyObject *outf; PyInterpreterState *interp = PyThreadState_GET()->interp; PyObject *modules = interp->modules; - PyObject *builtins = PyDict_GetItemString(modules, "builtins"); + PyObject *builtins; + static PyObject *newline = NULL; int err; _Py_IDENTIFIER(_); + _Py_IDENTIFIER(builtins); + builtins = _PyDict_GetItemId(modules, &PyId_builtins); if (builtins == NULL) { PyErr_SetString(PyExc_RuntimeError, "lost builtins module"); return NULL; @@ -173,7 +176,12 @@ return NULL; } } - if (PyFile_WriteString("\n", outf) != 0) + if (newline == NULL) { + newline = PyUnicode_FromString("\n"); + if (newline == NULL) + return NULL; + } + if (PyFile_WriteObject(newline, outf, Py_PRINT_RAW) != 0) return NULL; if (_PyObject_SetAttrId(builtins, &PyId__, o) != 0) return NULL; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 18:45:53 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 6 Nov 2013 18:45:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_=5Fprint?= =?utf-8?q?=5Ftotal=5Frefs=28=29_now_uses_an_identifier_to_get_=22showrefc?= =?utf-8?q?ount=22?= Message-ID: <3dFFXK3Q1tz7LrD@mail.python.org> http://hg.python.org/cpython/rev/55517661a053 changeset: 86965:55517661a053 user: Victor Stinner date: Wed Nov 06 18:28:21 2013 +0100 summary: Issue #19512: _print_total_refs() now uses an identifier to get "showrefcount" key from sys._xoptions files: Python/pythonrun.c | 10 ++++------ 1 files changed, 4 insertions(+), 6 deletions(-) diff --git a/Python/pythonrun.c b/Python/pythonrun.c --- a/Python/pythonrun.c +++ b/Python/pythonrun.c @@ -38,15 +38,13 @@ #ifdef Py_REF_DEBUG static void _print_total_refs(void) { - PyObject *xoptions, *key, *value; + PyObject *xoptions, *value; + _Py_IDENTIFIER(showrefcount); + xoptions = PySys_GetXOptions(); if (xoptions == NULL) return; - key = PyUnicode_FromString("showrefcount"); - if (key == NULL) - return; - value = PyDict_GetItem(xoptions, key); - Py_DECREF(key); + value = _PyDict_GetItemId(xoptions, &PyId_showrefcount); if (value == Py_True) fprintf(stderr, "[%" PY_FORMAT_SIZE_T "d refs, " -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 18:45:54 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 6 Nov 2013 18:45:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_Add_PyRu?= =?utf-8?q?n=5FInteractiveOneObject=28=29_function?= Message-ID: <3dFFXL6Cxgz7LsB@mail.python.org> http://hg.python.org/cpython/rev/af822a6c9faf changeset: 86966:af822a6c9faf user: Victor Stinner date: Wed Nov 06 18:41:07 2013 +0100 summary: Issue #19512: Add PyRun_InteractiveOneObject() function Only decode the filename once. PyRun_InteractiveOneObject() uses an identifier for "" string, so the byte string is only decoded once. files: Include/pythonrun.h | 4 + Python/pythonrun.c | 111 +++++++++++++++++++++++-------- 2 files changed, 86 insertions(+), 29 deletions(-) diff --git a/Include/pythonrun.h b/Include/pythonrun.h --- a/Include/pythonrun.h +++ b/Include/pythonrun.h @@ -63,6 +63,10 @@ FILE *fp, const char *filename, /* decoded from the filesystem encoding */ PyCompilerFlags *flags); +PyAPI_FUNC(int) PyRun_InteractiveOneObject( + FILE *fp, + PyObject *filename, + PyCompilerFlags *flags); PyAPI_FUNC(int) PyRun_InteractiveLoopFlags( FILE *fp, const char *filename, /* decoded from the filesystem encoding */ diff --git a/Python/pythonrun.c b/Python/pythonrun.c --- a/Python/pythonrun.c +++ b/Python/pythonrun.c @@ -73,7 +73,7 @@ static void initsite(void); static int initstdio(void); static void flush_io(void); -static PyObject *run_mod(mod_ty, const char *, PyObject *, PyObject *, +static PyObject *run_mod(mod_ty, PyObject *, PyObject *, PyObject *, PyCompilerFlags *, PyArena *); static PyObject *run_pyc_file(FILE *, const char *, PyObject *, PyObject *, PyCompilerFlags *); @@ -1265,12 +1265,18 @@ } int -PyRun_InteractiveLoopFlags(FILE *fp, const char *filename, PyCompilerFlags *flags) +PyRun_InteractiveLoopFlags(FILE *fp, const char *filename_str, PyCompilerFlags *flags) { - PyObject *v; - int ret; + PyObject *filename, *v; + int ret, err; PyCompilerFlags local_flags; + filename = PyUnicode_DecodeFSDefault(filename_str); + if (filename == NULL) { + PyErr_Print(); + return -1; + } + if (flags == NULL) { flags = &local_flags; local_flags.cf_flags = 0; @@ -1285,16 +1291,21 @@ PySys_SetObject("ps2", v = PyUnicode_FromString("... ")); Py_XDECREF(v); } + err = -1; for (;;) { - ret = PyRun_InteractiveOneFlags(fp, filename, flags); + ret = PyRun_InteractiveOneObject(fp, filename, flags); PRINT_TOTAL_REFS(); - if (ret == E_EOF) - return 0; + if (ret == E_EOF) { + err = 0; + break; + } /* if (ret == E_NOMEM) - return -1; + break; */ } + Py_DECREF(filename); + return err; } /* compute parser flags based on compiler flags */ @@ -1322,14 +1333,21 @@ #endif int -PyRun_InteractiveOneFlags(FILE *fp, const char *filename, PyCompilerFlags *flags) +PyRun_InteractiveOneObject(FILE *fp, PyObject *filename, PyCompilerFlags *flags) { - PyObject *m, *d, *v, *w, *oenc = NULL; + PyObject *m, *d, *v, *w, *oenc = NULL, *mod_name; mod_ty mod; PyArena *arena; char *ps1 = "", *ps2 = "", *enc = NULL; int errcode = 0; _Py_IDENTIFIER(encoding); + _Py_IDENTIFIER(__main__); + + mod_name = _PyUnicode_FromId(&PyId___main__); /* borrowed */ + if (mod_name == NULL) { + PyErr_Print(); + return -1; + } if (fp == stdin) { /* Fetch encoding from sys.stdin if possible. */ @@ -1375,9 +1393,9 @@ Py_XDECREF(oenc); return -1; } - mod = PyParser_ASTFromFile(fp, filename, enc, - Py_single_input, ps1, ps2, - flags, &errcode, arena); + mod = PyParser_ASTFromFileObject(fp, filename, enc, + Py_single_input, ps1, ps2, + flags, &errcode, arena); Py_XDECREF(v); Py_XDECREF(w); Py_XDECREF(oenc); @@ -1390,7 +1408,7 @@ PyErr_Print(); return -1; } - m = PyImport_AddModule("__main__"); + m = PyImport_AddModuleObject(mod_name); if (m == NULL) { PyArena_Free(arena); return -1; @@ -1407,6 +1425,23 @@ return 0; } +int +PyRun_InteractiveOneFlags(FILE *fp, const char *filename_str, PyCompilerFlags *flags) +{ + PyObject *filename; + int res; + + filename = PyUnicode_DecodeFSDefault(filename_str); + if (filename == NULL) { + PyErr_Print(); + return -1; + } + res = PyRun_InteractiveOneObject(fp, filename, flags); + Py_DECREF(filename); + return res; +} + + /* Check whether a file maybe a pyc file: Look at the extension, the file type, and, if we may close it, at the first few bytes. */ @@ -2010,37 +2045,55 @@ { PyObject *ret = NULL; mod_ty mod; - PyArena *arena = PyArena_New(); + PyArena *arena; + _Py_static_string(PyId_string, ""); + PyObject *filename; + + filename = _PyUnicode_FromId(&PyId_string); /* borrowed */ + if (filename == NULL) + return NULL; + + arena = PyArena_New(); if (arena == NULL) return NULL; - mod = PyParser_ASTFromString(str, "", start, flags, arena); + mod = PyParser_ASTFromStringObject(str, filename, start, flags, arena); if (mod != NULL) - ret = run_mod(mod, "", globals, locals, flags, arena); + ret = run_mod(mod, filename, globals, locals, flags, arena); PyArena_Free(arena); return ret; } PyObject * -PyRun_FileExFlags(FILE *fp, const char *filename, int start, PyObject *globals, +PyRun_FileExFlags(FILE *fp, const char *filename_str, int start, PyObject *globals, PyObject *locals, int closeit, PyCompilerFlags *flags) { - PyObject *ret; + PyObject *ret = NULL; mod_ty mod; - PyArena *arena = PyArena_New(); + PyArena *arena = NULL; + PyObject *filename; + + filename = PyUnicode_DecodeFSDefault(filename_str); + if (filename == NULL) + goto exit; + + arena = PyArena_New(); if (arena == NULL) - return NULL; + goto exit; - mod = PyParser_ASTFromFile(fp, filename, NULL, start, 0, 0, - flags, NULL, arena); + mod = PyParser_ASTFromFileObject(fp, filename, NULL, start, 0, 0, + flags, NULL, arena); if (closeit) fclose(fp); if (mod == NULL) { - PyArena_Free(arena); - return NULL; + goto exit; } ret = run_mod(mod, filename, globals, locals, flags, arena); - PyArena_Free(arena); + +exit: + Py_XDECREF(filename); + if (arena != NULL) + PyArena_Free(arena); return ret; } @@ -2075,12 +2128,12 @@ } static PyObject * -run_mod(mod_ty mod, const char *filename, PyObject *globals, PyObject *locals, - PyCompilerFlags *flags, PyArena *arena) +run_mod(mod_ty mod, PyObject *filename, PyObject *globals, PyObject *locals, + PyCompilerFlags *flags, PyArena *arena) { PyCodeObject *co; PyObject *v; - co = PyAST_Compile(mod, filename, flags, arena); + co = PyAST_CompileObject(mod, filename, flags, -1, arena); if (co == NULL) return NULL; v = PyEval_EvalCode((PyObject*)co, globals, locals); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 19:06:23 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 6 Nov 2013 19:06:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_Py=5FRep?= =?utf-8?q?rEnter=28=29_and_Py=5FReprLeave=28=29_now_use_an_identifier_for?= =?utf-8?q?_the?= Message-ID: <3dFFzz1h3xz7LpQ@mail.python.org> http://hg.python.org/cpython/rev/8a6a920d8eae changeset: 86967:8a6a920d8eae user: Victor Stinner date: Wed Nov 06 18:57:29 2013 +0100 summary: Issue #19512: Py_ReprEnter() and Py_ReprLeave() now use an identifier for the "Py_Repr" dictionary key files: Objects/object.c | 8 ++++---- 1 files changed, 4 insertions(+), 4 deletions(-) diff --git a/Objects/object.c b/Objects/object.c --- a/Objects/object.c +++ b/Objects/object.c @@ -1969,7 +1969,7 @@ See dictobject.c and listobject.c for examples of use. */ -#define KEY "Py_Repr" +_Py_IDENTIFIER(Py_Repr); int Py_ReprEnter(PyObject *obj) @@ -1981,12 +1981,12 @@ dict = PyThreadState_GetDict(); if (dict == NULL) return 0; - list = PyDict_GetItemString(dict, KEY); + list = _PyDict_GetItemId(dict, &PyId_Py_Repr); if (list == NULL) { list = PyList_New(0); if (list == NULL) return -1; - if (PyDict_SetItemString(dict, KEY, list) < 0) + if (_PyDict_SetItemId(dict, &PyId_Py_Repr, list) < 0) return -1; Py_DECREF(list); } @@ -2014,7 +2014,7 @@ if (dict == NULL) goto finally; - list = PyDict_GetItemString(dict, KEY); + list = _PyDict_GetItemId(dict, &PyId_Py_Repr); if (list == NULL || !PyList_Check(list)) goto finally; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 19:06:24 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 6 Nov 2013 19:06:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_Add_a_ne?= =?utf-8?q?w_=5FPyDict=5FDelItemId=28=29_function=2C_similar_to?= Message-ID: <3dFG004BDdz7Lpy@mail.python.org> http://hg.python.org/cpython/rev/69071054b42f changeset: 86968:69071054b42f user: Victor Stinner date: Wed Nov 06 18:58:22 2013 +0100 summary: Issue #19512: Add a new _PyDict_DelItemId() function, similar to PyDict_DelItemString() but using an identifier for the key files: Include/dictobject.h | 1 + Objects/dictobject.c | 9 +++++++++ 2 files changed, 10 insertions(+), 0 deletions(-) diff --git a/Include/dictobject.h b/Include/dictobject.h --- a/Include/dictobject.h +++ b/Include/dictobject.h @@ -109,6 +109,7 @@ PyAPI_FUNC(int) PyDict_SetItemString(PyObject *dp, const char *key, PyObject *item); PyAPI_FUNC(int) _PyDict_SetItemId(PyObject *dp, struct _Py_Identifier *key, PyObject *item); PyAPI_FUNC(int) PyDict_DelItemString(PyObject *dp, const char *key); +PyAPI_FUNC(int) _PyDict_DelItemId(PyObject *mp, struct _Py_Identifier *key); #ifndef Py_LIMITED_API int _PyObjectDict_SetItem(PyTypeObject *tp, PyObject **dictptr, PyObject *name, PyObject *value); diff --git a/Objects/dictobject.c b/Objects/dictobject.c --- a/Objects/dictobject.c +++ b/Objects/dictobject.c @@ -2736,6 +2736,15 @@ } int +_PyDict_DelItemId(PyObject *v, _Py_Identifier *key) +{ + PyObject *kv = _PyUnicode_FromId(key); /* borrowed */ + if (kv == NULL) + return -1; + return PyDict_DelItem(v, kv); +} + +int PyDict_DelItemString(PyObject *v, const char *key) { PyObject *kv; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 19:06:25 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 6 Nov 2013 19:06:25 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_type=5Fa?= =?utf-8?q?bstractmethods=28=29_and_type=5Fset=5Fabstractmethods=28=29_now?= =?utf-8?q?_use_an?= Message-ID: <3dFG016JJ7z7LsW@mail.python.org> http://hg.python.org/cpython/rev/862a62e61553 changeset: 86969:862a62e61553 user: Victor Stinner date: Wed Nov 06 18:59:18 2013 +0100 summary: Issue #19512: type_abstractmethods() and type_set_abstractmethods() now use an identifier for the "__abstractmethods__" string files: Objects/typeobject.c | 15 ++++++++++----- 1 files changed, 10 insertions(+), 5 deletions(-) diff --git a/Objects/typeobject.c b/Objects/typeobject.c --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -48,6 +48,7 @@ _Py_IDENTIFIER(__module__); _Py_IDENTIFIER(__name__); _Py_IDENTIFIER(__new__); +_Py_IDENTIFIER(__abstractmethods__); static PyObject * slot_tp_new(PyTypeObject *type, PyObject *args, PyObject *kwds); @@ -383,9 +384,11 @@ /* type itself has an __abstractmethods__ descriptor (this). Don't return that. */ if (type != &PyType_Type) - mod = PyDict_GetItemString(type->tp_dict, "__abstractmethods__"); + mod = _PyDict_GetItemId(type->tp_dict, &PyId___abstractmethods__); if (!mod) { - PyErr_SetString(PyExc_AttributeError, "__abstractmethods__"); + PyObject *message = _PyUnicode_FromId(&PyId___abstractmethods__); + if (message) + PyErr_SetObject(PyExc_AttributeError, message); return NULL; } Py_XINCREF(mod); @@ -404,13 +407,15 @@ abstract = PyObject_IsTrue(value); if (abstract < 0) return -1; - res = PyDict_SetItemString(type->tp_dict, "__abstractmethods__", value); + res = _PyDict_SetItemId(type->tp_dict, &PyId___abstractmethods__, value); } else { abstract = 0; - res = PyDict_DelItemString(type->tp_dict, "__abstractmethods__"); + res = _PyDict_DelItemId(type->tp_dict, &PyId___abstractmethods__); if (res && PyErr_ExceptionMatches(PyExc_KeyError)) { - PyErr_SetString(PyExc_AttributeError, "__abstractmethods__"); + PyObject *message = _PyUnicode_FromId(&PyId___abstractmethods__); + if (message) + PyErr_SetObject(PyExc_AttributeError, message); return -1; } } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 19:06:27 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 6 Nov 2013 19:06:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_eval=28?= =?utf-8?q?=29_and_exec=28=29_now_use_an_identifier_for_=22=5F=5Fbuiltins?= =?utf-8?q?=5F=5F=22_string?= Message-ID: <3dFG031cqZz7LsG@mail.python.org> http://hg.python.org/cpython/rev/e5476ecb8b57 changeset: 86970:e5476ecb8b57 user: Victor Stinner date: Wed Nov 06 19:03:11 2013 +0100 summary: Issue #19512: eval() and exec() now use an identifier for "__builtins__" string files: Python/bltinmodule.c | 13 +++++++------ 1 files changed, 7 insertions(+), 6 deletions(-) diff --git a/Python/bltinmodule.c b/Python/bltinmodule.c --- a/Python/bltinmodule.c +++ b/Python/bltinmodule.c @@ -34,6 +34,7 @@ _Py_IDENTIFIER(fileno); _Py_IDENTIFIER(flush); +_Py_IDENTIFIER(__builtins__); static PyObject * builtin___build_class__(PyObject *self, PyObject *args, PyObject *kwds) @@ -771,9 +772,9 @@ return NULL; } - if (PyDict_GetItemString(globals, "__builtins__") == NULL) { - if (PyDict_SetItemString(globals, "__builtins__", - PyEval_GetBuiltins()) != 0) + if (_PyDict_GetItemId(globals, &PyId___builtins__) == NULL) { + if (_PyDict_SetItemId(globals, &PyId___builtins__, + PyEval_GetBuiltins()) != 0) return NULL; } @@ -846,9 +847,9 @@ locals->ob_type->tp_name); return NULL; } - if (PyDict_GetItemString(globals, "__builtins__") == NULL) { - if (PyDict_SetItemString(globals, "__builtins__", - PyEval_GetBuiltins()) != 0) + if (_PyDict_GetItemId(globals, &PyId___builtins__) == NULL) { + if (_PyDict_SetItemId(globals, &PyId___builtins__, + PyEval_GetBuiltins()) != 0) return NULL; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 22:46:26 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 6 Nov 2013 22:46:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_Add_=5FP?= =?utf-8?q?ySys=5FGetObjectId=28=29_and_=5FPySys=5FSetObjectId=28=29_funct?= =?utf-8?q?ions?= Message-ID: <3dFLst6X9Lz7LyY@mail.python.org> http://hg.python.org/cpython/rev/5e402c16a74c changeset: 86971:5e402c16a74c user: Victor Stinner date: Wed Nov 06 22:36:40 2013 +0100 summary: Issue #19512: Add _PySys_GetObjectId() and _PySys_SetObjectId() functions files: Include/sysmodule.h | 3 +++ Python/sysmodule.c | 25 +++++++++++++++++++++++++ 2 files changed, 28 insertions(+), 0 deletions(-) diff --git a/Include/sysmodule.h b/Include/sysmodule.h --- a/Include/sysmodule.h +++ b/Include/sysmodule.h @@ -8,7 +8,10 @@ #endif PyAPI_FUNC(PyObject *) PySys_GetObject(const char *); +PyAPI_FUNC(PyObject *) _PySys_GetObjectId(_Py_Identifier *key); PyAPI_FUNC(int) PySys_SetObject(const char *, PyObject *); +PyAPI_FUNC(int) _PySys_SetObjectId(_Py_Identifier *key, PyObject *); + PyAPI_FUNC(void) PySys_SetArgv(int, wchar_t **); PyAPI_FUNC(void) PySys_SetArgvEx(int, wchar_t **, int); PyAPI_FUNC(void) PySys_SetPath(const wchar_t *); diff --git a/Python/sysmodule.c b/Python/sysmodule.c --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -42,6 +42,16 @@ #endif PyObject * +_PySys_GetObjectId(_Py_Identifier *key) +{ + PyThreadState *tstate = PyThreadState_GET(); + PyObject *sd = tstate->interp->sysdict; + if (sd == NULL) + return NULL; + return _PyDict_GetItemId(sd, key); +} + +PyObject * PySys_GetObject(const char *name) { PyThreadState *tstate = PyThreadState_GET(); @@ -52,6 +62,21 @@ } int +_PySys_SetObjectId(_Py_Identifier *key, PyObject *v) +{ + PyThreadState *tstate = PyThreadState_GET(); + PyObject *sd = tstate->interp->sysdict; + if (v == NULL) { + if (_PyDict_GetItemId(sd, key) == NULL) + return 0; + else + return _PyDict_DelItemId(sd, key); + } + else + return _PyDict_SetItemId(sd, key, v); +} + +int PySys_SetObject(const char *name, PyObject *v) { PyThreadState *tstate = PyThreadState_GET(); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 22:46:28 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 6 Nov 2013 22:46:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_PRINT=5F?= =?utf-8?q?EXPR_bytecode_now_uses_an_identifier_to_get_sys=2Edisplayhook?= Message-ID: <3dFLsw1G6tz7Lyh@mail.python.org> http://hg.python.org/cpython/rev/cca13dd603a9 changeset: 86972:cca13dd603a9 user: Victor Stinner date: Wed Nov 06 22:38:37 2013 +0100 summary: Issue #19512: PRINT_EXPR bytecode now uses an identifier to get sys.displayhook to only create the "displayhook" string once files: Python/ceval.c | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Python/ceval.c b/Python/ceval.c --- a/Python/ceval.c +++ b/Python/ceval.c @@ -1840,8 +1840,9 @@ } TARGET(PRINT_EXPR) { + _Py_IDENTIFIER(displayhook); PyObject *value = POP(); - PyObject *hook = PySys_GetObject("displayhook"); + PyObject *hook = _PySys_GetObjectId(&PyId_displayhook); PyObject *res; if (hook == NULL) { PyErr_SetString(PyExc_RuntimeError, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 22:46:29 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 6 Nov 2013 22:46:29 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_pickle_n?= =?utf-8?q?ow_uses_an_identifier_to_only_create_the_Unicode_string?= Message-ID: <3dFLsx3LDnz7M0K@mail.python.org> http://hg.python.org/cpython/rev/6348764bacdd changeset: 86973:6348764bacdd user: Victor Stinner date: Wed Nov 06 22:40:41 2013 +0100 summary: Issue #19512: pickle now uses an identifier to only create the Unicode string "modules" once files: Modules/_pickle.c | 6 ++++-- 1 files changed, 4 insertions(+), 2 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -136,6 +136,8 @@ /* For looking up name pairs in copyreg._extension_registry. */ static PyObject *two_tuple = NULL; +_Py_IDENTIFIER(modules); + static int stack_underflow(void) { @@ -1363,7 +1365,7 @@ return NULL; search: - modules_dict = PySys_GetObject("modules"); + modules_dict = _PySys_GetObjectId(&PyId_modules); if (modules_dict == NULL) { PyErr_SetString(PyExc_RuntimeError, "unable to get sys.modules"); return NULL; @@ -5548,7 +5550,7 @@ } } - modules_dict = PySys_GetObject("modules"); + modules_dict = _PySys_GetObjectId(&PyId_modules); if (modules_dict == NULL) { PyErr_SetString(PyExc_RuntimeError, "unable to get sys.modules"); return NULL; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 6 22:46:31 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 6 Nov 2013 22:46:31 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_add_some?= =?utf-8?q?_common_identifiers_to_only_create_common_strings_once=2C?= Message-ID: <3dFLsz0Zjhz7M08@mail.python.org> http://hg.python.org/cpython/rev/954167ce92a3 changeset: 86974:954167ce92a3 user: Victor Stinner date: Wed Nov 06 22:41:44 2013 +0100 summary: Issue #19512: add some common identifiers to only create common strings once, instead of creating temporary Unicode string objects Add also more identifiers in pythonrun.c to avoid temporary Unicode string objets for the interactive interpreter. files: Include/object.h | 12 ++++- Modules/_ctypes/callbacks.c | 2 +- Modules/_cursesmodule.c | 2 +- Modules/_threadmodule.c | 2 +- Modules/faulthandler.c | 2 +- Modules/main.c | 2 +- Modules/syslogmodule.c | 2 +- Python/_warnings.c | 4 +- Python/bltinmodule.c | 8 +- Python/errors.c | 2 +- Python/pythonrun.c | 59 +++++++++++++++--------- Python/sysmodule.c | 22 ++++---- Python/traceback.c | 2 +- 13 files changed, 72 insertions(+), 49 deletions(-) diff --git a/Include/object.h b/Include/object.h --- a/Include/object.h +++ b/Include/object.h @@ -143,9 +143,17 @@ PyObject *object; } _Py_Identifier; -#define _Py_static_string(varname, value) static _Py_Identifier varname = { 0, value, 0 } +#define _Py_static_string_init(value) { 0, value, 0 } +#define _Py_static_string(varname, value) static _Py_Identifier varname = _Py_static_string_init(value) #define _Py_IDENTIFIER(varname) _Py_static_string(PyId_##varname, #varname) +/* Common identifiers */ +PyAPI_DATA(_Py_Identifier) _PyId_path; +PyAPI_DATA(_Py_Identifier) _PyId_argv; +PyAPI_DATA(_Py_Identifier) _PyId_stdin; +PyAPI_DATA(_Py_Identifier) _PyId_stdout; +PyAPI_DATA(_Py_Identifier) _PyId_stderr; + /* Type objects contain a string containing the type name (to help somewhat in debugging), the allocation parameters (see PyObject_New() and @@ -829,7 +837,7 @@ PyObject *_py_xincref_tmp = (PyObject *)(op); \ if (_py_xincref_tmp != NULL) \ Py_INCREF(_py_xincref_tmp); \ - } while (0) + } while (0) #define Py_XDECREF(op) \ do { \ diff --git a/Modules/_ctypes/callbacks.c b/Modules/_ctypes/callbacks.c --- a/Modules/_ctypes/callbacks.c +++ b/Modules/_ctypes/callbacks.c @@ -80,7 +80,7 @@ PrintError(char *msg, ...) { char buf[512]; - PyObject *f = PySys_GetObject("stderr"); + PyObject *f = _PySys_GetObjectId(&_PyId_stderr); va_list marker; va_start(marker, msg); diff --git a/Modules/_cursesmodule.c b/Modules/_cursesmodule.c --- a/Modules/_cursesmodule.c +++ b/Modules/_cursesmodule.c @@ -2578,7 +2578,7 @@ if (fd == -1) { PyObject* sys_stdout; - sys_stdout = PySys_GetObject("stdout"); + sys_stdout = _PySys_GetObjectId(&_PyId_stdout); if (sys_stdout == NULL || sys_stdout == Py_None) { PyErr_SetString( diff --git a/Modules/_threadmodule.c b/Modules/_threadmodule.c --- a/Modules/_threadmodule.c +++ b/Modules/_threadmodule.c @@ -1005,7 +1005,7 @@ PySys_WriteStderr( "Unhandled exception in thread started by "); PyErr_Fetch(&exc, &value, &tb); - file = PySys_GetObject("stderr"); + file = _PySys_GetObjectId(&_PyId_stderr); if (file != NULL && file != Py_None) PyFile_WriteObject(boot->func, file, 0); else diff --git a/Modules/faulthandler.c b/Modules/faulthandler.c --- a/Modules/faulthandler.c +++ b/Modules/faulthandler.c @@ -136,7 +136,7 @@ int fd; if (file == NULL || file == Py_None) { - file = PySys_GetObject("stderr"); + file = _PySys_GetObjectId(&_PyId_stderr); if (file == NULL) { PyErr_SetString(PyExc_RuntimeError, "unable to get sys.stderr"); return NULL; diff --git a/Modules/main.c b/Modules/main.c --- a/Modules/main.c +++ b/Modules/main.c @@ -261,7 +261,7 @@ /* argv0 is usable as an import source, so put it in sys.path[0] and import __main__ */ - sys_path = PySys_GetObject("path"); + sys_path = _PySys_GetObjectId(&_PyId_path); if (sys_path == NULL) { PyErr_SetString(PyExc_RuntimeError, "unable to get sys.path"); goto error; diff --git a/Modules/syslogmodule.c b/Modules/syslogmodule.c --- a/Modules/syslogmodule.c +++ b/Modules/syslogmodule.c @@ -71,7 +71,7 @@ Py_ssize_t argv_len, scriptlen; PyObject *scriptobj; Py_ssize_t slash; - PyObject *argv = PySys_GetObject("argv"); + PyObject *argv = _PySys_GetObjectId(&_PyId_argv); if (argv == NULL) { return(NULL); diff --git a/Python/_warnings.c b/Python/_warnings.c --- a/Python/_warnings.c +++ b/Python/_warnings.c @@ -265,7 +265,7 @@ if (name == NULL) /* XXX Can an object lack a '__name__' attribute? */ goto error; - f_stderr = PySys_GetObject("stderr"); + f_stderr = _PySys_GetObjectId(&_PyId_stderr); if (f_stderr == NULL) { fprintf(stderr, "lost sys.stderr\n"); goto error; @@ -562,7 +562,7 @@ else { *filename = NULL; if (*module != Py_None && PyUnicode_CompareWithASCIIString(*module, "__main__") == 0) { - PyObject *argv = PySys_GetObject("argv"); + PyObject *argv = _PySys_GetObjectId(&_PyId_argv); /* PyList_Check() is needed because sys.argv is set to None during Python finalization */ if (argv != NULL && PyList_Check(argv) && PyList_Size(argv) > 0) { diff --git a/Python/bltinmodule.c b/Python/bltinmodule.c --- a/Python/bltinmodule.c +++ b/Python/bltinmodule.c @@ -1553,7 +1553,7 @@ kwlist, &sep, &end, &file, &flush)) return NULL; if (file == NULL || file == Py_None) { - file = PySys_GetObject("stdout"); + file = _PySys_GetObjectId(&_PyId_stdout); if (file == NULL) { PyErr_SetString(PyExc_RuntimeError, "lost sys.stdout"); return NULL; @@ -1638,9 +1638,9 @@ builtin_input(PyObject *self, PyObject *args) { PyObject *promptarg = NULL; - PyObject *fin = PySys_GetObject("stdin"); - PyObject *fout = PySys_GetObject("stdout"); - PyObject *ferr = PySys_GetObject("stderr"); + PyObject *fin = _PySys_GetObjectId(&_PyId_stdin); + PyObject *fout = _PySys_GetObjectId(&_PyId_stdout); + PyObject *ferr = _PySys_GetObjectId(&_PyId_stderr); PyObject *tmp; long fd; int tty; diff --git a/Python/errors.c b/Python/errors.c --- a/Python/errors.c +++ b/Python/errors.c @@ -844,7 +844,7 @@ PyErr_Fetch(&t, &v, &tb); - f = PySys_GetObject("stderr"); + f = _PySys_GetObjectId(&_PyId_stderr); if (f == NULL || f == Py_None) goto done; diff --git a/Python/pythonrun.c b/Python/pythonrun.c --- a/Python/pythonrun.c +++ b/Python/pythonrun.c @@ -35,6 +35,21 @@ #define PATH_MAX MAXPATHLEN #endif +/* Common identifiers */ +_Py_Identifier _PyId_argv = _Py_static_string_init("argv"); +_Py_Identifier _PyId_path = _Py_static_string_init("path"); +_Py_Identifier _PyId_stdin = _Py_static_string_init("stdin"); +_Py_Identifier _PyId_stdout = _Py_static_string_init("stdout"); +_Py_Identifier _PyId_stderr = _Py_static_string_init("stderr"); + +/* local identifiers */ +_Py_IDENTIFIER(excepthook); +_Py_IDENTIFIER(ps1); +_Py_IDENTIFIER(ps2); +_Py_IDENTIFIER(last_type); +_Py_IDENTIFIER(last_value); +_Py_IDENTIFIER(last_traceback); + #ifdef Py_REF_DEBUG static void _print_total_refs(void) { @@ -412,7 +427,7 @@ pstderr = PyFile_NewStdPrinter(fileno(stderr)); if (pstderr == NULL) Py_FatalError("Py_Initialize: can't set preliminary stderr"); - PySys_SetObject("stderr", pstderr); + _PySys_SetObjectId(&_PyId_stderr, pstderr); PySys_SetObject("__stderr__", pstderr); Py_DECREF(pstderr); @@ -497,8 +512,8 @@ static void flush_std_files(void) { - PyObject *fout = PySys_GetObject("stdout"); - PyObject *ferr = PySys_GetObject("stderr"); + PyObject *fout = _PySys_GetObjectId(&_PyId_stdout); + PyObject *ferr = _PySys_GetObjectId(&_PyId_stderr); PyObject *tmp; _Py_IDENTIFIER(flush); @@ -776,7 +791,7 @@ pstderr = PyFile_NewStdPrinter(fileno(stderr)); if (pstderr == NULL) Py_FatalError("Py_Initialize: can't set preliminary stderr"); - PySys_SetObject("stderr", pstderr); + _PySys_SetObjectId(&_PyId_stderr, pstderr); PySys_SetObject("__stderr__", pstderr); Py_DECREF(pstderr); @@ -1170,7 +1185,7 @@ goto error; } /* if (fd < 0) */ PySys_SetObject("__stdin__", std); - PySys_SetObject("stdin", std); + _PySys_SetObjectId(&_PyId_stdin, std); Py_DECREF(std); /* Set sys.stdout */ @@ -1185,7 +1200,7 @@ goto error; } /* if (fd < 0) */ PySys_SetObject("__stdout__", std); - PySys_SetObject("stdout", std); + _PySys_SetObjectId(&_PyId_stdout, std); Py_DECREF(std); #if 1 /* Disable this if you have trouble debugging bootstrap stuff */ @@ -1219,7 +1234,7 @@ Py_DECREF(std); goto error; } - if (PySys_SetObject("stderr", std) < 0) { + if (_PySys_SetObjectId(&_PyId_stderr, std) < 0) { Py_DECREF(std); goto error; } @@ -1281,14 +1296,14 @@ flags = &local_flags; local_flags.cf_flags = 0; } - v = PySys_GetObject("ps1"); + v = _PySys_GetObjectId(&PyId_ps1); if (v == NULL) { - PySys_SetObject("ps1", v = PyUnicode_FromString(">>> ")); + _PySys_SetObjectId(&PyId_ps1, v = PyUnicode_FromString(">>> ")); Py_XDECREF(v); } - v = PySys_GetObject("ps2"); + v = _PySys_GetObjectId(&PyId_ps2); if (v == NULL) { - PySys_SetObject("ps2", v = PyUnicode_FromString("... ")); + _PySys_SetObjectId(&PyId_ps2, v = PyUnicode_FromString("... ")); Py_XDECREF(v); } err = -1; @@ -1351,7 +1366,7 @@ if (fp == stdin) { /* Fetch encoding from sys.stdin if possible. */ - v = PySys_GetObject("stdin"); + v = _PySys_GetObjectId(&_PyId_stdin); if (v && v != Py_None) { oenc = _PyObject_GetAttrId(v, &PyId_encoding); if (oenc) @@ -1360,7 +1375,7 @@ PyErr_Clear(); } } - v = PySys_GetObject("ps1"); + v = _PySys_GetObjectId(&PyId_ps1); if (v != NULL) { v = PyObject_Str(v); if (v == NULL) @@ -1373,7 +1388,7 @@ } } } - w = PySys_GetObject("ps2"); + w = _PySys_GetObjectId(&PyId_ps2); if (w != NULL) { w = PyObject_Str(w); if (w == NULL) @@ -1752,7 +1767,7 @@ if (PyLong_Check(value)) exitcode = (int)PyLong_AsLong(value); else { - PyObject *sys_stderr = PySys_GetObject("stderr"); + PyObject *sys_stderr = _PySys_GetObjectId(&_PyId_stderr); if (sys_stderr != NULL && sys_stderr != Py_None) { PyFile_WriteObject(value, sys_stderr, Py_PRINT_RAW); } else { @@ -1795,11 +1810,11 @@ return; /* Now we know v != NULL too */ if (set_sys_last_vars) { - PySys_SetObject("last_type", exception); - PySys_SetObject("last_value", v); - PySys_SetObject("last_traceback", tb); + _PySys_SetObjectId(&PyId_last_type, exception); + _PySys_SetObjectId(&PyId_last_value, v); + _PySys_SetObjectId(&PyId_last_traceback, tb); } - hook = PySys_GetObject("excepthook"); + hook = _PySys_GetObjectId(&PyId_excepthook); if (hook) { PyObject *args = PyTuple_Pack(3, exception, v, tb); PyObject *result = PyEval_CallObject(hook, args); @@ -2009,7 +2024,7 @@ PyErr_Display(PyObject *exception, PyObject *value, PyObject *tb) { PyObject *seen; - PyObject *f = PySys_GetObject("stderr"); + PyObject *f = _PySys_GetObjectId(&_PyId_stderr); if (PyExceptionInstance_Check(value) && tb != NULL && PyTraceBack_Check(tb)) { /* Put the traceback on the exception, otherwise it won't get @@ -2107,7 +2122,7 @@ /* Save the current exception */ PyErr_Fetch(&type, &value, &traceback); - f = PySys_GetObject("stderr"); + f = _PySys_GetObjectId(&_PyId_stderr); if (f != NULL) { r = _PyObject_CallMethodId(f, &PyId_flush, ""); if (r) @@ -2115,7 +2130,7 @@ else PyErr_Clear(); } - f = PySys_GetObject("stdout"); + f = _PySys_GetObjectId(&_PyId_stdout); if (f != NULL) { r = _PyObject_CallMethodId(f, &PyId_flush, ""); if (r) diff --git a/Python/sysmodule.c b/Python/sysmodule.c --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -183,7 +183,7 @@ } if (_PyObject_SetAttrId(builtins, &PyId__, Py_None) != 0) return NULL; - outf = PySys_GetObject("stdout"); + outf = _PySys_GetObjectId(&_PyId_stdout); if (outf == NULL || outf == Py_None) { PyErr_SetString(PyExc_RuntimeError, "lost sys.stdout"); return NULL; @@ -1825,7 +1825,7 @@ PyObject *v; if ((v = makepathobject(path, DELIM)) == NULL) Py_FatalError("can't create sys.path"); - if (PySys_SetObject("path", v) != 0) + if (_PySys_SetObjectId(&_PyId_path, v) != 0) Py_FatalError("can't assign sys.path"); Py_DECREF(v); } @@ -1894,7 +1894,7 @@ wchar_t fullpath[MAX_PATH]; #endif - path = PySys_GetObject("path"); + path = _PySys_GetObjectId(&_PyId_path); if (path == NULL) return; @@ -2081,7 +2081,7 @@ */ static void -sys_write(char *name, FILE *fp, const char *format, va_list va) +sys_write(_Py_Identifier *key, FILE *fp, const char *format, va_list va) { PyObject *file; PyObject *error_type, *error_value, *error_traceback; @@ -2089,7 +2089,7 @@ int written; PyErr_Fetch(&error_type, &error_value, &error_traceback); - file = PySys_GetObject(name); + file = _PySys_GetObjectId(key); written = PyOS_vsnprintf(buffer, sizeof(buffer), format, va); if (sys_pyfile_write(buffer, file) != 0) { PyErr_Clear(); @@ -2109,7 +2109,7 @@ va_list va; va_start(va, format); - sys_write("stdout", stdout, format, va); + sys_write(&_PyId_stdout, stdout, format, va); va_end(va); } @@ -2119,19 +2119,19 @@ va_list va; va_start(va, format); - sys_write("stderr", stderr, format, va); + sys_write(&_PyId_stderr, stderr, format, va); va_end(va); } static void -sys_format(char *name, FILE *fp, const char *format, va_list va) +sys_format(_Py_Identifier *key, FILE *fp, const char *format, va_list va) { PyObject *file, *message; PyObject *error_type, *error_value, *error_traceback; char *utf8; PyErr_Fetch(&error_type, &error_value, &error_traceback); - file = PySys_GetObject(name); + file = _PySys_GetObjectId(key); message = PyUnicode_FromFormatV(format, va); if (message != NULL) { if (sys_pyfile_write_unicode(message, file) != 0) { @@ -2151,7 +2151,7 @@ va_list va; va_start(va, format); - sys_format("stdout", stdout, format, va); + sys_format(&_PyId_stdout, stdout, format, va); va_end(va); } @@ -2161,6 +2161,6 @@ va_list va; va_start(va, format); - sys_format("stderr", stderr, format, va); + sys_format(&_PyId_stderr, stderr, format, va); va_end(va); } diff --git a/Python/traceback.c b/Python/traceback.c --- a/Python/traceback.c +++ b/Python/traceback.c @@ -169,7 +169,7 @@ tail++; taillen = strlen(tail); - syspath = PySys_GetObject("path"); + syspath = _PySys_GetObjectId(&_PyId_path); if (syspath == NULL || !PyList_Check(syspath)) goto error; npath = PyList_Size(syspath); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 00:03:00 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 7 Nov 2013 00:03:00 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbjogSXNzdWUgIzE5NTEyOiBfX2J1?= =?utf-8?q?ild=5Fclass=28=29_builtin_now_uses_an_identifier_for_the?= Message-ID: <3dFNZD02jCz7Ljr@mail.python.org> http://hg.python.org/cpython/rev/40c73ccaee95 changeset: 86975:40c73ccaee95 user: Victor Stinner date: Wed Nov 06 22:46:51 2013 +0100 summary: Issue #19512: __build_class() builtin now uses an identifier for the "metaclass" string files: Python/bltinmodule.c | 5 +++-- 1 files changed, 3 insertions(+), 2 deletions(-) diff --git a/Python/bltinmodule.c b/Python/bltinmodule.c --- a/Python/bltinmodule.c +++ b/Python/bltinmodule.c @@ -44,6 +44,7 @@ Py_ssize_t nargs; int isclass; _Py_IDENTIFIER(__prepare__); + _Py_IDENTIFIER(metaclass); assert(args != NULL); if (!PyTuple_Check(args)) { @@ -83,10 +84,10 @@ Py_DECREF(bases); return NULL; } - meta = PyDict_GetItemString(mkw, "metaclass"); + meta = _PyDict_GetItemId(mkw, &PyId_metaclass); if (meta != NULL) { Py_INCREF(meta); - if (PyDict_DelItemString(mkw, "metaclass") < 0) { + if (_PyDict_DelItemId(mkw, &PyId_metaclass) < 0) { Py_DECREF(meta); Py_DECREF(mkw); Py_DECREF(bases); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 00:03:01 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 7 Nov 2013 00:03:01 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_fileio?= =?utf-8?q?=5Finit=28=29_reuses_PyId=5Fname_identifier_instead_of_=22name?= =?utf-8?q?=22?= Message-ID: <3dFNZF1tkzz7LlZ@mail.python.org> http://hg.python.org/cpython/rev/7177363d8c5c changeset: 86976:7177363d8c5c user: Victor Stinner date: Wed Nov 06 23:50:10 2013 +0100 summary: Issue #19512: fileio_init() reuses PyId_name identifier instead of "name" literal string files: Modules/_io/fileio.c | 5 +++-- 1 files changed, 3 insertions(+), 2 deletions(-) diff --git a/Modules/_io/fileio.c b/Modules/_io/fileio.c --- a/Modules/_io/fileio.c +++ b/Modules/_io/fileio.c @@ -59,6 +59,8 @@ PyTypeObject PyFileIO_Type; +_Py_IDENTIFIER(name); + #define PyFileIO_Check(op) (PyObject_TypeCheck((op), &PyFileIO_Type)) int @@ -427,7 +429,7 @@ _setmode(self->fd, O_BINARY); #endif - if (PyObject_SetAttrString((PyObject *)self, "name", nameobj) < 0) + if (_PyObject_SetAttrId((PyObject *)self, &PyId_name, nameobj) < 0) goto error; if (self->appending) { @@ -1036,7 +1038,6 @@ static PyObject * fileio_repr(fileio *self) { - _Py_IDENTIFIER(name); PyObject *nameobj, *res; if (self->fd < 0) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 00:03:02 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 7 Nov 2013 00:03:02 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_=5Fcount?= =?utf-8?q?=5Felements=28=29_of_=5Fcollections_reuses_PyId=5Fget_identifie?= =?utf-8?q?r?= Message-ID: <3dFNZG3fgsz7Lph@mail.python.org> http://hg.python.org/cpython/rev/dbee50619259 changeset: 86977:dbee50619259 user: Victor Stinner date: Wed Nov 06 23:52:55 2013 +0100 summary: Issue #19512: _count_elements() of _collections reuses PyId_get identifier instead of literal "get" string files: Modules/_collectionsmodule.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Modules/_collectionsmodule.c b/Modules/_collectionsmodule.c --- a/Modules/_collectionsmodule.c +++ b/Modules/_collectionsmodule.c @@ -1816,7 +1816,7 @@ Py_DECREF(key); } } else { - bound_get = PyObject_GetAttrString(mapping, "get"); + bound_get = _PyObject_GetAttrId(mapping, &PyId_get); if (bound_get == NULL) goto done; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 00:03:03 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 7 Nov 2013 00:03:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_builtin_?= =?utf-8?q?print=28=29_function_uses_an_identifier_instead_of_literal?= Message-ID: <3dFNZH5VrNz7Llw@mail.python.org> http://hg.python.org/cpython/rev/6a1ce1fd1fc0 changeset: 86978:6a1ce1fd1fc0 user: Victor Stinner date: Thu Nov 07 00:01:51 2013 +0100 summary: Issue #19512: builtin print() function uses an identifier instead of literal string "flush" to call the flush method files: Python/bltinmodule.c | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Python/bltinmodule.c b/Python/bltinmodule.c --- a/Python/bltinmodule.c +++ b/Python/bltinmodule.c @@ -1547,6 +1547,7 @@ static PyObject *dummy_args; PyObject *sep = NULL, *end = NULL, *file = NULL, *flush = NULL; int i, err; + _Py_IDENTIFIER(flush); if (dummy_args == NULL && !(dummy_args = PyTuple_New(0))) return NULL; @@ -1613,7 +1614,7 @@ if (do_flush == -1) return NULL; else if (do_flush) { - tmp = PyObject_CallMethod(file, "flush", ""); + tmp = _PyObject_CallMethodId(file, &PyId_flush, ""); if (tmp == NULL) return NULL; else -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 01:12:38 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 7 Nov 2013 01:12:38 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbjogcHJpbnRfZXhjZXB0aW9uKCk6?= =?utf-8?q?_don=27t_encode_the_module_name_to_UTF-8?= Message-ID: <3dFQ6Z1HYmz7LlC@mail.python.org> http://hg.python.org/cpython/rev/e9a865620788 changeset: 86979:e9a865620788 user: Victor Stinner date: Thu Nov 07 00:12:30 2013 +0100 summary: print_exception(): don't encode the module name to UTF-8 Replace _PyUnicode_AsString()+strcmp() with PyUnicode_CompareWithASCIIString(). files: Python/pythonrun.c | 5 ++--- 1 files changed, 2 insertions(+), 3 deletions(-) diff --git a/Python/pythonrun.c b/Python/pythonrun.c --- a/Python/pythonrun.c +++ b/Python/pythonrun.c @@ -1928,10 +1928,9 @@ err = PyFile_WriteString("", f); } else { - char* modstr = _PyUnicode_AsString(moduleName); - if (modstr && strcmp(modstr, "builtins")) + if (PyUnicode_CompareWithASCIIString(moduleName, "builtins") != 0) { - err = PyFile_WriteString(modstr, f); + err = PyFile_WriteObject(moduleName, f, Py_PRINT_RAW); err += PyFile_WriteString(".", f); } Py_DECREF(moduleName); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 01:12:39 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 7 Nov 2013 01:12:39 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_add_=5FP?= =?utf-8?q?yUnicode=5FCompareWithId=28=29_function?= Message-ID: <3dFQ6b4tC0z7LmT@mail.python.org> http://hg.python.org/cpython/rev/77bebcf5c4cf changeset: 86980:77bebcf5c4cf user: Victor Stinner date: Thu Nov 07 00:46:04 2013 +0100 summary: Issue #19512: add _PyUnicode_CompareWithId() function _PyUnicode_CompareWithId() is faster than PyUnicode_CompareWithASCIIString() when both strings are equal and interned. Add also _PyId_builtins identifier for "builtins" common string. files: Include/object.h | 5 +++-- Include/unicodeobject.h | 5 +++++ Objects/typeobject.c | 23 ++++++++++++----------- Objects/unicodeobject.c | 9 +++++++++ Python/errors.c | 2 +- Python/pythonrun.c | 3 ++- 6 files changed, 32 insertions(+), 15 deletions(-) diff --git a/Include/object.h b/Include/object.h --- a/Include/object.h +++ b/Include/object.h @@ -147,9 +147,10 @@ #define _Py_static_string(varname, value) static _Py_Identifier varname = _Py_static_string_init(value) #define _Py_IDENTIFIER(varname) _Py_static_string(PyId_##varname, #varname) -/* Common identifiers */ +/* Common identifiers (ex: _PyId_path is the string "path") */ +PyAPI_DATA(_Py_Identifier) _PyId_argv; +PyAPI_DATA(_Py_Identifier) _PyId_builtins; PyAPI_DATA(_Py_Identifier) _PyId_path; -PyAPI_DATA(_Py_Identifier) _PyId_argv; PyAPI_DATA(_Py_Identifier) _PyId_stdin; PyAPI_DATA(_Py_Identifier) _PyId_stdout; PyAPI_DATA(_Py_Identifier) _PyId_stderr; diff --git a/Include/unicodeobject.h b/Include/unicodeobject.h --- a/Include/unicodeobject.h +++ b/Include/unicodeobject.h @@ -1996,6 +1996,11 @@ PyObject *right /* Right string */ ); +PyAPI_FUNC(int) _PyUnicode_CompareWithId( + PyObject *left, /* Left string */ + _Py_Identifier *right /* Right identifier */ + ); + PyAPI_FUNC(int) PyUnicode_CompareWithASCIIString( PyObject *left, const char *right /* ASCII-encoded string */ diff --git a/Objects/typeobject.c b/Objects/typeobject.c --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -345,11 +345,10 @@ static PyObject * type_module(PyTypeObject *type, void *context) { - PyObject *mod; char *s; if (type->tp_flags & Py_TPFLAGS_HEAPTYPE) { - mod = _PyDict_GetItemId(type->tp_dict, &PyId___module__); + PyObject *mod = _PyDict_GetItemId(type->tp_dict, &PyId___module__); if (!mod) { PyErr_Format(PyExc_AttributeError, "__module__"); return 0; @@ -358,11 +357,14 @@ return mod; } else { + PyObject *name; s = strrchr(type->tp_name, '.'); if (s != NULL) return PyUnicode_FromStringAndSize( type->tp_name, (Py_ssize_t)(s - type->tp_name)); - return PyUnicode_FromString("builtins"); + name = _PyUnicode_FromId(&_PyId_builtins); + Py_XINCREF(name); + return name; } } @@ -712,7 +714,7 @@ return NULL; } - if (mod != NULL && PyUnicode_CompareWithASCIIString(mod, "builtins")) + if (mod != NULL && _PyUnicode_CompareWithId(mod, &_PyId_builtins)) rtn = PyUnicode_FromFormat("", mod, name); else rtn = PyUnicode_FromFormat("", type->tp_name); @@ -2143,7 +2145,7 @@ if (!valid_identifier(tmp)) goto error; assert(PyUnicode_Check(tmp)); - if (PyUnicode_CompareWithASCIIString(tmp, "__dict__") == 0) { + if (_PyUnicode_CompareWithId(tmp, &PyId___dict__) == 0) { if (!may_add_dict || add_dict) { PyErr_SetString(PyExc_TypeError, "__dict__ slot disallowed: " @@ -2174,7 +2176,7 @@ for (i = j = 0; i < nslots; i++) { tmp = PyTuple_GET_ITEM(slots, i); if ((add_dict && - PyUnicode_CompareWithASCIIString(tmp, "__dict__") == 0) || + _PyUnicode_CompareWithId(tmp, &PyId___dict__) == 0) || (add_weak && PyUnicode_CompareWithASCIIString(tmp, "__weakref__") == 0)) continue; @@ -3183,7 +3185,7 @@ Py_XDECREF(mod); return NULL; } - if (mod != NULL && PyUnicode_CompareWithASCIIString(mod, "builtins")) + if (mod != NULL && _PyUnicode_CompareWithId(mod, &_PyId_builtins)) rtn = PyUnicode_FromFormat("<%U.%U object at %p>", mod, name, self); else rtn = PyUnicode_FromFormat("<%s object at %p>", @@ -6336,8 +6338,8 @@ /* We want __class__ to return the class of the super object (i.e. super, or a subclass), not the class of su->obj. */ skip = (PyUnicode_Check(name) && - PyUnicode_GET_LENGTH(name) == 9 && - PyUnicode_CompareWithASCIIString(name, "__class__") == 0); + PyUnicode_GET_LENGTH(name) == 9 && + _PyUnicode_CompareWithId(name, &PyId___class__) == 0); } if (!skip) { @@ -6543,8 +6545,7 @@ for (i = 0; i < n; i++) { PyObject *name = PyTuple_GET_ITEM(co->co_freevars, i); assert(PyUnicode_Check(name)); - if (!PyUnicode_CompareWithASCIIString(name, - "__class__")) { + if (!_PyUnicode_CompareWithId(name, &PyId___class__)) { Py_ssize_t index = co->co_nlocals + PyTuple_GET_SIZE(co->co_cellvars) + i; PyObject *cell = f->f_localsplus[index]; diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -10566,6 +10566,15 @@ } int +_PyUnicode_CompareWithId(PyObject *left, _Py_Identifier *right) +{ + PyObject *right_str = _PyUnicode_FromId(right); /* borrowed */ + if (right_str == NULL) + return -1; + return PyUnicode_Compare(left, right_str); +} + +int PyUnicode_CompareWithASCIIString(PyObject* uni, const char* str) { Py_ssize_t i; diff --git a/Python/errors.c b/Python/errors.c --- a/Python/errors.c +++ b/Python/errors.c @@ -878,7 +878,7 @@ goto done; } else { - if (PyUnicode_CompareWithASCIIString(moduleName, "builtins") != 0) { + if (_PyUnicode_CompareWithId(moduleName, &_PyId_builtins) != 0) { if (PyFile_WriteObject(moduleName, f, Py_PRINT_RAW) < 0) goto done; if (PyFile_WriteString(".", f) < 0) diff --git a/Python/pythonrun.c b/Python/pythonrun.c --- a/Python/pythonrun.c +++ b/Python/pythonrun.c @@ -37,6 +37,7 @@ /* Common identifiers */ _Py_Identifier _PyId_argv = _Py_static_string_init("argv"); +_Py_Identifier _PyId_builtins = _Py_static_string_init("builtins"); _Py_Identifier _PyId_path = _Py_static_string_init("path"); _Py_Identifier _PyId_stdin = _Py_static_string_init("stdin"); _Py_Identifier _PyId_stdout = _Py_static_string_init("stdout"); @@ -1928,7 +1929,7 @@ err = PyFile_WriteString("", f); } else { - if (PyUnicode_CompareWithASCIIString(moduleName, "builtins") != 0) + if (_PyUnicode_CompareWithId(moduleName, &_PyId_builtins) != 0) { err = PyFile_WriteObject(moduleName, f, Py_PRINT_RAW); err += PyFile_WriteString(".", f); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 01:12:40 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 7 Nov 2013 01:12:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=3A_Use_the_?= =?utf-8?q?new_=5FPyId=5Fbuiltins_identifier?= Message-ID: <3dFQ6c6y7DzRbJ@mail.python.org> http://hg.python.org/cpython/rev/3f9f2cfae53b changeset: 86981:3f9f2cfae53b user: Victor Stinner date: Thu Nov 07 00:43:05 2013 +0100 summary: Issue #19512: Use the new _PyId_builtins identifier files: Modules/_lsprof.c | 2 +- Objects/object.c | 8 ++++++-- Python/import.c | 2 +- 3 files changed, 8 insertions(+), 4 deletions(-) diff --git a/Modules/_lsprof.c b/Modules/_lsprof.c --- a/Modules/_lsprof.c +++ b/Modules/_lsprof.c @@ -185,7 +185,7 @@ } } if (modname != NULL) { - if (PyUnicode_CompareWithASCIIString(modname, "builtins") != 0) { + if (_PyUnicode_CompareWithId(modname, &_PyId_builtins) != 0) { PyObject *result; result = PyUnicode_FromFormat("<%U.%s>", modname, fn->m_ml->ml_name); diff --git a/Objects/object.c b/Objects/object.c --- a/Objects/object.c +++ b/Objects/object.c @@ -1122,8 +1122,12 @@ PyObject * _PyObject_GetBuiltin(const char *name) { - PyObject *mod, *attr; - mod = PyImport_ImportModule("builtins"); + PyObject *mod_name, *mod, *attr; + + mod_name = _PyUnicode_FromId(&_PyId_builtins); /* borrowed */ + if (mod_name == NULL) + return NULL; + mod = PyImport_Import(mod_name); if (mod == NULL) return NULL; attr = PyObject_GetAttrString(mod, name); diff --git a/Python/import.c b/Python/import.c --- a/Python/import.c +++ b/Python/import.c @@ -310,7 +310,7 @@ /* XXX Perhaps these precautions are obsolete. Who knows? */ - value = PyDict_GetItemString(modules, "builtins"); + value = _PyDict_GetItemId(modules, &_PyId_builtins); if (value != NULL && PyModule_Check(value)) { dict = PyModule_GetDict(value); if (Py_VerboseFlag) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 01:12:42 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 7 Nov 2013 01:12:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_remove_an_outdated_comment?= Message-ID: <3dFQ6f1c0Nz7Lkq@mail.python.org> http://hg.python.org/cpython/rev/fafe20297927 changeset: 86982:fafe20297927 user: Victor Stinner date: Thu Nov 07 00:53:56 2013 +0100 summary: remove an outdated comment The comment is meaningless since changeset 4e985a96a612. files: Python/getargs.c | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/Python/getargs.c b/Python/getargs.c --- a/Python/getargs.c +++ b/Python/getargs.c @@ -1590,7 +1590,6 @@ "keywords must be strings"); return cleanreturn(0, &freelist); } - /* check that _PyUnicode_AsString() result is not NULL */ for (i = 0; i < len; i++) { if (!PyUnicode_CompareWithASCIIString(key, kwlist[i])) { match = 1; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 05:25:56 2013 From: python-checkins at python.org (guido.van.rossum) Date: Thu, 7 Nov 2013 05:25:56 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Add_close=28=29?= =?utf-8?q?_back_to_Unix_selector_event_loop=2C_to_remove_all_signal?= Message-ID: <3dFWkr4pkkz7Lk8@mail.python.org> http://hg.python.org/cpython/rev/89873f3b18fd changeset: 86983:89873f3b18fd user: Guido van Rossum date: Wed Nov 06 20:25:50 2013 -0800 summary: asyncio: Add close() back to Unix selector event loop, to remove all signal handlers. Should fix buildbot issues. files: Lib/asyncio/unix_events.py | 5 +++ Lib/test/test_asyncio/test_unix_events.py | 16 +++++++++++ 2 files changed, 21 insertions(+), 0 deletions(-) diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -48,6 +48,11 @@ def _socketpair(self): return socket.socketpair() + def close(self): + for sig in list(self._signal_handlers): + self.remove_signal_handler(sig) + super().close() + def add_signal_handler(self, sig, callback, *args): """Add a handler for a signal. UNIX only. diff --git a/Lib/test/test_asyncio/test_unix_events.py b/Lib/test/test_asyncio/test_unix_events.py --- a/Lib/test/test_asyncio/test_unix_events.py +++ b/Lib/test/test_asyncio/test_unix_events.py @@ -183,6 +183,22 @@ self.assertRaises( RuntimeError, self.loop.remove_signal_handler, signal.SIGHUP) + @unittest.mock.patch('asyncio.unix_events.signal') + def test_close(self, m_signal): + m_signal.NSIG = signal.NSIG + + self.loop.add_signal_handler(signal.SIGHUP, lambda: True) + self.loop.add_signal_handler(signal.SIGCHLD, lambda: True) + + self.assertEqual(len(self.loop._signal_handlers), 2) + + m_signal.set_wakeup_fd.reset_mock() + + self.loop.close() + + self.assertEqual(len(self.loop._signal_handlers), 0) + m_signal.set_wakeup_fd.assert_called_once_with(-1) + class UnixReadPipeTransportTests(unittest.TestCase): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 12:38:57 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 7 Nov 2013 12:38:57 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbjogcHJpbnRfZXJyb3JfdGV4dCgp?= =?utf-8?q?_doesn=27t_encode_the_filename_anymore?= Message-ID: <3dFjLT5XTlz7Ljj@mail.python.org> http://hg.python.org/cpython/rev/7c71d2f97196 changeset: 86984:7c71d2f97196 user: Victor Stinner date: Thu Nov 07 12:37:56 2013 +0100 summary: print_error_text() doesn't encode the filename anymore Use aslo PyUnicode_FromFormat() to format the line so only one call to PyFile_WriteObject() is needed. tb_displayline() of Python/traceback.c has similar implementation. files: Python/pythonrun.c | 62 +++++++++++++++++++-------------- 1 files changed, 35 insertions(+), 27 deletions(-) diff --git a/Python/pythonrun.c b/Python/pythonrun.c --- a/Python/pythonrun.c +++ b/Python/pythonrun.c @@ -50,6 +50,7 @@ _Py_IDENTIFIER(last_type); _Py_IDENTIFIER(last_value); _Py_IDENTIFIER(last_traceback); +_Py_static_string(PyId_string, ""); #ifdef Py_REF_DEBUG static @@ -1625,8 +1626,8 @@ } static int -parse_syntax_error(PyObject *err, PyObject **message, const char **filename, - int *lineno, int *offset, const char **text) +parse_syntax_error(PyObject *err, PyObject **message, PyObject **filename, + int *lineno, int *offset, PyObject **text) { long hold; PyObject *v; @@ -1637,6 +1638,7 @@ _Py_IDENTIFIER(text); *message = NULL; + *filename = NULL; /* new style errors. `err' is an instance */ *message = _PyObject_GetAttrId(err, &PyId_msg); @@ -1648,13 +1650,13 @@ goto finally; if (v == Py_None) { Py_DECREF(v); - *filename = NULL; + *filename = _PyUnicode_FromId(&PyId_string); + if (*filename == NULL) + goto finally; + Py_INCREF(*filename); } else { - *filename = _PyUnicode_AsString(v); - Py_DECREF(v); - if (!*filename) - goto finally; + *filename = v; } v = _PyObject_GetAttrId(err, &PyId_lineno); @@ -1688,15 +1690,13 @@ *text = NULL; } else { - *text = _PyUnicode_AsString(v); - Py_DECREF(v); - if (!*text) - goto finally; + *text = v; } return 1; finally: Py_XDECREF(*message); + Py_XDECREF(*filename); return 0; } @@ -1707,9 +1707,15 @@ } static void -print_error_text(PyObject *f, int offset, const char *text) +print_error_text(PyObject *f, int offset, PyObject *text_obj) { + char *text; char *nl; + + text = _PyUnicode_AsString(text_obj); + if (text == NULL) + return; + if (offset >= 0) { if (offset > 0 && offset == strlen(text) && text[offset - 1] == '\n') offset--; @@ -1880,27 +1886,30 @@ if (err == 0 && _PyObject_HasAttrId(value, &PyId_print_file_and_line)) { - PyObject *message; - const char *filename, *text; + PyObject *message, *filename, *text; int lineno, offset; if (!parse_syntax_error(value, &message, &filename, &lineno, &offset, &text)) PyErr_Clear(); else { - char buf[10]; - PyFile_WriteString(" File \"", f); - if (filename == NULL) - PyFile_WriteString("", f); - else - PyFile_WriteString(filename, f); - PyFile_WriteString("\", line ", f); - PyOS_snprintf(buf, sizeof(buf), "%d", lineno); - PyFile_WriteString(buf, f); - PyFile_WriteString("\n", f); - if (text != NULL) - print_error_text(f, offset, text); + PyObject *line; + Py_DECREF(value); value = message; + + line = PyUnicode_FromFormat(" File \"%U\", line %d\n", + filename, lineno); + Py_DECREF(filename); + if (line != NULL) { + PyFile_WriteObject(line, f, Py_PRINT_RAW); + Py_DECREF(line); + } + + if (text != NULL) { + print_error_text(f, offset, text); + Py_DECREF(text); + } + /* Can't be bothered to check all those PyFile_WriteString() calls */ if (PyErr_Occurred()) @@ -2061,7 +2070,6 @@ PyObject *ret = NULL; mod_ty mod; PyArena *arena; - _Py_static_string(PyId_string, ""); PyObject *filename; filename = _PyUnicode_FromId(&PyId_string); /* borrowed */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 16:53:53 2013 From: python-checkins at python.org (r.david.murray) Date: Thu, 7 Nov 2013 16:53:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogIzE4OTg1OiBJbXBy?= =?utf-8?q?ove_fcntl_documentation=2E?= Message-ID: <3dFq0d4fMzzR3q@mail.python.org> http://hg.python.org/cpython/rev/0e0dded5d616 changeset: 86985:0e0dded5d616 branch: 3.3 parent: 86956:bec6df56c053 user: R David Murray date: Thu Nov 07 10:51:07 2013 -0500 summary: #18985: Improve fcntl documentation. Original patch by Vajrasky Kok, further improved (I hope) by me. files: Doc/library/fcntl.rst | 13 ++++++++----- Modules/fcntlmodule.c | 24 +++++++++++++----------- 2 files changed, 21 insertions(+), 16 deletions(-) diff --git a/Doc/library/fcntl.rst b/Doc/library/fcntl.rst --- a/Doc/library/fcntl.rst +++ b/Doc/library/fcntl.rst @@ -30,11 +30,11 @@ .. function:: fcntl(fd, op[, arg]) - Perform the requested operation on file descriptor *fd* (file objects providing - a :meth:`~io.IOBase.fileno` method are accepted as well). The operation is - defined by *op* - and is operating system dependent. These codes are also found in the - :mod:`fcntl` module. The argument *arg* is optional, and defaults to the integer + Perform the operation *op* on file descriptor *fd* (file objects providing + a :meth:`~io.IOBase.fileno` method are accepted as well). The values used + for *op* are operating system dependent, and are available as constants + in the :mod:`fcntl` module, using the same names as used in the relevant C + header files. The argument *arg* is optional, and defaults to the integer value ``0``. When present, it can either be an integer value, or a string. With the argument missing or an integer value, the return value of this function is the integer return value of the C :c:func:`fcntl` call. When the argument is @@ -56,6 +56,9 @@ that the argument handling is even more complicated. The op parameter is limited to values that can fit in 32-bits. + Additional constants of interest for use as the *op* argument can be + found in the :mod:`termios` module, under the same names as used in + the relevant C header files. The parameter *arg* can be one of an integer, absent (treated identically to the integer ``0``), an object supporting the read-only buffer interface (most likely diff --git a/Modules/fcntlmodule.c b/Modules/fcntlmodule.c --- a/Modules/fcntlmodule.c +++ b/Modules/fcntlmodule.c @@ -27,7 +27,7 @@ } -/* fcntl(fd, opt, [arg]) */ +/* fcntl(fd, op, [arg]) */ static PyObject * fcntl_fcntl(PyObject *self, PyObject *args) @@ -77,11 +77,12 @@ } PyDoc_STRVAR(fcntl_doc, -"fcntl(fd, opt, [arg])\n\ +"fcntl(fd, op, [arg])\n\ \n\ -Perform the requested operation on file descriptor fd. The operation\n\ -is defined by op and is operating system dependent. These constants are\n\ -available from the fcntl module. The argument arg is optional, and\n\ +Perform the operation op on file descriptor fd. The values used\n\ +for op are operating system dependent, and are available\n\ +as constants in the fcntl module, using the same names as used in\n\ +the relevant C header files. The argument arg is optional, and\n\ defaults to 0; it may be an int or a string. If arg is given as a string,\n\ the return value of fcntl is a string of that length, containing the\n\ resulting value put in the arg buffer by the operating system. The length\n\ @@ -90,7 +91,7 @@ corresponding to the return value of the fcntl call in the C code."); -/* ioctl(fd, opt, [arg]) */ +/* ioctl(fd, op, [arg]) */ static PyObject * fcntl_ioctl(PyObject *self, PyObject *args) @@ -104,7 +105,7 @@ whereas the system expects it to be a 32bit bit field value regardless of it being passed as an int or unsigned long on various platforms. See the termios.TIOCSWINSZ constant across - platforms for an example of thise. + platforms for an example of this. If any of the 64bit platforms ever decide to use more than 32bits in their unsigned long ioctl codes this will break and need @@ -222,11 +223,12 @@ } PyDoc_STRVAR(ioctl_doc, -"ioctl(fd, opt[, arg[, mutate_flag]])\n\ +"ioctl(fd, op[, arg[, mutate_flag]])\n\ \n\ -Perform the requested operation on file descriptor fd. The operation is\n\ -defined by opt and is operating system dependent. Typically these codes are\n\ -retrieved from the fcntl or termios library modules.\n\ +Perform the operation op on file descriptor fd. The values used for op\n\ +are operating system dependent, and are available as constants in the\n\ +fcntl or termios library modules, using the same names as used in the\n\ +relevant C header files.\n\ \n\ The argument arg is optional, and defaults to 0; it may be an int or a\n\ buffer containing character data (most likely a string or an array). \n\ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 16:53:55 2013 From: python-checkins at python.org (r.david.murray) Date: Thu, 7 Nov 2013 16:53:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge_=2318985=3A_Improve_fcntl_documentation=2E?= Message-ID: <3dFq0g0kVwz7Llj@mail.python.org> http://hg.python.org/cpython/rev/ddf6da99b3cd changeset: 86986:ddf6da99b3cd parent: 86984:7c71d2f97196 parent: 86985:0e0dded5d616 user: R David Murray date: Thu Nov 07 10:51:41 2013 -0500 summary: Merge #18985: Improve fcntl documentation. files: Doc/library/fcntl.rst | 13 ++++++++----- Modules/fcntlmodule.c | 24 +++++++++++++----------- 2 files changed, 21 insertions(+), 16 deletions(-) diff --git a/Doc/library/fcntl.rst b/Doc/library/fcntl.rst --- a/Doc/library/fcntl.rst +++ b/Doc/library/fcntl.rst @@ -30,11 +30,11 @@ .. function:: fcntl(fd, op[, arg]) - Perform the requested operation on file descriptor *fd* (file objects providing - a :meth:`~io.IOBase.fileno` method are accepted as well). The operation is - defined by *op* - and is operating system dependent. These codes are also found in the - :mod:`fcntl` module. The argument *arg* is optional, and defaults to the integer + Perform the operation *op* on file descriptor *fd* (file objects providing + a :meth:`~io.IOBase.fileno` method are accepted as well). The values used + for *op* are operating system dependent, and are available as constants + in the :mod:`fcntl` module, using the same names as used in the relevant C + header files. The argument *arg* is optional, and defaults to the integer value ``0``. When present, it can either be an integer value, or a string. With the argument missing or an integer value, the return value of this function is the integer return value of the C :c:func:`fcntl` call. When the argument is @@ -56,6 +56,9 @@ that the argument handling is even more complicated. The op parameter is limited to values that can fit in 32-bits. + Additional constants of interest for use as the *op* argument can be + found in the :mod:`termios` module, under the same names as used in + the relevant C header files. The parameter *arg* can be one of an integer, absent (treated identically to the integer ``0``), an object supporting the read-only buffer interface (most likely diff --git a/Modules/fcntlmodule.c b/Modules/fcntlmodule.c --- a/Modules/fcntlmodule.c +++ b/Modules/fcntlmodule.c @@ -27,7 +27,7 @@ } -/* fcntl(fd, opt, [arg]) */ +/* fcntl(fd, op, [arg]) */ static PyObject * fcntl_fcntl(PyObject *self, PyObject *args) @@ -77,11 +77,12 @@ } PyDoc_STRVAR(fcntl_doc, -"fcntl(fd, opt, [arg])\n\ +"fcntl(fd, op, [arg])\n\ \n\ -Perform the requested operation on file descriptor fd. The operation\n\ -is defined by op and is operating system dependent. These constants are\n\ -available from the fcntl module. The argument arg is optional, and\n\ +Perform the operation op on file descriptor fd. The values used\n\ +for op are operating system dependent, and are available\n\ +as constants in the fcntl module, using the same names as used in\n\ +the relevant C header files. The argument arg is optional, and\n\ defaults to 0; it may be an int or a string. If arg is given as a string,\n\ the return value of fcntl is a string of that length, containing the\n\ resulting value put in the arg buffer by the operating system. The length\n\ @@ -90,7 +91,7 @@ corresponding to the return value of the fcntl call in the C code."); -/* ioctl(fd, opt, [arg]) */ +/* ioctl(fd, op, [arg]) */ static PyObject * fcntl_ioctl(PyObject *self, PyObject *args) @@ -104,7 +105,7 @@ whereas the system expects it to be a 32bit bit field value regardless of it being passed as an int or unsigned long on various platforms. See the termios.TIOCSWINSZ constant across - platforms for an example of thise. + platforms for an example of this. If any of the 64bit platforms ever decide to use more than 32bits in their unsigned long ioctl codes this will break and need @@ -222,11 +223,12 @@ } PyDoc_STRVAR(ioctl_doc, -"ioctl(fd, opt[, arg[, mutate_flag]])\n\ +"ioctl(fd, op[, arg[, mutate_flag]])\n\ \n\ -Perform the requested operation on file descriptor fd. The operation is\n\ -defined by opt and is operating system dependent. Typically these codes are\n\ -retrieved from the fcntl or termios library modules.\n\ +Perform the operation op on file descriptor fd. The values used for op\n\ +are operating system dependent, and are available as constants in the\n\ +fcntl or termios library modules, using the same names as used in the\n\ +relevant C header files.\n\ \n\ The argument arg is optional, and defaults to 0; it may be an int or a\n\ buffer containing character data (most likely a string or an array). \n\ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 16:53:56 2013 From: python-checkins at python.org (r.david.murray) Date: Thu, 7 Nov 2013 16:53:56 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogYmFja3BvcnQgIzE4?= =?utf-8?q?985=3A_Improve_fcntl_documentation=2E?= Message-ID: <3dFq0h3xncz7Lsv@mail.python.org> http://hg.python.org/cpython/rev/645aa4f44aa4 changeset: 86987:645aa4f44aa4 branch: 2.7 parent: 86955:c97600bdd726 user: R David Murray date: Thu Nov 07 10:52:53 2013 -0500 summary: backport #18985: Improve fcntl documentation. files: Doc/library/fcntl.rst | 13 ++++++++----- Modules/fcntlmodule.c | 24 +++++++++++++----------- 2 files changed, 21 insertions(+), 16 deletions(-) diff --git a/Doc/library/fcntl.rst b/Doc/library/fcntl.rst --- a/Doc/library/fcntl.rst +++ b/Doc/library/fcntl.rst @@ -24,11 +24,11 @@ .. function:: fcntl(fd, op[, arg]) - Perform the requested operation on file descriptor *fd* (file objects providing - a :meth:`~io.IOBase.fileno` method are accepted as well). The operation is - defined by *op* - and is operating system dependent. These codes are also found in the - :mod:`fcntl` module. The argument *arg* is optional, and defaults to the integer + Perform the operation *op* on file descriptor *fd* (file objects providing + a :meth:`~io.IOBase.fileno` method are accepted as well). The values used + for for *op* are operating system dependent, and are available as constants + in the :mod:`fcntl` module, using the same names as used in the relevant C + header files. The argument *arg* is optional, and defaults to the integer value ``0``. When present, it can either be an integer value, or a string. With the argument missing or an integer value, the return value of this function is the integer return value of the C :c:func:`fcntl` call. When the argument is @@ -51,6 +51,9 @@ argument handling is even more complicated. The op parameter is limited to values that can fit in 32-bits. + Additional constants of interest for use as the *op* argument can be + found in the :mod:`termios` module, under the same names as used in + the relevant C header files. The parameter *arg* can be one of an integer, absent (treated identically to the integer ``0``), an object supporting the read-only buffer interface (most likely diff --git a/Modules/fcntlmodule.c b/Modules/fcntlmodule.c --- a/Modules/fcntlmodule.c +++ b/Modules/fcntlmodule.c @@ -27,7 +27,7 @@ } -/* fcntl(fd, opt, [arg]) */ +/* fcntl(fd, op, [arg]) */ static PyObject * fcntl_fcntl(PyObject *self, PyObject *args) @@ -77,11 +77,12 @@ } PyDoc_STRVAR(fcntl_doc, -"fcntl(fd, opt, [arg])\n\ +"fcntl(fd, op, [arg])\n\ \n\ -Perform the requested operation on file descriptor fd. The operation\n\ -is defined by op and is operating system dependent. These constants are\n\ -available from the fcntl module. The argument arg is optional, and\n\ +Perform the operation op on file descriptor fd. The values used\n\ +for op are operating system dependent, and are available\n\ +as constants in the fcntl module, using the same names as used in\n\ +the relevant C header files. The argument arg is optional, and\n\ defaults to 0; it may be an int or a string. If arg is given as a string,\n\ the return value of fcntl is a string of that length, containing the\n\ resulting value put in the arg buffer by the operating system. The length\n\ @@ -90,7 +91,7 @@ corresponding to the return value of the fcntl call in the C code."); -/* ioctl(fd, opt, [arg]) */ +/* ioctl(fd, op, [arg]) */ static PyObject * fcntl_ioctl(PyObject *self, PyObject *args) @@ -104,7 +105,7 @@ whereas the system expects it to be a 32bit bit field value regardless of it being passed as an int or unsigned long on various platforms. See the termios.TIOCSWINSZ constant across - platforms for an example of thise. + platforms for an example of this. If any of the 64bit platforms ever decide to use more than 32bits in their unsigned long ioctl codes this will break and need @@ -212,11 +213,12 @@ } PyDoc_STRVAR(ioctl_doc, -"ioctl(fd, opt[, arg[, mutate_flag]])\n\ +"ioctl(fd, op[, arg[, mutate_flag]])\n\ \n\ -Perform the requested operation on file descriptor fd. The operation is\n\ -defined by opt and is operating system dependent. Typically these codes are\n\ -retrieved from the fcntl or termios library modules.\n\ +Perform the operation op on file descriptor fd. The values used for op\n\ +are operating system dependent, and are available as constants in the\n\ +fcntl or termios library modules, using the same names as used in the\n\ +relevant C header files.\n\ \n\ The argument arg is optional, and defaults to 0; it may be an int or a\n\ buffer containing character data (most likely a string or an array). \n\ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 17:35:58 2013 From: python-checkins at python.org (ezio.melotti) Date: Thu, 7 Nov 2013 17:35:58 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogIzE5NDgwOiBIVE1M?= =?utf-8?q?Parser_now_accepts_all_valid_start-tag_names_as_defined_by_the?= Message-ID: <3dFqxB31m9z7Ljj@mail.python.org> http://hg.python.org/cpython/rev/695f988824bb changeset: 86988:695f988824bb branch: 2.7 user: Ezio Melotti date: Thu Nov 07 18:31:36 2013 +0200 summary: #19480: HTMLParser now accepts all valid start-tag names as defined by the HTML5 standard. files: Lib/HTMLParser.py | 11 +++++++---- Lib/test/test_htmlparser.py | 8 ++++++-- Misc/NEWS | 5 ++++- 3 files changed, 17 insertions(+), 7 deletions(-) diff --git a/Lib/HTMLParser.py b/Lib/HTMLParser.py --- a/Lib/HTMLParser.py +++ b/Lib/HTMLParser.py @@ -22,9 +22,12 @@ starttagopen = re.compile('<[a-zA-Z]') piclose = re.compile('>') commentclose = re.compile(r'--\s*>') -tagfind = re.compile('([a-zA-Z][-.a-zA-Z0-9:_]*)(?:\s|/(?!>))*') + # see http://www.w3.org/TR/html5/tokenization.html#tag-open-state # and http://www.w3.org/TR/html5/tokenization.html#tag-name-state +# note: if you change tagfind/attrfind remember to update locatestarttagend too +tagfind = re.compile('([a-zA-Z][^\t\n\r\f />\x00]*)(?:\s|/(?!>))*') +# this regex is currently unused, but left for backward compatibility tagfind_tolerant = re.compile('[a-zA-Z][^\t\n\r\f />\x00]*') attrfind = re.compile( @@ -32,7 +35,7 @@ r'(\'[^\']*\'|"[^"]*"|(?![\'"])[^>\s]*))?(?:\s|/(?!>))*') locatestarttagend = re.compile(r""" - <[a-zA-Z][-.a-zA-Z0-9:_]* # tag name + <[a-zA-Z][^\t\n\r\f />\x00]* # tag name (?:[\s/]* # optional whitespace before attribute name (?:(?<=['"\s/])[^\s/>][^\s/=>]* # attribute name (?:\s*=+\s* # value indicator @@ -373,14 +376,14 @@ self.handle_data(rawdata[i:gtpos]) return gtpos # find the name: w3.org/TR/html5/tokenization.html#tag-name-state - namematch = tagfind_tolerant.match(rawdata, i+2) + namematch = tagfind.match(rawdata, i+2) if not namematch: # w3.org/TR/html5/tokenization.html#end-tag-open-state if rawdata[i:i+3] == '': return i+3 else: return self.parse_bogus_comment(i) - tagname = namematch.group().lower() + tagname = namematch.group(1).lower() # consume and ignore other stuff between the name and the > # Note: this is not 100% correct, since we might have things like # , but looking for > after tha name should cover diff --git a/Lib/test/test_htmlparser.py b/Lib/test/test_htmlparser.py --- a/Lib/test/test_htmlparser.py +++ b/Lib/test/test_htmlparser.py @@ -206,8 +206,7 @@ self._run_check("", [('comment', '$')]) self._run_check("", [('data', '", [('starttag', 'a", [('endtag', 'a'", [('data', "", [('starttag', 'a$b', [])]) + self._run_check("", [('startendtag', 'a$b', [])]) + self._run_check("", [('starttag', 'a$b', [])]) + self._run_check("", [('startendtag', 'a$b', [])]) def test_valid_doctypes(self): # from http://www.w3.org/QA/2002/04/valid-dtd-list.html diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -12,8 +12,11 @@ Library ------- +- Issue #19480: HTMLParser now accepts all valid start-tag names as defined + by the HTML5 standard. + - Issue #17827: Add the missing documentation for ``codecs.encode`` and - ``codecs.decode``. + ``codecs.decode``. - Issue #6157: Fixed Tkinter.Text.debug(). Original patch by Guilherme Polo. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 17:35:59 2013 From: python-checkins at python.org (ezio.melotti) Date: Thu, 7 Nov 2013 17:35:59 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogIzE5NDgwOiBIVE1M?= =?utf-8?q?Parser_now_accepts_all_valid_start-tag_names_as_defined_by_the?= Message-ID: <3dFqxC6dn5z7Lkk@mail.python.org> http://hg.python.org/cpython/rev/9b9d188ed549 changeset: 86989:9b9d188ed549 branch: 3.3 parent: 86985:0e0dded5d616 user: Ezio Melotti date: Thu Nov 07 18:33:24 2013 +0200 summary: #19480: HTMLParser now accepts all valid start-tag names as defined by the HTML5 standard. files: Lib/html/parser.py | 23 +++++++++++++---------- Lib/test/test_htmlparser.py | 17 +++++++++++++---- Misc/NEWS | 3 +++ 3 files changed, 29 insertions(+), 14 deletions(-) diff --git a/Lib/html/parser.py b/Lib/html/parser.py --- a/Lib/html/parser.py +++ b/Lib/html/parser.py @@ -23,16 +23,16 @@ starttagopen = re.compile('<[a-zA-Z]') piclose = re.compile('>') commentclose = re.compile(r'--\s*>') +# Note: +# 1) the strict attrfind isn't really strict, but we can't make it +# correctly strict without breaking backward compatibility; +# 2) if you change tagfind/attrfind remember to update locatestarttagend too; +# 3) if you change tagfind/attrfind and/or locatestarttagend the parser will +# explode, so don't do it. tagfind = re.compile('([a-zA-Z][-.a-zA-Z0-9:_]*)(?:\s|/(?!>))*') # see http://www.w3.org/TR/html5/tokenization.html#tag-open-state # and http://www.w3.org/TR/html5/tokenization.html#tag-name-state -tagfind_tolerant = re.compile('[a-zA-Z][^\t\n\r\f />\x00]*') -# Note: -# 1) the strict attrfind isn't really strict, but we can't make it -# correctly strict without breaking backward compatibility; -# 2) if you change attrfind remember to update locatestarttagend too; -# 3) if you change attrfind and/or locatestarttagend the parser will -# explode, so don't do it. +tagfind_tolerant = re.compile('([a-zA-Z][^\t\n\r\f />\x00]*)(?:\s|/(?!>))*') attrfind = re.compile( r'\s*([a-zA-Z_][-.:a-zA-Z_0-9]*)(\s*=\s*' r'(\'[^\']*\'|"[^"]*"|[^\s"\'=<>`]*))?') @@ -54,7 +54,7 @@ \s* # trailing whitespace """, re.VERBOSE) locatestarttagend_tolerant = re.compile(r""" - <[a-zA-Z][-.a-zA-Z0-9:_]* # tag name + <[a-zA-Z][^\t\n\r\f />\x00]* # tag name (?:[\s/]* # optional whitespace before attribute name (?:(?<=['"\s/])[^\s/>][^\s/=>]* # attribute name (?:\s*=+\s* # value indicator @@ -328,7 +328,10 @@ # Now parse the data between i+1 and j into a tag and attrs attrs = [] - match = tagfind.match(rawdata, i+1) + if self.strict: + match = tagfind.match(rawdata, i+1) + else: + match = tagfind_tolerant.match(rawdata, i+1) assert match, 'unexpected call to parse_starttag()' k = match.end() self.lasttag = tag = match.group(1).lower() @@ -440,7 +443,7 @@ return i+3 else: return self.parse_bogus_comment(i) - tagname = namematch.group().lower() + tagname = namematch.group(1).lower() # consume and ignore other stuff between the name and the > # Note: this is not 100% correct, since we might have things like # , but looking for > after tha name should cover diff --git a/Lib/test/test_htmlparser.py b/Lib/test/test_htmlparser.py --- a/Lib/test/test_htmlparser.py +++ b/Lib/test/test_htmlparser.py @@ -229,6 +229,11 @@ self._parse_error("'") self._parse_error(">xt'), ('entityref', 'a'), - ('data', '<", [('comment', '$')]) self._run_check("", [('data', '", [('starttag', 'a", [('endtag', 'a'", [('data', "", [('starttag', 'a$b', [])]) + self._run_check("", [('startendtag', 'a$b', [])]) + self._run_check("", [('starttag', 'a$b', [])]) + self._run_check("", [('startendtag', 'a$b', [])]) def test_slashes_in_starttag(self): self._run_check('', [('startendtag', 'a', [('foo', 'var')])]) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -13,6 +13,9 @@ Library ------- +- Issue #19480: HTMLParser now accepts all valid start-tag names as defined + by the HTML5 standard. + - Issue #6157: Fixed tkinter.Text.debug(). Original patch by Guilherme Polo. - Issue #6160: The bbox() method of tkinter.Spinbox now returns a tuple of -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 17:36:01 2013 From: python-checkins at python.org (ezio.melotti) Date: Thu, 7 Nov 2013 17:36:01 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?b?KTogIzE5NDgwOiBtZXJnZSB3aXRoIDMuMy4=?= Message-ID: <3dFqxF3HS4z7LkF@mail.python.org> http://hg.python.org/cpython/rev/7d8a37020db9 changeset: 86990:7d8a37020db9 parent: 86986:ddf6da99b3cd parent: 86989:9b9d188ed549 user: Ezio Melotti date: Thu Nov 07 18:35:27 2013 +0200 summary: #19480: merge with 3.3. files: Lib/html/parser.py | 23 +++++++++++++---------- Lib/test/test_htmlparser.py | 17 +++++++++++++---- Misc/NEWS | 3 +++ 3 files changed, 29 insertions(+), 14 deletions(-) diff --git a/Lib/html/parser.py b/Lib/html/parser.py --- a/Lib/html/parser.py +++ b/Lib/html/parser.py @@ -25,16 +25,16 @@ starttagopen = re.compile('<[a-zA-Z]') piclose = re.compile('>') commentclose = re.compile(r'--\s*>') +# Note: +# 1) the strict attrfind isn't really strict, but we can't make it +# correctly strict without breaking backward compatibility; +# 2) if you change tagfind/attrfind remember to update locatestarttagend too; +# 3) if you change tagfind/attrfind and/or locatestarttagend the parser will +# explode, so don't do it. tagfind = re.compile('([a-zA-Z][-.a-zA-Z0-9:_]*)(?:\s|/(?!>))*') # see http://www.w3.org/TR/html5/tokenization.html#tag-open-state # and http://www.w3.org/TR/html5/tokenization.html#tag-name-state -tagfind_tolerant = re.compile('[a-zA-Z][^\t\n\r\f />\x00]*') -# Note: -# 1) the strict attrfind isn't really strict, but we can't make it -# correctly strict without breaking backward compatibility; -# 2) if you change attrfind remember to update locatestarttagend too; -# 3) if you change attrfind and/or locatestarttagend the parser will -# explode, so don't do it. +tagfind_tolerant = re.compile('([a-zA-Z][^\t\n\r\f />\x00]*)(?:\s|/(?!>))*') attrfind = re.compile( r'\s*([a-zA-Z_][-.:a-zA-Z_0-9]*)(\s*=\s*' r'(\'[^\']*\'|"[^"]*"|[^\s"\'=<>`]*))?') @@ -56,7 +56,7 @@ \s* # trailing whitespace """, re.VERBOSE) locatestarttagend_tolerant = re.compile(r""" - <[a-zA-Z][-.a-zA-Z0-9:_]* # tag name + <[a-zA-Z][^\t\n\r\f />\x00]* # tag name (?:[\s/]* # optional whitespace before attribute name (?:(?<=['"\s/])[^\s/>][^\s/=>]* # attribute name (?:\s*=+\s* # value indicator @@ -336,7 +336,10 @@ # Now parse the data between i+1 and j into a tag and attrs attrs = [] - match = tagfind.match(rawdata, i+1) + if self.strict: + match = tagfind.match(rawdata, i+1) + else: + match = tagfind_tolerant.match(rawdata, i+1) assert match, 'unexpected call to parse_starttag()' k = match.end() self.lasttag = tag = match.group(1).lower() @@ -448,7 +451,7 @@ return i+3 else: return self.parse_bogus_comment(i) - tagname = namematch.group().lower() + tagname = namematch.group(1).lower() # consume and ignore other stuff between the name and the > # Note: this is not 100% correct, since we might have things like # , but looking for > after tha name should cover diff --git a/Lib/test/test_htmlparser.py b/Lib/test/test_htmlparser.py --- a/Lib/test/test_htmlparser.py +++ b/Lib/test/test_htmlparser.py @@ -231,6 +231,11 @@ self._parse_error("'") self._parse_error(">xt'), ('entityref', 'a'), - ('data', '<", [('comment', '$')]) self._run_check("", [('data', '", [('starttag', 'a", [('endtag', 'a'", [('data', "", [('starttag', 'a$b', [])]) + self._run_check("", [('startendtag', 'a$b', [])]) + self._run_check("", [('starttag', 'a$b', [])]) + self._run_check("", [('startendtag', 'a$b', [])]) def test_slashes_in_starttag(self): self._run_check('', [('startendtag', 'a', [('foo', 'var')])]) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -64,6 +64,9 @@ - Issue #19286: Directories in ``package_data`` are no longer added to the filelist, preventing failure outlined in the ticket. +- Issue #19480: HTMLParser now accepts all valid start-tag names as defined + by the HTML5 standard. + - Issue #15114: The html.parser module now raises a DeprecationWarning when the strict argument of HTMLParser or the HTMLParser.error method are used. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 18:11:30 2013 From: python-checkins at python.org (guido.van.rossum) Date: Thu, 7 Nov 2013 18:11:30 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Optimize_BaseSelector=2Emo?= =?utf-8?q?dify=28=29=2E_Patch_by_Arnaud_Faure=2E?= Message-ID: <3dFrkB5J7Mz7Lsg@mail.python.org> http://hg.python.org/cpython/rev/9c976f1b17e9 changeset: 86991:9c976f1b17e9 user: Guido van Rossum date: Thu Nov 07 08:39:28 2013 -0800 summary: Optimize BaseSelector.modify(). Patch by Arnaud Faure. files: Lib/selectors.py | 9 ++++++--- Lib/test/test_selectors.py | 10 ++++++++++ 2 files changed, 16 insertions(+), 3 deletions(-) diff --git a/Lib/selectors.py b/Lib/selectors.py --- a/Lib/selectors.py +++ b/Lib/selectors.py @@ -138,11 +138,14 @@ key = self._fd_to_key[_fileobj_to_fd(fileobj)] except KeyError: raise KeyError("{!r} is not registered".format(fileobj)) from None - if events != key.events or data != key.data: - # TODO: If only the data changed, use a shortcut that only - # updates the data. + if events != key.events: self.unregister(fileobj) return self.register(fileobj, events, data) + elif data != key.data: + # Use a shortcut to update the data. + key = key._replace(data=data) + self._fd_to_key[key.fd] = key + return key else: return key diff --git a/Lib/test/test_selectors.py b/Lib/test/test_selectors.py --- a/Lib/test/test_selectors.py +++ b/Lib/test/test_selectors.py @@ -6,6 +6,7 @@ from test import support from time import sleep import unittest +import unittest.mock try: from time import monotonic as time except ImportError: @@ -124,6 +125,15 @@ # modify unknown file obj self.assertRaises(KeyError, s.modify, 999999, selectors.EVENT_READ) + # modify use a shortcut + d3 = object() + s.register = unittest.mock.Mock() + s.unregister = unittest.mock.Mock() + + s.modify(rd, selectors.EVENT_READ, d3) + self.assertFalse(s.register.called) + self.assertFalse(s.unregister.called) + def test_close(self): s = self.SELECTOR() self.addCleanup(s.close) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 18:18:53 2013 From: python-checkins at python.org (ezio.melotti) Date: Thu, 7 Nov 2013 18:18:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=2317080=3A_improve_error_?= =?utf-8?q?message_of_float/complex_when_the_wrong_type_is_passed=2E?= Message-ID: <3dFrtj26ZKz7Lll@mail.python.org> http://hg.python.org/cpython/rev/a73c47c1d374 changeset: 86992:a73c47c1d374 user: Ezio Melotti date: Thu Nov 07 19:18:34 2013 +0200 summary: #17080: improve error message of float/complex when the wrong type is passed. files: Lib/test/test_complex.py | 1 + Lib/test/test_float.py | 1 + Objects/complexobject.c | 10 ++++++---- Objects/floatobject.c | 5 +++-- 4 files changed, 11 insertions(+), 6 deletions(-) diff --git a/Lib/test/test_complex.py b/Lib/test/test_complex.py --- a/Lib/test/test_complex.py +++ b/Lib/test/test_complex.py @@ -303,6 +303,7 @@ self.assertRaises(TypeError, float, 5+3j) self.assertRaises(ValueError, complex, "") self.assertRaises(TypeError, complex, None) + self.assertRaisesRegex(TypeError, "not 'NoneType'", complex, None) self.assertRaises(ValueError, complex, "\0") self.assertRaises(ValueError, complex, "3\09") self.assertRaises(TypeError, complex, "1", "2") diff --git a/Lib/test/test_float.py b/Lib/test/test_float.py --- a/Lib/test/test_float.py +++ b/Lib/test/test_float.py @@ -41,6 +41,7 @@ self.assertRaises(ValueError, float, "-.") self.assertRaises(ValueError, float, b"-") self.assertRaises(TypeError, float, {}) + self.assertRaisesRegex(TypeError, "not 'dict'", float, {}) # Lone surrogate self.assertRaises(UnicodeEncodeError, float, '\uD8F0') # check that we don't accept alternate exponent markers diff --git a/Objects/complexobject.c b/Objects/complexobject.c --- a/Objects/complexobject.c +++ b/Objects/complexobject.c @@ -773,8 +773,9 @@ goto error; } else if (PyObject_AsCharBuffer(v, &s, &len)) { - PyErr_SetString(PyExc_TypeError, - "complex() argument must be a string or a number"); + PyErr_Format(PyExc_TypeError, + "complex() argument must be a string or a number, not '%.200s'", + Py_TYPE(v)->tp_name); return NULL; } @@ -953,8 +954,9 @@ nbi = i->ob_type->tp_as_number; if (nbr == NULL || nbr->nb_float == NULL || ((i != NULL) && (nbi == NULL || nbi->nb_float == NULL))) { - PyErr_SetString(PyExc_TypeError, - "complex() argument must be a string or a number"); + PyErr_Format(PyExc_TypeError, + "complex() argument must be a string or a number, not '%.200s'", + Py_TYPE(r)->tp_name); if (own_r) { Py_DECREF(r); } diff --git a/Objects/floatobject.c b/Objects/floatobject.c --- a/Objects/floatobject.c +++ b/Objects/floatobject.c @@ -144,8 +144,9 @@ } } else if (PyObject_AsCharBuffer(v, &s, &len)) { - PyErr_SetString(PyExc_TypeError, - "float() argument must be a string or a number"); + PyErr_Format(PyExc_TypeError, + "float() argument must be a string or a number, not '%.200s'", + Py_TYPE(v)->tp_name); return NULL; } last = s + len; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 18:25:40 2013 From: python-checkins at python.org (guido.van.rossum) Date: Thu, 7 Nov 2013 18:25:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Remove_redundant_test=5Fse?= =?utf-8?q?lectors=2Epy_from_test=5Fasyncio=2E?= Message-ID: <3dFs2X6qk0z7Ljn@mail.python.org> http://hg.python.org/cpython/rev/42bcf0d3fc64 changeset: 86993:42bcf0d3fc64 user: Guido van Rossum date: Thu Nov 07 09:25:36 2013 -0800 summary: Remove redundant test_selectors.py from test_asyncio. files: Lib/test/test_asyncio/test_selectors.py | 149 ------------ Lib/test/test_asyncio/tests.txt | 1 - 2 files changed, 0 insertions(+), 150 deletions(-) diff --git a/Lib/test/test_asyncio/test_selectors.py b/Lib/test/test_asyncio/test_selectors.py deleted file mode 100644 --- a/Lib/test/test_asyncio/test_selectors.py +++ /dev/null @@ -1,149 +0,0 @@ -"""Tests for selectors.py.""" - -import unittest -import unittest.mock - -from asyncio import selectors - - -class FakeSelector(selectors.BaseSelector): - """Trivial non-abstract subclass of BaseSelector.""" - - def select(self, timeout=None): - raise NotImplementedError - - -class BaseSelectorTests(unittest.TestCase): - - def test_fileobj_to_fd(self): - self.assertEqual(10, selectors._fileobj_to_fd(10)) - - f = unittest.mock.Mock() - f.fileno.return_value = 10 - self.assertEqual(10, selectors._fileobj_to_fd(f)) - - f.fileno.side_effect = AttributeError - self.assertRaises(ValueError, selectors._fileobj_to_fd, f) - - def test_selector_key_repr(self): - key = selectors.SelectorKey(10, 10, selectors.EVENT_READ, None) - self.assertEqual( - "SelectorKey(fileobj=10, fd=10, events=1, data=None)", repr(key)) - - def test_register(self): - fobj = unittest.mock.Mock() - fobj.fileno.return_value = 10 - - s = FakeSelector() - key = s.register(fobj, selectors.EVENT_READ) - self.assertIsInstance(key, selectors.SelectorKey) - self.assertEqual(key.fd, 10) - self.assertIs(key, s._fd_to_key[10]) - - def test_register_unknown_event(self): - s = FakeSelector() - self.assertRaises(ValueError, s.register, unittest.mock.Mock(), 999999) - - def test_register_already_registered(self): - fobj = unittest.mock.Mock() - fobj.fileno.return_value = 10 - - s = FakeSelector() - s.register(fobj, selectors.EVENT_READ) - self.assertRaises(KeyError, s.register, fobj, selectors.EVENT_READ) - - def test_unregister(self): - fobj = unittest.mock.Mock() - fobj.fileno.return_value = 10 - - s = FakeSelector() - s.register(fobj, selectors.EVENT_READ) - s.unregister(fobj) - self.assertFalse(s._fd_to_key) - - def test_unregister_unknown(self): - fobj = unittest.mock.Mock() - fobj.fileno.return_value = 10 - - s = FakeSelector() - self.assertRaises(KeyError, s.unregister, fobj) - - def test_modify_unknown(self): - fobj = unittest.mock.Mock() - fobj.fileno.return_value = 10 - - s = FakeSelector() - self.assertRaises(KeyError, s.modify, fobj, 1) - - def test_modify(self): - fobj = unittest.mock.Mock() - fobj.fileno.return_value = 10 - - s = FakeSelector() - key = s.register(fobj, selectors.EVENT_READ) - key2 = s.modify(fobj, selectors.EVENT_WRITE) - self.assertNotEqual(key.events, key2.events) - self.assertEqual( - selectors.SelectorKey(fobj, 10, selectors.EVENT_WRITE, None), - s.get_key(fobj)) - - def test_modify_data(self): - fobj = unittest.mock.Mock() - fobj.fileno.return_value = 10 - - d1 = object() - d2 = object() - - s = FakeSelector() - key = s.register(fobj, selectors.EVENT_READ, d1) - key2 = s.modify(fobj, selectors.EVENT_READ, d2) - self.assertEqual(key.events, key2.events) - self.assertNotEqual(key.data, key2.data) - self.assertEqual( - selectors.SelectorKey(fobj, 10, selectors.EVENT_READ, d2), - s.get_key(fobj)) - - def test_modify_same(self): - fobj = unittest.mock.Mock() - fobj.fileno.return_value = 10 - - data = object() - - s = FakeSelector() - key = s.register(fobj, selectors.EVENT_READ, data) - key2 = s.modify(fobj, selectors.EVENT_READ, data) - self.assertIs(key, key2) - - def test_select(self): - s = FakeSelector() - self.assertRaises(NotImplementedError, s.select) - - def test_close(self): - s = FakeSelector() - s.register(1, selectors.EVENT_READ) - - s.close() - self.assertFalse(s._fd_to_key) - - def test_context_manager(self): - s = FakeSelector() - - with s as sel: - sel.register(1, selectors.EVENT_READ) - - self.assertFalse(s._fd_to_key) - - def test_key_from_fd(self): - s = FakeSelector() - key = s.register(1, selectors.EVENT_READ) - - self.assertIs(key, s._key_from_fd(1)) - self.assertIsNone(s._key_from_fd(10)) - - if hasattr(selectors.DefaultSelector, 'fileno'): - def test_fileno(self): - self.assertIsInstance(selectors.DefaultSelector().fileno(), int) - - -if __name__ == '__main__': - unittest.main() diff --git a/Lib/test/test_asyncio/tests.txt b/Lib/test/test_asyncio/tests.txt --- a/Lib/test/test_asyncio/tests.txt +++ b/Lib/test/test_asyncio/tests.txt @@ -5,7 +5,6 @@ test_asyncio.test_proactor_events test_asyncio.test_queues test_asyncio.test_selector_events -test_asyncio.test_selectors test_asyncio.test_streams test_asyncio.test_tasks test_asyncio.test_transports -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 18:47:11 2013 From: python-checkins at python.org (martin.v.loewis) Date: Thu, 7 Nov 2013 18:47:11 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319514=3A_Deduplic?= =?utf-8?q?ate_some_=5FPy=5FIDENTIFIER_declarations=2E?= Message-ID: <3dFsWM0SMHz7Ltj@mail.python.org> http://hg.python.org/cpython/rev/4a09cc62419b changeset: 86994:4a09cc62419b user: Martin v. L?wis date: Thu Nov 07 18:46:53 2013 +0100 summary: Issue #19514: Deduplicate some _Py_IDENTIFIER declarations. Patch by Andrei Dorian Duma. files: Misc/NEWS | 3 +++ Modules/_bisectmodule.c | 5 ++--- Modules/_datetimemodule.c | 19 +++++-------------- Modules/_sqlite/connection.c | 5 ++--- Objects/typeobject.c | 14 ++++++-------- 5 files changed, 18 insertions(+), 28 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,6 +10,9 @@ Core and Builtins ----------------- +- Issue #19514: Deduplicate some _Py_IDENTIFIER declarations. + Patch by Andrei Dorian Duma. + - Issue #17936: Fix O(n**2) behaviour when adding or removing many subclasses of a given type. diff --git a/Modules/_bisectmodule.c b/Modules/_bisectmodule.c --- a/Modules/_bisectmodule.c +++ b/Modules/_bisectmodule.c @@ -6,6 +6,8 @@ #define PY_SSIZE_T_CLEAN #include "Python.h" +_Py_IDENTIFIER(insert); + static Py_ssize_t internal_bisect_right(PyObject *list, PyObject *item, Py_ssize_t lo, Py_ssize_t hi) { @@ -90,8 +92,6 @@ if (PyList_Insert(list, index, item) < 0) return NULL; } else { - _Py_IDENTIFIER(insert); - result = _PyObject_CallMethodId(list, &PyId_insert, "nO", index, item); if (result == NULL) return NULL; @@ -195,7 +195,6 @@ if (PyList_Insert(list, index, item) < 0) return NULL; } else { - _Py_IDENTIFIER(insert); result = _PyObject_CallMethodId(list, &PyId_insert, "nO", index, item); if (result == NULL) return NULL; diff --git a/Modules/_datetimemodule.c b/Modules/_datetimemodule.c --- a/Modules/_datetimemodule.c +++ b/Modules/_datetimemodule.c @@ -104,6 +104,11 @@ static PyTypeObject PyDateTime_TZInfoType; static PyTypeObject PyDateTime_TimeZoneType; +_Py_IDENTIFIER(as_integer_ratio); +_Py_IDENTIFIER(fromutc); +_Py_IDENTIFIER(isoformat); +_Py_IDENTIFIER(strftime); + /* --------------------------------------------------------------------------- * Math utilities. */ @@ -1277,8 +1282,6 @@ goto Done; format = PyUnicode_FromString(PyBytes_AS_STRING(newfmt)); if (format != NULL) { - _Py_IDENTIFIER(strftime); - result = _PyObject_CallMethodId(time, &PyId_strftime, "OO", format, timetuple, NULL); Py_DECREF(format); @@ -1566,7 +1569,6 @@ PyObject *result = NULL; PyObject *pyus_in = NULL, *temp, *pyus_out; PyObject *ratio = NULL; - _Py_IDENTIFIER(as_integer_ratio); pyus_in = delta_to_microseconds(delta); if (pyus_in == NULL) @@ -1665,7 +1667,6 @@ PyObject *result = NULL; PyObject *pyus_in = NULL, *temp, *pyus_out; PyObject *ratio = NULL; - _Py_IDENTIFIER(as_integer_ratio); pyus_in = delta_to_microseconds(delta); if (pyus_in == NULL) @@ -2635,8 +2636,6 @@ static PyObject * date_str(PyDateTime_Date *self) { - _Py_IDENTIFIER(isoformat); - return _PyObject_CallMethodId((PyObject *)self, &PyId_isoformat, "()"); } @@ -2676,7 +2675,6 @@ date_format(PyDateTime_Date *self, PyObject *args) { PyObject *format; - _Py_IDENTIFIER(strftime); if (!PyArg_ParseTuple(args, "U:__format__", &format)) return NULL; @@ -3593,8 +3591,6 @@ static PyObject * time_str(PyDateTime_Time *self) { - _Py_IDENTIFIER(isoformat); - return _PyObject_CallMethodId((PyObject *)self, &PyId_isoformat, "()"); } @@ -4207,7 +4203,6 @@ if (self != NULL && tz != Py_None) { /* Convert UTC to tzinfo's zone. */ PyObject *temp = self; - _Py_IDENTIFIER(fromutc); self = _PyObject_CallMethodId(tz, &PyId_fromutc, "O", self); Py_DECREF(temp); @@ -4246,7 +4241,6 @@ if (self != NULL && tzinfo != Py_None) { /* Convert UTC to tzinfo's zone. */ PyObject *temp = self; - _Py_IDENTIFIER(fromutc); self = _PyObject_CallMethodId(tzinfo, &PyId_fromutc, "O", self); Py_DECREF(temp); @@ -4529,8 +4523,6 @@ static PyObject * datetime_str(PyDateTime_DateTime *self) { - _Py_IDENTIFIER(isoformat); - return _PyObject_CallMethodId((PyObject *)self, &PyId_isoformat, "(s)", " "); } @@ -4809,7 +4801,6 @@ PyObject *offset; PyObject *temp; PyObject *tzinfo = Py_None; - _Py_IDENTIFIER(fromutc); static char *keywords[] = {"tz", NULL}; if (! PyArg_ParseTupleAndKeywords(args, kw, "|O:astimezone", keywords, diff --git a/Modules/_sqlite/connection.c b/Modules/_sqlite/connection.c --- a/Modules/_sqlite/connection.c +++ b/Modules/_sqlite/connection.c @@ -41,6 +41,8 @@ #endif #endif +_Py_IDENTIFIER(cursor); + static int pysqlite_connection_set_isolation_level(pysqlite_Connection* self, PyObject* isolation_level); static void _pysqlite_drop_unused_cursor_references(pysqlite_Connection* self); @@ -1279,7 +1281,6 @@ PyObject* cursor = 0; PyObject* result = 0; PyObject* method = 0; - _Py_IDENTIFIER(cursor); cursor = _PyObject_CallMethodId((PyObject*)self, &PyId_cursor, ""); if (!cursor) { @@ -1309,7 +1310,6 @@ PyObject* cursor = 0; PyObject* result = 0; PyObject* method = 0; - _Py_IDENTIFIER(cursor); cursor = _PyObject_CallMethodId((PyObject*)self, &PyId_cursor, ""); if (!cursor) { @@ -1339,7 +1339,6 @@ PyObject* cursor = 0; PyObject* result = 0; PyObject* method = 0; - _Py_IDENTIFIER(cursor); cursor = _PyObject_CallMethodId((PyObject*)self, &PyId_cursor, ""); if (!cursor) { diff --git a/Objects/typeobject.c b/Objects/typeobject.c --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -39,16 +39,20 @@ static struct method_cache_entry method_cache[1 << MCACHE_SIZE_EXP]; static unsigned int next_version_tag = 0; +/* alphabetical order */ +_Py_IDENTIFIER(__abstractmethods__); _Py_IDENTIFIER(__class__); +_Py_IDENTIFIER(__delitem__); _Py_IDENTIFIER(__dict__); _Py_IDENTIFIER(__doc__); +_Py_IDENTIFIER(__getattribute__); _Py_IDENTIFIER(__getitem__); -_Py_IDENTIFIER(__getattribute__); _Py_IDENTIFIER(__hash__); +_Py_IDENTIFIER(__len__); _Py_IDENTIFIER(__module__); _Py_IDENTIFIER(__name__); _Py_IDENTIFIER(__new__); -_Py_IDENTIFIER(__abstractmethods__); +_Py_IDENTIFIER(__setitem__); static PyObject * slot_tp_new(PyTypeObject *type, PyObject *args, PyObject *kwds); @@ -5068,7 +5072,6 @@ static Py_ssize_t slot_sq_length(PyObject *self) { - _Py_IDENTIFIER(__len__); PyObject *res = call_method(self, &PyId___len__, "()"); Py_ssize_t len; @@ -5129,8 +5132,6 @@ slot_sq_ass_item(PyObject *self, Py_ssize_t index, PyObject *value) { PyObject *res; - _Py_IDENTIFIER(__delitem__); - _Py_IDENTIFIER(__setitem__); if (value == NULL) res = call_method(self, &PyId___delitem__, "(n)", index); @@ -5180,8 +5181,6 @@ slot_mp_ass_subscript(PyObject *self, PyObject *key, PyObject *value) { PyObject *res; - _Py_IDENTIFIER(__delitem__); - _Py_IDENTIFIER(__setitem__); if (value == NULL) res = call_method(self, &PyId___delitem__, "(O)", key); @@ -5232,7 +5231,6 @@ PyObject *func, *args; int result = -1; int using_len = 0; - _Py_IDENTIFIER(__len__); _Py_IDENTIFIER(__bool__); func = lookup_maybe(self, &PyId___bool__); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 21:51:14 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 7 Nov 2013 21:51:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_=5FPy=5Fnormalize=5Fen?= =?utf-8?q?coding=28=29=3A_ensure_that_buffer_is_big_enough_to_store_=22ut?= =?utf-8?q?f-8=22?= Message-ID: <3dFxbk1dYKz7Lmg@mail.python.org> http://hg.python.org/cpython/rev/99afa4c74436 changeset: 86995:99afa4c74436 user: Victor Stinner date: Thu Nov 07 13:33:36 2013 +0100 summary: Fix _Py_normalize_encoding(): ensure that buffer is big enough to store "utf-8" if the input string is NULL files: Objects/unicodeobject.c | 2 ++ 1 files changed, 2 insertions(+), 0 deletions(-) diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -2983,6 +2983,8 @@ char *l_end; if (encoding == NULL) { + if (lower_len < 6) + return 0; strcpy(lower, "utf-8"); return 1; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 21:51:15 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 7 Nov 2013 21:51:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319514=3A_Add_Andr?= =?utf-8?q?ei_Dorian_Duma_to_Misc/ACKS_for_changeset_4a09cc62419b?= Message-ID: <3dFxbl3VWvz7Lmg@mail.python.org> http://hg.python.org/cpython/rev/cb4c964800af changeset: 86996:cb4c964800af user: Victor Stinner date: Thu Nov 07 21:50:55 2013 +0100 summary: Issue #19514: Add Andrei Dorian Duma to Misc/ACKS for changeset 4a09cc62419b files: Misc/ACKS | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -328,6 +328,7 @@ John DuBois Paul Dubois Jacques Ducasse +Andrei Dorian Duma Graham Dumpleton Quinn Dunkan Robin Dunn -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 22:39:33 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 7 Nov 2013 22:39:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_C_API_doc=3A_try_to_group_?= =?utf-8?q?concrete_objects?= Message-ID: <3dFygT0VBdz7Lyh@mail.python.org> http://hg.python.org/cpython/rev/0c85c233b4c4 changeset: 86997:0c85c233b4c4 user: Victor Stinner date: Thu Nov 07 22:05:48 2013 +0100 summary: C API doc: try to group concrete objects files: Doc/c-api/concrete.rst | 22 +++++++++++++++------- 1 files changed, 15 insertions(+), 7 deletions(-) diff --git a/Doc/c-api/concrete.rst b/Doc/c-api/concrete.rst --- a/Doc/c-api/concrete.rst +++ b/Doc/c-api/concrete.rst @@ -74,26 +74,35 @@ .. _mapobjects: -Mapping Objects -=============== +Container Objects +================= .. index:: object: mapping .. toctree:: dict.rst + set.rst .. _otherobjects: +Function Objects +================ + +.. toctree:: + + function.rst + method.rst + cell.rst + code.rst + + Other Objects ============= .. toctree:: - set.rst - function.rst - method.rst file.rst module.rst iterator.rst @@ -102,7 +111,6 @@ memoryview.rst weakref.rst capsule.rst - cell.rst gen.rst datetime.rst - code.rst + -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 22:39:34 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 7 Nov 2013 22:39:34 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_frameobject=2Ec=3A_Use_an_?= =?utf-8?q?identifer_instead_of_creating_explicitly_an_interned?= Message-ID: <3dFygV2g2Dz7LyX@mail.python.org> http://hg.python.org/cpython/rev/23f0529a8b2f changeset: 86998:23f0529a8b2f user: Victor Stinner date: Thu Nov 07 22:22:39 2013 +0100 summary: frameobject.c: Use an identifer instead of creating explicitly an interned string for "__builtins__" literal string files: Objects/frameobject.c | 12 +++++------- 1 files changed, 5 insertions(+), 7 deletions(-) diff --git a/Objects/frameobject.c b/Objects/frameobject.c --- a/Objects/frameobject.c +++ b/Objects/frameobject.c @@ -601,13 +601,13 @@ 0, /* tp_dict */ }; -static PyObject *builtin_object; +_Py_IDENTIFIER(__builtins__); int _PyFrame_Init() { - builtin_object = PyUnicode_InternFromString("__builtins__"); - if (builtin_object == NULL) - return 0; + /* Before, PyId___builtins__ was a string created explicitly in + this function. Now there is nothing to initialize anymore, but + the function is kept for backward compatibility. */ return 1; } @@ -628,7 +628,7 @@ } #endif if (back == NULL || back->f_globals != globals) { - builtins = PyDict_GetItem(globals, builtin_object); + builtins = _PyDict_GetItemId(globals, &PyId___builtins__); if (builtins) { if (PyModule_Check(builtins)) { builtins = PyModule_GetDict(builtins); @@ -994,8 +994,6 @@ PyFrame_Fini(void) { (void)PyFrame_ClearFreeList(); - Py_XDECREF(builtin_object); - builtin_object = NULL; } /* Print summary info about the state of the optimized allocator */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 23:08:09 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 7 Nov 2013 23:08:09 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=2C_=2319515?= =?utf-8?q?=3A_remove_shared_identifiers=2C_move_identifiers_where_they?= Message-ID: <3dFzJT2VNQz7Lwn@mail.python.org> http://hg.python.org/cpython/rev/01c4a0af73cf changeset: 86999:01c4a0af73cf user: Victor Stinner date: Thu Nov 07 23:07:29 2013 +0100 summary: Issue #19512, #19515: remove shared identifiers, move identifiers where they are used. Move also _Py_IDENTIFIER() defintions to the top in modified files to remove identifiers duplicated in the same file. files: Include/object.h | 8 ---- Modules/_ctypes/callbacks.c | 2 +- Modules/_cursesmodule.c | 2 +- Modules/_lsprof.c | 2 +- Modules/_threadmodule.c | 4 +- Modules/faulthandler.c | 10 +++-- Modules/main.c | 2 +- Modules/syslogmodule.c | 2 +- Objects/object.c | 13 +++--- Objects/typeobject.c | 7 ++- Python/_warnings.c | 6 ++- Python/bltinmodule.c | 28 ++++++++------- Python/errors.c | 7 ++- Python/import.c | 2 +- Python/pythonrun.c | 45 +++++++++++------------- Python/sysmodule.c | 31 +++++++++------- Python/traceback.c | 11 +++-- 17 files changed, 93 insertions(+), 89 deletions(-) diff --git a/Include/object.h b/Include/object.h --- a/Include/object.h +++ b/Include/object.h @@ -147,14 +147,6 @@ #define _Py_static_string(varname, value) static _Py_Identifier varname = _Py_static_string_init(value) #define _Py_IDENTIFIER(varname) _Py_static_string(PyId_##varname, #varname) -/* Common identifiers (ex: _PyId_path is the string "path") */ -PyAPI_DATA(_Py_Identifier) _PyId_argv; -PyAPI_DATA(_Py_Identifier) _PyId_builtins; -PyAPI_DATA(_Py_Identifier) _PyId_path; -PyAPI_DATA(_Py_Identifier) _PyId_stdin; -PyAPI_DATA(_Py_Identifier) _PyId_stdout; -PyAPI_DATA(_Py_Identifier) _PyId_stderr; - /* Type objects contain a string containing the type name (to help somewhat in debugging), the allocation parameters (see PyObject_New() and diff --git a/Modules/_ctypes/callbacks.c b/Modules/_ctypes/callbacks.c --- a/Modules/_ctypes/callbacks.c +++ b/Modules/_ctypes/callbacks.c @@ -80,7 +80,7 @@ PrintError(char *msg, ...) { char buf[512]; - PyObject *f = _PySys_GetObjectId(&_PyId_stderr); + PyObject *f = PySys_GetObject("stderr"); va_list marker; va_start(marker, msg); diff --git a/Modules/_cursesmodule.c b/Modules/_cursesmodule.c --- a/Modules/_cursesmodule.c +++ b/Modules/_cursesmodule.c @@ -2578,7 +2578,7 @@ if (fd == -1) { PyObject* sys_stdout; - sys_stdout = _PySys_GetObjectId(&_PyId_stdout); + sys_stdout = PySys_GetObject("stdout"); if (sys_stdout == NULL || sys_stdout == Py_None) { PyErr_SetString( diff --git a/Modules/_lsprof.c b/Modules/_lsprof.c --- a/Modules/_lsprof.c +++ b/Modules/_lsprof.c @@ -185,7 +185,7 @@ } } if (modname != NULL) { - if (_PyUnicode_CompareWithId(modname, &_PyId_builtins) != 0) { + if (PyUnicode_CompareWithASCIIString(modname, "builtins") != 0) { PyObject *result; result = PyUnicode_FromFormat("<%U.%s>", modname, fn->m_ml->ml_name); diff --git a/Modules/_threadmodule.c b/Modules/_threadmodule.c --- a/Modules/_threadmodule.c +++ b/Modules/_threadmodule.c @@ -17,6 +17,8 @@ static long nb_threads = 0; static PyObject *str_dict; +_Py_IDENTIFIER(stderr); + /* Lock objects */ typedef struct { @@ -1005,7 +1007,7 @@ PySys_WriteStderr( "Unhandled exception in thread started by "); PyErr_Fetch(&exc, &value, &tb); - file = _PySys_GetObjectId(&_PyId_stderr); + file = _PySys_GetObjectId(&PyId_stderr); if (file != NULL && file != Py_None) PyFile_WriteObject(boot->func, file, 0); else diff --git a/Modules/faulthandler.c b/Modules/faulthandler.c --- a/Modules/faulthandler.c +++ b/Modules/faulthandler.c @@ -26,6 +26,11 @@ (anyway, the length is smaller than 30 characters) */ #define PUTS(fd, str) write(fd, str, (int)strlen(str)) +_Py_IDENTIFIER(enable); +_Py_IDENTIFIER(fileno); +_Py_IDENTIFIER(flush); +_Py_IDENTIFIER(stderr); + #ifdef HAVE_SIGACTION typedef struct sigaction _Py_sighandler_t; #else @@ -130,13 +135,11 @@ faulthandler_get_fileno(PyObject *file, int *p_fd) { PyObject *result; - _Py_IDENTIFIER(fileno); - _Py_IDENTIFIER(flush); long fd_long; int fd; if (file == NULL || file == Py_None) { - file = _PySys_GetObjectId(&_PyId_stderr); + file = _PySys_GetObjectId(&PyId_stderr); if (file == NULL) { PyErr_SetString(PyExc_RuntimeError, "unable to get sys.stderr"); return NULL; @@ -1047,7 +1050,6 @@ faulthandler_env_options(void) { PyObject *xoptions, *key, *module, *res; - _Py_IDENTIFIER(enable); char *p; if (!((p = Py_GETENV("PYTHONFAULTHANDLER")) && *p != '\0')) { diff --git a/Modules/main.c b/Modules/main.c --- a/Modules/main.c +++ b/Modules/main.c @@ -261,7 +261,7 @@ /* argv0 is usable as an import source, so put it in sys.path[0] and import __main__ */ - sys_path = _PySys_GetObjectId(&_PyId_path); + sys_path = PySys_GetObject("path"); if (sys_path == NULL) { PyErr_SetString(PyExc_RuntimeError, "unable to get sys.path"); goto error; diff --git a/Modules/syslogmodule.c b/Modules/syslogmodule.c --- a/Modules/syslogmodule.c +++ b/Modules/syslogmodule.c @@ -71,7 +71,7 @@ Py_ssize_t argv_len, scriptlen; PyObject *scriptobj; Py_ssize_t slash; - PyObject *argv = _PySys_GetObjectId(&_PyId_argv); + PyObject *argv = PySys_GetObject("argv"); if (argv == NULL) { return(NULL); diff --git a/Objects/object.c b/Objects/object.c --- a/Objects/object.c +++ b/Objects/object.c @@ -8,6 +8,12 @@ extern "C" { #endif +_Py_IDENTIFIER(Py_Repr); +_Py_IDENTIFIER(__bytes__); +_Py_IDENTIFIER(__dir__); +_Py_IDENTIFIER(__isabstractmethod__); +_Py_IDENTIFIER(builtins); + #ifdef Py_REF_DEBUG Py_ssize_t _Py_RefTotal; @@ -560,7 +566,6 @@ PyObject_Bytes(PyObject *v) { PyObject *result, *func; - _Py_IDENTIFIER(__bytes__); if (v == NULL) return PyBytes_FromString(""); @@ -949,7 +954,6 @@ { int res; PyObject* isabstract; - _Py_IDENTIFIER(__isabstractmethod__); if (obj == NULL) return 0; @@ -1124,7 +1128,7 @@ { PyObject *mod_name, *mod, *attr; - mod_name = _PyUnicode_FromId(&_PyId_builtins); /* borrowed */ + mod_name = _PyUnicode_FromId(&PyId_builtins); /* borrowed */ if (mod_name == NULL) return NULL; mod = PyImport_Import(mod_name); @@ -1440,7 +1444,6 @@ _dir_object(PyObject *obj) { PyObject *result, *sorted; - _Py_IDENTIFIER(__dir__); PyObject *dirfunc = _PyObject_LookupSpecial(obj, &PyId___dir__); assert(obj); @@ -1973,8 +1976,6 @@ See dictobject.c and listobject.c for examples of use. */ -_Py_IDENTIFIER(Py_Repr); - int Py_ReprEnter(PyObject *obj) { diff --git a/Objects/typeobject.c b/Objects/typeobject.c --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -53,6 +53,7 @@ _Py_IDENTIFIER(__name__); _Py_IDENTIFIER(__new__); _Py_IDENTIFIER(__setitem__); +_Py_IDENTIFIER(builtins); static PyObject * slot_tp_new(PyTypeObject *type, PyObject *args, PyObject *kwds); @@ -366,7 +367,7 @@ if (s != NULL) return PyUnicode_FromStringAndSize( type->tp_name, (Py_ssize_t)(s - type->tp_name)); - name = _PyUnicode_FromId(&_PyId_builtins); + name = _PyUnicode_FromId(&PyId_builtins); Py_XINCREF(name); return name; } @@ -718,7 +719,7 @@ return NULL; } - if (mod != NULL && _PyUnicode_CompareWithId(mod, &_PyId_builtins)) + if (mod != NULL && _PyUnicode_CompareWithId(mod, &PyId_builtins)) rtn = PyUnicode_FromFormat("", mod, name); else rtn = PyUnicode_FromFormat("", type->tp_name); @@ -3189,7 +3190,7 @@ Py_XDECREF(mod); return NULL; } - if (mod != NULL && _PyUnicode_CompareWithId(mod, &_PyId_builtins)) + if (mod != NULL && _PyUnicode_CompareWithId(mod, &PyId_builtins)) rtn = PyUnicode_FromFormat("<%U.%U object at %p>", mod, name, self); else rtn = PyUnicode_FromFormat("<%s object at %p>", diff --git a/Python/_warnings.c b/Python/_warnings.c --- a/Python/_warnings.c +++ b/Python/_warnings.c @@ -13,6 +13,8 @@ static PyObject *_once_registry; /* Dict */ static PyObject *_default_action; /* String */ +_Py_IDENTIFIER(argv); +_Py_IDENTIFIER(stderr); static int check_matched(PyObject *obj, PyObject *arg) @@ -265,7 +267,7 @@ if (name == NULL) /* XXX Can an object lack a '__name__' attribute? */ goto error; - f_stderr = _PySys_GetObjectId(&_PyId_stderr); + f_stderr = _PySys_GetObjectId(&PyId_stderr); if (f_stderr == NULL) { fprintf(stderr, "lost sys.stderr\n"); goto error; @@ -562,7 +564,7 @@ else { *filename = NULL; if (*module != Py_None && PyUnicode_CompareWithASCIIString(*module, "__main__") == 0) { - PyObject *argv = _PySys_GetObjectId(&_PyId_argv); + PyObject *argv = _PySys_GetObjectId(&PyId_argv); /* PyList_Check() is needed because sys.argv is set to None during Python finalization */ if (argv != NULL && PyList_Check(argv) && PyList_Size(argv) > 0) { diff --git a/Python/bltinmodule.c b/Python/bltinmodule.c --- a/Python/bltinmodule.c +++ b/Python/bltinmodule.c @@ -32,9 +32,19 @@ int Py_HasFileSystemDefaultEncoding = 0; #endif +_Py_IDENTIFIER(__builtins__); +_Py_IDENTIFIER(__dict__); +_Py_IDENTIFIER(__prepare__); +_Py_IDENTIFIER(__round__); +_Py_IDENTIFIER(encoding); +_Py_IDENTIFIER(errors); _Py_IDENTIFIER(fileno); _Py_IDENTIFIER(flush); -_Py_IDENTIFIER(__builtins__); +_Py_IDENTIFIER(metaclass); +_Py_IDENTIFIER(sort); +_Py_IDENTIFIER(stdin); +_Py_IDENTIFIER(stdout); +_Py_IDENTIFIER(stderr); static PyObject * builtin___build_class__(PyObject *self, PyObject *args, PyObject *kwds) @@ -43,8 +53,6 @@ PyObject *cls = NULL; Py_ssize_t nargs; int isclass; - _Py_IDENTIFIER(__prepare__); - _Py_IDENTIFIER(metaclass); assert(args != NULL); if (!PyTuple_Check(args)) { @@ -1547,7 +1555,6 @@ static PyObject *dummy_args; PyObject *sep = NULL, *end = NULL, *file = NULL, *flush = NULL; int i, err; - _Py_IDENTIFIER(flush); if (dummy_args == NULL && !(dummy_args = PyTuple_New(0))) return NULL; @@ -1555,7 +1562,7 @@ kwlist, &sep, &end, &file, &flush)) return NULL; if (file == NULL || file == Py_None) { - file = _PySys_GetObjectId(&_PyId_stdout); + file = _PySys_GetObjectId(&PyId_stdout); if (file == NULL) { PyErr_SetString(PyExc_RuntimeError, "lost sys.stdout"); return NULL; @@ -1640,9 +1647,9 @@ builtin_input(PyObject *self, PyObject *args) { PyObject *promptarg = NULL; - PyObject *fin = _PySys_GetObjectId(&_PyId_stdin); - PyObject *fout = _PySys_GetObjectId(&_PyId_stdout); - PyObject *ferr = _PySys_GetObjectId(&_PyId_stderr); + PyObject *fin = _PySys_GetObjectId(&PyId_stdin); + PyObject *fout = _PySys_GetObjectId(&PyId_stdout); + PyObject *ferr = _PySys_GetObjectId(&PyId_stderr); PyObject *tmp; long fd; int tty; @@ -1713,8 +1720,6 @@ char *stdin_encoding_str, *stdin_errors_str; PyObject *result; size_t len; - _Py_IDENTIFIER(encoding); - _Py_IDENTIFIER(errors); stdin_encoding = _PyObject_GetAttrId(fin, &PyId_encoding); stdin_errors = _PyObject_GetAttrId(fin, &PyId_errors); @@ -1843,7 +1848,6 @@ PyObject *ndigits = NULL; static char *kwlist[] = {"number", "ndigits", 0}; PyObject *number, *round, *result; - _Py_IDENTIFIER(__round__); if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|O:round", kwlist, &number, &ndigits)) @@ -1886,7 +1890,6 @@ PyObject *callable; static char *kwlist[] = {"iterable", "key", "reverse", 0}; int reverse; - _Py_IDENTIFIER(sort); /* args 1-3 should match listsort in Objects/listobject.c */ if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|Oi:sorted", @@ -1939,7 +1942,6 @@ Py_INCREF(d); } else { - _Py_IDENTIFIER(__dict__); d = _PyObject_GetAttrId(v, &PyId___dict__); if (d == NULL) { PyErr_SetString(PyExc_TypeError, diff --git a/Python/errors.c b/Python/errors.c --- a/Python/errors.c +++ b/Python/errors.c @@ -20,6 +20,9 @@ extern "C" { #endif +_Py_IDENTIFIER(builtins); +_Py_IDENTIFIER(stderr); + void PyErr_Restore(PyObject *type, PyObject *value, PyObject *traceback) @@ -844,7 +847,7 @@ PyErr_Fetch(&t, &v, &tb); - f = _PySys_GetObjectId(&_PyId_stderr); + f = _PySys_GetObjectId(&PyId_stderr); if (f == NULL || f == Py_None) goto done; @@ -878,7 +881,7 @@ goto done; } else { - if (_PyUnicode_CompareWithId(moduleName, &_PyId_builtins) != 0) { + if (_PyUnicode_CompareWithId(moduleName, &PyId_builtins) != 0) { if (PyFile_WriteObject(moduleName, f, Py_PRINT_RAW) < 0) goto done; if (PyFile_WriteString(".", f) < 0) diff --git a/Python/import.c b/Python/import.c --- a/Python/import.c +++ b/Python/import.c @@ -310,7 +310,7 @@ /* XXX Perhaps these precautions are obsolete. Who knows? */ - value = _PyDict_GetItemId(modules, &_PyId_builtins); + value = PyDict_GetItemString(modules, "builtins"); if (value != NULL && PyModule_Check(value)) { dict = PyModule_GetDict(value); if (Py_VerboseFlag) diff --git a/Python/pythonrun.c b/Python/pythonrun.c --- a/Python/pythonrun.c +++ b/Python/pythonrun.c @@ -35,21 +35,16 @@ #define PATH_MAX MAXPATHLEN #endif -/* Common identifiers */ -_Py_Identifier _PyId_argv = _Py_static_string_init("argv"); -_Py_Identifier _PyId_builtins = _Py_static_string_init("builtins"); -_Py_Identifier _PyId_path = _Py_static_string_init("path"); -_Py_Identifier _PyId_stdin = _Py_static_string_init("stdin"); -_Py_Identifier _PyId_stdout = _Py_static_string_init("stdout"); -_Py_Identifier _PyId_stderr = _Py_static_string_init("stderr"); - -/* local identifiers */ +_Py_IDENTIFIER(builtins); _Py_IDENTIFIER(excepthook); +_Py_IDENTIFIER(last_traceback); +_Py_IDENTIFIER(last_type); +_Py_IDENTIFIER(last_value); _Py_IDENTIFIER(ps1); _Py_IDENTIFIER(ps2); -_Py_IDENTIFIER(last_type); -_Py_IDENTIFIER(last_value); -_Py_IDENTIFIER(last_traceback); +_Py_IDENTIFIER(stdin); +_Py_IDENTIFIER(stdout); +_Py_IDENTIFIER(stderr); _Py_static_string(PyId_string, ""); #ifdef Py_REF_DEBUG @@ -429,7 +424,7 @@ pstderr = PyFile_NewStdPrinter(fileno(stderr)); if (pstderr == NULL) Py_FatalError("Py_Initialize: can't set preliminary stderr"); - _PySys_SetObjectId(&_PyId_stderr, pstderr); + _PySys_SetObjectId(&PyId_stderr, pstderr); PySys_SetObject("__stderr__", pstderr); Py_DECREF(pstderr); @@ -514,8 +509,8 @@ static void flush_std_files(void) { - PyObject *fout = _PySys_GetObjectId(&_PyId_stdout); - PyObject *ferr = _PySys_GetObjectId(&_PyId_stderr); + PyObject *fout = _PySys_GetObjectId(&PyId_stdout); + PyObject *ferr = _PySys_GetObjectId(&PyId_stderr); PyObject *tmp; _Py_IDENTIFIER(flush); @@ -793,7 +788,7 @@ pstderr = PyFile_NewStdPrinter(fileno(stderr)); if (pstderr == NULL) Py_FatalError("Py_Initialize: can't set preliminary stderr"); - _PySys_SetObjectId(&_PyId_stderr, pstderr); + _PySys_SetObjectId(&PyId_stderr, pstderr); PySys_SetObject("__stderr__", pstderr); Py_DECREF(pstderr); @@ -1187,7 +1182,7 @@ goto error; } /* if (fd < 0) */ PySys_SetObject("__stdin__", std); - _PySys_SetObjectId(&_PyId_stdin, std); + _PySys_SetObjectId(&PyId_stdin, std); Py_DECREF(std); /* Set sys.stdout */ @@ -1202,7 +1197,7 @@ goto error; } /* if (fd < 0) */ PySys_SetObject("__stdout__", std); - _PySys_SetObjectId(&_PyId_stdout, std); + _PySys_SetObjectId(&PyId_stdout, std); Py_DECREF(std); #if 1 /* Disable this if you have trouble debugging bootstrap stuff */ @@ -1236,7 +1231,7 @@ Py_DECREF(std); goto error; } - if (_PySys_SetObjectId(&_PyId_stderr, std) < 0) { + if (_PySys_SetObjectId(&PyId_stderr, std) < 0) { Py_DECREF(std); goto error; } @@ -1368,7 +1363,7 @@ if (fp == stdin) { /* Fetch encoding from sys.stdin if possible. */ - v = _PySys_GetObjectId(&_PyId_stdin); + v = _PySys_GetObjectId(&PyId_stdin); if (v && v != Py_None) { oenc = _PyObject_GetAttrId(v, &PyId_encoding); if (oenc) @@ -1774,7 +1769,7 @@ if (PyLong_Check(value)) exitcode = (int)PyLong_AsLong(value); else { - PyObject *sys_stderr = _PySys_GetObjectId(&_PyId_stderr); + PyObject *sys_stderr = _PySys_GetObjectId(&PyId_stderr); if (sys_stderr != NULL && sys_stderr != Py_None) { PyFile_WriteObject(value, sys_stderr, Py_PRINT_RAW); } else { @@ -1938,7 +1933,7 @@ err = PyFile_WriteString("", f); } else { - if (_PyUnicode_CompareWithId(moduleName, &_PyId_builtins) != 0) + if (_PyUnicode_CompareWithId(moduleName, &PyId_builtins) != 0) { err = PyFile_WriteObject(moduleName, f, Py_PRINT_RAW); err += PyFile_WriteString(".", f); @@ -2033,7 +2028,7 @@ PyErr_Display(PyObject *exception, PyObject *value, PyObject *tb) { PyObject *seen; - PyObject *f = _PySys_GetObjectId(&_PyId_stderr); + PyObject *f = _PySys_GetObjectId(&PyId_stderr); if (PyExceptionInstance_Check(value) && tb != NULL && PyTraceBack_Check(tb)) { /* Put the traceback on the exception, otherwise it won't get @@ -2130,7 +2125,7 @@ /* Save the current exception */ PyErr_Fetch(&type, &value, &traceback); - f = _PySys_GetObjectId(&_PyId_stderr); + f = _PySys_GetObjectId(&PyId_stderr); if (f != NULL) { r = _PyObject_CallMethodId(f, &PyId_flush, ""); if (r) @@ -2138,7 +2133,7 @@ else PyErr_Clear(); } - f = _PySys_GetObjectId(&_PyId_stdout); + f = _PySys_GetObjectId(&PyId_stdout); if (f != NULL) { r = _PyObject_CallMethodId(f, &PyId_flush, ""); if (r) diff --git a/Python/sysmodule.c b/Python/sysmodule.c --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -41,6 +41,16 @@ #include #endif +_Py_IDENTIFIER(_); +_Py_IDENTIFIER(__sizeof__); +_Py_IDENTIFIER(buffer); +_Py_IDENTIFIER(builtins); +_Py_IDENTIFIER(encoding); +_Py_IDENTIFIER(path); +_Py_IDENTIFIER(stdout); +_Py_IDENTIFIER(stderr); +_Py_IDENTIFIER(write); + PyObject * _PySys_GetObjectId(_Py_Identifier *key) { @@ -104,8 +114,6 @@ PyObject *encoded, *escaped_str, *repr_str, *buffer, *result; char *stdout_encoding_str; int ret; - _Py_IDENTIFIER(encoding); - _Py_IDENTIFIER(buffer); stdout_encoding = _PyObject_GetAttrId(outf, &PyId_encoding); if (stdout_encoding == NULL) @@ -126,7 +134,6 @@ buffer = _PyObject_GetAttrId(outf, &PyId_buffer); if (buffer) { - _Py_IDENTIFIER(write); result = _PyObject_CallMethodId(buffer, &PyId_write, "(O)", encoded); Py_DECREF(buffer); Py_DECREF(encoded); @@ -165,8 +172,6 @@ PyObject *builtins; static PyObject *newline = NULL; int err; - _Py_IDENTIFIER(_); - _Py_IDENTIFIER(builtins); builtins = _PyDict_GetItemId(modules, &PyId_builtins); if (builtins == NULL) { @@ -183,7 +188,7 @@ } if (_PyObject_SetAttrId(builtins, &PyId__, Py_None) != 0) return NULL; - outf = _PySys_GetObjectId(&_PyId_stdout); + outf = _PySys_GetObjectId(&PyId_stdout); if (outf == NULL || outf == Py_None) { PyErr_SetString(PyExc_RuntimeError, "lost sys.stdout"); return NULL; @@ -854,7 +859,6 @@ static char *kwlist[] = {"object", "default", 0}; PyObject *o, *dflt = NULL; PyObject *method; - _Py_IDENTIFIER(__sizeof__); if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|O:getsizeof", kwlist, &o, &dflt)) @@ -1825,7 +1829,7 @@ PyObject *v; if ((v = makepathobject(path, DELIM)) == NULL) Py_FatalError("can't create sys.path"); - if (_PySys_SetObjectId(&_PyId_path, v) != 0) + if (_PySys_SetObjectId(&PyId_path, v) != 0) Py_FatalError("can't assign sys.path"); Py_DECREF(v); } @@ -1894,7 +1898,7 @@ wchar_t fullpath[MAX_PATH]; #endif - path = _PySys_GetObjectId(&_PyId_path); + path = _PySys_GetObjectId(&PyId_path); if (path == NULL) return; @@ -2004,7 +2008,6 @@ { PyObject *writer = NULL, *args = NULL, *result = NULL; int err; - _Py_IDENTIFIER(write); if (file == NULL) return -1; @@ -2109,7 +2112,7 @@ va_list va; va_start(va, format); - sys_write(&_PyId_stdout, stdout, format, va); + sys_write(&PyId_stdout, stdout, format, va); va_end(va); } @@ -2119,7 +2122,7 @@ va_list va; va_start(va, format); - sys_write(&_PyId_stderr, stderr, format, va); + sys_write(&PyId_stderr, stderr, format, va); va_end(va); } @@ -2151,7 +2154,7 @@ va_list va; va_start(va, format); - sys_format(&_PyId_stdout, stdout, format, va); + sys_format(&PyId_stdout, stdout, format, va); va_end(va); } @@ -2161,6 +2164,6 @@ va_list va; va_start(va, format); - sys_format(&_PyId_stderr, stderr, format, va); + sys_format(&PyId_stderr, stderr, format, va); va_end(va); } diff --git a/Python/traceback.c b/Python/traceback.c --- a/Python/traceback.c +++ b/Python/traceback.c @@ -21,6 +21,11 @@ /* Function from Parser/tokenizer.c */ extern char * PyTokenizer_FindEncodingFilename(int, PyObject *); +_Py_IDENTIFIER(TextIOWrapper); +_Py_IDENTIFIER(close); +_Py_IDENTIFIER(open); +_Py_IDENTIFIER(path); + static PyObject * tb_dir(PyTracebackObject *self) { @@ -152,7 +157,6 @@ const char* filepath; Py_ssize_t len; PyObject* result; - _Py_IDENTIFIER(open); filebytes = PyUnicode_EncodeFSDefault(filename); if (filebytes == NULL) { @@ -169,7 +173,7 @@ tail++; taillen = strlen(tail); - syspath = _PySys_GetObjectId(&_PyId_path); + syspath = _PySys_GetObjectId(&PyId_path); if (syspath == NULL || !PyList_Check(syspath)) goto error; npath = PyList_Size(syspath); @@ -232,9 +236,6 @@ char buf[MAXPATHLEN+1]; int kind; void *data; - _Py_IDENTIFIER(close); - _Py_IDENTIFIER(open); - _Py_IDENTIFIER(TextIOWrapper); /* open the file */ if (filename == NULL) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 7 23:12:44 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 7 Nov 2013 23:12:44 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=5FPy=5Fnormalize=5Fencodi?= =?utf-8?q?ng=28=29=3A_explain_how_the_value_6_was_computed?= Message-ID: <3dFzPm24klz7LjT@mail.python.org> http://hg.python.org/cpython/rev/9c929b9e0a2a changeset: 87000:9c929b9e0a2a user: Victor Stinner date: Thu Nov 07 23:12:23 2013 +0100 summary: _Py_normalize_encoding(): explain how the value 6 was computed files: Objects/unicodeobject.c | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -2983,6 +2983,7 @@ char *l_end; if (encoding == NULL) { + /* 6 == strlen("utf-8") + 1 */ if (lower_len < 6) return 0; strcpy(lower, "utf-8"); -- Repository URL: http://hg.python.org/cpython From ncoghlan at gmail.com Fri Nov 8 00:16:13 2013 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 8 Nov 2013 09:16:13 +1000 Subject: [Python-checkins] cpython: Issue #19512: Add PyRun_InteractiveOneObject() function In-Reply-To: <3dFFXL6Cxgz7LsB@mail.python.org> References: <3dFFXL6Cxgz7LsB@mail.python.org> Message-ID: New C APIs should either be documented or have an underscore prefix. Also, if they're part of the stable ABI, they need a version guard. Wishlist item: an automated ABI checker that can diff the exported symbols against a reference list (Could ctypes or cffi be used for that?) As long as this kind of thing involves manual review, we're going to keep making similar mistakes :P That said, a potentially simpler first step would be a list of common mistakes to check for/questions to ask during reviews as part of the devguide. Cheers, Nick. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Nov 8 00:18:45 2013 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 8 Nov 2013 09:18:45 +1000 Subject: [Python-checkins] cpython: Issue #19512: Add a new _PyDict_DelItemId() function, similar to In-Reply-To: <3dFG004BDdz7Lpy@mail.python.org> References: <3dFG004BDdz7Lpy@mail.python.org> Message-ID: On 7 Nov 2013 04:06, "victor.stinner" wrote: > > http://hg.python.org/cpython/rev/69071054b42f > changeset: 86968:69071054b42f > user: Victor Stinner > date: Wed Nov 06 18:58:22 2013 +0100 > summary: > Issue #19512: Add a new _PyDict_DelItemId() function, similar to > PyDict_DelItemString() but using an identifier for the key > > files: > Include/dictobject.h | 1 + > Objects/dictobject.c | 9 +++++++++ > 2 files changed, 10 insertions(+), 0 deletions(-) > > > diff --git a/Include/dictobject.h b/Include/dictobject.h > --- a/Include/dictobject.h > +++ b/Include/dictobject.h > @@ -109,6 +109,7 @@ > PyAPI_FUNC(int) PyDict_SetItemString(PyObject *dp, const char *key, PyObject *item); > PyAPI_FUNC(int) _PyDict_SetItemId(PyObject *dp, struct _Py_Identifier *key, PyObject *item); > PyAPI_FUNC(int) PyDict_DelItemString(PyObject *dp, const char *key); > +PyAPI_FUNC(int) _PyDict_DelItemId(PyObject *mp, struct _Py_Identifier *key); As a private API, this shouldn't be part of the stable ABI. Cheers, Nick. > > #ifndef Py_LIMITED_API > int _PyObjectDict_SetItem(PyTypeObject *tp, PyObject **dictptr, PyObject *name, PyObject *value); > diff --git a/Objects/dictobject.c b/Objects/dictobject.c > --- a/Objects/dictobject.c > +++ b/Objects/dictobject.c > @@ -2736,6 +2736,15 @@ > } > > int > +_PyDict_DelItemId(PyObject *v, _Py_Identifier *key) > +{ > + PyObject *kv = _PyUnicode_FromId(key); /* borrowed */ > + if (kv == NULL) > + return -1; > + return PyDict_DelItem(v, kv); > +} > + > +int > PyDict_DelItemString(PyObject *v, const char *key) > { > PyObject *kv; > > -- > Repository URL: http://hg.python.org/cpython > > _______________________________________________ > Python-checkins mailing list > Python-checkins at python.org > https://mail.python.org/mailman/listinfo/python-checkins > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Nov 8 00:50:35 2013 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 8 Nov 2013 09:50:35 +1000 Subject: [Python-checkins] [Python-Dev] cpython: Issue #19512: Add PyRun_InteractiveOneObject() function In-Reply-To: References: <3dFFXL6Cxgz7LsB@mail.python.org> Message-ID: On 8 Nov 2013 09:45, "Victor Stinner" wrote: > > Hi, > > [PyRun_InteractiveOneObject()] > > 2013/11/8 Nick Coghlan : > > New C APIs should either be documented or have an underscore prefix. > > I created the issue #19518 to add documentation (but also to add even > more functions :-)). > > > Also, if they're part of the stable ABI, they need a version guard. > > As most PyRun_xxx() functions, the function is declared in a "#ifndef > Py_LIMITED_API" block. It means that it is not part of the stable ABI, > is that correct? Yeah, that's fine - I'm just replying on my phone at the moment, so checking the broader file context is a pain :) Cheers, Nick. > > > Wishlist item: an automated ABI checker that can diff the exported symbols > > against a reference list (Could ctypes or cffi be used for that?) > > Yeah, it would be nice to have a tool to list changes on the stable ABI :-) > > * * * > > For a previous issue, I also added various other functions like > PyParser_ASTFromStringObject() or PyParser_ASTFromFileObject(). As > PyRun_InteractiveOneObject(), they are declared in a "#ifndef > Py_LIMITED_API" block in pythonrun.h. These new functions are not > documented, because PyParser_AST*() functions are not documented. > > http://bugs.python.org/issue11619 > http://hg.python.org/cpython/rev/df2fdd42b375 > > The patch parser_unicode.patch was attached to the issue during 2 > years. First I didn't want to commit it becaues it added too many new > functions. But I pushed by change because some Windows users like > using the full Unicode range in the name of their scripts and modules! > > If it's not too late, I would appreciate a review of the stable ABI of > this changeset. > > Victor > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ncoghlan%40gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From python-checkins at python.org Fri Nov 8 01:02:31 2013 From: python-checkins at python.org (victor.stinner) Date: Fri, 8 Nov 2013 01:02:31 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_fsco?= =?utf-8?q?nvert=5Fstrdup=28=29=2C_raise_a_MemoryError_on_PyMem=5FMalloc?= =?utf-8?b?KCk=?= Message-ID: <3dG1rR0VWyz7Lkt@mail.python.org> http://hg.python.org/cpython/rev/f2cd38795931 changeset: 87001:f2cd38795931 user: Victor Stinner date: Thu Nov 07 23:56:10 2013 +0100 summary: Issue #19437: Fix fsconvert_strdup(), raise a MemoryError on PyMem_Malloc() failure files: Modules/posixmodule.c | 4 +++- 1 files changed, 3 insertions(+), 1 deletions(-) diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -5053,8 +5053,10 @@ return 0; size = PyBytes_GET_SIZE(bytes); *out = PyMem_Malloc(size+1); - if (!*out) + if (!*out) { + PyErr_NoMemory(); return 0; + } memcpy(*out, PyBytes_AsString(bytes), size+1); Py_DECREF(bytes); return 1; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 01:02:32 2013 From: python-checkins at python.org (victor.stinner) Date: Fri, 8 Nov 2013 01:02:32 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_=5Fi?= =?utf-8?b?by5fSU9CYXNlLmNsb3NlKCksIGhhbmRsZSBfUHlPYmplY3RfU2V0QXR0cklk?= =?utf-8?q?=28=29_failure?= Message-ID: <3dG1rS2Qdxz7LmP@mail.python.org> http://hg.python.org/cpython/rev/f88c6417b9f6 changeset: 87002:f88c6417b9f6 user: Victor Stinner date: Fri Nov 08 00:29:41 2013 +0100 summary: Issue #19437: Fix _io._IOBase.close(), handle _PyObject_SetAttrId() failure files: Modules/_io/iobase.c | 11 ++++++++--- 1 files changed, 8 insertions(+), 3 deletions(-) diff --git a/Modules/_io/iobase.c b/Modules/_io/iobase.c --- a/Modules/_io/iobase.c +++ b/Modules/_io/iobase.c @@ -186,11 +186,16 @@ Py_RETURN_NONE; res = PyObject_CallMethodObjArgs(self, _PyIO_str_flush, NULL); - _PyObject_SetAttrId(self, &PyId___IOBase_closed, Py_True); - if (res == NULL) { + + if (_PyObject_SetAttrId(self, &PyId___IOBase_closed, Py_True) < 0) { + Py_XDECREF(res); return NULL; } - Py_XDECREF(res); + + if (res == NULL) + return NULL; + + Py_DECREF(res); Py_RETURN_NONE; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 01:02:33 2013 From: python-checkins at python.org (victor.stinner) Date: Fri, 8 Nov 2013 01:02:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_date?= =?utf-8?q?time=5Fsubtract=28=29=2C_handle_new=5Fdelta=28=29_failure?= Message-ID: <3dG1rT4KZMz7Lmy@mail.python.org> http://hg.python.org/cpython/rev/0f48843652b1 changeset: 87003:0f48843652b1 user: Victor Stinner date: Fri Nov 08 00:50:58 2013 +0100 summary: Issue #19437: Fix datetime_subtract(), handle new_delta() failure files: Modules/_datetimemodule.c | 3 +++ 1 files changed, 3 insertions(+), 0 deletions(-) diff --git a/Modules/_datetimemodule.c b/Modules/_datetimemodule.c --- a/Modules/_datetimemodule.c +++ b/Modules/_datetimemodule.c @@ -4462,6 +4462,9 @@ delta_us = DATE_GET_MICROSECOND(left) - DATE_GET_MICROSECOND(right); result = new_delta(delta_d, delta_s, delta_us, 1); + if (result == NULL) + return NULL; + if (offdiff != NULL) { PyObject *temp = result; result = delta_subtract(result, offdiff); -- Repository URL: http://hg.python.org/cpython From ericsnowcurrently at gmail.com Fri Nov 8 01:15:35 2013 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Thu, 7 Nov 2013 17:15:35 -0700 Subject: [Python-checkins] [Python-Dev] cpython: Issue #19512: Add PyRun_InteractiveOneObject() function In-Reply-To: References: <3dFFXL6Cxgz7LsB@mail.python.org> Message-ID: On Thu, Nov 7, 2013 at 4:55 PM, Victor Stinner wrote: > About the 72523 functions PyRun_xxx(), I don't understand something. > The PyRun_FileEx() is documented in the Python "very high" C API to > use Python. But this function is not part of the stable ABI. So there > is no "very high" function in the stable ABI to use Python? My understanding is that if you are using the PyRun_* API (embedding?), you aren't concerned with maintaining binary compatibility across Python versions. If you are writing an extension module, you probably aren't using PyRun_*. > > Another question: it's not documented if a function is part or not > part of the stable ABI. So as an user of the API, it is hard to check > if a function is part of the stable ABI or not. The best we have is probably PEP 384. I've been meaning to work on the C-API docs for a while and add a concise reference page that would include a summary of the stable API. Alas, other things have taken precedence. -eric From solipsis at pitrou.net Fri Nov 8 04:50:00 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Fri, 08 Nov 2013 04:50:00 +0100 Subject: [Python-checkins] Daily reference leaks (0f48843652b1): sum=0 Message-ID: results for 0f48843652b1 on branch "default" -------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogT8DSYn', '-x'] From ncoghlan at gmail.com Fri Nov 8 10:25:35 2013 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 8 Nov 2013 19:25:35 +1000 Subject: [Python-checkins] [Python-Dev] cpython: Issue #19512: Add a new _PyDict_DelItemId() function, similar to In-Reply-To: References: <3dFG004BDdz7Lpy@mail.python.org> Message-ID: On 8 Nov 2013 09:48, "Victor Stinner" wrote: > > 2013/11/8 Nick Coghlan : > >> http://hg.python.org/cpython/rev/69071054b42f > >> changeset: 86968:69071054b42f > >> user: Victor Stinner > >> date: Wed Nov 06 18:58:22 2013 +0100 > >> summary: > >> Issue #19512: Add a new _PyDict_DelItemId() function, similar to > >> PyDict_DelItemString() but using an identifier for the key > >> ... > > > > As a private API, this shouldn't be part of the stable ABI. > > When I don't know if a function should be made public or not, I > compare with other functions. > > In Python 3.3, _PyDict_GetItemIdWithError(), _PyDict_GetItemId() and > _PyDict_SetItemId() are part of the stable ABI if I read correctly > dictobject.h. _PyObject_GetAttrId() is also part of the stable ABI. > Was it a mistake, or did I misunderstand how stable functions are > declared? Likely a mistake - the stable ABI is hard to review properly (since it can depend on non local preprocessor checks, so a mistake may not be obvious in a diff), we don't currently have a systematic approach to handling changes and there's no automated test to catch inadvertent additions or (worse) removals :( This may be a good thing for us to look at more generally when things settle down a bit after the beta 1 feature freeze. Cheers, Nick. > > Victor > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ncoghlan%40gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Nov 8 10:35:05 2013 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 8 Nov 2013 19:35:05 +1000 Subject: [Python-checkins] [Python-Dev] cpython: Issue #19512: Add PyRun_InteractiveOneObject() function In-Reply-To: References: <3dFFXL6Cxgz7LsB@mail.python.org> Message-ID: On 8 Nov 2013 10:15, "Eric Snow" wrote: > > On Thu, Nov 7, 2013 at 4:55 PM, Victor Stinner wrote: > > About the 72523 functions PyRun_xxx(), I don't understand something. > > The PyRun_FileEx() is documented in the Python "very high" C API to > > use Python. But this function is not part of the stable ABI. So there > > is no "very high" function in the stable ABI to use Python? > > My understanding is that if you are using the PyRun_* API > (embedding?), you aren't concerned with maintaining binary > compatibility across Python versions. If you are writing an extension > module, you probably aren't using PyRun_*. > > > > > Another question: it's not documented if a function is part or not > > part of the stable ABI. So as an user of the API, it is hard to check > > if a function is part of the stable ABI or not. > > The best we have is probably PEP 384. I've been meaning to work on > the C-API docs for a while and add a concise reference page that would > include a summary of the stable API. Alas, other things have taken > precedence. Give it six months or so and I'll likely once again be looking for people to help out with the "make embedding CPython less painful and arcane" project that is PEP 432. There's a *lot* we can do in that space, along with the PEP 451 based extension loading enhancements. Just throwin' it out there ;) Cheers, Nick. > > -eric -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Nov 8 13:55:17 2013 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 8 Nov 2013 22:55:17 +1000 Subject: [Python-checkins] [Python-Dev] cpython: Issue #19512: Add a new _PyDict_DelItemId() function, similar to In-Reply-To: References: <3dFG004BDdz7Lpy@mail.python.org> Message-ID: On 8 Nov 2013 22:03, "Thomas Heller" wrote: > > Am 08.11.2013 12:19, schrieb Victor Stinner: > >> 2013/11/8 Nick Coghlan : >>>> >>>> In Python 3.3, _PyDict_GetItemIdWithError(), _PyDict_GetItemId() and >>>> _PyDict_SetItemId() are part of the stable ABI if I read correctly >>>> dictobject.h. _PyObject_GetAttrId() is also part of the stable ABI. >>>> Was it a mistake, or did I misunderstand how stable functions are >>>> declared? >>> >>> >>> Likely a mistake - the stable ABI is hard to review properly (since it can >>> depend on non local preprocessor checks, so a mistake may not be obvious in >>> a diff), we don't currently have a systematic approach to handling changes >>> and there's no automated test to catch inadvertent additions or (worse) >>> removals :( >> >> >> Would it be possible to remove them from the stable ABI in Python 3.4? >> They are marked as private using the "_Py" prefix... > > > I may be confusing API and ABI (see my other message), but adding to > or removing functions from the stable ABI seems to be a very serious > mistake, IMO - private or not. Unless my understanding of the word > 'stable' is wrong... Yeah, we can add things to the stable ABI with an appropriate version guard, but we won't remove even private APIs if they were previously published in 3.3. The main thing I get out of this is that we need to figure out a way to test it automatically - the nature of the problem means that code review is an inherently unreliable mechanism for spotting mistakes. Cheers, Nick. > > >>> This may be a good thing for us to look at more generally when things settle >>> down a bit after the beta 1 feature freeze. >> >> >> I created the following issue to not forget it: >> http://bugs.python.org/issue19526 > > > Thomas > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ncoghlan%40gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From python-checkins at python.org Fri Nov 8 14:07:49 2013 From: python-checkins at python.org (victor.stinner) Date: Fri, 8 Nov 2013 14:07:49 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319512=2C_=2319526?= =?utf-8?q?=3A_Exclude_the_new_=5FPyDict=5FDelItemId=28=29_function_from_t?= =?utf-8?q?he?= Message-ID: <3dGMGY5Xztz7Llm@mail.python.org> http://hg.python.org/cpython/rev/bf9c77bac36d changeset: 87004:bf9c77bac36d user: Victor Stinner date: Fri Nov 08 14:07:11 2013 +0100 summary: Issue #19512, #19526: Exclude the new _PyDict_DelItemId() function from the stable ABI files: Include/dictobject.h | 5 +++-- 1 files changed, 3 insertions(+), 2 deletions(-) diff --git a/Include/dictobject.h b/Include/dictobject.h --- a/Include/dictobject.h +++ b/Include/dictobject.h @@ -109,12 +109,13 @@ PyAPI_FUNC(int) PyDict_SetItemString(PyObject *dp, const char *key, PyObject *item); PyAPI_FUNC(int) _PyDict_SetItemId(PyObject *dp, struct _Py_Identifier *key, PyObject *item); PyAPI_FUNC(int) PyDict_DelItemString(PyObject *dp, const char *key); -PyAPI_FUNC(int) _PyDict_DelItemId(PyObject *mp, struct _Py_Identifier *key); #ifndef Py_LIMITED_API +PyAPI_FUNC(int) _PyDict_DelItemId(PyObject *mp, struct _Py_Identifier *key); +PyAPI_FUNC(void) _PyDict_DebugMallocStats(FILE *out); + int _PyObjectDict_SetItem(PyTypeObject *tp, PyObject **dictptr, PyObject *name, PyObject *value); PyObject *_PyDict_LoadGlobal(PyDictObject *, PyDictObject *, PyObject *); -PyAPI_FUNC(void) _PyDict_DebugMallocStats(FILE *out); #endif #ifdef __cplusplus -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 16:19:30 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 8 Nov 2013 16:19:30 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Touch-up_for_PEP_451?= Message-ID: <3dGQBV6VPHz7LmB@mail.python.org> http://hg.python.org/peps/rev/20ba4aeb71b6 changeset: 5255:20ba4aeb71b6 user: Brett Cannon date: Fri Nov 08 10:19:24 2013 -0500 summary: Touch-up for PEP 451 files: pep-0451.txt | 41 ++++++++++++++++++++++----------------- 1 files changed, 23 insertions(+), 18 deletions(-) diff --git a/pep-0451.txt b/pep-0451.txt --- a/pep-0451.txt +++ b/pep-0451.txt @@ -3,6 +3,7 @@ Version: $Revision$ Last-Modified: $Date$ Author: Eric Snow +BDFL-Delegate: Brett Cannon , Nick Coghlan Discussions-To: import-sig at python.org Status: Draft Type: Standards Track @@ -67,12 +68,12 @@ Right now loaders (via load_module()) are responsible for certain boilerplate, import-related operations. These are: -1. perform some (module-related) validation; -2. create the module object; -3. set import-related attributes on the module; -4. "register" the module to sys.modules; -5. exec the module; -6. clean up in the event of failure while loading the module. +1. Perform some (module-related) validation +2. Create the module object +3. Set import-related attributes on the module +4. "Register" the module to sys.modules +5. Exec the module +6. Clean up in the event of failure while loading the module This all takes place during the import system's call to Loader.load_module(). @@ -170,7 +171,7 @@ As more developers come to understand and customize the import system, any weaknesses in the finder and loader APIs will be more impactful. So the sooner we can address any such weaknesses the import system, the -better...and there are a couple we can take care of with this proposal. +better...and there are a couple we hope to take care of with this proposal. Firstly, any time the import system needs to save information about a module we end up with more attributes on module objects that are @@ -184,7 +185,7 @@ The `finder`_ and `loader`_ sections above detail current responsibility of both. Notably, loaders are not required to provide any of the -functionality of their load_module() through other methods. Thus, +functionality of their load_module() method through other methods. Thus, though the import-related information about a module is likely available without loading the module, it is not otherwise exposed. @@ -284,9 +285,9 @@ For finders: -* importlib.abc.MetaPathFinder.find_spec(name, path) and - importlib.abc.PathEntryFinder.find_spec(name) will return a module - spec to use during import. +* importlib.abc.MetaPathFinder.find_spec(name, path, target) and + importlib.abc.PathEntryFinder.find_spec(name, target) will return a + module spec to use during import. For loaders: @@ -453,10 +454,12 @@ _init_module_attrs(spec, module) if spec.loader is None and spec.submodule_search_locations is not None: - # namespace package + # Namespace package sys.modules[spec.name] = module elif not hasattr(spec.loader, 'exec_module'): - module = spec.loader.load_module(spec.name) + spec.loader.load_module(spec.name) + # __loader__ and __package__ would be explicitly set here for + # backwards-compatibility. else: sys.modules[spec.name] = module try: @@ -502,13 +505,15 @@ _RELOADING[name] = module try: if spec.loader is None: - # namespace loader + # Namespace loader _init_module_attrs(spec, module) return module if spec.parent and spec.parent not in sys.modules: raise ImportError _init_module_attrs(spec, module) + # Ignoring backwards-compatibility call to load_module() + # for simplicity. spec.loader.exec_module(module) return sys.modules[name] finally: @@ -690,7 +695,7 @@ Finders ------- -Finders are still responsible for identifying, an typically creating, +Finders are still responsible for identifying, and typically creating, the loader that should be used to load a module. That loader will now be stored in the module spec returned by find_spec() rather than returned directly. As is currently the case without the PEP, if a @@ -809,7 +814,7 @@ However, if it exists on a loader it will be used exclusively. Loader.init_module_attr() method, added prior to Python 3.4's -release , will be removed in favor of the same method on ModuleSpec. +release, will be removed in favor of the same method on ModuleSpec. However, InspectLoader.is_package() will not be deprecated even though the same information is found on ModuleSpec. ModuleSpec @@ -832,11 +837,11 @@ series. * The spec for the ``__main__`` module will reflect how the interpreter was started. For instance, with ``-m`` the spec's name will be that - of the run module, while ``__main__.__name__`` will still be + of the module used, while ``__main__.__name__`` will still be "__main__". * We will add importlib.find_spec() to mirror importlib.find_loader() (which becomes deprecated). -* importlib.reload() is changed to use ModuleSpec.load(). +* importlib.reload() is changed to use ModuleSpec. * importlib.reload() will now make use of the per-module import lock. -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 8 16:25:21 2013 From: python-checkins at python.org (victor.stinner) Date: Fri, 8 Nov 2013 16:25:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_454?= Message-ID: <3dGQKF3MD4z7Lkf@mail.python.org> http://hg.python.org/peps/rev/9ea9c6ac801d changeset: 5256:9ea9c6ac801d user: Victor Stinner date: Fri Nov 08 16:25:13 2013 +0100 summary: PEP 454 * Add Trace, Traceback and Frame classes: generated on-demand read-only views of traces * Add a new StatisticDiff class and add new Snapshot.compare_to() method. The method return a list of StatisticDiff instances and replaces the compare_to parameter of Snapshot.statistics() * Remove Statistic.size_diff and Statistic.count_diff attributes * Remove get_traces() files: pep-0454.txt | 216 ++++++++++++++++++++++++++------------- 1 files changed, 144 insertions(+), 72 deletions(-) diff --git a/pep-0454.txt b/pep-0454.txt --- a/pep-0454.txt +++ b/pep-0454.txt @@ -161,13 +161,20 @@ ``take_snapshot()`` function: - Take a snapshot of traces of memory blocks allocated by Python using - the ``get_traces()`` function. Return a new ``Snapshot`` instance. + Take a snapshot of traces of memory blocks allocated by Python. + Return a new ``Snapshot`` instance. + + The snapshot does not include memory blocks allocated before the + ``tracemalloc`` module started to trace memory allocations nor + memory blocks ignored by filters (see ``get_filters()``). + + Tracebacks of traces are limited to ``traceback_limit`` frames. Use + ``set_traceback_limit()`` to store more frames. The ``tracemalloc`` module must be tracing memory allocations to take a snapshot, see the the ``start()`` function. - See also ``get_traces()`` and ``get_object_traceback()`` functions. + See also the ``get_object_traceback()`` function. Trace functions @@ -175,31 +182,15 @@ When Python allocates a memory block, ``tracemalloc`` attachs a "trace" to the memory block to store its size in bytes and the traceback where the -allocation occurred. - -The following functions give access to these traces. A trace is a ``(size: int, -traceback)`` tuple. *size* is the size of the memory block in bytes. -*traceback* is a tuple of frames sorted from the most recent to the oldest -frame, limited to ``get_traceback_limit()`` frames. A frame is -a ``(filename: str, lineno: int)`` tuple. - -A traceback contains at least ``1`` frame. If the ``tracemalloc`` module -failed to get a frame, the ``""`` filename and the line number ``0`` -are used. If it failed to get the traceback or if the traceback limit is ``0``, -the traceback is ``(('', 0),)``. - -Example of a trace: ``(32, (('x.py', 7), ('x.py', 11)))``. The memory block -has a size of 32 bytes and was allocated at ``x.py:7``, line called from line -``x.py:11``. +allocation occurred. See the ``Trace`` class. ``get_object_traceback(obj)`` function: Get the traceback where the Python object *obj* was allocated. - Return a tuple of ``(filename: str, lineno: int)`` tuples. - - Return ``None`` if the ``tracemalloc`` module is not tracing memory - allocations or did not trace the allocation of the object. + Return a ``Traceback`` instance, or ``None`` if the ``tracemalloc`` + module is not tracing memory allocations or did not trace the + allocation of the object. See also ``gc.get_referrers()`` and ``sys.getsizeof()`` functions. @@ -214,30 +205,6 @@ Use the ``set_traceback_limit()`` function to change the limit. -``get_traces()`` function: - - Get traces of memory blocks allocated by Python. Return a list of - ``(size: int, traceback: tuple)`` tuples. *traceback* is a tuple of - ``(filename: str, lineno: int)`` tuples. - - The list of traces does not include memory blocks allocated before - the ``tracemalloc`` module started to trace memory allocations nor - memory blocks ignored by filters (see ``get_filters()``). - - The list has an undefined order. Take a snapshot using - ``take_snapshot()`` and use the ``Snapshot.statistics()`` method to - get a sorted list of statistics. - - Tracebacks of traces are limited to ``traceback_limit`` frames. Use - ``set_traceback_limit()`` to store more frames. - - Return an empty list if the ``tracemalloc`` module is not tracing - memory allocations. - - See also ``take_snapshot()`` and ``get_object_traceback()`` - functions. - - ``set_traceback_limit(nframe: int)`` function: Set the maximum number of frames stored in the traceback of a trace. @@ -351,10 +318,28 @@ See the ``get_traceback_limit()`` function. +Frame +----- + +``Frame`` class: + + Frame of a traceback. + + The ``Traceback`` class is a sequence of ``Frame`` instances. + +``filename`` attribute: + + Filename (``str``). + +``lineno`` attribute: + + Line number (``int``). + + Snapshot -------- -``Snapshot(timestamp: datetime.datetime, traceback_limit: int, traces: dict=None)`` class: +``Snapshot`` class: Snapshot of traces of memory blocks allocated by Python. @@ -362,14 +347,28 @@ ``apply_filters(filters)`` method: - Apply filters on the ``traces`` dictionary, *filters* is a list of - ``Filter`` instances. Return a new ``Snapshot`` instance with the - filtered traces. + Create a new ``Snapshot`` instance with the filtered ``traces``, + *filters* is a list of ``Filter`` instances. If *filters* is an empty list, return a new ``Snapshot`` instance with a copy of the traces. +``compare_to(old_snapshot: Snapshot, group_by: str, cumulative: bool=False)`` method: + + Compute the differences with an old snapshot *old_snapshot*. Get + statistics as a sorted list of ``StatisticDiff`` instances, grouped + by *group_by*. + + See the ``statistics()`` method for *group_by* and *cumulative* + parameters. + + The result is sorted from the biggest to the smallest by: absolute + value of ``StatisticDiff.size_diff``, ``StatisticDiff.size``, + absolute value of ``StatisticDiff.count_diff``, ``Statistic.count`` + and then by ``StatisticDiff.traceback``. + + ``dump(filename)`` method: Write the snapshot into a file. @@ -384,7 +383,7 @@ See also ``dump()``. -``statistics(group_by: str, cumulative: bool=False, compare_to=None)`` method: +``statistics(group_by: str, cumulative: bool=False)`` method: Get statistics as a sorted list of ``Statistic`` instances, grouped by *group_by*: @@ -403,15 +402,9 @@ ``'filename'`` and ``'lineno'`` with ``traceback_limit`` greater than ``1``. - If *compare_to* is set to a previous ``Snapshot`` instance, compute - the differences betwen the two snapshots. Otherwise, - ``Statistic.size_diff`` and ``Statistic.count_diff`` attributes are - set to zero. - - The result is sorted from the biggest to the smallest by: absolute - value of ``Statistic.size_diff``, ``Statistic.size``, absolute value - of ``Statistic.count_diff``, ``Statistic.count`` and then by - ``Statistic.key``. + The result is sorted from the biggest to the smallest by: + ``Statistic.size``, ``Statistic.count`` and then by + ``Statistic.traceback``. ``traceback_limit`` attribute: @@ -421,8 +414,11 @@ ``traces`` attribute: - Traces of all memory blocks allocated by Python: see the - ``get_traces()`` function. + Traces of all memory blocks allocated by Python: sequence of + ``Trace`` instances. + + The sequence has an undefined order. Use the + ``Snapshot.statistics()`` method to get a sorted list of statistics. ``timestamp`` attribute: @@ -433,7 +429,7 @@ Statistic --------- -``Statistic(key, size, size_diff, count, count_diff)`` class: +``Statistic`` class: Statistic on memory allocations. @@ -442,25 +438,101 @@ ``Snapshot.statistics()`` returns a list of ``Statistic`` instances. -``traceback`` attribute: - - Tuple of ``(filename: str, lineno: int)`` tuples. + See also the ``StatisticDiff`` class. ``count`` attribute: Number of memory blocks (``int``). -``count_diff`` attribute: - - Difference of number of memory blocks (``int``). - ``size`` attribute: Total size of memory blocks in bytes (``int``). +``traceback`` attribute: + + Traceback where the memory block was allocated, ``Traceback`` + instance. + + +StatisticDiff +------------- + +``StatisticDiff`` class: + + Statistic difference on memory allocations between an old and a new + ``Snapshot`` instance. + + ``Snapshot.compare_to()`` returns a list of ``StatisticDiff`` + instances. See also the ``Statistic`` class. + +``count`` attribute: + + Number of memory blocks in the new snapshot (``int``): ``0`` if the + memory blocks have been released in the new snapshot. + +``count_diff`` attribute: + + Difference of number of memory blocks between the old and the new + snapshots (``int``): ``0`` if the memory blocks have been allocated + in the new snapshot. + +``size`` attribute: + + Total size of memory blocks in bytes in the new snapshot (``int``): + ``0`` if the memory blocks have been released in the new snapshot. + ``size_diff`` attribute: - Difference of total size of memory blocks in bytes (``int``). + Difference of total size of memory blocks in bytes between the old + and the new snapshots (``int``): ``0`` if the memory blocks have + been allocated in the new snapshot. + +``traceback`` attribute: + + Traceback where the memory blocks were allocated, ``Traceback`` + instance. + + +Trace +----- + +``Trace`` class: + + Trace of a memory block. + + The ``Snapshot.traces`` attribute is a sequence of ``Trace`` + instances. + +``size`` attribute: + + Size of the memory block in bytes (``int``). + +``traceback`` attribute: + + Traceback where the memory block was allocated, ``Traceback`` + instance. + + +Traceback +--------- + +``Traceback`` class: + + Sequence of ``Frame`` instances sorted from the most recent frame to + the oldest frame. + + A traceback contains at least ``1`` frame. If the ``tracemalloc`` + module failed to get a frame, the filename ``""`` and the + line number ``0`` are used. If it failed to get the traceback or if + the traceback limit is ``0``, the traceback only contains a frame: + filename ``''`` and line number ``0``. + + When a snapshot is taken, tracebacks of traces are limited to + ``get_traceback_limit()`` frames. See the ``take_snapshot()`` + function. + + The ``Trace.traceback`` attribute is an instance of ``Traceback`` + instance. Rejected Alternatives -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 8 17:10:49 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 8 Nov 2013 17:10:49 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2316803=3A_test=2Et?= =?utf-8?q?est=5Fimportlib=2Efrozen_now_runs_both_frozen_and_source_code?= Message-ID: <3dGRKj2mp7z7M1r@mail.python.org> http://hg.python.org/cpython/rev/b26e6e3e8037 changeset: 87005:b26e6e3e8037 user: Brett Cannon date: Fri Nov 08 11:10:41 2013 -0500 summary: Issue #16803: test.test_importlib.frozen now runs both frozen and source code files: Lib/test/test_importlib/frozen/test_finder.py | 16 +- Lib/test/test_importlib/frozen/test_loader.py | 48 +++++---- 2 files changed, 33 insertions(+), 31 deletions(-) diff --git a/Lib/test/test_importlib/frozen/test_finder.py b/Lib/test/test_importlib/frozen/test_finder.py --- a/Lib/test/test_importlib/frozen/test_finder.py +++ b/Lib/test/test_importlib/frozen/test_finder.py @@ -1,15 +1,17 @@ -from importlib import machinery from .. import abc +from .. import util + +machinery = util.import_importlib('importlib.machinery') import unittest -class FinderTests(unittest.TestCase, abc.FinderTests): +class FinderTests(abc.FinderTests): """Test finding frozen modules.""" def find(self, name, path=None): - finder = machinery.FrozenImporter + finder = self.machinery.FrozenImporter return finder.find_module(name, path) def test_module(self): @@ -37,11 +39,9 @@ loader = self.find('') self.assertIsNone(loader) - -def test_main(): - from test.support import run_unittest - run_unittest(FinderTests) +Frozen_FinderTests, Source_FinderTests = util.test_both(FinderTests, + machinery=machinery) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/frozen/test_loader.py b/Lib/test/test_importlib/frozen/test_loader.py --- a/Lib/test/test_importlib/frozen/test_loader.py +++ b/Lib/test/test_importlib/frozen/test_loader.py @@ -1,20 +1,21 @@ from .. import abc from .. import util -from importlib import machinery +machinery = util.import_importlib('importlib.machinery') + import unittest from test.support import captured_stdout import types -class LoaderTests(unittest.TestCase, abc.LoaderTests): +class LoaderTests(abc.LoaderTests): def test_module(self): with util.uncache('__hello__'), captured_stdout() as stdout: - module = machinery.FrozenImporter.load_module('__hello__') + module = self.machinery.FrozenImporter.load_module('__hello__') check = {'__name__': '__hello__', '__package__': '', - '__loader__': machinery.FrozenImporter, + '__loader__': self.machinery.FrozenImporter, } for attr, value in check.items(): self.assertEqual(getattr(module, attr), value) @@ -23,11 +24,11 @@ def test_package(self): with util.uncache('__phello__'), captured_stdout() as stdout: - module = machinery.FrozenImporter.load_module('__phello__') + module = self.machinery.FrozenImporter.load_module('__phello__') check = {'__name__': '__phello__', '__package__': '__phello__', '__path__': [], - '__loader__': machinery.FrozenImporter, + '__loader__': self.machinery.FrozenImporter, } for attr, value in check.items(): attr_value = getattr(module, attr) @@ -40,10 +41,10 @@ def test_lacking_parent(self): with util.uncache('__phello__', '__phello__.spam'), \ captured_stdout() as stdout: - module = machinery.FrozenImporter.load_module('__phello__.spam') + module = self.machinery.FrozenImporter.load_module('__phello__.spam') check = {'__name__': '__phello__.spam', '__package__': '__phello__', - '__loader__': machinery.FrozenImporter, + '__loader__': self.machinery.FrozenImporter, } for attr, value in check.items(): attr_value = getattr(module, attr) @@ -55,15 +56,15 @@ def test_module_reuse(self): with util.uncache('__hello__'), captured_stdout() as stdout: - module1 = machinery.FrozenImporter.load_module('__hello__') - module2 = machinery.FrozenImporter.load_module('__hello__') + module1 = self.machinery.FrozenImporter.load_module('__hello__') + module2 = self.machinery.FrozenImporter.load_module('__hello__') self.assertIs(module1, module2) self.assertEqual(stdout.getvalue(), 'Hello world!\nHello world!\n') def test_module_repr(self): with util.uncache('__hello__'), captured_stdout(): - module = machinery.FrozenImporter.load_module('__hello__') + module = self.machinery.FrozenImporter.load_module('__hello__') self.assertEqual(repr(module), "") @@ -72,13 +73,16 @@ pass def test_unloadable(self): - assert machinery.FrozenImporter.find_module('_not_real') is None + assert self.machinery.FrozenImporter.find_module('_not_real') is None with self.assertRaises(ImportError) as cm: - machinery.FrozenImporter.load_module('_not_real') + self.machinery.FrozenImporter.load_module('_not_real') self.assertEqual(cm.exception.name, '_not_real') +Frozen_LoaderTests, Source_LoaderTests = util.test_both(LoaderTests, + machinery=machinery) -class InspectLoaderTests(unittest.TestCase): + +class InspectLoaderTests: """Tests for the InspectLoader methods for FrozenImporter.""" @@ -86,7 +90,7 @@ # Make sure that the code object is good. name = '__hello__' with captured_stdout() as stdout: - code = machinery.FrozenImporter.get_code(name) + code = self.machinery.FrozenImporter.get_code(name) mod = types.ModuleType(name) exec(code, mod.__dict__) self.assertTrue(hasattr(mod, 'initialized')) @@ -94,7 +98,7 @@ def test_get_source(self): # Should always return None. - result = machinery.FrozenImporter.get_source('__hello__') + result = self.machinery.FrozenImporter.get_source('__hello__') self.assertIsNone(result) def test_is_package(self): @@ -102,22 +106,20 @@ test_for = (('__hello__', False), ('__phello__', True), ('__phello__.spam', False)) for name, is_package in test_for: - result = machinery.FrozenImporter.is_package(name) + result = self.machinery.FrozenImporter.is_package(name) self.assertEqual(bool(result), is_package) def test_failure(self): # Raise ImportError for modules that are not frozen. for meth_name in ('get_code', 'get_source', 'is_package'): - method = getattr(machinery.FrozenImporter, meth_name) + method = getattr(self.machinery.FrozenImporter, meth_name) with self.assertRaises(ImportError) as cm: method('importlib') self.assertEqual(cm.exception.name, 'importlib') - -def test_main(): - from test.support import run_unittest - run_unittest(LoaderTests, InspectLoaderTests) +Frozen_ILTests, Source_ILTests = util.test_both(InspectLoaderTests, + machinery=machinery) if __name__ == '__main__': - test_main() + unittest.main() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 18:06:52 2013 From: python-checkins at python.org (victor.stinner) Date: Fri, 8 Nov 2013 18:06:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_454=3A_remove_Snapshot=2E?= =?utf-8?q?timestamp_attribute?= Message-ID: <3dGSZN5Dh6z7M28@mail.python.org> http://hg.python.org/peps/rev/16ef9242c00e changeset: 5257:16ef9242c00e user: Victor Stinner date: Fri Nov 08 18:06:37 2013 +0100 summary: PEP 454: remove Snapshot.timestamp attribute files: pep-0454.txt | 51 ++++++++++++++++++--------------------- 1 files changed, 23 insertions(+), 28 deletions(-) diff --git a/pep-0454.txt b/pep-0454.txt --- a/pep-0454.txt +++ b/pep-0454.txt @@ -153,7 +153,7 @@ The function uninstalls hooks on Python memory allocators, so the overhead of the module becomes null. - Call ``get_traces()`` or ``take_snapshot()`` function to get traces + Call ``take_snapshot()`` function to take a snapshot of traces before clearing them. See also ``start()`` and ``is_tracing()`` functions. @@ -168,8 +168,8 @@ ``tracemalloc`` module started to trace memory allocations nor memory blocks ignored by filters (see ``get_filters()``). - Tracebacks of traces are limited to ``traceback_limit`` frames. Use - ``set_traceback_limit()`` to store more frames. + Tracebacks of traces are limited to ``get_traceback_limit()`` + frames. Use ``set_traceback_limit()`` to store more frames. The ``tracemalloc`` module must be tracing memory allocations to take a snapshot, see the the ``start()`` function. @@ -214,14 +214,14 @@ function to measure the overhead and the ``add_filter()`` function to select which memory allocations are traced. - If the limit is set to ``0`` frame, the traceback ``(('', - 0),)`` will be used for all traces. + If the limit is set to ``0`` frame, a traceback with a frame will be + used for all traces: filename ``''`` at line number ``0``. Use the ``get_traceback_limit()`` function to get the current limit. - The ``PYTHONTRACEMALLOC`` environment variable and the ``-X`` - ``tracemalloc=NFRAME`` command line option can be used to set a - limit at startup. + The ``PYTHONTRACEMALLOC`` environment variable + (``PYTHONTRACEMALLOC=NFRAME``) and the ``-X`` ``tracemalloc=NFRAME`` + command line option can be used to set the limit at startup. Filter functions @@ -306,7 +306,7 @@ ``filename_pattern`` attribute: - Filename pattern (``str``) of the filter. + Filename pattern of the filter (``str``). ``all_frames`` attribute: @@ -315,7 +315,8 @@ checked. This attribute is ignored if the traceback limit is less than ``2``. - See the ``get_traceback_limit()`` function. + See the ``get_traceback_limit()`` function and + ``Snapshot.traceback_limit`` attribute. Frame @@ -343,12 +344,12 @@ Snapshot of traces of memory blocks allocated by Python. - The ``take_snapshot()`` function create a snapshot instance. + The ``take_snapshot()`` function creates a snapshot instance. ``apply_filters(filters)`` method: - Create a new ``Snapshot`` instance with the filtered ``traces``, - *filters* is a list of ``Filter`` instances. + Create a new ``Snapshot`` instance with the filtered ``traces`` + sequence, *filters* is a list of ``Filter`` instances. If *filters* is an empty list, return a new ``Snapshot`` instance with a copy of the traces. @@ -356,9 +357,8 @@ ``compare_to(old_snapshot: Snapshot, group_by: str, cumulative: bool=False)`` method: - Compute the differences with an old snapshot *old_snapshot*. Get - statistics as a sorted list of ``StatisticDiff`` instances, grouped - by *group_by*. + Compute the differences with an old snapshot. Get statistics as a + sorted list of ``StatisticDiff`` instances grouped by *group_by*. See the ``statistics()`` method for *group_by* and *cumulative* parameters. @@ -385,7 +385,7 @@ ``statistics(group_by: str, cumulative: bool=False)`` method: - Get statistics as a sorted list of ``Statistic`` instances, grouped + Get statistics as a sorted list of ``Statistic`` instances grouped by *group_by*: ===================== ======================== @@ -398,9 +398,9 @@ If *cumulative* is ``True``, cumulate size and count of memory blocks of all frames of the traceback of a trace, not only the most - recent frame. The cumulative mode can only be used with key types - ``'filename'`` and ``'lineno'`` with ``traceback_limit`` greater - than ``1``. + recent frame. The cumulative mode can only be used with *group_by* + equals to ``'filename'`` and ``'lineno'`` and ``traceback_limit`` + greater than ``1``. The result is sorted from the biggest to the smallest by: ``Statistic.size``, ``Statistic.count`` and then by @@ -409,8 +409,8 @@ ``traceback_limit`` attribute: - Maximum number of frames stored in the traceback of ``traces``: see - the ``get_traceback_limit()`` function. + Maximum number of frames stored in the traceback of ``traces``: + result of the ``get_traceback_limit()`` when the snapshot was taken. ``traces`` attribute: @@ -420,11 +420,6 @@ The sequence has an undefined order. Use the ``Snapshot.statistics()`` method to get a sorted list of statistics. -``timestamp`` attribute: - - Creation date and time of the snapshot, ``datetime.datetime`` - instance. - Statistic --------- @@ -525,7 +520,7 @@ module failed to get a frame, the filename ``""`` and the line number ``0`` are used. If it failed to get the traceback or if the traceback limit is ``0``, the traceback only contains a frame: - filename ``''`` and line number ``0``. + filename ``''`` at line number ``0``. When a snapshot is taken, tracebacks of traces are limited to ``get_traceback_limit()`` frames. See the ``take_snapshot()`` -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 8 19:35:08 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 8 Nov 2013 19:35:08 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2316803=3A_test=2Et?= =?utf-8?q?est=5Fimportlib=2Eimport=5F_now_tests_frozen_and_source_code?= Message-ID: <3dGVXD2pMtz7LsX@mail.python.org> http://hg.python.org/cpython/rev/6c998d72553a changeset: 87006:6c998d72553a user: Brett Cannon date: Fri Nov 08 13:34:59 2013 -0500 summary: Issue #16803: test.test_importlib.import_ now tests frozen and source code files: Lib/test/test_importlib/__main__.py | 2 - Lib/test/test_importlib/import_/test___loader__.py | 10 +- Lib/test/test_importlib/import_/test___package__.py | 33 +++--- Lib/test/test_importlib/import_/test_api.py | 31 ++--- Lib/test/test_importlib/import_/test_caching.py | 30 +++-- Lib/test/test_importlib/import_/test_fromlist.py | 33 +++-- Lib/test/test_importlib/import_/test_meta_path.py | 24 ++-- Lib/test/test_importlib/import_/test_packages.py | 30 ++--- Lib/test/test_importlib/import_/test_path.py | 36 +++--- Lib/test/test_importlib/import_/test_relative_imports.py | 51 ++++----- Lib/test/test_importlib/import_/util.py | 20 +-- 11 files changed, 146 insertions(+), 154 deletions(-) diff --git a/Lib/test/test_importlib/__main__.py b/Lib/test/test_importlib/__main__.py --- a/Lib/test/test_importlib/__main__.py +++ b/Lib/test/test_importlib/__main__.py @@ -15,6 +15,4 @@ parser.add_argument('-b', '--builtin', action='store_true', default=False, help='use builtins.__import__() instead of importlib') args = parser.parse_args() - if args.builtin: - util.using___import__ = True test_main() diff --git a/Lib/test/test_importlib/import_/test___loader__.py b/Lib/test/test_importlib/import_/test___loader__.py --- a/Lib/test/test_importlib/import_/test___loader__.py +++ b/Lib/test/test_importlib/import_/test___loader__.py @@ -16,7 +16,7 @@ return self.module -class LoaderAttributeTests(unittest.TestCase): +class LoaderAttributeTests: def test___loader___missing(self): module = types.ModuleType('blah') @@ -27,7 +27,7 @@ loader = LoaderMock() loader.module = module with util.uncache('blah'), util.import_state(meta_path=[loader]): - module = import_util.import_('blah') + module = self.__import__('blah') self.assertEqual(loader, module.__loader__) def test___loader___is_None(self): @@ -36,9 +36,13 @@ loader = LoaderMock() loader.module = module with util.uncache('blah'), util.import_state(meta_path=[loader]): - returned_module = import_util.import_('blah') + returned_module = self.__import__('blah') self.assertEqual(loader, module.__loader__) +Frozen_Tests, Source_Tests = util.test_both(LoaderAttributeTests, + __import__=import_util.__import__) + + if __name__ == '__main__': unittest.main() diff --git a/Lib/test/test_importlib/import_/test___package__.py b/Lib/test/test_importlib/import_/test___package__.py --- a/Lib/test/test_importlib/import_/test___package__.py +++ b/Lib/test/test_importlib/import_/test___package__.py @@ -9,7 +9,7 @@ from . import util as import_util -class Using__package__(unittest.TestCase): +class Using__package__: """Use of __package__ supercedes the use of __name__/__path__ to calculate what package a module belongs to. The basic algorithm is [__package__]:: @@ -38,8 +38,8 @@ # [__package__] with util.mock_modules('pkg.__init__', 'pkg.fake') as importer: with util.import_state(meta_path=[importer]): - import_util.import_('pkg.fake') - module = import_util.import_('', + self.__import__('pkg.fake') + module = self.__import__('', globals={'__package__': 'pkg.fake'}, fromlist=['attr'], level=2) self.assertEqual(module.__name__, 'pkg') @@ -51,8 +51,8 @@ globals_['__package__'] = None with util.mock_modules('pkg.__init__', 'pkg.fake') as importer: with util.import_state(meta_path=[importer]): - import_util.import_('pkg.fake') - module = import_util.import_('', globals= globals_, + self.__import__('pkg.fake') + module = self.__import__('', globals= globals_, fromlist=['attr'], level=2) self.assertEqual(module.__name__, 'pkg') @@ -63,15 +63,17 @@ def test_bad__package__(self): globals = {'__package__': ''} with self.assertRaises(SystemError): - import_util.import_('', globals, {}, ['relimport'], 1) + self.__import__('', globals, {}, ['relimport'], 1) def test_bunk__package__(self): globals = {'__package__': 42} with self.assertRaises(TypeError): - import_util.import_('', globals, {}, ['relimport'], 1) + self.__import__('', globals, {}, ['relimport'], 1) +Frozen_UsingPackage, Source_UsingPackage = util.test_both( + Using__package__, __import__=import_util.__import__) - at import_util.importlib_only + class Setting__package__(unittest.TestCase): """Because __package__ is a new feature, it is not always set by a loader. @@ -84,12 +86,14 @@ """ + __import__ = import_util.__import__[1] + # [top-level] def test_top_level(self): with util.mock_modules('top_level') as mock: with util.import_state(meta_path=[mock]): del mock['top_level'].__package__ - module = import_util.import_('top_level') + module = self.__import__('top_level') self.assertEqual(module.__package__, '') # [package] @@ -97,7 +101,7 @@ with util.mock_modules('pkg.__init__') as mock: with util.import_state(meta_path=[mock]): del mock['pkg'].__package__ - module = import_util.import_('pkg') + module = self.__import__('pkg') self.assertEqual(module.__package__, 'pkg') # [submodule] @@ -105,15 +109,10 @@ with util.mock_modules('pkg.__init__', 'pkg.mod') as mock: with util.import_state(meta_path=[mock]): del mock['pkg.mod'].__package__ - pkg = import_util.import_('pkg.mod') + pkg = self.__import__('pkg.mod') module = getattr(pkg, 'mod') self.assertEqual(module.__package__, 'pkg') -def test_main(): - from test.support import run_unittest - run_unittest(Using__package__, Setting__package__) - - if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/test_api.py b/Lib/test/test_importlib/import_/test_api.py --- a/Lib/test/test_importlib/import_/test_api.py +++ b/Lib/test/test_importlib/import_/test_api.py @@ -1,5 +1,5 @@ -from .. import util as importlib_test_util -from . import util +from .. import util +from . import util as import_util import sys import types import unittest @@ -17,7 +17,7 @@ raise ImportError('I cannot be loaded!') -class APITest(unittest.TestCase): +class APITest: """Test API-specific details for __import__ (e.g. raising the right exception when passing in an int for the module name).""" @@ -25,24 +25,24 @@ def test_name_requires_rparition(self): # Raise TypeError if a non-string is passed in for the module name. with self.assertRaises(TypeError): - util.import_(42) + self.__import__(42) def test_negative_level(self): # Raise ValueError when a negative level is specified. # PEP 328 did away with sys.module None entries and the ambiguity of # absolute/relative imports. with self.assertRaises(ValueError): - util.import_('os', globals(), level=-1) + self.__import__('os', globals(), level=-1) def test_nonexistent_fromlist_entry(self): # If something in fromlist doesn't exist, that's okay. # issue15715 mod = types.ModuleType('fine') mod.__path__ = ['XXX'] - with importlib_test_util.import_state(meta_path=[BadLoaderFinder]): - with importlib_test_util.uncache('fine'): + with util.import_state(meta_path=[BadLoaderFinder]): + with util.uncache('fine'): sys.modules['fine'] = mod - util.import_('fine', fromlist=['not here']) + self.__import__('fine', fromlist=['not here']) def test_fromlist_load_error_propagates(self): # If something in fromlist triggers an exception not related to not @@ -50,18 +50,15 @@ # issue15316 mod = types.ModuleType('fine') mod.__path__ = ['XXX'] - with importlib_test_util.import_state(meta_path=[BadLoaderFinder]): - with importlib_test_util.uncache('fine'): + with util.import_state(meta_path=[BadLoaderFinder]): + with util.uncache('fine'): sys.modules['fine'] = mod with self.assertRaises(ImportError): - util.import_('fine', fromlist=['bogus']) + self.__import__('fine', fromlist=['bogus']) - - -def test_main(): - from test.support import run_unittest - run_unittest(APITest) +Frozen_APITests, Source_APITests = util.test_both( + APITest, __import__=import_util.__import__) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/test_caching.py b/Lib/test/test_importlib/import_/test_caching.py --- a/Lib/test/test_importlib/import_/test_caching.py +++ b/Lib/test/test_importlib/import_/test_caching.py @@ -6,7 +6,7 @@ import unittest -class UseCache(unittest.TestCase): +class UseCache: """When it comes to sys.modules, import prefers it over anything else. @@ -21,12 +21,13 @@ ImportError is raised [None in cache]. """ + def test_using_cache(self): # [use cache] module_to_use = "some module found!" with util.uncache('some_module'): sys.modules['some_module'] = module_to_use - module = import_util.import_('some_module') + module = self.__import__('some_module') self.assertEqual(id(module_to_use), id(module)) def test_None_in_cache(self): @@ -35,7 +36,7 @@ with util.uncache(name): sys.modules[name] = None with self.assertRaises(ImportError) as cm: - import_util.import_(name) + self.__import__(name) self.assertEqual(cm.exception.name, name) def create_mock(self, *names, return_=None): @@ -47,42 +48,43 @@ mock.load_module = MethodType(load_module, mock) return mock +Frozen_UseCache, Source_UseCache = util.test_both( + UseCache, __import__=import_util.__import__) + + +class ImportlibUseCache(UseCache, unittest.TestCase): + + __import__ = import_util.__import__[1] + # __import__ inconsistent between loaders and built-in import when it comes # to when to use the module in sys.modules and when not to. - @import_util.importlib_only def test_using_cache_after_loader(self): # [from cache on return] with self.create_mock('module') as mock: with util.import_state(meta_path=[mock]): - module = import_util.import_('module') + module = self.__import__('module') self.assertEqual(id(module), id(sys.modules['module'])) # See test_using_cache_after_loader() for reasoning. - @import_util.importlib_only def test_using_cache_for_assigning_to_attribute(self): # [from cache to attribute] with self.create_mock('pkg.__init__', 'pkg.module') as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('pkg.module') + module = self.__import__('pkg.module') self.assertTrue(hasattr(module, 'module')) self.assertEqual(id(module.module), id(sys.modules['pkg.module'])) # See test_using_cache_after_loader() for reasoning. - @import_util.importlib_only def test_using_cache_for_fromlist(self): # [from cache for fromlist] with self.create_mock('pkg.__init__', 'pkg.module') as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('pkg', fromlist=['module']) + module = self.__import__('pkg', fromlist=['module']) self.assertTrue(hasattr(module, 'module')) self.assertEqual(id(module.module), id(sys.modules['pkg.module'])) -def test_main(): - from test.support import run_unittest - run_unittest(UseCache) - if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/test_fromlist.py b/Lib/test/test_importlib/import_/test_fromlist.py --- a/Lib/test/test_importlib/import_/test_fromlist.py +++ b/Lib/test/test_importlib/import_/test_fromlist.py @@ -3,7 +3,8 @@ from . import util as import_util import unittest -class ReturnValue(unittest.TestCase): + +class ReturnValue: """The use of fromlist influences what import returns. @@ -18,18 +19,21 @@ # [import return] with util.mock_modules('pkg.__init__', 'pkg.module') as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('pkg.module') + module = self.__import__('pkg.module') self.assertEqual(module.__name__, 'pkg') def test_return_from_from_import(self): # [from return] with util.mock_modules('pkg.__init__', 'pkg.module')as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('pkg.module', fromlist=['attr']) + module = self.__import__('pkg.module', fromlist=['attr']) self.assertEqual(module.__name__, 'pkg.module') +Frozen_ReturnValue, Source_ReturnValue = util.test_both( + ReturnValue, __import__=import_util.__import__) -class HandlingFromlist(unittest.TestCase): + +class HandlingFromlist: """Using fromlist triggers different actions based on what is being asked of it. @@ -48,14 +52,14 @@ # [object case] with util.mock_modules('module') as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('module', fromlist=['attr']) + module = self.__import__('module', fromlist=['attr']) self.assertEqual(module.__name__, 'module') def test_nonexistent_object(self): # [bad object] with util.mock_modules('module') as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('module', fromlist=['non_existent']) + module = self.__import__('module', fromlist=['non_existent']) self.assertEqual(module.__name__, 'module') self.assertTrue(not hasattr(module, 'non_existent')) @@ -63,7 +67,7 @@ # [module] with util.mock_modules('pkg.__init__', 'pkg.module') as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('pkg', fromlist=['module']) + module = self.__import__('pkg', fromlist=['module']) self.assertEqual(module.__name__, 'pkg') self.assertTrue(hasattr(module, 'module')) self.assertEqual(module.module.__name__, 'pkg.module') @@ -78,13 +82,13 @@ module_code={'pkg.mod': module_code}) as importer: with util.import_state(meta_path=[importer]): with self.assertRaises(ImportError) as exc: - import_util.import_('pkg', fromlist=['mod']) + self.__import__('pkg', fromlist=['mod']) self.assertEqual('i_do_not_exist', exc.exception.name) def test_empty_string(self): with util.mock_modules('pkg.__init__', 'pkg.mod') as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('pkg.mod', fromlist=['']) + module = self.__import__('pkg.mod', fromlist=['']) self.assertEqual(module.__name__, 'pkg.mod') def basic_star_test(self, fromlist=['*']): @@ -92,7 +96,7 @@ with util.mock_modules('pkg.__init__', 'pkg.module') as mock: with util.import_state(meta_path=[mock]): mock['pkg'].__all__ = ['module'] - module = import_util.import_('pkg', fromlist=fromlist) + module = self.__import__('pkg', fromlist=fromlist) self.assertEqual(module.__name__, 'pkg') self.assertTrue(hasattr(module, 'module')) self.assertEqual(module.module.__name__, 'pkg.module') @@ -110,17 +114,16 @@ with context as mock: with util.import_state(meta_path=[mock]): mock['pkg'].__all__ = ['module1'] - module = import_util.import_('pkg', fromlist=['module2', '*']) + module = self.__import__('pkg', fromlist=['module2', '*']) self.assertEqual(module.__name__, 'pkg') self.assertTrue(hasattr(module, 'module1')) self.assertTrue(hasattr(module, 'module2')) self.assertEqual(module.module1.__name__, 'pkg.module1') self.assertEqual(module.module2.__name__, 'pkg.module2') +Frozen_FromList, Source_FromList = util.test_both( + HandlingFromlist, __import__=import_util.__import__) -def test_main(): - from test.support import run_unittest - run_unittest(ReturnValue, HandlingFromlist) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/test_meta_path.py b/Lib/test/test_importlib/import_/test_meta_path.py --- a/Lib/test/test_importlib/import_/test_meta_path.py +++ b/Lib/test/test_importlib/import_/test_meta_path.py @@ -7,7 +7,7 @@ import warnings -class CallingOrder(unittest.TestCase): +class CallingOrder: """Calls to the importers on sys.meta_path happen in order that they are specified in the sequence, starting with the first importer @@ -24,7 +24,7 @@ first.modules[mod] = 42 second.modules[mod] = -13 with util.import_state(meta_path=[first, second]): - self.assertEqual(import_util.import_(mod), 42) + self.assertEqual(self.__import__(mod), 42) def test_continuing(self): # [continuing] @@ -34,7 +34,7 @@ first.find_module = lambda self, fullname, path=None: None second.modules[mod_name] = 42 with util.import_state(meta_path=[first, second]): - self.assertEqual(import_util.import_(mod_name), 42) + self.assertEqual(self.__import__(mod_name), 42) def test_empty(self): # Raise an ImportWarning if sys.meta_path is empty. @@ -51,8 +51,11 @@ self.assertEqual(len(w), 1) self.assertTrue(issubclass(w[-1].category, ImportWarning)) +Frozen_CallingOrder, Source_CallingOrder = util.test_both( + CallingOrder, __import__=import_util.__import__) -class CallSignature(unittest.TestCase): + +class CallSignature: """If there is no __path__ entry on the parent module, then 'path' is None [no path]. Otherwise, the value for __path__ is passed in for the 'path' @@ -74,7 +77,7 @@ log, wrapped_call = self.log(importer.find_module) importer.find_module = MethodType(wrapped_call, importer) with util.import_state(meta_path=[importer]): - import_util.import_(mod_name) + self.__import__(mod_name) assert len(log) == 1 args = log[0][0] kwargs = log[0][1] @@ -95,7 +98,7 @@ log, wrapped_call = self.log(importer.find_module) importer.find_module = MethodType(wrapped_call, importer) with util.import_state(meta_path=[importer]): - import_util.import_(mod_name) + self.__import__(mod_name) assert len(log) == 2 args = log[1][0] kwargs = log[1][1] @@ -104,12 +107,9 @@ self.assertEqual(args[0], mod_name) self.assertIs(args[1], path) - - -def test_main(): - from test.support import run_unittest - run_unittest(CallingOrder, CallSignature) +Frozen_CallSignature, Source_CallSignature = util.test_both( + CallSignature, __import__=import_util.__import__) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/test_packages.py b/Lib/test/test_importlib/import_/test_packages.py --- a/Lib/test/test_importlib/import_/test_packages.py +++ b/Lib/test/test_importlib/import_/test_packages.py @@ -6,21 +6,21 @@ from test import support -class ParentModuleTests(unittest.TestCase): +class ParentModuleTests: """Importing a submodule should import the parent modules.""" def test_import_parent(self): with util.mock_modules('pkg.__init__', 'pkg.module') as mock: with util.import_state(meta_path=[mock]): - module = import_util.import_('pkg.module') + module = self.__import__('pkg.module') self.assertIn('pkg', sys.modules) def test_bad_parent(self): with util.mock_modules('pkg.module') as mock: with util.import_state(meta_path=[mock]): with self.assertRaises(ImportError) as cm: - import_util.import_('pkg.module') + self.__import__('pkg.module') self.assertEqual(cm.exception.name, 'pkg') def test_raising_parent_after_importing_child(self): @@ -32,11 +32,11 @@ with mock: with util.import_state(meta_path=[mock]): with self.assertRaises(ZeroDivisionError): - import_util.import_('pkg') + self.__import__('pkg') self.assertNotIn('pkg', sys.modules) self.assertIn('pkg.module', sys.modules) with self.assertRaises(ZeroDivisionError): - import_util.import_('pkg.module') + self.__import__('pkg.module') self.assertNotIn('pkg', sys.modules) self.assertIn('pkg.module', sys.modules) @@ -51,10 +51,10 @@ with self.assertRaises((ZeroDivisionError, ImportError)): # This raises ImportError on the "from . import module" # line, not sure why. - import_util.import_('pkg') + self.__import__('pkg') self.assertNotIn('pkg', sys.modules) with self.assertRaises((ZeroDivisionError, ImportError)): - import_util.import_('pkg.module') + self.__import__('pkg.module') self.assertNotIn('pkg', sys.modules) # XXX False #self.assertIn('pkg.module', sys.modules) @@ -71,10 +71,10 @@ with self.assertRaises((ZeroDivisionError, ImportError)): # This raises ImportError on the "from ..subpkg import module" # line, not sure why. - import_util.import_('pkg.subpkg') + self.__import__('pkg.subpkg') self.assertNotIn('pkg.subpkg', sys.modules) with self.assertRaises((ZeroDivisionError, ImportError)): - import_util.import_('pkg.subpkg.module') + self.__import__('pkg.subpkg.module') self.assertNotIn('pkg.subpkg', sys.modules) # XXX False #self.assertIn('pkg.subpkg.module', sys.modules) @@ -83,7 +83,7 @@ # Try to import a submodule from a non-package should raise ImportError. assert not hasattr(sys, '__path__') with self.assertRaises(ImportError) as cm: - import_util.import_('sys.no_submodules_here') + self.__import__('sys.no_submodules_here') self.assertEqual(cm.exception.name, 'sys.no_submodules_here') def test_module_not_package_but_side_effects(self): @@ -98,15 +98,13 @@ with mock_modules as mock: with util.import_state(meta_path=[mock]): try: - submodule = import_util.import_(subname) + submodule = self.__import__(subname) finally: support.unload(subname) - -def test_main(): - from test.support import run_unittest - run_unittest(ParentModuleTests) +Frozen_ParentTests, Source_ParentTests = util.test_both( + ParentModuleTests, __import__=import_util.__import__) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/test_path.py b/Lib/test/test_importlib/import_/test_path.py --- a/Lib/test/test_importlib/import_/test_path.py +++ b/Lib/test/test_importlib/import_/test_path.py @@ -1,8 +1,9 @@ -from importlib import _bootstrap -from importlib import machinery -from importlib import import_module from .. import util from . import util as import_util + +importlib = util.import_importlib('importlib') +machinery = util.import_importlib('importlib.machinery') + import os import sys from types import ModuleType @@ -11,7 +12,7 @@ import zipimport -class FinderTests(unittest.TestCase): +class FinderTests: """Tests for PathFinder.""" @@ -19,7 +20,7 @@ # Test None returned upon not finding a suitable finder. module = '' with util.import_state(): - self.assertIsNone(machinery.PathFinder.find_module(module)) + self.assertIsNone(self.machinery.PathFinder.find_module(module)) def test_sys_path(self): # Test that sys.path is used when 'path' is None. @@ -29,7 +30,7 @@ importer = util.mock_modules(module) with util.import_state(path_importer_cache={path: importer}, path=[path]): - loader = machinery.PathFinder.find_module(module) + loader = self.machinery.PathFinder.find_module(module) self.assertIs(loader, importer) def test_path(self): @@ -39,7 +40,7 @@ path = '' importer = util.mock_modules(module) with util.import_state(path_importer_cache={path: importer}): - loader = machinery.PathFinder.find_module(module, [path]) + loader = self.machinery.PathFinder.find_module(module, [path]) self.assertIs(loader, importer) def test_empty_list(self): @@ -49,7 +50,7 @@ importer = util.mock_modules(module) with util.import_state(path_importer_cache={path: importer}, path=[path]): - self.assertIsNone(machinery.PathFinder.find_module('module', [])) + self.assertIsNone(self.machinery.PathFinder.find_module('module', [])) def test_path_hooks(self): # Test that sys.path_hooks is used. @@ -59,7 +60,7 @@ importer = util.mock_modules(module) hook = import_util.mock_path_hook(path, importer=importer) with util.import_state(path_hooks=[hook]): - loader = machinery.PathFinder.find_module(module, [path]) + loader = self.machinery.PathFinder.find_module(module, [path]) self.assertIs(loader, importer) self.assertIn(path, sys.path_importer_cache) self.assertIs(sys.path_importer_cache[path], importer) @@ -72,7 +73,7 @@ path=[path_entry]): with warnings.catch_warnings(record=True) as w: warnings.simplefilter('always') - self.assertIsNone(machinery.PathFinder.find_module('os')) + self.assertIsNone(self.machinery.PathFinder.find_module('os')) self.assertIsNone(sys.path_importer_cache[path_entry]) self.assertEqual(len(w), 1) self.assertTrue(issubclass(w[-1].category, ImportWarning)) @@ -84,7 +85,7 @@ importer = util.mock_modules(module) hook = import_util.mock_path_hook(os.getcwd(), importer=importer) with util.import_state(path=[path], path_hooks=[hook]): - loader = machinery.PathFinder.find_module(module) + loader = self.machinery.PathFinder.find_module(module) self.assertIs(loader, importer) self.assertIn(os.getcwd(), sys.path_importer_cache) @@ -96,8 +97,8 @@ new_path_importer_cache = sys.path_importer_cache.copy() new_path_importer_cache.pop(None, None) new_path_hooks = [zipimport.zipimporter, - _bootstrap.FileFinder.path_hook( - *_bootstrap._get_supported_file_loaders())] + self.machinery.FileFinder.path_hook( + *self.importlib._bootstrap._get_supported_file_loaders())] missing = object() email = sys.modules.pop('email', missing) try: @@ -105,16 +106,15 @@ path=new_path, path_importer_cache=new_path_importer_cache, path_hooks=new_path_hooks): - module = import_module('email') + module = self.importlib.import_module('email') self.assertIsInstance(module, ModuleType) finally: if email is not missing: sys.modules['email'] = email +Frozen_FinderTests, Source_FinderTests = util.test_both( + FinderTests, importlib=importlib, machinery=machinery) -def test_main(): - from test.support import run_unittest - run_unittest(FinderTests) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/test_relative_imports.py b/Lib/test/test_importlib/import_/test_relative_imports.py --- a/Lib/test/test_importlib/import_/test_relative_imports.py +++ b/Lib/test/test_importlib/import_/test_relative_imports.py @@ -4,7 +4,7 @@ import sys import unittest -class RelativeImports(unittest.TestCase): +class RelativeImports: """PEP 328 introduced relative imports. This allows for imports to occur from within a package without having to specify the actual package name. @@ -76,8 +76,8 @@ create = 'pkg.__init__', 'pkg.mod2' globals_ = {'__package__': 'pkg'}, {'__name__': 'pkg.mod1'} def callback(global_): - import_util.import_('pkg') # For __import__(). - module = import_util.import_('', global_, fromlist=['mod2'], level=1) + self.__import__('pkg') # For __import__(). + module = self.__import__('', global_, fromlist=['mod2'], level=1) self.assertEqual(module.__name__, 'pkg') self.assertTrue(hasattr(module, 'mod2')) self.assertEqual(module.mod2.attr, 'pkg.mod2') @@ -88,8 +88,8 @@ create = 'pkg.__init__', 'pkg.mod2' globals_ = {'__package__': 'pkg'}, {'__name__': 'pkg.mod1'} def callback(global_): - import_util.import_('pkg') # For __import__(). - module = import_util.import_('mod2', global_, fromlist=['attr'], + self.__import__('pkg') # For __import__(). + module = self.__import__('mod2', global_, fromlist=['attr'], level=1) self.assertEqual(module.__name__, 'pkg.mod2') self.assertEqual(module.attr, 'pkg.mod2') @@ -101,8 +101,8 @@ globals_ = ({'__package__': 'pkg'}, {'__name__': 'pkg', '__path__': ['blah']}) def callback(global_): - import_util.import_('pkg') # For __import__(). - module = import_util.import_('', global_, fromlist=['module'], + self.__import__('pkg') # For __import__(). + module = self.__import__('', global_, fromlist=['module'], level=1) self.assertEqual(module.__name__, 'pkg') self.assertTrue(hasattr(module, 'module')) @@ -114,8 +114,8 @@ create = 'pkg.__init__', 'pkg.module' globals_ = {'__package__': 'pkg'}, {'__name__': 'pkg.module'} def callback(global_): - import_util.import_('pkg') # For __import__(). - module = import_util.import_('', global_, fromlist=['attr'], level=1) + self.__import__('pkg') # For __import__(). + module = self.__import__('', global_, fromlist=['attr'], level=1) self.assertEqual(module.__name__, 'pkg') self.relative_import_test(create, globals_, callback) @@ -126,7 +126,7 @@ globals_ = ({'__package__': 'pkg.subpkg1'}, {'__name__': 'pkg.subpkg1', '__path__': ['blah']}) def callback(global_): - module = import_util.import_('', global_, fromlist=['subpkg2'], + module = self.__import__('', global_, fromlist=['subpkg2'], level=2) self.assertEqual(module.__name__, 'pkg') self.assertTrue(hasattr(module, 'subpkg2')) @@ -142,8 +142,8 @@ {'__name__': 'pkg.pkg1.pkg2.pkg3.pkg4.pkg5', '__path__': ['blah']}) def callback(global_): - import_util.import_(globals_[0]['__package__']) - module = import_util.import_('', global_, fromlist=['attr'], level=6) + self.__import__(globals_[0]['__package__']) + module = self.__import__('', global_, fromlist=['attr'], level=6) self.assertEqual(module.__name__, 'pkg') self.relative_import_test(create, globals_, callback) @@ -153,9 +153,9 @@ globals_ = ({'__package__': 'pkg'}, {'__name__': 'pkg', '__path__': ['blah']}) def callback(global_): - import_util.import_('pkg') + self.__import__('pkg') with self.assertRaises(ValueError): - import_util.import_('', global_, fromlist=['top_level'], + self.__import__('', global_, fromlist=['top_level'], level=2) self.relative_import_test(create, globals_, callback) @@ -164,16 +164,16 @@ create = ['top_level', 'pkg.__init__', 'pkg.module'] globals_ = {'__package__': 'pkg'}, {'__name__': 'pkg.module'} def callback(global_): - import_util.import_('pkg') + self.__import__('pkg') with self.assertRaises(ValueError): - import_util.import_('', global_, fromlist=['top_level'], + self.__import__('', global_, fromlist=['top_level'], level=2) self.relative_import_test(create, globals_, callback) def test_empty_name_w_level_0(self): # [empty name] with self.assertRaises(ValueError): - import_util.import_('') + self.__import__('') def test_import_from_different_package(self): # Test importing from a different package than the caller. @@ -186,8 +186,8 @@ '__runpy_pkg__.uncle.cousin.nephew'] globals_ = {'__package__': '__runpy_pkg__.__runpy_pkg__'} def callback(global_): - import_util.import_('__runpy_pkg__.__runpy_pkg__') - module = import_util.import_('uncle.cousin', globals_, {}, + self.__import__('__runpy_pkg__.__runpy_pkg__') + module = self.__import__('uncle.cousin', globals_, {}, fromlist=['nephew'], level=2) self.assertEqual(module.__name__, '__runpy_pkg__.uncle.cousin') @@ -198,20 +198,19 @@ create = ['crash.__init__', 'crash.mod'] globals_ = [{'__package__': 'crash', '__name__': 'crash'}] def callback(global_): - import_util.import_('crash') - mod = import_util.import_('mod', global_, {}, [], 1) + self.__import__('crash') + mod = self.__import__('mod', global_, {}, [], 1) self.assertEqual(mod.__name__, 'crash.mod') self.relative_import_test(create, globals_, callback) def test_relative_import_no_globals(self): # No globals for a relative import is an error. with self.assertRaises(KeyError): - import_util.import_('sys', level=1) + self.__import__('sys', level=1) +Frozen_RelativeImports, Source_RelativeImports = util.test_both( + RelativeImports, __import__=import_util.__import__) -def test_main(): - from test.support import run_unittest - run_unittest(RelativeImports) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/util.py b/Lib/test/test_importlib/import_/util.py --- a/Lib/test/test_importlib/import_/util.py +++ b/Lib/test/test_importlib/import_/util.py @@ -1,22 +1,14 @@ +from .. import util + +frozen_importlib, source_importlib = util.import_importlib('importlib') + +import builtins import functools import importlib import unittest -using___import__ = False - - -def import_(*args, **kwargs): - """Delegate to allow for injecting different implementations of import.""" - if using___import__: - return __import__(*args, **kwargs) - else: - return importlib.__import__(*args, **kwargs) - - -def importlib_only(fxn): - """Decorator to skip a test if using __builtins__.__import__.""" - return unittest.skipIf(using___import__, "importlib-specific test")(fxn) +__import__ = staticmethod(builtins.__import__), staticmethod(source_importlib.__import__) def mock_path_hook(*entries, importer): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 19:35:40 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 8 Nov 2013 19:35:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Simplify_test=2Etest=5Fimp?= =?utf-8?b?b3J0bGliLl9fbWFpbl9f?= Message-ID: <3dGVXr2FXVz7Lkf@mail.python.org> http://hg.python.org/cpython/rev/93fbe25ea81a changeset: 87007:93fbe25ea81a user: Brett Cannon date: Fri Nov 08 13:35:34 2013 -0500 summary: Simplify test.test_importlib.__main__ files: Lib/test/test_importlib/__main__.py | 11 +---------- 1 files changed, 1 insertions(+), 10 deletions(-) diff --git a/Lib/test/test_importlib/__main__.py b/Lib/test/test_importlib/__main__.py --- a/Lib/test/test_importlib/__main__.py +++ b/Lib/test/test_importlib/__main__.py @@ -4,15 +4,6 @@ builtins.__import__ instead of importlib.__import__. """ -from . import test_main - - if __name__ == '__main__': - import argparse - - parser = argparse.ArgumentParser(description='Execute the importlib test ' - 'suite') - parser.add_argument('-b', '--builtin', action='store_true', default=False, - help='use builtins.__import__() instead of importlib') - args = parser.parse_args() + from . import test_main test_main() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 19:55:46 2013 From: python-checkins at python.org (charles-francois.natali) Date: Fri, 8 Nov 2013 19:55:46 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2318923=3A_Update_s?= =?utf-8?q?ubprocess_to_use_the_new_selectors_module=2E?= Message-ID: <3dGW02230sz7Ljj@mail.python.org> http://hg.python.org/cpython/rev/71b618f0c8e9 changeset: 87008:71b618f0c8e9 user: Charles-Fran?ois Natali date: Fri Nov 08 19:56:59 2013 +0100 summary: Issue #18923: Update subprocess to use the new selectors module. files: Lib/subprocess.py | 238 ++++++----------------- Lib/test/test_subprocess.py | 10 +- 2 files changed, 75 insertions(+), 173 deletions(-) diff --git a/Lib/subprocess.py b/Lib/subprocess.py --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -404,15 +404,23 @@ hStdError = None wShowWindow = 0 else: + import _posixsubprocess import select - _has_poll = hasattr(select, 'poll') - import _posixsubprocess + import selectors # When select or poll has indicated that the file is writable, # we can write up to _PIPE_BUF bytes without risk of blocking. # POSIX defines PIPE_BUF as >= 512. _PIPE_BUF = getattr(select, 'PIPE_BUF', 512) + # poll/select have the advantage of not requiring any extra file + # descriptor, contrarily to epoll/kqueue (also, they require a single + # syscall). + if hasattr(selectors, 'PollSelector'): + _PopenSelector = selectors.PollSelector + else: + _PopenSelector = selectors.SelectSelector + __all__ = ["Popen", "PIPE", "STDOUT", "call", "check_call", "getstatusoutput", "getoutput", "check_output", "CalledProcessError", "DEVNULL"] @@ -1530,12 +1538,65 @@ if not input: self.stdin.close() - if _has_poll: - stdout, stderr = self._communicate_with_poll(input, endtime, - orig_timeout) - else: - stdout, stderr = self._communicate_with_select(input, endtime, - orig_timeout) + stdout = None + stderr = None + + # Only create this mapping if we haven't already. + if not self._communication_started: + self._fileobj2output = {} + if self.stdout: + self._fileobj2output[self.stdout] = [] + if self.stderr: + self._fileobj2output[self.stderr] = [] + + if self.stdout: + stdout = self._fileobj2output[self.stdout] + if self.stderr: + stderr = self._fileobj2output[self.stderr] + + self._save_input(input) + + with _PopenSelector() as selector: + if self.stdin and input: + selector.register(self.stdin, selectors.EVENT_WRITE) + if self.stdout: + selector.register(self.stdout, selectors.EVENT_READ) + if self.stderr: + selector.register(self.stderr, selectors.EVENT_READ) + + while selector.get_map(): + timeout = self._remaining_time(endtime) + if timeout is not None and timeout < 0: + raise TimeoutExpired(self.args, orig_timeout) + + ready = selector.select(timeout) + self._check_timeout(endtime, orig_timeout) + + # XXX Rewrite these to use non-blocking I/O on the file + # objects; they are no longer using C stdio! + + for key, events in ready: + if key.fileobj is self.stdin: + chunk = self._input[self._input_offset : + self._input_offset + _PIPE_BUF] + try: + self._input_offset += os.write(key.fd, chunk) + except OSError as e: + if e.errno == errno.EPIPE: + selector.unregister(key.fileobj) + key.fileobj.close() + else: + raise + else: + if self._input_offset >= len(self._input): + selector.unregister(key.fileobj) + key.fileobj.close() + elif key.fileobj in (self.stdout, self.stderr): + data = os.read(key.fd, 4096) + if not data: + selector.unregister(key.fileobj) + key.fileobj.close() + self._fileobj2output[key.fileobj].append(data) self.wait(timeout=self._remaining_time(endtime)) @@ -1569,167 +1630,6 @@ self._input = self._input.encode(self.stdin.encoding) - def _communicate_with_poll(self, input, endtime, orig_timeout): - stdout = None # Return - stderr = None # Return - - if not self._communication_started: - self._fd2file = {} - - poller = select.poll() - def register_and_append(file_obj, eventmask): - poller.register(file_obj.fileno(), eventmask) - self._fd2file[file_obj.fileno()] = file_obj - - def close_unregister_and_remove(fd): - poller.unregister(fd) - self._fd2file[fd].close() - self._fd2file.pop(fd) - - if self.stdin and input: - register_and_append(self.stdin, select.POLLOUT) - - # Only create this mapping if we haven't already. - if not self._communication_started: - self._fd2output = {} - if self.stdout: - self._fd2output[self.stdout.fileno()] = [] - if self.stderr: - self._fd2output[self.stderr.fileno()] = [] - - select_POLLIN_POLLPRI = select.POLLIN | select.POLLPRI - if self.stdout: - register_and_append(self.stdout, select_POLLIN_POLLPRI) - stdout = self._fd2output[self.stdout.fileno()] - if self.stderr: - register_and_append(self.stderr, select_POLLIN_POLLPRI) - stderr = self._fd2output[self.stderr.fileno()] - - self._save_input(input) - - while self._fd2file: - timeout = self._remaining_time(endtime) - if timeout is not None and timeout < 0: - raise TimeoutExpired(self.args, orig_timeout) - try: - ready = poller.poll(timeout) - except OSError as e: - if e.args[0] == errno.EINTR: - continue - raise - self._check_timeout(endtime, orig_timeout) - - # XXX Rewrite these to use non-blocking I/O on the - # file objects; they are no longer using C stdio! - - for fd, mode in ready: - if mode & select.POLLOUT: - chunk = self._input[self._input_offset : - self._input_offset + _PIPE_BUF] - try: - self._input_offset += os.write(fd, chunk) - except OSError as e: - if e.errno == errno.EPIPE: - close_unregister_and_remove(fd) - else: - raise - else: - if self._input_offset >= len(self._input): - close_unregister_and_remove(fd) - elif mode & select_POLLIN_POLLPRI: - data = os.read(fd, 4096) - if not data: - close_unregister_and_remove(fd) - self._fd2output[fd].append(data) - else: - # Ignore hang up or errors. - close_unregister_and_remove(fd) - - return (stdout, stderr) - - - def _communicate_with_select(self, input, endtime, orig_timeout): - if not self._communication_started: - self._read_set = [] - self._write_set = [] - if self.stdin and input: - self._write_set.append(self.stdin) - if self.stdout: - self._read_set.append(self.stdout) - if self.stderr: - self._read_set.append(self.stderr) - - self._save_input(input) - - stdout = None # Return - stderr = None # Return - - if self.stdout: - if not self._communication_started: - self._stdout_buff = [] - stdout = self._stdout_buff - if self.stderr: - if not self._communication_started: - self._stderr_buff = [] - stderr = self._stderr_buff - - while self._read_set or self._write_set: - timeout = self._remaining_time(endtime) - if timeout is not None and timeout < 0: - raise TimeoutExpired(self.args, orig_timeout) - try: - (rlist, wlist, xlist) = \ - select.select(self._read_set, self._write_set, [], - timeout) - except OSError as e: - if e.args[0] == errno.EINTR: - continue - raise - - # According to the docs, returning three empty lists indicates - # that the timeout expired. - if not (rlist or wlist or xlist): - raise TimeoutExpired(self.args, orig_timeout) - # We also check what time it is ourselves for good measure. - self._check_timeout(endtime, orig_timeout) - - # XXX Rewrite these to use non-blocking I/O on the - # file objects; they are no longer using C stdio! - - if self.stdin in wlist: - chunk = self._input[self._input_offset : - self._input_offset + _PIPE_BUF] - try: - bytes_written = os.write(self.stdin.fileno(), chunk) - except OSError as e: - if e.errno == errno.EPIPE: - self.stdin.close() - self._write_set.remove(self.stdin) - else: - raise - else: - self._input_offset += bytes_written - if self._input_offset >= len(self._input): - self.stdin.close() - self._write_set.remove(self.stdin) - - if self.stdout in rlist: - data = os.read(self.stdout.fileno(), 1024) - if not data: - self.stdout.close() - self._read_set.remove(self.stdout) - stdout.append(data) - - if self.stderr in rlist: - data = os.read(self.stderr.fileno(), 1024) - if not data: - self.stderr.close() - self._read_set.remove(self.stderr) - stderr.append(data) - - return (stdout, stderr) - - def send_signal(self, sig): """Send a signal to the process """ diff --git a/Lib/test/test_subprocess.py b/Lib/test/test_subprocess.py --- a/Lib/test/test_subprocess.py +++ b/Lib/test/test_subprocess.py @@ -11,6 +11,7 @@ import tempfile import time import re +import selectors import sysconfig import warnings import select @@ -2179,15 +2180,16 @@ os.rmdir(dir) - at unittest.skipUnless(getattr(subprocess, '_has_poll', False), - "poll system call not supported") + at unittest.skipUnless(hasattr(selectors, 'PollSelector'), + "Test needs selectors.PollSelector") class ProcessTestCaseNoPoll(ProcessTestCase): def setUp(self): - subprocess._has_poll = False + self.orig_selector = subprocess._PopenSelector + subprocess._PopenSelector = selectors.SelectSelector ProcessTestCase.setUp(self) def tearDown(self): - subprocess._has_poll = True + subprocess._PopenSelector = self.orig_selector ProcessTestCase.tearDown(self) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 20:25:46 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 8 Nov 2013 20:25:46 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2316803=3A_test=2Et?= =?utf-8?q?est=5Fimportlib=2Esource_now_tests_frozen_and_source_code?= Message-ID: <3dGWff3Qxdz7Lq7@mail.python.org> http://hg.python.org/cpython/rev/b0f570aef6fd changeset: 87009:b0f570aef6fd user: Brett Cannon date: Fri Nov 08 14:25:37 2013 -0500 summary: Issue #16803: test.test_importlib.source now tests frozen and source code files: Lib/test/test_importlib/source/test_case_sensitivity.py | 28 ++-- Lib/test/test_importlib/source/test_file_loader.py | 70 ++++++--- Lib/test/test_importlib/source/test_finder.py | 30 ++-- Lib/test/test_importlib/source/test_path_hook.py | 17 +- Lib/test/test_importlib/source/test_source_encoding.py | 20 +- 5 files changed, 91 insertions(+), 74 deletions(-) diff --git a/Lib/test/test_importlib/source/test_case_sensitivity.py b/Lib/test/test_importlib/source/test_case_sensitivity.py --- a/Lib/test/test_importlib/source/test_case_sensitivity.py +++ b/Lib/test/test_importlib/source/test_case_sensitivity.py @@ -2,8 +2,9 @@ from .. import util from . import util as source_util -from importlib import _bootstrap -from importlib import machinery +importlib = util.import_importlib('importlib') +machinery = util.import_importlib('importlib.machinery') + import os import sys from test import support as test_support @@ -11,7 +12,7 @@ @util.case_insensitive_tests -class CaseSensitivityTest(unittest.TestCase): +class CaseSensitivityTest: """PEP 235 dictates that on case-preserving, case-insensitive file systems that imports are case-sensitive unless the PYTHONCASEOK environment @@ -21,11 +22,11 @@ assert name != name.lower() def find(self, path): - finder = machinery.FileFinder(path, - (machinery.SourceFileLoader, - machinery.SOURCE_SUFFIXES), - (machinery.SourcelessFileLoader, - machinery.BYTECODE_SUFFIXES)) + finder = self.machinery.FileFinder(path, + (self.machinery.SourceFileLoader, + self.machinery.SOURCE_SUFFIXES), + (self.machinery.SourcelessFileLoader, + self.machinery.BYTECODE_SUFFIXES)) return finder.find_module(self.name) def sensitivity_test(self): @@ -41,7 +42,7 @@ def test_sensitive(self): with test_support.EnvironmentVarGuard() as env: env.unset('PYTHONCASEOK') - if b'PYTHONCASEOK' in _bootstrap._os.environ: + if b'PYTHONCASEOK' in self.importlib._bootstrap._os.environ: self.skipTest('os.environ changes not reflected in ' '_os.environ') sensitive, insensitive = self.sensitivity_test() @@ -52,7 +53,7 @@ def test_insensitive(self): with test_support.EnvironmentVarGuard() as env: env.set('PYTHONCASEOK', '1') - if b'PYTHONCASEOK' not in _bootstrap._os.environ: + if b'PYTHONCASEOK' not in self.importlib._bootstrap._os.environ: self.skipTest('os.environ changes not reflected in ' '_os.environ') sensitive, insensitive = self.sensitivity_test() @@ -61,10 +62,9 @@ self.assertTrue(hasattr(insensitive, 'load_module')) self.assertIn(self.name, insensitive.get_filename(self.name)) - -def test_main(): - test_support.run_unittest(CaseSensitivityTest) +Frozen_CaseSensitivityTest, Source_CaseSensitivityTest = util.test_both( + CaseSensitivityTest, importlib=importlib, machinery=machinery) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/source/test_file_loader.py b/Lib/test/test_importlib/source/test_file_loader.py --- a/Lib/test/test_importlib/source/test_file_loader.py +++ b/Lib/test/test_importlib/source/test_file_loader.py @@ -1,11 +1,12 @@ -from importlib import machinery -import importlib -import importlib.abc -import importlib.util from .. import abc from .. import util from . import util as source_util +importlib = util.import_importlib('importlib') +importlib_abc = util.import_importlib('importlib.abc') +machinery = util.import_importlib('importlib.machinery') +importlib_util = util.import_importlib('importlib.util') + import errno import marshal import os @@ -19,7 +20,7 @@ from test.support import make_legacy_pyc, unload -class SimpleTest(unittest.TestCase, abc.LoaderTests): +class SimpleTest(abc.LoaderTests): """Should have no issue importing a source module [basic]. And if there is a syntax error, it should raise a SyntaxError [syntax error]. @@ -27,7 +28,7 @@ """ def test_load_module_API(self): - class Tester(importlib.abc.FileLoader): + class Tester(self.abc.FileLoader): def get_source(self, _): return 'attr = 42' def is_package(self, _): return False @@ -37,7 +38,7 @@ def test_get_filename_API(self): # If fullname is not set then assume self.path is desired. - class Tester(importlib.abc.FileLoader): + class Tester(self.abc.FileLoader): def get_code(self, _): pass def get_source(self, _): pass def is_package(self, _): pass @@ -55,7 +56,7 @@ # [basic] def test_module(self): with source_util.create_modules('_temp') as mapping: - loader = machinery.SourceFileLoader('_temp', mapping['_temp']) + loader = self.machinery.SourceFileLoader('_temp', mapping['_temp']) module = loader.load_module('_temp') self.assertIn('_temp', sys.modules) check = {'__name__': '_temp', '__file__': mapping['_temp'], @@ -65,7 +66,7 @@ def test_package(self): with source_util.create_modules('_pkg.__init__') as mapping: - loader = machinery.SourceFileLoader('_pkg', + loader = self.machinery.SourceFileLoader('_pkg', mapping['_pkg.__init__']) module = loader.load_module('_pkg') self.assertIn('_pkg', sys.modules) @@ -78,7 +79,7 @@ def test_lacking_parent(self): with source_util.create_modules('_pkg.__init__', '_pkg.mod')as mapping: - loader = machinery.SourceFileLoader('_pkg.mod', + loader = self.machinery.SourceFileLoader('_pkg.mod', mapping['_pkg.mod']) module = loader.load_module('_pkg.mod') self.assertIn('_pkg.mod', sys.modules) @@ -93,7 +94,7 @@ def test_module_reuse(self): with source_util.create_modules('_temp') as mapping: - loader = machinery.SourceFileLoader('_temp', mapping['_temp']) + loader = self.machinery.SourceFileLoader('_temp', mapping['_temp']) module = loader.load_module('_temp') module_id = id(module) module_dict_id = id(module.__dict__) @@ -118,7 +119,7 @@ setattr(orig_module, attr, value) with open(mapping[name], 'w') as file: file.write('+++ bad syntax +++') - loader = machinery.SourceFileLoader('_temp', mapping['_temp']) + loader = self.machinery.SourceFileLoader('_temp', mapping['_temp']) with self.assertRaises(SyntaxError): loader.load_module(name) for attr in attributes: @@ -129,7 +130,7 @@ with source_util.create_modules('_temp') as mapping: with open(mapping['_temp'], 'w') as file: file.write('=') - loader = machinery.SourceFileLoader('_temp', mapping['_temp']) + loader = self.machinery.SourceFileLoader('_temp', mapping['_temp']) with self.assertRaises(SyntaxError): loader.load_module('_temp') self.assertNotIn('_temp', sys.modules) @@ -142,14 +143,14 @@ file.write("# test file for importlib") try: with util.uncache('_temp'): - loader = machinery.SourceFileLoader('_temp', file_path) + loader = self.machinery.SourceFileLoader('_temp', file_path) mod = loader.load_module('_temp') self.assertEqual(file_path, mod.__file__) - self.assertEqual(importlib.util.cache_from_source(file_path), + self.assertEqual(self.util.cache_from_source(file_path), mod.__cached__) finally: os.unlink(file_path) - pycache = os.path.dirname(importlib.util.cache_from_source(file_path)) + pycache = os.path.dirname(self.util.cache_from_source(file_path)) if os.path.exists(pycache): shutil.rmtree(pycache) @@ -158,7 +159,7 @@ # truncated rather than raise an OverflowError. with source_util.create_modules('_temp') as mapping: source = mapping['_temp'] - compiled = importlib.util.cache_from_source(source) + compiled = self.util.cache_from_source(source) with open(source, 'w') as f: f.write("x = 5") try: @@ -169,7 +170,7 @@ if e.errno != getattr(errno, 'EOVERFLOW', None): raise self.skipTest("cannot set modification time to large integer ({})".format(e)) - loader = machinery.SourceFileLoader('_temp', mapping['_temp']) + loader = self.machinery.SourceFileLoader('_temp', mapping['_temp']) mod = loader.load_module('_temp') # Sanity checks. self.assertEqual(mod.__cached__, compiled) @@ -178,12 +179,16 @@ os.stat(compiled) def test_unloadable(self): - loader = machinery.SourceFileLoader('good name', {}) + loader = self.machinery.SourceFileLoader('good name', {}) with self.assertRaises(ImportError): loader.load_module('bad name') +Frozen_SimpleTest, Source_SimpleTest = util.test_both( + SimpleTest, importlib=importlib, machinery=machinery, abc=importlib_abc, + util=importlib_util) -class BadBytecodeTest(unittest.TestCase): + +class BadBytecodeTest: def import_(self, file, module_name): loader = self.loader(module_name, file) @@ -200,7 +205,7 @@ pass py_compile.compile(mapping[name]) if not del_source: - bytecode_path = importlib.util.cache_from_source(mapping[name]) + bytecode_path = self.util.cache_from_source(mapping[name]) else: os.unlink(mapping[name]) bytecode_path = make_legacy_pyc(mapping[name]) @@ -289,7 +294,9 @@ class SourceLoaderBadBytecodeTest(BadBytecodeTest): - loader = machinery.SourceFileLoader + @classmethod + def setUpClass(cls): + cls.loader = cls.machinery.SourceFileLoader @source_util.writes_bytecode_files def test_empty_file(self): @@ -329,7 +336,7 @@ self.import_(mapping[name], name) with open(bytecode_path, 'rb') as bytecode_file: self.assertEqual(bytecode_file.read(4), - importlib.util.MAGIC_NUMBER) + self.util.MAGIC_NUMBER) self._test_bad_magic(test) @@ -379,13 +386,13 @@ zeros = b'\x00\x00\x00\x00' with source_util.create_modules('_temp') as mapping: py_compile.compile(mapping['_temp']) - bytecode_path = importlib.util.cache_from_source(mapping['_temp']) + bytecode_path = self.util.cache_from_source(mapping['_temp']) with open(bytecode_path, 'r+b') as bytecode_file: bytecode_file.seek(4) bytecode_file.write(zeros) self.import_(mapping['_temp'], '_temp') source_mtime = os.path.getmtime(mapping['_temp']) - source_timestamp = importlib._w_long(source_mtime) + source_timestamp = self.importlib._w_long(source_mtime) with open(bytecode_path, 'rb') as bytecode_file: bytecode_file.seek(4) self.assertEqual(bytecode_file.read(4), source_timestamp) @@ -397,7 +404,7 @@ with source_util.create_modules('_temp') as mapping: # Create bytecode that will need to be re-created. py_compile.compile(mapping['_temp']) - bytecode_path = importlib.util.cache_from_source(mapping['_temp']) + bytecode_path = self.util.cache_from_source(mapping['_temp']) with open(bytecode_path, 'r+b') as bytecode_file: bytecode_file.seek(0) bytecode_file.write(b'\x00\x00\x00\x00') @@ -411,10 +418,16 @@ # Make writable for eventual clean-up. os.chmod(bytecode_path, stat.S_IWUSR) +Frozen_SourceBadBytecode, Source_SourceBadBytecode = util.test_both( + SourceLoaderBadBytecodeTest, importlib=importlib, machinery=machinery, + abc=importlib_abc, util=importlib_util) + class SourcelessLoaderBadBytecodeTest(BadBytecodeTest): - loader = machinery.SourcelessFileLoader + @classmethod + def setUpClass(cls): + cls.loader = cls.machinery.SourcelessFileLoader def test_empty_file(self): def test(name, mapping, bytecode_path): @@ -469,6 +482,9 @@ def test_non_code_marshal(self): self._test_non_code_marshal(del_source=True) +Frozen_SourcelessBadBytecode, Source_SourcelessBadBytecode = util.test_both( + SourcelessLoaderBadBytecodeTest, importlib=importlib, + machinery=machinery, abc=importlib_abc, util=importlib_util) if __name__ == '__main__': diff --git a/Lib/test/test_importlib/source/test_finder.py b/Lib/test/test_importlib/source/test_finder.py --- a/Lib/test/test_importlib/source/test_finder.py +++ b/Lib/test/test_importlib/source/test_finder.py @@ -1,7 +1,9 @@ from .. import abc +from .. import util from . import util as source_util -from importlib import machinery +machinery = util.import_importlib('importlib.machinery') + import errno import os import py_compile @@ -13,7 +15,7 @@ import warnings -class FinderTests(unittest.TestCase, abc.FinderTests): +class FinderTests(abc.FinderTests): """For a top-level module, it should just be found directly in the directory being searched. This is true for a directory with source @@ -38,11 +40,11 @@ """ def get_finder(self, root): - loader_details = [(machinery.SourceFileLoader, - machinery.SOURCE_SUFFIXES), - (machinery.SourcelessFileLoader, - machinery.BYTECODE_SUFFIXES)] - return machinery.FileFinder(root, *loader_details) + loader_details = [(self.machinery.SourceFileLoader, + self.machinery.SOURCE_SUFFIXES), + (self.machinery.SourcelessFileLoader, + self.machinery.BYTECODE_SUFFIXES)] + return self.machinery.FileFinder(root, *loader_details) def import_(self, root, module): return self.get_finder(root).find_module(module) @@ -123,8 +125,8 @@ def test_empty_string_for_dir(self): # The empty string from sys.path means to search in the cwd. - finder = machinery.FileFinder('', (machinery.SourceFileLoader, - machinery.SOURCE_SUFFIXES)) + finder = self.machinery.FileFinder('', (self.machinery.SourceFileLoader, + self.machinery.SOURCE_SUFFIXES)) with open('mod.py', 'w') as file: file.write("# test file for importlib") try: @@ -135,8 +137,8 @@ def test_invalidate_caches(self): # invalidate_caches() should reset the mtime. - finder = machinery.FileFinder('', (machinery.SourceFileLoader, - machinery.SOURCE_SUFFIXES)) + finder = self.machinery.FileFinder('', (self.machinery.SourceFileLoader, + self.machinery.SOURCE_SUFFIXES)) finder._path_mtime = 42 finder.invalidate_caches() self.assertEqual(finder._path_mtime, -1) @@ -180,11 +182,9 @@ finder = self.get_finder(file_obj.name) self.assertEqual((None, []), finder.find_loader('doesnotexist')) +Frozen_FinderTests, Source_FinderTests = util.test_both(FinderTests, machinery=machinery) -def test_main(): - from test.support import run_unittest - run_unittest(FinderTests) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/source/test_path_hook.py b/Lib/test/test_importlib/source/test_path_hook.py --- a/Lib/test/test_importlib/source/test_path_hook.py +++ b/Lib/test/test_importlib/source/test_path_hook.py @@ -1,16 +1,18 @@ +from .. import util from . import util as source_util -from importlib import machinery +machinery = util.import_importlib('importlib.machinery') + import unittest -class PathHookTest(unittest.TestCase): +class PathHookTest: """Test the path hook for source.""" def path_hook(self): - return machinery.FileFinder.path_hook((machinery.SourceFileLoader, - machinery.SOURCE_SUFFIXES)) + return self.machinery.FileFinder.path_hook((self.machinery.SourceFileLoader, + self.machinery.SOURCE_SUFFIXES)) def test_success(self): with source_util.create_modules('dummy') as mapping: @@ -21,11 +23,8 @@ # The empty string represents the cwd. self.assertTrue(hasattr(self.path_hook()(''), 'find_module')) - -def test_main(): - from test.support import run_unittest - run_unittest(PathHookTest) +Frozen_PathHookTest, Source_PathHooktest = util.test_both(PathHookTest, machinery=machinery) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/source/test_source_encoding.py b/Lib/test/test_importlib/source/test_source_encoding.py --- a/Lib/test/test_importlib/source/test_source_encoding.py +++ b/Lib/test/test_importlib/source/test_source_encoding.py @@ -1,6 +1,8 @@ +from .. import util from . import util as source_util -from importlib import _bootstrap +machinery = util.import_importlib('importlib.machinery') + import codecs import re import sys @@ -13,7 +15,7 @@ CODING_RE = re.compile(r'^[ \t\f]*#.*coding[:=][ \t]*([-\w.]+)', re.ASCII) -class EncodingTest(unittest.TestCase): +class EncodingTest: """PEP 3120 makes UTF-8 the default encoding for source code [default encoding]. @@ -35,7 +37,7 @@ with source_util.create_modules(self.module_name) as mapping: with open(mapping[self.module_name], 'wb') as file: file.write(source) - loader = _bootstrap.SourceFileLoader(self.module_name, + loader = self.machinery.SourceFileLoader(self.module_name, mapping[self.module_name]) return loader.load_module(self.module_name) @@ -84,8 +86,10 @@ with self.assertRaises(SyntaxError): self.run_test(source) +Frozen_EncodingTest, Source_EncodingTest = util.test_both(EncodingTest, machinery=machinery) -class LineEndingTest(unittest.TestCase): + +class LineEndingTest: r"""Source written with the three types of line endings (\n, \r\n, \r) need to be readable [cr][crlf][lf].""" @@ -97,7 +101,7 @@ with source_util.create_modules(module_name) as mapping: with open(mapping[module_name], 'wb') as file: file.write(source) - loader = _bootstrap.SourceFileLoader(module_name, + loader = self.machinery.SourceFileLoader(module_name, mapping[module_name]) return loader.load_module(module_name) @@ -113,11 +117,9 @@ def test_lf(self): self.run_test(b'\n') +Frozen_LineEndings, Source_LineEndings = util.test_both(LineEndingTest, machinery=machinery) -def test_main(): - from test.support import run_unittest - run_unittest(EncodingTest, LineEndingTest) if __name__ == '__main__': - test_main() + unittest.main() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 20:27:50 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 8 Nov 2013 20:27:50 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_remove_dead_import?= Message-ID: <3dGWj22byJz7Lm6@mail.python.org> http://hg.python.org/cpython/rev/6f4a6aa30d7f changeset: 87010:6f4a6aa30d7f user: Brett Cannon date: Fri Nov 08 14:27:42 2013 -0500 summary: remove dead import files: Lib/test/test_import.py | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_import.py b/Lib/test/test_import.py --- a/Lib/test/test_import.py +++ b/Lib/test/test_import.py @@ -3,7 +3,6 @@ import importlib.util from importlib._bootstrap import _get_sourcefile import builtins -from test.test_importlib.import_ import util as importlib_util import marshal import os import platform -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 20:38:18 2013 From: python-checkins at python.org (stefan.krah) Date: Fri, 8 Nov 2013 20:38:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Change_style_to_match_the_?= =?utf-8?q?surrounding_code_=28no_early_returns=29=2E?= Message-ID: <3dGWx62C0mz7LjY@mail.python.org> http://hg.python.org/cpython/rev/0591bf2edcf1 changeset: 87011:0591bf2edcf1 parent: 87003:0f48843652b1 user: Stefan Krah date: Fri Nov 08 17:48:58 2013 +0100 summary: Change style to match the surrounding code (no early returns). files: Modules/_decimal/_decimal.c | 7 +++---- 1 files changed, 3 insertions(+), 4 deletions(-) diff --git a/Modules/_decimal/_decimal.c b/Modules/_decimal/_decimal.c --- a/Modules/_decimal/_decimal.c +++ b/Modules/_decimal/_decimal.c @@ -3010,12 +3010,11 @@ } } else { - int is_instance = PyObject_IsInstance(w, Rational); - if (is_instance < 0) { + int is_rational = PyObject_IsInstance(w, Rational); + if (is_rational < 0) { *wcmp = NULL; - return 0; } - if (is_instance) { + else if (is_rational > 0) { *wcmp = numerator_as_decimal(w, context); if (*wcmp && !mpd_isspecial(MPD(v))) { *vcmp = multiply_by_denominator(v, w, context); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 20:38:19 2013 From: python-checkins at python.org (stefan.krah) Date: Fri, 8 Nov 2013 20:38:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Move_PyErr=5FNoMemory=28?= =?utf-8?q?=29_closer_to_the_failure=2E?= Message-ID: <3dGWx74GX5z7Lk5@mail.python.org> http://hg.python.org/cpython/rev/9fc542fdb030 changeset: 87012:9fc542fdb030 user: Stefan Krah date: Fri Nov 08 18:05:02 2013 +0100 summary: Move PyErr_NoMemory() closer to the failure. files: Modules/_decimal/_decimal.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Modules/_decimal/_decimal.c b/Modules/_decimal/_decimal.c --- a/Modules/_decimal/_decimal.c +++ b/Modules/_decimal/_decimal.c @@ -3108,6 +3108,7 @@ { char *dest = PyMem_Malloc(size+1); if (dest == NULL) { + PyErr_NoMemory(); return NULL; } @@ -3186,7 +3187,6 @@ replace_fillchar = 1; fmt = dec_strdup(fmt, size); if (fmt == NULL) { - PyErr_NoMemory(); return NULL; } fmt[0] = '_'; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 20:38:20 2013 From: python-checkins at python.org (stefan.krah) Date: Fri, 8 Nov 2013 20:38:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Valgrind=3A_suppress_false?= =?utf-8?q?_positive_in_=5FPyOS=5FGetOpt_=28getopt=2Ec=3A84=29_=28Invalid_?= =?utf-8?q?read?= Message-ID: <3dGWx8613dz7Ljt@mail.python.org> http://hg.python.org/cpython/rev/338324a84716 changeset: 87013:338324a84716 user: Stefan Krah date: Fri Nov 08 20:18:09 2013 +0100 summary: Valgrind: suppress false positive in _PyOS_GetOpt (getopt.c:84) (Invalid read of size 8: wcscmp (wcscmp.S:464)) files: Misc/valgrind-python.supp | 9 +++++++++ 1 files changed, 9 insertions(+), 0 deletions(-) diff --git a/Misc/valgrind-python.supp b/Misc/valgrind-python.supp --- a/Misc/valgrind-python.supp +++ b/Misc/valgrind-python.supp @@ -456,6 +456,15 @@ fun:PyUnicode_FSConverter } +{ + wcscmp_false_positive + Memcheck:Addr8 + fun:wcscmp + fun:_PyOS_GetOpt + fun:Py_Main + fun:main +} + # Additional suppressions for the unified decimal tests: { test_decimal -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 20:38:22 2013 From: python-checkins at python.org (stefan.krah) Date: Fri, 8 Nov 2013 20:38:22 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?b?KTogTWVyZ2Uu?= Message-ID: <3dGWxB6cnpz7Lkg@mail.python.org> http://hg.python.org/cpython/rev/36912c3e8de4 changeset: 87014:36912c3e8de4 parent: 87013:338324a84716 parent: 87010:6f4a6aa30d7f user: Stefan Krah date: Fri Nov 08 20:37:01 2013 +0100 summary: Merge. files: Include/dictobject.h | 5 +- Lib/subprocess.py | 238 ++------- Lib/test/test_import.py | 1 - Lib/test/test_importlib/__main__.py | 13 +- Lib/test/test_importlib/frozen/test_finder.py | 16 +- Lib/test/test_importlib/frozen/test_loader.py | 48 +- Lib/test/test_importlib/import_/test___loader__.py | 10 +- Lib/test/test_importlib/import_/test___package__.py | 33 +- Lib/test/test_importlib/import_/test_api.py | 31 +- Lib/test/test_importlib/import_/test_caching.py | 30 +- Lib/test/test_importlib/import_/test_fromlist.py | 33 +- Lib/test/test_importlib/import_/test_meta_path.py | 24 +- Lib/test/test_importlib/import_/test_packages.py | 30 +- Lib/test/test_importlib/import_/test_path.py | 36 +- Lib/test/test_importlib/import_/test_relative_imports.py | 51 +- Lib/test/test_importlib/import_/util.py | 20 +- Lib/test/test_importlib/source/test_case_sensitivity.py | 28 +- Lib/test/test_importlib/source/test_file_loader.py | 70 +- Lib/test/test_importlib/source/test_finder.py | 30 +- Lib/test/test_importlib/source/test_path_hook.py | 17 +- Lib/test/test_importlib/source/test_source_encoding.py | 20 +- Lib/test/test_subprocess.py | 10 +- 22 files changed, 349 insertions(+), 445 deletions(-) diff --git a/Include/dictobject.h b/Include/dictobject.h --- a/Include/dictobject.h +++ b/Include/dictobject.h @@ -109,12 +109,13 @@ PyAPI_FUNC(int) PyDict_SetItemString(PyObject *dp, const char *key, PyObject *item); PyAPI_FUNC(int) _PyDict_SetItemId(PyObject *dp, struct _Py_Identifier *key, PyObject *item); PyAPI_FUNC(int) PyDict_DelItemString(PyObject *dp, const char *key); -PyAPI_FUNC(int) _PyDict_DelItemId(PyObject *mp, struct _Py_Identifier *key); #ifndef Py_LIMITED_API +PyAPI_FUNC(int) _PyDict_DelItemId(PyObject *mp, struct _Py_Identifier *key); +PyAPI_FUNC(void) _PyDict_DebugMallocStats(FILE *out); + int _PyObjectDict_SetItem(PyTypeObject *tp, PyObject **dictptr, PyObject *name, PyObject *value); PyObject *_PyDict_LoadGlobal(PyDictObject *, PyDictObject *, PyObject *); -PyAPI_FUNC(void) _PyDict_DebugMallocStats(FILE *out); #endif #ifdef __cplusplus diff --git a/Lib/subprocess.py b/Lib/subprocess.py --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -404,15 +404,23 @@ hStdError = None wShowWindow = 0 else: + import _posixsubprocess import select - _has_poll = hasattr(select, 'poll') - import _posixsubprocess + import selectors # When select or poll has indicated that the file is writable, # we can write up to _PIPE_BUF bytes without risk of blocking. # POSIX defines PIPE_BUF as >= 512. _PIPE_BUF = getattr(select, 'PIPE_BUF', 512) + # poll/select have the advantage of not requiring any extra file + # descriptor, contrarily to epoll/kqueue (also, they require a single + # syscall). + if hasattr(selectors, 'PollSelector'): + _PopenSelector = selectors.PollSelector + else: + _PopenSelector = selectors.SelectSelector + __all__ = ["Popen", "PIPE", "STDOUT", "call", "check_call", "getstatusoutput", "getoutput", "check_output", "CalledProcessError", "DEVNULL"] @@ -1530,12 +1538,65 @@ if not input: self.stdin.close() - if _has_poll: - stdout, stderr = self._communicate_with_poll(input, endtime, - orig_timeout) - else: - stdout, stderr = self._communicate_with_select(input, endtime, - orig_timeout) + stdout = None + stderr = None + + # Only create this mapping if we haven't already. + if not self._communication_started: + self._fileobj2output = {} + if self.stdout: + self._fileobj2output[self.stdout] = [] + if self.stderr: + self._fileobj2output[self.stderr] = [] + + if self.stdout: + stdout = self._fileobj2output[self.stdout] + if self.stderr: + stderr = self._fileobj2output[self.stderr] + + self._save_input(input) + + with _PopenSelector() as selector: + if self.stdin and input: + selector.register(self.stdin, selectors.EVENT_WRITE) + if self.stdout: + selector.register(self.stdout, selectors.EVENT_READ) + if self.stderr: + selector.register(self.stderr, selectors.EVENT_READ) + + while selector.get_map(): + timeout = self._remaining_time(endtime) + if timeout is not None and timeout < 0: + raise TimeoutExpired(self.args, orig_timeout) + + ready = selector.select(timeout) + self._check_timeout(endtime, orig_timeout) + + # XXX Rewrite these to use non-blocking I/O on the file + # objects; they are no longer using C stdio! + + for key, events in ready: + if key.fileobj is self.stdin: + chunk = self._input[self._input_offset : + self._input_offset + _PIPE_BUF] + try: + self._input_offset += os.write(key.fd, chunk) + except OSError as e: + if e.errno == errno.EPIPE: + selector.unregister(key.fileobj) + key.fileobj.close() + else: + raise + else: + if self._input_offset >= len(self._input): + selector.unregister(key.fileobj) + key.fileobj.close() + elif key.fileobj in (self.stdout, self.stderr): + data = os.read(key.fd, 4096) + if not data: + selector.unregister(key.fileobj) + key.fileobj.close() + self._fileobj2output[key.fileobj].append(data) self.wait(timeout=self._remaining_time(endtime)) @@ -1569,167 +1630,6 @@ self._input = self._input.encode(self.stdin.encoding) - def _communicate_with_poll(self, input, endtime, orig_timeout): - stdout = None # Return - stderr = None # Return - - if not self._communication_started: - self._fd2file = {} - - poller = select.poll() - def register_and_append(file_obj, eventmask): - poller.register(file_obj.fileno(), eventmask) - self._fd2file[file_obj.fileno()] = file_obj - - def close_unregister_and_remove(fd): - poller.unregister(fd) - self._fd2file[fd].close() - self._fd2file.pop(fd) - - if self.stdin and input: - register_and_append(self.stdin, select.POLLOUT) - - # Only create this mapping if we haven't already. - if not self._communication_started: - self._fd2output = {} - if self.stdout: - self._fd2output[self.stdout.fileno()] = [] - if self.stderr: - self._fd2output[self.stderr.fileno()] = [] - - select_POLLIN_POLLPRI = select.POLLIN | select.POLLPRI - if self.stdout: - register_and_append(self.stdout, select_POLLIN_POLLPRI) - stdout = self._fd2output[self.stdout.fileno()] - if self.stderr: - register_and_append(self.stderr, select_POLLIN_POLLPRI) - stderr = self._fd2output[self.stderr.fileno()] - - self._save_input(input) - - while self._fd2file: - timeout = self._remaining_time(endtime) - if timeout is not None and timeout < 0: - raise TimeoutExpired(self.args, orig_timeout) - try: - ready = poller.poll(timeout) - except OSError as e: - if e.args[0] == errno.EINTR: - continue - raise - self._check_timeout(endtime, orig_timeout) - - # XXX Rewrite these to use non-blocking I/O on the - # file objects; they are no longer using C stdio! - - for fd, mode in ready: - if mode & select.POLLOUT: - chunk = self._input[self._input_offset : - self._input_offset + _PIPE_BUF] - try: - self._input_offset += os.write(fd, chunk) - except OSError as e: - if e.errno == errno.EPIPE: - close_unregister_and_remove(fd) - else: - raise - else: - if self._input_offset >= len(self._input): - close_unregister_and_remove(fd) - elif mode & select_POLLIN_POLLPRI: - data = os.read(fd, 4096) - if not data: - close_unregister_and_remove(fd) - self._fd2output[fd].append(data) - else: - # Ignore hang up or errors. - close_unregister_and_remove(fd) - - return (stdout, stderr) - - - def _communicate_with_select(self, input, endtime, orig_timeout): - if not self._communication_started: - self._read_set = [] - self._write_set = [] - if self.stdin and input: - self._write_set.append(self.stdin) - if self.stdout: - self._read_set.append(self.stdout) - if self.stderr: - self._read_set.append(self.stderr) - - self._save_input(input) - - stdout = None # Return - stderr = None # Return - - if self.stdout: - if not self._communication_started: - self._stdout_buff = [] - stdout = self._stdout_buff - if self.stderr: - if not self._communication_started: - self._stderr_buff = [] - stderr = self._stderr_buff - - while self._read_set or self._write_set: - timeout = self._remaining_time(endtime) - if timeout is not None and timeout < 0: - raise TimeoutExpired(self.args, orig_timeout) - try: - (rlist, wlist, xlist) = \ - select.select(self._read_set, self._write_set, [], - timeout) - except OSError as e: - if e.args[0] == errno.EINTR: - continue - raise - - # According to the docs, returning three empty lists indicates - # that the timeout expired. - if not (rlist or wlist or xlist): - raise TimeoutExpired(self.args, orig_timeout) - # We also check what time it is ourselves for good measure. - self._check_timeout(endtime, orig_timeout) - - # XXX Rewrite these to use non-blocking I/O on the - # file objects; they are no longer using C stdio! - - if self.stdin in wlist: - chunk = self._input[self._input_offset : - self._input_offset + _PIPE_BUF] - try: - bytes_written = os.write(self.stdin.fileno(), chunk) - except OSError as e: - if e.errno == errno.EPIPE: - self.stdin.close() - self._write_set.remove(self.stdin) - else: - raise - else: - self._input_offset += bytes_written - if self._input_offset >= len(self._input): - self.stdin.close() - self._write_set.remove(self.stdin) - - if self.stdout in rlist: - data = os.read(self.stdout.fileno(), 1024) - if not data: - self.stdout.close() - self._read_set.remove(self.stdout) - stdout.append(data) - - if self.stderr in rlist: - data = os.read(self.stderr.fileno(), 1024) - if not data: - self.stderr.close() - self._read_set.remove(self.stderr) - stderr.append(data) - - return (stdout, stderr) - - def send_signal(self, sig): """Send a signal to the process """ diff --git a/Lib/test/test_import.py b/Lib/test/test_import.py --- a/Lib/test/test_import.py +++ b/Lib/test/test_import.py @@ -3,7 +3,6 @@ import importlib.util from importlib._bootstrap import _get_sourcefile import builtins -from test.test_importlib.import_ import util as importlib_util import marshal import os import platform diff --git a/Lib/test/test_importlib/__main__.py b/Lib/test/test_importlib/__main__.py --- a/Lib/test/test_importlib/__main__.py +++ b/Lib/test/test_importlib/__main__.py @@ -4,17 +4,6 @@ builtins.__import__ instead of importlib.__import__. """ -from . import test_main - - if __name__ == '__main__': - import argparse - - parser = argparse.ArgumentParser(description='Execute the importlib test ' - 'suite') - parser.add_argument('-b', '--builtin', action='store_true', default=False, - help='use builtins.__import__() instead of importlib') - args = parser.parse_args() - if args.builtin: - util.using___import__ = True + from . import test_main test_main() diff --git a/Lib/test/test_importlib/frozen/test_finder.py b/Lib/test/test_importlib/frozen/test_finder.py --- a/Lib/test/test_importlib/frozen/test_finder.py +++ b/Lib/test/test_importlib/frozen/test_finder.py @@ -1,15 +1,17 @@ -from importlib import machinery from .. import abc +from .. import util + +machinery = util.import_importlib('importlib.machinery') import unittest -class FinderTests(unittest.TestCase, abc.FinderTests): +class FinderTests(abc.FinderTests): """Test finding frozen modules.""" def find(self, name, path=None): - finder = machinery.FrozenImporter + finder = self.machinery.FrozenImporter return finder.find_module(name, path) def test_module(self): @@ -37,11 +39,9 @@ loader = self.find('') self.assertIsNone(loader) - -def test_main(): - from test.support import run_unittest - run_unittest(FinderTests) +Frozen_FinderTests, Source_FinderTests = util.test_both(FinderTests, + machinery=machinery) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/frozen/test_loader.py b/Lib/test/test_importlib/frozen/test_loader.py --- a/Lib/test/test_importlib/frozen/test_loader.py +++ b/Lib/test/test_importlib/frozen/test_loader.py @@ -1,20 +1,21 @@ from .. import abc from .. import util -from importlib import machinery +machinery = util.import_importlib('importlib.machinery') + import unittest from test.support import captured_stdout import types -class LoaderTests(unittest.TestCase, abc.LoaderTests): +class LoaderTests(abc.LoaderTests): def test_module(self): with util.uncache('__hello__'), captured_stdout() as stdout: - module = machinery.FrozenImporter.load_module('__hello__') + module = self.machinery.FrozenImporter.load_module('__hello__') check = {'__name__': '__hello__', '__package__': '', - '__loader__': machinery.FrozenImporter, + '__loader__': self.machinery.FrozenImporter, } for attr, value in check.items(): self.assertEqual(getattr(module, attr), value) @@ -23,11 +24,11 @@ def test_package(self): with util.uncache('__phello__'), captured_stdout() as stdout: - module = machinery.FrozenImporter.load_module('__phello__') + module = self.machinery.FrozenImporter.load_module('__phello__') check = {'__name__': '__phello__', '__package__': '__phello__', '__path__': [], - '__loader__': machinery.FrozenImporter, + '__loader__': self.machinery.FrozenImporter, } for attr, value in check.items(): attr_value = getattr(module, attr) @@ -40,10 +41,10 @@ def test_lacking_parent(self): with util.uncache('__phello__', '__phello__.spam'), \ captured_stdout() as stdout: - module = machinery.FrozenImporter.load_module('__phello__.spam') + module = self.machinery.FrozenImporter.load_module('__phello__.spam') check = {'__name__': '__phello__.spam', '__package__': '__phello__', - '__loader__': machinery.FrozenImporter, + '__loader__': self.machinery.FrozenImporter, } for attr, value in check.items(): attr_value = getattr(module, attr) @@ -55,15 +56,15 @@ def test_module_reuse(self): with util.uncache('__hello__'), captured_stdout() as stdout: - module1 = machinery.FrozenImporter.load_module('__hello__') - module2 = machinery.FrozenImporter.load_module('__hello__') + module1 = self.machinery.FrozenImporter.load_module('__hello__') + module2 = self.machinery.FrozenImporter.load_module('__hello__') self.assertIs(module1, module2) self.assertEqual(stdout.getvalue(), 'Hello world!\nHello world!\n') def test_module_repr(self): with util.uncache('__hello__'), captured_stdout(): - module = machinery.FrozenImporter.load_module('__hello__') + module = self.machinery.FrozenImporter.load_module('__hello__') self.assertEqual(repr(module), "") @@ -72,13 +73,16 @@ pass def test_unloadable(self): - assert machinery.FrozenImporter.find_module('_not_real') is None + assert self.machinery.FrozenImporter.find_module('_not_real') is None with self.assertRaises(ImportError) as cm: - machinery.FrozenImporter.load_module('_not_real') + self.machinery.FrozenImporter.load_module('_not_real') self.assertEqual(cm.exception.name, '_not_real') +Frozen_LoaderTests, Source_LoaderTests = util.test_both(LoaderTests, + machinery=machinery) -class InspectLoaderTests(unittest.TestCase): + +class InspectLoaderTests: """Tests for the InspectLoader methods for FrozenImporter.""" @@ -86,7 +90,7 @@ # Make sure that the code object is good. name = '__hello__' with captured_stdout() as stdout: - code = machinery.FrozenImporter.get_code(name) + code = self.machinery.FrozenImporter.get_code(name) mod = types.ModuleType(name) exec(code, mod.__dict__) self.assertTrue(hasattr(mod, 'initialized')) @@ -94,7 +98,7 @@ def test_get_source(self): # Should always return None. - result = machinery.FrozenImporter.get_source('__hello__') + result = self.machinery.FrozenImporter.get_source('__hello__') self.assertIsNone(result) def test_is_package(self): @@ -102,22 +106,20 @@ test_for = (('__hello__', False), ('__phello__', True), ('__phello__.spam', False)) for name, is_package in test_for: - result = machinery.FrozenImporter.is_package(name) + result = self.machinery.FrozenImporter.is_package(name) self.assertEqual(bool(result), is_package) def test_failure(self): # Raise ImportError for modules that are not frozen. for meth_name in ('get_code', 'get_source', 'is_package'): - method = getattr(machinery.FrozenImporter, meth_name) + method = getattr(self.machinery.FrozenImporter, meth_name) with self.assertRaises(ImportError) as cm: method('importlib') self.assertEqual(cm.exception.name, 'importlib') - -def test_main(): - from test.support import run_unittest - run_unittest(LoaderTests, InspectLoaderTests) +Frozen_ILTests, Source_ILTests = util.test_both(InspectLoaderTests, + machinery=machinery) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/test___loader__.py b/Lib/test/test_importlib/import_/test___loader__.py --- a/Lib/test/test_importlib/import_/test___loader__.py +++ b/Lib/test/test_importlib/import_/test___loader__.py @@ -16,7 +16,7 @@ return self.module -class LoaderAttributeTests(unittest.TestCase): +class LoaderAttributeTests: def test___loader___missing(self): module = types.ModuleType('blah') @@ -27,7 +27,7 @@ loader = LoaderMock() loader.module = module with util.uncache('blah'), util.import_state(meta_path=[loader]): - module = import_util.import_('blah') + module = self.__import__('blah') self.assertEqual(loader, module.__loader__) def test___loader___is_None(self): @@ -36,9 +36,13 @@ loader = LoaderMock() loader.module = module with util.uncache('blah'), util.import_state(meta_path=[loader]): - returned_module = import_util.import_('blah') + returned_module = self.__import__('blah') self.assertEqual(loader, module.__loader__) +Frozen_Tests, Source_Tests = util.test_both(LoaderAttributeTests, + __import__=import_util.__import__) + + if __name__ == '__main__': unittest.main() diff --git a/Lib/test/test_importlib/import_/test___package__.py b/Lib/test/test_importlib/import_/test___package__.py --- a/Lib/test/test_importlib/import_/test___package__.py +++ b/Lib/test/test_importlib/import_/test___package__.py @@ -9,7 +9,7 @@ from . import util as import_util -class Using__package__(unittest.TestCase): +class Using__package__: """Use of __package__ supercedes the use of __name__/__path__ to calculate what package a module belongs to. The basic algorithm is [__package__]:: @@ -38,8 +38,8 @@ # [__package__] with util.mock_modules('pkg.__init__', 'pkg.fake') as importer: with util.import_state(meta_path=[importer]): - import_util.import_('pkg.fake') - module = import_util.import_('', + self.__import__('pkg.fake') + module = self.__import__('', globals={'__package__': 'pkg.fake'}, fromlist=['attr'], level=2) self.assertEqual(module.__name__, 'pkg') @@ -51,8 +51,8 @@ globals_['__package__'] = None with util.mock_modules('pkg.__init__', 'pkg.fake') as importer: with util.import_state(meta_path=[importer]): - import_util.import_('pkg.fake') - module = import_util.import_('', globals= globals_, + self.__import__('pkg.fake') + module = self.__import__('', globals= globals_, fromlist=['attr'], level=2) self.assertEqual(module.__name__, 'pkg') @@ -63,15 +63,17 @@ def test_bad__package__(self): globals = {'__package__': ''} with self.assertRaises(SystemError): - import_util.import_('', globals, {}, ['relimport'], 1) + self.__import__('', globals, {}, ['relimport'], 1) def test_bunk__package__(self): globals = {'__package__': 42} with self.assertRaises(TypeError): - import_util.import_('', globals, {}, ['relimport'], 1) + self.__import__('', globals, {}, ['relimport'], 1) +Frozen_UsingPackage, Source_UsingPackage = util.test_both( + Using__package__, __import__=import_util.__import__) - at import_util.importlib_only + class Setting__package__(unittest.TestCase): """Because __package__ is a new feature, it is not always set by a loader. @@ -84,12 +86,14 @@ """ + __import__ = import_util.__import__[1] + # [top-level] def test_top_level(self): with util.mock_modules('top_level') as mock: with util.import_state(meta_path=[mock]): del mock['top_level'].__package__ - module = import_util.import_('top_level') + module = self.__import__('top_level') self.assertEqual(module.__package__, '') # [package] @@ -97,7 +101,7 @@ with util.mock_modules('pkg.__init__') as mock: with util.import_state(meta_path=[mock]): del mock['pkg'].__package__ - module = import_util.import_('pkg') + module = self.__import__('pkg') self.assertEqual(module.__package__, 'pkg') # [submodule] @@ -105,15 +109,10 @@ with util.mock_modules('pkg.__init__', 'pkg.mod') as mock: with util.import_state(meta_path=[mock]): del mock['pkg.mod'].__package__ - pkg = import_util.import_('pkg.mod') + pkg = self.__import__('pkg.mod') module = getattr(pkg, 'mod') self.assertEqual(module.__package__, 'pkg') -def test_main(): - from test.support import run_unittest - run_unittest(Using__package__, Setting__package__) - - if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/test_api.py b/Lib/test/test_importlib/import_/test_api.py --- a/Lib/test/test_importlib/import_/test_api.py +++ b/Lib/test/test_importlib/import_/test_api.py @@ -1,5 +1,5 @@ -from .. import util as importlib_test_util -from . import util +from .. import util +from . import util as import_util import sys import types import unittest @@ -17,7 +17,7 @@ raise ImportError('I cannot be loaded!') -class APITest(unittest.TestCase): +class APITest: """Test API-specific details for __import__ (e.g. raising the right exception when passing in an int for the module name).""" @@ -25,24 +25,24 @@ def test_name_requires_rparition(self): # Raise TypeError if a non-string is passed in for the module name. with self.assertRaises(TypeError): - util.import_(42) + self.__import__(42) def test_negative_level(self): # Raise ValueError when a negative level is specified. # PEP 328 did away with sys.module None entries and the ambiguity of # absolute/relative imports. with self.assertRaises(ValueError): - util.import_('os', globals(), level=-1) + self.__import__('os', globals(), level=-1) def test_nonexistent_fromlist_entry(self): # If something in fromlist doesn't exist, that's okay. # issue15715 mod = types.ModuleType('fine') mod.__path__ = ['XXX'] - with importlib_test_util.import_state(meta_path=[BadLoaderFinder]): - with importlib_test_util.uncache('fine'): + with util.import_state(meta_path=[BadLoaderFinder]): + with util.uncache('fine'): sys.modules['fine'] = mod - util.import_('fine', fromlist=['not here']) + self.__import__('fine', fromlist=['not here']) def test_fromlist_load_error_propagates(self): # If something in fromlist triggers an exception not related to not @@ -50,18 +50,15 @@ # issue15316 mod = types.ModuleType('fine') mod.__path__ = ['XXX'] - with importlib_test_util.import_state(meta_path=[BadLoaderFinder]): - with importlib_test_util.uncache('fine'): + with util.import_state(meta_path=[BadLoaderFinder]): + with util.uncache('fine'): sys.modules['fine'] = mod with self.assertRaises(ImportError): - util.import_('fine', fromlist=['bogus']) + self.__import__('fine', fromlist=['bogus']) - - -def test_main(): - from test.support import run_unittest - run_unittest(APITest) +Frozen_APITests, Source_APITests = util.test_both( + APITest, __import__=import_util.__import__) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/test_caching.py b/Lib/test/test_importlib/import_/test_caching.py --- a/Lib/test/test_importlib/import_/test_caching.py +++ b/Lib/test/test_importlib/import_/test_caching.py @@ -6,7 +6,7 @@ import unittest -class UseCache(unittest.TestCase): +class UseCache: """When it comes to sys.modules, import prefers it over anything else. @@ -21,12 +21,13 @@ ImportError is raised [None in cache]. """ + def test_using_cache(self): # [use cache] module_to_use = "some module found!" with util.uncache('some_module'): sys.modules['some_module'] = module_to_use - module = import_util.import_('some_module') + module = self.__import__('some_module') self.assertEqual(id(module_to_use), id(module)) def test_None_in_cache(self): @@ -35,7 +36,7 @@ with util.uncache(name): sys.modules[name] = None with self.assertRaises(ImportError) as cm: - import_util.import_(name) + self.__import__(name) self.assertEqual(cm.exception.name, name) def create_mock(self, *names, return_=None): @@ -47,42 +48,43 @@ mock.load_module = MethodType(load_module, mock) return mock +Frozen_UseCache, Source_UseCache = util.test_both( + UseCache, __import__=import_util.__import__) + + +class ImportlibUseCache(UseCache, unittest.TestCase): + + __import__ = import_util.__import__[1] + # __import__ inconsistent between loaders and built-in import when it comes # to when to use the module in sys.modules and when not to. - @import_util.importlib_only def test_using_cache_after_loader(self): # [from cache on return] with self.create_mock('module') as mock: with util.import_state(meta_path=[mock]): - module = import_util.import_('module') + module = self.__import__('module') self.assertEqual(id(module), id(sys.modules['module'])) # See test_using_cache_after_loader() for reasoning. - @import_util.importlib_only def test_using_cache_for_assigning_to_attribute(self): # [from cache to attribute] with self.create_mock('pkg.__init__', 'pkg.module') as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('pkg.module') + module = self.__import__('pkg.module') self.assertTrue(hasattr(module, 'module')) self.assertEqual(id(module.module), id(sys.modules['pkg.module'])) # See test_using_cache_after_loader() for reasoning. - @import_util.importlib_only def test_using_cache_for_fromlist(self): # [from cache for fromlist] with self.create_mock('pkg.__init__', 'pkg.module') as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('pkg', fromlist=['module']) + module = self.__import__('pkg', fromlist=['module']) self.assertTrue(hasattr(module, 'module')) self.assertEqual(id(module.module), id(sys.modules['pkg.module'])) -def test_main(): - from test.support import run_unittest - run_unittest(UseCache) - if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/test_fromlist.py b/Lib/test/test_importlib/import_/test_fromlist.py --- a/Lib/test/test_importlib/import_/test_fromlist.py +++ b/Lib/test/test_importlib/import_/test_fromlist.py @@ -3,7 +3,8 @@ from . import util as import_util import unittest -class ReturnValue(unittest.TestCase): + +class ReturnValue: """The use of fromlist influences what import returns. @@ -18,18 +19,21 @@ # [import return] with util.mock_modules('pkg.__init__', 'pkg.module') as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('pkg.module') + module = self.__import__('pkg.module') self.assertEqual(module.__name__, 'pkg') def test_return_from_from_import(self): # [from return] with util.mock_modules('pkg.__init__', 'pkg.module')as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('pkg.module', fromlist=['attr']) + module = self.__import__('pkg.module', fromlist=['attr']) self.assertEqual(module.__name__, 'pkg.module') +Frozen_ReturnValue, Source_ReturnValue = util.test_both( + ReturnValue, __import__=import_util.__import__) -class HandlingFromlist(unittest.TestCase): + +class HandlingFromlist: """Using fromlist triggers different actions based on what is being asked of it. @@ -48,14 +52,14 @@ # [object case] with util.mock_modules('module') as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('module', fromlist=['attr']) + module = self.__import__('module', fromlist=['attr']) self.assertEqual(module.__name__, 'module') def test_nonexistent_object(self): # [bad object] with util.mock_modules('module') as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('module', fromlist=['non_existent']) + module = self.__import__('module', fromlist=['non_existent']) self.assertEqual(module.__name__, 'module') self.assertTrue(not hasattr(module, 'non_existent')) @@ -63,7 +67,7 @@ # [module] with util.mock_modules('pkg.__init__', 'pkg.module') as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('pkg', fromlist=['module']) + module = self.__import__('pkg', fromlist=['module']) self.assertEqual(module.__name__, 'pkg') self.assertTrue(hasattr(module, 'module')) self.assertEqual(module.module.__name__, 'pkg.module') @@ -78,13 +82,13 @@ module_code={'pkg.mod': module_code}) as importer: with util.import_state(meta_path=[importer]): with self.assertRaises(ImportError) as exc: - import_util.import_('pkg', fromlist=['mod']) + self.__import__('pkg', fromlist=['mod']) self.assertEqual('i_do_not_exist', exc.exception.name) def test_empty_string(self): with util.mock_modules('pkg.__init__', 'pkg.mod') as importer: with util.import_state(meta_path=[importer]): - module = import_util.import_('pkg.mod', fromlist=['']) + module = self.__import__('pkg.mod', fromlist=['']) self.assertEqual(module.__name__, 'pkg.mod') def basic_star_test(self, fromlist=['*']): @@ -92,7 +96,7 @@ with util.mock_modules('pkg.__init__', 'pkg.module') as mock: with util.import_state(meta_path=[mock]): mock['pkg'].__all__ = ['module'] - module = import_util.import_('pkg', fromlist=fromlist) + module = self.__import__('pkg', fromlist=fromlist) self.assertEqual(module.__name__, 'pkg') self.assertTrue(hasattr(module, 'module')) self.assertEqual(module.module.__name__, 'pkg.module') @@ -110,17 +114,16 @@ with context as mock: with util.import_state(meta_path=[mock]): mock['pkg'].__all__ = ['module1'] - module = import_util.import_('pkg', fromlist=['module2', '*']) + module = self.__import__('pkg', fromlist=['module2', '*']) self.assertEqual(module.__name__, 'pkg') self.assertTrue(hasattr(module, 'module1')) self.assertTrue(hasattr(module, 'module2')) self.assertEqual(module.module1.__name__, 'pkg.module1') self.assertEqual(module.module2.__name__, 'pkg.module2') +Frozen_FromList, Source_FromList = util.test_both( + HandlingFromlist, __import__=import_util.__import__) -def test_main(): - from test.support import run_unittest - run_unittest(ReturnValue, HandlingFromlist) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/test_meta_path.py b/Lib/test/test_importlib/import_/test_meta_path.py --- a/Lib/test/test_importlib/import_/test_meta_path.py +++ b/Lib/test/test_importlib/import_/test_meta_path.py @@ -7,7 +7,7 @@ import warnings -class CallingOrder(unittest.TestCase): +class CallingOrder: """Calls to the importers on sys.meta_path happen in order that they are specified in the sequence, starting with the first importer @@ -24,7 +24,7 @@ first.modules[mod] = 42 second.modules[mod] = -13 with util.import_state(meta_path=[first, second]): - self.assertEqual(import_util.import_(mod), 42) + self.assertEqual(self.__import__(mod), 42) def test_continuing(self): # [continuing] @@ -34,7 +34,7 @@ first.find_module = lambda self, fullname, path=None: None second.modules[mod_name] = 42 with util.import_state(meta_path=[first, second]): - self.assertEqual(import_util.import_(mod_name), 42) + self.assertEqual(self.__import__(mod_name), 42) def test_empty(self): # Raise an ImportWarning if sys.meta_path is empty. @@ -51,8 +51,11 @@ self.assertEqual(len(w), 1) self.assertTrue(issubclass(w[-1].category, ImportWarning)) +Frozen_CallingOrder, Source_CallingOrder = util.test_both( + CallingOrder, __import__=import_util.__import__) -class CallSignature(unittest.TestCase): + +class CallSignature: """If there is no __path__ entry on the parent module, then 'path' is None [no path]. Otherwise, the value for __path__ is passed in for the 'path' @@ -74,7 +77,7 @@ log, wrapped_call = self.log(importer.find_module) importer.find_module = MethodType(wrapped_call, importer) with util.import_state(meta_path=[importer]): - import_util.import_(mod_name) + self.__import__(mod_name) assert len(log) == 1 args = log[0][0] kwargs = log[0][1] @@ -95,7 +98,7 @@ log, wrapped_call = self.log(importer.find_module) importer.find_module = MethodType(wrapped_call, importer) with util.import_state(meta_path=[importer]): - import_util.import_(mod_name) + self.__import__(mod_name) assert len(log) == 2 args = log[1][0] kwargs = log[1][1] @@ -104,12 +107,9 @@ self.assertEqual(args[0], mod_name) self.assertIs(args[1], path) - - -def test_main(): - from test.support import run_unittest - run_unittest(CallingOrder, CallSignature) +Frozen_CallSignature, Source_CallSignature = util.test_both( + CallSignature, __import__=import_util.__import__) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/test_packages.py b/Lib/test/test_importlib/import_/test_packages.py --- a/Lib/test/test_importlib/import_/test_packages.py +++ b/Lib/test/test_importlib/import_/test_packages.py @@ -6,21 +6,21 @@ from test import support -class ParentModuleTests(unittest.TestCase): +class ParentModuleTests: """Importing a submodule should import the parent modules.""" def test_import_parent(self): with util.mock_modules('pkg.__init__', 'pkg.module') as mock: with util.import_state(meta_path=[mock]): - module = import_util.import_('pkg.module') + module = self.__import__('pkg.module') self.assertIn('pkg', sys.modules) def test_bad_parent(self): with util.mock_modules('pkg.module') as mock: with util.import_state(meta_path=[mock]): with self.assertRaises(ImportError) as cm: - import_util.import_('pkg.module') + self.__import__('pkg.module') self.assertEqual(cm.exception.name, 'pkg') def test_raising_parent_after_importing_child(self): @@ -32,11 +32,11 @@ with mock: with util.import_state(meta_path=[mock]): with self.assertRaises(ZeroDivisionError): - import_util.import_('pkg') + self.__import__('pkg') self.assertNotIn('pkg', sys.modules) self.assertIn('pkg.module', sys.modules) with self.assertRaises(ZeroDivisionError): - import_util.import_('pkg.module') + self.__import__('pkg.module') self.assertNotIn('pkg', sys.modules) self.assertIn('pkg.module', sys.modules) @@ -51,10 +51,10 @@ with self.assertRaises((ZeroDivisionError, ImportError)): # This raises ImportError on the "from . import module" # line, not sure why. - import_util.import_('pkg') + self.__import__('pkg') self.assertNotIn('pkg', sys.modules) with self.assertRaises((ZeroDivisionError, ImportError)): - import_util.import_('pkg.module') + self.__import__('pkg.module') self.assertNotIn('pkg', sys.modules) # XXX False #self.assertIn('pkg.module', sys.modules) @@ -71,10 +71,10 @@ with self.assertRaises((ZeroDivisionError, ImportError)): # This raises ImportError on the "from ..subpkg import module" # line, not sure why. - import_util.import_('pkg.subpkg') + self.__import__('pkg.subpkg') self.assertNotIn('pkg.subpkg', sys.modules) with self.assertRaises((ZeroDivisionError, ImportError)): - import_util.import_('pkg.subpkg.module') + self.__import__('pkg.subpkg.module') self.assertNotIn('pkg.subpkg', sys.modules) # XXX False #self.assertIn('pkg.subpkg.module', sys.modules) @@ -83,7 +83,7 @@ # Try to import a submodule from a non-package should raise ImportError. assert not hasattr(sys, '__path__') with self.assertRaises(ImportError) as cm: - import_util.import_('sys.no_submodules_here') + self.__import__('sys.no_submodules_here') self.assertEqual(cm.exception.name, 'sys.no_submodules_here') def test_module_not_package_but_side_effects(self): @@ -98,15 +98,13 @@ with mock_modules as mock: with util.import_state(meta_path=[mock]): try: - submodule = import_util.import_(subname) + submodule = self.__import__(subname) finally: support.unload(subname) - -def test_main(): - from test.support import run_unittest - run_unittest(ParentModuleTests) +Frozen_ParentTests, Source_ParentTests = util.test_both( + ParentModuleTests, __import__=import_util.__import__) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/test_path.py b/Lib/test/test_importlib/import_/test_path.py --- a/Lib/test/test_importlib/import_/test_path.py +++ b/Lib/test/test_importlib/import_/test_path.py @@ -1,8 +1,9 @@ -from importlib import _bootstrap -from importlib import machinery -from importlib import import_module from .. import util from . import util as import_util + +importlib = util.import_importlib('importlib') +machinery = util.import_importlib('importlib.machinery') + import os import sys from types import ModuleType @@ -11,7 +12,7 @@ import zipimport -class FinderTests(unittest.TestCase): +class FinderTests: """Tests for PathFinder.""" @@ -19,7 +20,7 @@ # Test None returned upon not finding a suitable finder. module = '' with util.import_state(): - self.assertIsNone(machinery.PathFinder.find_module(module)) + self.assertIsNone(self.machinery.PathFinder.find_module(module)) def test_sys_path(self): # Test that sys.path is used when 'path' is None. @@ -29,7 +30,7 @@ importer = util.mock_modules(module) with util.import_state(path_importer_cache={path: importer}, path=[path]): - loader = machinery.PathFinder.find_module(module) + loader = self.machinery.PathFinder.find_module(module) self.assertIs(loader, importer) def test_path(self): @@ -39,7 +40,7 @@ path = '' importer = util.mock_modules(module) with util.import_state(path_importer_cache={path: importer}): - loader = machinery.PathFinder.find_module(module, [path]) + loader = self.machinery.PathFinder.find_module(module, [path]) self.assertIs(loader, importer) def test_empty_list(self): @@ -49,7 +50,7 @@ importer = util.mock_modules(module) with util.import_state(path_importer_cache={path: importer}, path=[path]): - self.assertIsNone(machinery.PathFinder.find_module('module', [])) + self.assertIsNone(self.machinery.PathFinder.find_module('module', [])) def test_path_hooks(self): # Test that sys.path_hooks is used. @@ -59,7 +60,7 @@ importer = util.mock_modules(module) hook = import_util.mock_path_hook(path, importer=importer) with util.import_state(path_hooks=[hook]): - loader = machinery.PathFinder.find_module(module, [path]) + loader = self.machinery.PathFinder.find_module(module, [path]) self.assertIs(loader, importer) self.assertIn(path, sys.path_importer_cache) self.assertIs(sys.path_importer_cache[path], importer) @@ -72,7 +73,7 @@ path=[path_entry]): with warnings.catch_warnings(record=True) as w: warnings.simplefilter('always') - self.assertIsNone(machinery.PathFinder.find_module('os')) + self.assertIsNone(self.machinery.PathFinder.find_module('os')) self.assertIsNone(sys.path_importer_cache[path_entry]) self.assertEqual(len(w), 1) self.assertTrue(issubclass(w[-1].category, ImportWarning)) @@ -84,7 +85,7 @@ importer = util.mock_modules(module) hook = import_util.mock_path_hook(os.getcwd(), importer=importer) with util.import_state(path=[path], path_hooks=[hook]): - loader = machinery.PathFinder.find_module(module) + loader = self.machinery.PathFinder.find_module(module) self.assertIs(loader, importer) self.assertIn(os.getcwd(), sys.path_importer_cache) @@ -96,8 +97,8 @@ new_path_importer_cache = sys.path_importer_cache.copy() new_path_importer_cache.pop(None, None) new_path_hooks = [zipimport.zipimporter, - _bootstrap.FileFinder.path_hook( - *_bootstrap._get_supported_file_loaders())] + self.machinery.FileFinder.path_hook( + *self.importlib._bootstrap._get_supported_file_loaders())] missing = object() email = sys.modules.pop('email', missing) try: @@ -105,16 +106,15 @@ path=new_path, path_importer_cache=new_path_importer_cache, path_hooks=new_path_hooks): - module = import_module('email') + module = self.importlib.import_module('email') self.assertIsInstance(module, ModuleType) finally: if email is not missing: sys.modules['email'] = email +Frozen_FinderTests, Source_FinderTests = util.test_both( + FinderTests, importlib=importlib, machinery=machinery) -def test_main(): - from test.support import run_unittest - run_unittest(FinderTests) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/test_relative_imports.py b/Lib/test/test_importlib/import_/test_relative_imports.py --- a/Lib/test/test_importlib/import_/test_relative_imports.py +++ b/Lib/test/test_importlib/import_/test_relative_imports.py @@ -4,7 +4,7 @@ import sys import unittest -class RelativeImports(unittest.TestCase): +class RelativeImports: """PEP 328 introduced relative imports. This allows for imports to occur from within a package without having to specify the actual package name. @@ -76,8 +76,8 @@ create = 'pkg.__init__', 'pkg.mod2' globals_ = {'__package__': 'pkg'}, {'__name__': 'pkg.mod1'} def callback(global_): - import_util.import_('pkg') # For __import__(). - module = import_util.import_('', global_, fromlist=['mod2'], level=1) + self.__import__('pkg') # For __import__(). + module = self.__import__('', global_, fromlist=['mod2'], level=1) self.assertEqual(module.__name__, 'pkg') self.assertTrue(hasattr(module, 'mod2')) self.assertEqual(module.mod2.attr, 'pkg.mod2') @@ -88,8 +88,8 @@ create = 'pkg.__init__', 'pkg.mod2' globals_ = {'__package__': 'pkg'}, {'__name__': 'pkg.mod1'} def callback(global_): - import_util.import_('pkg') # For __import__(). - module = import_util.import_('mod2', global_, fromlist=['attr'], + self.__import__('pkg') # For __import__(). + module = self.__import__('mod2', global_, fromlist=['attr'], level=1) self.assertEqual(module.__name__, 'pkg.mod2') self.assertEqual(module.attr, 'pkg.mod2') @@ -101,8 +101,8 @@ globals_ = ({'__package__': 'pkg'}, {'__name__': 'pkg', '__path__': ['blah']}) def callback(global_): - import_util.import_('pkg') # For __import__(). - module = import_util.import_('', global_, fromlist=['module'], + self.__import__('pkg') # For __import__(). + module = self.__import__('', global_, fromlist=['module'], level=1) self.assertEqual(module.__name__, 'pkg') self.assertTrue(hasattr(module, 'module')) @@ -114,8 +114,8 @@ create = 'pkg.__init__', 'pkg.module' globals_ = {'__package__': 'pkg'}, {'__name__': 'pkg.module'} def callback(global_): - import_util.import_('pkg') # For __import__(). - module = import_util.import_('', global_, fromlist=['attr'], level=1) + self.__import__('pkg') # For __import__(). + module = self.__import__('', global_, fromlist=['attr'], level=1) self.assertEqual(module.__name__, 'pkg') self.relative_import_test(create, globals_, callback) @@ -126,7 +126,7 @@ globals_ = ({'__package__': 'pkg.subpkg1'}, {'__name__': 'pkg.subpkg1', '__path__': ['blah']}) def callback(global_): - module = import_util.import_('', global_, fromlist=['subpkg2'], + module = self.__import__('', global_, fromlist=['subpkg2'], level=2) self.assertEqual(module.__name__, 'pkg') self.assertTrue(hasattr(module, 'subpkg2')) @@ -142,8 +142,8 @@ {'__name__': 'pkg.pkg1.pkg2.pkg3.pkg4.pkg5', '__path__': ['blah']}) def callback(global_): - import_util.import_(globals_[0]['__package__']) - module = import_util.import_('', global_, fromlist=['attr'], level=6) + self.__import__(globals_[0]['__package__']) + module = self.__import__('', global_, fromlist=['attr'], level=6) self.assertEqual(module.__name__, 'pkg') self.relative_import_test(create, globals_, callback) @@ -153,9 +153,9 @@ globals_ = ({'__package__': 'pkg'}, {'__name__': 'pkg', '__path__': ['blah']}) def callback(global_): - import_util.import_('pkg') + self.__import__('pkg') with self.assertRaises(ValueError): - import_util.import_('', global_, fromlist=['top_level'], + self.__import__('', global_, fromlist=['top_level'], level=2) self.relative_import_test(create, globals_, callback) @@ -164,16 +164,16 @@ create = ['top_level', 'pkg.__init__', 'pkg.module'] globals_ = {'__package__': 'pkg'}, {'__name__': 'pkg.module'} def callback(global_): - import_util.import_('pkg') + self.__import__('pkg') with self.assertRaises(ValueError): - import_util.import_('', global_, fromlist=['top_level'], + self.__import__('', global_, fromlist=['top_level'], level=2) self.relative_import_test(create, globals_, callback) def test_empty_name_w_level_0(self): # [empty name] with self.assertRaises(ValueError): - import_util.import_('') + self.__import__('') def test_import_from_different_package(self): # Test importing from a different package than the caller. @@ -186,8 +186,8 @@ '__runpy_pkg__.uncle.cousin.nephew'] globals_ = {'__package__': '__runpy_pkg__.__runpy_pkg__'} def callback(global_): - import_util.import_('__runpy_pkg__.__runpy_pkg__') - module = import_util.import_('uncle.cousin', globals_, {}, + self.__import__('__runpy_pkg__.__runpy_pkg__') + module = self.__import__('uncle.cousin', globals_, {}, fromlist=['nephew'], level=2) self.assertEqual(module.__name__, '__runpy_pkg__.uncle.cousin') @@ -198,20 +198,19 @@ create = ['crash.__init__', 'crash.mod'] globals_ = [{'__package__': 'crash', '__name__': 'crash'}] def callback(global_): - import_util.import_('crash') - mod = import_util.import_('mod', global_, {}, [], 1) + self.__import__('crash') + mod = self.__import__('mod', global_, {}, [], 1) self.assertEqual(mod.__name__, 'crash.mod') self.relative_import_test(create, globals_, callback) def test_relative_import_no_globals(self): # No globals for a relative import is an error. with self.assertRaises(KeyError): - import_util.import_('sys', level=1) + self.__import__('sys', level=1) +Frozen_RelativeImports, Source_RelativeImports = util.test_both( + RelativeImports, __import__=import_util.__import__) -def test_main(): - from test.support import run_unittest - run_unittest(RelativeImports) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/import_/util.py b/Lib/test/test_importlib/import_/util.py --- a/Lib/test/test_importlib/import_/util.py +++ b/Lib/test/test_importlib/import_/util.py @@ -1,22 +1,14 @@ +from .. import util + +frozen_importlib, source_importlib = util.import_importlib('importlib') + +import builtins import functools import importlib import unittest -using___import__ = False - - -def import_(*args, **kwargs): - """Delegate to allow for injecting different implementations of import.""" - if using___import__: - return __import__(*args, **kwargs) - else: - return importlib.__import__(*args, **kwargs) - - -def importlib_only(fxn): - """Decorator to skip a test if using __builtins__.__import__.""" - return unittest.skipIf(using___import__, "importlib-specific test")(fxn) +__import__ = staticmethod(builtins.__import__), staticmethod(source_importlib.__import__) def mock_path_hook(*entries, importer): diff --git a/Lib/test/test_importlib/source/test_case_sensitivity.py b/Lib/test/test_importlib/source/test_case_sensitivity.py --- a/Lib/test/test_importlib/source/test_case_sensitivity.py +++ b/Lib/test/test_importlib/source/test_case_sensitivity.py @@ -2,8 +2,9 @@ from .. import util from . import util as source_util -from importlib import _bootstrap -from importlib import machinery +importlib = util.import_importlib('importlib') +machinery = util.import_importlib('importlib.machinery') + import os import sys from test import support as test_support @@ -11,7 +12,7 @@ @util.case_insensitive_tests -class CaseSensitivityTest(unittest.TestCase): +class CaseSensitivityTest: """PEP 235 dictates that on case-preserving, case-insensitive file systems that imports are case-sensitive unless the PYTHONCASEOK environment @@ -21,11 +22,11 @@ assert name != name.lower() def find(self, path): - finder = machinery.FileFinder(path, - (machinery.SourceFileLoader, - machinery.SOURCE_SUFFIXES), - (machinery.SourcelessFileLoader, - machinery.BYTECODE_SUFFIXES)) + finder = self.machinery.FileFinder(path, + (self.machinery.SourceFileLoader, + self.machinery.SOURCE_SUFFIXES), + (self.machinery.SourcelessFileLoader, + self.machinery.BYTECODE_SUFFIXES)) return finder.find_module(self.name) def sensitivity_test(self): @@ -41,7 +42,7 @@ def test_sensitive(self): with test_support.EnvironmentVarGuard() as env: env.unset('PYTHONCASEOK') - if b'PYTHONCASEOK' in _bootstrap._os.environ: + if b'PYTHONCASEOK' in self.importlib._bootstrap._os.environ: self.skipTest('os.environ changes not reflected in ' '_os.environ') sensitive, insensitive = self.sensitivity_test() @@ -52,7 +53,7 @@ def test_insensitive(self): with test_support.EnvironmentVarGuard() as env: env.set('PYTHONCASEOK', '1') - if b'PYTHONCASEOK' not in _bootstrap._os.environ: + if b'PYTHONCASEOK' not in self.importlib._bootstrap._os.environ: self.skipTest('os.environ changes not reflected in ' '_os.environ') sensitive, insensitive = self.sensitivity_test() @@ -61,10 +62,9 @@ self.assertTrue(hasattr(insensitive, 'load_module')) self.assertIn(self.name, insensitive.get_filename(self.name)) - -def test_main(): - test_support.run_unittest(CaseSensitivityTest) +Frozen_CaseSensitivityTest, Source_CaseSensitivityTest = util.test_both( + CaseSensitivityTest, importlib=importlib, machinery=machinery) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/source/test_file_loader.py b/Lib/test/test_importlib/source/test_file_loader.py --- a/Lib/test/test_importlib/source/test_file_loader.py +++ b/Lib/test/test_importlib/source/test_file_loader.py @@ -1,11 +1,12 @@ -from importlib import machinery -import importlib -import importlib.abc -import importlib.util from .. import abc from .. import util from . import util as source_util +importlib = util.import_importlib('importlib') +importlib_abc = util.import_importlib('importlib.abc') +machinery = util.import_importlib('importlib.machinery') +importlib_util = util.import_importlib('importlib.util') + import errno import marshal import os @@ -19,7 +20,7 @@ from test.support import make_legacy_pyc, unload -class SimpleTest(unittest.TestCase, abc.LoaderTests): +class SimpleTest(abc.LoaderTests): """Should have no issue importing a source module [basic]. And if there is a syntax error, it should raise a SyntaxError [syntax error]. @@ -27,7 +28,7 @@ """ def test_load_module_API(self): - class Tester(importlib.abc.FileLoader): + class Tester(self.abc.FileLoader): def get_source(self, _): return 'attr = 42' def is_package(self, _): return False @@ -37,7 +38,7 @@ def test_get_filename_API(self): # If fullname is not set then assume self.path is desired. - class Tester(importlib.abc.FileLoader): + class Tester(self.abc.FileLoader): def get_code(self, _): pass def get_source(self, _): pass def is_package(self, _): pass @@ -55,7 +56,7 @@ # [basic] def test_module(self): with source_util.create_modules('_temp') as mapping: - loader = machinery.SourceFileLoader('_temp', mapping['_temp']) + loader = self.machinery.SourceFileLoader('_temp', mapping['_temp']) module = loader.load_module('_temp') self.assertIn('_temp', sys.modules) check = {'__name__': '_temp', '__file__': mapping['_temp'], @@ -65,7 +66,7 @@ def test_package(self): with source_util.create_modules('_pkg.__init__') as mapping: - loader = machinery.SourceFileLoader('_pkg', + loader = self.machinery.SourceFileLoader('_pkg', mapping['_pkg.__init__']) module = loader.load_module('_pkg') self.assertIn('_pkg', sys.modules) @@ -78,7 +79,7 @@ def test_lacking_parent(self): with source_util.create_modules('_pkg.__init__', '_pkg.mod')as mapping: - loader = machinery.SourceFileLoader('_pkg.mod', + loader = self.machinery.SourceFileLoader('_pkg.mod', mapping['_pkg.mod']) module = loader.load_module('_pkg.mod') self.assertIn('_pkg.mod', sys.modules) @@ -93,7 +94,7 @@ def test_module_reuse(self): with source_util.create_modules('_temp') as mapping: - loader = machinery.SourceFileLoader('_temp', mapping['_temp']) + loader = self.machinery.SourceFileLoader('_temp', mapping['_temp']) module = loader.load_module('_temp') module_id = id(module) module_dict_id = id(module.__dict__) @@ -118,7 +119,7 @@ setattr(orig_module, attr, value) with open(mapping[name], 'w') as file: file.write('+++ bad syntax +++') - loader = machinery.SourceFileLoader('_temp', mapping['_temp']) + loader = self.machinery.SourceFileLoader('_temp', mapping['_temp']) with self.assertRaises(SyntaxError): loader.load_module(name) for attr in attributes: @@ -129,7 +130,7 @@ with source_util.create_modules('_temp') as mapping: with open(mapping['_temp'], 'w') as file: file.write('=') - loader = machinery.SourceFileLoader('_temp', mapping['_temp']) + loader = self.machinery.SourceFileLoader('_temp', mapping['_temp']) with self.assertRaises(SyntaxError): loader.load_module('_temp') self.assertNotIn('_temp', sys.modules) @@ -142,14 +143,14 @@ file.write("# test file for importlib") try: with util.uncache('_temp'): - loader = machinery.SourceFileLoader('_temp', file_path) + loader = self.machinery.SourceFileLoader('_temp', file_path) mod = loader.load_module('_temp') self.assertEqual(file_path, mod.__file__) - self.assertEqual(importlib.util.cache_from_source(file_path), + self.assertEqual(self.util.cache_from_source(file_path), mod.__cached__) finally: os.unlink(file_path) - pycache = os.path.dirname(importlib.util.cache_from_source(file_path)) + pycache = os.path.dirname(self.util.cache_from_source(file_path)) if os.path.exists(pycache): shutil.rmtree(pycache) @@ -158,7 +159,7 @@ # truncated rather than raise an OverflowError. with source_util.create_modules('_temp') as mapping: source = mapping['_temp'] - compiled = importlib.util.cache_from_source(source) + compiled = self.util.cache_from_source(source) with open(source, 'w') as f: f.write("x = 5") try: @@ -169,7 +170,7 @@ if e.errno != getattr(errno, 'EOVERFLOW', None): raise self.skipTest("cannot set modification time to large integer ({})".format(e)) - loader = machinery.SourceFileLoader('_temp', mapping['_temp']) + loader = self.machinery.SourceFileLoader('_temp', mapping['_temp']) mod = loader.load_module('_temp') # Sanity checks. self.assertEqual(mod.__cached__, compiled) @@ -178,12 +179,16 @@ os.stat(compiled) def test_unloadable(self): - loader = machinery.SourceFileLoader('good name', {}) + loader = self.machinery.SourceFileLoader('good name', {}) with self.assertRaises(ImportError): loader.load_module('bad name') +Frozen_SimpleTest, Source_SimpleTest = util.test_both( + SimpleTest, importlib=importlib, machinery=machinery, abc=importlib_abc, + util=importlib_util) -class BadBytecodeTest(unittest.TestCase): + +class BadBytecodeTest: def import_(self, file, module_name): loader = self.loader(module_name, file) @@ -200,7 +205,7 @@ pass py_compile.compile(mapping[name]) if not del_source: - bytecode_path = importlib.util.cache_from_source(mapping[name]) + bytecode_path = self.util.cache_from_source(mapping[name]) else: os.unlink(mapping[name]) bytecode_path = make_legacy_pyc(mapping[name]) @@ -289,7 +294,9 @@ class SourceLoaderBadBytecodeTest(BadBytecodeTest): - loader = machinery.SourceFileLoader + @classmethod + def setUpClass(cls): + cls.loader = cls.machinery.SourceFileLoader @source_util.writes_bytecode_files def test_empty_file(self): @@ -329,7 +336,7 @@ self.import_(mapping[name], name) with open(bytecode_path, 'rb') as bytecode_file: self.assertEqual(bytecode_file.read(4), - importlib.util.MAGIC_NUMBER) + self.util.MAGIC_NUMBER) self._test_bad_magic(test) @@ -379,13 +386,13 @@ zeros = b'\x00\x00\x00\x00' with source_util.create_modules('_temp') as mapping: py_compile.compile(mapping['_temp']) - bytecode_path = importlib.util.cache_from_source(mapping['_temp']) + bytecode_path = self.util.cache_from_source(mapping['_temp']) with open(bytecode_path, 'r+b') as bytecode_file: bytecode_file.seek(4) bytecode_file.write(zeros) self.import_(mapping['_temp'], '_temp') source_mtime = os.path.getmtime(mapping['_temp']) - source_timestamp = importlib._w_long(source_mtime) + source_timestamp = self.importlib._w_long(source_mtime) with open(bytecode_path, 'rb') as bytecode_file: bytecode_file.seek(4) self.assertEqual(bytecode_file.read(4), source_timestamp) @@ -397,7 +404,7 @@ with source_util.create_modules('_temp') as mapping: # Create bytecode that will need to be re-created. py_compile.compile(mapping['_temp']) - bytecode_path = importlib.util.cache_from_source(mapping['_temp']) + bytecode_path = self.util.cache_from_source(mapping['_temp']) with open(bytecode_path, 'r+b') as bytecode_file: bytecode_file.seek(0) bytecode_file.write(b'\x00\x00\x00\x00') @@ -411,10 +418,16 @@ # Make writable for eventual clean-up. os.chmod(bytecode_path, stat.S_IWUSR) +Frozen_SourceBadBytecode, Source_SourceBadBytecode = util.test_both( + SourceLoaderBadBytecodeTest, importlib=importlib, machinery=machinery, + abc=importlib_abc, util=importlib_util) + class SourcelessLoaderBadBytecodeTest(BadBytecodeTest): - loader = machinery.SourcelessFileLoader + @classmethod + def setUpClass(cls): + cls.loader = cls.machinery.SourcelessFileLoader def test_empty_file(self): def test(name, mapping, bytecode_path): @@ -469,6 +482,9 @@ def test_non_code_marshal(self): self._test_non_code_marshal(del_source=True) +Frozen_SourcelessBadBytecode, Source_SourcelessBadBytecode = util.test_both( + SourcelessLoaderBadBytecodeTest, importlib=importlib, + machinery=machinery, abc=importlib_abc, util=importlib_util) if __name__ == '__main__': diff --git a/Lib/test/test_importlib/source/test_finder.py b/Lib/test/test_importlib/source/test_finder.py --- a/Lib/test/test_importlib/source/test_finder.py +++ b/Lib/test/test_importlib/source/test_finder.py @@ -1,7 +1,9 @@ from .. import abc +from .. import util from . import util as source_util -from importlib import machinery +machinery = util.import_importlib('importlib.machinery') + import errno import os import py_compile @@ -13,7 +15,7 @@ import warnings -class FinderTests(unittest.TestCase, abc.FinderTests): +class FinderTests(abc.FinderTests): """For a top-level module, it should just be found directly in the directory being searched. This is true for a directory with source @@ -38,11 +40,11 @@ """ def get_finder(self, root): - loader_details = [(machinery.SourceFileLoader, - machinery.SOURCE_SUFFIXES), - (machinery.SourcelessFileLoader, - machinery.BYTECODE_SUFFIXES)] - return machinery.FileFinder(root, *loader_details) + loader_details = [(self.machinery.SourceFileLoader, + self.machinery.SOURCE_SUFFIXES), + (self.machinery.SourcelessFileLoader, + self.machinery.BYTECODE_SUFFIXES)] + return self.machinery.FileFinder(root, *loader_details) def import_(self, root, module): return self.get_finder(root).find_module(module) @@ -123,8 +125,8 @@ def test_empty_string_for_dir(self): # The empty string from sys.path means to search in the cwd. - finder = machinery.FileFinder('', (machinery.SourceFileLoader, - machinery.SOURCE_SUFFIXES)) + finder = self.machinery.FileFinder('', (self.machinery.SourceFileLoader, + self.machinery.SOURCE_SUFFIXES)) with open('mod.py', 'w') as file: file.write("# test file for importlib") try: @@ -135,8 +137,8 @@ def test_invalidate_caches(self): # invalidate_caches() should reset the mtime. - finder = machinery.FileFinder('', (machinery.SourceFileLoader, - machinery.SOURCE_SUFFIXES)) + finder = self.machinery.FileFinder('', (self.machinery.SourceFileLoader, + self.machinery.SOURCE_SUFFIXES)) finder._path_mtime = 42 finder.invalidate_caches() self.assertEqual(finder._path_mtime, -1) @@ -180,11 +182,9 @@ finder = self.get_finder(file_obj.name) self.assertEqual((None, []), finder.find_loader('doesnotexist')) +Frozen_FinderTests, Source_FinderTests = util.test_both(FinderTests, machinery=machinery) -def test_main(): - from test.support import run_unittest - run_unittest(FinderTests) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/source/test_path_hook.py b/Lib/test/test_importlib/source/test_path_hook.py --- a/Lib/test/test_importlib/source/test_path_hook.py +++ b/Lib/test/test_importlib/source/test_path_hook.py @@ -1,16 +1,18 @@ +from .. import util from . import util as source_util -from importlib import machinery +machinery = util.import_importlib('importlib.machinery') + import unittest -class PathHookTest(unittest.TestCase): +class PathHookTest: """Test the path hook for source.""" def path_hook(self): - return machinery.FileFinder.path_hook((machinery.SourceFileLoader, - machinery.SOURCE_SUFFIXES)) + return self.machinery.FileFinder.path_hook((self.machinery.SourceFileLoader, + self.machinery.SOURCE_SUFFIXES)) def test_success(self): with source_util.create_modules('dummy') as mapping: @@ -21,11 +23,8 @@ # The empty string represents the cwd. self.assertTrue(hasattr(self.path_hook()(''), 'find_module')) - -def test_main(): - from test.support import run_unittest - run_unittest(PathHookTest) +Frozen_PathHookTest, Source_PathHooktest = util.test_both(PathHookTest, machinery=machinery) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_importlib/source/test_source_encoding.py b/Lib/test/test_importlib/source/test_source_encoding.py --- a/Lib/test/test_importlib/source/test_source_encoding.py +++ b/Lib/test/test_importlib/source/test_source_encoding.py @@ -1,6 +1,8 @@ +from .. import util from . import util as source_util -from importlib import _bootstrap +machinery = util.import_importlib('importlib.machinery') + import codecs import re import sys @@ -13,7 +15,7 @@ CODING_RE = re.compile(r'^[ \t\f]*#.*coding[:=][ \t]*([-\w.]+)', re.ASCII) -class EncodingTest(unittest.TestCase): +class EncodingTest: """PEP 3120 makes UTF-8 the default encoding for source code [default encoding]. @@ -35,7 +37,7 @@ with source_util.create_modules(self.module_name) as mapping: with open(mapping[self.module_name], 'wb') as file: file.write(source) - loader = _bootstrap.SourceFileLoader(self.module_name, + loader = self.machinery.SourceFileLoader(self.module_name, mapping[self.module_name]) return loader.load_module(self.module_name) @@ -84,8 +86,10 @@ with self.assertRaises(SyntaxError): self.run_test(source) +Frozen_EncodingTest, Source_EncodingTest = util.test_both(EncodingTest, machinery=machinery) -class LineEndingTest(unittest.TestCase): + +class LineEndingTest: r"""Source written with the three types of line endings (\n, \r\n, \r) need to be readable [cr][crlf][lf].""" @@ -97,7 +101,7 @@ with source_util.create_modules(module_name) as mapping: with open(mapping[module_name], 'wb') as file: file.write(source) - loader = _bootstrap.SourceFileLoader(module_name, + loader = self.machinery.SourceFileLoader(module_name, mapping[module_name]) return loader.load_module(module_name) @@ -113,11 +117,9 @@ def test_lf(self): self.run_test(b'\n') +Frozen_LineEndings, Source_LineEndings = util.test_both(LineEndingTest, machinery=machinery) -def test_main(): - from test.support import run_unittest - run_unittest(EncodingTest, LineEndingTest) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_subprocess.py b/Lib/test/test_subprocess.py --- a/Lib/test/test_subprocess.py +++ b/Lib/test/test_subprocess.py @@ -11,6 +11,7 @@ import tempfile import time import re +import selectors import sysconfig import warnings import select @@ -2179,15 +2180,16 @@ os.rmdir(dir) - at unittest.skipUnless(getattr(subprocess, '_has_poll', False), - "poll system call not supported") + at unittest.skipUnless(hasattr(selectors, 'PollSelector'), + "Test needs selectors.PollSelector") class ProcessTestCaseNoPoll(ProcessTestCase): def setUp(self): - subprocess._has_poll = False + self.orig_selector = subprocess._PopenSelector + subprocess._PopenSelector = selectors.SelectSelector ProcessTestCase.setUp(self) def tearDown(self): - subprocess._has_poll = True + subprocess._PopenSelector = self.orig_selector ProcessTestCase.tearDown(self) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 20:51:29 2013 From: python-checkins at python.org (eric.snow) Date: Fri, 8 Nov 2013 20:51:29 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_=5BPEP_451=5D_Mark_as_Accepte?= =?utf-8?q?d!?= Message-ID: <3dGXDK6MzYz7Lkx@mail.python.org> http://hg.python.org/peps/rev/89a1e993d3ba changeset: 5258:89a1e993d3ba user: Eric Snow date: Fri Nov 08 12:46:55 2013 -0700 summary: [PEP 451] Mark as Accepted! files: pep-0451.txt | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/pep-0451.txt b/pep-0451.txt --- a/pep-0451.txt +++ b/pep-0451.txt @@ -5,13 +5,13 @@ Author: Eric Snow BDFL-Delegate: Brett Cannon , Nick Coghlan Discussions-To: import-sig at python.org -Status: Draft +Status: Accepted Type: Standards Track Content-Type: text/x-rst Created: 8-Aug-2013 Python-Version: 3.4 Post-History: 8-Aug-2013, 28-Aug-2013, 18-Sep-2013, 24-Sep-2013, 4-Oct-2013 -Resolution: +Resolution: https://mail.python.org/pipermail/python-dev/2013-November/130104.html Abstract -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 8 21:14:18 2013 From: python-checkins at python.org (stefan.krah) Date: Fri, 8 Nov 2013 21:14:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NDM3?= =?utf-8?q?=3A_Fix_convert=5Fop=5Fcmp=28=29_of_decimal=2EDecimal_rich_comp?= =?utf-8?q?arator=2C_handle?= Message-ID: <3dGXkf3hBzz7LkW@mail.python.org> http://hg.python.org/cpython/rev/08c81db35959 changeset: 87015:08c81db35959 branch: 3.3 parent: 86989:9b9d188ed549 user: Victor Stinner date: Tue Oct 29 19:26:11 2013 +0100 summary: Issue #19437: Fix convert_op_cmp() of decimal.Decimal rich comparator, handle PyObject_IsInstance() failure files: Modules/_decimal/_decimal.c | 27 +++++++++++++++--------- 1 files changed, 17 insertions(+), 10 deletions(-) diff --git a/Modules/_decimal/_decimal.c b/Modules/_decimal/_decimal.c --- a/Modules/_decimal/_decimal.c +++ b/Modules/_decimal/_decimal.c @@ -3009,18 +3009,25 @@ *wcmp = Py_NotImplemented; } } - else if (PyObject_IsInstance(w, Rational)) { - *wcmp = numerator_as_decimal(w, context); - if (*wcmp && !mpd_isspecial(MPD(v))) { - *vcmp = multiply_by_denominator(v, w, context); - if (*vcmp == NULL) { - Py_CLEAR(*wcmp); + else { + int is_instance = PyObject_IsInstance(w, Rational); + if (is_instance < 0) { + *wcmp = NULL; + return 0; + } + if (is_instance) { + *wcmp = numerator_as_decimal(w, context); + if (*wcmp && !mpd_isspecial(MPD(v))) { + *vcmp = multiply_by_denominator(v, w, context); + if (*vcmp == NULL) { + Py_CLEAR(*wcmp); + } } } - } - else { - Py_INCREF(Py_NotImplemented); - *wcmp = Py_NotImplemented; + else { + Py_INCREF(Py_NotImplemented); + *wcmp = Py_NotImplemented; + } } if (*wcmp == NULL || *wcmp == Py_NotImplemented) { -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 21:14:19 2013 From: python-checkins at python.org (stefan.krah) Date: Fri, 8 Nov 2013 21:14:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NDM3?= =?utf-8?q?=3A_Fix_dec=5Fformat=28=29_of_the_=5Fdecimal_module=2C_handle_d?= =?utf-8?b?ZWNfc3RyZHVwKCk=?= Message-ID: <3dGXkg5X1Jz7Ll7@mail.python.org> http://hg.python.org/cpython/rev/3ff670a83c73 changeset: 87016:3ff670a83c73 branch: 3.3 user: Victor Stinner date: Tue Oct 29 20:33:14 2013 +0100 summary: Issue #19437: Fix dec_format() of the _decimal module, handle dec_strdup() failure (memory allocation failure): raise a MemoryError exception files: Modules/_decimal/_decimal.c | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Modules/_decimal/_decimal.c b/Modules/_decimal/_decimal.c --- a/Modules/_decimal/_decimal.c +++ b/Modules/_decimal/_decimal.c @@ -3187,6 +3187,7 @@ replace_fillchar = 1; fmt = dec_strdup(fmt, size); if (fmt == NULL) { + PyErr_NoMemory(); return NULL; } fmt[0] = '_'; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 21:14:21 2013 From: python-checkins at python.org (stefan.krah) Date: Fri, 8 Nov 2013 21:14:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Change_style_t?= =?utf-8?q?o_match_the_surrounding_code_=28no_early_returns=29=2E?= Message-ID: <3dGXkj0KFmz7LlC@mail.python.org> http://hg.python.org/cpython/rev/88b85b53f067 changeset: 87017:88b85b53f067 branch: 3.3 user: Stefan Krah date: Fri Nov 08 17:48:58 2013 +0100 summary: Change style to match the surrounding code (no early returns). files: Modules/_decimal/_decimal.c | 7 +++---- 1 files changed, 3 insertions(+), 4 deletions(-) diff --git a/Modules/_decimal/_decimal.c b/Modules/_decimal/_decimal.c --- a/Modules/_decimal/_decimal.c +++ b/Modules/_decimal/_decimal.c @@ -3010,12 +3010,11 @@ } } else { - int is_instance = PyObject_IsInstance(w, Rational); - if (is_instance < 0) { + int is_rational = PyObject_IsInstance(w, Rational); + if (is_rational < 0) { *wcmp = NULL; - return 0; } - if (is_instance) { + else if (is_rational > 0) { *wcmp = numerator_as_decimal(w, context); if (*wcmp && !mpd_isspecial(MPD(v))) { *vcmp = multiply_by_denominator(v, w, context); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 21:14:22 2013 From: python-checkins at python.org (stefan.krah) Date: Fri, 8 Nov 2013 21:14:22 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogTW92ZSBQeUVycl9O?= =?utf-8?q?oMemory=28=29_closer_to_the_failure=2E?= Message-ID: <3dGXkk3PQhz7Lk5@mail.python.org> http://hg.python.org/cpython/rev/73b6225ae9cd changeset: 87018:73b6225ae9cd branch: 3.3 user: Stefan Krah date: Fri Nov 08 18:05:02 2013 +0100 summary: Move PyErr_NoMemory() closer to the failure. files: Modules/_decimal/_decimal.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Modules/_decimal/_decimal.c b/Modules/_decimal/_decimal.c --- a/Modules/_decimal/_decimal.c +++ b/Modules/_decimal/_decimal.c @@ -3108,6 +3108,7 @@ { char *dest = PyMem_Malloc(size+1); if (dest == NULL) { + PyErr_NoMemory(); return NULL; } @@ -3186,7 +3187,6 @@ replace_fillchar = 1; fmt = dec_strdup(fmt, size); if (fmt == NULL) { - PyErr_NoMemory(); return NULL; } fmt[0] = '_'; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 21:14:23 2013 From: python-checkins at python.org (stefan.krah) Date: Fri, 8 Nov 2013 21:14:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogVmFsZ3JpbmQ6IHN1?= =?utf-8?q?ppress_false_positive_in_=5FPyOS=5FGetOpt_=28getopt=2Ec=3A84=29?= =?utf-8?q?_=28Invalid_read?= Message-ID: <3dGXkl6BCHz7LlN@mail.python.org> http://hg.python.org/cpython/rev/ec3ba0c13222 changeset: 87019:ec3ba0c13222 branch: 3.3 user: Stefan Krah date: Fri Nov 08 20:18:09 2013 +0100 summary: Valgrind: suppress false positive in _PyOS_GetOpt (getopt.c:84) (Invalid read of size 8: wcscmp (wcscmp.S:464)) files: Misc/valgrind-python.supp | 9 +++++++++ 1 files changed, 9 insertions(+), 0 deletions(-) diff --git a/Misc/valgrind-python.supp b/Misc/valgrind-python.supp --- a/Misc/valgrind-python.supp +++ b/Misc/valgrind-python.supp @@ -456,6 +456,15 @@ fun:PyUnicode_FSConverter } +{ + wcscmp_false_positive + Memcheck:Addr8 + fun:wcscmp + fun:_PyOS_GetOpt + fun:Py_Main + fun:main +} + # Additional suppressions for the unified decimal tests: { test_decimal -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 8 21:14:25 2013 From: python-checkins at python.org (stefan.krah) Date: Fri, 8 Nov 2013 21:14:25 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Null_merge=2E?= Message-ID: <3dGXkn10Ybz7LlX@mail.python.org> http://hg.python.org/cpython/rev/6952c14a3f75 changeset: 87020:6952c14a3f75 parent: 87014:36912c3e8de4 parent: 87019:ec3ba0c13222 user: Stefan Krah date: Fri Nov 08 21:08:46 2013 +0100 summary: Null merge. files: -- Repository URL: http://hg.python.org/cpython From martin at v.loewis.de Sat Nov 9 00:14:52 2013 From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=) Date: Sat, 09 Nov 2013 00:14:52 +0100 Subject: [Python-checkins] [Python-Dev] cpython: Issue #19512: Add a new _PyDict_DelItemId() function, similar to In-Reply-To: References: <3dFG004BDdz7Lpy@mail.python.org> Message-ID: <527D706C.6020905@v.loewis.de> Am 08.11.13 10:25, schrieb Nick Coghlan: > Likely a mistake - the stable ABI is hard to review properly (since it > can depend on non local preprocessor checks, so a mistake may not be > obvious in a diff), we don't currently have a systematic approach to > handling changes and there's no automated test to catch inadvertent > additions or (worse) removals :( Actually, there is, for Windows. The .def file is an explicit list of symbols; additions can easily be reviewed. Everything that's not added there is not in the stable ABI (whether or not it is included in the #ifdef). Removals cause linker failures. It would be possible to automatically find out whether any functions declared under the stable API are actually exported also, by parsing the preprocessor output for Python.h Regards, Martin From python-checkins at python.org Sat Nov 9 03:14:22 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 9 Nov 2013 03:14:22 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_454=3A_remove_filters_on_?= =?utf-8?q?tracing?= Message-ID: <3dGhk62YFKz7LlR@mail.python.org> http://hg.python.org/peps/rev/e025eac2d050 changeset: 5259:e025eac2d050 user: Victor Stinner date: Sat Nov 09 03:13:58 2013 +0100 summary: PEP 454: remove filters on tracing * Remove add_filter(), get_filters(), clear_filters() * Remove the default filter on tracemalloc.py * Give more explanation on the impacts of set_traceback_limit() files: pep-0454.txt | 114 +++++++++++--------------------------- 1 files changed, 33 insertions(+), 81 deletions(-) diff --git a/pep-0454.txt b/pep-0454.txt --- a/pep-0454.txt +++ b/pep-0454.txt @@ -99,10 +99,6 @@ tracemalloc=25`` command line option. The ``set_traceback_limit()`` function can be used at runtime to set the limit. -By default, Python memory blocks allocated in the ``tracemalloc`` module -are ignored using a filter. Use ``clear_filters()`` to trace also these -memory allocations. - Main functions -------------- @@ -136,22 +132,17 @@ ``start()`` function: - Start tracing Python memory allocations. - - The function installs hooks on Python memory allocators. These hooks - have important overhead in term of performances and memory usage: - see `Filter functions`_ to limit the overhead. + Start tracing Python memory allocations: install hooks on Python + memory allocators. See also ``stop()`` and ``is_tracing()`` functions. ``stop()`` function: - Stop tracing Python memory allocations and clear traces of memory - blocks allocated by Python. - - The function uninstalls hooks on Python memory allocators, so the - overhead of the module becomes null. + Stop tracing Python memory allocations: uninstall hooks on Python + memory allocators. Clear also traces of memory blocks allocated by + Python Call ``take_snapshot()`` function to take a snapshot of traces before clearing them. @@ -165,8 +156,7 @@ Return a new ``Snapshot`` instance. The snapshot does not include memory blocks allocated before the - ``tracemalloc`` module started to trace memory allocations nor - memory blocks ignored by filters (see ``get_filters()``). + ``tracemalloc`` module started to trace memory allocations. Tracebacks of traces are limited to ``get_traceback_limit()`` frames. Use ``set_traceback_limit()`` to store more frames. @@ -209,10 +199,14 @@ Set the maximum number of frames stored in the traceback of a trace. - Storing the traceback of each memory allocation has an important - overhead on the memory usage. Use the ``get_tracemalloc_memory()`` - function to measure the overhead and the ``add_filter()`` function - to select which memory allocations are traced. + Storing more frames allocates more memory of the ``tracemalloc`` + module and makes Python slower. Use the ``get_tracemalloc_memory()`` + function to measure how much memory is used by the ``tracemalloc`` + module. + + Storing more than ``1`` frame is only useful to compute statistics + grouped by ``'traceback'`` or to compute cumulative statistics: see + the ``Snapshot.statistics()`` method. If the limit is set to ``0`` frame, a traceback with a frame will be used for all traces: filename ``''`` at line number ``0``. @@ -224,70 +218,25 @@ command line option can be used to set the limit at startup. -Filter functions ----------------- - -Tracing all Python memroy allocations has an important overhead on performances -and on the memory usage. - -To limit the overhead, some files can be excluded or tracing can be restricted -to a set of files using filters. Examples: ``add_filter(Filter(True, -subprocess.__file__))`` only traces memory allocations in the ``subprocess`` -module, and ``add_filter(Filter(False, tracemalloc.__file__))`` ignores -memory allocations in the ``tracemalloc`` module - -By default, there is one exclusive filter to ignore Python memory blocks -allocated by the ``tracemalloc`` module. - -Use the ``get_tracemalloc_memory()`` function to measure the memory usage. -See also the ``set_traceback_limit()`` function to configure how many -frames are stored. - -``add_filter(filter)`` function: - - Add a new filter on Python memory allocations, *filter* is a - ``Filter`` instance. - - All inclusive filters are applied at once, a memory allocation is - ignored if no inclusive filters match its trace. A memory allocation - is ignored if at least one exclusive filter matchs its trace. - - The new filter is not applied on already collected traces. Use the - ``clear_traces()`` function to ensure that all traces match the new - filter. - - -``clear_filters()`` function: - - Clear the filter list. - - See also the ``get_filters()`` function. - - -``get_filters()`` function: - - Get the filters on Python memory allocations. Return a list of - ``Filter`` instances. - - See also the ``clear_filters()`` function. - - Filter ------ ``Filter(inclusive: bool, filename_pattern: str, lineno: int=None, all_frames: bool=False)`` class: - Filter to select which memory allocations are traced. Filters can be - used to reduce the memory usage of the ``tracemalloc`` module, which - can be read using the ``get_tracemalloc_memory()`` function. + Filter on traces of memory blocks. - The ``'*'`` joker character can be used in *filename_pattern* to - match any substring, including empty string. The ``'.pyc'`` and - ``'.pyo'`` file extensions are replaced with ``'.py'``. On Windows, - the comparison is case insensitive and the alternative separator - ``'/'`` is replaced with the standard separator ``'\'``. + See the ``fnmatch.fnmatch()`` function for the syntax of + *filename_pattern*. The ``'.pyc'`` and ``'.pyo'`` file extensions + are replaced with ``'.py'``. - Use ``Filter(False, "")`` to exclude empty tracebacks. + Examples: + + * ``Filter(True, subprocess.__file__)`` only includes traces of the + ``subprocess`` module + * ``Filter(False, tracemalloc.__file__)`` + excludes traces of the ``tracemalloc`` module. + * ``Filter(False, + "")`` excludes empty tracebacks ``inclusive`` attribute: @@ -348,11 +297,14 @@ ``apply_filters(filters)`` method: - Create a new ``Snapshot`` instance with the filtered ``traces`` - sequence, *filters* is a list of ``Filter`` instances. + Create a new ``Snapshot`` instance with a filtered ``traces`` + sequence, *filters* is a list of ``Filter`` instances. If *filters* + is an empty list, return a new ``Snapshot`` instance with a copy of + the traces. - If *filters* is an empty list, return a new ``Snapshot`` instance - with a copy of the traces. + All inclusive filters are applied at once, a trace is ignored if no + inclusive filters match it. A trace is ignored if at least one + exclusive filter matchs it. ``compare_to(old_snapshot: Snapshot, group_by: str, cumulative: bool=False)`` method: -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sat Nov 9 03:24:43 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 9 Nov 2013 03:24:43 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_454=3A_merge_two_=22funct?= =?utf-8?q?ions=22_sections?= Message-ID: <3dGhy36Kt5z7LlR@mail.python.org> http://hg.python.org/peps/rev/e53f7c4949a5 changeset: 5260:e53f7c4949a5 user: Victor Stinner date: Sat Nov 09 03:24:36 2013 +0100 summary: PEP 454: merge two "functions" sections files: pep-0454.txt | 133 ++++++++++++++++++-------------------- 1 files changed, 62 insertions(+), 71 deletions(-) diff --git a/pep-0454.txt b/pep-0454.txt --- a/pep-0454.txt +++ b/pep-0454.txt @@ -100,8 +100,8 @@ function can be used at runtime to set the limit. -Main functions --------------- +Functions +--------- ``clear_traces()`` function: @@ -110,71 +110,6 @@ See also ``stop()``. -``get_traced_memory()`` function: - - Get the current size and maximum size of memory blocks traced by the - ``tracemalloc`` module as a tuple: ``(size: int, max_size: int)``. - - -``get_tracemalloc_memory()`` function: - - Get the memory usage in bytes of the ``tracemalloc`` module used to - store traces of memory blocks. Return an ``int``. - - -``is_tracing()`` function: - - ``True`` if the ``tracemalloc`` module is tracing Python memory - allocations, ``False`` otherwise. - - See also ``start()`` and ``stop()`` functions. - - -``start()`` function: - - Start tracing Python memory allocations: install hooks on Python - memory allocators. - - See also ``stop()`` and ``is_tracing()`` functions. - - -``stop()`` function: - - Stop tracing Python memory allocations: uninstall hooks on Python - memory allocators. Clear also traces of memory blocks allocated by - Python - - Call ``take_snapshot()`` function to take a snapshot of traces - before clearing them. - - See also ``start()`` and ``is_tracing()`` functions. - - -``take_snapshot()`` function: - - Take a snapshot of traces of memory blocks allocated by Python. - Return a new ``Snapshot`` instance. - - The snapshot does not include memory blocks allocated before the - ``tracemalloc`` module started to trace memory allocations. - - Tracebacks of traces are limited to ``get_traceback_limit()`` - frames. Use ``set_traceback_limit()`` to store more frames. - - The ``tracemalloc`` module must be tracing memory allocations to - take a snapshot, see the the ``start()`` function. - - See also the ``get_object_traceback()`` function. - - -Trace functions ---------------- - -When Python allocates a memory block, ``tracemalloc`` attachs a "trace" to -the memory block to store its size in bytes and the traceback where the -allocation occurred. See the ``Trace`` class. - - ``get_object_traceback(obj)`` function: Get the traceback where the Python object *obj* was allocated. @@ -195,6 +130,26 @@ Use the ``set_traceback_limit()`` function to change the limit. +``get_traced_memory()`` function: + + Get the current size and maximum size of memory blocks traced by the + ``tracemalloc`` module as a tuple: ``(size: int, max_size: int)``. + + +``get_tracemalloc_memory()`` function: + + Get the memory usage in bytes of the ``tracemalloc`` module used to + store traces of memory blocks. Return an ``int``. + + +``is_tracing()`` function: + + ``True`` if the ``tracemalloc`` module is tracing Python memory + allocations, ``False`` otherwise. + + See also ``start()`` and ``stop()`` functions. + + ``set_traceback_limit(nframe: int)`` function: Set the maximum number of frames stored in the traceback of a trace. @@ -218,6 +173,43 @@ command line option can be used to set the limit at startup. +``start()`` function: + + Start tracing Python memory allocations: install hooks on Python + memory allocators. + + See also ``stop()`` and ``is_tracing()`` functions. + + +``stop()`` function: + + Stop tracing Python memory allocations: uninstall hooks on Python + memory allocators. Clear also traces of memory blocks allocated by + Python + + Call ``take_snapshot()`` function to take a snapshot of traces + before clearing them. + + See also ``start()`` and ``is_tracing()`` functions. + + +``take_snapshot()`` function: + + Take a snapshot of traces of memory blocks allocated by Python. + Return a new ``Snapshot`` instance. + + The snapshot does not include memory blocks allocated before the + ``tracemalloc`` module started to trace memory allocations. + + Tracebacks of traces are limited to ``get_traceback_limit()`` + frames. Use ``set_traceback_limit()`` to store more frames. + + The ``tracemalloc`` module must be tracing memory allocations to + take a snapshot, see the the ``start()`` function. + + See also the ``get_object_traceback()`` function. + + Filter ------ @@ -233,10 +225,9 @@ * ``Filter(True, subprocess.__file__)`` only includes traces of the ``subprocess`` module - * ``Filter(False, tracemalloc.__file__)`` - excludes traces of the ``tracemalloc`` module. - * ``Filter(False, - "")`` excludes empty tracebacks + * ``Filter(False, tracemalloc.__file__)`` excludes traces of the + ``tracemalloc`` module + * ``Filter(False, "")`` excludes empty tracebacks ``inclusive`` attribute: -- Repository URL: http://hg.python.org/peps From solipsis at pitrou.net Sat Nov 9 04:49:43 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sat, 09 Nov 2013 04:49:43 +0100 Subject: [Python-checkins] Daily reference leaks (6952c14a3f75): sum=0 Message-ID: results for 6952c14a3f75 on branch "default" -------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogUHY3P_', '-x'] From python-checkins at python.org Sat Nov 9 19:50:59 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 9 Nov 2013 19:50:59 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_454=3A_Statistic_doesn=27?= =?utf-8?q?t_have_size=5Fdif/count=5Fdiff_anymore?= Message-ID: <3dH6r35Lq4z7Lk1@mail.python.org> http://hg.python.org/peps/rev/333cdbce5275 changeset: 5261:333cdbce5275 user: Victor Stinner date: Sat Nov 09 19:50:51 2013 +0100 summary: PEP 454: Statistic doesn't have size_dif/count_diff anymore files: pep-0454.txt | 3 --- 1 files changed, 0 insertions(+), 3 deletions(-) diff --git a/pep-0454.txt b/pep-0454.txt --- a/pep-0454.txt +++ b/pep-0454.txt @@ -371,9 +371,6 @@ Statistic on memory allocations. - ``size_diff`` and ``count_diff`` attributes are the difference - between two ``Snapshot`` instance. - ``Snapshot.statistics()`` returns a list of ``Statistic`` instances. See also the ``StatisticDiff`` class. -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sat Nov 9 20:18:00 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 9 Nov 2013 20:18:00 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Fix_Tkinter_te?= =?utf-8?q?sts_on_Tk_8=2E5_with_patchlevel_=3C_8=2E5=2E11_=28issue_=231908?= =?utf-8?b?NSku?= Message-ID: <3dH7RD5McRz7Lkd@mail.python.org> http://hg.python.org/cpython/rev/be8f9beca8aa changeset: 87021:be8f9beca8aa branch: 2.7 parent: 86988:695f988824bb user: Serhiy Storchaka date: Sat Nov 09 21:15:26 2013 +0200 summary: Fix Tkinter tests on Tk 8.5 with patchlevel < 8.5.11 (issue #19085). files: Lib/lib-tk/test/test_tkinter/test_widgets.py | 11 ++- Lib/lib-tk/test/test_ttk/support.py | 15 +++++ Lib/lib-tk/test/widget_tests.py | 27 ++------- 3 files changed, 31 insertions(+), 22 deletions(-) diff --git a/Lib/lib-tk/test/test_tkinter/test_widgets.py b/Lib/lib-tk/test/test_tkinter/test_widgets.py --- a/Lib/lib-tk/test/test_tkinter/test_widgets.py +++ b/Lib/lib-tk/test/test_tkinter/test_widgets.py @@ -3,7 +3,8 @@ import os from test.test_support import requires, run_unittest -from test_ttk.support import tcl_version, requires_tcl, widget_eq +from test_ttk.support import (tcl_version, requires_tcl, get_tk_patchlevel, + widget_eq) from widget_tests import ( add_standard_options, noconv, noconv_meth, int_round, pixels_round, AbstractWidgetTest, StandardOptionsTests, @@ -536,7 +537,7 @@ def test_selectborderwidth(self): widget = self.create() self.checkPixelsParam(widget, 'selectborderwidth', - 1.3, 2.6, -2, '10p', conv=False, + 1.3, 2.6, -2, '10p', conv=noconv, keep_orig=tcl_version >= (8, 5)) def test_spacing1(self): @@ -577,7 +578,11 @@ def test_tabs(self): widget = self.create() - self.checkParam(widget, 'tabs', (10.2, 20.7, '1i', '2i')) + if get_tk_patchlevel() < (8, 5, 11): + self.checkParam(widget, 'tabs', (10.2, 20.7, '1i', '2i'), + expected=('10.2', '20.7', '1i', '2i')) + else: + self.checkParam(widget, 'tabs', (10.2, 20.7, '1i', '2i')) self.checkParam(widget, 'tabs', '10.2 20.7 1i 2i', expected=('10.2', '20.7', '1i', '2i')) self.checkParam(widget, 'tabs', '2c left 4c 6c center', diff --git a/Lib/lib-tk/test/test_ttk/support.py b/Lib/lib-tk/test/test_ttk/support.py --- a/Lib/lib-tk/test/test_ttk/support.py +++ b/Lib/lib-tk/test/test_ttk/support.py @@ -41,6 +41,21 @@ return unittest.skipUnless(tcl_version >= version, 'requires Tcl version >= ' + '.'.join(map(str, version))) +_tk_patchlevel = None +def get_tk_patchlevel(): + global _tk_patchlevel + if _tk_patchlevel is None: + tcl = Tkinter.Tcl() + patchlevel = [] + for x in tcl.call('info', 'patchlevel').split('.'): + try: + x = int(x, 10) + except ValueError: + x = -1 + patchlevel.append(x) + _tk_patchlevel = tuple(patchlevel) + return _tk_patchlevel + units = { 'c': 72 / 2.54, # centimeters 'i': 72, # inches diff --git a/Lib/lib-tk/test/widget_tests.py b/Lib/lib-tk/test/widget_tests.py --- a/Lib/lib-tk/test/widget_tests.py +++ b/Lib/lib-tk/test/widget_tests.py @@ -2,31 +2,23 @@ import Tkinter from ttk import setup_master, Scale -from test_ttk.support import tcl_version, requires_tcl, pixels_conv, tcl_obj_eq +from test_ttk.support import (tcl_version, requires_tcl, get_tk_patchlevel, + pixels_conv, tcl_obj_eq) -noconv = str if tcl_version < (8, 5) else False +noconv = noconv_meth = False +if get_tk_patchlevel() < (8, 5, 11): + noconv = str noconv_meth = noconv and staticmethod(noconv) def int_round(x): return int(round(x)) pixels_round = int_round -if tcl_version[:2] == (8, 5): +if get_tk_patchlevel()[:3] == (8, 5, 11): # Issue #19085: Workaround a bug in Tk # http://core.tcl.tk/tk/info/3497848 - _pixels_round = None - def pixels_round(x): - global _pixels_round - if _pixels_round is None: - root = setup_master() - patchlevel = root.call('info', 'patchlevel') - patchlevel = tuple(map(int, patchlevel.split('.'))) - if patchlevel < (8, 5, 12): - _pixels_round = int - else: - _pixels_round = int_round - return _pixels_round(x) + pixels_round = int _sentinel = object() @@ -424,10 +416,7 @@ def test_wraplength(self): widget = self.create() - if tcl_version < (8, 5): - self.checkPixelsParam(widget, 'wraplength', 100) - else: - self.checkParams(widget, 'wraplength', 100) + self.checkPixelsParam(widget, 'wraplength', 100) def test_xscrollcommand(self): widget = self.create() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 9 20:18:02 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 9 Nov 2013 20:18:02 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Fix_Tkinter_te?= =?utf-8?q?sts_on_Tk_8=2E5_with_patchlevel_=3C_8=2E5=2E11_=28issue_=231908?= =?utf-8?b?NSku?= Message-ID: <3dH7RG18JQz7Lkd@mail.python.org> http://hg.python.org/cpython/rev/204e66190dbb changeset: 87022:204e66190dbb branch: 3.3 parent: 87019:ec3ba0c13222 user: Serhiy Storchaka date: Sat Nov 09 21:16:19 2013 +0200 summary: Fix Tkinter tests on Tk 8.5 with patchlevel < 8.5.11 (issue #19085). files: Lib/tkinter/test/support.py | 15 +++++ Lib/tkinter/test/test_tkinter/test_widgets.py | 11 ++- Lib/tkinter/test/widget_tests.py | 28 ++------- 3 files changed, 31 insertions(+), 23 deletions(-) diff --git a/Lib/tkinter/test/support.py b/Lib/tkinter/test/support.py --- a/Lib/tkinter/test/support.py +++ b/Lib/tkinter/test/support.py @@ -86,6 +86,21 @@ return unittest.skipUnless(tcl_version >= version, 'requires Tcl version >= ' + '.'.join(map(str, version))) +_tk_patchlevel = None +def get_tk_patchlevel(): + global _tk_patchlevel + if _tk_patchlevel is None: + tcl = tkinter.Tcl() + patchlevel = [] + for x in tcl.call('info', 'patchlevel').split('.'): + try: + x = int(x, 10) + except ValueError: + x = -1 + patchlevel.append(x) + _tk_patchlevel = tuple(patchlevel) + return _tk_patchlevel + units = { 'c': 72 / 2.54, # centimeters 'i': 72, # inches diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -3,7 +3,8 @@ import os from test.support import requires -from tkinter.test.support import tcl_version, requires_tcl, widget_eq +from tkinter.test.support import (tcl_version, requires_tcl, + get_tk_patchlevel, widget_eq) from tkinter.test.widget_tests import ( add_standard_options, noconv, pixels_round, AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests) @@ -539,7 +540,7 @@ def test_selectborderwidth(self): widget = self.create() self.checkPixelsParam(widget, 'selectborderwidth', - 1.3, 2.6, -2, '10p', conv=False, + 1.3, 2.6, -2, '10p', conv=noconv, keep_orig=tcl_version >= (8, 5)) def test_spacing1(self): @@ -580,7 +581,11 @@ def test_tabs(self): widget = self.create() - self.checkParam(widget, 'tabs', (10.2, 20.7, '1i', '2i')) + if get_tk_patchlevel() < (8, 5, 11): + self.checkParam(widget, 'tabs', (10.2, 20.7, '1i', '2i'), + expected=('10.2', '20.7', '1i', '2i')) + else: + self.checkParam(widget, 'tabs', (10.2, 20.7, '1i', '2i')) self.checkParam(widget, 'tabs', '10.2 20.7 1i 2i', expected=('10.2', '20.7', '1i', '2i')) self.checkParam(widget, 'tabs', '2c left 4c 6c center', diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/tkinter/test/widget_tests.py @@ -2,28 +2,19 @@ import tkinter from tkinter.ttk import setup_master, Scale -from tkinter.test.support import (tcl_version, requires_tcl, pixels_conv, - tcl_obj_eq) +from tkinter.test.support import (tcl_version, requires_tcl, get_tk_patchlevel, + pixels_conv, tcl_obj_eq) -noconv = str if tcl_version < (8, 5) else False +noconv = False +if get_tk_patchlevel() < (8, 5, 11): + noconv = str pixels_round = round -if tcl_version[:2] == (8, 5): +if get_tk_patchlevel()[:3] == (8, 5, 11): # Issue #19085: Workaround a bug in Tk # http://core.tcl.tk/tk/info/3497848 - _pixels_round = None - def pixels_round(x): - global _pixels_round - if _pixels_round is None: - root = setup_master() - patchlevel = root.call('info', 'patchlevel') - patchlevel = tuple(map(int, patchlevel.split('.'))) - if patchlevel < (8, 5, 12): - _pixels_round = int - else: - _pixels_round = round - return _pixels_round(x) + pixels_round = int _sentinel = object() @@ -406,10 +397,7 @@ def test_wraplength(self): widget = self.create() - if tcl_version < (8, 5): - self.checkPixelsParam(widget, 'wraplength', 100) - else: - self.checkParams(widget, 'wraplength', 100) + self.checkPixelsParam(widget, 'wraplength', 100) def test_xscrollcommand(self): widget = self.create() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 9 20:18:03 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 9 Nov 2013 20:18:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Fix_Tkinter_tests_on_Tk_8=2E5_with_patchlevel_=3C_8=2E5?= =?utf-8?b?LjExIChpc3N1ZSAjMTkwODUpLg==?= Message-ID: <3dH7RH47qzz7Lrw@mail.python.org> http://hg.python.org/cpython/rev/2834e410d1ae changeset: 87023:2834e410d1ae parent: 87020:6952c14a3f75 parent: 87022:204e66190dbb user: Serhiy Storchaka date: Sat Nov 09 21:17:37 2013 +0200 summary: Fix Tkinter tests on Tk 8.5 with patchlevel < 8.5.11 (issue #19085). files: Lib/tkinter/test/support.py | 15 +++++ Lib/tkinter/test/test_tkinter/test_widgets.py | 11 ++- Lib/tkinter/test/widget_tests.py | 28 ++------- 3 files changed, 31 insertions(+), 23 deletions(-) diff --git a/Lib/tkinter/test/support.py b/Lib/tkinter/test/support.py --- a/Lib/tkinter/test/support.py +++ b/Lib/tkinter/test/support.py @@ -86,6 +86,21 @@ return unittest.skipUnless(tcl_version >= version, 'requires Tcl version >= ' + '.'.join(map(str, version))) +_tk_patchlevel = None +def get_tk_patchlevel(): + global _tk_patchlevel + if _tk_patchlevel is None: + tcl = tkinter.Tcl() + patchlevel = [] + for x in tcl.call('info', 'patchlevel').split('.'): + try: + x = int(x, 10) + except ValueError: + x = -1 + patchlevel.append(x) + _tk_patchlevel = tuple(patchlevel) + return _tk_patchlevel + units = { 'c': 72 / 2.54, # centimeters 'i': 72, # inches diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -3,7 +3,8 @@ import os from test.support import requires -from tkinter.test.support import tcl_version, requires_tcl, widget_eq +from tkinter.test.support import (tcl_version, requires_tcl, + get_tk_patchlevel, widget_eq) from tkinter.test.widget_tests import ( add_standard_options, noconv, pixels_round, AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests) @@ -539,7 +540,7 @@ def test_selectborderwidth(self): widget = self.create() self.checkPixelsParam(widget, 'selectborderwidth', - 1.3, 2.6, -2, '10p', conv=False, + 1.3, 2.6, -2, '10p', conv=noconv, keep_orig=tcl_version >= (8, 5)) def test_spacing1(self): @@ -580,7 +581,11 @@ def test_tabs(self): widget = self.create() - self.checkParam(widget, 'tabs', (10.2, 20.7, '1i', '2i')) + if get_tk_patchlevel() < (8, 5, 11): + self.checkParam(widget, 'tabs', (10.2, 20.7, '1i', '2i'), + expected=('10.2', '20.7', '1i', '2i')) + else: + self.checkParam(widget, 'tabs', (10.2, 20.7, '1i', '2i')) self.checkParam(widget, 'tabs', '10.2 20.7 1i 2i', expected=('10.2', '20.7', '1i', '2i')) self.checkParam(widget, 'tabs', '2c left 4c 6c center', diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/tkinter/test/widget_tests.py @@ -2,28 +2,19 @@ import tkinter from tkinter.ttk import setup_master, Scale -from tkinter.test.support import (tcl_version, requires_tcl, pixels_conv, - tcl_obj_eq) +from tkinter.test.support import (tcl_version, requires_tcl, get_tk_patchlevel, + pixels_conv, tcl_obj_eq) -noconv = str if tcl_version < (8, 5) else False +noconv = False +if get_tk_patchlevel() < (8, 5, 11): + noconv = str pixels_round = round -if tcl_version[:2] == (8, 5): +if get_tk_patchlevel()[:3] == (8, 5, 11): # Issue #19085: Workaround a bug in Tk # http://core.tcl.tk/tk/info/3497848 - _pixels_round = None - def pixels_round(x): - global _pixels_round - if _pixels_round is None: - root = setup_master() - patchlevel = root.call('info', 'patchlevel') - patchlevel = tuple(map(int, patchlevel.split('.'))) - if patchlevel < (8, 5, 12): - _pixels_round = int - else: - _pixels_round = round - return _pixels_round(x) + pixels_round = int _sentinel = object() @@ -406,10 +397,7 @@ def test_wraplength(self): widget = self.create() - if tcl_version < (8, 5): - self.checkPixelsParam(widget, 'wraplength', 100) - else: - self.checkParams(widget, 'wraplength', 100) + self.checkPixelsParam(widget, 'wraplength', 100) def test_xscrollcommand(self): widget = self.create() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 9 22:17:11 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 9 Nov 2013 22:17:11 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE1NzUw?= =?utf-8?q?20=3A_Fixed_support_of_24-bit_wave_files_on_big-endian_platform?= =?utf-8?q?s=2E?= Message-ID: <3dHB4l2mXnz7Lr2@mail.python.org> http://hg.python.org/cpython/rev/5fbcb4aa48fa changeset: 87024:5fbcb4aa48fa branch: 2.7 parent: 87021:be8f9beca8aa user: Serhiy Storchaka date: Sat Nov 09 23:09:44 2013 +0200 summary: Issue #1575020: Fixed support of 24-bit wave files on big-endian platforms. files: Lib/test/test_wave.py | 3 --- Lib/wave.py | 21 +++++++++++++-------- Misc/NEWS | 2 ++ 3 files changed, 15 insertions(+), 11 deletions(-) diff --git a/Lib/test/test_wave.py b/Lib/test/test_wave.py --- a/Lib/test/test_wave.py +++ b/Lib/test/test_wave.py @@ -49,9 +49,6 @@ frames = audiotests.byteswap2(frames) - at unittest.skipIf(sys.byteorder == 'big', - '24-bit wave files are supported only on little-endian ' - 'platforms') class WavePCM24Test(audiotests.AudioWriteTests, audiotests.AudioTestsWithSourceFile, unittest.TestCase): diff --git a/Lib/wave.py b/Lib/wave.py --- a/Lib/wave.py +++ b/Lib/wave.py @@ -82,14 +82,15 @@ _array_fmts = None, 'b', 'h', None, 'i' -# Determine endian-ness import struct -if struct.pack("h", 1) == "\000\001": - big_endian = 1 -else: - big_endian = 0 +import sys +from chunk import Chunk -from chunk import Chunk +def _byteswap3(data): + ba = bytearray(data) + ba[::3] = data[2::3] + ba[2::3] = data[::3] + return bytes(ba) class Wave_read: """Variables used in this class: @@ -231,7 +232,7 @@ self._data_seek_needed = 0 if nframes == 0: return '' - if self._sampwidth > 1 and big_endian: + if self._sampwidth in (2, 4) and sys.byteorder == 'big': # unfortunately the fromfile() method does not take # something that only looks like a file object, so # we have to reach into the innards of the chunk object @@ -252,6 +253,8 @@ data = data.tostring() else: data = self._data_chunk.read(nframes * self._framesize) + if self._sampwidth == 3 and sys.byteorder == 'big': + data = _byteswap3(data) if self._convert and data: data = self._convert(data) self._soundpos = self._soundpos + len(data) // (self._nchannels * self._sampwidth) @@ -419,7 +422,7 @@ nframes = len(data) // (self._sampwidth * self._nchannels) if self._convert: data = self._convert(data) - if self._sampwidth > 1 and big_endian: + if self._sampwidth in (2, 4) and sys.byteorder == 'big': import array data = array.array(_array_fmts[self._sampwidth], data) assert data.itemsize == self._sampwidth @@ -427,6 +430,8 @@ data.tofile(self._file) self._datawritten = self._datawritten + len(data) * self._sampwidth else: + if self._sampwidth == 3 and sys.byteorder == 'big': + data = _byteswap3(data) self._file.write(data) self._datawritten = self._datawritten + len(data) self._nframeswritten = self._nframeswritten + nframes diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -12,6 +12,8 @@ Library ------- +- Issue #1575020: Fixed support of 24-bit wave files on big-endian platforms. + - Issue #19480: HTMLParser now accepts all valid start-tag names as defined by the HTML5 standard. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 9 22:17:12 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 9 Nov 2013 22:17:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE1NzUw?= =?utf-8?q?20=3A_Fixed_support_of_24-bit_wave_files_on_big-endian_platform?= =?utf-8?q?s=2E?= Message-ID: <3dHB4m4kYHz7LsW@mail.python.org> http://hg.python.org/cpython/rev/79b8b7c5fe8a changeset: 87025:79b8b7c5fe8a branch: 3.3 parent: 87022:204e66190dbb user: Serhiy Storchaka date: Sat Nov 09 23:12:06 2013 +0200 summary: Issue #1575020: Fixed support of 24-bit wave files on big-endian platforms. files: Lib/test/test_wave.py | 3 --- Lib/wave.py | 21 +++++++++++++-------- Misc/NEWS | 2 ++ 3 files changed, 15 insertions(+), 11 deletions(-) diff --git a/Lib/test/test_wave.py b/Lib/test/test_wave.py --- a/Lib/test/test_wave.py +++ b/Lib/test/test_wave.py @@ -49,9 +49,6 @@ frames = audiotests.byteswap2(frames) - at unittest.skipIf(sys.byteorder == 'big', - '24-bit wave files are supported only on little-endian ' - 'platforms') class WavePCM24Test(audiotests.AudioWriteTests, audiotests.AudioTestsWithSourceFile, unittest.TestCase): diff --git a/Lib/wave.py b/Lib/wave.py --- a/Lib/wave.py +++ b/Lib/wave.py @@ -82,14 +82,15 @@ _array_fmts = None, 'b', 'h', None, 'i' -# Determine endian-ness import struct -if struct.pack("h", 1) == b"\000\001": - big_endian = 1 -else: - big_endian = 0 +import sys +from chunk import Chunk -from chunk import Chunk +def _byteswap3(data): + ba = bytearray(data) + ba[::3] = data[2::3] + ba[2::3] = data[::3] + return bytes(ba) class Wave_read: """Variables used in this class: @@ -231,7 +232,7 @@ self._data_seek_needed = 0 if nframes == 0: return b'' - if self._sampwidth > 1 and big_endian: + if self._sampwidth in (2, 4) and sys.byteorder == 'big': # unfortunately the fromfile() method does not take # something that only looks like a file object, so # we have to reach into the innards of the chunk object @@ -252,6 +253,8 @@ data = data.tobytes() else: data = self._data_chunk.read(nframes * self._framesize) + if self._sampwidth == 3 and sys.byteorder == 'big': + data = _byteswap3(data) if self._convert and data: data = self._convert(data) self._soundpos = self._soundpos + len(data) // (self._nchannels * self._sampwidth) @@ -419,7 +422,7 @@ nframes = len(data) // (self._sampwidth * self._nchannels) if self._convert: data = self._convert(data) - if self._sampwidth > 1 and big_endian: + if self._sampwidth in (2, 4) and sys.byteorder == 'big': import array data = array.array(_array_fmts[self._sampwidth], data) assert data.itemsize == self._sampwidth @@ -427,6 +430,8 @@ data.tofile(self._file) self._datawritten = self._datawritten + len(data) * self._sampwidth else: + if self._sampwidth == 3 and sys.byteorder == 'big': + data = _byteswap3(data) self._file.write(data) self._datawritten = self._datawritten + len(data) self._nframeswritten = self._nframeswritten + nframes diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -13,6 +13,8 @@ Library ------- +- Issue #1575020: Fixed support of 24-bit wave files on big-endian platforms. + - Issue #19480: HTMLParser now accepts all valid start-tag names as defined by the HTML5 standard. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 9 22:17:13 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 9 Nov 2013 22:17:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=231575020=3A_Fixed_support_of_24-bit_wave_files_o?= =?utf-8?q?n_big-endian_platforms=2E?= Message-ID: <3dHB4n6hZJz7LtL@mail.python.org> http://hg.python.org/cpython/rev/1ee45eb6aab9 changeset: 87026:1ee45eb6aab9 parent: 87023:2834e410d1ae parent: 87025:79b8b7c5fe8a user: Serhiy Storchaka date: Sat Nov 09 23:15:52 2013 +0200 summary: Issue #1575020: Fixed support of 24-bit wave files on big-endian platforms. files: Lib/test/test_wave.py | 3 --- Lib/wave.py | 14 ++++++++++++-- Misc/NEWS | 2 ++ 3 files changed, 14 insertions(+), 5 deletions(-) diff --git a/Lib/test/test_wave.py b/Lib/test/test_wave.py --- a/Lib/test/test_wave.py +++ b/Lib/test/test_wave.py @@ -49,9 +49,6 @@ frames = audiotests.byteswap2(frames) - at unittest.skipIf(sys.byteorder == 'big', - '24-bit wave files are supported only on little-endian ' - 'platforms') class WavePCM24Test(audiotests.AudioWriteTests, audiotests.AudioTestsWithSourceFile, unittest.TestCase): diff --git a/Lib/wave.py b/Lib/wave.py --- a/Lib/wave.py +++ b/Lib/wave.py @@ -87,6 +87,12 @@ from chunk import Chunk from collections import namedtuple +def _byteswap3(data): + ba = bytearray(data) + ba[::3] = data[2::3] + ba[2::3] = data[::3] + return bytes(ba) + _wave_params = namedtuple('_wave_params', 'nchannels sampwidth framerate nframes comptype compname') @@ -237,7 +243,7 @@ self._data_seek_needed = 0 if nframes == 0: return b'' - if self._sampwidth > 1 and sys.byteorder == 'big': + if self._sampwidth in (2, 4) and sys.byteorder == 'big': # unfortunately the fromfile() method does not take # something that only looks like a file object, so # we have to reach into the innards of the chunk object @@ -258,6 +264,8 @@ data = data.tobytes() else: data = self._data_chunk.read(nframes * self._framesize) + if self._sampwidth == 3 and sys.byteorder == 'big': + data = _byteswap3(data) if self._convert and data: data = self._convert(data) self._soundpos = self._soundpos + len(data) // (self._nchannels * self._sampwidth) @@ -431,7 +439,7 @@ nframes = len(data) // (self._sampwidth * self._nchannels) if self._convert: data = self._convert(data) - if self._sampwidth > 1 and sys.byteorder == 'big': + if self._sampwidth in (2, 4) and sys.byteorder == 'big': import array data = array.array(_array_fmts[self._sampwidth], data) assert data.itemsize == self._sampwidth @@ -439,6 +447,8 @@ data.tofile(self._file) self._datawritten = self._datawritten + len(data) * self._sampwidth else: + if self._sampwidth == 3 and sys.byteorder == 'big': + data = _byteswap3(data) self._file.write(data) self._datawritten = self._datawritten + len(data) self._nframeswritten = self._nframeswritten + nframes diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -34,6 +34,8 @@ Library ------- +- Issue #1575020: Fixed support of 24-bit wave files on big-endian platforms. + - Issue #19378: Fixed a number of cases in the dis module where the new "file" parameter was not being honoured correctly -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Sun Nov 10 04:49:41 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sun, 10 Nov 2013 04:49:41 +0100 Subject: [Python-checkins] Daily reference leaks (1ee45eb6aab9): sum=4 Message-ID: results for 1ee45eb6aab9 on branch "default" -------------------------------------------- test_asyncio leaked [0, 4, 0] memory blocks, sum=4 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogm_GssV', '-x'] From python-checkins at python.org Sun Nov 10 08:38:59 2013 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 10 Nov 2013 08:38:59 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_update_for_2=2E7=2E6?= Message-ID: <3dHRtC6G89z7LjM@mail.python.org> http://hg.python.org/peps/rev/1bd1cd100407 changeset: 5262:1bd1cd100407 user: Benjamin Peterson date: Sun Nov 10 02:38:37 2013 -0500 summary: update for 2.7.6 files: pep-0373.txt | 6 ++---- 1 files changed, 2 insertions(+), 4 deletions(-) diff --git a/pep-0373.txt b/pep-0373.txt --- a/pep-0373.txt +++ b/pep-0373.txt @@ -56,10 +56,6 @@ Planned future release dates: -- 2.7.6rc1 October 26 2013 -- 2.7.6 November 2 2013 - - - 2.7.7 May 2014 - 2.7.8 November 2014 - 2.7.9 May 2015 @@ -74,6 +70,8 @@ - 2.7.4rc1 2013-03-23 - 2.7.4 2013-04-06 - 2.7.5 2013-05-12 +- 2.7.6rc1 2013-10-26 +- 2.7.6 2012-11-10 Possible features for 2.7 -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sun Nov 10 08:47:26 2013 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 10 Nov 2013 08:47:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogMi43LjYgZmluYWw=?= Message-ID: <3dHS3y4gsPz7Ljm@mail.python.org> http://hg.python.org/cpython/rev/3a1db0d2747e changeset: 87027:3a1db0d2747e branch: 2.7 tag: v2.7.6 parent: 86942:ba31940588b6 user: Benjamin Peterson date: Sun Nov 10 02:36:30 2013 -0500 summary: 2.7.6 final files: Include/patchlevel.h | 6 +++--- Lib/distutils/__init__.py | 2 +- Lib/idlelib/idlever.py | 2 +- Misc/NEWS | 2 +- Misc/RPM/python-2.7.spec | 2 +- 5 files changed, 7 insertions(+), 7 deletions(-) diff --git a/Include/patchlevel.h b/Include/patchlevel.h --- a/Include/patchlevel.h +++ b/Include/patchlevel.h @@ -23,11 +23,11 @@ #define PY_MAJOR_VERSION 2 #define PY_MINOR_VERSION 7 #define PY_MICRO_VERSION 6 -#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_GAMMA -#define PY_RELEASE_SERIAL 1 +#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_FINAL +#define PY_RELEASE_SERIAL 0 /* Version as a string */ -#define PY_VERSION "2.7.6rc1" +#define PY_VERSION "2.7.6" /*--end constants--*/ /* Subversion Revision number of this file (not of the repository). Empty diff --git a/Lib/distutils/__init__.py b/Lib/distutils/__init__.py --- a/Lib/distutils/__init__.py +++ b/Lib/distutils/__init__.py @@ -15,5 +15,5 @@ # Updated automatically by the Python release process. # #--start constants-- -__version__ = "2.7.6rc1" +__version__ = "2.7.6" #--end constants-- diff --git a/Lib/idlelib/idlever.py b/Lib/idlelib/idlever.py --- a/Lib/idlelib/idlever.py +++ b/Lib/idlelib/idlever.py @@ -1,1 +1,1 @@ -IDLE_VERSION = "2.7.6rc1" +IDLE_VERSION = "2.7.6" diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -4,7 +4,7 @@ Whats' New in Python 2.7.6? =========================== -*Release date: 2013-11-02* +*Release date: 2013-11-10* Library ------- diff --git a/Misc/RPM/python-2.7.spec b/Misc/RPM/python-2.7.spec --- a/Misc/RPM/python-2.7.spec +++ b/Misc/RPM/python-2.7.spec @@ -39,7 +39,7 @@ %define name python #--start constants-- -%define version 2.7.6rc1 +%define version 2.7.6 %define libvers 2.7 #--end constants-- %define release 1pydotorg -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 10 08:47:27 2013 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 10 Nov 2013 08:47:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Added_tag_v2?= =?utf-8?q?=2E7=2E6_for_changeset_3a1db0d2747e?= Message-ID: <3dHS3z6R9Vz7Ljm@mail.python.org> http://hg.python.org/cpython/rev/99d03261c1ba changeset: 87028:99d03261c1ba branch: 2.7 user: Benjamin Peterson date: Sun Nov 10 02:36:35 2013 -0500 summary: Added tag v2.7.6 for changeset 3a1db0d2747e files: .hgtags | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/.hgtags b/.hgtags --- a/.hgtags +++ b/.hgtags @@ -161,3 +161,4 @@ ab05e7dd27889b93f20d97bae86170aabfe45ace v2.7.5 a0025037f11a73df5a7dd03e5a4027adad4cb94e v2.6.9rc1 4913d0e9be30666218cc4d713937e81c0e7f346a v2.7.6rc1 +3a1db0d2747ec2d47a8693ed5650f3567161a200 v2.7.6 -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 10 08:47:29 2013 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 10 Nov 2013 08:47:29 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMi43IC0+IDIuNyk6?= =?utf-8?q?_merge_2=2E7=2E6_release_branch?= Message-ID: <3dHS4119rvz7Ljm@mail.python.org> http://hg.python.org/cpython/rev/f373f5f47991 changeset: 87029:f373f5f47991 branch: 2.7 parent: 87024:5fbcb4aa48fa parent: 87028:99d03261c1ba user: Benjamin Peterson date: Sun Nov 10 02:46:48 2013 -0500 summary: merge 2.7.6 release branch files: .hgtags | 1 + Include/patchlevel.h | 6 +++--- Lib/distutils/__init__.py | 2 +- Lib/idlelib/idlever.py | 2 +- Misc/NEWS | 2 +- Misc/RPM/python-2.7.spec | 2 +- 6 files changed, 8 insertions(+), 7 deletions(-) diff --git a/.hgtags b/.hgtags --- a/.hgtags +++ b/.hgtags @@ -162,3 +162,4 @@ a0025037f11a73df5a7dd03e5a4027adad4cb94e v2.6.9rc1 fcb3ec2842f99454322492dd0ec2cf01322df093 v2.6.9 4913d0e9be30666218cc4d713937e81c0e7f346a v2.7.6rc1 +3a1db0d2747ec2d47a8693ed5650f3567161a200 v2.7.6 diff --git a/Include/patchlevel.h b/Include/patchlevel.h --- a/Include/patchlevel.h +++ b/Include/patchlevel.h @@ -23,11 +23,11 @@ #define PY_MAJOR_VERSION 2 #define PY_MINOR_VERSION 7 #define PY_MICRO_VERSION 6 -#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_GAMMA -#define PY_RELEASE_SERIAL 1 +#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_FINAL +#define PY_RELEASE_SERIAL 0 /* Version as a string */ -#define PY_VERSION "2.7.6rc1" +#define PY_VERSION "2.7.6" /*--end constants--*/ /* Subversion Revision number of this file (not of the repository). Empty diff --git a/Lib/distutils/__init__.py b/Lib/distutils/__init__.py --- a/Lib/distutils/__init__.py +++ b/Lib/distutils/__init__.py @@ -15,5 +15,5 @@ # Updated automatically by the Python release process. # #--start constants-- -__version__ = "2.7.6rc1" +__version__ = "2.7.6" #--end constants-- diff --git a/Lib/idlelib/idlever.py b/Lib/idlelib/idlever.py --- a/Lib/idlelib/idlever.py +++ b/Lib/idlelib/idlever.py @@ -1,1 +1,1 @@ -IDLE_VERSION = "2.7.6rc1" +IDLE_VERSION = "2.7.6" diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -46,7 +46,7 @@ Whats' New in Python 2.7.6? =========================== -*Release date: 2013-11-02* +*Release date: 2013-11-10* Library ------- diff --git a/Misc/RPM/python-2.7.spec b/Misc/RPM/python-2.7.spec --- a/Misc/RPM/python-2.7.spec +++ b/Misc/RPM/python-2.7.spec @@ -39,7 +39,7 @@ %define name python #--start constants-- -%define version 2.7.6rc1 +%define version 2.7.6 %define libvers 2.7 #--end constants-- %define release 1pydotorg -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 10 08:47:30 2013 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 10 Nov 2013 08:47:30 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogYSBwb3N0LTIuNy42?= =?utf-8?q?_world?= Message-ID: <3dHS422wb1z7Lpw@mail.python.org> http://hg.python.org/cpython/rev/c163e5011bd3 changeset: 87030:c163e5011bd3 branch: 2.7 user: Benjamin Peterson date: Sun Nov 10 02:47:17 2013 -0500 summary: a post-2.7.6 world files: Include/patchlevel.h | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Include/patchlevel.h b/Include/patchlevel.h --- a/Include/patchlevel.h +++ b/Include/patchlevel.h @@ -27,7 +27,7 @@ #define PY_RELEASE_SERIAL 0 /* Version as a string */ -#define PY_VERSION "2.7.6" +#define PY_VERSION "2.7.6+" /*--end constants--*/ /* Subversion Revision number of this file (not of the repository). Empty -- Repository URL: http://hg.python.org/cpython From victor.stinner at gmail.com Fri Nov 8 00:43:37 2013 From: victor.stinner at gmail.com (Victor Stinner) Date: Fri, 8 Nov 2013 00:43:37 +0100 Subject: [Python-checkins] cpython: Issue #19512: Add PyRun_InteractiveOneObject() function In-Reply-To: References: <3dFFXL6Cxgz7LsB@mail.python.org> Message-ID: Hi, [PyRun_InteractiveOneObject()] 2013/11/8 Nick Coghlan : > New C APIs should either be documented or have an underscore prefix. I created the issue #19518 to add documentation (but also to add even more functions :-)). > Also, if they're part of the stable ABI, they need a version guard. As most PyRun_xxx() functions, the function is declared in a "#ifndef Py_LIMITED_API" block. It means that it is not part of the stable ABI, is that correct? > Wishlist item: an automated ABI checker that can diff the exported symbols > against a reference list (Could ctypes or cffi be used for that?) Yeah, it would be nice to have a tool to list changes on the stable ABI :-) * * * For a previous issue, I also added various other functions like PyParser_ASTFromStringObject() or PyParser_ASTFromFileObject(). As PyRun_InteractiveOneObject(), they are declared in a "#ifndef Py_LIMITED_API" block in pythonrun.h. These new functions are not documented, because PyParser_AST*() functions are not documented. http://bugs.python.org/issue11619 http://hg.python.org/cpython/rev/df2fdd42b375 The patch parser_unicode.patch was attached to the issue during 2 years. First I didn't want to commit it becaues it added too many new functions. But I pushed by change because some Windows users like using the full Unicode range in the name of their scripts and modules! If it's not too late, I would appreciate a review of the stable ABI of this changeset. Victor From victor.stinner at gmail.com Fri Nov 8 00:46:40 2013 From: victor.stinner at gmail.com (Victor Stinner) Date: Fri, 8 Nov 2013 00:46:40 +0100 Subject: [Python-checkins] cpython: Issue #19512: Add a new _PyDict_DelItemId() function, similar to In-Reply-To: References: <3dFG004BDdz7Lpy@mail.python.org> Message-ID: 2013/11/8 Nick Coghlan : >> http://hg.python.org/cpython/rev/69071054b42f >> changeset: 86968:69071054b42f >> user: Victor Stinner >> date: Wed Nov 06 18:58:22 2013 +0100 >> summary: >> Issue #19512: Add a new _PyDict_DelItemId() function, similar to >> PyDict_DelItemString() but using an identifier for the key >> ... > > As a private API, this shouldn't be part of the stable ABI. When I don't know if a function should be made public or not, I compare with other functions. In Python 3.3, _PyDict_GetItemIdWithError(), _PyDict_GetItemId() and _PyDict_SetItemId() are part of the stable ABI if I read correctly dictobject.h. _PyObject_GetAttrId() is also part of the stable ABI. Was it a mistake, or did I misunderstand how stable functions are declared? Victor From victor.stinner at gmail.com Fri Nov 8 00:55:54 2013 From: victor.stinner at gmail.com (Victor Stinner) Date: Fri, 8 Nov 2013 00:55:54 +0100 Subject: [Python-checkins] [Python-Dev] cpython: Issue #19512: Add PyRun_InteractiveOneObject() function In-Reply-To: References: <3dFFXL6Cxgz7LsB@mail.python.org> Message-ID: 2013/11/8 Nick Coghlan : >> [PyRun_InteractiveOneObject()] >> (...) >> > Also, if they're part of the stable ABI, they need a version guard. >> >> As most PyRun_xxx() functions, the function is declared in a "#ifndef >> Py_LIMITED_API" block. It means that it is not part of the stable ABI, >> is that correct? > > Yeah, that's fine - I'm just replying on my phone at the moment, so checking > the broader file context is a pain :) About the 72523 functions PyRun_xxx(), I don't understand something. The PyRun_FileEx() is documented in the Python "very high" C API to use Python. But this function is not part of the stable ABI. So there is no "very high" function in the stable ABI to use Python? Another question: it's not documented if a function is part or not part of the stable ABI. So as an user of the API, it is hard to check if a function is part of the stable ABI or not. Victor From victor.stinner at gmail.com Fri Nov 8 00:58:43 2013 From: victor.stinner at gmail.com (Victor Stinner) Date: Fri, 8 Nov 2013 00:58:43 +0100 Subject: [Python-checkins] [Python-Dev] cpython: Issue #19512: Add PyRun_InteractiveOneObject() function In-Reply-To: References: <3dFFXL6Cxgz7LsB@mail.python.org> Message-ID: 2013/11/8 Victor Stinner : > Another question: it's not documented if a function is part or not > part of the stable ABI. So as an user of the API, it is hard to check > if a function is part of the stable ABI or not. A solution for that it maybe to split the documentation of the C API in two parts: * Stable C ABI * Private C ABI If it's stupid, at least functions of the stable ABI should be marked in the documentation. Victor From victor.stinner at gmail.com Fri Nov 8 12:19:19 2013 From: victor.stinner at gmail.com (Victor Stinner) Date: Fri, 8 Nov 2013 12:19:19 +0100 Subject: [Python-checkins] [Python-Dev] cpython: Issue #19512: Add a new _PyDict_DelItemId() function, similar to In-Reply-To: References: <3dFG004BDdz7Lpy@mail.python.org> Message-ID: 2013/11/8 Nick Coghlan : >> In Python 3.3, _PyDict_GetItemIdWithError(), _PyDict_GetItemId() and >> _PyDict_SetItemId() are part of the stable ABI if I read correctly >> dictobject.h. _PyObject_GetAttrId() is also part of the stable ABI. >> Was it a mistake, or did I misunderstand how stable functions are >> declared? > > Likely a mistake - the stable ABI is hard to review properly (since it can > depend on non local preprocessor checks, so a mistake may not be obvious in > a diff), we don't currently have a systematic approach to handling changes > and there's no automated test to catch inadvertent additions or (worse) > removals :( Would it be possible to remove them from the stable ABI in Python 3.4? They are marked as private using the "_Py" prefix... > This may be a good thing for us to look at more generally when things settle > down a bit after the beta 1 feature freeze. I created the following issue to not forget it: http://bugs.python.org/issue19526 Victor From python-checkins at python.org Sun Nov 10 19:46:04 2013 From: python-checkins at python.org (andrew.kuchling) Date: Sun, 10 Nov 2013 19:46:04 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Ignore_two_constructed_dir?= =?utf-8?q?ectories_in_Tools/unicode?= Message-ID: <3dHkgw2cmYz7Lsv@mail.python.org> http://hg.python.org/cpython/rev/204b59a918f7 changeset: 87031:204b59a918f7 parent: 87026:1ee45eb6aab9 user: Andrew Kuchling date: Sun Nov 10 13:43:47 2013 -0500 summary: Ignore two constructed directories in Tools/unicode files: .hgignore | 2 ++ 1 files changed, 2 insertions(+), 0 deletions(-) diff --git a/.hgignore b/.hgignore --- a/.hgignore +++ b/.hgignore @@ -87,6 +87,8 @@ PCbuild/x64-temp-* PCbuild/amd64 PCbuild/ipch +Tools/unicode/build/ +Tools/unicode/MAPPINGS/ BuildLog.htm __pycache__ Modules/_freeze_importlib -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 10 19:46:05 2013 From: python-checkins at python.org (andrew.kuchling) Date: Sun, 10 Nov 2013 19:46:05 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=231097797=3A_Add_CP273_co?= =?utf-8?q?dec=2C_and_exercise_it_in_the_test_suite?= Message-ID: <3dHkgx6mzFz7Ltb@mail.python.org> http://hg.python.org/cpython/rev/7d9d1bcd7d18 changeset: 87032:7d9d1bcd7d18 user: Andrew Kuchling date: Sun Nov 10 13:44:30 2013 -0500 summary: #1097797: Add CP273 codec, and exercise it in the test suite files: Lib/encodings/cp273.py | 307 +++++++++++++++++++++++++++ Lib/test/test_unicode.py | 4 +- 2 files changed, 309 insertions(+), 2 deletions(-) diff --git a/Lib/encodings/cp273.py b/Lib/encodings/cp273.py new file mode 100644 --- /dev/null +++ b/Lib/encodings/cp273.py @@ -0,0 +1,307 @@ +""" Python Character Mapping Codec cp273 generated from 'python-mappings/CP273.TXT' with gencodec.py. + +"""#" + +import codecs + +### Codec APIs + +class Codec(codecs.Codec): + + def encode(self,input,errors='strict'): + return codecs.charmap_encode(input,errors,encoding_table) + + def decode(self,input,errors='strict'): + return codecs.charmap_decode(input,errors,decoding_table) + +class IncrementalEncoder(codecs.IncrementalEncoder): + def encode(self, input, final=False): + return codecs.charmap_encode(input,self.errors,encoding_table)[0] + +class IncrementalDecoder(codecs.IncrementalDecoder): + def decode(self, input, final=False): + return codecs.charmap_decode(input,self.errors,decoding_table)[0] + +class StreamWriter(Codec,codecs.StreamWriter): + pass + +class StreamReader(Codec,codecs.StreamReader): + pass + +### encodings module API + +def getregentry(): + return codecs.CodecInfo( + name='cp273', + encode=Codec().encode, + decode=Codec().decode, + incrementalencoder=IncrementalEncoder, + incrementaldecoder=IncrementalDecoder, + streamreader=StreamReader, + streamwriter=StreamWriter, + ) + + +### Decoding Table + +decoding_table = ( + '\x00' # 0x00 -> NULL (NUL) + '\x01' # 0x01 -> START OF HEADING (SOH) + '\x02' # 0x02 -> START OF TEXT (STX) + '\x03' # 0x03 -> END OF TEXT (ETX) + '\x9c' # 0x04 -> STRING TERMINATOR (ST) + '\t' # 0x05 -> CHARACTER TABULATION (HT) + '\x86' # 0x06 -> START OF SELECTED AREA (SSA) + '\x7f' # 0x07 -> DELETE (DEL) + '\x97' # 0x08 -> END OF GUARDED AREA (EPA) + '\x8d' # 0x09 -> REVERSE LINE FEED (RI) + '\x8e' # 0x0A -> SINGLE-SHIFT TWO (SS2) + '\x0b' # 0x0B -> LINE TABULATION (VT) + '\x0c' # 0x0C -> FORM FEED (FF) + '\r' # 0x0D -> CARRIAGE RETURN (CR) + '\x0e' # 0x0E -> SHIFT OUT (SO) + '\x0f' # 0x0F -> SHIFT IN (SI) + '\x10' # 0x10 -> DATALINK ESCAPE (DLE) + '\x11' # 0x11 -> DEVICE CONTROL ONE (DC1) + '\x12' # 0x12 -> DEVICE CONTROL TWO (DC2) + '\x13' # 0x13 -> DEVICE CONTROL THREE (DC3) + '\x9d' # 0x14 -> OPERATING SYSTEM COMMAND (OSC) + '\x85' # 0x15 -> NEXT LINE (NEL) + '\x08' # 0x16 -> BACKSPACE (BS) + '\x87' # 0x17 -> END OF SELECTED AREA (ESA) + '\x18' # 0x18 -> CANCEL (CAN) + '\x19' # 0x19 -> END OF MEDIUM (EM) + '\x92' # 0x1A -> PRIVATE USE TWO (PU2) + '\x8f' # 0x1B -> SINGLE-SHIFT THREE (SS3) + '\x1c' # 0x1C -> FILE SEPARATOR (IS4) + '\x1d' # 0x1D -> GROUP SEPARATOR (IS3) + '\x1e' # 0x1E -> RECORD SEPARATOR (IS2) + '\x1f' # 0x1F -> UNIT SEPARATOR (IS1) + '\x80' # 0x20 -> PADDING CHARACTER (PAD) + '\x81' # 0x21 -> HIGH OCTET PRESET (HOP) + '\x82' # 0x22 -> BREAK PERMITTED HERE (BPH) + '\x83' # 0x23 -> NO BREAK HERE (NBH) + '\x84' # 0x24 -> INDEX (IND) + '\n' # 0x25 -> LINE FEED (LF) + '\x17' # 0x26 -> END OF TRANSMISSION BLOCK (ETB) + '\x1b' # 0x27 -> ESCAPE (ESC) + '\x88' # 0x28 -> CHARACTER TABULATION SET (HTS) + '\x89' # 0x29 -> CHARACTER TABULATION WITH JUSTIFICATION (HTJ) + '\x8a' # 0x2A -> LINE TABULATION SET (VTS) + '\x8b' # 0x2B -> PARTIAL LINE FORWARD (PLD) + '\x8c' # 0x2C -> PARTIAL LINE BACKWARD (PLU) + '\x05' # 0x2D -> ENQUIRY (ENQ) + '\x06' # 0x2E -> ACKNOWLEDGE (ACK) + '\x07' # 0x2F -> BELL (BEL) + '\x90' # 0x30 -> DEVICE CONTROL STRING (DCS) + '\x91' # 0x31 -> PRIVATE USE ONE (PU1) + '\x16' # 0x32 -> SYNCHRONOUS IDLE (SYN) + '\x93' # 0x33 -> SET TRANSMIT STATE (STS) + '\x94' # 0x34 -> CANCEL CHARACTER (CCH) + '\x95' # 0x35 -> MESSAGE WAITING (MW) + '\x96' # 0x36 -> START OF GUARDED AREA (SPA) + '\x04' # 0x37 -> END OF TRANSMISSION (EOT) + '\x98' # 0x38 -> START OF STRING (SOS) + '\x99' # 0x39 -> SINGLE GRAPHIC CHARACTER INTRODUCER (SGCI) + '\x9a' # 0x3A -> SINGLE CHARACTER INTRODUCER (SCI) + '\x9b' # 0x3B -> CONTROL SEQUENCE INTRODUCER (CSI) + '\x14' # 0x3C -> DEVICE CONTROL FOUR (DC4) + '\x15' # 0x3D -> NEGATIVE ACKNOWLEDGE (NAK) + '\x9e' # 0x3E -> PRIVACY MESSAGE (PM) + '\x1a' # 0x3F -> SUBSTITUTE (SUB) + ' ' # 0x40 -> SPACE + '\xa0' # 0x41 -> NO-BREAK SPACE + '\xe2' # 0x42 -> LATIN SMALL LETTER A WITH CIRCUMFLEX + '{' # 0x43 -> LEFT CURLY BRACKET + '\xe0' # 0x44 -> LATIN SMALL LETTER A WITH GRAVE + '\xe1' # 0x45 -> LATIN SMALL LETTER A WITH ACUTE + '\xe3' # 0x46 -> LATIN SMALL LETTER A WITH TILDE + '\xe5' # 0x47 -> LATIN SMALL LETTER A WITH RING ABOVE + '\xe7' # 0x48 -> LATIN SMALL LETTER C WITH CEDILLA + '\xf1' # 0x49 -> LATIN SMALL LETTER N WITH TILDE + '\xc4' # 0x4A -> LATIN CAPITAL LETTER A WITH DIAERESIS + '.' # 0x4B -> FULL STOP + '<' # 0x4C -> LESS-THAN SIGN + '(' # 0x4D -> LEFT PARENTHESIS + '+' # 0x4E -> PLUS SIGN + '!' # 0x4F -> EXCLAMATION MARK + '&' # 0x50 -> AMPERSAND + '\xe9' # 0x51 -> LATIN SMALL LETTER E WITH ACUTE + '\xea' # 0x52 -> LATIN SMALL LETTER E WITH CIRCUMFLEX + '\xeb' # 0x53 -> LATIN SMALL LETTER E WITH DIAERESIS + '\xe8' # 0x54 -> LATIN SMALL LETTER E WITH GRAVE + '\xed' # 0x55 -> LATIN SMALL LETTER I WITH ACUTE + '\xee' # 0x56 -> LATIN SMALL LETTER I WITH CIRCUMFLEX + '\xef' # 0x57 -> LATIN SMALL LETTER I WITH DIAERESIS + '\xec' # 0x58 -> LATIN SMALL LETTER I WITH GRAVE + '~' # 0x59 -> TILDE + '\xdc' # 0x5A -> LATIN CAPITAL LETTER U WITH DIAERESIS + '$' # 0x5B -> DOLLAR SIGN + '*' # 0x5C -> ASTERISK + ')' # 0x5D -> RIGHT PARENTHESIS + ';' # 0x5E -> SEMICOLON + '^' # 0x5F -> CIRCUMFLEX ACCENT + '-' # 0x60 -> HYPHEN-MINUS + '/' # 0x61 -> SOLIDUS + '\xc2' # 0x62 -> LATIN CAPITAL LETTER A WITH CIRCUMFLEX + '[' # 0x63 -> LEFT SQUARE BRACKET + '\xc0' # 0x64 -> LATIN CAPITAL LETTER A WITH GRAVE + '\xc1' # 0x65 -> LATIN CAPITAL LETTER A WITH ACUTE + '\xc3' # 0x66 -> LATIN CAPITAL LETTER A WITH TILDE + '\xc5' # 0x67 -> LATIN CAPITAL LETTER A WITH RING ABOVE + '\xc7' # 0x68 -> LATIN CAPITAL LETTER C WITH CEDILLA + '\xd1' # 0x69 -> LATIN CAPITAL LETTER N WITH TILDE + '\xf6' # 0x6A -> LATIN SMALL LETTER O WITH DIAERESIS + ',' # 0x6B -> COMMA + '%' # 0x6C -> PERCENT SIGN + '_' # 0x6D -> LOW LINE + '>' # 0x6E -> GREATER-THAN SIGN + '?' # 0x6F -> QUESTION MARK + '\xf8' # 0x70 -> LATIN SMALL LETTER O WITH STROKE + '\xc9' # 0x71 -> LATIN CAPITAL LETTER E WITH ACUTE + '\xca' # 0x72 -> LATIN CAPITAL LETTER E WITH CIRCUMFLEX + '\xcb' # 0x73 -> LATIN CAPITAL LETTER E WITH DIAERESIS + '\xc8' # 0x74 -> LATIN CAPITAL LETTER E WITH GRAVE + '\xcd' # 0x75 -> LATIN CAPITAL LETTER I WITH ACUTE + '\xce' # 0x76 -> LATIN CAPITAL LETTER I WITH CIRCUMFLEX + '\xcf' # 0x77 -> LATIN CAPITAL LETTER I WITH DIAERESIS + '\xcc' # 0x78 -> LATIN CAPITAL LETTER I WITH GRAVE + '`' # 0x79 -> GRAVE ACCENT + ':' # 0x7A -> COLON + '#' # 0x7B -> NUMBER SIGN + '\xa7' # 0x7C -> SECTION SIGN + "'" # 0x7D -> APOSTROPHE + '=' # 0x7E -> EQUALS SIGN + '"' # 0x7F -> QUOTATION MARK + '\xd8' # 0x80 -> LATIN CAPITAL LETTER O WITH STROKE + 'a' # 0x81 -> LATIN SMALL LETTER A + 'b' # 0x82 -> LATIN SMALL LETTER B + 'c' # 0x83 -> LATIN SMALL LETTER C + 'd' # 0x84 -> LATIN SMALL LETTER D + 'e' # 0x85 -> LATIN SMALL LETTER E + 'f' # 0x86 -> LATIN SMALL LETTER F + 'g' # 0x87 -> LATIN SMALL LETTER G + 'h' # 0x88 -> LATIN SMALL LETTER H + 'i' # 0x89 -> LATIN SMALL LETTER I + '\xab' # 0x8A -> LEFT-POINTING DOUBLE ANGLE QUOTATION MARK + '\xbb' # 0x8B -> RIGHT-POINTING DOUBLE ANGLE QUOTATION MARK + '\xf0' # 0x8C -> LATIN SMALL LETTER ETH (Icelandic) + '\xfd' # 0x8D -> LATIN SMALL LETTER Y WITH ACUTE + '\xfe' # 0x8E -> LATIN SMALL LETTER THORN (Icelandic) + '\xb1' # 0x8F -> PLUS-MINUS SIGN + '\xb0' # 0x90 -> DEGREE SIGN + 'j' # 0x91 -> LATIN SMALL LETTER J + 'k' # 0x92 -> LATIN SMALL LETTER K + 'l' # 0x93 -> LATIN SMALL LETTER L + 'm' # 0x94 -> LATIN SMALL LETTER M + 'n' # 0x95 -> LATIN SMALL LETTER N + 'o' # 0x96 -> LATIN SMALL LETTER O + 'p' # 0x97 -> LATIN SMALL LETTER P + 'q' # 0x98 -> LATIN SMALL LETTER Q + 'r' # 0x99 -> LATIN SMALL LETTER R + '\xaa' # 0x9A -> FEMININE ORDINAL INDICATOR + '\xba' # 0x9B -> MASCULINE ORDINAL INDICATOR + '\xe6' # 0x9C -> LATIN SMALL LETTER AE + '\xb8' # 0x9D -> CEDILLA + '\xc6' # 0x9E -> LATIN CAPITAL LETTER AE + '\xa4' # 0x9F -> CURRENCY SIGN + '\xb5' # 0xA0 -> MICRO SIGN + '\xdf' # 0xA1 -> LATIN SMALL LETTER SHARP S (German) + 's' # 0xA2 -> LATIN SMALL LETTER S + 't' # 0xA3 -> LATIN SMALL LETTER T + 'u' # 0xA4 -> LATIN SMALL LETTER U + 'v' # 0xA5 -> LATIN SMALL LETTER V + 'w' # 0xA6 -> LATIN SMALL LETTER W + 'x' # 0xA7 -> LATIN SMALL LETTER X + 'y' # 0xA8 -> LATIN SMALL LETTER Y + 'z' # 0xA9 -> LATIN SMALL LETTER Z + '\xa1' # 0xAA -> INVERTED EXCLAMATION MARK + '\xbf' # 0xAB -> INVERTED QUESTION MARK + '\xd0' # 0xAC -> LATIN CAPITAL LETTER ETH (Icelandic) + '\xdd' # 0xAD -> LATIN CAPITAL LETTER Y WITH ACUTE + '\xde' # 0xAE -> LATIN CAPITAL LETTER THORN (Icelandic) + '\xae' # 0xAF -> REGISTERED SIGN + '\xa2' # 0xB0 -> CENT SIGN + '\xa3' # 0xB1 -> POUND SIGN + '\xa5' # 0xB2 -> YEN SIGN + '\xb7' # 0xB3 -> MIDDLE DOT + '\xa9' # 0xB4 -> COPYRIGHT SIGN + '@' # 0xB5 -> COMMERCIAL AT + '\xb6' # 0xB6 -> PILCROW SIGN + '\xbc' # 0xB7 -> VULGAR FRACTION ONE QUARTER + '\xbd' # 0xB8 -> VULGAR FRACTION ONE HALF + '\xbe' # 0xB9 -> VULGAR FRACTION THREE QUARTERS + '\xac' # 0xBA -> NOT SIGN + '|' # 0xBB -> VERTICAL LINE + '\u203e' # 0xBC -> OVERLINE + '\xa8' # 0xBD -> DIAERESIS + '\xb4' # 0xBE -> ACUTE ACCENT + '\xd7' # 0xBF -> MULTIPLICATION SIGN + '\xe4' # 0xC0 -> LATIN SMALL LETTER A WITH DIAERESIS + 'A' # 0xC1 -> LATIN CAPITAL LETTER A + 'B' # 0xC2 -> LATIN CAPITAL LETTER B + 'C' # 0xC3 -> LATIN CAPITAL LETTER C + 'D' # 0xC4 -> LATIN CAPITAL LETTER D + 'E' # 0xC5 -> LATIN CAPITAL LETTER E + 'F' # 0xC6 -> LATIN CAPITAL LETTER F + 'G' # 0xC7 -> LATIN CAPITAL LETTER G + 'H' # 0xC8 -> LATIN CAPITAL LETTER H + 'I' # 0xC9 -> LATIN CAPITAL LETTER I + '\xad' # 0xCA -> SOFT HYPHEN + '\xf4' # 0xCB -> LATIN SMALL LETTER O WITH CIRCUMFLEX + '\xa6' # 0xCC -> BROKEN BAR + '\xf2' # 0xCD -> LATIN SMALL LETTER O WITH GRAVE + '\xf3' # 0xCE -> LATIN SMALL LETTER O WITH ACUTE + '\xf5' # 0xCF -> LATIN SMALL LETTER O WITH TILDE + '\xfc' # 0xD0 -> LATIN SMALL LETTER U WITH DIAERESIS + 'J' # 0xD1 -> LATIN CAPITAL LETTER J + 'K' # 0xD2 -> LATIN CAPITAL LETTER K + 'L' # 0xD3 -> LATIN CAPITAL LETTER L + 'M' # 0xD4 -> LATIN CAPITAL LETTER M + 'N' # 0xD5 -> LATIN CAPITAL LETTER N + 'O' # 0xD6 -> LATIN CAPITAL LETTER O + 'P' # 0xD7 -> LATIN CAPITAL LETTER P + 'Q' # 0xD8 -> LATIN CAPITAL LETTER Q + 'R' # 0xD9 -> LATIN CAPITAL LETTER R + '\xb9' # 0xDA -> SUPERSCRIPT ONE + '\xfb' # 0xDB -> LATIN SMALL LETTER U WITH CIRCUMFLEX + '}' # 0xDC -> RIGHT CURLY BRACKET + '\xf9' # 0xDD -> LATIN SMALL LETTER U WITH GRAVE + '\xfa' # 0xDE -> LATIN SMALL LETTER U WITH ACUTE + '\xff' # 0xDF -> LATIN SMALL LETTER Y WITH DIAERESIS + '\xd6' # 0xE0 -> LATIN CAPITAL LETTER O WITH DIAERESIS + '\xf7' # 0xE1 -> DIVISION SIGN + 'S' # 0xE2 -> LATIN CAPITAL LETTER S + 'T' # 0xE3 -> LATIN CAPITAL LETTER T + 'U' # 0xE4 -> LATIN CAPITAL LETTER U + 'V' # 0xE5 -> LATIN CAPITAL LETTER V + 'W' # 0xE6 -> LATIN CAPITAL LETTER W + 'X' # 0xE7 -> LATIN CAPITAL LETTER X + 'Y' # 0xE8 -> LATIN CAPITAL LETTER Y + 'Z' # 0xE9 -> LATIN CAPITAL LETTER Z + '\xb2' # 0xEA -> SUPERSCRIPT TWO + '\xd4' # 0xEB -> LATIN CAPITAL LETTER O WITH CIRCUMFLEX + '\\' # 0xEC -> REVERSE SOLIDUS + '\xd2' # 0xED -> LATIN CAPITAL LETTER O WITH GRAVE + '\xd3' # 0xEE -> LATIN CAPITAL LETTER O WITH ACUTE + '\xd5' # 0xEF -> LATIN CAPITAL LETTER O WITH TILDE + '0' # 0xF0 -> DIGIT ZERO + '1' # 0xF1 -> DIGIT ONE + '2' # 0xF2 -> DIGIT TWO + '3' # 0xF3 -> DIGIT THREE + '4' # 0xF4 -> DIGIT FOUR + '5' # 0xF5 -> DIGIT FIVE + '6' # 0xF6 -> DIGIT SIX + '7' # 0xF7 -> DIGIT SEVEN + '8' # 0xF8 -> DIGIT EIGHT + '9' # 0xF9 -> DIGIT NINE + '\xb3' # 0xFA -> SUPERSCRIPT THREE + '\xdb' # 0xFB -> LATIN CAPITAL LETTER U WITH CIRCUMFLEX + ']' # 0xFC -> RIGHT SQUARE BRACKET + '\xd9' # 0xFD -> LATIN CAPITAL LETTER U WITH GRAVE + '\xda' # 0xFE -> LATIN CAPITAL LETTER U WITH ACUTE + '\x9f' # 0xFF -> APPLICATION PROGRAM COMMAND (APC) +) + +### Encoding table +encoding_table=codecs.charmap_build(decoding_table) diff --git a/Lib/test/test_unicode.py b/Lib/test/test_unicode.py --- a/Lib/test/test_unicode.py +++ b/Lib/test/test_unicode.py @@ -1831,7 +1831,7 @@ # 0-127 s = bytes(range(128)) for encoding in ( - 'cp037', 'cp1026', + 'cp037', 'cp1026', 'cp273', 'cp437', 'cp500', 'cp720', 'cp737', 'cp775', 'cp850', 'cp852', 'cp855', 'cp858', 'cp860', 'cp861', 'cp862', 'cp863', 'cp865', 'cp866', @@ -1859,7 +1859,7 @@ # 128-255 s = bytes(range(128, 256)) for encoding in ( - 'cp037', 'cp1026', + 'cp037', 'cp1026', 'cp273', 'cp437', 'cp500', 'cp720', 'cp737', 'cp775', 'cp850', 'cp852', 'cp855', 'cp858', 'cp860', 'cp861', 'cp862', 'cp863', 'cp865', 'cp866', -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 10 19:48:03 2013 From: python-checkins at python.org (andrew.kuchling) Date: Sun, 10 Nov 2013 19:48:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Add_news_entry_for_=231097?= =?utf-8?q?797=3B_whitespace_cleanup?= Message-ID: <3dHkkC0HZlz7LwL@mail.python.org> http://hg.python.org/cpython/rev/fa2581bbef44 changeset: 87033:fa2581bbef44 user: Andrew Kuchling date: Sun Nov 10 13:47:57 2013 -0500 summary: Add news entry for #1097797; whitespace cleanup files: Misc/NEWS | 5 ++++- 1 files changed, 4 insertions(+), 1 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,7 +10,7 @@ Core and Builtins ----------------- -- Issue #19514: Deduplicate some _Py_IDENTIFIER declarations. +- Issue #19514: Deduplicate some _Py_IDENTIFIER declarations. Patch by Andrei Dorian Duma. - Issue #17936: Fix O(n**2) behaviour when adding or removing many subclasses @@ -34,6 +34,9 @@ Library ------- +- Issue #1097797: Added CP273 encoding, used on IBM mainframes in + Germany and Austria. Mapping provided by Michael Bierenfeld. + - Issue #1575020: Fixed support of 24-bit wave files on big-endian platforms. - Issue #19378: Fixed a number of cases in the dis module where the new -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 10 20:02:51 2013 From: python-checkins at python.org (jason.coombs) Date: Sun, 10 Nov 2013 20:02:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=237171=3A_Add_Windo?= =?utf-8?q?ws_implementation_of_=60=60inet=5Fntop=60=60_and_=60=60inet=5Fp?= =?utf-8?q?ton=60=60_to?= Message-ID: <3dHl3H0F65z7Ljf@mail.python.org> http://hg.python.org/cpython/rev/17b160baa20f changeset: 87034:17b160baa20f parent: 87026:1ee45eb6aab9 user: Atsuo Ishimoto date: Mon Jul 16 15:16:54 2012 +0900 summary: Issue #7171: Add Windows implementation of ``inet_ntop`` and ``inet_pton`` to socket module. files: Doc/library/socket.rst | 4 +- Lib/test/test_socket.py | 16 ++++ Modules/socketmodule.c | 108 +++++++++++++++++++++++++++- 3 files changed, 124 insertions(+), 4 deletions(-) diff --git a/Doc/library/socket.rst b/Doc/library/socket.rst --- a/Doc/library/socket.rst +++ b/Doc/library/socket.rst @@ -594,7 +594,7 @@ both the value of *address_family* and the underlying implementation of :c:func:`inet_pton`. - Availability: Unix (maybe not all platforms). + Availability: Unix (maybe not all platforms), Windows. .. function:: inet_ntop(address_family, packed_ip) @@ -610,7 +610,7 @@ specified address family, :exc:`ValueError` will be raised. A :exc:`OSError` is raised for errors from the call to :func:`inet_ntop`. - Availability: Unix (maybe not all platforms). + Availability: Unix (maybe not all platforms), Windows. .. diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -980,6 +980,14 @@ return except ImportError: return + + if sys.platform == "win32": + try: + inet_pton(AF_INET6, '::') + except OSError as e: + if e.winerror == 10022: + return # IPv6 might not be installed on this PC + f = lambda a: inet_pton(AF_INET6, a) assertInvalid = lambda a: self.assertRaises( (OSError, ValueError), f, a @@ -1058,6 +1066,14 @@ return except ImportError: return + + if sys.platform == "win32": + try: + inet_ntop(AF_INET6, b'\x00' * 16) + except OSError as e: + if e.winerror == 10022: + return # IPv6 might not be installed on this PC + f = lambda a: inet_ntop(AF_INET6, a) assertInvalid = lambda a: self.assertRaises( (OSError, ValueError), f, a diff --git a/Modules/socketmodule.c b/Modules/socketmodule.c --- a/Modules/socketmodule.c +++ b/Modules/socketmodule.c @@ -5023,7 +5023,7 @@ return PyUnicode_FromString(inet_ntoa(packed_addr)); } -#ifdef HAVE_INET_PTON +#if defined(HAVE_INET_PTON) || defined(MS_WINDOWS) PyDoc_STRVAR(inet_pton_doc, "inet_pton(af, ip) -> packed IP address string\n\ @@ -5031,6 +5031,10 @@ Convert an IP address from string format to a packed string suitable\n\ for use with low-level network functions."); +#endif + +#ifdef HAVE_INET_PTON + static PyObject * socket_inet_pton(PyObject *self, PyObject *args) { @@ -5075,12 +5079,52 @@ return NULL; } } +#elif defined(MS_WINDOWS) + +static PyObject * +socket_inet_pton(PyObject *self, PyObject *args) +{ + int af; + char* ip; + struct sockaddr_in6 addr; + INT ret, size; + + if (!PyArg_ParseTuple(args, "is:inet_pton", &af, &ip)) { + return NULL; + } + + size = sizeof(addr); + ret = WSAStringToAddressA(ip, af, NULL, (LPSOCKADDR)&addr, &size); + + if (ret) { + PyErr_SetExcFromWindowsErr(PyExc_OSError, WSAGetLastError()); + return NULL; + } else if(af == AF_INET) { + struct sockaddr_in *addr4 = (struct sockaddr_in*)&addr; + return PyBytes_FromStringAndSize((const char *)&(addr4->sin_addr), + sizeof(addr4->sin_addr)); + } else if (af == AF_INET6) { + return PyBytes_FromStringAndSize((const char *)&(addr.sin6_addr), + sizeof(addr.sin6_addr)); + } else { + PyErr_SetString(PyExc_OSError, "unknown address family"); + return NULL; + } +} + +#endif + +#if defined(HAVE_INET_PTON) || defined(MS_WINDOWS) PyDoc_STRVAR(inet_ntop_doc, "inet_ntop(af, packed_ip) -> string formatted IP address\n\ \n\ Convert a packed IP address of the given family to string format."); +#endif + + +#ifdef HAVE_INET_PTON static PyObject * socket_inet_ntop(PyObject *self, PyObject *args) { @@ -5134,6 +5178,66 @@ return NULL; } +#elif defined(MS_WINDOWS) + +static PyObject * +socket_inet_ntop(PyObject *self, PyObject *args) +{ + int af; + char* packed; + int len; + struct sockaddr_in6 addr; + DWORD addrlen, ret, retlen; + char ip[MAX(INET_ADDRSTRLEN, INET6_ADDRSTRLEN) + 1]; + + /* Guarantee NUL-termination for PyUnicode_FromString() below */ + memset((void *) &ip[0], '\0', sizeof(ip)); + + if (!PyArg_ParseTuple(args, "iy#:inet_ntop", &af, &packed, &len)) { + return NULL; + } + + if (af == AF_INET) { + struct sockaddr_in * addr4 = (struct sockaddr_in *)&addr; + + if (len != sizeof(struct in_addr)) { + PyErr_SetString(PyExc_ValueError, + "invalid length of packed IP address string"); + return NULL; + } + memset(addr4, 0, sizeof(struct sockaddr_in)); + addr4->sin_family = AF_INET; + memcpy(&(addr4->sin_addr), packed, sizeof(addr4->sin_addr)); + addrlen = sizeof(struct sockaddr_in); + } else if (af == AF_INET6) { + if (len != sizeof(struct in6_addr)) { + PyErr_SetString(PyExc_ValueError, + "invalid length of packed IP address string"); + return NULL; + } + + memset(&addr, 0, sizeof(addr)); + addr.sin6_family = AF_INET6; + memcpy(&(addr.sin6_addr), packed, sizeof(addr.sin6_addr)); + addrlen = sizeof(addr); + } else { + PyErr_Format(PyExc_ValueError, + "unknown address family %d", af); + return NULL; + } + + retlen = sizeof(ip); + ret = WSAAddressToStringA((struct sockaddr*)&addr, addrlen, NULL, + ip, &retlen); + + if (ret) { + PyErr_SetExcFromWindowsErr(PyExc_OSError, WSAGetLastError()); + return NULL; + } else { + return PyUnicode_FromString(ip); + } +} + #endif /* HAVE_INET_PTON */ /* Python interface to getaddrinfo(host, port). */ @@ -5600,7 +5704,7 @@ METH_VARARGS, inet_aton_doc}, {"inet_ntoa", socket_inet_ntoa, METH_VARARGS, inet_ntoa_doc}, -#ifdef HAVE_INET_PTON +#if defined(HAVE_INET_PTON) || defined(MS_WINDOWS) {"inet_pton", socket_inet_pton, METH_VARARGS, inet_pton_doc}, {"inet_ntop", socket_inet_ntop, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 10 20:02:52 2013 From: python-checkins at python.org (jason.coombs) Date: Sun, 10 Nov 2013 20:02:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=237171=3A_Update_sy?= =?utf-8?q?ntax_to_replace_MAX_in_favor_of_Py=5FMAX_=28matching?= Message-ID: <3dHl3J24dBz7Ljf@mail.python.org> http://hg.python.org/cpython/rev/a21f506d04c9 changeset: 87035:a21f506d04c9 user: Jason R. Coombs date: Sun Nov 10 13:43:22 2013 -0500 summary: Issue #7171: Update syntax to replace MAX in favor of Py_MAX (matching implementation for Unix). files: Modules/socketmodule.c | 6 +++++- 1 files changed, 5 insertions(+), 1 deletions(-) diff --git a/Modules/socketmodule.c b/Modules/socketmodule.c --- a/Modules/socketmodule.c +++ b/Modules/socketmodule.c @@ -5188,7 +5188,11 @@ int len; struct sockaddr_in6 addr; DWORD addrlen, ret, retlen; - char ip[MAX(INET_ADDRSTRLEN, INET6_ADDRSTRLEN) + 1]; +#ifdef ENABLE_IPV6 + char ip[Py_MAX(INET_ADDRSTRLEN, INET6_ADDRSTRLEN) + 1]; +#else + char ip[INET_ADDRSTRLEN + 1]; +#endif /* Guarantee NUL-termination for PyUnicode_FromString() below */ memset((void *) &ip[0], '\0', sizeof(ip)); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 10 20:02:53 2013 From: python-checkins at python.org (jason.coombs) Date: Sun, 10 Nov 2013 20:02:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Normalize_whitespace?= Message-ID: <3dHl3K3q8tz7Lxc@mail.python.org> http://hg.python.org/cpython/rev/892ec51a162f changeset: 87036:892ec51a162f user: Jason R. Coombs date: Sun Nov 10 14:02:04 2013 -0500 summary: Normalize whitespace files: Lib/test/test_socket.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -980,14 +980,14 @@ return except ImportError: return - + if sys.platform == "win32": try: inet_pton(AF_INET6, '::') except OSError as e: if e.winerror == 10022: return # IPv6 might not be installed on this PC - + f = lambda a: inet_pton(AF_INET6, a) assertInvalid = lambda a: self.assertRaises( (OSError, ValueError), f, a -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 10 20:02:55 2013 From: python-checkins at python.org (jason.coombs) Date: Sun, 10 Nov 2013 20:02:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merge?= Message-ID: <3dHl3M0pk7z7M34@mail.python.org> http://hg.python.org/cpython/rev/466549b1cad8 changeset: 87037:466549b1cad8 parent: 87036:892ec51a162f parent: 87033:fa2581bbef44 user: Jason R. Coombs date: Sun Nov 10 14:02:40 2013 -0500 summary: Merge files: .hgignore | 2 + Lib/encodings/cp273.py | 307 +++++++++++++++++++++++++++ Lib/test/test_unicode.py | 4 +- Misc/NEWS | 5 +- 4 files changed, 315 insertions(+), 3 deletions(-) diff --git a/.hgignore b/.hgignore --- a/.hgignore +++ b/.hgignore @@ -87,6 +87,8 @@ PCbuild/x64-temp-* PCbuild/amd64 PCbuild/ipch +Tools/unicode/build/ +Tools/unicode/MAPPINGS/ BuildLog.htm __pycache__ Modules/_freeze_importlib diff --git a/Lib/encodings/cp273.py b/Lib/encodings/cp273.py new file mode 100644 --- /dev/null +++ b/Lib/encodings/cp273.py @@ -0,0 +1,307 @@ +""" Python Character Mapping Codec cp273 generated from 'python-mappings/CP273.TXT' with gencodec.py. + +"""#" + +import codecs + +### Codec APIs + +class Codec(codecs.Codec): + + def encode(self,input,errors='strict'): + return codecs.charmap_encode(input,errors,encoding_table) + + def decode(self,input,errors='strict'): + return codecs.charmap_decode(input,errors,decoding_table) + +class IncrementalEncoder(codecs.IncrementalEncoder): + def encode(self, input, final=False): + return codecs.charmap_encode(input,self.errors,encoding_table)[0] + +class IncrementalDecoder(codecs.IncrementalDecoder): + def decode(self, input, final=False): + return codecs.charmap_decode(input,self.errors,decoding_table)[0] + +class StreamWriter(Codec,codecs.StreamWriter): + pass + +class StreamReader(Codec,codecs.StreamReader): + pass + +### encodings module API + +def getregentry(): + return codecs.CodecInfo( + name='cp273', + encode=Codec().encode, + decode=Codec().decode, + incrementalencoder=IncrementalEncoder, + incrementaldecoder=IncrementalDecoder, + streamreader=StreamReader, + streamwriter=StreamWriter, + ) + + +### Decoding Table + +decoding_table = ( + '\x00' # 0x00 -> NULL (NUL) + '\x01' # 0x01 -> START OF HEADING (SOH) + '\x02' # 0x02 -> START OF TEXT (STX) + '\x03' # 0x03 -> END OF TEXT (ETX) + '\x9c' # 0x04 -> STRING TERMINATOR (ST) + '\t' # 0x05 -> CHARACTER TABULATION (HT) + '\x86' # 0x06 -> START OF SELECTED AREA (SSA) + '\x7f' # 0x07 -> DELETE (DEL) + '\x97' # 0x08 -> END OF GUARDED AREA (EPA) + '\x8d' # 0x09 -> REVERSE LINE FEED (RI) + '\x8e' # 0x0A -> SINGLE-SHIFT TWO (SS2) + '\x0b' # 0x0B -> LINE TABULATION (VT) + '\x0c' # 0x0C -> FORM FEED (FF) + '\r' # 0x0D -> CARRIAGE RETURN (CR) + '\x0e' # 0x0E -> SHIFT OUT (SO) + '\x0f' # 0x0F -> SHIFT IN (SI) + '\x10' # 0x10 -> DATALINK ESCAPE (DLE) + '\x11' # 0x11 -> DEVICE CONTROL ONE (DC1) + '\x12' # 0x12 -> DEVICE CONTROL TWO (DC2) + '\x13' # 0x13 -> DEVICE CONTROL THREE (DC3) + '\x9d' # 0x14 -> OPERATING SYSTEM COMMAND (OSC) + '\x85' # 0x15 -> NEXT LINE (NEL) + '\x08' # 0x16 -> BACKSPACE (BS) + '\x87' # 0x17 -> END OF SELECTED AREA (ESA) + '\x18' # 0x18 -> CANCEL (CAN) + '\x19' # 0x19 -> END OF MEDIUM (EM) + '\x92' # 0x1A -> PRIVATE USE TWO (PU2) + '\x8f' # 0x1B -> SINGLE-SHIFT THREE (SS3) + '\x1c' # 0x1C -> FILE SEPARATOR (IS4) + '\x1d' # 0x1D -> GROUP SEPARATOR (IS3) + '\x1e' # 0x1E -> RECORD SEPARATOR (IS2) + '\x1f' # 0x1F -> UNIT SEPARATOR (IS1) + '\x80' # 0x20 -> PADDING CHARACTER (PAD) + '\x81' # 0x21 -> HIGH OCTET PRESET (HOP) + '\x82' # 0x22 -> BREAK PERMITTED HERE (BPH) + '\x83' # 0x23 -> NO BREAK HERE (NBH) + '\x84' # 0x24 -> INDEX (IND) + '\n' # 0x25 -> LINE FEED (LF) + '\x17' # 0x26 -> END OF TRANSMISSION BLOCK (ETB) + '\x1b' # 0x27 -> ESCAPE (ESC) + '\x88' # 0x28 -> CHARACTER TABULATION SET (HTS) + '\x89' # 0x29 -> CHARACTER TABULATION WITH JUSTIFICATION (HTJ) + '\x8a' # 0x2A -> LINE TABULATION SET (VTS) + '\x8b' # 0x2B -> PARTIAL LINE FORWARD (PLD) + '\x8c' # 0x2C -> PARTIAL LINE BACKWARD (PLU) + '\x05' # 0x2D -> ENQUIRY (ENQ) + '\x06' # 0x2E -> ACKNOWLEDGE (ACK) + '\x07' # 0x2F -> BELL (BEL) + '\x90' # 0x30 -> DEVICE CONTROL STRING (DCS) + '\x91' # 0x31 -> PRIVATE USE ONE (PU1) + '\x16' # 0x32 -> SYNCHRONOUS IDLE (SYN) + '\x93' # 0x33 -> SET TRANSMIT STATE (STS) + '\x94' # 0x34 -> CANCEL CHARACTER (CCH) + '\x95' # 0x35 -> MESSAGE WAITING (MW) + '\x96' # 0x36 -> START OF GUARDED AREA (SPA) + '\x04' # 0x37 -> END OF TRANSMISSION (EOT) + '\x98' # 0x38 -> START OF STRING (SOS) + '\x99' # 0x39 -> SINGLE GRAPHIC CHARACTER INTRODUCER (SGCI) + '\x9a' # 0x3A -> SINGLE CHARACTER INTRODUCER (SCI) + '\x9b' # 0x3B -> CONTROL SEQUENCE INTRODUCER (CSI) + '\x14' # 0x3C -> DEVICE CONTROL FOUR (DC4) + '\x15' # 0x3D -> NEGATIVE ACKNOWLEDGE (NAK) + '\x9e' # 0x3E -> PRIVACY MESSAGE (PM) + '\x1a' # 0x3F -> SUBSTITUTE (SUB) + ' ' # 0x40 -> SPACE + '\xa0' # 0x41 -> NO-BREAK SPACE + '\xe2' # 0x42 -> LATIN SMALL LETTER A WITH CIRCUMFLEX + '{' # 0x43 -> LEFT CURLY BRACKET + '\xe0' # 0x44 -> LATIN SMALL LETTER A WITH GRAVE + '\xe1' # 0x45 -> LATIN SMALL LETTER A WITH ACUTE + '\xe3' # 0x46 -> LATIN SMALL LETTER A WITH TILDE + '\xe5' # 0x47 -> LATIN SMALL LETTER A WITH RING ABOVE + '\xe7' # 0x48 -> LATIN SMALL LETTER C WITH CEDILLA + '\xf1' # 0x49 -> LATIN SMALL LETTER N WITH TILDE + '\xc4' # 0x4A -> LATIN CAPITAL LETTER A WITH DIAERESIS + '.' # 0x4B -> FULL STOP + '<' # 0x4C -> LESS-THAN SIGN + '(' # 0x4D -> LEFT PARENTHESIS + '+' # 0x4E -> PLUS SIGN + '!' # 0x4F -> EXCLAMATION MARK + '&' # 0x50 -> AMPERSAND + '\xe9' # 0x51 -> LATIN SMALL LETTER E WITH ACUTE + '\xea' # 0x52 -> LATIN SMALL LETTER E WITH CIRCUMFLEX + '\xeb' # 0x53 -> LATIN SMALL LETTER E WITH DIAERESIS + '\xe8' # 0x54 -> LATIN SMALL LETTER E WITH GRAVE + '\xed' # 0x55 -> LATIN SMALL LETTER I WITH ACUTE + '\xee' # 0x56 -> LATIN SMALL LETTER I WITH CIRCUMFLEX + '\xef' # 0x57 -> LATIN SMALL LETTER I WITH DIAERESIS + '\xec' # 0x58 -> LATIN SMALL LETTER I WITH GRAVE + '~' # 0x59 -> TILDE + '\xdc' # 0x5A -> LATIN CAPITAL LETTER U WITH DIAERESIS + '$' # 0x5B -> DOLLAR SIGN + '*' # 0x5C -> ASTERISK + ')' # 0x5D -> RIGHT PARENTHESIS + ';' # 0x5E -> SEMICOLON + '^' # 0x5F -> CIRCUMFLEX ACCENT + '-' # 0x60 -> HYPHEN-MINUS + '/' # 0x61 -> SOLIDUS + '\xc2' # 0x62 -> LATIN CAPITAL LETTER A WITH CIRCUMFLEX + '[' # 0x63 -> LEFT SQUARE BRACKET + '\xc0' # 0x64 -> LATIN CAPITAL LETTER A WITH GRAVE + '\xc1' # 0x65 -> LATIN CAPITAL LETTER A WITH ACUTE + '\xc3' # 0x66 -> LATIN CAPITAL LETTER A WITH TILDE + '\xc5' # 0x67 -> LATIN CAPITAL LETTER A WITH RING ABOVE + '\xc7' # 0x68 -> LATIN CAPITAL LETTER C WITH CEDILLA + '\xd1' # 0x69 -> LATIN CAPITAL LETTER N WITH TILDE + '\xf6' # 0x6A -> LATIN SMALL LETTER O WITH DIAERESIS + ',' # 0x6B -> COMMA + '%' # 0x6C -> PERCENT SIGN + '_' # 0x6D -> LOW LINE + '>' # 0x6E -> GREATER-THAN SIGN + '?' # 0x6F -> QUESTION MARK + '\xf8' # 0x70 -> LATIN SMALL LETTER O WITH STROKE + '\xc9' # 0x71 -> LATIN CAPITAL LETTER E WITH ACUTE + '\xca' # 0x72 -> LATIN CAPITAL LETTER E WITH CIRCUMFLEX + '\xcb' # 0x73 -> LATIN CAPITAL LETTER E WITH DIAERESIS + '\xc8' # 0x74 -> LATIN CAPITAL LETTER E WITH GRAVE + '\xcd' # 0x75 -> LATIN CAPITAL LETTER I WITH ACUTE + '\xce' # 0x76 -> LATIN CAPITAL LETTER I WITH CIRCUMFLEX + '\xcf' # 0x77 -> LATIN CAPITAL LETTER I WITH DIAERESIS + '\xcc' # 0x78 -> LATIN CAPITAL LETTER I WITH GRAVE + '`' # 0x79 -> GRAVE ACCENT + ':' # 0x7A -> COLON + '#' # 0x7B -> NUMBER SIGN + '\xa7' # 0x7C -> SECTION SIGN + "'" # 0x7D -> APOSTROPHE + '=' # 0x7E -> EQUALS SIGN + '"' # 0x7F -> QUOTATION MARK + '\xd8' # 0x80 -> LATIN CAPITAL LETTER O WITH STROKE + 'a' # 0x81 -> LATIN SMALL LETTER A + 'b' # 0x82 -> LATIN SMALL LETTER B + 'c' # 0x83 -> LATIN SMALL LETTER C + 'd' # 0x84 -> LATIN SMALL LETTER D + 'e' # 0x85 -> LATIN SMALL LETTER E + 'f' # 0x86 -> LATIN SMALL LETTER F + 'g' # 0x87 -> LATIN SMALL LETTER G + 'h' # 0x88 -> LATIN SMALL LETTER H + 'i' # 0x89 -> LATIN SMALL LETTER I + '\xab' # 0x8A -> LEFT-POINTING DOUBLE ANGLE QUOTATION MARK + '\xbb' # 0x8B -> RIGHT-POINTING DOUBLE ANGLE QUOTATION MARK + '\xf0' # 0x8C -> LATIN SMALL LETTER ETH (Icelandic) + '\xfd' # 0x8D -> LATIN SMALL LETTER Y WITH ACUTE + '\xfe' # 0x8E -> LATIN SMALL LETTER THORN (Icelandic) + '\xb1' # 0x8F -> PLUS-MINUS SIGN + '\xb0' # 0x90 -> DEGREE SIGN + 'j' # 0x91 -> LATIN SMALL LETTER J + 'k' # 0x92 -> LATIN SMALL LETTER K + 'l' # 0x93 -> LATIN SMALL LETTER L + 'm' # 0x94 -> LATIN SMALL LETTER M + 'n' # 0x95 -> LATIN SMALL LETTER N + 'o' # 0x96 -> LATIN SMALL LETTER O + 'p' # 0x97 -> LATIN SMALL LETTER P + 'q' # 0x98 -> LATIN SMALL LETTER Q + 'r' # 0x99 -> LATIN SMALL LETTER R + '\xaa' # 0x9A -> FEMININE ORDINAL INDICATOR + '\xba' # 0x9B -> MASCULINE ORDINAL INDICATOR + '\xe6' # 0x9C -> LATIN SMALL LETTER AE + '\xb8' # 0x9D -> CEDILLA + '\xc6' # 0x9E -> LATIN CAPITAL LETTER AE + '\xa4' # 0x9F -> CURRENCY SIGN + '\xb5' # 0xA0 -> MICRO SIGN + '\xdf' # 0xA1 -> LATIN SMALL LETTER SHARP S (German) + 's' # 0xA2 -> LATIN SMALL LETTER S + 't' # 0xA3 -> LATIN SMALL LETTER T + 'u' # 0xA4 -> LATIN SMALL LETTER U + 'v' # 0xA5 -> LATIN SMALL LETTER V + 'w' # 0xA6 -> LATIN SMALL LETTER W + 'x' # 0xA7 -> LATIN SMALL LETTER X + 'y' # 0xA8 -> LATIN SMALL LETTER Y + 'z' # 0xA9 -> LATIN SMALL LETTER Z + '\xa1' # 0xAA -> INVERTED EXCLAMATION MARK + '\xbf' # 0xAB -> INVERTED QUESTION MARK + '\xd0' # 0xAC -> LATIN CAPITAL LETTER ETH (Icelandic) + '\xdd' # 0xAD -> LATIN CAPITAL LETTER Y WITH ACUTE + '\xde' # 0xAE -> LATIN CAPITAL LETTER THORN (Icelandic) + '\xae' # 0xAF -> REGISTERED SIGN + '\xa2' # 0xB0 -> CENT SIGN + '\xa3' # 0xB1 -> POUND SIGN + '\xa5' # 0xB2 -> YEN SIGN + '\xb7' # 0xB3 -> MIDDLE DOT + '\xa9' # 0xB4 -> COPYRIGHT SIGN + '@' # 0xB5 -> COMMERCIAL AT + '\xb6' # 0xB6 -> PILCROW SIGN + '\xbc' # 0xB7 -> VULGAR FRACTION ONE QUARTER + '\xbd' # 0xB8 -> VULGAR FRACTION ONE HALF + '\xbe' # 0xB9 -> VULGAR FRACTION THREE QUARTERS + '\xac' # 0xBA -> NOT SIGN + '|' # 0xBB -> VERTICAL LINE + '\u203e' # 0xBC -> OVERLINE + '\xa8' # 0xBD -> DIAERESIS + '\xb4' # 0xBE -> ACUTE ACCENT + '\xd7' # 0xBF -> MULTIPLICATION SIGN + '\xe4' # 0xC0 -> LATIN SMALL LETTER A WITH DIAERESIS + 'A' # 0xC1 -> LATIN CAPITAL LETTER A + 'B' # 0xC2 -> LATIN CAPITAL LETTER B + 'C' # 0xC3 -> LATIN CAPITAL LETTER C + 'D' # 0xC4 -> LATIN CAPITAL LETTER D + 'E' # 0xC5 -> LATIN CAPITAL LETTER E + 'F' # 0xC6 -> LATIN CAPITAL LETTER F + 'G' # 0xC7 -> LATIN CAPITAL LETTER G + 'H' # 0xC8 -> LATIN CAPITAL LETTER H + 'I' # 0xC9 -> LATIN CAPITAL LETTER I + '\xad' # 0xCA -> SOFT HYPHEN + '\xf4' # 0xCB -> LATIN SMALL LETTER O WITH CIRCUMFLEX + '\xa6' # 0xCC -> BROKEN BAR + '\xf2' # 0xCD -> LATIN SMALL LETTER O WITH GRAVE + '\xf3' # 0xCE -> LATIN SMALL LETTER O WITH ACUTE + '\xf5' # 0xCF -> LATIN SMALL LETTER O WITH TILDE + '\xfc' # 0xD0 -> LATIN SMALL LETTER U WITH DIAERESIS + 'J' # 0xD1 -> LATIN CAPITAL LETTER J + 'K' # 0xD2 -> LATIN CAPITAL LETTER K + 'L' # 0xD3 -> LATIN CAPITAL LETTER L + 'M' # 0xD4 -> LATIN CAPITAL LETTER M + 'N' # 0xD5 -> LATIN CAPITAL LETTER N + 'O' # 0xD6 -> LATIN CAPITAL LETTER O + 'P' # 0xD7 -> LATIN CAPITAL LETTER P + 'Q' # 0xD8 -> LATIN CAPITAL LETTER Q + 'R' # 0xD9 -> LATIN CAPITAL LETTER R + '\xb9' # 0xDA -> SUPERSCRIPT ONE + '\xfb' # 0xDB -> LATIN SMALL LETTER U WITH CIRCUMFLEX + '}' # 0xDC -> RIGHT CURLY BRACKET + '\xf9' # 0xDD -> LATIN SMALL LETTER U WITH GRAVE + '\xfa' # 0xDE -> LATIN SMALL LETTER U WITH ACUTE + '\xff' # 0xDF -> LATIN SMALL LETTER Y WITH DIAERESIS + '\xd6' # 0xE0 -> LATIN CAPITAL LETTER O WITH DIAERESIS + '\xf7' # 0xE1 -> DIVISION SIGN + 'S' # 0xE2 -> LATIN CAPITAL LETTER S + 'T' # 0xE3 -> LATIN CAPITAL LETTER T + 'U' # 0xE4 -> LATIN CAPITAL LETTER U + 'V' # 0xE5 -> LATIN CAPITAL LETTER V + 'W' # 0xE6 -> LATIN CAPITAL LETTER W + 'X' # 0xE7 -> LATIN CAPITAL LETTER X + 'Y' # 0xE8 -> LATIN CAPITAL LETTER Y + 'Z' # 0xE9 -> LATIN CAPITAL LETTER Z + '\xb2' # 0xEA -> SUPERSCRIPT TWO + '\xd4' # 0xEB -> LATIN CAPITAL LETTER O WITH CIRCUMFLEX + '\\' # 0xEC -> REVERSE SOLIDUS + '\xd2' # 0xED -> LATIN CAPITAL LETTER O WITH GRAVE + '\xd3' # 0xEE -> LATIN CAPITAL LETTER O WITH ACUTE + '\xd5' # 0xEF -> LATIN CAPITAL LETTER O WITH TILDE + '0' # 0xF0 -> DIGIT ZERO + '1' # 0xF1 -> DIGIT ONE + '2' # 0xF2 -> DIGIT TWO + '3' # 0xF3 -> DIGIT THREE + '4' # 0xF4 -> DIGIT FOUR + '5' # 0xF5 -> DIGIT FIVE + '6' # 0xF6 -> DIGIT SIX + '7' # 0xF7 -> DIGIT SEVEN + '8' # 0xF8 -> DIGIT EIGHT + '9' # 0xF9 -> DIGIT NINE + '\xb3' # 0xFA -> SUPERSCRIPT THREE + '\xdb' # 0xFB -> LATIN CAPITAL LETTER U WITH CIRCUMFLEX + ']' # 0xFC -> RIGHT SQUARE BRACKET + '\xd9' # 0xFD -> LATIN CAPITAL LETTER U WITH GRAVE + '\xda' # 0xFE -> LATIN CAPITAL LETTER U WITH ACUTE + '\x9f' # 0xFF -> APPLICATION PROGRAM COMMAND (APC) +) + +### Encoding table +encoding_table=codecs.charmap_build(decoding_table) diff --git a/Lib/test/test_unicode.py b/Lib/test/test_unicode.py --- a/Lib/test/test_unicode.py +++ b/Lib/test/test_unicode.py @@ -1831,7 +1831,7 @@ # 0-127 s = bytes(range(128)) for encoding in ( - 'cp037', 'cp1026', + 'cp037', 'cp1026', 'cp273', 'cp437', 'cp500', 'cp720', 'cp737', 'cp775', 'cp850', 'cp852', 'cp855', 'cp858', 'cp860', 'cp861', 'cp862', 'cp863', 'cp865', 'cp866', @@ -1859,7 +1859,7 @@ # 128-255 s = bytes(range(128, 256)) for encoding in ( - 'cp037', 'cp1026', + 'cp037', 'cp1026', 'cp273', 'cp437', 'cp500', 'cp720', 'cp737', 'cp775', 'cp850', 'cp852', 'cp855', 'cp858', 'cp860', 'cp861', 'cp862', 'cp863', 'cp865', 'cp866', diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,7 +10,7 @@ Core and Builtins ----------------- -- Issue #19514: Deduplicate some _Py_IDENTIFIER declarations. +- Issue #19514: Deduplicate some _Py_IDENTIFIER declarations. Patch by Andrei Dorian Duma. - Issue #17936: Fix O(n**2) behaviour when adding or removing many subclasses @@ -34,6 +34,9 @@ Library ------- +- Issue #1097797: Added CP273 encoding, used on IBM mainframes in + Germany and Austria. Mapping provided by Michael Bierenfeld. + - Issue #1575020: Fixed support of 24-bit wave files on big-endian platforms. - Issue #19378: Fixed a number of cases in the dis module where the new -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 10 20:06:12 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 10 Nov 2013 20:06:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319261=3A_Added_su?= =?utf-8?q?pport_for_writing_24-bit_samples_in_the_sunau_module=2E?= Message-ID: <3dHl784kVrz7Ljf@mail.python.org> http://hg.python.org/cpython/rev/d2cc6254d399 changeset: 87038:d2cc6254d399 parent: 87033:fa2581bbef44 user: Serhiy Storchaka date: Sun Nov 10 21:02:53 2013 +0200 summary: Issue #19261: Added support for writing 24-bit samples in the sunau module. files: Doc/library/sunau.rst | 3 + Lib/sunau.py | 5 ++- Lib/test/audiodata/pluck-pcm24.au | Bin Lib/test/test_sunau.py | 28 ++++++++++++++++++ Misc/NEWS | 2 + 5 files changed, 37 insertions(+), 1 deletions(-) diff --git a/Doc/library/sunau.rst b/Doc/library/sunau.rst --- a/Doc/library/sunau.rst +++ b/Doc/library/sunau.rst @@ -212,6 +212,9 @@ Set the sample width (in bytes.) + .. versionchanged:: 3.4 + Added support for 24-bit samples. + .. method:: AU_write.setframerate(n) diff --git a/Lib/sunau.py b/Lib/sunau.py --- a/Lib/sunau.py +++ b/Lib/sunau.py @@ -352,7 +352,7 @@ def setsampwidth(self, sampwidth): if self._nframeswritten: raise Error('cannot change parameters after starting to write') - if sampwidth not in (1, 2, 4): + if sampwidth not in (1, 2, 3, 4): raise Error('bad sample width') self._sampwidth = sampwidth @@ -465,6 +465,9 @@ elif self._sampwidth == 2: encoding = AUDIO_FILE_ENCODING_LINEAR_16 self._framesize = 2 + elif self._sampwidth == 3: + encoding = AUDIO_FILE_ENCODING_LINEAR_24 + self._framesize = 3 elif self._sampwidth == 4: encoding = AUDIO_FILE_ENCODING_LINEAR_32 self._framesize = 4 diff --git a/Lib/test/audiodata/pluck-pcm24.au b/Lib/test/audiodata/pluck-pcm24.au new file mode 100644 index 0000000000000000000000000000000000000000..0bb230418a3844b81637c7b4b1ea3f18f3bfe2d6 GIT binary patch [stripped] diff --git a/Lib/test/test_sunau.py b/Lib/test/test_sunau.py --- a/Lib/test/test_sunau.py +++ b/Lib/test/test_sunau.py @@ -47,6 +47,34 @@ """) +class SunauPCM24Test(audiotests.AudioWriteTests, + audiotests.AudioTestsWithSourceFile, + unittest.TestCase): + module = sunau + sndfilename = 'pluck-pcm24.au' + sndfilenframes = 3307 + nchannels = 2 + sampwidth = 3 + framerate = 11025 + nframes = 48 + comptype = 'NONE' + compname = 'not compressed' + frames = bytes.fromhex("""\ + 022D65FFEB9D 4B5A0F00FA54 3113C304EE2B 80DCD6084303 \ + CBDEC006B261 48A99803F2F8 BFE82401B07D 036BFBFE7B5D \ + B85756FA3EC9 B4B055F3502B 299830EBCB62 1A5CA7E6D99A \ + EDFA3EE491BD C625EBE27884 0E05A9E0B6CF EF2929E02922 \ + 5758D8E27067 FB3557E83E16 1377BFEF8402 D82C5BF7272A \ + 978F16FB7745 F5F865FC1013 086635FB9C4E DF30FCFB40EE \ + 117FE0FA3438 3EE6B8FB5AC3 BC77A3FCB2F4 66D6DAFF5F32 \ + CF13B9041275 431D69097A8C C1BB600EC74E 5120B912A2BA \ + EEDF641754C0 8207001664B7 7FFFFF14453F 8000001294E6 \ + 499C1B0EB3B2 52B73E0DBCA0 EFB2B20F5FD8 CE3CDB0FBE12 \ + E4B49C0CEA2D 6344A80A5A7C 08C8FE0A1FFE 2BB9860B0A0E \ + 51486F0E44E1 8BCC64113B05 B6F4EC0EEB36 4413170A5B48 \ + """) + + class SunauPCM32Test(audiotests.AudioWriteTests, audiotests.AudioTestsWithSourceFile, unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -34,6 +34,8 @@ Library ------- +- Issue #19261: Added support for writing 24-bit samples in the sunau module. + - Issue #1097797: Added CP273 encoding, used on IBM mainframes in Germany and Austria. Mapping provided by Michael Bierenfeld. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 10 20:06:14 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 10 Nov 2013 20:06:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merge_heads?= Message-ID: <3dHl7B0n8Vz7M2q@mail.python.org> http://hg.python.org/cpython/rev/80aa6a1e38e7 changeset: 87039:80aa6a1e38e7 parent: 87038:d2cc6254d399 parent: 87037:466549b1cad8 user: Serhiy Storchaka date: Sun Nov 10 21:05:38 2013 +0200 summary: Merge heads files: Doc/library/socket.rst | 4 +- Lib/test/test_socket.py | 16 ++++ Modules/socketmodule.c | 112 +++++++++++++++++++++++++++- 3 files changed, 128 insertions(+), 4 deletions(-) diff --git a/Doc/library/socket.rst b/Doc/library/socket.rst --- a/Doc/library/socket.rst +++ b/Doc/library/socket.rst @@ -594,7 +594,7 @@ both the value of *address_family* and the underlying implementation of :c:func:`inet_pton`. - Availability: Unix (maybe not all platforms). + Availability: Unix (maybe not all platforms), Windows. .. function:: inet_ntop(address_family, packed_ip) @@ -610,7 +610,7 @@ specified address family, :exc:`ValueError` will be raised. A :exc:`OSError` is raised for errors from the call to :func:`inet_ntop`. - Availability: Unix (maybe not all platforms). + Availability: Unix (maybe not all platforms), Windows. .. diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -980,6 +980,14 @@ return except ImportError: return + + if sys.platform == "win32": + try: + inet_pton(AF_INET6, '::') + except OSError as e: + if e.winerror == 10022: + return # IPv6 might not be installed on this PC + f = lambda a: inet_pton(AF_INET6, a) assertInvalid = lambda a: self.assertRaises( (OSError, ValueError), f, a @@ -1058,6 +1066,14 @@ return except ImportError: return + + if sys.platform == "win32": + try: + inet_ntop(AF_INET6, b'\x00' * 16) + except OSError as e: + if e.winerror == 10022: + return # IPv6 might not be installed on this PC + f = lambda a: inet_ntop(AF_INET6, a) assertInvalid = lambda a: self.assertRaises( (OSError, ValueError), f, a diff --git a/Modules/socketmodule.c b/Modules/socketmodule.c --- a/Modules/socketmodule.c +++ b/Modules/socketmodule.c @@ -5023,7 +5023,7 @@ return PyUnicode_FromString(inet_ntoa(packed_addr)); } -#ifdef HAVE_INET_PTON +#if defined(HAVE_INET_PTON) || defined(MS_WINDOWS) PyDoc_STRVAR(inet_pton_doc, "inet_pton(af, ip) -> packed IP address string\n\ @@ -5031,6 +5031,10 @@ Convert an IP address from string format to a packed string suitable\n\ for use with low-level network functions."); +#endif + +#ifdef HAVE_INET_PTON + static PyObject * socket_inet_pton(PyObject *self, PyObject *args) { @@ -5075,12 +5079,52 @@ return NULL; } } +#elif defined(MS_WINDOWS) + +static PyObject * +socket_inet_pton(PyObject *self, PyObject *args) +{ + int af; + char* ip; + struct sockaddr_in6 addr; + INT ret, size; + + if (!PyArg_ParseTuple(args, "is:inet_pton", &af, &ip)) { + return NULL; + } + + size = sizeof(addr); + ret = WSAStringToAddressA(ip, af, NULL, (LPSOCKADDR)&addr, &size); + + if (ret) { + PyErr_SetExcFromWindowsErr(PyExc_OSError, WSAGetLastError()); + return NULL; + } else if(af == AF_INET) { + struct sockaddr_in *addr4 = (struct sockaddr_in*)&addr; + return PyBytes_FromStringAndSize((const char *)&(addr4->sin_addr), + sizeof(addr4->sin_addr)); + } else if (af == AF_INET6) { + return PyBytes_FromStringAndSize((const char *)&(addr.sin6_addr), + sizeof(addr.sin6_addr)); + } else { + PyErr_SetString(PyExc_OSError, "unknown address family"); + return NULL; + } +} + +#endif + +#if defined(HAVE_INET_PTON) || defined(MS_WINDOWS) PyDoc_STRVAR(inet_ntop_doc, "inet_ntop(af, packed_ip) -> string formatted IP address\n\ \n\ Convert a packed IP address of the given family to string format."); +#endif + + +#ifdef HAVE_INET_PTON static PyObject * socket_inet_ntop(PyObject *self, PyObject *args) { @@ -5134,6 +5178,70 @@ return NULL; } +#elif defined(MS_WINDOWS) + +static PyObject * +socket_inet_ntop(PyObject *self, PyObject *args) +{ + int af; + char* packed; + int len; + struct sockaddr_in6 addr; + DWORD addrlen, ret, retlen; +#ifdef ENABLE_IPV6 + char ip[Py_MAX(INET_ADDRSTRLEN, INET6_ADDRSTRLEN) + 1]; +#else + char ip[INET_ADDRSTRLEN + 1]; +#endif + + /* Guarantee NUL-termination for PyUnicode_FromString() below */ + memset((void *) &ip[0], '\0', sizeof(ip)); + + if (!PyArg_ParseTuple(args, "iy#:inet_ntop", &af, &packed, &len)) { + return NULL; + } + + if (af == AF_INET) { + struct sockaddr_in * addr4 = (struct sockaddr_in *)&addr; + + if (len != sizeof(struct in_addr)) { + PyErr_SetString(PyExc_ValueError, + "invalid length of packed IP address string"); + return NULL; + } + memset(addr4, 0, sizeof(struct sockaddr_in)); + addr4->sin_family = AF_INET; + memcpy(&(addr4->sin_addr), packed, sizeof(addr4->sin_addr)); + addrlen = sizeof(struct sockaddr_in); + } else if (af == AF_INET6) { + if (len != sizeof(struct in6_addr)) { + PyErr_SetString(PyExc_ValueError, + "invalid length of packed IP address string"); + return NULL; + } + + memset(&addr, 0, sizeof(addr)); + addr.sin6_family = AF_INET6; + memcpy(&(addr.sin6_addr), packed, sizeof(addr.sin6_addr)); + addrlen = sizeof(addr); + } else { + PyErr_Format(PyExc_ValueError, + "unknown address family %d", af); + return NULL; + } + + retlen = sizeof(ip); + ret = WSAAddressToStringA((struct sockaddr*)&addr, addrlen, NULL, + ip, &retlen); + + if (ret) { + PyErr_SetExcFromWindowsErr(PyExc_OSError, WSAGetLastError()); + return NULL; + } else { + return PyUnicode_FromString(ip); + } +} + #endif /* HAVE_INET_PTON */ /* Python interface to getaddrinfo(host, port). */ @@ -5600,7 +5708,7 @@ METH_VARARGS, inet_aton_doc}, {"inet_ntoa", socket_inet_ntoa, METH_VARARGS, inet_ntoa_doc}, -#ifdef HAVE_INET_PTON +#if defined(HAVE_INET_PTON) || defined(MS_WINDOWS) {"inet_pton", socket_inet_pton, METH_VARARGS, inet_pton_doc}, {"inet_ntop", socket_inet_ntop, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 10 20:15:06 2013 From: python-checkins at python.org (jason.coombs) Date: Sun, 10 Nov 2013 20:15:06 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Update_Misc/NEWS_for_Issue?= =?utf-8?q?_=237171?= Message-ID: <3dHlKQ0VFrz7Lkj@mail.python.org> http://hg.python.org/cpython/rev/31fe38f95c82 changeset: 87040:31fe38f95c82 user: Jason R. Coombs date: Sun Nov 10 14:13:44 2013 -0500 summary: Update Misc/NEWS for Issue #7171 files: Misc/NEWS | 3 +++ 1 files changed, 3 insertions(+), 0 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -34,6 +34,9 @@ Library ------- +- Issue #7171: Add Windows implementation of ``inet_ntop`` and ``inet_pton`` + to socket module. Patch by Atsuo Ishimoto. + - Issue #19261: Added support for writing 24-bit samples in the sunau module. - Issue #1097797: Added CP273 encoding, used on IBM mainframes in -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 10 20:45:10 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 10 Nov 2013 20:45:10 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2316685=3A_Added_su?= =?utf-8?q?pport_for_any_bytes-like_objects_in_the_audioop_module=2E?= Message-ID: <3dHm061MzRz7M3Q@mail.python.org> http://hg.python.org/cpython/rev/bab0cbf86835 changeset: 87041:bab0cbf86835 user: Serhiy Storchaka date: Sun Nov 10 21:44:36 2013 +0200 summary: Issue #16685: Added support for any bytes-like objects in the audioop module. Removed support for strings. files: Doc/library/audioop.rst | 8 +- Lib/test/test_audioop.py | 102 +++- Misc/NEWS | 3 + Modules/audioop.c | 668 +++++++++++++++----------- 4 files changed, 486 insertions(+), 295 deletions(-) diff --git a/Doc/library/audioop.rst b/Doc/library/audioop.rst --- a/Doc/library/audioop.rst +++ b/Doc/library/audioop.rst @@ -7,12 +7,16 @@ The :mod:`audioop` module contains some useful operations on sound fragments. It operates on sound fragments consisting of signed integer samples 8, 16, 24 -or 32 bits wide, stored in bytes objects. All scalar items are integers, -unless specified otherwise. +or 32 bits wide, stored in :term:`bytes-like object`\ s. All scalar items are +integers, unless specified otherwise. .. versionchanged:: 3.4 Support for 24-bit samples was added. +.. versionchanged:: 3.4 + Any :term:`bytes-like object`\ s are now accepted by all functions in this + module. Strings no more supported. + .. index:: single: Intel/DVI ADPCM single: ADPCM, Intel/DVI diff --git a/Lib/test/test_audioop.py b/Lib/test/test_audioop.py --- a/Lib/test/test_audioop.py +++ b/Lib/test/test_audioop.py @@ -34,6 +34,8 @@ def test_max(self): for w in 1, 2, 3, 4: self.assertEqual(audioop.max(b'', w), 0) + self.assertEqual(audioop.max(bytearray(), w), 0) + self.assertEqual(audioop.max(memoryview(b''), w), 0) p = packs[w] self.assertEqual(audioop.max(p(5), w), 5) self.assertEqual(audioop.max(p(5, -8, -1), w), 8) @@ -45,6 +47,10 @@ for w in 1, 2, 3, 4: self.assertEqual(audioop.minmax(b'', w), (0x7fffffff, -0x80000000)) + self.assertEqual(audioop.minmax(bytearray(), w), + (0x7fffffff, -0x80000000)) + self.assertEqual(audioop.minmax(memoryview(b''), w), + (0x7fffffff, -0x80000000)) p = packs[w] self.assertEqual(audioop.minmax(p(5), w), (5, 5)) self.assertEqual(audioop.minmax(p(5, -8, -1), w), (-8, 5)) @@ -58,6 +64,8 @@ def test_maxpp(self): for w in 1, 2, 3, 4: self.assertEqual(audioop.maxpp(b'', w), 0) + self.assertEqual(audioop.maxpp(bytearray(), w), 0) + self.assertEqual(audioop.maxpp(memoryview(b''), w), 0) self.assertEqual(audioop.maxpp(packs[w](*range(100)), w), 0) self.assertEqual(audioop.maxpp(packs[w](9, 10, 5, 5, 0, 1), w), 10) self.assertEqual(audioop.maxpp(datas[w], w), @@ -66,6 +74,8 @@ def test_avg(self): for w in 1, 2, 3, 4: self.assertEqual(audioop.avg(b'', w), 0) + self.assertEqual(audioop.avg(bytearray(), w), 0) + self.assertEqual(audioop.avg(memoryview(b''), w), 0) p = packs[w] self.assertEqual(audioop.avg(p(5), w), 5) self .assertEqual(audioop.avg(p(5, 8), w), 6) @@ -82,6 +92,8 @@ def test_avgpp(self): for w in 1, 2, 3, 4: self.assertEqual(audioop.avgpp(b'', w), 0) + self.assertEqual(audioop.avgpp(bytearray(), w), 0) + self.assertEqual(audioop.avgpp(memoryview(b''), w), 0) self.assertEqual(audioop.avgpp(packs[w](*range(100)), w), 0) self.assertEqual(audioop.avgpp(packs[w](9, 10, 5, 5, 0, 1), w), 10) self.assertEqual(audioop.avgpp(datas[1], 1), 196) @@ -92,6 +104,8 @@ def test_rms(self): for w in 1, 2, 3, 4: self.assertEqual(audioop.rms(b'', w), 0) + self.assertEqual(audioop.rms(bytearray(), w), 0) + self.assertEqual(audioop.rms(memoryview(b''), w), 0) p = packs[w] self.assertEqual(audioop.rms(p(*range(100)), w), 57) self.assertAlmostEqual(audioop.rms(p(maxvalues[w]) * 5, w), @@ -106,6 +120,8 @@ def test_cross(self): for w in 1, 2, 3, 4: self.assertEqual(audioop.cross(b'', w), -1) + self.assertEqual(audioop.cross(bytearray(), w), -1) + self.assertEqual(audioop.cross(memoryview(b''), w), -1) p = packs[w] self.assertEqual(audioop.cross(p(0, 1, 2), w), 0) self.assertEqual(audioop.cross(p(1, 2, -3, -4), w), 1) @@ -116,6 +132,8 @@ def test_add(self): for w in 1, 2, 3, 4: self.assertEqual(audioop.add(b'', b'', w), b'') + self.assertEqual(audioop.add(bytearray(), bytearray(), w), b'') + self.assertEqual(audioop.add(memoryview(b''), memoryview(b''), w), b'') self.assertEqual(audioop.add(datas[w], b'\0' * len(datas[w]), w), datas[w]) self.assertEqual(audioop.add(datas[1], datas[1], 1), @@ -133,6 +151,8 @@ for w in 1, 2, 3, 4: for bias in 0, 1, -1, 127, -128, 0x7fffffff, -0x80000000: self.assertEqual(audioop.bias(b'', w, bias), b'') + self.assertEqual(audioop.bias(bytearray(), w, bias), b'') + self.assertEqual(audioop.bias(memoryview(b''), w, bias), b'') self.assertEqual(audioop.bias(datas[1], 1, 1), b'\x01\x13\x46\xbc\x80\x81\x00') self.assertEqual(audioop.bias(datas[1], 1, -1), @@ -176,6 +196,10 @@ def test_lin2lin(self): for w in 1, 2, 3, 4: self.assertEqual(audioop.lin2lin(datas[w], w, w), datas[w]) + self.assertEqual(audioop.lin2lin(bytearray(datas[w]), w, w), + datas[w]) + self.assertEqual(audioop.lin2lin(memoryview(datas[w]), w, w), + datas[w]) self.assertEqual(audioop.lin2lin(datas[1], 1, 2), packs[2](0, 0x1200, 0x4500, -0x4500, 0x7f00, -0x8000, -0x100)) @@ -211,6 +235,10 @@ def test_adpcm2lin(self): self.assertEqual(audioop.adpcm2lin(b'\x07\x7f\x7f', 1, None), (b'\x00\x00\x00\xff\x00\xff', (-179, 40))) + self.assertEqual(audioop.adpcm2lin(bytearray(b'\x07\x7f\x7f'), 1, None), + (b'\x00\x00\x00\xff\x00\xff', (-179, 40))) + self.assertEqual(audioop.adpcm2lin(memoryview(b'\x07\x7f\x7f'), 1, None), + (b'\x00\x00\x00\xff\x00\xff', (-179, 40))) self.assertEqual(audioop.adpcm2lin(b'\x07\x7f\x7f', 2, None), (packs[2](0, 0xb, 0x29, -0x16, 0x72, -0xb3), (-179, 40))) self.assertEqual(audioop.adpcm2lin(b'\x07\x7f\x7f', 3, None), @@ -228,6 +256,10 @@ def test_lin2adpcm(self): self.assertEqual(audioop.lin2adpcm(datas[1], 1, None), (b'\x07\x7f\x7f', (-221, 39))) + self.assertEqual(audioop.lin2adpcm(bytearray(datas[1]), 1, None), + (b'\x07\x7f\x7f', (-221, 39))) + self.assertEqual(audioop.lin2adpcm(memoryview(datas[1]), 1, None), + (b'\x07\x7f\x7f', (-221, 39))) for w in 2, 3, 4: self.assertEqual(audioop.lin2adpcm(datas[w], w, None), (b'\x07\x7f\x7f', (31, 39))) @@ -240,6 +272,10 @@ def test_lin2alaw(self): self.assertEqual(audioop.lin2alaw(datas[1], 1), b'\xd5\x87\xa4\x24\xaa\x2a\x5a') + self.assertEqual(audioop.lin2alaw(bytearray(datas[1]), 1), + b'\xd5\x87\xa4\x24\xaa\x2a\x5a') + self.assertEqual(audioop.lin2alaw(memoryview(datas[1]), 1), + b'\xd5\x87\xa4\x24\xaa\x2a\x5a') for w in 2, 3, 4: self.assertEqual(audioop.lin2alaw(datas[w], w), b'\xd5\x87\xa4\x24\xaa\x2a\x55') @@ -250,8 +286,10 @@ src = [-688, -720, -2240, -4032, -9, -3, -1, -27, -244, -82, -106, 688, 720, 2240, 4032, 9, 3, 1, 27, 244, 82, 106] for w in 1, 2, 3, 4: - self.assertEqual(audioop.alaw2lin(encoded, w), - packs[w](*(x << (w * 8) >> 13 for x in src))) + decoded = packs[w](*(x << (w * 8) >> 13 for x in src)) + self.assertEqual(audioop.alaw2lin(encoded, w), decoded) + self.assertEqual(audioop.alaw2lin(bytearray(encoded), w), decoded) + self.assertEqual(audioop.alaw2lin(memoryview(encoded), w), decoded) encoded = bytes(range(256)) for w in 2, 3, 4: @@ -261,6 +299,10 @@ def test_lin2ulaw(self): self.assertEqual(audioop.lin2ulaw(datas[1], 1), b'\xff\xad\x8e\x0e\x80\x00\x67') + self.assertEqual(audioop.lin2ulaw(bytearray(datas[1]), 1), + b'\xff\xad\x8e\x0e\x80\x00\x67') + self.assertEqual(audioop.lin2ulaw(memoryview(datas[1]), 1), + b'\xff\xad\x8e\x0e\x80\x00\x67') for w in 2, 3, 4: self.assertEqual(audioop.lin2ulaw(datas[w], w), b'\xff\xad\x8e\x0e\x80\x00\x7e') @@ -271,8 +313,10 @@ src = [-8031, -4447, -1471, -495, -163, -53, -18, -6, -2, 0, 8031, 4447, 1471, 495, 163, 53, 18, 6, 2, 0] for w in 1, 2, 3, 4: - self.assertEqual(audioop.ulaw2lin(encoded, w), - packs[w](*(x << (w * 8) >> 14 for x in src))) + decoded = packs[w](*(x << (w * 8) >> 14 for x in src)) + self.assertEqual(audioop.ulaw2lin(encoded, w), decoded) + self.assertEqual(audioop.ulaw2lin(bytearray(encoded), w), decoded) + self.assertEqual(audioop.ulaw2lin(memoryview(encoded), w), decoded) # Current u-law implementation has two codes fo 0: 0x7f and 0xff. encoded = bytes(range(127)) + bytes(range(128, 256)) @@ -283,6 +327,8 @@ def test_mul(self): for w in 1, 2, 3, 4: self.assertEqual(audioop.mul(b'', w, 2), b'') + self.assertEqual(audioop.mul(bytearray(), w, 2), b'') + self.assertEqual(audioop.mul(memoryview(b''), w, 2), b'') self.assertEqual(audioop.mul(datas[w], w, 0), b'\0' * len(datas[w])) self.assertEqual(audioop.mul(datas[w], w, 1), @@ -302,6 +348,10 @@ for w in 1, 2, 3, 4: self.assertEqual(audioop.ratecv(b'', w, 1, 8000, 8000, None), (b'', (-1, ((0, 0),)))) + self.assertEqual(audioop.ratecv(bytearray(), w, 1, 8000, 8000, None), + (b'', (-1, ((0, 0),)))) + self.assertEqual(audioop.ratecv(memoryview(b''), w, 1, 8000, 8000, None), + (b'', (-1, ((0, 0),)))) self.assertEqual(audioop.ratecv(b'', w, 5, 8000, 8000, None), (b'', (-1, ((0, 0),) * 5))) self.assertEqual(audioop.ratecv(b'', w, 1, 8000, 16000, None), @@ -326,6 +376,8 @@ def test_reverse(self): for w in 1, 2, 3, 4: self.assertEqual(audioop.reverse(b'', w), b'') + self.assertEqual(audioop.reverse(bytearray(), w), b'') + self.assertEqual(audioop.reverse(memoryview(b''), w), b'') self.assertEqual(audioop.reverse(packs[w](0, 1, 2), w), packs[w](2, 1, 0)) @@ -340,6 +392,10 @@ for k in range(w): data2[k+w::2*w] = data1[k::w] self.assertEqual(audioop.tomono(data2, w, 0.5, 0.5), data1) + self.assertEqual(audioop.tomono(bytearray(data2), w, 0.5, 0.5), + data1) + self.assertEqual(audioop.tomono(memoryview(data2), w, 0.5, 0.5), + data1) def test_tostereo(self): for w in 1, 2, 3, 4: @@ -352,14 +408,25 @@ for k in range(w): data2[k+w::2*w] = data1[k::w] self.assertEqual(audioop.tostereo(data1, w, 1, 1), data2) + self.assertEqual(audioop.tostereo(bytearray(data1), w, 1, 1), data2) + self.assertEqual(audioop.tostereo(memoryview(data1), w, 1, 1), + data2) def test_findfactor(self): self.assertEqual(audioop.findfactor(datas[2], datas[2]), 1.0) + self.assertEqual(audioop.findfactor(bytearray(datas[2]), + bytearray(datas[2])), 1.0) + self.assertEqual(audioop.findfactor(memoryview(datas[2]), + memoryview(datas[2])), 1.0) self.assertEqual(audioop.findfactor(b'\0' * len(datas[2]), datas[2]), 0.0) def test_findfit(self): self.assertEqual(audioop.findfit(datas[2], datas[2]), (0, 1.0)) + self.assertEqual(audioop.findfit(bytearray(datas[2]), + bytearray(datas[2])), (0, 1.0)) + self.assertEqual(audioop.findfit(memoryview(datas[2]), + memoryview(datas[2])), (0, 1.0)) self.assertEqual(audioop.findfit(datas[2], packs[2](1, 2, 0)), (1, 8038.8)) self.assertEqual(audioop.findfit(datas[2][:-2] * 5 + datas[2], datas[2]), @@ -367,11 +434,15 @@ def test_findmax(self): self.assertEqual(audioop.findmax(datas[2], 1), 5) + self.assertEqual(audioop.findmax(bytearray(datas[2]), 1), 5) + self.assertEqual(audioop.findmax(memoryview(datas[2]), 1), 5) def test_getsample(self): for w in 1, 2, 3, 4: data = packs[w](0, 1, -1, maxvalues[w], minvalues[w]) self.assertEqual(audioop.getsample(data, w, 0), 0) + self.assertEqual(audioop.getsample(bytearray(data), w, 0), 0) + self.assertEqual(audioop.getsample(memoryview(data), w, 0), 0) self.assertEqual(audioop.getsample(data, w, 1), 1) self.assertEqual(audioop.getsample(data, w, 2), -1) self.assertEqual(audioop.getsample(data, w, 3), maxvalues[w]) @@ -406,6 +477,29 @@ self.assertRaises(audioop.error, audioop.lin2alaw, data, size) self.assertRaises(audioop.error, audioop.lin2adpcm, data, size, state) + def test_string(self): + data = 'abcd' + size = 2 + self.assertRaises(TypeError, audioop.getsample, data, size, 0) + self.assertRaises(TypeError, audioop.max, data, size) + self.assertRaises(TypeError, audioop.minmax, data, size) + self.assertRaises(TypeError, audioop.avg, data, size) + self.assertRaises(TypeError, audioop.rms, data, size) + self.assertRaises(TypeError, audioop.avgpp, data, size) + self.assertRaises(TypeError, audioop.maxpp, data, size) + self.assertRaises(TypeError, audioop.cross, data, size) + self.assertRaises(TypeError, audioop.mul, data, size, 1.0) + self.assertRaises(TypeError, audioop.tomono, data, size, 0.5, 0.5) + self.assertRaises(TypeError, audioop.tostereo, data, size, 0.5, 0.5) + self.assertRaises(TypeError, audioop.add, data, data, size) + self.assertRaises(TypeError, audioop.bias, data, size, 0) + self.assertRaises(TypeError, audioop.reverse, data, size) + self.assertRaises(TypeError, audioop.lin2lin, data, size, size) + self.assertRaises(TypeError, audioop.ratecv, data, size, 1, 1, 1, None) + self.assertRaises(TypeError, audioop.lin2ulaw, data, size) + self.assertRaises(TypeError, audioop.lin2alaw, data, size) + self.assertRaises(TypeError, audioop.lin2adpcm, data, size, None) + def test_wrongsize(self): data = b'abcdefgh' state = None diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -34,6 +34,9 @@ Library ------- +- Issue #16685: Added support for any bytes-like objects in the audioop module. + Removed support for strings. + - Issue #7171: Add Windows implementation of ``inet_ntop`` and ``inet_pton`` to socket module. Patch by Atsuo Ishimoto. diff --git a/Modules/audioop.c b/Modules/audioop.c --- a/Modules/audioop.c +++ b/Modules/audioop.c @@ -393,110 +393,129 @@ static PyObject * audioop_getsample(PyObject *self, PyObject *args) { - signed char *cp; - Py_ssize_t len, i; + Py_buffer view; + Py_ssize_t i; int size; + int val; - if ( !PyArg_ParseTuple(args, "s#in:getsample", &cp, &len, &size, &i) ) - return 0; - if (!audioop_check_parameters(len, size)) + if (!PyArg_ParseTuple(args, "y*in:getsample", &view, &size, &i)) return NULL; - if ( i < 0 || i >= len/size ) { + if (!audioop_check_parameters(view.len, size)) + goto error; + if (i < 0 || i >= view.len/size) { PyErr_SetString(AudioopError, "Index out of range"); - return 0; + goto error; } - return PyLong_FromLong(GETRAWSAMPLE(size, cp, i*size)); + val = GETRAWSAMPLE(size, view.buf, i*size); + PyBuffer_Release(&view); + return PyLong_FromLong(val); + + error: + PyBuffer_Release(&view); + return NULL; } static PyObject * audioop_max(PyObject *self, PyObject *args) { - signed char *cp; - Py_ssize_t len, i; + Py_buffer view; + Py_ssize_t i; int size; unsigned int absval, max = 0; - if ( !PyArg_ParseTuple(args, "s#i:max", &cp, &len, &size) ) - return 0; - if (!audioop_check_parameters(len, size)) + if (!PyArg_ParseTuple(args, "y*i:max", &view, &size)) return NULL; - for (i = 0; i < len; i += size) { - int val = GETRAWSAMPLE(size, cp, i); + if (!audioop_check_parameters(view.len, size)) { + PyBuffer_Release(&view); + return NULL; + } + for (i = 0; i < view.len; i += size) { + int val = GETRAWSAMPLE(size, view.buf, i); if (val < 0) absval = (-val); else absval = val; if (absval > max) max = absval; } + PyBuffer_Release(&view); return PyLong_FromUnsignedLong(max); } static PyObject * audioop_minmax(PyObject *self, PyObject *args) { - signed char *cp; - Py_ssize_t len, i; + Py_buffer view; + Py_ssize_t i; int size; /* -1 trick below is needed on Windows to support -0x80000000 without a warning */ int min = 0x7fffffff, max = -0x7FFFFFFF-1; - if (!PyArg_ParseTuple(args, "s#i:minmax", &cp, &len, &size)) + if (!PyArg_ParseTuple(args, "y*i:minmax", &view, &size)) return NULL; - if (!audioop_check_parameters(len, size)) + if (!audioop_check_parameters(view.len, size)) { + PyBuffer_Release(&view); return NULL; - for (i = 0; i < len; i += size) { - int val = GETRAWSAMPLE(size, cp, i); + } + for (i = 0; i < view.len; i += size) { + int val = GETRAWSAMPLE(size, view.buf, i); if (val > max) max = val; if (val < min) min = val; } + PyBuffer_Release(&view); return Py_BuildValue("(ii)", min, max); } static PyObject * audioop_avg(PyObject *self, PyObject *args) { - signed char *cp; - Py_ssize_t len, i; + Py_buffer view; + Py_ssize_t i; int size, avg; double sum = 0.0; - if ( !PyArg_ParseTuple(args, "s#i:avg", &cp, &len, &size) ) - return 0; - if (!audioop_check_parameters(len, size)) + if (!PyArg_ParseTuple(args, "y*i:avg", &view, &size)) return NULL; - for (i = 0; i < len; i += size) - sum += GETRAWSAMPLE(size, cp, i); - if ( len == 0 ) + if (!audioop_check_parameters(view.len, size)) { + PyBuffer_Release(&view); + return NULL; + } + for (i = 0; i < view.len; i += size) + sum += GETRAWSAMPLE(size, view.buf, i); + if (view.len == 0) avg = 0; else - avg = (int)floor(sum / (double)(len/size)); + avg = (int)floor(sum / (double)(view.len/size)); + PyBuffer_Release(&view); return PyLong_FromLong(avg); } static PyObject * audioop_rms(PyObject *self, PyObject *args) { - signed char *cp; - Py_ssize_t len, i; + Py_buffer view; + Py_ssize_t i; int size; unsigned int res; double sum_squares = 0.0; - if ( !PyArg_ParseTuple(args, "s#i:rms", &cp, &len, &size) ) - return 0; - if (!audioop_check_parameters(len, size)) + if (!PyArg_ParseTuple(args, "y*i:rms", &view, &size)) return NULL; - for (i = 0; i < len; i += size) { - double val = GETRAWSAMPLE(size, cp, i); + if (!audioop_check_parameters(view.len, size)) { + PyBuffer_Release(&view); + return NULL; + } + for (i = 0; i < view.len; i += size) { + double val = GETRAWSAMPLE(size, view.buf, i); sum_squares += val*val; } - if ( len == 0 ) + if (view.len == 0) res = 0; else - res = (unsigned int)sqrt(sum_squares / (double)(len/size)); + res = (unsigned int)sqrt(sum_squares / (double)(view.len/size)); + PyBuffer_Release(&view); return PyLong_FromUnsignedLong(res); } -static double _sum2(short *a, short *b, Py_ssize_t len) +static double _sum2(const short *a, const short *b, Py_ssize_t len) { Py_ssize_t i; double sum = 0.0; @@ -542,29 +561,28 @@ static PyObject * audioop_findfit(PyObject *self, PyObject *args) { - short *cp1, *cp2; + Py_buffer view1; + Py_buffer view2; + const short *cp1, *cp2; Py_ssize_t len1, len2; Py_ssize_t j, best_j; double aj_m1, aj_lm1; double sum_ri_2, sum_aij_2, sum_aij_ri, result, best_result, factor; - /* Passing a short** for an 's' argument is correct only - if the string contents is aligned for interpretation - as short[]. Due to the definition of PyBytesObject, - this is currently (Python 2.6) the case. */ - if ( !PyArg_ParseTuple(args, "s#s#:findfit", - (char**)&cp1, &len1, (char**)&cp2, &len2) ) - return 0; - if ( len1 & 1 || len2 & 1 ) { + if (!PyArg_ParseTuple(args, "y*y*:findfit", &view1, &view2)) + return NULL; + if (view1.len & 1 || view2.len & 1) { PyErr_SetString(AudioopError, "Strings should be even-sized"); - return 0; + goto error; } - len1 >>= 1; - len2 >>= 1; + cp1 = (const short *)view1.buf; + len1 = view1.len >> 1; + cp2 = (const short *)view2.buf; + len2 = view2.len >> 1; - if ( len1 < len2 ) { + if (len1 < len2) { PyErr_SetString(AudioopError, "First sample should be longer"); - return 0; + goto error; } sum_ri_2 = _sum2(cp2, cp2, len2); sum_aij_2 = _sum2(cp1, cp1, len2); @@ -594,7 +612,14 @@ factor = _sum2(cp1+best_j, cp2, len2) / sum_ri_2; + PyBuffer_Release(&view1); + PyBuffer_Release(&view2); return Py_BuildValue("(nf)", best_j, factor); + + error: + PyBuffer_Release(&view1); + PyBuffer_Release(&view2); + return NULL; } /* @@ -604,28 +629,38 @@ static PyObject * audioop_findfactor(PyObject *self, PyObject *args) { - short *cp1, *cp2; - Py_ssize_t len1, len2; + Py_buffer view1; + Py_buffer view2; + const short *cp1, *cp2; + Py_ssize_t len; double sum_ri_2, sum_aij_ri, result; - if ( !PyArg_ParseTuple(args, "s#s#:findfactor", - (char**)&cp1, &len1, (char**)&cp2, &len2) ) - return 0; - if ( len1 & 1 || len2 & 1 ) { + if (!PyArg_ParseTuple(args, "y*y*:findfactor", &view1, &view2)) + return NULL; + if (view1.len & 1 || view2.len & 1) { PyErr_SetString(AudioopError, "Strings should be even-sized"); - return 0; + goto error; } - if ( len1 != len2 ) { + if (view1.len != view2.len) { PyErr_SetString(AudioopError, "Samples should be same size"); - return 0; + goto error; } - len2 >>= 1; - sum_ri_2 = _sum2(cp2, cp2, len2); - sum_aij_ri = _sum2(cp1, cp2, len2); + cp1 = (const short *)view1.buf; + cp2 = (const short *)view2.buf; + len = view1.len >> 1; + sum_ri_2 = _sum2(cp2, cp2, len); + sum_aij_ri = _sum2(cp1, cp2, len); result = sum_aij_ri / sum_ri_2; + PyBuffer_Release(&view1); + PyBuffer_Release(&view2); return PyFloat_FromDouble(result); + + error: + PyBuffer_Release(&view1); + PyBuffer_Release(&view2); + return NULL; } /* @@ -635,24 +670,25 @@ static PyObject * audioop_findmax(PyObject *self, PyObject *args) { - short *cp1; + Py_buffer view; + const short *cp1; Py_ssize_t len1, len2; Py_ssize_t j, best_j; double aj_m1, aj_lm1; double result, best_result; - if ( !PyArg_ParseTuple(args, "s#n:findmax", - (char**)&cp1, &len1, &len2) ) - return 0; - if ( len1 & 1 ) { + if (!PyArg_ParseTuple(args, "y*n:findmax", &view, &len2)) + return NULL; + if (view.len & 1) { PyErr_SetString(AudioopError, "Strings should be even-sized"); - return 0; + goto error; } - len1 >>= 1; + cp1 = (const short *)view.buf; + len1 = view.len >> 1; - if ( len2 < 0 || len1 < len2 ) { + if (len2 < 0 || len1 < len2) { PyErr_SetString(AudioopError, "Input sample should be longer"); - return 0; + goto error; } result = _sum2(cp1, cp1, len2); @@ -673,30 +709,39 @@ } + PyBuffer_Release(&view); return PyLong_FromSsize_t(best_j); + + error: + PyBuffer_Release(&view); + return NULL; } static PyObject * audioop_avgpp(PyObject *self, PyObject *args) { - signed char *cp; - Py_ssize_t len, i; + Py_buffer view; + Py_ssize_t i; int size, prevval, prevextremevalid = 0, prevextreme = 0; double sum = 0.0; unsigned int avg; int diff, prevdiff, nextreme = 0; - if ( !PyArg_ParseTuple(args, "s#i:avgpp", &cp, &len, &size) ) - return 0; - if (!audioop_check_parameters(len, size)) + if (!PyArg_ParseTuple(args, "y*i:avgpp", &view, &size)) return NULL; - if (len <= size) + if (!audioop_check_parameters(view.len, size)) { + PyBuffer_Release(&view); + return NULL; + } + if (view.len <= size) { + PyBuffer_Release(&view); return PyLong_FromLong(0); - prevval = GETRAWSAMPLE(size, cp, 0); + } + prevval = GETRAWSAMPLE(size, view.buf, 0); prevdiff = 17; /* Anything != 0, 1 */ - for (i = size; i < len; i += size) { - int val = GETRAWSAMPLE(size, cp, i); + for (i = size; i < view.len; i += size) { + int val = GETRAWSAMPLE(size, view.buf, i); if (val != prevval) { diff = val < prevval; if (prevdiff == !diff) { @@ -723,29 +768,34 @@ avg = 0; else avg = (unsigned int)(sum / (double)nextreme); + PyBuffer_Release(&view); return PyLong_FromUnsignedLong(avg); } static PyObject * audioop_maxpp(PyObject *self, PyObject *args) { - signed char *cp; - Py_ssize_t len, i; + Py_buffer view; + Py_ssize_t i; int size, prevval, prevextremevalid = 0, prevextreme = 0; unsigned int max = 0, extremediff; int diff, prevdiff; - if ( !PyArg_ParseTuple(args, "s#i:maxpp", &cp, &len, &size) ) - return 0; - if (!audioop_check_parameters(len, size)) + if (!PyArg_ParseTuple(args, "y*i:maxpp", &view, &size)) return NULL; - if (len <= size) + if (!audioop_check_parameters(view.len, size)) { + PyBuffer_Release(&view); + return NULL; + } + if (view.len <= size) { + PyBuffer_Release(&view); return PyLong_FromLong(0); - prevval = GETRAWSAMPLE(size, cp, 0); + } + prevval = GETRAWSAMPLE(size, view.buf, 0); prevdiff = 17; /* Anything != 0, 1 */ - for (i = size; i < len; i += size) { - int val = GETRAWSAMPLE(size, cp, i); + for (i = size; i < view.len; i += size) { + int val = GETRAWSAMPLE(size, view.buf, i); if (val != prevval) { diff = val < prevval; if (prevdiff == !diff) { @@ -769,60 +819,67 @@ prevdiff = diff; } } + PyBuffer_Release(&view); return PyLong_FromUnsignedLong(max); } static PyObject * audioop_cross(PyObject *self, PyObject *args) { - signed char *cp; - Py_ssize_t len, i; + Py_buffer view; + Py_ssize_t i; int size; int prevval; Py_ssize_t ncross; - if ( !PyArg_ParseTuple(args, "s#i:cross", &cp, &len, &size) ) - return 0; - if (!audioop_check_parameters(len, size)) + if (!PyArg_ParseTuple(args, "y*i:cross", &view, &size)) return NULL; + if (!audioop_check_parameters(view.len, size)) { + PyBuffer_Release(&view); + return NULL; + } ncross = -1; prevval = 17; /* Anything <> 0,1 */ - for (i = 0; i < len; i += size) { - int val = GETRAWSAMPLE(size, cp, i) < 0; + for (i = 0; i < view.len; i += size) { + int val = GETRAWSAMPLE(size, view.buf, i) < 0; if (val != prevval) ncross++; prevval = val; } + PyBuffer_Release(&view); return PyLong_FromSsize_t(ncross); } static PyObject * audioop_mul(PyObject *self, PyObject *args) { - signed char *cp, *ncp; - Py_ssize_t len, i; + Py_buffer view; + signed char *ncp; + Py_ssize_t i; int size; double factor, maxval, minval; - PyObject *rv; + PyObject *rv = NULL; - if ( !PyArg_ParseTuple(args, "s#id:mul", &cp, &len, &size, &factor ) ) - return 0; - if (!audioop_check_parameters(len, size)) + if (!PyArg_ParseTuple(args, "y*id:mul", &view, &size, &factor)) return NULL; + if (!audioop_check_parameters(view.len, size)) + goto exit; maxval = (double) maxvals[size]; minval = (double) minvals[size]; - rv = PyBytes_FromStringAndSize(NULL, len); - if ( rv == 0 ) - return 0; + rv = PyBytes_FromStringAndSize(NULL, view.len); + if (rv == NULL) + goto exit; ncp = (signed char *)PyBytes_AsString(rv); - for (i = 0; i < len; i += size) { - double val = GETRAWSAMPLE(size, cp, i); + for (i = 0; i < view.len; i += size) { + double val = GETRAWSAMPLE(size, view.buf, i); val *= factor; val = floor(fbound(val, minval, maxval)); SETRAWSAMPLE(size, ncp, i, (int)val); } + exit: + PyBuffer_Release(&view); return rv; } @@ -834,31 +891,26 @@ Py_ssize_t len, i; int size; double fac1, fac2, maxval, minval; - PyObject *rv; + PyObject *rv = NULL; - if ( !PyArg_ParseTuple(args, "s*idd:tomono", - &pcp, &size, &fac1, &fac2 ) ) - return 0; + if (!PyArg_ParseTuple(args, "y*idd:tomono", + &pcp, &size, &fac1, &fac2)) + return NULL; cp = pcp.buf; len = pcp.len; - if (!audioop_check_parameters(len, size)) { - PyBuffer_Release(&pcp); - return NULL; - } + if (!audioop_check_parameters(len, size)) + goto exit; if (((len / size) & 1) != 0) { PyErr_SetString(AudioopError, "not a whole number of frames"); - PyBuffer_Release(&pcp); - return NULL; + goto exit; } maxval = (double) maxvals[size]; minval = (double) minvals[size]; rv = PyBytes_FromStringAndSize(NULL, len/2); - if ( rv == 0 ) { - PyBuffer_Release(&pcp); - return 0; - } + if (rv == NULL) + goto exit; ncp = (signed char *)PyBytes_AsString(rv); for (i = 0; i < len; i += size*2) { @@ -868,6 +920,7 @@ val = floor(fbound(val, minval, maxval)); SETRAWSAMPLE(size, ncp, i/2, val); } + exit: PyBuffer_Release(&pcp); return rv; } @@ -875,71 +928,76 @@ static PyObject * audioop_tostereo(PyObject *self, PyObject *args) { - signed char *cp, *ncp; - Py_ssize_t len, i; + Py_buffer view; + signed char *ncp; + Py_ssize_t i; int size; double fac1, fac2, maxval, minval; - PyObject *rv; + PyObject *rv = NULL; - if ( !PyArg_ParseTuple(args, "s#idd:tostereo", - &cp, &len, &size, &fac1, &fac2 ) ) - return 0; - if (!audioop_check_parameters(len, size)) + if (!PyArg_ParseTuple(args, "y*idd:tostereo", + &view, &size, &fac1, &fac2)) return NULL; + if (!audioop_check_parameters(view.len, size)) + goto exit; maxval = (double) maxvals[size]; minval = (double) minvals[size]; - if (len > PY_SSIZE_T_MAX/2) { + if (view.len > PY_SSIZE_T_MAX/2) { PyErr_SetString(PyExc_MemoryError, "not enough memory for output buffer"); - return 0; + goto exit; } - rv = PyBytes_FromStringAndSize(NULL, len*2); - if ( rv == 0 ) - return 0; + rv = PyBytes_FromStringAndSize(NULL, view.len*2); + if (rv == NULL) + goto exit; ncp = (signed char *)PyBytes_AsString(rv); - for (i = 0; i < len; i += size) { - double val = GETRAWSAMPLE(size, cp, i); + for (i = 0; i < view.len; i += size) { + double val = GETRAWSAMPLE(size, view.buf, i); int val1 = (int)floor(fbound(val*fac1, minval, maxval)); int val2 = (int)floor(fbound(val*fac2, minval, maxval)); SETRAWSAMPLE(size, ncp, i*2, val1); SETRAWSAMPLE(size, ncp, i*2 + size, val2); } + exit: + PyBuffer_Release(&view); return rv; } static PyObject * audioop_add(PyObject *self, PyObject *args) { - signed char *cp1, *cp2, *ncp; - Py_ssize_t len1, len2, i; + Py_buffer view1; + Py_buffer view2; + signed char *ncp; + Py_ssize_t i; int size, minval, maxval, newval; - PyObject *rv; + PyObject *rv = NULL; - if ( !PyArg_ParseTuple(args, "s#s#i:add", - &cp1, &len1, &cp2, &len2, &size ) ) - return 0; - if (!audioop_check_parameters(len1, size)) + if (!PyArg_ParseTuple(args, "y*y*i:add", + &view1, &view2, &size)) return NULL; - if ( len1 != len2 ) { + if (!audioop_check_parameters(view1.len, size)) + goto exit; + if (view1.len != view2.len) { PyErr_SetString(AudioopError, "Lengths should be the same"); - return 0; + goto exit; } maxval = maxvals[size]; minval = minvals[size]; - rv = PyBytes_FromStringAndSize(NULL, len1); - if ( rv == 0 ) - return 0; + rv = PyBytes_FromStringAndSize(NULL, view1.len); + if (rv == NULL) + goto exit; ncp = (signed char *)PyBytes_AsString(rv); - for (i = 0; i < len1; i += size) { - int val1 = GETRAWSAMPLE(size, cp1, i); - int val2 = GETRAWSAMPLE(size, cp2, i); + for (i = 0; i < view1.len; i += size) { + int val1 = GETRAWSAMPLE(size, view1.buf, i); + int val2 = GETRAWSAMPLE(size, view2.buf, i); if (size < 4) { newval = val1 + val2; @@ -957,42 +1015,46 @@ SETRAWSAMPLE(size, ncp, i, newval); } + exit: + PyBuffer_Release(&view1); + PyBuffer_Release(&view2); return rv; } static PyObject * audioop_bias(PyObject *self, PyObject *args) { - signed char *cp, *ncp; - Py_ssize_t len, i; + Py_buffer view; + signed char *ncp; + Py_ssize_t i; int size, bias; unsigned int val = 0, mask; - PyObject *rv; + PyObject *rv = NULL; - if ( !PyArg_ParseTuple(args, "s#ii:bias", - &cp, &len, &size , &bias) ) - return 0; - - if (!audioop_check_parameters(len, size)) + if (!PyArg_ParseTuple(args, "y*ii:bias", + &view, &size, &bias)) return NULL; - rv = PyBytes_FromStringAndSize(NULL, len); - if ( rv == 0 ) - return 0; + if (!audioop_check_parameters(view.len, size)) + goto exit; + + rv = PyBytes_FromStringAndSize(NULL, view.len); + if (rv == NULL) + goto exit; ncp = (signed char *)PyBytes_AsString(rv); mask = masks[size]; - for (i = 0; i < len; i += size) { + for (i = 0; i < view.len; i += size) { if (size == 1) - val = GETINTX(unsigned char, cp, i); + val = GETINTX(unsigned char, view.buf, i); else if (size == 2) - val = GETINTX(unsigned short, cp, i); + val = GETINTX(unsigned short, view.buf, i); else if (size == 3) - val = ((unsigned int)GETINT24(cp, i)) & 0xffffffu; + val = ((unsigned int)GETINT24(view.buf, i)) & 0xffffffu; else { assert(size == 4); - val = GETINTX(PY_UINT32_T, cp, i); + val = GETINTX(PY_UINT32_T, view.buf, i); } val += (unsigned int)bias; @@ -1010,69 +1072,75 @@ SETINTX(PY_UINT32_T, ncp, i, val); } } + exit: + PyBuffer_Release(&view); return rv; } static PyObject * audioop_reverse(PyObject *self, PyObject *args) { - signed char *cp; + Py_buffer view; unsigned char *ncp; - Py_ssize_t len, i; + Py_ssize_t i; int size; - PyObject *rv; + PyObject *rv = NULL; - if ( !PyArg_ParseTuple(args, "s#i:reverse", - &cp, &len, &size) ) - return 0; - - if (!audioop_check_parameters(len, size)) + if (!PyArg_ParseTuple(args, "y*i:reverse", + &view, &size)) return NULL; - rv = PyBytes_FromStringAndSize(NULL, len); - if ( rv == 0 ) - return 0; + if (!audioop_check_parameters(view.len, size)) + goto exit; + + rv = PyBytes_FromStringAndSize(NULL, view.len); + if (rv == NULL) + goto exit; ncp = (unsigned char *)PyBytes_AsString(rv); - for (i = 0; i < len; i += size) { - int val = GETRAWSAMPLE(size, cp, i); - SETRAWSAMPLE(size, ncp, len - i - size, val); + for (i = 0; i < view.len; i += size) { + int val = GETRAWSAMPLE(size, view.buf, i); + SETRAWSAMPLE(size, ncp, view.len - i - size, val); } + exit: + PyBuffer_Release(&view); return rv; } static PyObject * audioop_lin2lin(PyObject *self, PyObject *args) { - signed char *cp; + Py_buffer view; unsigned char *ncp; - Py_ssize_t len, i, j; + Py_ssize_t i, j; int size, size2; - PyObject *rv; + PyObject *rv = NULL; - if ( !PyArg_ParseTuple(args, "s#ii:lin2lin", - &cp, &len, &size, &size2) ) - return 0; - - if (!audioop_check_parameters(len, size)) - return NULL; - if (!audioop_check_size(size2)) + if (!PyArg_ParseTuple(args, "y*ii:lin2lin", + &view, &size, &size2)) return NULL; - if (len/size > PY_SSIZE_T_MAX/size2) { + if (!audioop_check_parameters(view.len, size)) + goto exit; + if (!audioop_check_size(size2)) + goto exit; + + if (view.len/size > PY_SSIZE_T_MAX/size2) { PyErr_SetString(PyExc_MemoryError, "not enough memory for output buffer"); - return 0; + goto exit; } - rv = PyBytes_FromStringAndSize(NULL, (len/size)*size2); - if ( rv == 0 ) - return 0; + rv = PyBytes_FromStringAndSize(NULL, (view.len/size)*size2); + if (rv == NULL) + goto exit; ncp = (unsigned char *)PyBytes_AsString(rv); - for (i = j = 0; i < len; i += size, j += size2) { - int val = GETSAMPLE32(size, cp, i); + for (i = j = 0; i < view.len; i += size, j += size2) { + int val = GETSAMPLE32(size, view.buf, i); SETSAMPLE32(size2, ncp, j, val); } + exit: + PyBuffer_Release(&view); return rv; } @@ -1090,6 +1158,7 @@ static PyObject * audioop_ratecv(PyObject *self, PyObject *args) { + Py_buffer view; char *cp, *ncp; Py_ssize_t len; int size, nchannels, inrate, outrate, weightA, weightB; @@ -1099,15 +1168,15 @@ weightA = 1; weightB = 0; - if (!PyArg_ParseTuple(args, "s#iiiiO|ii:ratecv", &cp, &len, &size, + if (!PyArg_ParseTuple(args, "y*iiiiO|ii:ratecv", &view, &size, &nchannels, &inrate, &outrate, &state, &weightA, &weightB)) return NULL; if (!audioop_check_size(size)) - return NULL; + goto exit2; if (nchannels < 1) { PyErr_SetString(AudioopError, "# of channels should be >= 1"); - return NULL; + goto exit2; } if (size > INT_MAX / nchannels) { /* This overflow test is rigorously correct because @@ -1115,21 +1184,21 @@ from the docs for the error msg. */ PyErr_SetString(PyExc_OverflowError, "width * nchannels too big for a C int"); - return NULL; + goto exit2; } bytes_per_frame = size * nchannels; if (weightA < 1 || weightB < 0) { PyErr_SetString(AudioopError, "weightA should be >= 1, weightB should be >= 0"); - return NULL; + goto exit2; } - if (len % bytes_per_frame != 0) { + if (view.len % bytes_per_frame != 0) { PyErr_SetString(AudioopError, "not a whole number of frames"); - return NULL; + goto exit2; } if (inrate <= 0 || outrate <= 0) { PyErr_SetString(AudioopError, "sampling rate not > 0"); - return NULL; + goto exit2; } /* divide inrate and outrate by their greatest common divisor */ d = gcd(inrate, outrate); @@ -1143,7 +1212,7 @@ if ((size_t)nchannels > PY_SIZE_MAX/sizeof(int)) { PyErr_SetString(PyExc_MemoryError, "not enough memory for output buffer"); - return 0; + goto exit2; } prev_i = (int *) PyMem_Malloc(nchannels * sizeof(int)); cur_i = (int *) PyMem_Malloc(nchannels * sizeof(int)); @@ -1152,7 +1221,7 @@ goto exit; } - len /= bytes_per_frame; /* # of frames */ + len = view.len / bytes_per_frame; /* # of frames */ if (state == Py_None) { d = -outrate; @@ -1202,6 +1271,7 @@ goto exit; } ncp = PyBytes_AsString(str); + cp = view.buf; for (;;) { while (d < 0) { @@ -1257,152 +1327,166 @@ exit: PyMem_Free(prev_i); PyMem_Free(cur_i); + exit2: + PyBuffer_Release(&view); return rv; } static PyObject * audioop_lin2ulaw(PyObject *self, PyObject *args) { - signed char *cp; + Py_buffer view; unsigned char *ncp; - Py_ssize_t len, i; + Py_ssize_t i; int size; - PyObject *rv; + PyObject *rv = NULL; - if ( !PyArg_ParseTuple(args, "s#i:lin2ulaw", - &cp, &len, &size) ) - return 0 ; - - if (!audioop_check_parameters(len, size)) + if (!PyArg_ParseTuple(args, "y*i:lin2ulaw", + &view, &size)) return NULL; - rv = PyBytes_FromStringAndSize(NULL, len/size); - if ( rv == 0 ) - return 0; + if (!audioop_check_parameters(view.len, size)) + goto exit; + + rv = PyBytes_FromStringAndSize(NULL, view.len/size); + if (rv == NULL) + goto exit; ncp = (unsigned char *)PyBytes_AsString(rv); - for (i = 0; i < len; i += size) { - int val = GETSAMPLE32(size, cp, i); + for (i = 0; i < view.len; i += size) { + int val = GETSAMPLE32(size, view.buf, i); *ncp++ = st_14linear2ulaw(val >> 18); } + exit: + PyBuffer_Release(&view); return rv; } static PyObject * audioop_ulaw2lin(PyObject *self, PyObject *args) { + Py_buffer view; unsigned char *cp; signed char *ncp; - Py_ssize_t len, i; + Py_ssize_t i; int size; - PyObject *rv; + PyObject *rv = NULL; - if ( !PyArg_ParseTuple(args, "s#i:ulaw2lin", - &cp, &len, &size) ) - return 0; + if (!PyArg_ParseTuple(args, "y*i:ulaw2lin", + &view, &size)) + return NULL; if (!audioop_check_size(size)) - return NULL; + goto exit; - if (len > PY_SSIZE_T_MAX/size) { + if (view.len > PY_SSIZE_T_MAX/size) { PyErr_SetString(PyExc_MemoryError, "not enough memory for output buffer"); - return 0; + goto exit; } - rv = PyBytes_FromStringAndSize(NULL, len*size); - if ( rv == 0 ) - return 0; + rv = PyBytes_FromStringAndSize(NULL, view.len*size); + if (rv == NULL) + goto exit; ncp = (signed char *)PyBytes_AsString(rv); - for (i = 0; i < len*size; i += size) { + cp = view.buf; + for (i = 0; i < view.len*size; i += size) { int val = st_ulaw2linear16(*cp++) << 16; SETSAMPLE32(size, ncp, i, val); } + exit: + PyBuffer_Release(&view); return rv; } static PyObject * audioop_lin2alaw(PyObject *self, PyObject *args) { - signed char *cp; + Py_buffer view; unsigned char *ncp; - Py_ssize_t len, i; + Py_ssize_t i; int size; - PyObject *rv; + PyObject *rv = NULL; - if ( !PyArg_ParseTuple(args, "s#i:lin2alaw", - &cp, &len, &size) ) - return 0; - - if (!audioop_check_parameters(len, size)) + if (!PyArg_ParseTuple(args, "y*i:lin2alaw", + &view, &size)) return NULL; - rv = PyBytes_FromStringAndSize(NULL, len/size); - if ( rv == 0 ) - return 0; + if (!audioop_check_parameters(view.len, size)) + goto exit; + + rv = PyBytes_FromStringAndSize(NULL, view.len/size); + if (rv == NULL) + goto exit; ncp = (unsigned char *)PyBytes_AsString(rv); - for (i = 0; i < len; i += size) { - int val = GETSAMPLE32(size, cp, i); + for (i = 0; i < view.len; i += size) { + int val = GETSAMPLE32(size, view.buf, i); *ncp++ = st_linear2alaw(val >> 19); } + exit: + PyBuffer_Release(&view); return rv; } static PyObject * audioop_alaw2lin(PyObject *self, PyObject *args) { + Py_buffer view; unsigned char *cp; signed char *ncp; - Py_ssize_t len, i; + Py_ssize_t i; int size, val; - PyObject *rv; + PyObject *rv = NULL; - if ( !PyArg_ParseTuple(args, "s#i:alaw2lin", - &cp, &len, &size) ) - return 0; + if (!PyArg_ParseTuple(args, "y*i:alaw2lin", + &view, &size)) + return NULL; if (!audioop_check_size(size)) - return NULL; + goto exit; - if (len > PY_SSIZE_T_MAX/size) { + if (view.len > PY_SSIZE_T_MAX/size) { PyErr_SetString(PyExc_MemoryError, "not enough memory for output buffer"); - return 0; + goto exit; } - rv = PyBytes_FromStringAndSize(NULL, len*size); - if ( rv == 0 ) - return 0; + rv = PyBytes_FromStringAndSize(NULL, view.len*size); + if (rv == NULL) + goto exit; ncp = (signed char *)PyBytes_AsString(rv); + cp = view.buf; - for (i = 0; i < len*size; i += size) { + for (i = 0; i < view.len*size; i += size) { val = st_alaw2linear16(*cp++) << 16; SETSAMPLE32(size, ncp, i, val); } + exit: + PyBuffer_Release(&view); return rv; } static PyObject * audioop_lin2adpcm(PyObject *self, PyObject *args) { - signed char *cp; + Py_buffer view; signed char *ncp; - Py_ssize_t len, i; + Py_ssize_t i; int size, step, valpred, delta, index, sign, vpdiff, diff; - PyObject *rv, *state, *str; + PyObject *rv = NULL, *state, *str; int outputbuffer = 0, bufferstep; - if ( !PyArg_ParseTuple(args, "s#iO:lin2adpcm", - &cp, &len, &size, &state) ) - return 0; - - if (!audioop_check_parameters(len, size)) + if (!PyArg_ParseTuple(args, "y*iO:lin2adpcm", + &view, &size, &state)) return NULL; - str = PyBytes_FromStringAndSize(NULL, len/(size*2)); - if ( str == 0 ) - return 0; + if (!audioop_check_parameters(view.len, size)) + goto exit; + + str = PyBytes_FromStringAndSize(NULL, view.len/(size*2)); + if (str == NULL) + goto exit; ncp = (signed char *)PyBytes_AsString(str); /* Decode state, should have (value, step) */ @@ -1410,14 +1494,14 @@ /* First time, it seems. Set defaults */ valpred = 0; index = 0; - } else if ( !PyArg_ParseTuple(state, "ii", &valpred, &index) ) - return 0; + } else if (!PyArg_ParseTuple(state, "ii", &valpred, &index)) + goto exit; step = stepsizeTable[index]; bufferstep = 1; - for (i = 0; i < len; i += size) { - int val = GETSAMPLE32(size, cp, i) >> 16; + for (i = 0; i < view.len; i += size) { + int val = GETSAMPLE32(size, view.buf, i) >> 16; /* Step 1 - compute difference with previous value */ if (val < valpred) { @@ -1488,44 +1572,48 @@ } rv = Py_BuildValue("(O(ii))", str, valpred, index); Py_DECREF(str); + exit: + PyBuffer_Release(&view); return rv; } static PyObject * audioop_adpcm2lin(PyObject *self, PyObject *args) { + Py_buffer view; signed char *cp; signed char *ncp; - Py_ssize_t len, i, outlen; + Py_ssize_t i, outlen; int size, valpred, step, delta, index, sign, vpdiff; - PyObject *rv, *str, *state; + PyObject *rv = NULL, *str, *state; int inputbuffer = 0, bufferstep; - if ( !PyArg_ParseTuple(args, "s#iO:adpcm2lin", - &cp, &len, &size, &state) ) - return 0; + if (!PyArg_ParseTuple(args, "y*iO:adpcm2lin", + &view, &size, &state)) + return NULL; if (!audioop_check_size(size)) - return NULL; + goto exit; /* Decode state, should have (value, step) */ if ( state == Py_None ) { /* First time, it seems. Set defaults */ valpred = 0; index = 0; - } else if ( !PyArg_ParseTuple(state, "ii", &valpred, &index) ) - return 0; + } else if (!PyArg_ParseTuple(state, "ii", &valpred, &index)) + goto exit; - if (len > (PY_SSIZE_T_MAX/2)/size) { + if (view.len > (PY_SSIZE_T_MAX/2)/size) { PyErr_SetString(PyExc_MemoryError, "not enough memory for output buffer"); - return 0; + goto exit; } - outlen = len*size*2; + outlen = view.len*size*2; str = PyBytes_FromStringAndSize(NULL, outlen); - if ( str == 0 ) - return 0; + if (str == NULL) + goto exit; ncp = (signed char *)PyBytes_AsString(str); + cp = view.buf; step = stepsizeTable[index]; bufferstep = 0; @@ -1580,6 +1668,8 @@ rv = Py_BuildValue("(O(ii))", str, valpred, index); Py_DECREF(str); + exit: + PyBuffer_Release(&view); return rv; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 10 21:00:23 2013 From: python-checkins at python.org (benjamin.peterson) Date: Sun, 10 Nov 2013 21:00:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_fix_Zephyr_AST_link?= Message-ID: <3dHmKg3JQHz7M4S@mail.python.org> http://hg.python.org/peps/rev/c0d120cf0aac changeset: 5263:c0d120cf0aac user: Benjamin Peterson date: Sun Nov 10 15:00:13 2013 -0500 summary: fix Zephyr AST link files: pep-0339.txt | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/pep-0339.txt b/pep-0339.txt --- a/pep-0339.txt +++ b/pep-0339.txt @@ -559,7 +559,7 @@ 213--227, 1997. .. _The Zephyr Abstract Syntax Description Language.: - http://www.cs.princeton.edu/~danwang/Papers/dsl97/dsl97.html + http://www.cs.princeton.edu/research/techreps/TR-554-97 .. _SPARK: http://pages.cpsc.ucalgary.ca/~aycock/spark/ -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sun Nov 10 23:56:48 2013 From: python-checkins at python.org (victor.stinner) Date: Sun, 10 Nov 2013 23:56:48 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_454=3A_set=5Ftraceback=5F?= =?utf-8?q?limit=28=29_requires_nframe_=3E=3D_1?= Message-ID: <3dHrFD6xlDz7Ljf@mail.python.org> http://hg.python.org/peps/rev/72fc6b519f12 changeset: 5264:72fc6b519f12 user: Victor Stinner date: Sun Nov 10 23:54:37 2013 +0100 summary: PEP 454: set_traceback_limit() requires nframe >= 1 files: pep-0454.txt | 12 ++++-------- 1 files changed, 4 insertions(+), 8 deletions(-) diff --git a/pep-0454.txt b/pep-0454.txt --- a/pep-0454.txt +++ b/pep-0454.txt @@ -153,6 +153,7 @@ ``set_traceback_limit(nframe: int)`` function: Set the maximum number of frames stored in the traceback of a trace. + *nframe* must be greater or equal to ``1``. Storing more frames allocates more memory of the ``tracemalloc`` module and makes Python slower. Use the ``get_tracemalloc_memory()`` @@ -163,9 +164,6 @@ grouped by ``'traceback'`` or to compute cumulative statistics: see the ``Snapshot.statistics()`` method. - If the limit is set to ``0`` frame, a traceback with a frame will be - used for all traces: filename ``''`` at line number ``0``. - Use the ``get_traceback_limit()`` function to get the current limit. The ``PYTHONTRACEMALLOC`` environment variable @@ -456,11 +454,9 @@ Sequence of ``Frame`` instances sorted from the most recent frame to the oldest frame. - A traceback contains at least ``1`` frame. If the ``tracemalloc`` - module failed to get a frame, the filename ``""`` and the - line number ``0`` are used. If it failed to get the traceback or if - the traceback limit is ``0``, the traceback only contains a frame: - filename ``''`` at line number ``0``. + A traceback contains at least ``1`` frame. If the ``tracemalloc`` module + failed to get a frame, the filename ``""`` at line number ``0`` is + used. When a snapshot is taken, tracebacks of traces are limited to ``get_traceback_limit()`` frames. See the ``take_snapshot()`` -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Mon Nov 11 00:22:21 2013 From: python-checkins at python.org (jason.coombs) Date: Mon, 11 Nov 2013 00:22:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Issue_19544_an?= =?utf-8?q?d_Issue_=237457=3A_Restore_the_read=5Fpkg=5Ffile_method_to?= Message-ID: <3dHrpj23m5z7Llb@mail.python.org> http://hg.python.org/cpython/rev/e19441e540ca changeset: 87042:e19441e540ca branch: 3.3 parent: 87025:79b8b7c5fe8a user: Jason R. Coombs date: Sun Nov 10 18:15:03 2013 -0500 summary: Issue 19544 and Issue #7457: Restore the read_pkg_file method to distutils.dist.DistributionMetadata accidentally removed in the undo of distutils2. files: Doc/distutils/examples.rst | 42 +++++++++ Lib/distutils/dist.py | 90 ++++++++++++++++--- Lib/distutils/tests/test_dist.py | 29 ++++++- Misc/NEWS | 4 + 4 files changed, 147 insertions(+), 18 deletions(-) diff --git a/Doc/distutils/examples.rst b/Doc/distutils/examples.rst --- a/Doc/distutils/examples.rst +++ b/Doc/distutils/examples.rst @@ -284,6 +284,48 @@ warning: check: Title underline too short. (line 2) warning: check: Could not finish the parsing. +Reading the metadata +===================== + +The :func:`distutils.core.setup` function provides a command-line interface +that allows you to query the metadata fields of a project through the +`setup.py` script of a given project:: + + $ python setup.py --name + distribute + +This call reads the `name` metadata by running the +:func:`distutils.core.setup` function. Although, when a source or binary +distribution is created with Distutils, the metadata fields are written +in a static file called :file:`PKG-INFO`. When a Distutils-based project is +installed in Python, the :file:`PKG-INFO` file is copied alongside the modules +and packages of the distribution under :file:`NAME-VERSION-pyX.X.egg-info`, +where `NAME` is the name of the project, `VERSION` its version as defined +in the Metadata, and `pyX.X` the major and minor version of Python like +`2.7` or `3.2`. + +You can read back this static file, by using the +:class:`distutils.dist.DistributionMetadata` class and its +:func:`read_pkg_file` method:: + + >>> from distutils.dist import DistributionMetadata + >>> metadata = DistributionMetadata() + >>> metadata.read_pkg_file(open('distribute-0.6.8-py2.7.egg-info')) + >>> metadata.name + 'distribute' + >>> metadata.version + '0.6.8' + >>> metadata.description + 'Easily download, build, install, upgrade, and uninstall Python packages' + +Notice that the class can also be instanciated with a metadata file path to +loads its values:: + + >>> pkg_info_path = 'distribute-0.6.8-py2.7.egg-info' + >>> DistributionMetadata(pkg_info_path).name + 'distribute' + + .. % \section{Multiple extension modules} .. % \label{multiple-ext} diff --git a/Lib/distutils/dist.py b/Lib/distutils/dist.py --- a/Lib/distutils/dist.py +++ b/Lib/distutils/dist.py @@ -5,6 +5,7 @@ """ import sys, os, re +from email import message_from_file try: import warnings @@ -999,25 +1000,80 @@ "provides", "requires", "obsoletes", ) - def __init__ (self): - self.name = None - self.version = None - self.author = None - self.author_email = None + def __init__(self, path=None): + if path is not None: + self.read_pkg_file(open(path)) + else: + self.name = None + self.version = None + self.author = None + self.author_email = None + self.maintainer = None + self.maintainer_email = None + self.url = None + self.license = None + self.description = None + self.long_description = None + self.keywords = None + self.platforms = None + self.classifiers = None + self.download_url = None + # PEP 314 + self.provides = None + self.requires = None + self.obsoletes = None + + def read_pkg_file(self, file): + """Reads the metadata values from a file object.""" + msg = message_from_file(file) + + def _read_field(name): + value = msg[name] + if value == 'UNKNOWN': + return None + return value + + def _read_list(name): + values = msg.get_all(name, None) + if values == []: + return None + return values + + metadata_version = msg['metadata-version'] + self.name = _read_field('name') + self.version = _read_field('version') + self.description = _read_field('summary') + # we are filling author only. + self.author = _read_field('author') self.maintainer = None + self.author_email = _read_field('author-email') self.maintainer_email = None - self.url = None - self.license = None - self.description = None - self.long_description = None - self.keywords = None - self.platforms = None - self.classifiers = None - self.download_url = None - # PEP 314 - self.provides = None - self.requires = None - self.obsoletes = None + self.url = _read_field('home-page') + self.license = _read_field('license') + + if 'download-url' in msg: + self.download_url = _read_field('download-url') + else: + self.download_url = None + + self.long_description = _read_field('description') + self.description = _read_field('summary') + + if 'keywords' in msg: + self.keywords = _read_field('keywords').split(',') + + self.platforms = _read_list('platform') + self.classifiers = _read_list('classifier') + + # PEP 314 - these fields only exist in 1.1 + if metadata_version == '1.1': + self.requires = _read_list('requires') + self.provides = _read_list('provides') + self.obsoletes = _read_list('obsoletes') + else: + self.requires = None + self.provides = None + self.obsoletes = None def write_pkg_info(self, base_dir): """Write the PKG-INFO file into the release tree. diff --git a/Lib/distutils/tests/test_dist.py b/Lib/distutils/tests/test_dist.py --- a/Lib/distutils/tests/test_dist.py +++ b/Lib/distutils/tests/test_dist.py @@ -8,7 +8,7 @@ from unittest import mock -from distutils.dist import Distribution, fix_help_options +from distutils.dist import Distribution, fix_help_options, DistributionMetadata from distutils.cmd import Command from test.support import TESTFN, captured_stdout, run_unittest @@ -388,6 +388,33 @@ self.assertTrue(output) + def test_read_metadata(self): + attrs = {"name": "package", + "version": "1.0", + "long_description": "desc", + "description": "xxx", + "download_url": "http://example.com", + "keywords": ['one', 'two'], + "requires": ['foo']} + + dist = Distribution(attrs) + metadata = dist.metadata + + # write it then reloads it + PKG_INFO = io.StringIO() + metadata.write_pkg_file(PKG_INFO) + PKG_INFO.seek(0) + metadata.read_pkg_file(PKG_INFO) + + self.assertEquals(metadata.name, "package") + self.assertEquals(metadata.version, "1.0") + self.assertEquals(metadata.description, "xxx") + self.assertEquals(metadata.download_url, 'http://example.com') + self.assertEquals(metadata.keywords, ['one', 'two']) + self.assertEquals(metadata.platforms, ['UNKNOWN']) + self.assertEquals(metadata.obsoletes, None) + self.assertEquals(metadata.requires, ['foo']) + def test_suite(): suite = unittest.TestSuite() suite.addTest(unittest.makeSuite(DistributionTestCase)) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -13,6 +13,10 @@ Library ------- +- Issue #19544 and Issue #7457: Restore the read_pkg_file method to + distutils.dist.DistributionMetadata accidentally removed in the undo of + distutils2. + - Issue #1575020: Fixed support of 24-bit wave files on big-endian platforms. - Issue #19480: HTMLParser now accepts all valid start-tag names as defined -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 11 00:22:22 2013 From: python-checkins at python.org (jason.coombs) Date: Mon, 11 Nov 2013 00:22:22 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge_with_3=2E3_for_Issue_=2319544_and_Issue_=237457?= Message-ID: <3dHrpk5BNBz7M5P@mail.python.org> http://hg.python.org/cpython/rev/28059d8b395b changeset: 87043:28059d8b395b parent: 87041:bab0cbf86835 parent: 87042:e19441e540ca user: Jason R. Coombs date: Sun Nov 10 18:21:49 2013 -0500 summary: Merge with 3.3 for Issue #19544 and Issue #7457 files: Doc/distutils/examples.rst | 42 +++++++++ Lib/distutils/dist.py | 90 ++++++++++++++++--- Lib/distutils/tests/test_dist.py | 29 ++++++- Misc/NEWS | 4 + 4 files changed, 147 insertions(+), 18 deletions(-) diff --git a/Doc/distutils/examples.rst b/Doc/distutils/examples.rst --- a/Doc/distutils/examples.rst +++ b/Doc/distutils/examples.rst @@ -284,6 +284,48 @@ warning: check: Title underline too short. (line 2) warning: check: Could not finish the parsing. +Reading the metadata +===================== + +The :func:`distutils.core.setup` function provides a command-line interface +that allows you to query the metadata fields of a project through the +`setup.py` script of a given project:: + + $ python setup.py --name + distribute + +This call reads the `name` metadata by running the +:func:`distutils.core.setup` function. Although, when a source or binary +distribution is created with Distutils, the metadata fields are written +in a static file called :file:`PKG-INFO`. When a Distutils-based project is +installed in Python, the :file:`PKG-INFO` file is copied alongside the modules +and packages of the distribution under :file:`NAME-VERSION-pyX.X.egg-info`, +where `NAME` is the name of the project, `VERSION` its version as defined +in the Metadata, and `pyX.X` the major and minor version of Python like +`2.7` or `3.2`. + +You can read back this static file, by using the +:class:`distutils.dist.DistributionMetadata` class and its +:func:`read_pkg_file` method:: + + >>> from distutils.dist import DistributionMetadata + >>> metadata = DistributionMetadata() + >>> metadata.read_pkg_file(open('distribute-0.6.8-py2.7.egg-info')) + >>> metadata.name + 'distribute' + >>> metadata.version + '0.6.8' + >>> metadata.description + 'Easily download, build, install, upgrade, and uninstall Python packages' + +Notice that the class can also be instanciated with a metadata file path to +loads its values:: + + >>> pkg_info_path = 'distribute-0.6.8-py2.7.egg-info' + >>> DistributionMetadata(pkg_info_path).name + 'distribute' + + .. % \section{Multiple extension modules} .. % \label{multiple-ext} diff --git a/Lib/distutils/dist.py b/Lib/distutils/dist.py --- a/Lib/distutils/dist.py +++ b/Lib/distutils/dist.py @@ -5,6 +5,7 @@ """ import sys, os, re +from email import message_from_file try: import warnings @@ -999,25 +1000,80 @@ "provides", "requires", "obsoletes", ) - def __init__ (self): - self.name = None - self.version = None - self.author = None - self.author_email = None + def __init__(self, path=None): + if path is not None: + self.read_pkg_file(open(path)) + else: + self.name = None + self.version = None + self.author = None + self.author_email = None + self.maintainer = None + self.maintainer_email = None + self.url = None + self.license = None + self.description = None + self.long_description = None + self.keywords = None + self.platforms = None + self.classifiers = None + self.download_url = None + # PEP 314 + self.provides = None + self.requires = None + self.obsoletes = None + + def read_pkg_file(self, file): + """Reads the metadata values from a file object.""" + msg = message_from_file(file) + + def _read_field(name): + value = msg[name] + if value == 'UNKNOWN': + return None + return value + + def _read_list(name): + values = msg.get_all(name, None) + if values == []: + return None + return values + + metadata_version = msg['metadata-version'] + self.name = _read_field('name') + self.version = _read_field('version') + self.description = _read_field('summary') + # we are filling author only. + self.author = _read_field('author') self.maintainer = None + self.author_email = _read_field('author-email') self.maintainer_email = None - self.url = None - self.license = None - self.description = None - self.long_description = None - self.keywords = None - self.platforms = None - self.classifiers = None - self.download_url = None - # PEP 314 - self.provides = None - self.requires = None - self.obsoletes = None + self.url = _read_field('home-page') + self.license = _read_field('license') + + if 'download-url' in msg: + self.download_url = _read_field('download-url') + else: + self.download_url = None + + self.long_description = _read_field('description') + self.description = _read_field('summary') + + if 'keywords' in msg: + self.keywords = _read_field('keywords').split(',') + + self.platforms = _read_list('platform') + self.classifiers = _read_list('classifier') + + # PEP 314 - these fields only exist in 1.1 + if metadata_version == '1.1': + self.requires = _read_list('requires') + self.provides = _read_list('provides') + self.obsoletes = _read_list('obsoletes') + else: + self.requires = None + self.provides = None + self.obsoletes = None def write_pkg_info(self, base_dir): """Write the PKG-INFO file into the release tree. diff --git a/Lib/distutils/tests/test_dist.py b/Lib/distutils/tests/test_dist.py --- a/Lib/distutils/tests/test_dist.py +++ b/Lib/distutils/tests/test_dist.py @@ -8,7 +8,7 @@ from unittest import mock -from distutils.dist import Distribution, fix_help_options +from distutils.dist import Distribution, fix_help_options, DistributionMetadata from distutils.cmd import Command from test.support import TESTFN, captured_stdout, run_unittest @@ -388,6 +388,33 @@ self.assertTrue(output) + def test_read_metadata(self): + attrs = {"name": "package", + "version": "1.0", + "long_description": "desc", + "description": "xxx", + "download_url": "http://example.com", + "keywords": ['one', 'two'], + "requires": ['foo']} + + dist = Distribution(attrs) + metadata = dist.metadata + + # write it then reloads it + PKG_INFO = io.StringIO() + metadata.write_pkg_file(PKG_INFO) + PKG_INFO.seek(0) + metadata.read_pkg_file(PKG_INFO) + + self.assertEquals(metadata.name, "package") + self.assertEquals(metadata.version, "1.0") + self.assertEquals(metadata.description, "xxx") + self.assertEquals(metadata.download_url, 'http://example.com') + self.assertEquals(metadata.keywords, ['one', 'two']) + self.assertEquals(metadata.platforms, ['UNKNOWN']) + self.assertEquals(metadata.obsoletes, None) + self.assertEquals(metadata.requires, ['foo']) + def test_suite(): suite = unittest.TestSuite() suite.addTest(unittest.makeSuite(DistributionTestCase)) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -34,6 +34,10 @@ Library ------- +- Issue #19544 and Issue #7457: Restore the read_pkg_file method to + distutils.dist.DistributionMetadata accidentally removed in the undo of + distutils2. + - Issue #16685: Added support for any bytes-like objects in the audioop module. Removed support for strings. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 11 01:00:39 2013 From: python-checkins at python.org (jason.coombs) Date: Mon, 11 Nov 2013 01:00:39 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NTQ0?= =?utf-8?q?_and_Issue_=236286=3A_Restore_use_of_urllib_over_http_allowing_?= =?utf-8?q?use_of?= Message-ID: <3dHsfv3bkXz7LmR@mail.python.org> http://hg.python.org/cpython/rev/5e98c4e9c909 changeset: 87044:5e98c4e9c909 branch: 3.3 parent: 87042:e19441e540ca user: Jason R. Coombs date: Sun Nov 10 18:50:10 2013 -0500 summary: Issue #19544 and Issue #6286: Restore use of urllib over http allowing use of http_proxy for Distutils upload command, a feature accidentally lost in the rollback of distutils2. files: Lib/distutils/command/upload.py | 58 ++++++++-------- Lib/distutils/tests/test_upload.py | 63 +++++++---------- Misc/NEWS | 4 + 3 files changed, 58 insertions(+), 67 deletions(-) diff --git a/Lib/distutils/command/upload.py b/Lib/distutils/command/upload.py --- a/Lib/distutils/command/upload.py +++ b/Lib/distutils/command/upload.py @@ -10,10 +10,9 @@ import os, io import socket import platform -import configparser -import http.client as httpclient from base64 import standard_b64encode -import urllib.parse +from urllib.request import urlopen, Request, HTTPError +from urllib.parse import urlparse # this keeps compatibility for 2.3 and 2.4 if sys.version < "2.5": @@ -66,6 +65,15 @@ self.upload_file(command, pyversion, filename) def upload_file(self, command, pyversion, filename): + # Makes sure the repository URL is compliant + schema, netloc, url, params, query, fragments = \ + urlparse(self.repository) + if params or query or fragments: + raise AssertionError("Incompatible url %s" % self.repository) + + if schema not in ('http', 'https'): + raise AssertionError("unsupported schema " + schema) + # Sign if requested if self.sign: gpg_args = ["gpg", "--detach-sign", "-a", filename] @@ -162,41 +170,31 @@ self.announce("Submitting %s to %s" % (filename, self.repository), log.INFO) # build the Request - # We can't use urllib since we need to send the Basic - # auth right with the first request - # TODO(jhylton): Can we fix urllib? - schema, netloc, url, params, query, fragments = \ - urllib.parse.urlparse(self.repository) - assert not params and not query and not fragments - if schema == 'http': - http = httpclient.HTTPConnection(netloc) - elif schema == 'https': - http = httpclient.HTTPSConnection(netloc) - else: - raise AssertionError("unsupported schema "+schema) + headers = {'Content-type': + 'multipart/form-data; boundary=%s' % boundary, + 'Content-length': str(len(body)), + 'Authorization': auth} - data = '' - loglevel = log.INFO + request = Request(self.repository, data=body, + headers=headers) + # send the data try: - http.connect() - http.putrequest("POST", url) - http.putheader('Content-type', - 'multipart/form-data; boundary=%s'%boundary) - http.putheader('Content-length', str(len(body))) - http.putheader('Authorization', auth) - http.endheaders() - http.send(body) + result = urlopen(request) + status = result.getcode() + reason = result.msg except socket.error as e: self.announce(str(e), log.ERROR) return + except HTTPError as e: + status = e.code + reason = e.msg - r = http.getresponse() - if r.status == 200: - self.announce('Server response (%s): %s' % (r.status, r.reason), + if status == 200: + self.announce('Server response (%s): %s' % (status, reason), log.INFO) else: - self.announce('Upload failed (%s): %s' % (r.status, r.reason), + self.announce('Upload failed (%s): %s' % (status, reason), log.ERROR) if self.show_response: - msg = '\n'.join(('-' * 75, r.read(), '-' * 75)) + msg = '\n'.join(('-' * 75, result.read(), '-' * 75)) self.announce(msg, log.INFO) diff --git a/Lib/distutils/tests/test_upload.py b/Lib/distutils/tests/test_upload.py --- a/Lib/distutils/tests/test_upload.py +++ b/Lib/distutils/tests/test_upload.py @@ -1,9 +1,9 @@ """Tests for distutils.command.upload.""" import os import unittest -import http.client as httpclient from test.support import run_unittest +from distutils.command import upload as upload_mod from distutils.command.upload import upload from distutils.core import Distribution @@ -37,48 +37,37 @@ [server1] username:me """ -class Response(object): - def __init__(self, status=200, reason='OK'): - self.status = status - self.reason = reason -class FakeConnection(object): +class FakeOpen(object): - def __init__(self): - self.requests = [] - self.headers = [] - self.body = '' + def __init__(self, url): + self.url = url + if not isinstance(url, str): + self.req = url + else: + self.req = None + self.msg = 'OK' - def __call__(self, netloc): - return self + def getcode(self): + return 200 - def connect(self): - pass - endheaders = connect - - def putrequest(self, method, url): - self.requests.append((method, url)) - - def putheader(self, name, value): - self.headers.append((name, value)) - - def send(self, body): - self.body = body - - def getresponse(self): - return Response() class uploadTestCase(PyPIRCCommandTestCase): def setUp(self): super(uploadTestCase, self).setUp() - self.old_class = httpclient.HTTPConnection - self.conn = httpclient.HTTPConnection = FakeConnection() + self.old_open = upload_mod.urlopen + upload_mod.urlopen = self._urlopen + self.last_open = None def tearDown(self): - httpclient.HTTPConnection = self.old_class + upload_mod.urlopen = self.old_open super(uploadTestCase, self).tearDown() + def _urlopen(self, url): + self.last_open = FakeOpen(url) + return self.last_open + def test_finalize_options(self): # new format @@ -122,14 +111,14 @@ cmd.ensure_finalized() cmd.run() - # what did we send ? - headers = dict(self.conn.headers) + # what did we send ? + headers = dict(self.last_open.req.headers) self.assertEqual(headers['Content-length'], '2087') - self.assertTrue(headers['Content-type'].startswith('multipart/form-data')) - self.assertFalse('\n' in headers['Authorization']) - - self.assertEqual(self.conn.requests, [('POST', '/pypi')]) - self.assertTrue((b'xxx') in self.conn.body) + self.assert_(headers['Content-type'].startswith('multipart/form-data')) + self.assertEquals(self.last_open.req.get_method(), 'POST') + self.assertEquals(self.last_open.req.get_full_url(), + 'http://pypi.python.org/pypi') + self.assert_(b'xxx' in self.last_open.req.data) def test_suite(): return unittest.makeSuite(uploadTestCase) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -13,6 +13,10 @@ Library ------- +- Issue #19544 and Issue #6286: Restore use of urllib over http allowing use + of http_proxy for Distutils upload command, a feature accidentally lost + in the rollback of distutils2. + - Issue #19544 and Issue #7457: Restore the read_pkg_file method to distutils.dist.DistributionMetadata accidentally removed in the undo of distutils2. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 11 01:00:40 2013 From: python-checkins at python.org (jason.coombs) Date: Mon, 11 Nov 2013 01:00:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge_with_3=2E3_for_Issue_=2319544_and_Issue_=236286=2E?= =?utf-8?q?_Merge_is_untested=2E_I_was?= Message-ID: <3dHsfw6nYmz7M5n@mail.python.org> http://hg.python.org/cpython/rev/b1244046f37a changeset: 87045:b1244046f37a parent: 87043:28059d8b395b parent: 87044:5e98c4e9c909 user: Jason R. Coombs date: Sun Nov 10 18:59:44 2013 -0500 summary: Merge with 3.3 for Issue #19544 and Issue #6286. Merge is untested. I was unable to test due to bab0cbf86835. files: Lib/distutils/command/upload.py | 58 +++++++------- Lib/distutils/tests/test_upload.py | 69 +++++++---------- Misc/NEWS | 4 + 3 files changed, 61 insertions(+), 70 deletions(-) diff --git a/Lib/distutils/command/upload.py b/Lib/distutils/command/upload.py --- a/Lib/distutils/command/upload.py +++ b/Lib/distutils/command/upload.py @@ -10,10 +10,9 @@ import os, io import socket import platform -import configparser -import http.client as httpclient from base64 import standard_b64encode -import urllib.parse +from urllib.request import urlopen, Request, HTTPError +from urllib.parse import urlparse # this keeps compatibility for 2.3 and 2.4 if sys.version < "2.5": @@ -66,6 +65,15 @@ self.upload_file(command, pyversion, filename) def upload_file(self, command, pyversion, filename): + # Makes sure the repository URL is compliant + schema, netloc, url, params, query, fragments = \ + urlparse(self.repository) + if params or query or fragments: + raise AssertionError("Incompatible url %s" % self.repository) + + if schema not in ('http', 'https'): + raise AssertionError("unsupported schema " + schema) + # Sign if requested if self.sign: gpg_args = ["gpg", "--detach-sign", "-a", filename] @@ -162,41 +170,31 @@ self.announce("Submitting %s to %s" % (filename, self.repository), log.INFO) # build the Request - # We can't use urllib since we need to send the Basic - # auth right with the first request - # TODO(jhylton): Can we fix urllib? - schema, netloc, url, params, query, fragments = \ - urllib.parse.urlparse(self.repository) - assert not params and not query and not fragments - if schema == 'http': - http = httpclient.HTTPConnection(netloc) - elif schema == 'https': - http = httpclient.HTTPSConnection(netloc) - else: - raise AssertionError("unsupported schema "+schema) + headers = {'Content-type': + 'multipart/form-data; boundary=%s' % boundary, + 'Content-length': str(len(body)), + 'Authorization': auth} - data = '' - loglevel = log.INFO + request = Request(self.repository, data=body, + headers=headers) + # send the data try: - http.connect() - http.putrequest("POST", url) - http.putheader('Content-type', - 'multipart/form-data; boundary=%s'%boundary) - http.putheader('Content-length', str(len(body))) - http.putheader('Authorization', auth) - http.endheaders() - http.send(body) + result = urlopen(request) + status = result.getcode() + reason = result.msg except OSError as e: self.announce(str(e), log.ERROR) return + except HTTPError as e: + status = e.code + reason = e.msg - r = http.getresponse() - if r.status == 200: - self.announce('Server response (%s): %s' % (r.status, r.reason), + if status == 200: + self.announce('Server response (%s): %s' % (status, reason), log.INFO) else: - self.announce('Upload failed (%s): %s' % (r.status, r.reason), + self.announce('Upload failed (%s): %s' % (status, reason), log.ERROR) if self.show_response: - msg = '\n'.join(('-' * 75, r.read(), '-' * 75)) + msg = '\n'.join(('-' * 75, result.read(), '-' * 75)) self.announce(msg, log.INFO) diff --git a/Lib/distutils/tests/test_upload.py b/Lib/distutils/tests/test_upload.py --- a/Lib/distutils/tests/test_upload.py +++ b/Lib/distutils/tests/test_upload.py @@ -1,9 +1,9 @@ """Tests for distutils.command.upload.""" import os import unittest -import http.client as httpclient from test.support import run_unittest +from distutils.command import upload as upload_mod from distutils.command.upload import upload from distutils.core import Distribution @@ -37,47 +37,36 @@ [server1] username:me """ -class Response(object): - def __init__(self, status=200, reason='OK'): - self.status = status - self.reason = reason -class FakeConnection(object): +class FakeOpen(object): - def __init__(self): - self.requests = [] - self.headers = [] - self.body = '' + def __init__(self, url): + self.url = url + if not isinstance(url, str): + self.req = url + else: + self.req = None + self.msg = 'OK' - def __call__(self, netloc): - return self + def getcode(self): + return 200 - def connect(self): - pass - endheaders = connect - - def putrequest(self, method, url): - self.requests.append((method, url)) - - def putheader(self, name, value): - self.headers.append((name, value)) - - def send(self, body): - self.body = body - - def getresponse(self): - return Response() class uploadTestCase(PyPIRCCommandTestCase): def setUp(self): super(uploadTestCase, self).setUp() - if hasattr(httpclient, 'HTTPSConnection'): - self.addCleanup(setattr, httpclient, 'HTTPSConnection', - httpclient.HTTPSConnection) - else: - self.addCleanup(delattr, httpclient, 'HTTPSConnection') - self.conn = httpclient.HTTPSConnection = FakeConnection() + self.old_open = upload_mod.urlopen + upload_mod.urlopen = self._urlopen + self.last_open = None + + def tearDown(self): + upload_mod.urlopen = self.old_open + super(uploadTestCase, self).tearDown() + + def _urlopen(self, url): + self.last_open = FakeOpen(url) + return self.last_open def test_finalize_options(self): @@ -122,14 +111,14 @@ cmd.ensure_finalized() cmd.run() - # what did we send ? - headers = dict(self.conn.headers) + # what did we send ? + headers = dict(self.last_open.req.headers) self.assertEqual(headers['Content-length'], '2087') - self.assertTrue(headers['Content-type'].startswith('multipart/form-data')) - self.assertFalse('\n' in headers['Authorization']) - - self.assertEqual(self.conn.requests, [('POST', '/pypi')]) - self.assertTrue((b'xxx') in self.conn.body) + self.assert_(headers['Content-type'].startswith('multipart/form-data')) + self.assertEquals(self.last_open.req.get_method(), 'POST') + self.assertEquals(self.last_open.req.get_full_url(), + 'http://pypi.python.org/pypi') + self.assert_(b'xxx' in self.last_open.req.data) def test_suite(): return unittest.makeSuite(uploadTestCase) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -34,6 +34,10 @@ Library ------- +- Issue #19544 and Issue #6286: Restore use of urllib over http allowing use + of http_proxy for Distutils upload command, a feature accidentally lost + in the rollback of distutils2. + - Issue #19544 and Issue #7457: Restore the read_pkg_file method to distutils.dist.DistributionMetadata accidentally removed in the undo of distutils2. -- Repository URL: http://hg.python.org/cpython From tjreedy at udel.edu Mon Nov 11 00:48:30 2013 From: tjreedy at udel.edu (Terry Reedy) Date: Sun, 10 Nov 2013 18:48:30 -0500 Subject: [Python-checkins] cpython: #1097797: Add CP273 codec, and exercise it in the test suite In-Reply-To: <3dHkgx6mzFz7Ltb@mail.python.org> References: <3dHkgx6mzFz7Ltb@mail.python.org> Message-ID: <52801B4E.1020204@udel.edu> On 11/10/2013 1:46 PM, andrew.kuchling wrote: > http://hg.python.org/cpython/rev/7d9d1bcd7d18 > changeset: 87032:7d9d1bcd7d18 > user: Andrew Kuchling > date: Sun Nov 10 13:44:30 2013 -0500 > summary: > #1097797: Add CP273 codec, and exercise it in the test suite > > files: > Lib/encodings/cp273.py | 307 +++++++++++++++++++++++++++ > Lib/test/test_unicode.py | 4 +- > 2 files changed, 309 insertions(+), 2 deletions(-) > > > diff --git a/Lib/encodings/cp273.py b/Lib/encodings/cp273.py > new file mode 100644 > --- /dev/null > +++ b/Lib/encodings/cp273.py > @@ -0,0 +1,307 @@ > +""" Python Character Mapping Codec cp273 generated from 'python-mappings/CP273.TXT' with gencodec.py. > + > +"""#" > + > +import codecs > + > +### Codec APIs > + > +class Codec(codecs.Codec): > + > + def encode(self,input,errors='strict'): > + return codecs.charmap_encode(input,errors,encoding_table) You left out the spaces after the call commas (PEP8) > + > + def decode(self,input,errors='strict'): > + return codecs.charmap_decode(input,errors,decoding_table) > + > +class IncrementalEncoder(codecs.IncrementalEncoder): > + def encode(self, input, final=False): as included above > + return codecs.charmap_encode(input,self.errors,encoding_table)[0] > +class IncrementalDecoder(codecs.IncrementalDecoder): > + def decode(self, input, final=False): and here. > + return codecs.charmap_decode(input,self.errors,decoding_table)[0] From python-checkins at python.org Mon Nov 11 02:30:31 2013 From: python-checkins at python.org (jason.coombs) Date: Mon, 11 Nov 2013 02:30:31 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_failing_test_incorrect?= =?utf-8?q?ly_merged_in_b1244046f37a?= Message-ID: <3dHvfb3j8Nz7LjT@mail.python.org> http://hg.python.org/cpython/rev/394ed9deebd4 changeset: 87046:394ed9deebd4 user: Jason R. Coombs date: Sun Nov 10 20:28:18 2013 -0500 summary: Fix failing test incorrectly merged in b1244046f37a files: Lib/distutils/tests/test_upload.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/distutils/tests/test_upload.py b/Lib/distutils/tests/test_upload.py --- a/Lib/distutils/tests/test_upload.py +++ b/Lib/distutils/tests/test_upload.py @@ -117,7 +117,7 @@ self.assert_(headers['Content-type'].startswith('multipart/form-data')) self.assertEquals(self.last_open.req.get_method(), 'POST') self.assertEquals(self.last_open.req.get_full_url(), - 'http://pypi.python.org/pypi') + 'https://pypi.python.org/pypi') self.assert_(b'xxx' in self.last_open.req.data) def test_suite(): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 11 03:46:57 2013 From: python-checkins at python.org (andrew.kuchling) Date: Mon, 11 Nov 2013 03:46:57 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_some_PEP8-formatting_p?= =?utf-8?q?roblems_in_the_generated_code?= Message-ID: <3dHxLn5RSgz7Ljv@mail.python.org> http://hg.python.org/cpython/rev/6f74b664d619 changeset: 87047:6f74b664d619 user: Andrew Kuchling date: Sun Nov 10 21:45:24 2013 -0500 summary: Fix some PEP8-formatting problems in the generated code files: Tools/unicode/gencodec.py | 18 +++++++++--------- 1 files changed, 9 insertions(+), 9 deletions(-) diff --git a/Tools/unicode/gencodec.py b/Tools/unicode/gencodec.py --- a/Tools/unicode/gencodec.py +++ b/Tools/unicode/gencodec.py @@ -290,27 +290,27 @@ class Codec(codecs.Codec): - def encode(self,input,errors='strict'): - return codecs.charmap_encode(input,errors,encoding_%s) + def encode(self, input, errors='strict'): + return codecs.charmap_encode(input, errors, encoding_%s) - def decode(self,input,errors='strict'): - return codecs.charmap_decode(input,errors,decoding_%s) + def decode(self, input, errors='strict'): + return codecs.charmap_decode(input, errors, decoding_%s) ''' % (encodingname, name, suffix, suffix)] l.append('''\ class IncrementalEncoder(codecs.IncrementalEncoder): def encode(self, input, final=False): - return codecs.charmap_encode(input,self.errors,encoding_%s)[0] + return codecs.charmap_encode(input, self.errors, encoding_%s)[0] class IncrementalDecoder(codecs.IncrementalDecoder): def decode(self, input, final=False): - return codecs.charmap_decode(input,self.errors,decoding_%s)[0]''' % + return codecs.charmap_decode(input, self.errors, decoding_%s)[0]''' % (suffix, suffix)) l.append(''' -class StreamWriter(Codec,codecs.StreamWriter): +class StreamWriter(Codec, codecs.StreamWriter): pass -class StreamReader(Codec,codecs.StreamReader): +class StreamReader(Codec, codecs.StreamReader): pass ### encodings module API @@ -343,7 +343,7 @@ if decoding_table_code: l.append(''' ### Encoding table -encoding_table=codecs.charmap_build(decoding_table) +encoding_table = codecs.charmap_build(decoding_table) ''') else: l.append(''' -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 11 03:46:59 2013 From: python-checkins at python.org (andrew.kuchling) Date: Mon, 11 Nov 2013 03:46:59 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=231097797=3A_add_the_orig?= =?utf-8?q?inal_mapping_file?= Message-ID: <3dHxLq1BDTz7Ljv@mail.python.org> http://hg.python.org/cpython/rev/93645b0b6750 changeset: 87048:93645b0b6750 user: Andrew Kuchling date: Sun Nov 10 21:46:02 2013 -0500 summary: #1097797: add the original mapping file files: Tools/unicode/python-mappings/CP273.TXT | 258 ++++++++++++ 1 files changed, 258 insertions(+), 0 deletions(-) diff --git a/Tools/unicode/python-mappings/CP273.TXT b/Tools/unicode/python-mappings/CP273.TXT new file mode 100644 --- /dev/null +++ b/Tools/unicode/python-mappings/CP273.TXT @@ -0,0 +1,258 @@ +0x00 0x0000 #NULL (NUL) +0x01 0x0001 #START OF HEADING (SOH) +0x02 0x0002 #START OF TEXT (STX) +0x03 0x0003 #END OF TEXT (ETX) +0x04 0x009C #STRING TERMINATOR (ST) +0x05 0x0009 #CHARACTER TABULATION (HT) +0x06 0x0086 #START OF SELECTED AREA (SSA) +0x07 0x007F #DELETE (DEL) +0x08 0x0097 #END OF GUARDED AREA (EPA) +0x09 0x008D #REVERSE LINE FEED (RI) +0x0A 0x008E #SINGLE-SHIFT TWO (SS2) +0x0B 0x000B #LINE TABULATION (VT) +0x0C 0x000C #FORM FEED (FF) +0x0D 0x000D #CARRIAGE RETURN (CR) +0x0E 0x000E #SHIFT OUT (SO) +0x0F 0x000F #SHIFT IN (SI) +0x10 0x0010 #DATALINK ESCAPE (DLE) +0x11 0x0011 #DEVICE CONTROL ONE (DC1) +0x12 0x0012 #DEVICE CONTROL TWO (DC2) +0x13 0x0013 #DEVICE CONTROL THREE (DC3) +0x14 0x009D #OPERATING SYSTEM COMMAND (OSC) +0x15 0x0085 #NEXT LINE (NEL) +0x16 0x0008 #BACKSPACE (BS) +0x17 0x0087 #END OF SELECTED AREA (ESA) +0x18 0x0018 #CANCEL (CAN) +0x19 0x0019 #END OF MEDIUM (EM) +0x1A 0x0092 #PRIVATE USE TWO (PU2) +0x1B 0x008F #SINGLE-SHIFT THREE (SS3) +0x1C 0x001C #FILE SEPARATOR (IS4) +0x1D 0x001D #GROUP SEPARATOR (IS3) +0x1E 0x001E #RECORD SEPARATOR (IS2) +0x1F 0x001F #UNIT SEPARATOR (IS1) +0x20 0x0080 #PADDING CHARACTER (PAD) +0x21 0x0081 #HIGH OCTET PRESET (HOP) +0x22 0x0082 #BREAK PERMITTED HERE (BPH) +0x23 0x0083 #NO BREAK HERE (NBH) +0x24 0x0084 #INDEX (IND) +0x25 0x000A #LINE FEED (LF) +0x26 0x0017 #END OF TRANSMISSION BLOCK (ETB) +0x27 0x001B #ESCAPE (ESC) +0x28 0x0088 #CHARACTER TABULATION SET (HTS) +0x29 0x0089 #CHARACTER TABULATION WITH JUSTIFICATION (HTJ) +0x2A 0x008A #LINE TABULATION SET (VTS) +0x2B 0x008B #PARTIAL LINE FORWARD (PLD) +0x2C 0x008C #PARTIAL LINE BACKWARD (PLU) +0x2D 0x0005 #ENQUIRY (ENQ) +0x2E 0x0006 #ACKNOWLEDGE (ACK) +0x2F 0x0007 #BELL (BEL) +0x30 0x0090 #DEVICE CONTROL STRING (DCS) +0x31 0x0091 #PRIVATE USE ONE (PU1) +0x32 0x0016 #SYNCHRONOUS IDLE (SYN) +0x33 0x0093 #SET TRANSMIT STATE (STS) +0x34 0x0094 #CANCEL CHARACTER (CCH) +0x35 0x0095 #MESSAGE WAITING (MW) +0x36 0x0096 #START OF GUARDED AREA (SPA) +0x37 0x0004 #END OF TRANSMISSION (EOT) +0x38 0x0098 #START OF STRING (SOS) +0x39 0x0099 #SINGLE GRAPHIC CHARACTER INTRODUCER (SGCI) +0x3A 0x009A #SINGLE CHARACTER INTRODUCER (SCI) +0x3B 0x009B #CONTROL SEQUENCE INTRODUCER (CSI) +0x3C 0x0014 #DEVICE CONTROL FOUR (DC4) +0x3D 0x0015 #NEGATIVE ACKNOWLEDGE (NAK) +0x3E 0x009E #PRIVACY MESSAGE (PM) +0x3F 0x001A #SUBSTITUTE (SUB) +0x40 0x0020 #SPACE +0x41 0x00A0 #NO-BREAK SPACE +0x42 0x00E2 #LATIN SMALL LETTER A WITH CIRCUMFLEX +0x43 0x007B #LEFT CURLY BRACKET +0x44 0x00E0 #LATIN SMALL LETTER A WITH GRAVE +0x45 0x00E1 #LATIN SMALL LETTER A WITH ACUTE +0x46 0x00E3 #LATIN SMALL LETTER A WITH TILDE +0x47 0x00E5 #LATIN SMALL LETTER A WITH RING ABOVE +0x48 0x00E7 #LATIN SMALL LETTER C WITH CEDILLA +0x49 0x00F1 #LATIN SMALL LETTER N WITH TILDE +0x4A 0x00C4 #LATIN CAPITAL LETTER A WITH DIAERESIS +0x4B 0x002E #FULL STOP +0x4C 0x003C #LESS-THAN SIGN +0x4D 0x0028 #LEFT PARENTHESIS +0x4E 0x002B #PLUS SIGN +0x4F 0x0021 #EXCLAMATION MARK +0x50 0x0026 #AMPERSAND +0x51 0x00E9 #LATIN SMALL LETTER E WITH ACUTE +0x52 0x00EA #LATIN SMALL LETTER E WITH CIRCUMFLEX +0x53 0x00EB #LATIN SMALL LETTER E WITH DIAERESIS +0x54 0x00E8 #LATIN SMALL LETTER E WITH GRAVE +0x55 0x00ED #LATIN SMALL LETTER I WITH ACUTE +0x56 0x00EE #LATIN SMALL LETTER I WITH CIRCUMFLEX +0x57 0x00EF #LATIN SMALL LETTER I WITH DIAERESIS +0x58 0x00EC #LATIN SMALL LETTER I WITH GRAVE +0x59 0x007E #TILDE +0x5A 0x00DC #LATIN CAPITAL LETTER U WITH DIAERESIS +0x5B 0x0024 #DOLLAR SIGN +0x5C 0x002A #ASTERISK +0x5D 0x0029 #RIGHT PARENTHESIS +0x5E 0x003B #SEMICOLON +0x5F 0x005E #CIRCUMFLEX ACCENT +0x60 0x002D #HYPHEN-MINUS +0x61 0x002F #SOLIDUS +0x62 0x00C2 #LATIN CAPITAL LETTER A WITH CIRCUMFLEX +0x63 0x005B #LEFT SQUARE BRACKET +0x64 0x00C0 #LATIN CAPITAL LETTER A WITH GRAVE +0x65 0x00C1 #LATIN CAPITAL LETTER A WITH ACUTE +0x66 0x00C3 #LATIN CAPITAL LETTER A WITH TILDE +0x67 0x00C5 #LATIN CAPITAL LETTER A WITH RING ABOVE +0x68 0x00C7 #LATIN CAPITAL LETTER C WITH CEDILLA +0x69 0x00D1 #LATIN CAPITAL LETTER N WITH TILDE +0x6A 0x00F6 #LATIN SMALL LETTER O WITH DIAERESIS +0x6B 0x002C #COMMA +0x6C 0x0025 #PERCENT SIGN +0x6D 0x005F #LOW LINE +0x6E 0x003E #GREATER-THAN SIGN +0x6F 0x003F #QUESTION MARK +0x70 0x00F8 #LATIN SMALL LETTER O WITH STROKE +0x71 0x00C9 #LATIN CAPITAL LETTER E WITH ACUTE +0x72 0x00CA #LATIN CAPITAL LETTER E WITH CIRCUMFLEX +0x73 0x00CB #LATIN CAPITAL LETTER E WITH DIAERESIS +0x74 0x00C8 #LATIN CAPITAL LETTER E WITH GRAVE +0x75 0x00CD #LATIN CAPITAL LETTER I WITH ACUTE +0x76 0x00CE #LATIN CAPITAL LETTER I WITH CIRCUMFLEX +0x77 0x00CF #LATIN CAPITAL LETTER I WITH DIAERESIS +0x78 0x00CC #LATIN CAPITAL LETTER I WITH GRAVE +0x79 0x0060 #GRAVE ACCENT +0x7A 0x003A #COLON +0x7B 0x0023 #NUMBER SIGN +0x7C 0x00A7 #SECTION SIGN +0x7D 0x0027 #APOSTROPHE +0x7E 0x003D #EQUALS SIGN +0x7F 0x0022 #QUOTATION MARK +0x80 0x00D8 #LATIN CAPITAL LETTER O WITH STROKE +0x81 0x0061 #LATIN SMALL LETTER A +0x82 0x0062 #LATIN SMALL LETTER B +0x83 0x0063 #LATIN SMALL LETTER C +0x84 0x0064 #LATIN SMALL LETTER D +0x85 0x0065 #LATIN SMALL LETTER E +0x86 0x0066 #LATIN SMALL LETTER F +0x87 0x0067 #LATIN SMALL LETTER G +0x88 0x0068 #LATIN SMALL LETTER H +0x89 0x0069 #LATIN SMALL LETTER I +0x8A 0x00AB #LEFT-POINTING DOUBLE ANGLE QUOTATION MARK +0x8B 0x00BB #RIGHT-POINTING DOUBLE ANGLE QUOTATION MARK +0x8C 0x00F0 #LATIN SMALL LETTER ETH (Icelandic) +0x8D 0x00FD #LATIN SMALL LETTER Y WITH ACUTE +0x8E 0x00FE #LATIN SMALL LETTER THORN (Icelandic) +0x8F 0x00B1 #PLUS-MINUS SIGN +0x90 0x00B0 #DEGREE SIGN +0x91 0x006A #LATIN SMALL LETTER J +0x92 0x006B #LATIN SMALL LETTER K +0x93 0x006C #LATIN SMALL LETTER L +0x94 0x006D #LATIN SMALL LETTER M +0x95 0x006E #LATIN SMALL LETTER N +0x96 0x006F #LATIN SMALL LETTER O +0x97 0x0070 #LATIN SMALL LETTER P +0x98 0x0071 #LATIN SMALL LETTER Q +0x99 0x0072 #LATIN SMALL LETTER R +0x9A 0x00AA #FEMININE ORDINAL INDICATOR +0x9B 0x00BA #MASCULINE ORDINAL INDICATOR +0x9C 0x00E6 #LATIN SMALL LETTER AE +0x9D 0x00B8 #CEDILLA +0x9E 0x00C6 #LATIN CAPITAL LETTER AE +0x9F 0x00A4 #CURRENCY SIGN +0xA0 0x00B5 #MICRO SIGN +0xA1 0x00DF #LATIN SMALL LETTER SHARP S (German) +0xA2 0x0073 #LATIN SMALL LETTER S +0xA3 0x0074 #LATIN SMALL LETTER T +0xA4 0x0075 #LATIN SMALL LETTER U +0xA5 0x0076 #LATIN SMALL LETTER V +0xA6 0x0077 #LATIN SMALL LETTER W +0xA7 0x0078 #LATIN SMALL LETTER X +0xA8 0x0079 #LATIN SMALL LETTER Y +0xA9 0x007A #LATIN SMALL LETTER Z +0xAA 0x00A1 #INVERTED EXCLAMATION MARK +0xAB 0x00BF #INVERTED QUESTION MARK +0xAC 0x00D0 #LATIN CAPITAL LETTER ETH (Icelandic) +0xAD 0x00DD #LATIN CAPITAL LETTER Y WITH ACUTE +0xAE 0x00DE #LATIN CAPITAL LETTER THORN (Icelandic) +0xAF 0x00AE #REGISTERED SIGN +0xB0 0x00A2 #CENT SIGN +0xB1 0x00A3 #POUND SIGN +0xB2 0x00A5 #YEN SIGN +0xB3 0x00B7 #MIDDLE DOT +0xB4 0x00A9 #COPYRIGHT SIGN +0xB5 0x0040 #COMMERCIAL AT +0xB6 0x00B6 #PILCROW SIGN +0xB7 0x00BC #VULGAR FRACTION ONE QUARTER +0xB8 0x00BD #VULGAR FRACTION ONE HALF +0xB9 0x00BE #VULGAR FRACTION THREE QUARTERS +0xBA 0x00AC #NOT SIGN +0xBB 0x007C #VERTICAL LINE +0xBC 0x203E #OVERLINE +0xBD 0x00A8 #DIAERESIS +0xBE 0x00B4 #ACUTE ACCENT +0xBF 0x00D7 #MULTIPLICATION SIGN +0xC0 0x00E4 #LATIN SMALL LETTER A WITH DIAERESIS +0xC1 0x0041 #LATIN CAPITAL LETTER A +0xC2 0x0042 #LATIN CAPITAL LETTER B +0xC3 0x0043 #LATIN CAPITAL LETTER C +0xC4 0x0044 #LATIN CAPITAL LETTER D +0xC5 0x0045 #LATIN CAPITAL LETTER E +0xC6 0x0046 #LATIN CAPITAL LETTER F +0xC7 0x0047 #LATIN CAPITAL LETTER G +0xC8 0x0048 #LATIN CAPITAL LETTER H +0xC9 0x0049 #LATIN CAPITAL LETTER I +0xCA 0x00AD #SOFT HYPHEN +0xCB 0x00F4 #LATIN SMALL LETTER O WITH CIRCUMFLEX +0xCC 0x00A6 #BROKEN BAR +0xCD 0x00F2 #LATIN SMALL LETTER O WITH GRAVE +0xCE 0x00F3 #LATIN SMALL LETTER O WITH ACUTE +0xCF 0x00F5 #LATIN SMALL LETTER O WITH TILDE +0xD0 0x00FC #LATIN SMALL LETTER U WITH DIAERESIS +0xD1 0x004A #LATIN CAPITAL LETTER J +0xD2 0x004B #LATIN CAPITAL LETTER K +0xD3 0x004C #LATIN CAPITAL LETTER L +0xD4 0x004D #LATIN CAPITAL LETTER M +0xD5 0x004E #LATIN CAPITAL LETTER N +0xD6 0x004F #LATIN CAPITAL LETTER O +0xD7 0x0050 #LATIN CAPITAL LETTER P +0xD8 0x0051 #LATIN CAPITAL LETTER Q +0xD9 0x0052 #LATIN CAPITAL LETTER R +0xDA 0x00B9 #SUPERSCRIPT ONE +0xDB 0x00FB #LATIN SMALL LETTER U WITH CIRCUMFLEX +0xDC 0x007D #RIGHT CURLY BRACKET +0xDD 0x00F9 #LATIN SMALL LETTER U WITH GRAVE +0xDE 0x00FA #LATIN SMALL LETTER U WITH ACUTE +0xDF 0x00FF #LATIN SMALL LETTER Y WITH DIAERESIS +0xE0 0x00D6 #LATIN CAPITAL LETTER O WITH DIAERESIS +0xE1 0x00F7 #DIVISION SIGN +0xE2 0x0053 #LATIN CAPITAL LETTER S +0xE3 0x0054 #LATIN CAPITAL LETTER T +0xE4 0x0055 #LATIN CAPITAL LETTER U +0xE5 0x0056 #LATIN CAPITAL LETTER V +0xE6 0x0057 #LATIN CAPITAL LETTER W +0xE7 0x0058 #LATIN CAPITAL LETTER X +0xE8 0x0059 #LATIN CAPITAL LETTER Y +0xE9 0x005A #LATIN CAPITAL LETTER Z +0xEA 0x00B2 #SUPERSCRIPT TWO +0xEB 0x00D4 #LATIN CAPITAL LETTER O WITH CIRCUMFLEX +0xEC 0x005C #REVERSE SOLIDUS +0xED 0x00D2 #LATIN CAPITAL LETTER O WITH GRAVE +0xEE 0x00D3 #LATIN CAPITAL LETTER O WITH ACUTE +0xEF 0x00D5 #LATIN CAPITAL LETTER O WITH TILDE +0xF0 0x0030 #DIGIT ZERO +0xF1 0x0031 #DIGIT ONE +0xF2 0x0032 #DIGIT TWO +0xF3 0x0033 #DIGIT THREE +0xF4 0x0034 #DIGIT FOUR +0xF5 0x0035 #DIGIT FIVE +0xF6 0x0036 #DIGIT SIX +0xF7 0x0037 #DIGIT SEVEN +0xF8 0x0038 #DIGIT EIGHT +0xF9 0x0039 #DIGIT NINE +0xFA 0x00B3 #SUPERSCRIPT THREE +0xFB 0x00DB #LATIN CAPITAL LETTER U WITH CIRCUMFLEX +0xFC 0x005D #RIGHT SQUARE BRACKET +0xFD 0x00D9 #LATIN CAPITAL LETTER U WITH GRAVE +0xFE 0x00DA #LATIN CAPITAL LETTER U WITH ACUTE +0xFF 0x009F #APPLICATION PROGRAM COMMAND (APC) + + -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Mon Nov 11 04:49:28 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Mon, 11 Nov 2013 04:49:28 +0100 Subject: [Python-checkins] Daily reference leaks (394ed9deebd4): sum=4 Message-ID: results for 394ed9deebd4 on branch "default" -------------------------------------------- test_asyncio leaked [4, 0, 0] memory blocks, sum=4 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogwWDM4x', '-x'] From python-checkins at python.org Mon Nov 11 06:52:12 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Mon, 11 Nov 2013 06:52:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fixed_compile_error_on_Win?= =?utf-8?q?dows_caused_by_arithmetic_with_void_*_pointers?= Message-ID: <3dJ1SX1BpNz7Lkr@mail.python.org> http://hg.python.org/cpython/rev/35cd00465624 changeset: 87049:35cd00465624 user: Serhiy Storchaka date: Mon Nov 11 07:47:35 2013 +0200 summary: Fixed compile error on Windows caused by arithmetic with void * pointers (issue #16685). files: Modules/audioop.c | 6 +++--- 1 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Modules/audioop.c b/Modules/audioop.c --- a/Modules/audioop.c +++ b/Modules/audioop.c @@ -287,9 +287,9 @@ 15289, 16818, 18500, 20350, 22385, 24623, 27086, 29794, 32767 }; -#define GETINTX(T, cp, i) (*(T *)((cp) + (i))) -#define SETINTX(T, cp, i, val) do { \ - *(T *)((cp) + (i)) = (T)(val); \ +#define GETINTX(T, cp, i) (*(T *)((unsigned char *)(cp) + (i))) +#define SETINTX(T, cp, i, val) do { \ + *(T *)((unsigned char *)(cp) + (i)) = (T)(val); \ } while (0) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 11 12:32:23 2013 From: python-checkins at python.org (kristjan.jonsson) Date: Mon, 11 Nov 2013 12:32:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=238799=3A_Reduce_ti?= =?utf-8?q?ming_sensitivity_of_condition_test_by_explicitly?= Message-ID: <3dJ9132zHDz7Lnt@mail.python.org> http://hg.python.org/cpython/rev/9e48c9538a7f changeset: 87050:9e48c9538a7f user: Kristjan Valur Jonsson date: Mon Nov 11 11:29:04 2013 +0000 summary: Issue #8799: Reduce timing sensitivity of condition test by explicitly delaying the main thread so that it doesn't race ahead of the workers. files: Lib/test/lock_tests.py | 15 +++++++++++++++ 1 files changed, 15 insertions(+), 0 deletions(-) diff --git a/Lib/test/lock_tests.py b/Lib/test/lock_tests.py --- a/Lib/test/lock_tests.py +++ b/Lib/test/lock_tests.py @@ -418,6 +418,17 @@ self.assertRaises(RuntimeError, cond.notify) def _check_notify(self, cond): + # Note that this test is sensitive to timing. If the worker threads + # don't execute in a timely fashion, the main thread may think they + # are further along then they are. The main thread therefore issues + # _wait() statements to try to make sure that it doesn't race ahead + # of the workers. + # Secondly, this test assumes that condition variables are not subject + # to spurious wakeups. The absence of spurious wakeups is an implementation + # detail of Condition Cariables in current CPython, but in general, not + # a guaranteed property of condition variables as a programming + # construct. In particular, it is possible that this can no longer + # be conveniently guaranteed should their implementation ever change. N = 5 results1 = [] results2 = [] @@ -445,6 +456,9 @@ _wait() self.assertEqual(results1, [(True, 1)] * 3) self.assertEqual(results2, []) + # first wait, to ensure all workers settle into cond.wait() before + # we continue. See issue #8799 + _wait() # Notify 5 threads: they might be in their first or second wait cond.acquire() cond.notify(5) @@ -455,6 +469,7 @@ _wait() self.assertEqual(results1, [(True, 1)] * 3 + [(True, 2)] * 2) self.assertEqual(results2, [(True, 2)] * 3) + _wait() # make sure all workers settle into cond.wait() # Notify all threads: they are all in their second wait cond.acquire() cond.notify_all() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 11 13:15:28 2013 From: python-checkins at python.org (nick.coghlan) Date: Mon, 11 Nov 2013 13:15:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2319406=3A_Initial_?= =?utf-8?q?implementation_of_ensurepip?= Message-ID: <3dJ9ym2HGcz7LwB@mail.python.org> http://hg.python.org/cpython/rev/6a6b1ee306e3 changeset: 87051:6a6b1ee306e3 user: Nick Coghlan date: Mon Nov 11 22:11:55 2013 +1000 summary: Close #19406: Initial implementation of ensurepip Patch by Donald Stufft and Nick Coghlan files: .hgeol | 1 + Doc/library/development.rst | 1 - Doc/library/distribution.rst | 14 + Doc/library/ensurepip.rst | 125 ++++++++++ Doc/library/index.rst | 1 + Doc/library/python.rst | 1 - Doc/whatsnew/3.4.rst | 29 ++ Lib/ensurepip/__init__.py | 92 +++++++ Lib/ensurepip/__main__.py | 66 +++++ Lib/ensurepip/_bundled/pip-1.5.dev1-py2.py3-none-any.whl | Bin Lib/ensurepip/_bundled/setuptools-1.3.2-py2.py3-none-any.whl | Bin Lib/test/test_ensurepip.py | 123 +++++++++ Makefile.pre.in | 1 + Misc/ACKS | 1 + Misc/NEWS | 3 + Tools/scripts/checkpip.py | 32 ++ 16 files changed, 488 insertions(+), 2 deletions(-) diff --git a/.hgeol b/.hgeol --- a/.hgeol +++ b/.hgeol @@ -26,6 +26,7 @@ **.psd = BIN **.tar = BIN **.wav = BIN +**.whl = BIN **.xar = BIN **.zip = BIN diff --git a/Doc/library/development.rst b/Doc/library/development.rst --- a/Doc/library/development.rst +++ b/Doc/library/development.rst @@ -23,4 +23,3 @@ unittest.mock-examples.rst 2to3.rst test.rst - venv.rst diff --git a/Doc/library/distribution.rst b/Doc/library/distribution.rst new file mode 100644 --- /dev/null +++ b/Doc/library/distribution.rst @@ -0,0 +1,14 @@ +*********************************** +Software Packaging and Distribution +*********************************** + +These libraries help you with publishing and installing Python software. +While these modules are designed to work in conjunction with the +`Python Package Index `__, they can also be used +with a local index server, or without any index server at all. + +.. toctree:: + + distutils.rst + ensurepip.rst + venv.rst diff --git a/Doc/library/ensurepip.rst b/Doc/library/ensurepip.rst new file mode 100644 --- /dev/null +++ b/Doc/library/ensurepip.rst @@ -0,0 +1,125 @@ +:mod:`ensurepip` --- Bootstrapping the ``pip`` installer +======================================================== + +.. module:: ensurepip + :synopsis: Bootstrapping the ``pip`` installer into an existing Python + installation or virtual environment. + +The :mod:`ensurepip` package provides support for bootstrapping the ``pip`` +installer into an existing Python installation or virtual environment. This +bootstrapping approach reflects the fact that ``pip`` is an independent +project with its own release cycle, and the latest available stable version +is bundled with maintenance and feature releases of the CPython reference +interpreter. + +In most cases, end users of Python shouldn't need to invoke this module +directly (as ``pip`` should be bootstrapped by default), but it may be +needed if installing ``pip`` was skipped when installing Python (or +when creating a virtual environment) or after explicitly uninstalling +``pip``. + +.. versionadded:: 3.4 + +.. note:: + + This module *does not* access the internet. All of the components + needed to bootstrap ``pip`` are included as internal parts of the + package. + +.. seealso:: + + :ref:`install-index` + The end user guide for installing Python packages + + :pep:`453`: Explicit bootstrapping of pip in Python installations + The original rationale and specification for this module. + + +Command line interface +---------------------- + +The command line interface is invoked using the interpreter's ``-m`` switch. + +The simplest possible invocation is:: + + python -m ensurepip + +This invocation will install ``pip`` if it is not already installed, +but otherwise does nothing. To ensure the installed version of ``pip`` +is at least as recent as the one bundled with ``ensurepip``, pass the +``--upgrade`` option:: + + python -m ensurepip --upgrade + +By default, ``pip`` is installed into the current virtual environment +(if one is active) or into the system site packages (if there is no +active virtual environment). The installation location can be controlled +through two additional command line options: + +* ``--root ``: Installs ``pip`` relative to the given root directory + rather than the root of the currently active virtual environment (if any) + or the default root for the current Python installation. +* ``--user``: Installs ``pip`` into the user site packages directory rather + than globally for the current Python installation (this option is not + permitted inside an active virtual environment). + +By default, the scripts ``pipX`` and ``pipX.Y`` will be installed (where +X.Y stands for the version of Python used to invoke ``ensurepip``). The +scripts installed can be controlled through two additional command line +options: + +* ``--altinstall``: if an alternate installation is requested, the ``pipX`` + script will *not* be installed. + +* ``--default-pip``: if a "default pip" installation is requested, the + ``pip`` script will be installed in addition to the two regular scripts. + +Providing both of the script selection options will trigger an exception. + + +Module API +---------- + +:mod:`ensurepip` exposes two functions for programmatic use: + +.. function:: version() + + Returns a string specifying the bundled version of pip that will be + installed when bootstrapping an environment. + +.. function:: bootstrap(root=None, upgrade=False, user=False, \ + altinstall=False, default_pip=False, \ + verbosity=0) + + Bootstraps ``pip`` into the current or designated environment. + + *root* specifies an alternative root directory to install relative to. + If *root* is None, then installation uses the default install location + for the current environment. + + *upgrade* indicates whether or not to upgrade an existing installation + of an earlier version of ``pip`` to the bundled version. + + *user* indicates whether to use the user scheme rather than installing + globally. + + By default, the scripts ``pipX`` and ``pipX.Y`` will be installed (where + X.Y stands for the current version of Python). + + If *altinstall* is set, then ``pipX`` will *not* be installed. + + If *default_pip* is set, then ``pip`` will be installed in addition to + the two regular scripts. + + Setting both *altinstall* and *default_pip* will trigger + :exc:`ValueError`. + + *verbosity* controls the level of output to :data:`sys.stdout` from the + bootstrapping operation. + + .. note:: + + The bootstrapping process may install additional modules required by + ``pip``, but other software should not assume those dependencies will + always be present by default (as the dependencies may be removed in a + future version of ``pip``). diff --git a/Doc/library/index.rst b/Doc/library/index.rst --- a/Doc/library/index.rst +++ b/Doc/library/index.rst @@ -65,6 +65,7 @@ tk.rst development.rst debug.rst + distribution.rst python.rst custominterp.rst modules.rst diff --git a/Doc/library/python.rst b/Doc/library/python.rst --- a/Doc/library/python.rst +++ b/Doc/library/python.rst @@ -25,4 +25,3 @@ inspect.rst site.rst fpectl.rst - distutils.rst diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -92,6 +92,7 @@ * :mod:`asyncio`: New provisonal API for asynchronous IO (:pep:`3156`). * :mod:`enum`: Support for enumeration types (:pep:`435`). +* :mod:`ensurepip`: Bootstrapping the pip installer (:pep:`453`). * :mod:`selectors`: High-level and efficient I/O multiplexing, built upon the :mod:`select` module primitives. * :mod:`statistics`: A basic numerically stable statistics library (:pep:`450`). @@ -123,6 +124,34 @@ Please read on for a comprehensive list of user-facing changes. + +PEP 453: Explicit bootstrapping of pip in Python installations +============================================================== + +The new :mod:`ensurepip` module (defined in :pep:`453`) provides a standard +cross-platform mechanism to boostrap the pip installer into Python +installations and virtual environments. + +.. note:: + + Only the first phase of PEP 453 has been implemented at this point. + This section will be fleshed out with additional details once those + other changes are implemented. + + Refer to :issue:`19347` for the progress on additional steps: + + * ``make install`` and ``make altinstall`` integration + * Windows installer integration + * Mac OS X installer integration + * :mod:`venv` module and :command:`pyvenv` integration + +.. seealso:: + + :pep:`453` - Explicit bootstrapping of pip in Python installations + PEP written by Donald Stufft and Nick Coghlan, implemented by + Donald Stufft, Nick Coghlan (and ...). + + .. _pep-446: PEP 446: Make newly created file descriptors non-inheritable diff --git a/Lib/ensurepip/__init__.py b/Lib/ensurepip/__init__.py new file mode 100644 --- /dev/null +++ b/Lib/ensurepip/__init__.py @@ -0,0 +1,92 @@ +import os +import os.path +import pkgutil +import sys +import tempfile + +# TODO: Remove the --pre flag when a pip 1.5 final copy is available + + +__all__ = ["version", "bootstrap"] + + +_SETUPTOOLS_VERSION = "1.3.2" + +_PIP_VERSION = "1.5.dev1" + +_PROJECTS = [ + ("setuptools", _SETUPTOOLS_VERSION), + ("pip", _PIP_VERSION), +] + + +def _run_pip(args, additional_paths): + # Add our bundled software to the sys.path so we can import it + sys.path = additional_paths + sys.path + + # Install the bundled software + import pip + pip.main(args) + + +def version(): + """ + Returns a string specifying the bundled version of pip. + """ + return _PIP_VERSION + + +def bootstrap(*, root=None, upgrade=False, user=False, + altinstall=False, default_pip=False, + verbosity=0): + """ + Bootstrap pip into the current Python installation (or the given root + directory). + """ + if altinstall and default_pip: + raise ValueError("Cannot use altinstall and default_pip together") + + # By default, installing pip and setuptools installs all of the + # following scripts (X.Y == running Python version): + # + # pip, pipX, pipX.Y, easy_install, easy_install-X.Y + # + # pip 1.5+ allows ensurepip to request that some of those be left out + if altinstall: + # omit pip, pipX and easy_install + os.environ["ENSUREPIP_OPTIONS"] = "altinstall" + elif not default_pip: + # omit pip and easy_install + os.environ["ENSUREPIP_OPTIONS"] = "install" + + with tempfile.TemporaryDirectory() as tmpdir: + # Put our bundled wheels into a temporary directory and construct the + # additional paths that need added to sys.path + additional_paths = [] + for project, version in _PROJECTS: + wheel_name = "{}-{}-py2.py3-none-any.whl".format(project, version) + whl = pkgutil.get_data( + "ensurepip", + "_bundled/{}".format(wheel_name), + ) + with open(os.path.join(tmpdir, wheel_name), "wb") as fp: + fp.write(whl) + + additional_paths.append(os.path.join(tmpdir, wheel_name)) + + # Construct the arguments to be passed to the pip command + args = [ + "install", "--no-index", "--find-links", tmpdir, + # Temporary until pip 1.5 is final + "--pre", + ] + if root: + args += ["--root", root] + if upgrade: + args += ["--upgrade"] + if user: + args += ["--user"] + if verbosity: + args += ["-" + "v" * verbosity] + + _run_pip(args + [p[0] for p in _PROJECTS], additional_paths) diff --git a/Lib/ensurepip/__main__.py b/Lib/ensurepip/__main__.py new file mode 100644 --- /dev/null +++ b/Lib/ensurepip/__main__.py @@ -0,0 +1,66 @@ +import argparse +import ensurepip + + +def main(): + parser = argparse.ArgumentParser(prog="python -m ensurepip") + parser.add_argument( + "--version", + action="version", + version="pip {}".format(ensurepip.version()), + help="Show the version of pip that is bundled with this Python.", + ) + parser.add_argument( + "-v", "--verbose", + action="count", + default=0, + dest="verbosity", + help=("Give more output. Option is additive, and can be used up to 3 " + "times."), + ) + parser.add_argument( + "-U", "--upgrade", + action="store_true", + default=False, + help="Upgrade pip and dependencies, even if already installed.", + ) + parser.add_argument( + "--user", + action="store_true", + default=False, + help="Install using the user scheme.", + ) + parser.add_argument( + "--root", + default=None, + help="Install everything relative to this alternate root directory.", + ) + parser.add_argument( + "--altinstall", + action="store_true", + default=False, + help=("Make an alternate install, installing only the X.Y versioned" + "scripts (Default: pipX, pipX.Y, easy_install-X.Y)"), + ) + parser.add_argument( + "--default-pip", + action="store_true", + default=False, + help=("Make a default pip install, installing the unqualified pip " + "and easy_install in addition to the versioned scripts"), + ) + + args = parser.parse_args() + + ensurepip.bootstrap( + root=args.root, + upgrade=args.upgrade, + user=args.user, + verbosity=args.verbosity, + altinstall=args.altinstall, + default_pip=args.default_pip, + ) + + +if __name__ == "__main__": + main() diff --git a/Lib/ensurepip/_bundled/pip-1.5.dev1-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/pip-1.5.dev1-py2.py3-none-any.whl new file mode 100644 index 0000000000000000000000000000000000000000..65e354860e048066407bb142f03c419cfbae3d52 GIT binary patch [stripped] diff --git a/Lib/ensurepip/_bundled/setuptools-1.3.2-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/setuptools-1.3.2-py2.py3-none-any.whl new file mode 100644 index 0000000000000000000000000000000000000000..81962b10814b495bb8f7f0edc34e5274a0580865 GIT binary patch [stripped] diff --git a/Lib/test/test_ensurepip.py b/Lib/test/test_ensurepip.py new file mode 100644 --- /dev/null +++ b/Lib/test/test_ensurepip.py @@ -0,0 +1,123 @@ +import unittest +import unittest.mock +import ensurepip +import test.support + + +class TestEnsurePipVersion(unittest.TestCase): + + def test_returns_version(self): + self.assertEqual(ensurepip._PIP_VERSION, ensurepip.version()) + + +class TestBootstrap(unittest.TestCase): + + def setUp(self): + run_pip_patch = unittest.mock.patch("ensurepip._run_pip") + self.run_pip = run_pip_patch.start() + self.addCleanup(run_pip_patch.stop) + + os_environ_patch = unittest.mock.patch("ensurepip.os.environ", {}) + self.os_environ = os_environ_patch.start() + self.addCleanup(os_environ_patch.stop) + + def test_basic_bootstrapping(self): + ensurepip.bootstrap() + + self.run_pip.assert_called_once_with( + [ + "install", "--no-index", "--find-links", + unittest.mock.ANY, "--pre", "setuptools", "pip", + ], + unittest.mock.ANY, + ) + + additional_paths = self.run_pip.call_args[0][1] + self.assertEqual(len(additional_paths), 2) + + def test_bootstrapping_with_root(self): + ensurepip.bootstrap(root="/foo/bar/") + + self.run_pip.assert_called_once_with( + [ + "install", "--no-index", "--find-links", + unittest.mock.ANY, "--pre", "--root", "/foo/bar/", + "setuptools", "pip", + ], + unittest.mock.ANY, + ) + + def test_bootstrapping_with_user(self): + ensurepip.bootstrap(user=True) + + self.run_pip.assert_called_once_with( + [ + "install", "--no-index", "--find-links", + unittest.mock.ANY, "--pre", "--user", "setuptools", "pip", + ], + unittest.mock.ANY, + ) + + def test_bootstrapping_with_upgrade(self): + ensurepip.bootstrap(upgrade=True) + + self.run_pip.assert_called_once_with( + [ + "install", "--no-index", "--find-links", + unittest.mock.ANY, "--pre", "--upgrade", "setuptools", "pip", + ], + unittest.mock.ANY, + ) + + def test_bootstrapping_with_verbosity_1(self): + ensurepip.bootstrap(verbosity=1) + + self.run_pip.assert_called_once_with( + [ + "install", "--no-index", "--find-links", + unittest.mock.ANY, "--pre", "-v", "setuptools", "pip", + ], + unittest.mock.ANY, + ) + + def test_bootstrapping_with_verbosity_2(self): + ensurepip.bootstrap(verbosity=2) + + self.run_pip.assert_called_once_with( + [ + "install", "--no-index", "--find-links", + unittest.mock.ANY, "--pre", "-vv", "setuptools", "pip", + ], + unittest.mock.ANY, + ) + + def test_bootstrapping_with_verbosity_3(self): + ensurepip.bootstrap(verbosity=3) + + self.run_pip.assert_called_once_with( + [ + "install", "--no-index", "--find-links", + unittest.mock.ANY, "--pre", "-vvv", "setuptools", "pip", + ], + unittest.mock.ANY, + ) + + def test_bootstrapping_with_regular_install(self): + ensurepip.bootstrap() + self.assertEqual(self.os_environ["ENSUREPIP_OPTIONS"], "install") + + def test_bootstrapping_with_alt_install(self): + ensurepip.bootstrap(altinstall=True) + self.assertEqual(self.os_environ["ENSUREPIP_OPTIONS"], "altinstall") + + def test_bootstrapping_with_default_pip(self): + ensurepip.bootstrap(default_pip=True) + self.assertNotIn("ENSUREPIP_OPTIONS", self.os_environ) + + def test_altinstall_default_pip_conflict(self): + with self.assertRaises(ValueError): + ensurepip.bootstrap(altinstall=True, default_pip=True) + + +if __name__ == "__main__": + test.support.run_unittest(__name__) diff --git a/Makefile.pre.in b/Makefile.pre.in --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -1091,6 +1091,7 @@ test/test_asyncio \ collections concurrent concurrent/futures encodings \ email email/mime test/test_email test/test_email/data \ + ensurepip ensurepip/_bundled \ html json test/test_json http dbm xmlrpc \ sqlite3 sqlite3/test \ logging csv wsgiref urllib \ diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -1244,6 +1244,7 @@ Serhiy Storchaka Ken Stox Dan Stromberg +Donald Stufft Daniel Stutzbach Andreas St?hrk Colin Su diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -34,6 +34,9 @@ Library ------- +- Issue #19406: implementation of the ensurepip module (part of PEP 453). + Patch by Donald Stufft and Nick Coghlan. + - Issue #19544 and Issue #6286: Restore use of urllib over http allowing use of http_proxy for Distutils upload command, a feature accidentally lost in the rollback of distutils2. diff --git a/Tools/scripts/checkpip.py b/Tools/scripts/checkpip.py new file mode 100644 --- /dev/null +++ b/Tools/scripts/checkpip.py @@ -0,0 +1,32 @@ +#/usr/bin/env python3 +""" +Checks that the version of the projects bundled in ensurepip are the latest +versions available. +""" +import ensurepip +import json +import urllib.request +import sys + + +def main(): + outofdate = False + + for project, version in ensurepip._PROJECTS: + data = json.loads(urllib.request.urlopen( + "https://pypi.python.org/pypi/{}/json".format(project), + cadefault=True, + ).read().decode("utf8")) + upstream_version = data["info"]["version"] + + if version != upstream_version: + outofdate = True + print("The latest version of {} on PyPI is {}, but ensurepip " + "has {}".format(project, upstream_version, version)) + + if outofdate: + sys.exit(1) + + +if __name__ == "__main__": + main() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 11 15:01:28 2013 From: python-checkins at python.org (eric.smith) Date: Mon, 11 Nov 2013 15:01:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Fixed_typo=2E?= Message-ID: <3dJDK42ZVvz7Lk2@mail.python.org> http://hg.python.org/peps/rev/d21893d88a8a changeset: 5265:d21893d88a8a user: Eric V. Smith date: Mon Nov 11 09:01:17 2013 -0500 summary: Fixed typo. files: pep-3156.txt | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/pep-3156.txt b/pep-3156.txt --- a/pep-3156.txt +++ b/pep-3156.txt @@ -128,7 +128,7 @@ event loop methods documented as accepting coroutine arguments *must* accept both Futures and coroutines for such arguments. (A convenience function, ``async()``, exists to convert an argument that is either a -conroutine or a Future into a Future.) +coroutine or a Future into a Future.) For users (like myself) who don't like using callbacks, a scheduler is provided for writing asynchronous I/O code as coroutines using the PEP -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Mon Nov 11 16:09:04 2013 From: python-checkins at python.org (tim.golden) Date: Mon, 11 Nov 2013 16:09:04 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Remove_outdate?= =?utf-8?q?d_comment?= Message-ID: <3dJFq4721Cz7LlM@mail.python.org> http://hg.python.org/cpython/rev/2908063f3287 changeset: 87052:2908063f3287 branch: 3.3 parent: 87044:5e98c4e9c909 user: Tim Golden date: Mon Nov 11 15:08:04 2013 +0000 summary: Remove outdated comment files: Lib/subprocess.py | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/Lib/subprocess.py b/Lib/subprocess.py --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -662,7 +662,6 @@ # Various tools for executing commands and looking at their output and status. # -# NB This only works (and is only relevant) for POSIX. def getstatusoutput(cmd): """Return (status, output) of executing cmd in a shell. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 11 16:09:06 2013 From: python-checkins at python.org (tim.golden) Date: Mon, 11 Nov 2013 16:09:06 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Remove_outdated_comment?= Message-ID: <3dJFq61pM7z7LnQ@mail.python.org> http://hg.python.org/cpython/rev/0a329f94448d changeset: 87053:0a329f94448d parent: 87051:6a6b1ee306e3 parent: 87052:2908063f3287 user: Tim Golden date: Mon Nov 11 15:08:40 2013 +0000 summary: Remove outdated comment files: Lib/subprocess.py | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/Lib/subprocess.py b/Lib/subprocess.py --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -691,7 +691,6 @@ # Various tools for executing commands and looking at their output and status. # -# NB This only works (and is only relevant) for POSIX. def getstatusoutput(cmd): """ Return (status, output) of executing cmd in a shell. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 11 20:03:54 2013 From: python-checkins at python.org (andrew.kuchling) Date: Mon, 11 Nov 2013 20:03:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Closes_=236683=3A_add_a_te?= =?utf-8?q?st_that_exercises_multiple_authentication=2E?= Message-ID: <3dJM225ZhCz7Lnv@mail.python.org> http://hg.python.org/cpython/rev/19912ad231a3 changeset: 87054:19912ad231a3 user: Andrew Kuchling date: Mon Nov 11 14:03:23 2013 -0500 summary: Closes #6683: add a test that exercises multiple authentication. The SMTP server advertises four different authentication methods, and the code will try CRAM-MD5 first, which will fail, but LOGIN succeeds. files: Lib/test/test_smtplib.py | 9 +++++++++ 1 files changed, 9 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_smtplib.py b/Lib/test/test_smtplib.py --- a/Lib/test/test_smtplib.py +++ b/Lib/test/test_smtplib.py @@ -819,6 +819,15 @@ self.assertIn(sim_auth_credentials['cram-md5'], str(err)) smtp.close() + def testAUTH_multiple(self): + # Test that multiple authentication methods are tried. + self.serv.add_feature("AUTH BOGUS PLAIN LOGIN CRAM-MD5") + smtp = smtplib.SMTP(HOST, self.port, local_hostname='localhost', timeout=15) + try: smtp.login(sim_auth[0], sim_auth[1]) + except smtplib.SMTPAuthenticationError as err: + self.assertIn(sim_auth_login_password, str(err)) + smtp.close() + def test_with_statement(self): with smtplib.SMTP(HOST, self.port) as smtp: code, message = smtp.noop() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 11 20:50:25 2013 From: python-checkins at python.org (andrew.kuchling) Date: Mon, 11 Nov 2013 20:50:25 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=2315422=3A_remove_NEWS_it?= =?utf-8?q?em_for_a_change_that_was_later_reverted?= Message-ID: <3dJN3j6jsYz7Ln4@mail.python.org> http://hg.python.org/cpython/rev/267ad2ed4138 changeset: 87055:267ad2ed4138 user: Andrew Kuchling date: Mon Nov 11 14:50:13 2013 -0500 summary: #15422: remove NEWS item for a change that was later reverted files: Misc/NEWS | 3 --- 1 files changed, 0 insertions(+), 3 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -2829,9 +2829,6 @@ - Issue #16881: Fix Py_ARRAY_LENGTH macro for GCC < 3.1. -- Issue #15422: Get rid of PyCFunction_New macro. Use PyCFunction_NewEx - function (PyCFunction_New func is still present for backward compatibility). - - Issue #16505: Remove unused Py_TPFLAGS_INT_SUBCLASS. - Issue #16086: PyTypeObject.tp_flags and PyType_Spec.flags are now unsigned -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Tue Nov 12 04:49:52 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Tue, 12 Nov 2013 04:49:52 +0100 Subject: [Python-checkins] Daily reference leaks (267ad2ed4138): sum=0 Message-ID: results for 267ad2ed4138 on branch "default" -------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogtFUilB', '-x'] From python-checkins at python.org Tue Nov 12 05:45:37 2013 From: python-checkins at python.org (zach.ware) Date: Tue, 12 Nov 2013 05:45:37 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Correct_a_merg?= =?utf-8?q?e_error_in_Misc/NEWS?= Message-ID: <3dJbxF4rGzz7Lmj@mail.python.org> http://hg.python.org/cpython/rev/b689e48855cc changeset: 87056:b689e48855cc branch: 3.3 parent: 87052:2908063f3287 user: Zachary Ware date: Mon Nov 11 22:30:47 2013 -0600 summary: Correct a merge error in Misc/NEWS files: Misc/NEWS | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -29,10 +29,10 @@ - Issue #6157: Fixed tkinter.Text.debug(). Original patch by Guilherme Polo. - Issue #6160: The bbox() method of tkinter.Spinbox now returns a tuple of + integers instead of a string. Based on patch by Guilherme Polo. - Issue #10197: Rework subprocess.get[status]output to use subprocess functionality and thus to work on Windows. Patch by Nick Coghlan. - integers instead of a string. Based on patch by Guilherme Polo. - Issue #19286: Directories in ``package_data`` are no longer added to the filelist, preventing failure outlined in the ticket. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 05:45:38 2013 From: python-checkins at python.org (zach.ware) Date: Tue, 12 Nov 2013 05:45:38 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Null_merge?= Message-ID: <3dJbxG6hPrz7Lmj@mail.python.org> http://hg.python.org/cpython/rev/45aca50a7f72 changeset: 87057:45aca50a7f72 parent: 87055:267ad2ed4138 parent: 87056:b689e48855cc user: Zachary Ware date: Mon Nov 11 22:44:03 2013 -0600 summary: Null merge files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 06:00:06 2013 From: python-checkins at python.org (zach.ware) Date: Tue, 12 Nov 2013 06:00:06 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NDQw?= =?utf-8?q?=3A_Clean_up_test=5Fcapi?= Message-ID: <3dJcFy3J6mz7LlY@mail.python.org> http://hg.python.org/cpython/rev/5198e8f325f5 changeset: 87058:5198e8f325f5 branch: 3.3 parent: 87056:b689e48855cc user: Zachary Ware date: Mon Nov 11 22:47:04 2013 -0600 summary: Issue #19440: Clean up test_capi files: Lib/test/test_capi.py | 20 +++++++------------- Misc/NEWS | 4 ++++ 2 files changed, 11 insertions(+), 13 deletions(-) diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -1,7 +1,6 @@ # Run the _testcapi module tests (tests for the Python/C API): by defn, # these are all functions _testcapi exports whose name begins with 'test_'. -from __future__ import with_statement import os import pickle import random @@ -351,17 +350,12 @@ t.start() t.join() - -def test_main(): - support.run_unittest(CAPITest, TestPendingCalls, Test6012, - EmbeddingTest, SkipitemTest, TestThreadState) - - for name in dir(_testcapi): - if name.startswith('test_'): - test = getattr(_testcapi, name) - if support.verbose: - print("internal", name) - test() +class Test_testcapi(unittest.TestCase): + def test__testcapi(self): + for name in dir(_testcapi): + if name.startswith('test_'): + test = getattr(_testcapi, name) + test() if __name__ == "__main__": - test_main() + unittest.main() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -42,6 +42,10 @@ Tests ----- +- Issue #19440: Clean up test_capi by removing an unnecessary __future__ + import, converting from test_main to unittest.main, and running the + _testcapi module tests within a unittest TestCase. + - Issue #18702: All skipped tests now reported as skipped. - Issue #19085: Added basic tests for all tkinter widget options. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 06:00:07 2013 From: python-checkins at python.org (zach.ware) Date: Tue, 12 Nov 2013 06:00:07 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319440=3A_Clean_up_test=5Fcapi?= Message-ID: <3dJcFz5D62z7Ln1@mail.python.org> http://hg.python.org/cpython/rev/26108b2761aa changeset: 87059:26108b2761aa parent: 87057:45aca50a7f72 parent: 87058:5198e8f325f5 user: Zachary Ware date: Mon Nov 11 22:59:23 2013 -0600 summary: Issue #19440: Clean up test_capi files: Lib/test/test_capi.py | 22 ++++++++-------------- Misc/NEWS | 4 ++++ 2 files changed, 12 insertions(+), 14 deletions(-) diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -1,7 +1,6 @@ # Run the _testcapi module tests (tests for the Python/C API): by defn, # these are all functions _testcapi exports whose name begins with 'test_'. -from __future__ import with_statement import os import pickle import random @@ -416,18 +415,13 @@ t.start() t.join() - -def test_main(): - support.run_unittest(CAPITest, TestPendingCalls, Test6012, - EmbeddingTests, SkipitemTest, TestThreadState, - SubinterpreterTest) - - for name in dir(_testcapi): - if name.startswith('test_'): - test = getattr(_testcapi, name) - if support.verbose: - print("internal", name) - test() +class Test_testcapi(unittest.TestCase): + def test__testcapi(self): + for name in dir(_testcapi): + if name.startswith('test_'): + with self.subTest("internal", name=name): + test = getattr(_testcapi, name) + test() if __name__ == "__main__": - test_main() + unittest.main() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -213,6 +213,10 @@ Tests ----- +- Issue #19440: Clean up test_capi by removing an unnecessary __future__ + import, converting from test_main to unittest.main, and running the + _testcapi module tests as subTests of a unittest TestCase method. + - Issue #19378: the main dis module tests are now run with both stdout redirection *and* passing an explicit file parameter -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 13:42:14 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 12 Nov 2013 13:42:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_454=3A_rename_Snapshot=2E?= =?utf-8?q?apply=5Ffilters=28=29_to_Snapshot=2Efilter=5Ftraces=28=29?= Message-ID: <3dJpWB0XzRz7LqF@mail.python.org> http://hg.python.org/peps/rev/881b7f5fbabe changeset: 5266:881b7f5fbabe user: Victor Stinner date: Tue Nov 12 13:38:36 2013 +0100 summary: PEP 454: rename Snapshot.apply_filters() to Snapshot.filter_traces() files: pep-0454.txt | 24 ++++++++++++------------ 1 files changed, 12 insertions(+), 12 deletions(-) diff --git a/pep-0454.txt b/pep-0454.txt --- a/pep-0454.txt +++ b/pep-0454.txt @@ -284,18 +284,6 @@ The ``take_snapshot()`` function creates a snapshot instance. -``apply_filters(filters)`` method: - - Create a new ``Snapshot`` instance with a filtered ``traces`` - sequence, *filters* is a list of ``Filter`` instances. If *filters* - is an empty list, return a new ``Snapshot`` instance with a copy of - the traces. - - All inclusive filters are applied at once, a trace is ignored if no - inclusive filters match it. A trace is ignored if at least one - exclusive filter matchs it. - - ``compare_to(old_snapshot: Snapshot, group_by: str, cumulative: bool=False)`` method: Compute the differences with an old snapshot. Get statistics as a @@ -317,6 +305,18 @@ Use ``load()`` to reload the snapshot. +``filter_traces(filters)`` method: + + Create a new ``Snapshot`` instance with a filtered ``traces`` + sequence, *filters* is a list of ``Filter`` instances. If *filters* + is an empty list, return a new ``Snapshot`` instance with a copy of + the traces. + + All inclusive filters are applied at once, a trace is ignored if no + inclusive filters match it. A trace is ignored if at least one + exclusive filter matchs it. + + ``load(filename)`` classmethod: Load a snapshot from a file. -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Tue Nov 12 13:58:19 2013 From: python-checkins at python.org (tim.golden) Date: Tue, 12 Nov 2013 13:58:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Issue13674_Cor?= =?utf-8?q?rect_crash_with_strftime_=25y_format_under_Windows?= Message-ID: <3dJpsl3WLTz7Lt1@mail.python.org> http://hg.python.org/cpython/rev/1537f14cc690 changeset: 87060:1537f14cc690 branch: 3.3 parent: 87058:5198e8f325f5 user: Tim Golden date: Tue Nov 12 12:36:54 2013 +0000 summary: Issue13674 Correct crash with strftime %y format under Windows files: Lib/test/test_strftime.py | 26 +++++++++++++++++++++++++- Modules/timemodule.c | 7 +++++++ 2 files changed, 32 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_strftime.py b/Lib/test/test_strftime.py --- a/Lib/test/test_strftime.py +++ b/Lib/test/test_strftime.py @@ -176,8 +176,32 @@ (e[0], e[2])) print(" Expected %s, but got %s" % (e[1], result)) + +class Y1900Tests(unittest.TestCase): + """A limitation of the MS C runtime library is that it crashes if + a date before 1900 is passed with a format string containing "%y" + """ + + @unittest.skipUnless(sys.platform == "win32", "Only applies to Windows") + def test_y_before_1900_win(self): + with self.assertRaises(ValueError): + time.strftime("%y", (1899, 1, 1, 0, 0, 0, 0, 0, 0)) + + @unittest.skipIf(sys.platform == "win32", "Doesn't apply on Windows") + def test_y_before_1900_nonwin(self): + self.assertEquals( + time.strftime("%y", (1899, 1, 1, 0, 0, 0, 0, 0, 0)), "99") + + def test_y_1900(self): + self.assertEquals( + time.strftime("%y", (1900, 1, 1, 0, 0, 0, 0, 0, 0)), "00") + + def test_y_after_1900(self): + self.assertEquals( + time.strftime("%y", (2013, 1, 1, 0, 0, 0, 0, 0, 0)), "13") + def test_main(): - support.run_unittest(StrftimeTest) + support.run_unittest(StrftimeTest, Y1900Tests) if __name__ == '__main__': test_main() diff --git a/Modules/timemodule.c b/Modules/timemodule.c --- a/Modules/timemodule.c +++ b/Modules/timemodule.c @@ -632,6 +632,13 @@ Py_DECREF(format); return NULL; } + if ((outbuf[1] == 'y') && buf.tm_year < 0) + { + PyErr_SetString(PyExc_ValueError, + "format %y requires year >= 1900 on Windows"); + Py_DECREF(format); + return NULL; + } } #endif -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 13:58:20 2013 From: python-checkins at python.org (tim.golden) Date: Tue, 12 Nov 2013 13:58:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue13674_Correct_crash_with_strftime_=25y_format_under?= =?utf-8?q?_Windows?= Message-ID: <3dJpsm5Fjbz7LqY@mail.python.org> http://hg.python.org/cpython/rev/41a4c55db7f2 changeset: 87061:41a4c55db7f2 parent: 87059:26108b2761aa parent: 87060:1537f14cc690 user: Tim Golden date: Tue Nov 12 12:48:20 2013 +0000 summary: Issue13674 Correct crash with strftime %y format under Windows files: Lib/test/test_strftime.py | 26 ++++++++++++++++++++++++++ Modules/timemodule.c | 7 +++++++ 2 files changed, 33 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_strftime.py b/Lib/test/test_strftime.py --- a/Lib/test/test_strftime.py +++ b/Lib/test/test_strftime.py @@ -177,5 +177,31 @@ print(" Expected %s, but got %s" % (e[1], result)) +class Y1900Tests(unittest.TestCase): + """A limitation of the MS C runtime library is that it crashes if + a date before 1900 is passed with a format string containing "%y" + """ + + @unittest.skipUnless(sys.platform == "win32", "Only applies to Windows") + def test_y_before_1900_win(self): + with self.assertRaises(ValueError): + time.strftime("%y", (1899, 1, 1, 0, 0, 0, 0, 0, 0)) + + @unittest.skipIf(sys.platform == "win32", "Doesn't apply on Windows") + def test_y_before_1900_nonwin(self): + self.assertEquals( + time.strftime("%y", (1899, 1, 1, 0, 0, 0, 0, 0, 0)), "99") + + def test_y_1900(self): + self.assertEquals( + time.strftime("%y", (1900, 1, 1, 0, 0, 0, 0, 0, 0)), "00") + + def test_y_after_1900(self): + self.assertEquals( + time.strftime("%y", (2013, 1, 1, 0, 0, 0, 0, 0, 0)), "13") + +def test_main(): + support.run_unittest(StrftimeTest, Y1900Tests) + if __name__ == '__main__': unittest.main() diff --git a/Modules/timemodule.c b/Modules/timemodule.c --- a/Modules/timemodule.c +++ b/Modules/timemodule.c @@ -642,6 +642,13 @@ Py_DECREF(format); return NULL; } + if ((outbuf[1] == 'y') && buf.tm_year < 0) + { + PyErr_SetString(PyExc_ValueError, + "format %y requires year >= 1900 on Windows"); + Py_DECREF(format); + return NULL; + } } #endif -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 13:58:21 2013 From: python-checkins at python.org (tim.golden) Date: Tue, 12 Nov 2013 13:58:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Remove_mis-merged_artefact?= Message-ID: <3dJpsn6vCqz7M09@mail.python.org> http://hg.python.org/cpython/rev/1b34cea27b27 changeset: 87062:1b34cea27b27 user: Tim Golden date: Tue Nov 12 12:51:37 2013 +0000 summary: Remove mis-merged artefact files: Lib/test/test_strftime.py | 3 --- 1 files changed, 0 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_strftime.py b/Lib/test/test_strftime.py --- a/Lib/test/test_strftime.py +++ b/Lib/test/test_strftime.py @@ -200,8 +200,5 @@ self.assertEquals( time.strftime("%y", (2013, 1, 1, 0, 0, 0, 0, 0, 0)), "13") -def test_main(): - support.run_unittest(StrftimeTest, Y1900Tests) - if __name__ == '__main__': unittest.main() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 14:33:53 2013 From: python-checkins at python.org (tim.golden) Date: Tue, 12 Nov 2013 14:33:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2313674_Updated_NEW?= =?utf-8?q?S?= Message-ID: <3dJqfn4QJ4z7Ls0@mail.python.org> http://hg.python.org/cpython/rev/c61147d66843 changeset: 87063:c61147d66843 user: Tim Golden date: Tue Nov 12 13:22:39 2013 +0000 summary: Issue #13674 Updated NEWS files: Misc/NEWS | 3 +++ 1 files changed, 3 insertions(+), 0 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -34,6 +34,9 @@ Library ------- +- Issue #13674: Prevented time.strftime from crashing on Windows when given + a year before 1900 and a format of %y. + - Issue #19406: implementation of the ensurepip module (part of PEP 453). Patch by Donald Stufft and Nick Coghlan. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 14:33:54 2013 From: python-checkins at python.org (tim.golden) Date: Tue, 12 Nov 2013 14:33:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzEzNjc0?= =?utf-8?q?_Updated_NEWS?= Message-ID: <3dJqfp6pVRz7Lsq@mail.python.org> http://hg.python.org/cpython/rev/49db4851c63b changeset: 87064:49db4851c63b branch: 3.3 parent: 87060:1537f14cc690 user: Tim Golden date: Tue Nov 12 13:24:03 2013 +0000 summary: Issue #13674 Updated NEWS files: Misc/NEWS | 3 +++ 1 files changed, 3 insertions(+), 0 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -13,6 +13,9 @@ Library ------- +- Issue #13674: Prevented time.strftime from crashing on Windows when given + a year before 1900 and a format of %y. + - Issue #19544 and Issue #6286: Restore use of urllib over http allowing use of http_proxy for Distutils upload command, a feature accidentally lost in the rollback of distutils2. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 14:33:56 2013 From: python-checkins at python.org (tim.golden) Date: Tue, 12 Nov 2013 14:33:56 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2313674_Null_merge_with_3=2E3?= Message-ID: <3dJqfr2VkXz7Lxp@mail.python.org> http://hg.python.org/cpython/rev/3ff7602ee543 changeset: 87065:3ff7602ee543 parent: 87063:c61147d66843 parent: 87064:49db4851c63b user: Tim Golden date: Tue Nov 12 13:33:17 2013 +0000 summary: Issue #13674 Null merge with 3.3 files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 16:02:49 2013 From: python-checkins at python.org (andrew.kuchling) Date: Tue, 12 Nov 2013 16:02:49 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Update_e-mail_?= =?utf-8?q?address?= Message-ID: <3dJsdP0py3z7M6q@mail.python.org> http://hg.python.org/cpython/rev/06f10ba16aa3 changeset: 87066:06f10ba16aa3 branch: 3.3 parent: 87064:49db4851c63b user: Andrew Kuchling date: Tue Nov 12 10:02:35 2013 -0500 summary: Update e-mail address files: Doc/library/gettext.rst | 4 ++-- Doc/library/pickle.rst | 2 +- Doc/library/smtpd.rst | 2 +- 3 files changed, 4 insertions(+), 4 deletions(-) diff --git a/Doc/library/gettext.rst b/Doc/library/gettext.rst --- a/Doc/library/gettext.rst +++ b/Doc/library/gettext.rst @@ -3,8 +3,8 @@ .. module:: gettext :synopsis: Multilingual internationalization services. -.. moduleauthor:: Barry A. Warsaw -.. sectionauthor:: Barry A. Warsaw +.. moduleauthor:: Barry A. Warsaw +.. sectionauthor:: Barry A. Warsaw **Source code:** :source:`Lib/gettext.py` diff --git a/Doc/library/pickle.rst b/Doc/library/pickle.rst --- a/Doc/library/pickle.rst +++ b/Doc/library/pickle.rst @@ -12,7 +12,7 @@ .. module:: pickle :synopsis: Convert Python objects to streams of bytes and back. .. sectionauthor:: Jim Kerr . -.. sectionauthor:: Barry Warsaw +.. sectionauthor:: Barry Warsaw The :mod:`pickle` module implements a fundamental, but powerful algorithm for diff --git a/Doc/library/smtpd.rst b/Doc/library/smtpd.rst --- a/Doc/library/smtpd.rst +++ b/Doc/library/smtpd.rst @@ -4,7 +4,7 @@ .. module:: smtpd :synopsis: A SMTP server implementation in Python. -.. moduleauthor:: Barry Warsaw +.. moduleauthor:: Barry Warsaw .. sectionauthor:: Moshe Zadka **Source code:** :source:`Lib/smtpd.py` -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 16:03:34 2013 From: python-checkins at python.org (andrew.kuchling) Date: Tue, 12 Nov 2013 16:03:34 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge_from_3=2E3?= Message-ID: <3dJsfG2Ys2z7M9F@mail.python.org> http://hg.python.org/cpython/rev/5073fa30f226 changeset: 87067:5073fa30f226 parent: 87065:3ff7602ee543 parent: 87066:06f10ba16aa3 user: Andrew Kuchling date: Tue Nov 12 10:03:20 2013 -0500 summary: Merge from 3.3 files: Doc/library/gettext.rst | 4 ++-- Doc/library/pickle.rst | 2 +- Doc/library/smtpd.rst | 2 +- 3 files changed, 4 insertions(+), 4 deletions(-) diff --git a/Doc/library/gettext.rst b/Doc/library/gettext.rst --- a/Doc/library/gettext.rst +++ b/Doc/library/gettext.rst @@ -3,8 +3,8 @@ .. module:: gettext :synopsis: Multilingual internationalization services. -.. moduleauthor:: Barry A. Warsaw -.. sectionauthor:: Barry A. Warsaw +.. moduleauthor:: Barry A. Warsaw +.. sectionauthor:: Barry A. Warsaw **Source code:** :source:`Lib/gettext.py` diff --git a/Doc/library/pickle.rst b/Doc/library/pickle.rst --- a/Doc/library/pickle.rst +++ b/Doc/library/pickle.rst @@ -12,7 +12,7 @@ .. module:: pickle :synopsis: Convert Python objects to streams of bytes and back. .. sectionauthor:: Jim Kerr . -.. sectionauthor:: Barry Warsaw +.. sectionauthor:: Barry Warsaw The :mod:`pickle` module implements a fundamental, but powerful algorithm for diff --git a/Doc/library/smtpd.rst b/Doc/library/smtpd.rst --- a/Doc/library/smtpd.rst +++ b/Doc/library/smtpd.rst @@ -4,7 +4,7 @@ .. module:: smtpd :synopsis: A SMTP server implementation in Python. -.. moduleauthor:: Barry Warsaw +.. moduleauthor:: Barry Warsaw .. sectionauthor:: Moshe Zadka **Source code:** :source:`Lib/smtpd.py` -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 16:25:24 2013 From: python-checkins at python.org (andrew.kuchling) Date: Tue, 12 Nov 2013 16:25:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogQ2xvc2VzICMxMjgy?= =?utf-8?q?8=3A_add_docstring_text_noting_this_is_an_internal-only_module?= Message-ID: <3dJt7S0SNGz7Lxn@mail.python.org> http://hg.python.org/cpython/rev/3710514aa92a changeset: 87068:3710514aa92a branch: 3.3 parent: 87066:06f10ba16aa3 user: Andrew Kuchling date: Tue Nov 12 10:25:15 2013 -0500 summary: Closes #12828: add docstring text noting this is an internal-only module files: Lib/xml/dom/minicompat.py | 6 +++++- 1 files changed, 5 insertions(+), 1 deletions(-) diff --git a/Lib/xml/dom/minicompat.py b/Lib/xml/dom/minicompat.py --- a/Lib/xml/dom/minicompat.py +++ b/Lib/xml/dom/minicompat.py @@ -1,4 +1,8 @@ -"""Python version compatibility support for minidom.""" +"""Python version compatibility support for minidom. + +This module contains internal implementation details and +should not be imported; use xml.dom.minidom instead. +""" # This module should only be imported using "import *". # -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 16:26:28 2013 From: python-checkins at python.org (andrew.kuchling) Date: Tue, 12 Nov 2013 16:26:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge_from_3=2E3?= Message-ID: <3dJt8h0Wk7z7Lts@mail.python.org> http://hg.python.org/cpython/rev/cb05beabb656 changeset: 87069:cb05beabb656 parent: 87067:5073fa30f226 parent: 87068:3710514aa92a user: Andrew Kuchling date: Tue Nov 12 10:26:15 2013 -0500 summary: Merge from 3.3 files: Lib/xml/dom/minicompat.py | 6 +++++- 1 files changed, 5 insertions(+), 1 deletions(-) diff --git a/Lib/xml/dom/minicompat.py b/Lib/xml/dom/minicompat.py --- a/Lib/xml/dom/minicompat.py +++ b/Lib/xml/dom/minicompat.py @@ -1,4 +1,8 @@ -"""Python version compatibility support for minidom.""" +"""Python version compatibility support for minidom. + +This module contains internal implementation details and +should not be imported; use xml.dom.minidom instead. +""" # This module should only be imported using "import *". # -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 16:45:42 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 12 Nov 2013 16:45:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2319466=3A_Clear_th?= =?utf-8?q?e_frames_of_daemon_threads_earlier_during_the_Python?= Message-ID: <3dJtZt44VszPtG@mail.python.org> http://hg.python.org/cpython/rev/c2a13acd5e2b changeset: 87070:c2a13acd5e2b user: Victor Stinner date: Tue Nov 12 16:37:55 2013 +0100 summary: Close #19466: Clear the frames of daemon threads earlier during the Python shutdown to call objects destructors. So "unclosed file" resource warnings are now corretly emitted for daemon threads. files: Lib/test/test_threading.py | 50 ++++++++++++++++++++++++++ Misc/NEWS | 4 ++ Python/pythonrun.c | 20 +++++++-- 3 files changed, 69 insertions(+), 5 deletions(-) diff --git a/Lib/test/test_threading.py b/Lib/test/test_threading.py --- a/Lib/test/test_threading.py +++ b/Lib/test/test_threading.py @@ -617,6 +617,52 @@ t.join() self.assertRaises(ValueError, bs.release) + def test_locals_at_exit(self): + # Issue #19466: thread locals must not be deleted before destructors + # are called + rc, out, err = assert_python_ok("-c", """if 1: + import threading + + class Atexit: + def __del__(self): + print("thread_dict.atexit = %r" % thread_dict.atexit) + + thread_dict = threading.local() + thread_dict.atexit = "atexit" + + atexit = Atexit() + """) + self.assertEqual(out.rstrip(), b"thread_dict.atexit = 'atexit'") + + def test_warnings_at_exit(self): + # Issue #19466: try to call most destructors at Python shutdown before + # destroying Python thread states + filename = __file__ + rc, out, err = assert_python_ok("-Wd", "-c", """if 1: + import time + import threading + + def open_sleep(): + # a warning will be emitted when the open file will be + # destroyed (without being explicitly closed) while the daemon + # thread is destroyed + fileobj = open(%a, 'rb') + start_event.set() + time.sleep(60.0) + + start_event = threading.Event() + + thread = threading.Thread(target=open_sleep) + thread.daemon = True + thread.start() + + # wait until the thread started + start_event.wait() + """ % filename) + self.assertRegex(err.rstrip(), + b"^sys:1: ResourceWarning: unclosed file ") + + class ThreadJoinOnShutdown(BaseTestCase): def _run_and_join(self, script): @@ -701,6 +747,10 @@ import sys import time import threading + import warnings + + # ignore "unclosed file ..." warnings + warnings.filterwarnings('ignore', '', ResourceWarning) thread_has_run = set() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,6 +10,10 @@ Core and Builtins ----------------- +- Issue #19466: Clear the frames of daemon threads earlier during the + Python shutdown to call objects destructors. So "unclosed file" resource + warnings are now corretly emitted for daemon threads. + - Issue #19514: Deduplicate some _Py_IDENTIFIER declarations. Patch by Andrei Dorian Duma. diff --git a/Python/pythonrun.c b/Python/pythonrun.c --- a/Python/pythonrun.c +++ b/Python/pythonrun.c @@ -576,11 +576,13 @@ _Py_Finalizing = tstate; initialized = 0; - /* Flush stdout+stderr */ - flush_std_files(); - - /* Disable signal handling */ - PyOS_FiniInterrupts(); + /* Destroy the state of all threads except of the current thread: in + practice, only daemon threads should still be alive. Clear frames of + other threads to call objects destructor. Destructors will be called in + the current Python thread. Since _Py_Finalizing has been set, no other + Python threads can lock the GIL at this point (if they try, they will + exit immediatly). */ + _PyThreadState_DeleteExcept(tstate); /* Collect garbage. This may call finalizers; it's nice to call these * before all modules are destroyed. @@ -595,6 +597,7 @@ * XXX I haven't seen a real-life report of either of these. */ PyGC_Collect(); + #ifdef COUNT_ALLOCS /* With COUNT_ALLOCS, it helps to run GC multiple times: each collection might release some types from the type @@ -602,6 +605,13 @@ while (PyGC_Collect() > 0) /* nothing */; #endif + + /* Flush stdout+stderr */ + flush_std_files(); + + /* Disable signal handling */ + PyOS_FiniInterrupts(); + /* Destroy all modules */ PyImport_Cleanup(); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 17:19:01 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 12 Nov 2013 17:19:01 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319466=3A_Fix_typo?= =?utf-8?q?=2E_Patch_written_by_Vajrasky_Kok=2E?= Message-ID: <3dJvKK2x7jz7Ljq@mail.python.org> http://hg.python.org/cpython/rev/10a8e676b87b changeset: 87071:10a8e676b87b user: Victor Stinner date: Tue Nov 12 17:18:51 2013 +0100 summary: Issue #19466: Fix typo. Patch written by Vajrasky Kok. files: Python/pythonrun.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Python/pythonrun.c b/Python/pythonrun.c --- a/Python/pythonrun.c +++ b/Python/pythonrun.c @@ -581,7 +581,7 @@ other threads to call objects destructor. Destructors will be called in the current Python thread. Since _Py_Finalizing has been set, no other Python threads can lock the GIL at this point (if they try, they will - exit immediatly). */ + exit immediately). */ _PyThreadState_DeleteExcept(tstate); /* Collect garbage. This may call finalizers; it's nice to call these -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 21:42:40 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 12 Nov 2013 21:42:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319515=3A_Remove_i?= =?utf-8?q?dentifiers_duplicated_in_the_same_file=2E?= Message-ID: <3dK19X1Pg0z7LnG@mail.python.org> http://hg.python.org/cpython/rev/518e3b174120 changeset: 87072:518e3b174120 user: Victor Stinner date: Tue Nov 12 21:39:02 2013 +0100 summary: Issue #19515: Remove identifiers duplicated in the same file. Patch written by Andrei Dorian Duma. files: Modules/_io/bufferedio.c | 1 - Modules/_io/iobase.c | 5 ++--- Modules/cjkcodecs/multibytecodec.c | 4 ++-- Objects/typeobject.c | 1 - Python/pythonrun.c | 6 ++---- 5 files changed, 6 insertions(+), 11 deletions(-) diff --git a/Modules/_io/bufferedio.c b/Modules/_io/bufferedio.c --- a/Modules/_io/bufferedio.c +++ b/Modules/_io/bufferedio.c @@ -52,7 +52,6 @@ Py_buffer buf; Py_ssize_t len; PyObject *data; - _Py_IDENTIFIER(read); if (!PyArg_ParseTuple(args, "w*:readinto", &buf)) { return NULL; diff --git a/Modules/_io/iobase.c b/Modules/_io/iobase.c --- a/Modules/_io/iobase.c +++ b/Modules/_io/iobase.c @@ -63,6 +63,8 @@ #define IS_CLOSED(self) \ _PyObject_HasAttrId(self, &PyId___IOBase_closed) +_Py_IDENTIFIER(read); + /* Internal methods */ static PyObject * iobase_unsupported(const char *message) @@ -180,7 +182,6 @@ iobase_close(PyObject *self, PyObject *args) { PyObject *res; - _Py_IDENTIFIER(__IOBase_closed); if (IS_CLOSED(self)) Py_RETURN_NONE; @@ -454,7 +455,6 @@ int has_peek = 0; PyObject *buffer, *result; Py_ssize_t old_size = -1; - _Py_IDENTIFIER(read); _Py_IDENTIFIER(peek); if (!PyArg_ParseTuple(args, "|O&:readline", &_PyIO_ConvertSsize_t, &limit)) { @@ -848,7 +848,6 @@ return NULL; while (1) { - _Py_IDENTIFIER(read); PyObject *data = _PyObject_CallMethodId(self, &PyId_read, "i", DEFAULT_BUFFER_SIZE); if (!data) { diff --git a/Modules/cjkcodecs/multibytecodec.c b/Modules/cjkcodecs/multibytecodec.c --- a/Modules/cjkcodecs/multibytecodec.c +++ b/Modules/cjkcodecs/multibytecodec.c @@ -51,6 +51,8 @@ #define MBENC_RESET MBENC_MAX<<1 /* reset after an encoding session */ +_Py_IDENTIFIER(write); + static PyObject * make_tuple(PyObject *object, Py_ssize_t len) { @@ -1569,7 +1571,6 @@ PyObject *unistr) { PyObject *str, *wr; - _Py_IDENTIFIER(write); str = encoder_encode_stateful(STATEFUL_ECTX(self), unistr, 0); if (str == NULL) @@ -1639,7 +1640,6 @@ assert(PyBytes_Check(pwrt)); if (PyBytes_Size(pwrt) > 0) { PyObject *wr; - _Py_IDENTIFIER(write); wr = _PyObject_CallMethodId(self->stream, &PyId_write, "O", pwrt); if (wr == NULL) { diff --git a/Objects/typeobject.c b/Objects/typeobject.c --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -5633,7 +5633,6 @@ PyObject *func; PyObject *newargs, *x; Py_ssize_t i, n; - _Py_IDENTIFIER(__new__); func = _PyObject_GetAttrId((PyObject *)type, &PyId___new__); if (func == NULL) diff --git a/Python/pythonrun.c b/Python/pythonrun.c --- a/Python/pythonrun.c +++ b/Python/pythonrun.c @@ -37,9 +37,11 @@ _Py_IDENTIFIER(builtins); _Py_IDENTIFIER(excepthook); +_Py_IDENTIFIER(flush); _Py_IDENTIFIER(last_traceback); _Py_IDENTIFIER(last_type); _Py_IDENTIFIER(last_value); +_Py_IDENTIFIER(name); _Py_IDENTIFIER(ps1); _Py_IDENTIFIER(ps2); _Py_IDENTIFIER(stdin); @@ -215,7 +217,6 @@ { char *name_utf8, *name_str; PyObject *codec, *name = NULL; - _Py_IDENTIFIER(name); codec = _PyCodec_Lookup(encoding); if (!codec) @@ -512,7 +513,6 @@ PyObject *fout = _PySys_GetObjectId(&PyId_stdout); PyObject *ferr = _PySys_GetObjectId(&PyId_stderr); PyObject *tmp; - _Py_IDENTIFIER(flush); if (fout != NULL && fout != Py_None && !file_is_closed(fout)) { tmp = _PyObject_CallMethodId(fout, &PyId_flush, ""); @@ -1009,7 +1009,6 @@ _Py_IDENTIFIER(open); _Py_IDENTIFIER(isatty); _Py_IDENTIFIER(TextIOWrapper); - _Py_IDENTIFIER(name); _Py_IDENTIFIER(mode); /* stdin is always opened in buffered mode, first because it shouldn't @@ -2130,7 +2129,6 @@ { PyObject *f, *r; PyObject *type, *value, *traceback; - _Py_IDENTIFIER(flush); /* Save the current exception */ PyErr_Fetch(&type, &value, &traceback); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 21:44:59 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 12 Nov 2013 21:44:59 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319515=3A_Remove_d?= =?utf-8?q?uplicated_identifiers_in_zipimport=2Ec?= Message-ID: <3dK1DC380Jz7LnG@mail.python.org> http://hg.python.org/cpython/rev/d2209b9f8971 changeset: 87073:d2209b9f8971 user: Victor Stinner date: Tue Nov 12 21:44:18 2013 +0100 summary: Issue #19515: Remove duplicated identifiers in zipimport.c files: Modules/zipimport.c | 10 ++++------ 1 files changed, 4 insertions(+), 6 deletions(-) diff --git a/Modules/zipimport.c b/Modules/zipimport.c --- a/Modules/zipimport.c +++ b/Modules/zipimport.c @@ -14,6 +14,10 @@ int type; }; +#ifdef ALTSEP +_Py_IDENTIFIER(replace); +#endif + /* zip_searchorder defines how we search for a module in the Zip archive: we first search for a package __init__, then for non-package .pyc, .pyo and .py entries. The .pyc and .pyo entries @@ -66,9 +70,6 @@ PyObject *path, *files, *tmp; PyObject *filename = NULL; Py_ssize_t len, flen; -#ifdef ALTSEP - _Py_IDENTIFIER(replace); -#endif if (!_PyArg_NoKeywords("zipimporter()", kwds)) return -1; @@ -559,9 +560,6 @@ { ZipImporter *self = (ZipImporter *)obj; PyObject *path, *key; -#ifdef ALTSEP - _Py_IDENTIFIER(replace); -#endif PyObject *toc_entry; Py_ssize_t path_start, path_len, len; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 23:09:38 2013 From: python-checkins at python.org (giampaolo.rodola) Date: Tue, 12 Nov 2013 23:09:38 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Provide_a_more_readable_re?= =?utf-8?q?presentation_of_socket_on_repr=28=29=2E?= Message-ID: <3dK35t3mfqz7LkL@mail.python.org> http://hg.python.org/cpython/rev/c5751f01b09b changeset: 87074:c5751f01b09b parent: 85942:0d079c66dc23 user: Giampaolo Rodola' date: Thu Oct 03 21:01:43 2013 +0200 summary: Provide a more readable representation of socket on repr(). Before: Now: files: Lib/socket.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/socket.py b/Lib/socket.py --- a/Lib/socket.py +++ b/Lib/socket.py @@ -136,7 +136,7 @@ address(es). """ closed = getattr(self, '_closed', False) - s = "<%s.%s%s fd=%i, family=%i, type=%i, proto=%i" \ + s = "<%s.%s%s fd=%i, family=%s, type=%s, proto=%i" \ % (self.__class__.__module__, self.__class__.__name__, " [closed]" if closed else "", -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 23:09:39 2013 From: python-checkins at python.org (giampaolo.rodola) Date: Tue, 12 Nov 2013 23:09:39 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_test=2Esupport=3A_consider?= =?utf-8?q?ing_the_module_is_a_mix_of_utilities_unrelated_with_each?= Message-ID: <3dK35v5Wpcz7LkL@mail.python.org> http://hg.python.org/cpython/rev/63aecf2311cf changeset: 87075:63aecf2311cf parent: 87073:d2209b9f8971 user: Giampaolo Rodola' date: Tue Nov 12 23:08:27 2013 +0100 summary: test.support: considering the module is a mix of utilities unrelated with each other divide __all__ in sub-sections so that it can be used as a quick-reference doc files: Lib/test/support/__init__.py | 54 ++++++++++++++++-------- 1 files changed, 36 insertions(+), 18 deletions(-) diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -63,25 +63,43 @@ resource = None __all__ = [ - "Error", "TestFailed", "ResourceDenied", "import_module", "verbose", - "use_resources", "max_memuse", "record_original_stdout", - "get_original_stdout", "unload", "unlink", "rmtree", "forget", + # globals + "PIPE_MAX_SIZE", "verbose", "max_memuse", "use_resources", "failfast", + # exceptions + "Error", "TestFailed", "ResourceDenied", + # imports + "import_module", "import_fresh_module", "CleanImport", + # modules + "unload", "forget", + # io + "record_original_stdout", "get_original_stdout", "captured_stdout", + "captured_stdin", "captured_stderr", + # filesystem + "TESTFN", "SAVEDCWD", "unlink", "rmtree", "temp_cwd", "findfile", + "create_empty_file", "can_symlink", + # unittest "is_resource_enabled", "requires", "requires_freebsd_version", - "requires_linux_version", "requires_mac_ver", "find_unused_port", - "bind_port", "IPV6_ENABLED", "is_jython", "TESTFN", "HOST", "SAVEDCWD", - "temp_cwd", "findfile", "create_empty_file", "sortdict", - "check_syntax_error", "open_urlresource", "check_warnings", "CleanImport", - "EnvironmentVarGuard", "TransientResource", "captured_stdout", - "captured_stdin", "captured_stderr", "time_out", "socket_peer_reset", - "ioerror_peer_reset", "run_with_locale", 'temp_umask', - "transient_internet", "set_memlimit", "bigmemtest", "bigaddrspacetest", - "BasicTestRunner", "run_unittest", "run_doctest", "threading_setup", - "threading_cleanup", "reap_children", "cpython_only", "check_impl_detail", - "get_attribute", "swap_item", "swap_attr", "requires_IEEE_754", - "TestHandler", "Matcher", "can_symlink", "skip_unless_symlink", - "skip_unless_xattr", "import_fresh_module", "requires_zlib", - "PIPE_MAX_SIZE", "failfast", "anticipate_failure", "run_with_tz", - "requires_gzip", "requires_bz2", "requires_lzma", "SuppressCrashReport" + "requires_linux_version", "requires_mac_ver", "check_syntax_error", + "TransientResource", "time_out", "socket_peer_reset", "ioerror_peer_reset", + "transient_internet", "BasicTestRunner", "run_unittest", "run_doctest", + "skip_unless_symlink", "requires_gzip", "requires_bz2", "requires_lzma", + "bigmemtest", "bigaddrspacetest", "cpython_only", "get_attribute", + "requires_IEEE_754", "skip_unless_xattr", "requires_zlib", + "anticipate_failure", + # sys + "is_jython", "check_impl_detail", + # network + "HOST", "IPV6_ENABLED", "find_unused_port", "bind_port", "open_urlresource", + # processes + 'temp_umask', "reap_children", + # logging + "TestHandler", + # threads + "threading_setup", "threading_cleanup", + # miscellaneous + "check_warnings", "EnvironmentVarGuard", "run_with_locale", "swap_item", + "swap_attr", "Matcher", "set_memlimit", "SuppressCrashReport", "sortdict", + "run_with_tz", ] class Error(Exception): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 12 23:09:41 2013 From: python-checkins at python.org (giampaolo.rodola) Date: Tue, 12 Nov 2013 23:09:41 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_merge?= Message-ID: <3dK35x0DHlz7Ltq@mail.python.org> http://hg.python.org/cpython/rev/8af2dc11464f changeset: 87076:8af2dc11464f parent: 87075:63aecf2311cf parent: 87074:c5751f01b09b user: Giampaolo Rodola' date: Tue Nov 12 23:09:01 2013 +0100 summary: merge files: Lib/socket.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/socket.py b/Lib/socket.py --- a/Lib/socket.py +++ b/Lib/socket.py @@ -136,7 +136,7 @@ address(es). """ closed = getattr(self, '_closed', False) - s = "<%s.%s%s fd=%i, family=%i, type=%i, proto=%i" \ + s = "<%s.%s%s fd=%i, family=%s, type=%s, proto=%i" \ % (self.__class__.__module__, self.__class__.__name__, " [closed]" if closed else "", -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Wed Nov 13 04:49:33 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Wed, 13 Nov 2013 04:49:33 +0100 Subject: [Python-checkins] Daily reference leaks (8af2dc11464f): sum=0 Message-ID: results for 8af2dc11464f on branch "default" -------------------------------------------- Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogU8umsn', '-x'] From python-checkins at python.org Wed Nov 13 13:10:28 2013 From: python-checkins at python.org (nick.coghlan) Date: Wed, 13 Nov 2013 13:10:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_test=5Fsocket_for_repr?= =?utf-8?q?_update?= Message-ID: <3dKPm45qJxz7Lk2@mail.python.org> http://hg.python.org/cpython/rev/fa7b90829262 changeset: 87077:fa7b90829262 user: Nick Coghlan date: Wed Nov 13 22:10:16 2013 +1000 summary: Fix test_socket for repr update files: Lib/test/test_socket.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -650,8 +650,8 @@ s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) with s: self.assertIn('fd=%i' % s.fileno(), repr(s)) - self.assertIn('family=%i' % socket.AF_INET, repr(s)) - self.assertIn('type=%i' % socket.SOCK_STREAM, repr(s)) + self.assertIn('family=%s' % socket.AF_INET, repr(s)) + self.assertIn('type=%s' % socket.SOCK_STREAM, repr(s)) self.assertIn('proto=0', repr(s)) self.assertNotIn('raddr', repr(s)) s.bind(('127.0.0.1', 0)) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 13 13:25:19 2013 From: python-checkins at python.org (nick.coghlan) Date: Wed, 13 Nov 2013 13:25:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Avoid_global_side_effect_i?= =?utf-8?q?n_test=5Fensurepip?= Message-ID: <3dKQ5C5tvMz7Lk2@mail.python.org> http://hg.python.org/cpython/rev/10cbc2ca579c changeset: 87078:10cbc2ca579c user: Nick Coghlan date: Wed Nov 13 22:24:58 2013 +1000 summary: Avoid global side effect in test_ensurepip files: Lib/test/test_ensurepip.py | 11 ++++++++--- 1 files changed, 8 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_ensurepip.py b/Lib/test/test_ensurepip.py --- a/Lib/test/test_ensurepip.py +++ b/Lib/test/test_ensurepip.py @@ -2,6 +2,8 @@ import unittest.mock import ensurepip import test.support +import os +import os.path class TestEnsurePipVersion(unittest.TestCase): @@ -17,9 +19,12 @@ self.run_pip = run_pip_patch.start() self.addCleanup(run_pip_patch.stop) - os_environ_patch = unittest.mock.patch("ensurepip.os.environ", {}) - self.os_environ = os_environ_patch.start() - self.addCleanup(os_environ_patch.stop) + # Avoid side effects on the actual os module + os_patch = unittest.mock.patch("ensurepip.os") + patched_os = os_patch.start() + self.addCleanup(os_patch.stop) + patched_os.path = os.path + self.os_environ = patched_os.environ = os.environ.copy() def test_basic_bootstrapping(self): ensurepip.bootstrap() -- Repository URL: http://hg.python.org/cpython From ncoghlan at gmail.com Wed Nov 13 13:26:07 2013 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 13 Nov 2013 22:26:07 +1000 Subject: [Python-checkins] [Python-Dev] cpython: Provide a more readable representation of socket on repr(). In-Reply-To: References: <3dK35t3mfqz7LkL@mail.python.org> Message-ID: On 13 November 2013 08:52, Victor Stinner wrote: > Hi Giampaolo, > > You forgot to update tests after your change in repr(socket). Tests > are failing on buildbots, just one example: > > ====================================================================== > FAIL: test_repr (test.test_socket.GeneralModuleTests) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/var/lib/buildslave/3.x.murray-gentoo/build/Lib/test/test_socket.py", > line 653, in test_repr > self.assertIn('family=%i' % socket.AF_INET, repr(s)) > AssertionError: 'family=2' not found in " family=AddressFamily.AF_INET, type=SocketType.SOCK_STREAM, proto=0, > laddr=('0.0.0.0', 0)>" I fixed this to turn the buildbots green again before committing the codec error handling changes. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From python-checkins at python.org Wed Nov 13 13:33:28 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 13 Nov 2013 13:33:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_PyIm?= =?utf-8?q?port=5FImportModuleLevelObject=28=29=2C_handle?= Message-ID: <3dKQGc10bvz7Lkd@mail.python.org> http://hg.python.org/cpython/rev/8e40d07d3cd2 changeset: 87079:8e40d07d3cd2 user: Victor Stinner date: Wed Nov 13 12:11:36 2013 +0100 summary: Issue #19437: Fix PyImport_ImportModuleLevelObject(), handle PyUnicode_Substring() failure (ex: MemoryError) files: Python/import.c | 4 ++++ 1 files changed, 4 insertions(+), 0 deletions(-) diff --git a/Python/import.c b/Python/import.c --- a/Python/import.c +++ b/Python/import.c @@ -1364,7 +1364,11 @@ goto error; } } + base = PyUnicode_Substring(package, 0, last_dot); + if (base == NULL) + goto error; + if (PyUnicode_GET_LENGTH(name) > 0) { PyObject *borrowed_dot, *seq = NULL; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 13 13:33:29 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 13 Nov 2013 13:33:29 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_PyCD?= =?utf-8?q?ata=5FGetContainer=28=29_of_ctypes=2C_handle_PyDict=5FNew=28=29?= =?utf-8?q?_failure?= Message-ID: <3dKQGd2rQjz7LlX@mail.python.org> http://hg.python.org/cpython/rev/a217ea1671a8 changeset: 87080:a217ea1671a8 user: Victor Stinner date: Wed Nov 13 13:23:35 2013 +0100 summary: Issue #19437: Fix PyCData_GetContainer() of ctypes, handle PyDict_New() failure files: Modules/_ctypes/_ctypes.c | 2 ++ 1 files changed, 2 insertions(+), 0 deletions(-) diff --git a/Modules/_ctypes/_ctypes.c b/Modules/_ctypes/_ctypes.c --- a/Modules/_ctypes/_ctypes.c +++ b/Modules/_ctypes/_ctypes.c @@ -2331,6 +2331,8 @@ if (self->b_objects == NULL) { if (self->b_length) { self->b_objects = PyDict_New(); + if (self->b_objects == NULL) + return NULL; } else { Py_INCREF(Py_None); self->b_objects = Py_None; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 13 13:33:30 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 13 Nov 2013 13:33:30 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_GetK?= =?utf-8?q?eepedObjects=28=29_of_ctypes=2C_handle_PyCData=5FGetContainer?= =?utf-8?b?KCk=?= Message-ID: <3dKQGf5RT9z7Lml@mail.python.org> http://hg.python.org/cpython/rev/dc5ae99bc605 changeset: 87081:dc5ae99bc605 user: Victor Stinner date: Wed Nov 13 13:24:50 2013 +0100 summary: Issue #19437: Fix GetKeepedObjects() of ctypes, handle PyCData_GetContainer() failure files: Modules/_ctypes/_ctypes.c | 6 +++++- 1 files changed, 5 insertions(+), 1 deletions(-) diff --git a/Modules/_ctypes/_ctypes.c b/Modules/_ctypes/_ctypes.c --- a/Modules/_ctypes/_ctypes.c +++ b/Modules/_ctypes/_ctypes.c @@ -2344,7 +2344,11 @@ static PyObject * GetKeepedObjects(CDataObject *target) { - return PyCData_GetContainer(target)->b_objects; + CDataObject *container; + container = PyCData_GetContainer(target); + if (container == NULL) + return NULL; + return container->b_objects; } static PyObject * -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 13 13:33:32 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 13 Nov 2013 13:33:32 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_ctyp?= =?utf-8?q?es=2C_handle_PyCData=5FGetContainer=28=29_and_GetKeepedObjects?= =?utf-8?b?KCk=?= Message-ID: <3dKQGh1GZgz7Lwm@mail.python.org> http://hg.python.org/cpython/rev/13203ea0ac5b changeset: 87082:13203ea0ac5b user: Victor Stinner date: Wed Nov 13 13:29:37 2013 +0100 summary: Issue #19437: Fix ctypes, handle PyCData_GetContainer() and GetKeepedObjects() failures files: Modules/_ctypes/_ctypes.c | 20 +++++++++++++++++++- 1 files changed, 19 insertions(+), 1 deletions(-) diff --git a/Modules/_ctypes/_ctypes.c b/Modules/_ctypes/_ctypes.c --- a/Modules/_ctypes/_ctypes.c +++ b/Modules/_ctypes/_ctypes.c @@ -2405,6 +2405,10 @@ return 0; } ob = PyCData_GetContainer(target); + if (ob == NULL) { + Py_DECREF(keep); + return -1; + } if (ob->b_objects == NULL || !PyDict_CheckExact(ob->b_objects)) { Py_XDECREF(ob->b_objects); ob->b_objects = keep; /* refcount consumed */ @@ -2791,6 +2795,9 @@ /* XXX */; value = GetKeepedObjects(src); + if (value == NULL) + return NULL; + Py_INCREF(value); return value; } @@ -2814,6 +2821,9 @@ *(void **)ptr = src->b_ptr; keep = GetKeepedObjects(src); + if (keep == NULL) + return NULL; + /* We are assigning an array object to a field which represents a pointer. This has the same effect as converting an array @@ -4810,6 +4820,9 @@ return -1; keep = GetKeepedObjects(dst); + if (keep == NULL) + return -1; + Py_INCREF(keep); return KeepRef(self, 0, keep); } @@ -5216,9 +5229,14 @@ */ if (CDataObject_Check(src)) { CDataObject *obj = (CDataObject *)src; + CDataObject *container; + /* PyCData_GetContainer will initialize src.b_objects, we need this so it can be shared */ - PyCData_GetContainer(obj); + container = PyCData_GetContainer(obj); + if (container == NULL) + goto failed; + /* But we need a dictionary! */ if (obj->b_objects == Py_None) { Py_DECREF(Py_None); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 13 14:20:06 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 13 Nov 2013 14:20:06 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Don=27t_use_deprecated_fun?= =?utf-8?q?ction_PyUnicode=5FGET=5FSIZE=28=29?= Message-ID: <3dKRJQ5nJWz7Lk8@mail.python.org> http://hg.python.org/cpython/rev/28f71af02b69 changeset: 87083:28f71af02b69 user: Victor Stinner date: Wed Nov 13 14:17:30 2013 +0100 summary: Don't use deprecated function PyUnicode_GET_SIZE() Replace it with PyUnicode_GET_LENGTH() or PyUnicode_AsUnicodeAndSize() files: Modules/_elementtree.c | 2 +- Modules/posixmodule.c | 13 ++++++------- Objects/namespaceobject.c | 2 +- 3 files changed, 8 insertions(+), 9 deletions(-) diff --git a/Modules/_elementtree.c b/Modules/_elementtree.c --- a/Modules/_elementtree.c +++ b/Modules/_elementtree.c @@ -3461,7 +3461,7 @@ if (PyUnicode_CheckExact(buffer)) { /* A unicode object is encoded into bytes using UTF-8 */ - if (PyUnicode_GET_SIZE(buffer) == 0) { + if (PyUnicode_GET_LENGTH(buffer) == 0) { Py_DECREF(buffer); break; } diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -829,19 +829,18 @@ if (unicode) { #ifdef MS_WINDOWS wchar_t *wide; - length = PyUnicode_GET_SIZE(unicode); + + wide = PyUnicode_AsUnicodeAndSize(unicode, &length); + if (!wide) { + Py_DECREF(unicode); + return 0; + } if (length > 32767) { FORMAT_EXCEPTION(PyExc_ValueError, "%s too long for Windows"); Py_DECREF(unicode); return 0; } - wide = PyUnicode_AsUnicode(unicode); - if (!wide) { - Py_DECREF(unicode); - return 0; - } - path->wide = wide; path->narrow = NULL; path->length = length; diff --git a/Objects/namespaceobject.c b/Objects/namespaceobject.c --- a/Objects/namespaceobject.c +++ b/Objects/namespaceobject.c @@ -101,7 +101,7 @@ goto error; while ((key = PyIter_Next(keys_iter)) != NULL) { - if (PyUnicode_Check(key) && PyUnicode_GET_SIZE(key) > 0) { + if (PyUnicode_Check(key) && PyUnicode_GET_LENGTH(key) > 0) { PyObject *value, *item; value = PyDict_GetItem(d, key); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 13 14:51:52 2013 From: python-checkins at python.org (nick.coghlan) Date: Wed, 13 Nov 2013 14:51:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2317828=3A_better_h?= =?utf-8?q?andling_of_codec_errors?= Message-ID: <3dKS143jbKz7Lk8@mail.python.org> http://hg.python.org/cpython/rev/854a2cea31b9 changeset: 87084:854a2cea31b9 user: Nick Coghlan date: Wed Nov 13 23:49:21 2013 +1000 summary: Close #17828: better handling of codec errors - output type errors now redirect users to the type-neutral convenience functions in the codecs module - stateless errors that occur during encoding and decoding will now be automatically wrapped in exceptions that give the name of the codec involved files: Doc/whatsnew/3.4.rst | 78 +++++++++- Include/pyerrors.h | 22 +++ Lib/test/test_codecs.py | 193 ++++++++++++++++++++++++--- Misc/NEWS | 9 + Objects/exceptions.c | 113 ++++++++++++++++ Objects/unicodeobject.c | 27 ++- Python/codecs.c | 18 ++ 7 files changed, 414 insertions(+), 46 deletions(-) diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -102,6 +102,7 @@ * :ref:`PEP 446: Make newly created file descriptors non-inheritable `. * command line option for :ref:`isolated mode `, (:issue:`16499`). +* improvements to handling of non-Unicode codecs Significantly Improved Library Modules: @@ -170,6 +171,70 @@ PEP written and implemented by Victor Stinner. +Improvements to handling of non-Unicode codecs +============================================== + +Since it was first introduced, the :mod:`codecs` module has always been +intended to operate as a type-neutral dynamic encoding and decoding +system. However, its close coupling with the Python text model, especially +the type restricted convenience methods on the builtin :class:`str`, +:class:`bytes` and :class:`bytearray` types, has historically obscured that +fact. + +As a key step in clarifying the situation, the :meth:`codecs.encode` and +:meth:`codecs.decode` convenience functions are now properly documented in +Python 2.7, 3.3 and 3.4. These functions have existed in the :mod:`codecs` +module and have been covered by the regression test suite since Python 2.4, +but were previously only discoverable through runtime introspection. + +Unlike the convenience methods on :class:`str`, :class:`bytes` and +:class:`bytearray`, these convenience functions support arbitrary codecs +in both Python 2 and Python 3, rather than being limited to Unicode text +encodings (in Python 3) or ``basestring`` <-> ``basestring`` conversions +(in Python 2). + +In Python 3.4, the errors raised by the convenience methods when a codec +produces the incorrect output type have also been updated to direct users +towards these general purpose convenience functions:: + + >>> import codecs + + >>> codecs.encode(b"hello", "bz2_codec").decode("bz2_codec") + Traceback (most recent call last): + File "", line 1, in + TypeError: 'bz2_codec' decoder returned 'bytes' instead of 'str'; use codecs.decode() to decode to arbitrary types + + >>> "hello".encode("rot_13") + Traceback (most recent call last): + File "", line 1, in + TypeError: 'rot_13' encoder returned 'str' instead of 'bytes'; use codecs.encode() to encode to arbitrary types + +In a related change, whenever it is feasible without breaking backwards +compatibility, exceptions raised during encoding and decoding operations +will be wrapped in a chained exception of the same type that mentions the +name of the codec responsible for producing the error:: + + >>> b"hello".decode("uu_codec") + ValueError: Missing "begin" line in input data + + The above exception was the direct cause of the following exception: + + Traceback (most recent call last): + File "", line 1, in + ValueError: decoding with 'uu_codec' codec failed (ValueError: Missing "begin" line in input data) + + >>> "hello".encode("bz2_codec") + TypeError: 'str' does not support the buffer interface + + The above exception was the direct cause of the following exception: + + Traceback (most recent call last): + File "", line 1, in + TypeError: encoding with 'bz2_codec' codec failed (TypeError: 'str' does not support the buffer interface) + +(Contributed by Nick Coghlan in :issue:`17827` and :issue:`17828`) + + Other Language Changes ====================== @@ -262,19 +327,6 @@ Added support for 24-bit samples (:issue:`12866`). -codecs ------- - -The :meth:`codecs.encode` and :meth:`codecs.decode` convenience functions are -now properly documented. These functions have existed in the :mod:`codecs` -module since ~2004, but were previously only discoverable through runtime -introspection. - -Unlike the convenience methods on :class:`str`, :class:`bytes` and -:class:`bytearray`, these convenience functions support arbitrary codecs, -rather than being limited to Unicode text encodings. - - colorsys -------- diff --git a/Include/pyerrors.h b/Include/pyerrors.h --- a/Include/pyerrors.h +++ b/Include/pyerrors.h @@ -285,6 +285,28 @@ const char *name, const char *doc, PyObject *base, PyObject *dict); PyAPI_FUNC(void) PyErr_WriteUnraisable(PyObject *); +/* In exceptions.c */ +#ifndef Py_LIMITED_API +/* Helper that attempts to replace the current exception with one of the + * same type but with a prefix added to the exception text. The resulting + * exception description looks like: + * + * prefix (exc_type: original_exc_str) + * + * Only some exceptions can be safely replaced. If the function determines + * it isn't safe to perform the replacement, it will leave the original + * unmodified exception in place. + * + * Returns a borrowed reference to the new exception (if any), NULL if the + * existing exception was left in place. + */ +PyAPI_FUNC(PyObject *) _PyErr_TrySetFromCause( + const char *prefix_format, /* ASCII-encoded string */ + ... + ); +#endif + + /* In sigcheck.c or signalmodule.c */ PyAPI_FUNC(int) PyErr_CheckSignals(void); PyAPI_FUNC(void) PyErr_SetInterrupt(void); diff --git a/Lib/test/test_codecs.py b/Lib/test/test_codecs.py --- a/Lib/test/test_codecs.py +++ b/Lib/test/test_codecs.py @@ -1,5 +1,6 @@ import _testcapi import codecs +import contextlib import io import locale import sys @@ -2292,28 +2293,31 @@ def test_basics(self): binput = bytes(range(256)) for encoding in bytes_transform_encodings: - # generic codecs interface - (o, size) = codecs.getencoder(encoding)(binput) - self.assertEqual(size, len(binput)) - (i, size) = codecs.getdecoder(encoding)(o) - self.assertEqual(size, len(o)) - self.assertEqual(i, binput) + with self.subTest(encoding=encoding): + # generic codecs interface + (o, size) = codecs.getencoder(encoding)(binput) + self.assertEqual(size, len(binput)) + (i, size) = codecs.getdecoder(encoding)(o) + self.assertEqual(size, len(o)) + self.assertEqual(i, binput) def test_read(self): for encoding in bytes_transform_encodings: - sin = codecs.encode(b"\x80", encoding) - reader = codecs.getreader(encoding)(io.BytesIO(sin)) - sout = reader.read() - self.assertEqual(sout, b"\x80") + with self.subTest(encoding=encoding): + sin = codecs.encode(b"\x80", encoding) + reader = codecs.getreader(encoding)(io.BytesIO(sin)) + sout = reader.read() + self.assertEqual(sout, b"\x80") def test_readline(self): for encoding in bytes_transform_encodings: if encoding in ['uu_codec', 'zlib_codec']: continue - sin = codecs.encode(b"\x80", encoding) - reader = codecs.getreader(encoding)(io.BytesIO(sin)) - sout = reader.readline() - self.assertEqual(sout, b"\x80") + with self.subTest(encoding=encoding): + sin = codecs.encode(b"\x80", encoding) + reader = codecs.getreader(encoding)(io.BytesIO(sin)) + sout = reader.readline() + self.assertEqual(sout, b"\x80") def test_buffer_api_usage(self): # We check all the transform codecs accept memoryview input @@ -2321,17 +2325,158 @@ # and also that they roundtrip correctly original = b"12345\x80" for encoding in bytes_transform_encodings: - data = original - view = memoryview(data) - data = codecs.encode(data, encoding) - view_encoded = codecs.encode(view, encoding) - self.assertEqual(view_encoded, data) - view = memoryview(data) - data = codecs.decode(data, encoding) - self.assertEqual(data, original) - view_decoded = codecs.decode(view, encoding) - self.assertEqual(view_decoded, data) + with self.subTest(encoding=encoding): + data = original + view = memoryview(data) + data = codecs.encode(data, encoding) + view_encoded = codecs.encode(view, encoding) + self.assertEqual(view_encoded, data) + view = memoryview(data) + data = codecs.decode(data, encoding) + self.assertEqual(data, original) + view_decoded = codecs.decode(view, encoding) + self.assertEqual(view_decoded, data) + def test_type_error_for_text_input(self): + # Check binary -> binary codecs give a good error for str input + bad_input = "bad input type" + for encoding in bytes_transform_encodings: + with self.subTest(encoding=encoding): + msg = "^encoding with '{}' codec failed".format(encoding) + with self.assertRaisesRegex(TypeError, msg) as failure: + bad_input.encode(encoding) + self.assertTrue(isinstance(failure.exception.__cause__, + TypeError)) + + def test_type_error_for_binary_input(self): + # Check str -> str codec gives a good error for binary input + for bad_input in (b"immutable", bytearray(b"mutable")): + with self.subTest(bad_input=bad_input): + msg = "^decoding with 'rot_13' codec failed" + with self.assertRaisesRegex(AttributeError, msg) as failure: + bad_input.decode("rot_13") + self.assertTrue(isinstance(failure.exception.__cause__, + AttributeError)) + + def test_bad_decoding_output_type(self): + # Check bytes.decode and bytearray.decode give a good error + # message for binary -> binary codecs + data = b"encode first to ensure we meet any format restrictions" + for encoding in bytes_transform_encodings: + with self.subTest(encoding=encoding): + encoded_data = codecs.encode(data, encoding) + fmt = ("'{}' decoder returned 'bytes' instead of 'str'; " + "use codecs.decode\(\) to decode to arbitrary types") + msg = fmt.format(encoding) + with self.assertRaisesRegex(TypeError, msg): + encoded_data.decode(encoding) + with self.assertRaisesRegex(TypeError, msg): + bytearray(encoded_data).decode(encoding) + + def test_bad_encoding_output_type(self): + # Check str.encode gives a good error message for str -> str codecs + msg = ("'rot_13' encoder returned 'str' instead of 'bytes'; " + "use codecs.encode\(\) to encode to arbitrary types") + with self.assertRaisesRegex(TypeError, msg): + "just an example message".encode("rot_13") + + +# The codec system tries to wrap exceptions in order to ensure the error +# mentions the operation being performed and the codec involved. We +# currently *only* want this to happen for relatively stateless +# exceptions, where the only significant information they contain is their +# type and a single str argument. +class ExceptionChainingTest(unittest.TestCase): + + def setUp(self): + # There's no way to unregister a codec search function, so we just + # ensure we render this one fairly harmless after the test + # case finishes by using the test case repr as the codec name + # The codecs module normalizes codec names, although this doesn't + # appear to be formally documented... + self.codec_name = repr(self).lower().replace(" ", "-") + self.codec_info = None + codecs.register(self.get_codec) + + def get_codec(self, codec_name): + if codec_name != self.codec_name: + return None + return self.codec_info + + def set_codec(self, obj_to_raise): + def raise_obj(*args, **kwds): + raise obj_to_raise + self.codec_info = codecs.CodecInfo(raise_obj, raise_obj, + name=self.codec_name) + + @contextlib.contextmanager + def assertWrapped(self, operation, exc_type, msg): + full_msg = "{} with '{}' codec failed \({}: {}\)".format( + operation, self.codec_name, exc_type.__name__, msg) + with self.assertRaisesRegex(exc_type, full_msg) as caught: + yield caught + + def check_wrapped(self, obj_to_raise, msg): + self.set_codec(obj_to_raise) + with self.assertWrapped("encoding", RuntimeError, msg): + "str_input".encode(self.codec_name) + with self.assertWrapped("encoding", RuntimeError, msg): + codecs.encode("str_input", self.codec_name) + with self.assertWrapped("decoding", RuntimeError, msg): + b"bytes input".decode(self.codec_name) + with self.assertWrapped("decoding", RuntimeError, msg): + codecs.decode(b"bytes input", self.codec_name) + + def test_raise_by_type(self): + self.check_wrapped(RuntimeError, "") + + def test_raise_by_value(self): + msg = "This should be wrapped" + self.check_wrapped(RuntimeError(msg), msg) + + @contextlib.contextmanager + def assertNotWrapped(self, operation, exc_type, msg): + with self.assertRaisesRegex(exc_type, msg) as caught: + yield caught + actual_msg = str(caught.exception) + self.assertNotIn(operation, actual_msg) + self.assertNotIn(self.codec_name, actual_msg) + + def check_not_wrapped(self, obj_to_raise, msg): + self.set_codec(obj_to_raise) + with self.assertNotWrapped("encoding", RuntimeError, msg): + "str input".encode(self.codec_name) + with self.assertNotWrapped("encoding", RuntimeError, msg): + codecs.encode("str input", self.codec_name) + with self.assertNotWrapped("decoding", RuntimeError, msg): + b"bytes input".decode(self.codec_name) + with self.assertNotWrapped("decoding", RuntimeError, msg): + codecs.decode(b"bytes input", self.codec_name) + + def test_init_override_is_not_wrapped(self): + class CustomInit(RuntimeError): + def __init__(self): + pass + self.check_not_wrapped(CustomInit, "") + + def test_new_override_is_not_wrapped(self): + class CustomNew(RuntimeError): + def __new__(cls): + return super().__new__(cls) + self.check_not_wrapped(CustomNew, "") + + def test_instance_attribute_is_not_wrapped(self): + msg = "This should NOT be wrapped" + exc = RuntimeError(msg) + exc.attr = 1 + self.check_not_wrapped(exc, msg) + + def test_non_str_arg_is_not_wrapped(self): + self.check_not_wrapped(RuntimeError(1), "1") + + def test_multiple_args_is_not_wrapped(self): + msg = "\('a', 'b', 'c'\)" + self.check_not_wrapped(RuntimeError('a', 'b', 'c'), msg) @unittest.skipUnless(sys.platform == 'win32', diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,6 +10,15 @@ Core and Builtins ----------------- +- Issue #17828: Output type errors in str.encode(), bytes.decode() and + bytearray.decode() now direct users to codecs.encode() or codecs.decode() + as appropriate. + +- Issue #17828: The interpreter now attempts to chain errors that occur in + codec processing with a replacement exception of the same type that + includes the codec name in the error message. It ensures it only does this + when the creation of the replacement exception won't lose any information. + - Issue #19466: Clear the frames of daemon threads earlier during the Python shutdown to call objects destructors. So "unclosed file" resource warnings are now corretly emitted for daemon threads. diff --git a/Objects/exceptions.c b/Objects/exceptions.c --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -2591,3 +2591,116 @@ free_preallocated_memerrors(); Py_CLEAR(errnomap); } + +/* Helper to do the equivalent of "raise X from Y" in C, but always using + * the current exception rather than passing one in. + * + * We currently limit this to *only* exceptions that use the BaseException + * tp_init and tp_new methods, since we can be reasonably sure we can wrap + * those correctly without losing data and without losing backwards + * compatibility. + * + * We also aim to rule out *all* exceptions that might be storing additional + * state, whether by having a size difference relative to BaseException, + * additional arguments passed in during construction or by having a + * non-empty instance dict. + * + * We need to be very careful with what we wrap, since changing types to + * a broader exception type would be backwards incompatible for + * existing codecs, and with different init or new method implementations + * may either not support instantiation with PyErr_Format or lose + * information when instantiated that way. + * + * XXX (ncoghlan): This could be made more comprehensive by exploiting the + * fact that exceptions are expected to support pickling. If more builtin + * exceptions (e.g. AttributeError) start to be converted to rich + * exceptions with additional attributes, that's probably a better approach + * to pursue over adding special cases for particular stateful subclasses. + * + * Returns a borrowed reference to the new exception (if any), NULL if the + * existing exception was left in place. + */ +PyObject * +_PyErr_TrySetFromCause(const char *format, ...) +{ + PyObject* msg_prefix; + PyObject *exc, *val, *tb; + PyTypeObject *caught_type; + PyObject *instance_dict; + PyObject *instance_args; + Py_ssize_t num_args; + PyObject *new_exc, *new_val, *new_tb; + va_list vargs; + +#ifdef HAVE_STDARG_PROTOTYPES + va_start(vargs, format); +#else + va_start(vargs); +#endif + + PyErr_Fetch(&exc, &val, &tb); + caught_type = (PyTypeObject *) exc; + /* Ensure type info indicates no extra state is stored at the C level */ + if (caught_type->tp_init != (initproc) BaseException_init || + caught_type->tp_new != BaseException_new || + caught_type->tp_basicsize != _PyExc_BaseException.tp_basicsize || + caught_type->tp_itemsize != _PyExc_BaseException.tp_itemsize + ) { + /* We can't be sure we can wrap this safely, since it may contain + * more state than just the exception type. Accordingly, we just + * leave it alone. + */ + PyErr_Restore(exc, val, tb); + return NULL; + } + + /* Check the args are empty or contain a single string */ + PyErr_NormalizeException(&exc, &val, &tb); + instance_args = ((PyBaseExceptionObject *) val)->args; + num_args = PyTuple_GET_SIZE(instance_args); + if ((num_args > 1) || + (num_args == 1 && + !PyUnicode_CheckExact(PyTuple_GET_ITEM(instance_args, 0)) + ) + ) { + /* More than 1 arg, or the one arg we do have isn't a string + */ + PyErr_Restore(exc, val, tb); + return NULL; + } + + /* Ensure the instance dict is also empty */ + instance_dict = *_PyObject_GetDictPtr(val); + if (instance_dict != NULL && PyObject_Length(instance_dict) > 0) { + /* While we could potentially copy a non-empty instance dictionary + * to the replacement exception, for now we take the more + * conservative path of leaving exceptions with attributes set + * alone. + */ + PyErr_Restore(exc, val, tb); + return NULL; + } + + /* For exceptions that we can wrap safely, we chain the original + * exception to a new one of the exact same type with an + * error message that mentions the additional details and the + * original exception. + * + * It would be nice to wrap OSError and various other exception + * types as well, but that's quite a bit trickier due to the extra + * state potentially stored on OSError instances. + */ + msg_prefix = PyUnicode_FromFormatV(format, vargs); + if (msg_prefix == NULL) + return NULL; + + PyErr_Format(exc, "%U (%s: %S)", + msg_prefix, Py_TYPE(val)->tp_name, val); + Py_DECREF(exc); + Py_XDECREF(tb); + PyErr_Fetch(&new_exc, &new_val, &new_tb); + PyErr_NormalizeException(&new_exc, &new_val, &new_tb); + PyException_SetCause(new_val, val); + PyErr_Restore(new_exc, new_val, new_tb); + return new_val; +} diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -3054,8 +3054,10 @@ goto onError; if (!PyUnicode_Check(unicode)) { PyErr_Format(PyExc_TypeError, - "decoder did not return a str object (type=%.400s)", - Py_TYPE(unicode)->tp_name); + "'%.400s' decoder returned '%.400s' instead of 'str'; " + "use codecs.decode() to decode to arbitrary types", + encoding, + Py_TYPE(unicode)->tp_name, Py_TYPE(unicode)->tp_name); Py_DECREF(unicode); goto onError; } @@ -3113,8 +3115,10 @@ goto onError; if (!PyUnicode_Check(v)) { PyErr_Format(PyExc_TypeError, - "decoder did not return a str object (type=%.400s)", - Py_TYPE(v)->tp_name); + "'%.400s' decoder returned '%.400s' instead of 'str'; " + "use codecs.decode() to decode to arbitrary types", + encoding, + Py_TYPE(unicode)->tp_name, Py_TYPE(unicode)->tp_name); Py_DECREF(v); goto onError; } @@ -3425,7 +3429,8 @@ PyObject *b; error = PyErr_WarnFormat(PyExc_RuntimeWarning, 1, - "encoder %s returned bytearray instead of bytes", + "encoder %s returned bytearray instead of bytes; " + "use codecs.encode() to encode to arbitrary types", encoding); if (error) { Py_DECREF(v); @@ -3438,8 +3443,10 @@ } PyErr_Format(PyExc_TypeError, - "encoder did not return a bytes object (type=%.400s)", - Py_TYPE(v)->tp_name); + "'%.400s' encoder returned '%.400s' instead of 'bytes'; " + "use codecs.encode() to encode to arbitrary types", + encoding, + Py_TYPE(v)->tp_name, Py_TYPE(v)->tp_name); Py_DECREF(v); return NULL; } @@ -3465,8 +3472,10 @@ goto onError; if (!PyUnicode_Check(v)) { PyErr_Format(PyExc_TypeError, - "encoder did not return an str object (type=%.400s)", - Py_TYPE(v)->tp_name); + "'%.400s' encoder returned '%.400s' instead of 'str'; " + "use codecs.encode() to encode to arbitrary types", + encoding, + Py_TYPE(v)->tp_name, Py_TYPE(v)->tp_name); Py_DECREF(v); goto onError; } diff --git a/Python/codecs.c b/Python/codecs.c --- a/Python/codecs.c +++ b/Python/codecs.c @@ -332,6 +332,22 @@ return codec_getstreamcodec(encoding, stream, errors, 3); } +/* Helper that tries to ensure the reported exception chain indicates the + * codec that was invoked to trigger the failure without changing the type + * of the exception raised. + */ +static void +wrap_codec_error(const char *operation, + const char *encoding) +{ + /* TrySetFromCause will replace the active exception with a suitably + * updated clone if it can, otherwise it will leave the original + * exception alone. + */ + _PyErr_TrySetFromCause("%s with '%s' codec failed", + operation, encoding); +} + /* Encode an object (e.g. an Unicode object) using the given encoding and return the resulting encoded object (usually a Python string). @@ -376,6 +392,7 @@ Py_XDECREF(result); Py_XDECREF(args); Py_XDECREF(encoder); + wrap_codec_error("encoding", encoding); return NULL; } @@ -422,6 +439,7 @@ Py_XDECREF(args); Py_XDECREF(decoder); Py_XDECREF(result); + wrap_codec_error("decoding", encoding); return NULL; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 13 15:25:24 2013 From: python-checkins at python.org (nick.coghlan) Date: Wed, 13 Nov 2013 15:25:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2317839=3A_mention_?= =?utf-8?q?base64_change_in_What=27s_New?= Message-ID: <3dKSlm0pS8z7Lrb@mail.python.org> http://hg.python.org/cpython/rev/e53984133740 changeset: 87085:e53984133740 user: Nick Coghlan date: Thu Nov 14 00:24:31 2013 +1000 summary: Issue #17839: mention base64 change in What's New files: Doc/whatsnew/3.4.rst | 8 ++++++++ 1 files changed, 8 insertions(+), 0 deletions(-) diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -327,6 +327,14 @@ Added support for 24-bit samples (:issue:`12866`). +base64 +------ + +The encoding and decoding functions in :mod:`base64` now accept any +:term:`bytes-like object` in cases where it previously required a +:class:`bytes` or :class:`bytearray` instance (:issue`17839`) + + colorsys -------- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 13 20:09:08 2013 From: python-checkins at python.org (guido.van.rossum) Date: Wed, 13 Nov 2013 20:09:08 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Temporary_fix_b?= =?utf-8?q?y_Victor_Stinner_for_issue_19566=2E?= Message-ID: <3dKb384cPjz7M59@mail.python.org> http://hg.python.org/cpython/rev/8e0eeb4cc8fa changeset: 87086:8e0eeb4cc8fa user: Guido van Rossum date: Wed Nov 13 11:08:34 2013 -0800 summary: asyncio: Temporary fix by Victor Stinner for issue 19566. files: Lib/asyncio/unix_events.py | 5 +++-- 1 files changed, 3 insertions(+), 2 deletions(-) diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -593,11 +593,12 @@ (O(1) each time a child terminates). """ def __init__(self, loop): - super().__init__(loop) - self._lock = threading.Lock() self._zombies = {} self._forks = 0 + # Call base class constructor last because it calls back into + # the subclass (set_loop() calls _do_waitpid()). + super().__init__(loop) def close(self): super().close() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 13 23:25:58 2013 From: python-checkins at python.org (ethan.furman) Date: Wed, 13 Nov 2013 23:25:58 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbjogcmVtb3ZlZCBFbnVtLl9fZXFf?= =?utf-8?q?=5F_as_it_added_nothing?= Message-ID: <3dKgQG72KRz7M1q@mail.python.org> http://hg.python.org/cpython/rev/ca909a3728d3 changeset: 87087:ca909a3728d3 user: Ethan Furman date: Wed Nov 13 14:25:45 2013 -0800 summary: removed Enum.__eq__ as it added nothing files: Lib/enum.py | 5 ----- Lib/test/test_enum.py | 11 +++++++++++ 2 files changed, 11 insertions(+), 5 deletions(-) diff --git a/Lib/enum.py b/Lib/enum.py --- a/Lib/enum.py +++ b/Lib/enum.py @@ -447,11 +447,6 @@ return (['__class__', '__doc__', '__module__', 'name', 'value'] + added_behavior) - def __eq__(self, other): - if type(other) is self.__class__: - return self is other - return NotImplemented - def __format__(self, format_spec): # mixed-in Enums should use the mixed-in type's __format__, otherwise # we can get strange results with the Enum name showing up instead of diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -1030,6 +1030,15 @@ self.assertEqual(list(Color), [Color.red, Color.green, Color.blue]) self.assertEqual(list(map(int, Color)), [1, 2, 3]) + def test_equality(self): + class AlwaysEqual: + def __eq__(self, other): + return True + class OrdinaryEnum(Enum): + a = 1 + self.assertEqual(AlwaysEqual(), OrdinaryEnum.a) + self.assertEqual(OrdinaryEnum.a, AlwaysEqual()) + def test_ordered_mixin(self): class OrderedEnum(Enum): def __ge__(self, other): @@ -1058,6 +1067,8 @@ self.assertLessEqual(Grade.F, Grade.C) self.assertLess(Grade.D, Grade.A) self.assertGreaterEqual(Grade.B, Grade.B) + self.assertEqual(Grade.B, Grade.B) + self.assertNotEqual(Grade.C, Grade.D) def test_extending2(self): class Shade(Enum): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 13 23:33:02 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 13 Nov 2013 23:33:02 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_456=3A_add_some_of_the_ne?= =?utf-8?q?w_implementation_details_to_the_PEP=27s_text?= Message-ID: <3dKgZQ31fjz7Lqj@mail.python.org> http://hg.python.org/peps/rev/610f7b296939 changeset: 5267:610f7b296939 user: Christian Heimes date: Wed Nov 13 23:32:54 2013 +0100 summary: PEP 456: add some of the new implementation details to the PEP's text files: pep-0456.txt | 110 ++++++++++++++++++++++++++++++++------ 1 files changed, 93 insertions(+), 17 deletions(-) diff --git a/pep-0456.txt b/pep-0456.txt --- a/pep-0456.txt +++ b/pep-0456.txt @@ -222,6 +222,7 @@ can no longer be considered as secure algorithm. It still may be an alternative is hash collision attacks are of no concern. + CityHash -------- @@ -246,6 +247,12 @@ weakness in CityHash. +DJBX33A +------- + +TODO + + HMAC, MD5, SHA-1, SHA-2 ----------------------- @@ -268,6 +275,45 @@ other prominent projects have came to the same conclusion. +Small string optimization +========================= + +Hash functions like SipHash24 have a costly initialization and finalization +code that can dominate speed of the algorithm for very short strings. On the +other hand Python calculates the hash value of short strings quite often. A +simple and fast function for especially for hashing of small strings can make +a measurably impact on performance. For example these measurements were taken +during a run of Python's regression tests. Additional measurements of other +code have shown a similar distribution. + +===== ============ ======= +bytes hash() calls portion +===== ============ ======= +1 18709 0.2% +2 737480 9.5% +3 636178 17.6% +4 1518313 36.7% +5 643022 44.9% +6 770478 54.6% +7 525150 61.2% +8 304873 65.1% +9 297272 68.8% +10 68191 69.7% +11 1388484 87.2% +12 480786 93.3% +13 52730 93.9% +14 65309 94.8% +15 44245 95.3% +16 85643 96.4% +Total 7921678 +===== ============ ======= + +However a fast function like DJBX33A is not as secure as SipHash24. A cutoff +at about 5 to 7 bytes should provide a decent safety margin and speed up at +the same time. The PEP's reference implementation provides such a cutoff with +``Py_HASH_CUTOFF`` but disables the optimization by default. + + C API additions =============== @@ -277,28 +323,57 @@ ----------- The ``_Py_HashSecret_t`` type of Python 2.6 to 3.3 has two members with either -32- or 64-bit length each. SipHash requires two 64-bit unsigned integers as keys. -The typedef will be changed to an union with a guaranteed size of 128 bits on -all architectures. On platforms with a 64-bit data type it will have two -``uint64`` members. Because C89 compatible compilers may not have ``uint64`` -the union also has an array of 16 chars. +32- or 64-bit length each. SipHash requires two 64-bit unsigned integers as +keys. The typedef will be changed to an union with a guaranteed size of 24 +bytes on all architectures. The union provides a 128 bit random key for +SipHash24 and FNV as well as an additional value of 64 bit for the optional +small string optimization and pyexpat seed. The additional 64 bit seed ensures +that pyexpat or small string optimization cannot reveal bits of the SipHash24 +seed. + +memory layout on 64 bit systems:: + + cccccccc cccccccc cccccccc uc -- unsigned char[24] + pppppppp ssssssss ........ fnv -- two Py_hash_t + k0k0k0k0 k1k1k1k1 ........ siphash -- two PY_UINT64_T + ........ ........ ssssssss djbx33a -- 16 bytes padding + one Py_hash_t + ........ ........ eeeeeeee pyexpat XML hash salt + +memory layout on 32 bit systems:: + + cccccccc cccccccc cccccccc uc -- unsigned char[24] + ppppssss ........ ........ fnv -- two Py_hash_t + k0k0k0k0 k1k1k1k1 ........ siphash -- two PY_UINT64_T (if available) + ........ ........ ssss.... djbx33a -- 16 bytes padding + one Py_hash_t + ........ ........ eeee.... pyexpat XML hash salt new type definition:: typedef union { - unsigned char uc16[16]; + /* ensure 24 bytes */ + unsigned char uc[24]; + /* two Py_hash_t for FNV */ struct { Py_hash_t prefix; Py_hash_t suffix; - } ht; + } fnv; #ifdef PY_UINT64_T + /* two uint64 for SipHash24 */ struct { PY_UINT64_T k0; PY_UINT64_T k1; - } ui64; + } siphash; #endif + /* a different (!) Py_hash_t for small string optimization */ + struct { + unsigned char padding[16]; + Py_hash_t suffix; + } djbx33a; + struct { + unsigned char padding[16]; + Py_hash_t hashsalt; + } expat; } _Py_HashSecret_t; - PyAPI_DATA(_Py_HashSecret_t) _Py_HashSecret; ``_Py_HashSecret_t`` is initialized in ``Python/random.c:_PyRandom_Init()`` @@ -336,9 +411,9 @@ hash function selection ----------------------- -The value of the macro ``PY_HASH_ALGORITHM`` defines which hash algorithm is -used internally. It may be set to any of the three values ``PY_HASH_SIPHASH24``, -``PY_HASH_FNV`` or ``PY_HASH_EXTERNAL``. If ``PY_HASH_ALGORITHM`` is not +The value of the macro ``Py_HASH_ALGORITHM`` defines which hash algorithm is +used internally. It may be set to any of the three values ``Py_HASH_SIPHASH24``, +``Py_HASH_FNV`` or ``Py_HASH_EXTERNAL``. If ``Py_HASH_ALGORITHM`` is not defined at all, then the best available algorithm is selected. On platforms wich don't require aligned memory access (``HAVE_ALIGNED_REQUIRED`` not defined) and an unsigned 64bit integer type ``PY_UINT64_T``, SipHash24 is @@ -346,17 +421,17 @@ as SPARC, FNV is selected as fallback. A hash algorithm can be selected with an autoconf option, for example ``./configure --with-hash-algorithm=fnv``. -The value ``PY_HASH_EXTERNAL`` allows 3rd parties to provide their own +The value ``Py_HASH_EXTERNAL`` allows 3rd parties to provide their own implementation at compile time. Implementation:: - #if PY_HASH_ALGORITHM == PY_HASH_EXTERNAL + #if Py_HASH_ALGORITHM == Py_HASH_EXTERNAL extern PyHash_FuncDef PyHash_Func; - #elif PY_HASH_ALGORITHM == PY_HASH_SIPHASH24 + #elif Py_HASH_ALGORITHM == Py_HASH_SIPHASH24 static PyHash_FuncDef PyHash_Func = {siphash24, "siphash24", 64, 128}; - #elif PY_HASH_ALGORITHM == PY_HASH_FNV + #elif Py_HASH_ALGORITHM == Py_HASH_FNV static PyHash_FuncDef PyHash_Func = {fnv, "fnv", 8 * sizeof(Py_hash_t), 16 * sizeof(Py_hash_t)}; #endif @@ -381,7 +456,8 @@ # new fields: algorithm='siphash24', hash_bits=64, - seed_bits=128) + seed_bits=128, + cutoff=0) Necessary modifications to C code -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Thu Nov 14 00:20:21 2013 From: python-checkins at python.org (christian.heimes) Date: Thu, 14 Nov 2013 00:20:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_More_compact_notation?= Message-ID: <3dKhd160sbz7Lxr@mail.python.org> http://hg.python.org/peps/rev/b6778ead0ebb changeset: 5268:b6778ead0ebb user: Christian Heimes date: Wed Nov 13 23:39:54 2013 +0100 summary: More compact notation files: pep-0456.txt | 46 ++++++++++++--------------------------- 1 files changed, 14 insertions(+), 32 deletions(-) diff --git a/pep-0456.txt b/pep-0456.txt --- a/pep-0456.txt +++ b/pep-0456.txt @@ -180,9 +180,7 @@ defines the prototype of the function. (Note: ``k`` is split up into two uint64_t):: - uint64_t siphash24(const void *src, - unsigned long src_sz, - const char k[16]); + uint64_t siphash24(const void *src, unsigned long src_sz, const char k[16]) SipHash requires a 64-bit data type and is not compatible with pure C89 platforms. @@ -199,20 +197,11 @@ Murmur3's function prototypes are:: - void MurmurHash3_x86_32(const void *key, - int len, - uint32_t seed, - void *out); + void MurmurHash3_x86_32(const void *key, int len, uint32_t seed, void *out) - void MurmurHash3_x86_128(const void * key, - int len, - uint32_t seed, - void *out); + void MurmurHash3_x86_128(const void *key, int len, uint32_t seed, void *out) - void MurmurHash3_x64_128(const void *key, - int len, - uint32_t seed, - void *out); + void MurmurHash3_x64_128(const void *key, int len, uint32_t seed, void *out) The 128-bit variants requires a 64-bit data type and are not compatible with pure C89 platforms. The 32-bit variant is fully C89-compatible. @@ -234,10 +223,8 @@ The relevant function prototype for 64-bit CityHash with 128-bit seed is:: - uint64 CityHash64WithSeeds(const char *buf, - size_t len, - uint64 seed0, - uint64 seed1) + uint64 CityHash64WithSeeds(const char *buf, size_t len, uint64 seed0, + uint64 seed1) CityHash also offers SSE 4.2 optimizations with CRC32 intrinsic for long inputs. All variants except CityHash32 require 64-bit data types. CityHash32 @@ -253,19 +240,14 @@ TODO -HMAC, MD5, SHA-1, SHA-2 ------------------------ +Other +----- -These hash algorithms are too slow and have high setup and finalization costs. -For these reasons they are not considered fit for this purpose. - - -AES CMAC --------- - -Modern AMD and Intel CPUs have AES-NI (AES instruction set) [aes-ni]_ to speed -up AES encryption. CMAC with AES-NI might be a viable option but it's probably -too slow for daily operation. (testing required) +Crypto algorithms such as HMAC, MD5, SHA-1 or SHA-2 are too slow and have +high setup and finalization costs. For these reasons they are not considered +fit for this purpose. Modern AMD and Intel CPUs have AES-NI (AES instruction +set) [aes-ni]_ to speed up AES encryption. CMAC with AES-NI might be a viable +option but it's probably too slow for daily operation. (testing required) Conclusion @@ -473,7 +455,7 @@ The function is moved to Python/pyhash.c and modified to use the hash function through PyHash_Func.hash(). The function signature is altered to take a ``const void *`` as first argument. ``_Py_HashBytes`` also takes care of -special cases. It maps zero length input to ``0`` and return value of ``-1`` +special cases: it maps zero length input to ``0`` and return value of ``-1`` to ``-2``. bytes_hash() (Objects/bytesobject.c) -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Thu Nov 14 00:20:23 2013 From: python-checkins at python.org (christian.heimes) Date: Thu, 14 Nov 2013 00:20:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_document_DJBX33A_and_discussi?= =?utf-8?q?on_about_non-aligned_memory?= Message-ID: <3dKhd30hFnz7Lxr@mail.python.org> http://hg.python.org/peps/rev/bd98d7a7b946 changeset: 5269:bd98d7a7b946 user: Christian Heimes date: Thu Nov 14 00:20:03 2013 +0100 summary: document DJBX33A and discussion about non-aligned memory files: pep-0456.txt | 23 ++++++++++++++++++++++- 1 files changed, 22 insertions(+), 1 deletions(-) diff --git a/pep-0456.txt b/pep-0456.txt --- a/pep-0456.txt +++ b/pep-0456.txt @@ -237,7 +237,10 @@ DJBX33A ------- -TODO +DJBX33A is a very simple multiplication and addition algorithm by Daniel +J. Bernstein. It is fast and has low setup costs but it's not secure against +hash collision attacks. Its properties make it a viable choice for small +string hashing optimization. Other @@ -609,6 +612,22 @@ an unnecessary complication by several core committers [pluggable]_. Subsequent versions of the PEP aim for compile time configuration. +Non-aligned memory access +------------------------- + +The implementation of SipHash24 were critized because it ignores the issue +of non-aligned memory and therefore doesn't work on architectures that +requires alignment of integer types. The PEP deliberately neglects this +special case and doesn't support SipHash24 on such platforms. It's simply +not considered worth the trouble until proven otherwise. All major platforms +like X86, X86_64 and ARMv6+ can handle unaligned memory with minimal or even +no speed impact. [alignmentmyth]_ + +Almost every block is properly aligned anyway. At present bytes' and str's +data are always aligned. Only memoryviews can point to unaligned blocks +under rare circumstances. The PEP implementation is optimized and simplified +for the common case. + References ========== @@ -649,6 +668,8 @@ .. [pluggable] https://mail.python.org/pipermail/python-dev/2013-October/129138.html +.. [alignmentmyth] http://lemire.me/blog/archives/2012/05/31/data-alignment-for-speed-myth-or-reality/ + Copyright ========= -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Thu Nov 14 00:50:16 2013 From: python-checkins at python.org (guido.van.rossum) Date: Thu, 14 Nov 2013 00:50:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Fix_from_Anthon?= =?utf-8?q?y_Baire_for_CPython_issue_19566_=28replaces_earlier_fix=29=2E?= Message-ID: <3dKjHX232kz7M4v@mail.python.org> http://hg.python.org/cpython/rev/eb42adc53923 changeset: 87088:eb42adc53923 user: Guido van Rossum date: Wed Nov 13 15:50:08 2013 -0800 summary: asyncio: Fix from Anthony Baire for CPython issue 19566 (replaces earlier fix). files: Lib/asyncio/unix_events.py | 69 ++++++---- Lib/test/test_asyncio/test_events.py | 4 +- Lib/test/test_asyncio/test_unix_events.py | 29 ++-- 3 files changed, 60 insertions(+), 42 deletions(-) diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -440,10 +440,13 @@ raise NotImplementedError() - def set_loop(self, loop): - """Reattach the watcher to another event loop. + def attach_loop(self, loop): + """Attach the watcher to an event loop. - Note: loop may be None + If the watcher was previously attached to an event loop, then it is + first detached before attaching to the new loop. + + Note: loop may be None. """ raise NotImplementedError() @@ -467,15 +470,11 @@ class BaseChildWatcher(AbstractChildWatcher): - def __init__(self, loop): + def __init__(self): self._loop = None - self._callbacks = {} - - self.set_loop(loop) def close(self): - self.set_loop(None) - self._callbacks.clear() + self.attach_loop(None) def _do_waitpid(self, expected_pid): raise NotImplementedError() @@ -483,7 +482,7 @@ def _do_waitpid_all(self): raise NotImplementedError() - def set_loop(self, loop): + def attach_loop(self, loop): assert loop is None or isinstance(loop, events.AbstractEventLoop) if self._loop is not None: @@ -497,13 +496,6 @@ # during the switch. self._do_waitpid_all() - def remove_child_handler(self, pid): - try: - del self._callbacks[pid] - return True - except KeyError: - return False - def _sig_chld(self): try: self._do_waitpid_all() @@ -535,6 +527,14 @@ big number of children (O(n) each time SIGCHLD is raised) """ + def __init__(self): + super().__init__() + self._callbacks = {} + + def close(self): + self._callbacks.clear() + super().close() + def __enter__(self): return self @@ -547,6 +547,13 @@ # Prevent a race condition in case the child is already terminated. self._do_waitpid(pid) + def remove_child_handler(self, pid): + try: + del self._callbacks[pid] + return True + except KeyError: + return False + def _do_waitpid_all(self): for pid in list(self._callbacks): @@ -592,17 +599,17 @@ There is no noticeable overhead when handling a big number of children (O(1) each time a child terminates). """ - def __init__(self, loop): + def __init__(self): + super().__init__() + self._callbacks = {} self._lock = threading.Lock() self._zombies = {} self._forks = 0 - # Call base class constructor last because it calls back into - # the subclass (set_loop() calls _do_waitpid()). - super().__init__(loop) def close(self): + self._callbacks.clear() + self._zombies.clear() super().close() - self._zombies.clear() def __enter__(self): with self._lock: @@ -643,6 +650,13 @@ else: callback(pid, returncode, *args) + def remove_child_handler(self, pid): + try: + del self._callbacks[pid] + return True + except KeyError: + return False + def _do_waitpid_all(self): # Because of signal coalescing, we must keep calling waitpid() as # long as we're able to reap a child. @@ -687,25 +701,24 @@ def _init_watcher(self): with events._lock: if self._watcher is None: # pragma: no branch + self._watcher = SafeChildWatcher() if isinstance(threading.current_thread(), threading._MainThread): - self._watcher = SafeChildWatcher(self._local._loop) - else: - self._watcher = SafeChildWatcher(None) + self._watcher.attach_loop(self._local._loop) def set_event_loop(self, loop): """Set the event loop. As a side effect, if a child watcher was set before, then calling - .set_event_loop() from the main thread will call .set_loop(loop) on the - child watcher. + .set_event_loop() from the main thread will call .attach_loop(loop) on + the child watcher. """ super().set_event_loop(loop) if self._watcher is not None and \ isinstance(threading.current_thread(), threading._MainThread): - self._watcher.set_loop(loop) + self._watcher.attach_loop(loop) def get_child_watcher(self): """Get the child watcher diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -1311,7 +1311,9 @@ class UnixEventLoopTestsMixin(EventLoopTestsMixin): def setUp(self): super().setUp() - events.set_child_watcher(unix_events.SafeChildWatcher(self.loop)) + watcher = unix_events.SafeChildWatcher() + watcher.attach_loop(self.loop) + events.set_child_watcher(watcher) def tearDown(self): events.set_child_watcher(None) diff --git a/Lib/test/test_asyncio/test_unix_events.py b/Lib/test/test_asyncio/test_unix_events.py --- a/Lib/test/test_asyncio/test_unix_events.py +++ b/Lib/test/test_asyncio/test_unix_events.py @@ -687,7 +687,7 @@ self.assertRaises( NotImplementedError, watcher.remove_child_handler, f) self.assertRaises( - NotImplementedError, watcher.set_loop, f) + NotImplementedError, watcher.attach_loop, f) self.assertRaises( NotImplementedError, watcher.close) self.assertRaises( @@ -700,7 +700,7 @@ def test_not_implemented(self): f = unittest.mock.Mock() - watcher = unix_events.BaseChildWatcher(None) + watcher = unix_events.BaseChildWatcher() self.assertRaises( NotImplementedError, watcher._do_waitpid, f) @@ -720,10 +720,13 @@ with unittest.mock.patch.object( self.loop, "add_signal_handler") as self.m_add_signal_handler: - self.watcher = self.create_watcher(self.loop) + self.watcher = self.create_watcher() + self.watcher.attach_loop(self.loop) - def tearDown(self): - ChildWatcherTestsMixin.instance = None + def cleanup(): + ChildWatcherTestsMixin.instance = None + + self.addCleanup(cleanup) def waitpid(pid, flags): self = ChildWatcherTestsMixin.instance @@ -1334,7 +1337,7 @@ self.loop, "add_signal_handler") as m_new_add_signal_handler: - self.watcher.set_loop(self.loop) + self.watcher.attach_loop(self.loop) m_old_remove_signal_handler.assert_called_once_with( signal.SIGCHLD) @@ -1375,7 +1378,7 @@ with unittest.mock.patch.object( old_loop, "remove_signal_handler") as m_remove_signal_handler: - self.watcher.set_loop(None) + self.watcher.attach_loop(None) m_remove_signal_handler.assert_called_once_with( signal.SIGCHLD) @@ -1395,7 +1398,7 @@ with unittest.mock.patch.object( self.loop, "add_signal_handler") as m_add_signal_handler: - self.watcher.set_loop(self.loop) + self.watcher.attach_loop(self.loop) m_add_signal_handler.assert_called_once_with( signal.SIGCHLD, self.watcher._sig_chld) @@ -1457,13 +1460,13 @@ class SafeChildWatcherTests (ChildWatcherTestsMixin, unittest.TestCase): - def create_watcher(self, loop): - return unix_events.SafeChildWatcher(loop) + def create_watcher(self): + return unix_events.SafeChildWatcher() class FastChildWatcherTests (ChildWatcherTestsMixin, unittest.TestCase): - def create_watcher(self, loop): - return unix_events.FastChildWatcher(loop) + def create_watcher(self): + return unix_events.FastChildWatcher() class PolicyTests(unittest.TestCase): @@ -1485,7 +1488,7 @@ def test_get_child_watcher_after_set(self): policy = self.create_policy() - watcher = unix_events.FastChildWatcher(None) + watcher = unix_events.FastChildWatcher() policy.set_child_watcher(watcher) self.assertIs(policy._watcher, watcher) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 01:39:52 2013 From: python-checkins at python.org (christian.heimes) Date: Thu, 14 Nov 2013 01:39:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2317828=3A_va=5Fsta?= =?utf-8?q?rt=28=29_must_be_accompanied_by_va=5Fend=28=29?= Message-ID: <3dKkNm0mwGz7LlT@mail.python.org> http://hg.python.org/cpython/rev/99ba1772c469 changeset: 87089:99ba1772c469 user: Christian Heimes date: Thu Nov 14 01:39:35 2013 +0100 summary: Issue #17828: va_start() must be accompanied by va_end() CID 1128793: Missing varargs init or cleanup (VARARGS) files: Objects/exceptions.c | 13 +++++++------ 1 files changed, 7 insertions(+), 6 deletions(-) diff --git a/Objects/exceptions.c b/Objects/exceptions.c --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -2632,12 +2632,6 @@ PyObject *new_exc, *new_val, *new_tb; va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES - va_start(vargs, format); -#else - va_start(vargs); -#endif - PyErr_Fetch(&exc, &val, &tb); caught_type = (PyTypeObject *) exc; /* Ensure type info indicates no extra state is stored at the C level */ @@ -2690,7 +2684,14 @@ * types as well, but that's quite a bit trickier due to the extra * state potentially stored on OSError instances. */ + +#ifdef HAVE_STDARG_PROTOTYPES + va_start(vargs, format); +#else + va_start(vargs); +#endif msg_prefix = PyUnicode_FromFormatV(format, vargs); + va_end(vargs); if (msg_prefix == NULL) return NULL; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 01:47:12 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 14 Nov 2013 01:47:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_fold?= =?utf-8?q?=5Funaryops=5Fon=5Fconstants=28=29_of_the_peephole_optimizer=2C?= =?utf-8?q?_clear?= Message-ID: <3dKkYD0sVFz7M4c@mail.python.org> http://hg.python.org/cpython/rev/eea54395797c changeset: 87090:eea54395797c user: Victor Stinner date: Thu Nov 14 01:21:00 2013 +0100 summary: Issue #19437: Fix fold_unaryops_on_constants() of the peephole optimizer, clear the exception when PyList_Append() fails files: Python/peephole.c | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Python/peephole.c b/Python/peephole.c --- a/Python/peephole.c +++ b/Python/peephole.c @@ -275,6 +275,7 @@ len_consts = PyList_GET_SIZE(consts); if (PyList_Append(consts, newconst)) { Py_DECREF(newconst); + PyErr_Clear(); return 0; } Py_DECREF(newconst); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 01:47:13 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 14 Nov 2013 01:47:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Use_an_i?= =?utf-8?q?dentifier_for_=22=5F=5Fname=5F=5F=22_string_in_pickle_to_improv?= =?utf-8?q?e?= Message-ID: <3dKkYF2dDXz7M5l@mail.python.org> http://hg.python.org/cpython/rev/7d0323556c53 changeset: 87091:7d0323556c53 user: Victor Stinner date: Thu Nov 14 01:26:17 2013 +0100 summary: Issue #19437: Use an identifier for "__name__" string in pickle to improve error handling The following code didn't handle correctly the failure of PyUnicode_InternFromString("__name__"). if (newobj_str == NULL) { newobj_str = PyUnicode_InternFromString("__newobj__"); name_str = PyUnicode_InternFromString("__name__"); if (newobj_str == NULL || name_str == NULL) return -1; } files: Modules/_pickle.c | 17 +++++------------ 1 files changed, 5 insertions(+), 12 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -136,6 +136,7 @@ /* For looking up name pairs in copyreg._extension_registry. */ static PyObject *two_tuple = NULL; +_Py_IDENTIFIER(__name__); _Py_IDENTIFIER(modules); static int @@ -2599,7 +2600,6 @@ static int save_global(PicklerObject *self, PyObject *obj, PyObject *name) { - static PyObject *name_str = NULL; PyObject *global_name = NULL; PyObject *module_name = NULL; PyObject *module = NULL; @@ -2608,18 +2608,12 @@ const char global_op = GLOBAL; - if (name_str == NULL) { - name_str = PyUnicode_InternFromString("__name__"); - if (name_str == NULL) - goto error; - } - if (name) { global_name = name; Py_INCREF(global_name); } else { - global_name = PyObject_GetAttr(obj, name_str); + global_name = _PyObject_GetAttrId(obj, &PyId___name__); if (global_name == NULL) goto error; } @@ -3016,17 +3010,16 @@ /* Protocol 2 special case: if callable's name is __newobj__, use NEWOBJ. */ if (use_newobj) { - static PyObject *newobj_str = NULL, *name_str = NULL; + static PyObject *newobj_str = NULL; PyObject *name; if (newobj_str == NULL) { newobj_str = PyUnicode_InternFromString("__newobj__"); - name_str = PyUnicode_InternFromString("__name__"); - if (newobj_str == NULL || name_str == NULL) + if (newobj_str == NULL) return -1; } - name = PyObject_GetAttr(callable, name_str); + name = _PyObject_GetAttrId(callable, &PyId___name__); if (name == NULL) { if (PyErr_ExceptionMatches(PyExc_AttributeError)) PyErr_Clear(); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 01:47:14 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 14 Nov 2013 01:47:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_arra?= =?utf-8?q?y=2Ebuffer=5Finfo=28=29=2C_handle_PyLong=5FFromVoidPtr=28=29_an?= =?utf-8?q?d?= Message-ID: <3dKkYG4MVqz7M5f@mail.python.org> http://hg.python.org/cpython/rev/285d28c3620a changeset: 87092:285d28c3620a user: Victor Stinner date: Thu Nov 14 01:27:12 2013 +0100 summary: Issue #19437: Fix array.buffer_info(), handle PyLong_FromVoidPtr() and PyLong_FromLong() failure files: Modules/arraymodule.c | 18 +++++++++++++++--- 1 files changed, 15 insertions(+), 3 deletions(-) diff --git a/Modules/arraymodule.c b/Modules/arraymodule.c --- a/Modules/arraymodule.c +++ b/Modules/arraymodule.c @@ -1133,13 +1133,25 @@ static PyObject * array_buffer_info(arrayobject *self, PyObject *unused) { - PyObject* retval = NULL; + PyObject *retval = NULL, *v; + retval = PyTuple_New(2); if (!retval) return NULL; - PyTuple_SET_ITEM(retval, 0, PyLong_FromVoidPtr(self->ob_item)); - PyTuple_SET_ITEM(retval, 1, PyLong_FromLong((long)(Py_SIZE(self)))); + v = PyLong_FromVoidPtr(self->ob_item); + if (v == NULL) { + Py_DECREF(retval); + return NULL; + } + PyTuple_SET_ITEM(retval, 0, v); + + v = PyLong_FromLong((long)(Py_SIZE(self))); + if (v == NULL) { + Py_DECREF(retval); + return NULL; + } + PyTuple_SET_ITEM(retval, 1, v); return retval; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 01:48:42 2013 From: python-checkins at python.org (christian.heimes) Date: Thu, 14 Nov 2013 01:48:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2317828=3A_=5FPyObj?= =?utf-8?q?ect=5FGetDictPtr=28=29_may_return_NULL_instead_of_a_PyObject**?= Message-ID: <3dKkZy1VJGz7M5T@mail.python.org> http://hg.python.org/cpython/rev/26121ae22016 changeset: 87093:26121ae22016 parent: 87089:99ba1772c469 user: Christian Heimes date: Thu Nov 14 01:47:14 2013 +0100 summary: Issue #17828: _PyObject_GetDictPtr() may return NULL instead of a PyObject** CID 1128792: Dereference null return value (NULL_RETURNS) files: Objects/exceptions.c | 8 +++++--- 1 files changed, 5 insertions(+), 3 deletions(-) diff --git a/Objects/exceptions.c b/Objects/exceptions.c --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -2626,7 +2626,7 @@ PyObject* msg_prefix; PyObject *exc, *val, *tb; PyTypeObject *caught_type; - PyObject *instance_dict; + PyObject **dictptr; PyObject *instance_args; Py_ssize_t num_args; PyObject *new_exc, *new_val, *new_tb; @@ -2664,8 +2664,10 @@ } /* Ensure the instance dict is also empty */ - instance_dict = *_PyObject_GetDictPtr(val); - if (instance_dict != NULL && PyObject_Length(instance_dict) > 0) { + dictptr = _PyObject_GetDictPtr(val); + if ((dictptr != NULL) && (*dictptr != NULL) && + (PyObject_Length(*dictptr) > 0) + ) { /* While we could potentially copy a non-empty instance dictionary * to the replacement exception, for now we take the more * conservative path of leaving exceptions with attributes set -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 01:48:43 2013 From: python-checkins at python.org (christian.heimes) Date: Thu, 14 Nov 2013 01:48:43 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_merge?= Message-ID: <3dKkZz3HF4z7M5R@mail.python.org> http://hg.python.org/cpython/rev/784a02ec2a26 changeset: 87094:784a02ec2a26 parent: 87093:26121ae22016 parent: 87092:285d28c3620a user: Christian Heimes date: Thu Nov 14 01:48:32 2013 +0100 summary: merge files: Modules/_pickle.c | 17 +++++------------ Modules/arraymodule.c | 18 +++++++++++++++--- Python/peephole.c | 1 + 3 files changed, 21 insertions(+), 15 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -136,6 +136,7 @@ /* For looking up name pairs in copyreg._extension_registry. */ static PyObject *two_tuple = NULL; +_Py_IDENTIFIER(__name__); _Py_IDENTIFIER(modules); static int @@ -2599,7 +2600,6 @@ static int save_global(PicklerObject *self, PyObject *obj, PyObject *name) { - static PyObject *name_str = NULL; PyObject *global_name = NULL; PyObject *module_name = NULL; PyObject *module = NULL; @@ -2608,18 +2608,12 @@ const char global_op = GLOBAL; - if (name_str == NULL) { - name_str = PyUnicode_InternFromString("__name__"); - if (name_str == NULL) - goto error; - } - if (name) { global_name = name; Py_INCREF(global_name); } else { - global_name = PyObject_GetAttr(obj, name_str); + global_name = _PyObject_GetAttrId(obj, &PyId___name__); if (global_name == NULL) goto error; } @@ -3016,17 +3010,16 @@ /* Protocol 2 special case: if callable's name is __newobj__, use NEWOBJ. */ if (use_newobj) { - static PyObject *newobj_str = NULL, *name_str = NULL; + static PyObject *newobj_str = NULL; PyObject *name; if (newobj_str == NULL) { newobj_str = PyUnicode_InternFromString("__newobj__"); - name_str = PyUnicode_InternFromString("__name__"); - if (newobj_str == NULL || name_str == NULL) + if (newobj_str == NULL) return -1; } - name = PyObject_GetAttr(callable, name_str); + name = _PyObject_GetAttrId(callable, &PyId___name__); if (name == NULL) { if (PyErr_ExceptionMatches(PyExc_AttributeError)) PyErr_Clear(); diff --git a/Modules/arraymodule.c b/Modules/arraymodule.c --- a/Modules/arraymodule.c +++ b/Modules/arraymodule.c @@ -1133,13 +1133,25 @@ static PyObject * array_buffer_info(arrayobject *self, PyObject *unused) { - PyObject* retval = NULL; + PyObject *retval = NULL, *v; + retval = PyTuple_New(2); if (!retval) return NULL; - PyTuple_SET_ITEM(retval, 0, PyLong_FromVoidPtr(self->ob_item)); - PyTuple_SET_ITEM(retval, 1, PyLong_FromLong((long)(Py_SIZE(self)))); + v = PyLong_FromVoidPtr(self->ob_item); + if (v == NULL) { + Py_DECREF(retval); + return NULL; + } + PyTuple_SET_ITEM(retval, 0, v); + + v = PyLong_FromLong((long)(Py_SIZE(self))); + if (v == NULL) { + Py_DECREF(retval); + return NULL; + } + PyTuple_SET_ITEM(retval, 1, v); return retval; } diff --git a/Python/peephole.c b/Python/peephole.c --- a/Python/peephole.c +++ b/Python/peephole.c @@ -275,6 +275,7 @@ len_consts = PyList_GET_SIZE(consts); if (PyList_Append(consts, newconst)) { Py_DECREF(newconst); + PyErr_Clear(); return 0; } Py_DECREF(newconst); -- Repository URL: http://hg.python.org/cpython From ncoghlan at gmail.com Thu Nov 14 01:50:18 2013 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 14 Nov 2013 10:50:18 +1000 Subject: [Python-checkins] cpython: Issue #17828: va_start() must be accompanied by va_end() In-Reply-To: <3dKkNm0mwGz7LlT@mail.python.org> References: <3dKkNm0mwGz7LlT@mail.python.org> Message-ID: On 14 Nov 2013 10:40, "christian.heimes" wrote: > > http://hg.python.org/cpython/rev/99ba1772c469 > changeset: 87089:99ba1772c469 > user: Christian Heimes > date: Thu Nov 14 01:39:35 2013 +0100 > summary: > Issue #17828: va_start() must be accompanied by va_end() > CID 1128793: Missing varargs init or cleanup (VARARGS) Today I learned... :) Thanks! Cheers, Nick. > > files: > Objects/exceptions.c | 13 +++++++------ > 1 files changed, 7 insertions(+), 6 deletions(-) > > > diff --git a/Objects/exceptions.c b/Objects/exceptions.c > --- a/Objects/exceptions.c > +++ b/Objects/exceptions.c > @@ -2632,12 +2632,6 @@ > PyObject *new_exc, *new_val, *new_tb; > va_list vargs; > > -#ifdef HAVE_STDARG_PROTOTYPES > - va_start(vargs, format); > -#else > - va_start(vargs); > -#endif > - > PyErr_Fetch(&exc, &val, &tb); > caught_type = (PyTypeObject *) exc; > /* Ensure type info indicates no extra state is stored at the C level */ > @@ -2690,7 +2684,14 @@ > * types as well, but that's quite a bit trickier due to the extra > * state potentially stored on OSError instances. > */ > + > +#ifdef HAVE_STDARG_PROTOTYPES > + va_start(vargs, format); > +#else > + va_start(vargs); > +#endif > msg_prefix = PyUnicode_FromFormatV(format, vargs); > + va_end(vargs); > if (msg_prefix == NULL) > return NULL; > > > -- > Repository URL: http://hg.python.org/cpython > > _______________________________________________ > Python-checkins mailing list > Python-checkins at python.org > https://mail.python.org/mailman/listinfo/python-checkins > -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Thu Nov 14 04:50:36 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Thu, 14 Nov 2013 04:50:36 +0100 Subject: [Python-checkins] Daily reference leaks (784a02ec2a26): sum=522 Message-ID: results for 784a02ec2a26 on branch "default" -------------------------------------------- test_codeccallbacks leaked [40, 40, 40] references, sum=120 test_codeccallbacks leaked [40, 40, 40] memory blocks, sum=120 test_codecs leaked [38, 38, 38] references, sum=114 test_codecs leaked [24, 24, 24] memory blocks, sum=72 test_email leaked [16, 16, 16] references, sum=48 test_email leaked [16, 16, 16] memory blocks, sum=48 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogx2QIb_', '-x'] From python-checkins at python.org Thu Nov 14 05:18:01 2013 From: python-checkins at python.org (guido.van.rossum) Date: Thu, 14 Nov 2013 05:18:01 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Relax_timing_re?= =?utf-8?q?quirement=2E_Fixes_issue_19579=2E?= Message-ID: <3dKqDT6PcGz7Lnw@mail.python.org> http://hg.python.org/cpython/rev/f91550e7e2b7 changeset: 87095:f91550e7e2b7 user: Guido van Rossum date: Wed Nov 13 20:17:52 2013 -0800 summary: asyncio: Relax timing requirement. Fixes issue 19579. files: Lib/test/test_asyncio/test_base_events.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -170,7 +170,7 @@ f.cancel() # Don't complain about abandoned Future. def test__run_once(self): - h1 = events.TimerHandle(time.monotonic() + 0.1, lambda: True, ()) + h1 = events.TimerHandle(time.monotonic() + 5.0, lambda: True, ()) h2 = events.TimerHandle(time.monotonic() + 10.0, lambda: True, ()) h1.cancel() @@ -181,7 +181,7 @@ self.loop._run_once() t = self.loop._selector.select.call_args[0][0] - self.assertTrue(9.99 < t < 10.1, t) + self.assertTrue(9.9 < t < 10.1, t) self.assertEqual([h2], self.loop._scheduled) self.assertTrue(self.loop._process_events.called) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 05:50:15 2013 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 14 Nov 2013 05:50:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_adjust_style?= Message-ID: <3dKqxg60bkz7LjT@mail.python.org> http://hg.python.org/cpython/rev/e0fb41d41642 changeset: 87096:e0fb41d41642 user: Benjamin Peterson date: Wed Nov 13 23:25:01 2013 -0500 summary: adjust style files: Objects/exceptions.c | 20 ++++++++------------ 1 files changed, 8 insertions(+), 12 deletions(-) diff --git a/Objects/exceptions.c b/Objects/exceptions.c --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -2633,13 +2633,12 @@ va_list vargs; PyErr_Fetch(&exc, &val, &tb); - caught_type = (PyTypeObject *) exc; + caught_type = (PyTypeObject *)exc; /* Ensure type info indicates no extra state is stored at the C level */ - if (caught_type->tp_init != (initproc) BaseException_init || + if (caught_type->tp_init != (initproc)BaseException_init || caught_type->tp_new != BaseException_new || caught_type->tp_basicsize != _PyExc_BaseException.tp_basicsize || - caught_type->tp_itemsize != _PyExc_BaseException.tp_itemsize - ) { + caught_type->tp_itemsize != _PyExc_BaseException.tp_itemsize) { /* We can't be sure we can wrap this safely, since it may contain * more state than just the exception type. Accordingly, we just * leave it alone. @@ -2650,13 +2649,11 @@ /* Check the args are empty or contain a single string */ PyErr_NormalizeException(&exc, &val, &tb); - instance_args = ((PyBaseExceptionObject *) val)->args; + instance_args = ((PyBaseExceptionObject *)val)->args; num_args = PyTuple_GET_SIZE(instance_args); - if ((num_args > 1) || + if (num_args > 1 || (num_args == 1 && - !PyUnicode_CheckExact(PyTuple_GET_ITEM(instance_args, 0)) - ) - ) { + !PyUnicode_CheckExact(PyTuple_GET_ITEM(instance_args, 0)))) { /* More than 1 arg, or the one arg we do have isn't a string */ PyErr_Restore(exc, val, tb); @@ -2665,9 +2662,8 @@ /* Ensure the instance dict is also empty */ dictptr = _PyObject_GetDictPtr(val); - if ((dictptr != NULL) && (*dictptr != NULL) && - (PyObject_Length(*dictptr) > 0) - ) { + if (dictptr != NULL && *dictptr != NULL && + PyObject_Length(*dictptr) > 0) { /* While we could potentially copy a non-empty instance dictionary * to the replacement exception, for now we take the more * conservative path of leaving exceptions with attributes set -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 05:50:17 2013 From: python-checkins at python.org (benjamin.peterson) Date: Thu, 14 Nov 2013 05:50:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_fix_refleaks?= Message-ID: <3dKqxj0ZCmz7LjT@mail.python.org> http://hg.python.org/cpython/rev/c27237d57231 changeset: 87097:c27237d57231 user: Benjamin Peterson date: Wed Nov 13 23:49:49 2013 -0500 summary: fix refleaks files: Objects/exceptions.c | 10 +++++++--- 1 files changed, 7 insertions(+), 3 deletions(-) diff --git a/Objects/exceptions.c b/Objects/exceptions.c --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -2683,6 +2683,9 @@ * state potentially stored on OSError instances. */ + Py_DECREF(exc); + Py_XDECREF(tb); + #ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); #else @@ -2690,13 +2693,14 @@ #endif msg_prefix = PyUnicode_FromFormatV(format, vargs); va_end(vargs); - if (msg_prefix == NULL) + if (msg_prefix == NULL) { + Py_DECREF(val); return NULL; + } PyErr_Format(exc, "%U (%s: %S)", msg_prefix, Py_TYPE(val)->tp_name, val); - Py_DECREF(exc); - Py_XDECREF(tb); + Py_DECREF(msg_prefix); PyErr_Fetch(&new_exc, &new_val, &new_tb); PyErr_NormalizeException(&new_exc, &new_val, &new_tb); PyException_SetCause(new_val, val); -- Repository URL: http://hg.python.org/cpython From ncoghlan at gmail.com Thu Nov 14 12:58:16 2013 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 14 Nov 2013 21:58:16 +1000 Subject: [Python-checkins] Daily reference leaks (784a02ec2a26): sum=522 In-Reply-To: References: Message-ID: On 14 Nov 2013 13:52, wrote: > > results for 784a02ec2a26 on branch "default" > -------------------------------------------- > > test_codeccallbacks leaked [40, 40, 40] references, sum=120 > test_codeccallbacks leaked [40, 40, 40] memory blocks, sum=120 > test_codecs leaked [38, 38, 38] references, sum=114 > test_codecs leaked [24, 24, 24] memory blocks, sum=72 > test_email leaked [16, 16, 16] references, sum=48 > test_email leaked [16, 16, 16] memory blocks, sum=48 Hmm, it appears I have a reference leak somewhere. Cheers, Nick. > > > Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogx2QIb_', '-x'] > _______________________________________________ > Python-checkins mailing list > Python-checkins at python.org > https://mail.python.org/mailman/listinfo/python-checkins -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Thu Nov 14 13:01:32 2013 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 14 Nov 2013 22:01:32 +1000 Subject: [Python-checkins] Daily reference leaks (784a02ec2a26): sum=522 In-Reply-To: References: Message-ID: On 14 Nov 2013 21:58, "Nick Coghlan" wrote: > > > On 14 Nov 2013 13:52, wrote: > > > > results for 784a02ec2a26 on branch "default" > > -------------------------------------------- > > > > test_codeccallbacks leaked [40, 40, 40] references, sum=120 > > test_codeccallbacks leaked [40, 40, 40] memory blocks, sum=120 > > test_codecs leaked [38, 38, 38] references, sum=114 > > test_codecs leaked [24, 24, 24] memory blocks, sum=72 > > test_email leaked [16, 16, 16] references, sum=48 > > test_email leaked [16, 16, 16] memory blocks, sum=48 > > Hmm, it appears I have a reference leak somewhere. Ah, Benjamin fixed it already. Thanks! :) Cheers, Nick. > > Cheers, > Nick. > > > > > > > Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogx2QIb_', '-x'] > > _______________________________________________ > > Python-checkins mailing list > > Python-checkins at python.org > > https://mail.python.org/mailman/listinfo/python-checkins -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Thu Nov 14 13:06:11 2013 From: solipsis at pitrou.net (Antoine Pitrou) Date: Thu, 14 Nov 2013 13:06:11 +0100 Subject: [Python-checkins] Daily reference leaks (784a02ec2a26): sum=522 References: Message-ID: <20131114130611.263ee550@fsol> On Thu, 14 Nov 2013 22:01:32 +1000 Nick Coghlan wrote: > On 14 Nov 2013 21:58, "Nick Coghlan" wrote: > > > > > > On 14 Nov 2013 13:52, wrote: > > > > > > results for 784a02ec2a26 on branch "default" > > > -------------------------------------------- > > > > > > test_codeccallbacks leaked [40, 40, 40] references, sum=120 > > > test_codeccallbacks leaked [40, 40, 40] memory blocks, sum=120 > > > test_codecs leaked [38, 38, 38] references, sum=114 > > > test_codecs leaked [24, 24, 24] memory blocks, sum=72 > > > test_email leaked [16, 16, 16] references, sum=48 > > > test_email leaked [16, 16, 16] memory blocks, sum=48 > > > > Hmm, it appears I have a reference leak somewhere. > > Ah, Benjamin fixed it already. Thanks! :) The reference leak task has been running for quite some time on my personal machine and I believe it has proven useful. I have no problem continuing running it on the same machine (which is mostly sitting idle anyway), but maybe it should rather be hosted on our CI infrastructure? Any suggestions? (the script is quite rough with hardcoded stuff, but beating it into better shape could be a nice target for first-time contributors) Regards Antoine. From python-checkins at python.org Thu Nov 14 13:55:24 2013 From: python-checkins at python.org (christian.heimes) Date: Thu, 14 Nov 2013 13:55:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?devguide=3A_Coverity_has_reduced_numb?= =?utf-8?q?er_of_builds_to_4_per_week?= Message-ID: <3dL2jS16Zqz7LjW@mail.python.org> http://hg.python.org/devguide/rev/1c2dbbf25d0f changeset: 652:1c2dbbf25d0f user: Christian Heimes date: Thu Nov 14 13:55:17 2013 +0100 summary: Coverity has reduced number of builds to 4 per week https://scan.coverity.com/faq#frequency files: coverity.rst | 11 ++++++----- 1 files changed, 6 insertions(+), 5 deletions(-) diff --git a/coverity.rst b/coverity.rst --- a/coverity.rst +++ b/coverity.rst @@ -37,11 +37,12 @@ Building and uploading analysis =============================== -The process is automated. Twice a day a script runs ``hg pull``, ``hg update`` -``cov-build`` and uploads the latest analysis to Coverity. The build runs on a -dedicated virtual machine on PSF's infrastructure at OSU Open Source Labs. The -process is maintained by Christian Heimes (see `Contact`_). At present only -the tip is analyzed with the 64bit Linux tools. +The process is automated. A script runs ``hg pull``, ``hg update``, +``cov-build`` and uploads the latest analysis to Coverity. Since Coverity has +limited the maximum number of builds per week Python is analyzed every second +day. The build runs on a dedicated virtual machine on PSF's infrastructure at +OSU Open Source Labs. The process is maintained by Christian Heimes (see +`Contact`_). At present only the tip is analyzed with the 64bit Linux tools. Known limitations -- Repository URL: http://hg.python.org/devguide From benjamin at python.org Thu Nov 14 15:14:46 2013 From: benjamin at python.org (Benjamin Peterson) Date: Thu, 14 Nov 2013 09:14:46 -0500 Subject: [Python-checkins] Daily reference leaks (784a02ec2a26): sum=522 In-Reply-To: <20131114130611.263ee550@fsol> References: <20131114130611.263ee550@fsol> Message-ID: 2013/11/14 Antoine Pitrou : > On Thu, 14 Nov 2013 22:01:32 +1000 > Nick Coghlan wrote: >> On 14 Nov 2013 21:58, "Nick Coghlan" wrote: >> > >> > >> > On 14 Nov 2013 13:52, wrote: >> > > >> > > results for 784a02ec2a26 on branch "default" >> > > -------------------------------------------- >> > > >> > > test_codeccallbacks leaked [40, 40, 40] references, sum=120 >> > > test_codeccallbacks leaked [40, 40, 40] memory blocks, sum=120 >> > > test_codecs leaked [38, 38, 38] references, sum=114 >> > > test_codecs leaked [24, 24, 24] memory blocks, sum=72 >> > > test_email leaked [16, 16, 16] references, sum=48 >> > > test_email leaked [16, 16, 16] memory blocks, sum=48 >> > >> > Hmm, it appears I have a reference leak somewhere. >> >> Ah, Benjamin fixed it already. Thanks! :) > > The reference leak task has been running for quite some time on my > personal machine and I believe it has proven useful. I have no problem > continuing running it on the same machine (which is mostly sitting idle > anyway), but maybe it should rather be hosted on our CI infrastructure? > Any suggestions? Thank you very much for running that, btw. I'm sure we would have released a lot of horribly leaking stuff without it. > > (the script is quite rough with hardcoded stuff, but beating it into > better shape could be a nice target for first-time contributors) Perhaps someone can figure out how to run it on one of the the buildbots? -- Regards, Benjamin From python-checkins at python.org Thu Nov 14 15:36:39 2013 From: python-checkins at python.org (stefan.krah) Date: Thu, 14 Nov 2013 15:36:39 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Add_unused_third_arg_for_t?= =?utf-8?q?he_benefit_of_Valgrind=2E?= Message-ID: <3dL4yH2PVmz7LkR@mail.python.org> http://hg.python.org/cpython/rev/e539761d45e3 changeset: 87098:e539761d45e3 user: Stefan Krah date: Thu Nov 14 15:35:47 2013 +0100 summary: Add unused third arg for the benefit of Valgrind. files: Python/fileutils.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Python/fileutils.c b/Python/fileutils.c --- a/Python/fileutils.c +++ b/Python/fileutils.c @@ -675,7 +675,7 @@ request = FIONCLEX; else request = FIOCLEX; - err = ioctl(fd, request); + err = ioctl(fd, request, NULL); if (err) { if (raise) PyErr_SetFromErrno(PyExc_OSError); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 16:09:36 2013 From: python-checkins at python.org (christian.heimes) Date: Thu, 14 Nov 2013 16:09:36 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_add_benchmark_numbers_for_sma?= =?utf-8?q?ll_string_optimization?= Message-ID: <3dL5hJ13kTz7LjS@mail.python.org> http://hg.python.org/peps/rev/90d5a5d88de9 changeset: 5270:90d5a5d88de9 user: Christian Heimes date: Thu Nov 14 16:09:27 2013 +0100 summary: add benchmark numbers for small string optimization files: pep-0456.txt | 10 ++++++---- 1 files changed, 6 insertions(+), 4 deletions(-) diff --git a/pep-0456.txt b/pep-0456.txt --- a/pep-0456.txt +++ b/pep-0456.txt @@ -9,7 +9,7 @@ Content-Type: text/x-rst Created: 27-Sep-2013 Python-Version: 3.4 -Post-History: 06-Oct-2013 +Post-History: 06-Oct-2013, 14-Nov-2013 Abstract @@ -267,7 +267,7 @@ code that can dominate speed of the algorithm for very short strings. On the other hand Python calculates the hash value of short strings quite often. A simple and fast function for especially for hashing of small strings can make -a measurably impact on performance. For example these measurements were taken +a measurable impact on performance. For example these measurements were taken during a run of Python's regression tests. Additional measurements of other code have shown a similar distribution. @@ -296,7 +296,9 @@ However a fast function like DJBX33A is not as secure as SipHash24. A cutoff at about 5 to 7 bytes should provide a decent safety margin and speed up at the same time. The PEP's reference implementation provides such a cutoff with -``Py_HASH_CUTOFF`` but disables the optimization by default. +``Py_HASH_CUTOFF`` but disables the optimization by default. Multiple runs of +Python's benchmark suite shows an average speedups between 3% and 5% for +benchmarks such as django_v2, mako and etree with a cutoff of 7 on 64 bit Linux. C API additions @@ -401,7 +403,7 @@ ``Py_HASH_FNV`` or ``Py_HASH_EXTERNAL``. If ``Py_HASH_ALGORITHM`` is not defined at all, then the best available algorithm is selected. On platforms wich don't require aligned memory access (``HAVE_ALIGNED_REQUIRED`` not -defined) and an unsigned 64bit integer type ``PY_UINT64_T``, SipHash24 is +defined) and an unsigned 64 bit integer type ``PY_UINT64_T``, SipHash24 is used. On strict C89 platforms without a 64 bit data type, or architectures such as SPARC, FNV is selected as fallback. A hash algorithm can be selected with an autoconf option, for example ``./configure --with-hash-algorithm=fnv``. -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Thu Nov 14 19:06:32 2013 From: python-checkins at python.org (guido.van.rossum) Date: Thu, 14 Nov 2013 19:06:32 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Avoid_ResourceW?= =?utf-8?q?arning=2E_Fix_issue_19580_by_Vajrasky_Kok=2E?= Message-ID: <3dL9cS0F0Fz7Lnj@mail.python.org> http://hg.python.org/cpython/rev/df50f73f03ca changeset: 87099:df50f73f03ca user: Guido van Rossum date: Thu Nov 14 10:06:18 2013 -0800 summary: asyncio: Avoid ResourceWarning. Fix issue 19580 by Vajrasky Kok. files: Lib/test/test_asyncio/test_base_events.py | 22 +++++++++-- 1 files changed, 18 insertions(+), 4 deletions(-) diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -458,16 +458,26 @@ self.loop.sock_connect.return_value = () self.loop._make_ssl_transport = unittest.mock.Mock() + class _SelectorTransportMock: + _sock = None + + def close(self): + self._sock.close() + def mock_make_ssl_transport(sock, protocol, sslcontext, waiter, **kwds): waiter.set_result(None) + transport = _SelectorTransportMock() + transport._sock = sock + return transport self.loop._make_ssl_transport.side_effect = mock_make_ssl_transport ANY = unittest.mock.ANY # First try the default server_hostname. self.loop._make_ssl_transport.reset_mock() coro = self.loop.create_connection(MyProto, 'python.org', 80, ssl=True) - self.loop.run_until_complete(coro) + transport, _ = self.loop.run_until_complete(coro) + transport.close() self.loop._make_ssl_transport.assert_called_with( ANY, ANY, ANY, ANY, server_side=False, @@ -476,7 +486,8 @@ self.loop._make_ssl_transport.reset_mock() coro = self.loop.create_connection(MyProto, 'python.org', 80, ssl=True, server_hostname='perl.com') - self.loop.run_until_complete(coro) + transport, _ = self.loop.run_until_complete(coro) + transport.close() self.loop._make_ssl_transport.assert_called_with( ANY, ANY, ANY, ANY, server_side=False, @@ -485,7 +496,8 @@ self.loop._make_ssl_transport.reset_mock() coro = self.loop.create_connection(MyProto, 'python.org', 80, ssl=True, server_hostname='') - self.loop.run_until_complete(coro) + transport, _ = self.loop.run_until_complete(coro) + transport.close() self.loop._make_ssl_transport.assert_called_with(ANY, ANY, ANY, ANY, server_side=False, server_hostname='') @@ -505,8 +517,10 @@ self.assertRaises(ValueError, self.loop.run_until_complete, coro) coro = self.loop.create_connection(MyProto, None, 80, ssl=True) self.assertRaises(ValueError, self.loop.run_until_complete, coro) + sock = socket.socket() coro = self.loop.create_connection(MyProto, None, None, - ssl=True, sock=socket.socket()) + ssl=True, sock=sock) + self.addCleanup(sock.close) self.assertRaises(ValueError, self.loop.run_until_complete, coro) def test_create_server_empty_host(self): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 22:11:35 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Thu, 14 Nov 2013 22:11:35 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319589=3A_Use_spec?= =?utf-8?q?ific_asserts_in_asyncio_tests=2E?= Message-ID: <3dLFjz1HmXz7Lm0@mail.python.org> http://hg.python.org/cpython/rev/7e00bdada290 changeset: 87100:7e00bdada290 user: Serhiy Storchaka date: Thu Nov 14 23:10:51 2013 +0200 summary: Issue #19589: Use specific asserts in asyncio tests. files: Lib/test/test_asyncio/test_events.py | 18 ++++++---- Lib/test/test_asyncio/test_tasks.py | 2 +- Lib/test/test_asyncio/test_unix_events.py | 2 +- 3 files changed, 12 insertions(+), 10 deletions(-) diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -472,8 +472,8 @@ f = self.loop.create_connection( lambda: MyProto(loop=self.loop), *httpd.address) tr, pr = self.loop.run_until_complete(f) - self.assertTrue(isinstance(tr, transports.Transport)) - self.assertTrue(isinstance(pr, protocols.Protocol)) + self.assertIsInstance(tr, transports.Transport) + self.assertIsInstance(pr, protocols.Protocol) self.loop.run_until_complete(pr.done) self.assertGreater(pr.nbytes, 0) tr.close() @@ -500,8 +500,8 @@ f = self.loop.create_connection( lambda: MyProto(loop=self.loop), sock=sock) tr, pr = self.loop.run_until_complete(f) - self.assertTrue(isinstance(tr, transports.Transport)) - self.assertTrue(isinstance(pr, protocols.Protocol)) + self.assertIsInstance(tr, transports.Transport) + self.assertIsInstance(pr, protocols.Protocol) self.loop.run_until_complete(pr.done) self.assertGreater(pr.nbytes, 0) tr.close() @@ -513,8 +513,8 @@ lambda: MyProto(loop=self.loop), *httpd.address, ssl=test_utils.dummy_ssl_context()) tr, pr = self.loop.run_until_complete(f) - self.assertTrue(isinstance(tr, transports.Transport)) - self.assertTrue(isinstance(pr, protocols.Protocol)) + self.assertIsInstance(tr, transports.Transport) + self.assertIsInstance(pr, protocols.Protocol) self.assertTrue('ssl' in tr.__class__.__name__.lower()) self.assertIsNotNone(tr.get_extra_info('sockname')) self.loop.run_until_complete(pr.done) @@ -926,7 +926,8 @@ r.setblocking(False) f = self.loop.sock_recv(r, 1) ov = getattr(f, 'ov', None) - self.assertTrue(ov is None or ov.pending) + if ov is not None: + self.assertTrue(ov.pending) @tasks.coroutine def main(): @@ -949,7 +950,8 @@ self.assertLess(elapsed, 0.1) self.assertEqual(t.result(), 'cancelled') self.assertRaises(futures.CancelledError, f.result) - self.assertTrue(ov is None or not ov.pending) + if ov is not None: + self.assertFalse(ov.pending) self.loop._stop_serving(r) r.close() diff --git a/Lib/test/test_asyncio/test_tasks.py b/Lib/test/test_asyncio/test_tasks.py --- a/Lib/test/test_asyncio/test_tasks.py +++ b/Lib/test/test_asyncio/test_tasks.py @@ -779,7 +779,7 @@ self.assertEqual(len(res), 2, res) self.assertEqual(res[0], (1, 'a')) self.assertEqual(res[1][0], 2) - self.assertTrue(isinstance(res[1][1], futures.TimeoutError)) + self.assertIsInstance(res[1][1], futures.TimeoutError) self.assertAlmostEqual(0.12, loop.time()) # move forward to close generator diff --git a/Lib/test/test_asyncio/test_unix_events.py b/Lib/test/test_asyncio/test_unix_events.py --- a/Lib/test/test_asyncio/test_unix_events.py +++ b/Lib/test/test_asyncio/test_unix_events.py @@ -67,7 +67,7 @@ cb = lambda: True self.loop.add_signal_handler(signal.SIGHUP, cb) h = self.loop._signal_handlers.get(signal.SIGHUP) - self.assertTrue(isinstance(h, events.Handle)) + self.assertIsInstance(h, events.Handle) self.assertEqual(h._callback, cb) @unittest.mock.patch('asyncio.unix_events.signal') -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 22:26:00 2013 From: python-checkins at python.org (guido.van.rossum) Date: Thu, 14 Nov 2013 22:26:00 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Complete_subprocess_and_flow_?= =?utf-8?q?control_docs=2E__Misc_other_additions=2E?= Message-ID: <3dLG2c4jRZzRSr@mail.python.org> http://hg.python.org/peps/rev/5c1d209e06a2 changeset: 5271:5c1d209e06a2 user: Guido van Rossum date: Thu Nov 14 13:25:58 2013 -0800 summary: Complete subprocess and flow control docs. Misc other additions. files: pep-3156.txt | 125 +++++++++++++++++++++++++++++++++++--- 1 files changed, 114 insertions(+), 11 deletions(-) diff --git a/pep-3156.txt b/pep-3156.txt --- a/pep-3156.txt +++ b/pep-3156.txt @@ -1172,6 +1172,30 @@ is known ahead of time, the best approach in both cases is to use the Content-Length header.) +- ``get_write_buffer_size()``. Return the current size of the + transport's write buffer in bytes. This only knows about the write + buffer managed explicitly by the transport; buffering in other + layers of the network stack or elsewhere of the network is not + reported. + +- ``set_write_buffer_limits(high=None, low=None)``. Set the high- and + low-water limits for flow control. + + These two values control when to call the protocol's + ``pause_writing()`` and ``resume_writing()`` methods. If specified, + the low-water limit must be less than or equal to the high-water + limit. Neither value can be negative. + + The defaults are implementation-specific. If only the high-water + limit is given, the low-water limit defaults to a + implementation-specific value less than or equal to the high-water + limit. Setting high to zero forces low to zero as well, and causes + ``pause_writing()`` to be called whenever the buffer becomes + non-empty. Setting low to zero causes ``resume_writing()`` to be + called only once the buffer is empty. Use of zero for either limit + is generally sub-optimal as it reduces opportunities for doing I/O + and computation concurrently. + - ``pause_reading()``. Suspend delivery of data to the protocol until a subsequent ``resume_reading()`` call. Between ``pause_reading()`` and ``resume_reading()``, the protocol's ``data_received()`` method @@ -1225,7 +1249,7 @@ ``create_datagram_endpoint()`` call that created this transport. If present, and ``remote_addr`` was specified, they must match. The (data, addr) pair may be sent immediately or buffered. The return - value is None. + value is ``None``. - ``abort()``. Immediately close the transport. Buffered data will be discarded. @@ -1248,7 +1272,33 @@ Subprocess Transports ''''''''''''''''''''' -TBD. +Subprocess transports have the following methods: + +- ``get_pid()``. Return the process ID of the subprocess. + +- ``get_returncode()``. Return the process return code, if the + process has exited; otherwise ``None``. + +- ``get_pipe_transport(fd)``. Return the pipe transport (a + unidirectional stream transport) corresponding to the argument, + which should be 0, 1 or 2 representing stdin, stdout or stderr (of + the subprocess). If there is no such pipe transport, return + ``None``. For stdin, this is a writing transport; for stdout and + stderr this is a reading transport. You must use this method to get + a transport you can use to write to the subprocess's stdin. + +- ``send_signal(signal)``. Send a signal to the subprocess. + +- ``terminate()``. Terminate the subprocess. + +- ``kill()``. Kill the subprocess. On Windows this is an alias for + ``terminate()``. + +- ``close()``. This is an alias for ``terminate()``. + +Note that ``send_signal()``, ``terminate()`` and ``kill()`` wrap the +corresponding methods in the standard library ``subprocess`` module. + Protocols --------- @@ -1289,13 +1339,22 @@ - ``eof_received()``. This is called when the other end called ``write_eof()`` (or something equivalent). If this returns a false - value (including None), the transport will close itself. If it + value (including ``None``), the transport will close itself. If it returns a true value, closing the transport is up to the protocol. However, for SSL/TLS connections this is ignored, because the TLS standard requires that no more data is sent and the connection is closed as soon as a "closure alert" is received. - The default implementation returns None. + The default implementation returns ``None``. + +- ``pause_writing()``. Asks that the protocol temporarily stop + writing data to the transport. Heeding the request is optional, but + the transport's buffer may grow without bounds if you keep writing. + The buffer size at which this is called can be controlled through + the transport's ``set_write_buffer_limits()`` method. + +- ``resume_writing()``. Tells the protocol that it is safe to start + writing data to the transport again. Note that this may be cal - ``connection_lost(exc)``. The transport has been closed or aborted, has detected that the other end has closed the connection cleanly, @@ -1303,14 +1362,18 @@ the argument is ``None``; for an unexpected error, the argument is the exception that caused the transport to give up. -Here is a chart indicating the order and multiplicity of calls: +Here is a table indicating the order and multiplicity of the basic +calls: 1. ``connection_made()`` -- exactly once 2. ``data_received()`` -- zero or more times 3. ``eof_received()`` -- at most once 4. ``connection_lost()`` -- exactly once -TBD: Document ``pause_writing()`` and ``resume_writing()``. +Calls to ``pause_writing()`` and ``resume_writing()`` occur in pairs +and only between #1 and #4. These pairs will not be nested. The +final ``resume_writing()`` call may be omitted; i.e. a paused +connection may be lost and never be resumed. Datagram Protocols '''''''''''''''''' @@ -1345,7 +1408,34 @@ Subprocess Protocol ''''''''''''''''''' -TBD. +Subprocess protocols have ``connection_made()``, ``connection_lost()`` +``pause_writing()`` and ``resume_writing()`` methods with the same +signatures as stream protocols. In addition, they have the following +methods: + +- ``pipe_data_received(fd, data)``. Called when the subprocess writes + data to its stdout or stderr. ``fd`` is the file descriptor (1 for + stdout, 2 for stderr). ``data`` is a ``bytes`` object. (TBD: No + ``pipe_eof_received()``?) + +- ``pipe_connection_lost(fd, exc)``. Called when the subprocess + closes its stdin, stdout or stderr. ``fd`` is the file descriptor. + ``exc`` is an exception or ``None``. + +- ``process_exited()``. Called when the subprocess has exited. To + retrieve the exit status, use the transport's ``get_returncode()`` + method. + +Note that depending on the behavior of the subprocess it is possible +that ``process_exited()`` is called either before or after +``pipe_connection_lost()``. For example, if the subprocess creates a +sub-subprocess that shares its stdin/stdout/stderr and then itself +exits, ``process_exited()`` may be called while all the pipes are +still open. On the other hand when the subprocess closes its +stdin/stdout/stderr but does not exit, ``pipe_connection_lost()`` may +be called for all three pipes without ``process_exited()`` being +called. If (as is the more common case) the subprocess exits and +thereby implicitly closes all pipes, the calling order is undefined. Callback Style -------------- @@ -1493,6 +1583,11 @@ arguments. Note that coroutine arguments are converted to Futures using ``asyncio.async()``. +- ``asyncio.shield(f)``. Wait for a Future, shielding it from + cancellation. This returns a Future whose result or exception + is exactly the same as the argument; however, if the returned + Future is cancelled, the argument Future is unaffected. + Sleeping -------- @@ -1558,13 +1653,17 @@ TO DO ===== -- Document pause/resume_writing. - -- Document subprocess/pipe protocols/transports. +- Document cancellation in more detail. - Document locks and queues. -- Document SIGCHILD handling API (once it lands). +- Document StreamReader, StreamWriter and open_connection(). + +- Document passing 'loop=...' everywhere. + +- Document logger object. + +- Document SIGCHILD handling API. - Compare all APIs with the source code to be sure there aren't any undocumented or unimplemented features. @@ -1573,6 +1672,10 @@ Wish List ========= +- An open_server() helper. It should take a callback which is called + for each accepted connection with a reader and writer; the callback + may be a coroutine (then it is wrapped in a task). + - Support a "start TLS" operation to upgrade a TCP socket to SSL/TLS. - UNIX domain sockets. -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Thu Nov 14 22:51:08 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Thu, 14 Nov 2013 22:51:08 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NTky?= =?utf-8?q?=3A_Use_specific_asserts_in_lib2to3_tests=2E?= Message-ID: <3dLGbc1WYwz7Lkp@mail.python.org> http://hg.python.org/cpython/rev/c178d72e2f84 changeset: 87101:c178d72e2f84 branch: 2.7 parent: 87030:c163e5011bd3 user: Serhiy Storchaka date: Thu Nov 14 23:49:14 2013 +0200 summary: Issue #19592: Use specific asserts in lib2to3 tests. files: Lib/lib2to3/tests/test_fixers.py | 2 +- Lib/lib2to3/tests/test_main.py | 6 +++--- Lib/lib2to3/tests/test_parser.py | 4 ++-- Lib/lib2to3/tests/test_pytree.py | 16 ++++++++-------- Lib/lib2to3/tests/test_refactor.py | 6 +++--- 5 files changed, 17 insertions(+), 17 deletions(-) diff --git a/Lib/lib2to3/tests/test_fixers.py b/Lib/lib2to3/tests/test_fixers.py --- a/Lib/lib2to3/tests/test_fixers.py +++ b/Lib/lib2to3/tests/test_fixers.py @@ -41,7 +41,7 @@ def warns(self, before, after, message, unchanged=False): tree = self._check(before, after) - self.assertTrue(message in "".join(self.fixer_log)) + self.assertIn(message, "".join(self.fixer_log)) if not unchanged: self.assertTrue(tree.was_changed) diff --git a/Lib/lib2to3/tests/test_main.py b/Lib/lib2to3/tests/test_main.py --- a/Lib/lib2to3/tests/test_main.py +++ b/Lib/lib2to3/tests/test_main.py @@ -59,9 +59,9 @@ ret = self.run_2to3_capture(["-"], input_stream, out_enc, err) self.assertEqual(ret, 0) output = out.getvalue() - self.assertTrue("-print 'nothing'" in output) - self.assertTrue("WARNING: couldn't encode 's diff for " - "your terminal" in err.getvalue()) + self.assertIn("-print 'nothing'", output) + self.assertIn("WARNING: couldn't encode 's diff for " + "your terminal", err.getvalue()) def setup_test_source_trees(self): """Setup a test source tree and output destination tree.""" diff --git a/Lib/lib2to3/tests/test_parser.py b/Lib/lib2to3/tests/test_parser.py --- a/Lib/lib2to3/tests/test_parser.py +++ b/Lib/lib2to3/tests/test_parser.py @@ -165,8 +165,8 @@ for filepath in support.all_project_files(): with open(filepath, "rb") as fp: encoding = tokenize.detect_encoding(fp.readline)[0] - self.assertTrue(encoding is not None, - "can't detect encoding for %s" % filepath) + self.assertIsNotNone(encoding, + "can't detect encoding for %s" % filepath) with open(filepath, "r") as fp: source = fp.read() source = source.decode(encoding) diff --git a/Lib/lib2to3/tests/test_pytree.py b/Lib/lib2to3/tests/test_pytree.py --- a/Lib/lib2to3/tests/test_pytree.py +++ b/Lib/lib2to3/tests/test_pytree.py @@ -160,12 +160,12 @@ l3 = pytree.Leaf(100, "bar") n1 = pytree.Node(1000, [l1, l2, l3]) self.assertEqual(n1.children, [l1, l2, l3]) - self.assertTrue(isinstance(n1.children, list)) + self.assertIsInstance(n1.children, list) self.assertFalse(n1.was_changed) l2new = pytree.Leaf(100, "-") l2.replace(l2new) self.assertEqual(n1.children, [l1, l2new, l3]) - self.assertTrue(isinstance(n1.children, list)) + self.assertIsInstance(n1.children, list) self.assertTrue(n1.was_changed) def test_replace_with_list(self): @@ -176,7 +176,7 @@ l2.replace([pytree.Leaf(100, "*"), pytree.Leaf(100, "*")]) self.assertEqual(str(n1), "foo**bar") - self.assertTrue(isinstance(n1.children, list)) + self.assertIsInstance(n1.children, list) def test_leaves(self): l1 = pytree.Leaf(100, "foo") @@ -347,7 +347,7 @@ n2 = pytree.Node(1000, []) p1 = pytree.Node(1000, [n1, n2]) - self.assertTrue(n1.next_sibling is n2) + self.assertIs(n1.next_sibling, n2) self.assertEqual(n2.next_sibling, None) self.assertEqual(p1.next_sibling, None) @@ -356,7 +356,7 @@ l2 = pytree.Leaf(100, "b") p1 = pytree.Node(1000, [l1, l2]) - self.assertTrue(l1.next_sibling is l2) + self.assertIs(l1.next_sibling, l2) self.assertEqual(l2.next_sibling, None) self.assertEqual(p1.next_sibling, None) @@ -365,7 +365,7 @@ n2 = pytree.Node(1000, []) p1 = pytree.Node(1000, [n1, n2]) - self.assertTrue(n2.prev_sibling is n1) + self.assertIs(n2.prev_sibling, n1) self.assertEqual(n1.prev_sibling, None) self.assertEqual(p1.prev_sibling, None) @@ -374,7 +374,7 @@ l2 = pytree.Leaf(100, "b") p1 = pytree.Node(1000, [l1, l2]) - self.assertTrue(l2.prev_sibling is l1) + self.assertIs(l2.prev_sibling, l1) self.assertEqual(l1.prev_sibling, None) self.assertEqual(p1.prev_sibling, None) @@ -447,7 +447,7 @@ r = {} self.assertTrue(pw.match_seq([l1, l3], r)) self.assertEqual(r, {"pl": l3, "pw": [l1, l3]}) - self.assertTrue(r["pl"] is l3) + self.assertIs(r["pl"], l3) r = {} def test_generate_matches(self): diff --git a/Lib/lib2to3/tests/test_refactor.py b/Lib/lib2to3/tests/test_refactor.py --- a/Lib/lib2to3/tests/test_refactor.py +++ b/Lib/lib2to3/tests/test_refactor.py @@ -49,9 +49,9 @@ def test_print_function_option(self): rt = self.rt({"print_function" : True}) - self.assertTrue(rt.grammar is pygram.python_grammar_no_print_statement) - self.assertTrue(rt.driver.grammar is - pygram.python_grammar_no_print_statement) + self.assertIs(rt.grammar, pygram.python_grammar_no_print_statement) + self.assertIs(rt.driver.grammar, + pygram.python_grammar_no_print_statement) def test_write_unchanged_files_option(self): rt = self.rt() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 22:51:09 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Thu, 14 Nov 2013 22:51:09 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NTky?= =?utf-8?q?=3A_Use_specific_asserts_in_lib2to3_tests=2E?= Message-ID: <3dLGbd4SHdz7LmM@mail.python.org> http://hg.python.org/cpython/rev/46fc4fb2c8c5 changeset: 87102:46fc4fb2c8c5 branch: 3.3 parent: 87068:3710514aa92a user: Serhiy Storchaka date: Thu Nov 14 23:49:58 2013 +0200 summary: Issue #19592: Use specific asserts in lib2to3 tests. files: Lib/lib2to3/tests/test_fixers.py | 2 +- Lib/lib2to3/tests/test_main.py | 6 +++--- Lib/lib2to3/tests/test_parser.py | 4 ++-- Lib/lib2to3/tests/test_pytree.py | 16 ++++++++-------- Lib/lib2to3/tests/test_refactor.py | 6 +++--- 5 files changed, 17 insertions(+), 17 deletions(-) diff --git a/Lib/lib2to3/tests/test_fixers.py b/Lib/lib2to3/tests/test_fixers.py --- a/Lib/lib2to3/tests/test_fixers.py +++ b/Lib/lib2to3/tests/test_fixers.py @@ -41,7 +41,7 @@ def warns(self, before, after, message, unchanged=False): tree = self._check(before, after) - self.assertTrue(message in "".join(self.fixer_log)) + self.assertIn(message, "".join(self.fixer_log)) if not unchanged: self.assertTrue(tree.was_changed) diff --git a/Lib/lib2to3/tests/test_main.py b/Lib/lib2to3/tests/test_main.py --- a/Lib/lib2to3/tests/test_main.py +++ b/Lib/lib2to3/tests/test_main.py @@ -49,9 +49,9 @@ ret = self.run_2to3_capture(["-"], input_stream, out_enc, err) self.assertEqual(ret, 0) output = out.getvalue().decode("ascii") - self.assertTrue("-print 'nothing'" in output) - self.assertTrue("WARNING: couldn't encode 's diff for " - "your terminal" in err.getvalue()) + self.assertIn("-print 'nothing'", output) + self.assertIn("WARNING: couldn't encode 's diff for " + "your terminal", err.getvalue()) def setup_test_source_trees(self): """Setup a test source tree and output destination tree.""" diff --git a/Lib/lib2to3/tests/test_parser.py b/Lib/lib2to3/tests/test_parser.py --- a/Lib/lib2to3/tests/test_parser.py +++ b/Lib/lib2to3/tests/test_parser.py @@ -168,8 +168,8 @@ for filepath in support.all_project_files(): with open(filepath, "rb") as fp: encoding = tokenize.detect_encoding(fp.readline)[0] - self.assertTrue(encoding is not None, - "can't detect encoding for %s" % filepath) + self.assertIsNotNone(encoding, + "can't detect encoding for %s" % filepath) with open(filepath, "r", encoding=encoding) as fp: source = fp.read() try: diff --git a/Lib/lib2to3/tests/test_pytree.py b/Lib/lib2to3/tests/test_pytree.py --- a/Lib/lib2to3/tests/test_pytree.py +++ b/Lib/lib2to3/tests/test_pytree.py @@ -143,12 +143,12 @@ l3 = pytree.Leaf(100, "bar") n1 = pytree.Node(1000, [l1, l2, l3]) self.assertEqual(n1.children, [l1, l2, l3]) - self.assertTrue(isinstance(n1.children, list)) + self.assertIsInstance(n1.children, list) self.assertFalse(n1.was_changed) l2new = pytree.Leaf(100, "-") l2.replace(l2new) self.assertEqual(n1.children, [l1, l2new, l3]) - self.assertTrue(isinstance(n1.children, list)) + self.assertIsInstance(n1.children, list) self.assertTrue(n1.was_changed) def test_replace_with_list(self): @@ -159,7 +159,7 @@ l2.replace([pytree.Leaf(100, "*"), pytree.Leaf(100, "*")]) self.assertEqual(str(n1), "foo**bar") - self.assertTrue(isinstance(n1.children, list)) + self.assertIsInstance(n1.children, list) def test_leaves(self): l1 = pytree.Leaf(100, "foo") @@ -330,7 +330,7 @@ n2 = pytree.Node(1000, []) p1 = pytree.Node(1000, [n1, n2]) - self.assertTrue(n1.next_sibling is n2) + self.assertIs(n1.next_sibling, n2) self.assertEqual(n2.next_sibling, None) self.assertEqual(p1.next_sibling, None) @@ -339,7 +339,7 @@ l2 = pytree.Leaf(100, "b") p1 = pytree.Node(1000, [l1, l2]) - self.assertTrue(l1.next_sibling is l2) + self.assertIs(l1.next_sibling, l2) self.assertEqual(l2.next_sibling, None) self.assertEqual(p1.next_sibling, None) @@ -348,7 +348,7 @@ n2 = pytree.Node(1000, []) p1 = pytree.Node(1000, [n1, n2]) - self.assertTrue(n2.prev_sibling is n1) + self.assertIs(n2.prev_sibling, n1) self.assertEqual(n1.prev_sibling, None) self.assertEqual(p1.prev_sibling, None) @@ -357,7 +357,7 @@ l2 = pytree.Leaf(100, "b") p1 = pytree.Node(1000, [l1, l2]) - self.assertTrue(l2.prev_sibling is l1) + self.assertIs(l2.prev_sibling, l1) self.assertEqual(l1.prev_sibling, None) self.assertEqual(p1.prev_sibling, None) @@ -430,7 +430,7 @@ r = {} self.assertTrue(pw.match_seq([l1, l3], r)) self.assertEqual(r, {"pl": l3, "pw": [l1, l3]}) - self.assertTrue(r["pl"] is l3) + self.assertIs(r["pl"], l3) r = {} def test_generate_matches(self): diff --git a/Lib/lib2to3/tests/test_refactor.py b/Lib/lib2to3/tests/test_refactor.py --- a/Lib/lib2to3/tests/test_refactor.py +++ b/Lib/lib2to3/tests/test_refactor.py @@ -49,9 +49,9 @@ def test_print_function_option(self): rt = self.rt({"print_function" : True}) - self.assertTrue(rt.grammar is pygram.python_grammar_no_print_statement) - self.assertTrue(rt.driver.grammar is - pygram.python_grammar_no_print_statement) + self.assertIs(rt.grammar, pygram.python_grammar_no_print_statement) + self.assertIs(rt.driver.grammar, + pygram.python_grammar_no_print_statement) def test_write_unchanged_files_option(self): rt = self.rt() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 22:51:11 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Thu, 14 Nov 2013 22:51:11 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319592=3A_Use_specific_asserts_in_lib2to3_tests?= =?utf-8?q?=2E?= Message-ID: <3dLGbg0PvBz7Lmg@mail.python.org> http://hg.python.org/cpython/rev/1b9d8be8b07e changeset: 87103:1b9d8be8b07e parent: 87100:7e00bdada290 parent: 87102:46fc4fb2c8c5 user: Serhiy Storchaka date: Thu Nov 14 23:50:51 2013 +0200 summary: Issue #19592: Use specific asserts in lib2to3 tests. files: Lib/lib2to3/tests/test_fixers.py | 2 +- Lib/lib2to3/tests/test_main.py | 6 +++--- Lib/lib2to3/tests/test_parser.py | 4 ++-- Lib/lib2to3/tests/test_pytree.py | 16 ++++++++-------- Lib/lib2to3/tests/test_refactor.py | 6 +++--- 5 files changed, 17 insertions(+), 17 deletions(-) diff --git a/Lib/lib2to3/tests/test_fixers.py b/Lib/lib2to3/tests/test_fixers.py --- a/Lib/lib2to3/tests/test_fixers.py +++ b/Lib/lib2to3/tests/test_fixers.py @@ -41,7 +41,7 @@ def warns(self, before, after, message, unchanged=False): tree = self._check(before, after) - self.assertTrue(message in "".join(self.fixer_log)) + self.assertIn(message, "".join(self.fixer_log)) if not unchanged: self.assertTrue(tree.was_changed) diff --git a/Lib/lib2to3/tests/test_main.py b/Lib/lib2to3/tests/test_main.py --- a/Lib/lib2to3/tests/test_main.py +++ b/Lib/lib2to3/tests/test_main.py @@ -49,9 +49,9 @@ ret = self.run_2to3_capture(["-"], input_stream, out_enc, err) self.assertEqual(ret, 0) output = out.getvalue().decode("ascii") - self.assertTrue("-print 'nothing'" in output) - self.assertTrue("WARNING: couldn't encode 's diff for " - "your terminal" in err.getvalue()) + self.assertIn("-print 'nothing'", output) + self.assertIn("WARNING: couldn't encode 's diff for " + "your terminal", err.getvalue()) def setup_test_source_trees(self): """Setup a test source tree and output destination tree.""" diff --git a/Lib/lib2to3/tests/test_parser.py b/Lib/lib2to3/tests/test_parser.py --- a/Lib/lib2to3/tests/test_parser.py +++ b/Lib/lib2to3/tests/test_parser.py @@ -168,8 +168,8 @@ for filepath in support.all_project_files(): with open(filepath, "rb") as fp: encoding = tokenize.detect_encoding(fp.readline)[0] - self.assertTrue(encoding is not None, - "can't detect encoding for %s" % filepath) + self.assertIsNotNone(encoding, + "can't detect encoding for %s" % filepath) with open(filepath, "r", encoding=encoding) as fp: source = fp.read() try: diff --git a/Lib/lib2to3/tests/test_pytree.py b/Lib/lib2to3/tests/test_pytree.py --- a/Lib/lib2to3/tests/test_pytree.py +++ b/Lib/lib2to3/tests/test_pytree.py @@ -143,12 +143,12 @@ l3 = pytree.Leaf(100, "bar") n1 = pytree.Node(1000, [l1, l2, l3]) self.assertEqual(n1.children, [l1, l2, l3]) - self.assertTrue(isinstance(n1.children, list)) + self.assertIsInstance(n1.children, list) self.assertFalse(n1.was_changed) l2new = pytree.Leaf(100, "-") l2.replace(l2new) self.assertEqual(n1.children, [l1, l2new, l3]) - self.assertTrue(isinstance(n1.children, list)) + self.assertIsInstance(n1.children, list) self.assertTrue(n1.was_changed) def test_replace_with_list(self): @@ -159,7 +159,7 @@ l2.replace([pytree.Leaf(100, "*"), pytree.Leaf(100, "*")]) self.assertEqual(str(n1), "foo**bar") - self.assertTrue(isinstance(n1.children, list)) + self.assertIsInstance(n1.children, list) def test_leaves(self): l1 = pytree.Leaf(100, "foo") @@ -330,7 +330,7 @@ n2 = pytree.Node(1000, []) p1 = pytree.Node(1000, [n1, n2]) - self.assertTrue(n1.next_sibling is n2) + self.assertIs(n1.next_sibling, n2) self.assertEqual(n2.next_sibling, None) self.assertEqual(p1.next_sibling, None) @@ -339,7 +339,7 @@ l2 = pytree.Leaf(100, "b") p1 = pytree.Node(1000, [l1, l2]) - self.assertTrue(l1.next_sibling is l2) + self.assertIs(l1.next_sibling, l2) self.assertEqual(l2.next_sibling, None) self.assertEqual(p1.next_sibling, None) @@ -348,7 +348,7 @@ n2 = pytree.Node(1000, []) p1 = pytree.Node(1000, [n1, n2]) - self.assertTrue(n2.prev_sibling is n1) + self.assertIs(n2.prev_sibling, n1) self.assertEqual(n1.prev_sibling, None) self.assertEqual(p1.prev_sibling, None) @@ -357,7 +357,7 @@ l2 = pytree.Leaf(100, "b") p1 = pytree.Node(1000, [l1, l2]) - self.assertTrue(l2.prev_sibling is l1) + self.assertIs(l2.prev_sibling, l1) self.assertEqual(l1.prev_sibling, None) self.assertEqual(p1.prev_sibling, None) @@ -430,7 +430,7 @@ r = {} self.assertTrue(pw.match_seq([l1, l3], r)) self.assertEqual(r, {"pl": l3, "pw": [l1, l3]}) - self.assertTrue(r["pl"] is l3) + self.assertIs(r["pl"], l3) r = {} def test_generate_matches(self): diff --git a/Lib/lib2to3/tests/test_refactor.py b/Lib/lib2to3/tests/test_refactor.py --- a/Lib/lib2to3/tests/test_refactor.py +++ b/Lib/lib2to3/tests/test_refactor.py @@ -49,9 +49,9 @@ def test_print_function_option(self): rt = self.rt({"print_function" : True}) - self.assertTrue(rt.grammar is pygram.python_grammar_no_print_statement) - self.assertTrue(rt.driver.grammar is - pygram.python_grammar_no_print_statement) + self.assertIs(rt.grammar, pygram.python_grammar_no_print_statement) + self.assertIs(rt.driver.grammar, + pygram.python_grammar_no_print_statement) def test_write_unchanged_files_option(self): rt = self.rt() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 22:56:34 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 14 Nov 2013 22:56:34 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_pars?= =?utf-8?q?e=5Fsave=5Ffield=28=29_of_the_csv_module=2C_handle_PyList=5FApp?= =?utf-8?b?ZW5kKCk=?= Message-ID: <3dLGjt27Ydz7Lnm@mail.python.org> http://hg.python.org/cpython/rev/c9efe992e9b2 changeset: 87104:c9efe992e9b2 user: Victor Stinner date: Thu Nov 14 21:29:34 2013 +0100 summary: Issue #19437: Fix parse_save_field() of the csv module, handle PyList_Append() failure files: Modules/_csv.c | 5 ++++- 1 files changed, 4 insertions(+), 1 deletions(-) diff --git a/Modules/_csv.c b/Modules/_csv.c --- a/Modules/_csv.c +++ b/Modules/_csv.c @@ -546,7 +546,10 @@ return -1; field = tmp; } - PyList_Append(self->fields, field); + if (PyList_Append(self->fields, field) < 0) { + Py_DECREF(field); + return -1; + } Py_DECREF(field); return 0; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 22:56:35 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 14 Nov 2013 22:56:35 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_pars?= =?utf-8?q?e=5Fenvlist=28=29_of_the_posix/nt_module=2C_don=27t_call?= Message-ID: <3dLGjv4Gl1z7Lp8@mail.python.org> http://hg.python.org/cpython/rev/98ac18544722 changeset: 87105:98ac18544722 user: Victor Stinner date: Thu Nov 14 21:37:05 2013 +0100 summary: Issue #19437: Fix parse_envlist() of the posix/nt module, don't call PyMapping_Values() with an exception set, exit immediatly on error. files: Modules/posixmodule.c | 4 +++- 1 files changed, 3 insertions(+), 1 deletions(-) diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -5083,8 +5083,10 @@ } envc = 0; keys = PyMapping_Keys(env); + if (!keys) + goto error; vals = PyMapping_Values(env); - if (!keys || !vals) + if (!vals) goto error; if (!PyList_Check(keys) || !PyList_Check(vals)) { PyErr_Format(PyExc_TypeError, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 22:56:36 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 14 Nov 2013 22:56:36 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319429=2C_=2319437?= =?utf-8?q?=3A_fix_error_handling_in_the_OSError_constructor?= Message-ID: <3dLGjw68Fhz7LnB@mail.python.org> http://hg.python.org/cpython/rev/61a712066770 changeset: 87106:61a712066770 user: Victor Stinner date: Thu Nov 14 22:31:41 2013 +0100 summary: Issue #19429, #19437: fix error handling in the OSError constructor files: Objects/exceptions.c | 10 ++++++---- 1 files changed, 6 insertions(+), 4 deletions(-) diff --git a/Objects/exceptions.c b/Objects/exceptions.c --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -845,7 +845,7 @@ /* Steals the reference to args */ Py_CLEAR(self->args); self->args = args; - args = NULL; + *p_args = args = NULL; return 0; } @@ -885,11 +885,12 @@ PyObject *winerror = NULL; #endif + Py_INCREF(args); + if (!oserror_use_init(type)) { if (!_PyArg_NoKeywords(type->tp_name, kwds)) - return NULL; - - Py_INCREF(args); + goto error; + if (oserror_parse_args(&args, &myerrno, &strerror, &filename #ifdef MS_WINDOWS , &winerror @@ -932,6 +933,7 @@ goto error; } + Py_XDECREF(args); return (PyObject *) self; error: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 14 22:56:38 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 14 Nov 2013 22:56:38 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_init?= =?utf-8?q?=5Fbuiltin=28=29=2C_handle_=5FPyImport=5FFindExtensionObject=28?= =?utf-8?q?=29?= Message-ID: <3dLGjy1lFlz7Lnm@mail.python.org> http://hg.python.org/cpython/rev/f7d401eaee0e changeset: 87107:f7d401eaee0e user: Victor Stinner date: Thu Nov 14 22:38:52 2013 +0100 summary: Issue #19437: Fix init_builtin(), handle _PyImport_FindExtensionObject() failure files: Python/import.c | 6 +++++- 1 files changed, 5 insertions(+), 1 deletions(-) diff --git a/Python/import.c b/Python/import.c --- a/Python/import.c +++ b/Python/import.c @@ -948,8 +948,12 @@ init_builtin(PyObject *name) { struct _inittab *p; + PyObject *mod; - if (_PyImport_FindExtensionObject(name, name) != NULL) + mod = _PyImport_FindExtensionObject(name, name); + if (PyErr_Occurred()) + return -1; + if (mod != NULL) return 1; for (p = PyImport_Inittab; p->name != NULL; p++) { -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 01:16:36 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 15 Nov 2013 01:16:36 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Refactor_waitpi?= =?utf-8?q?d_mocks=2E_Patch_by_Anthony_Baire=2E?= Message-ID: <3dLKqS10HVz7LjP@mail.python.org> http://hg.python.org/cpython/rev/a75b88048339 changeset: 87108:a75b88048339 user: Guido van Rossum date: Thu Nov 14 16:16:29 2013 -0800 summary: asyncio: Refactor waitpid mocks. Patch by Anthony Baire. files: Lib/test/test_asyncio/test_unix_events.py | 425 ++++----- 1 files changed, 181 insertions(+), 244 deletions(-) diff --git a/Lib/test/test_asyncio/test_unix_events.py b/Lib/test/test_asyncio/test_unix_events.py --- a/Lib/test/test_asyncio/test_unix_events.py +++ b/Lib/test/test_asyncio/test_unix_events.py @@ -1,5 +1,6 @@ """Tests for unix_events.py.""" +import collections import gc import errno import io @@ -705,8 +706,16 @@ NotImplementedError, watcher._do_waitpid, f) +WaitPidMocks = collections.namedtuple("WaitPidMocks", + ("waitpid", + "WIFEXITED", + "WIFSIGNALED", + "WEXITSTATUS", + "WTERMSIG", + )) + + class ChildWatcherTestsMixin: - instance = None ignore_warnings = unittest.mock.patch.object(unix_events.logger, "warning") @@ -715,21 +724,12 @@ self.running = False self.zombies = {} - assert ChildWatcherTestsMixin.instance is None - ChildWatcherTestsMixin.instance = self - with unittest.mock.patch.object( self.loop, "add_signal_handler") as self.m_add_signal_handler: self.watcher = self.create_watcher() self.watcher.attach_loop(self.loop) - def cleanup(): - ChildWatcherTestsMixin.instance = None - - self.addCleanup(cleanup) - - def waitpid(pid, flags): - self = ChildWatcherTestsMixin.instance + def waitpid(self, pid, flags): if isinstance(self.watcher, unix_events.SafeChildWatcher) or pid != -1: self.assertGreater(pid, 0) try: @@ -747,33 +747,43 @@ def add_zombie(self, pid, returncode): self.zombies[pid] = returncode + 32768 - def WIFEXITED(status): + def WIFEXITED(self, status): return status >= 32768 - def WIFSIGNALED(status): + def WIFSIGNALED(self, status): return 32700 < status < 32768 - def WEXITSTATUS(status): - self = ChildWatcherTestsMixin.instance - self.assertTrue(type(self).WIFEXITED(status)) + def WEXITSTATUS(self, status): + self.assertTrue(self.WIFEXITED(status)) return status - 32768 - def WTERMSIG(status): - self = ChildWatcherTestsMixin.instance - self.assertTrue(type(self).WIFSIGNALED(status)) + def WTERMSIG(self, status): + self.assertTrue(self.WIFSIGNALED(status)) return 32768 - status def test_create_watcher(self): self.m_add_signal_handler.assert_called_once_with( signal.SIGCHLD, self.watcher._sig_chld) - @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) - @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) - @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) - @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) - @unittest.mock.patch('os.waitpid', wraps=waitpid) - def test_sigchld(self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, - m_WEXITSTATUS, m_WTERMSIG): + def waitpid_mocks(func): + def wrapped_func(self): + def patch(target, wrapper): + return unittest.mock.patch(target, wraps=wrapper, + new_callable=unittest.mock.Mock) + + with patch('os.WTERMSIG', self.WTERMSIG) as m_WTERMSIG, \ + patch('os.WEXITSTATUS', self.WEXITSTATUS) as m_WEXITSTATUS, \ + patch('os.WIFSIGNALED', self.WIFSIGNALED) as m_WIFSIGNALED, \ + patch('os.WIFEXITED', self.WIFEXITED) as m_WIFEXITED, \ + patch('os.waitpid', self.waitpid) as m_waitpid: + func(self, WaitPidMocks(m_waitpid, + m_WIFEXITED, m_WIFSIGNALED, + m_WEXITSTATUS, m_WTERMSIG, + )) + return wrapped_func + + @waitpid_mocks + def test_sigchld(self, m): # register a child callback = unittest.mock.Mock() @@ -782,33 +792,33 @@ self.watcher.add_child_handler(42, callback, 9, 10, 14) self.assertFalse(callback.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) # child is running self.watcher._sig_chld() self.assertFalse(callback.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) # child terminates (returncode 12) self.running = False self.add_zombie(42, 12) self.watcher._sig_chld() - self.assertTrue(m_WIFEXITED.called) - self.assertTrue(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertTrue(m.WIFEXITED.called) + self.assertTrue(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) callback.assert_called_once_with(42, 12, 9, 10, 14) - m_WIFSIGNALED.reset_mock() - m_WIFEXITED.reset_mock() - m_WEXITSTATUS.reset_mock() + m.WIFSIGNALED.reset_mock() + m.WIFEXITED.reset_mock() + m.WEXITSTATUS.reset_mock() callback.reset_mock() # ensure that the child is effectively reaped @@ -817,29 +827,24 @@ self.watcher._sig_chld() self.assertFalse(callback.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WTERMSIG.called) - m_WIFSIGNALED.reset_mock() - m_WIFEXITED.reset_mock() - m_WEXITSTATUS.reset_mock() + m.WIFSIGNALED.reset_mock() + m.WIFEXITED.reset_mock() + m.WEXITSTATUS.reset_mock() # sigchld called again self.zombies.clear() self.watcher._sig_chld() self.assertFalse(callback.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) - @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) - @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) - @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) - @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) - @unittest.mock.patch('os.waitpid', wraps=waitpid) - def test_sigchld_two_children(self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, - m_WEXITSTATUS, m_WTERMSIG): + @waitpid_mocks + def test_sigchld_two_children(self, m): callback1 = unittest.mock.Mock() callback2 = unittest.mock.Mock() @@ -850,10 +855,10 @@ self.assertFalse(callback1.called) self.assertFalse(callback2.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) # register child 2 with self.watcher: @@ -861,20 +866,20 @@ self.assertFalse(callback1.called) self.assertFalse(callback2.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) # childen are running self.watcher._sig_chld() self.assertFalse(callback1.called) self.assertFalse(callback2.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) # child 1 terminates (signal 3) self.add_zombie(43, -3) @@ -882,13 +887,13 @@ callback1.assert_called_once_with(43, -3, 7, 8) self.assertFalse(callback2.called) - self.assertTrue(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertTrue(m_WTERMSIG.called) + self.assertTrue(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertTrue(m.WTERMSIG.called) - m_WIFSIGNALED.reset_mock() - m_WIFEXITED.reset_mock() - m_WTERMSIG.reset_mock() + m.WIFSIGNALED.reset_mock() + m.WIFEXITED.reset_mock() + m.WTERMSIG.reset_mock() callback1.reset_mock() # child 2 still running @@ -896,10 +901,10 @@ self.assertFalse(callback1.called) self.assertFalse(callback2.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) # child 2 terminates (code 108) self.add_zombie(44, 108) @@ -908,13 +913,13 @@ callback2.assert_called_once_with(44, 108, 147, 18) self.assertFalse(callback1.called) - self.assertTrue(m_WIFEXITED.called) - self.assertTrue(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertTrue(m.WIFEXITED.called) + self.assertTrue(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) - m_WIFSIGNALED.reset_mock() - m_WIFEXITED.reset_mock() - m_WEXITSTATUS.reset_mock() + m.WIFSIGNALED.reset_mock() + m.WIFEXITED.reset_mock() + m.WEXITSTATUS.reset_mock() callback2.reset_mock() # ensure that the children are effectively reaped @@ -925,11 +930,11 @@ self.assertFalse(callback1.called) self.assertFalse(callback2.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WTERMSIG.called) - m_WIFSIGNALED.reset_mock() - m_WIFEXITED.reset_mock() - m_WEXITSTATUS.reset_mock() + m.WIFSIGNALED.reset_mock() + m.WIFEXITED.reset_mock() + m.WEXITSTATUS.reset_mock() # sigchld called again self.zombies.clear() @@ -937,19 +942,13 @@ self.assertFalse(callback1.called) self.assertFalse(callback2.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) - @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) - @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) - @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) - @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) - @unittest.mock.patch('os.waitpid', wraps=waitpid) - def test_sigchld_two_children_terminating_together( - self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, - m_WTERMSIG): + @waitpid_mocks + def test_sigchld_two_children_terminating_together(self, m): callback1 = unittest.mock.Mock() callback2 = unittest.mock.Mock() @@ -960,10 +959,10 @@ self.assertFalse(callback1.called) self.assertFalse(callback2.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) # register child 2 with self.watcher: @@ -971,20 +970,20 @@ self.assertFalse(callback1.called) self.assertFalse(callback2.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) # childen are running self.watcher._sig_chld() self.assertFalse(callback1.called) self.assertFalse(callback2.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) # child 1 terminates (code 78) # child 2 terminates (signal 5) @@ -995,15 +994,15 @@ callback1.assert_called_once_with(45, 78, 17, 8) callback2.assert_called_once_with(46, -5, 1147, 18) - self.assertTrue(m_WIFSIGNALED.called) - self.assertTrue(m_WIFEXITED.called) - self.assertTrue(m_WEXITSTATUS.called) - self.assertTrue(m_WTERMSIG.called) + self.assertTrue(m.WIFSIGNALED.called) + self.assertTrue(m.WIFEXITED.called) + self.assertTrue(m.WEXITSTATUS.called) + self.assertTrue(m.WTERMSIG.called) - m_WIFSIGNALED.reset_mock() - m_WIFEXITED.reset_mock() - m_WTERMSIG.reset_mock() - m_WEXITSTATUS.reset_mock() + m.WIFSIGNALED.reset_mock() + m.WIFEXITED.reset_mock() + m.WTERMSIG.reset_mock() + m.WEXITSTATUS.reset_mock() callback1.reset_mock() callback2.reset_mock() @@ -1015,16 +1014,10 @@ self.assertFalse(callback1.called) self.assertFalse(callback2.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WTERMSIG.called) - @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) - @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) - @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) - @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) - @unittest.mock.patch('os.waitpid', wraps=waitpid) - def test_sigchld_race_condition( - self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, - m_WTERMSIG): + @waitpid_mocks + def test_sigchld_race_condition(self, m): # register a child callback = unittest.mock.Mock() @@ -1045,14 +1038,8 @@ self.assertFalse(callback.called) - @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) - @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) - @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) - @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) - @unittest.mock.patch('os.waitpid', wraps=waitpid) - def test_sigchld_replace_handler( - self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, - m_WTERMSIG): + @waitpid_mocks + def test_sigchld_replace_handler(self, m): callback1 = unittest.mock.Mock() callback2 = unittest.mock.Mock() @@ -1063,10 +1050,10 @@ self.assertFalse(callback1.called) self.assertFalse(callback2.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) # register the same child again with self.watcher: @@ -1074,10 +1061,10 @@ self.assertFalse(callback1.called) self.assertFalse(callback2.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) # child terminates (signal 8) self.running = False @@ -1086,13 +1073,13 @@ callback2.assert_called_once_with(51, -8, 21) self.assertFalse(callback1.called) - self.assertTrue(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertTrue(m_WTERMSIG.called) + self.assertTrue(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertTrue(m.WTERMSIG.called) - m_WIFSIGNALED.reset_mock() - m_WIFEXITED.reset_mock() - m_WTERMSIG.reset_mock() + m.WIFSIGNALED.reset_mock() + m.WIFEXITED.reset_mock() + m.WTERMSIG.reset_mock() callback2.reset_mock() # ensure that the child is effectively reaped @@ -1102,15 +1089,10 @@ self.assertFalse(callback1.called) self.assertFalse(callback2.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WTERMSIG.called) - @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) - @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) - @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) - @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) - @unittest.mock.patch('os.waitpid', wraps=waitpid) - def test_sigchld_remove_handler(self, m_waitpid, m_WIFEXITED, - m_WIFSIGNALED, m_WEXITSTATUS, m_WTERMSIG): + @waitpid_mocks + def test_sigchld_remove_handler(self, m): callback = unittest.mock.Mock() # register a child @@ -1119,19 +1101,19 @@ self.watcher.add_child_handler(52, callback, 1984) self.assertFalse(callback.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) # unregister the child self.watcher.remove_child_handler(52) self.assertFalse(callback.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) # child terminates (code 99) self.running = False @@ -1141,13 +1123,8 @@ self.assertFalse(callback.called) - @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) - @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) - @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) - @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) - @unittest.mock.patch('os.waitpid', wraps=waitpid) - def test_sigchld_unknown_status(self, m_waitpid, m_WIFEXITED, - m_WIFSIGNALED, m_WEXITSTATUS, m_WTERMSIG): + @waitpid_mocks + def test_sigchld_unknown_status(self, m): callback = unittest.mock.Mock() # register a child @@ -1156,10 +1133,10 @@ self.watcher.add_child_handler(53, callback, -19) self.assertFalse(callback.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) # terminate with unknown status self.zombies[53] = 1178 @@ -1167,14 +1144,14 @@ self.watcher._sig_chld() callback.assert_called_once_with(53, 1178, -19) - self.assertTrue(m_WIFEXITED.called) - self.assertTrue(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertTrue(m.WIFEXITED.called) + self.assertTrue(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) callback.reset_mock() - m_WIFEXITED.reset_mock() - m_WIFSIGNALED.reset_mock() + m.WIFEXITED.reset_mock() + m.WIFSIGNALED.reset_mock() # ensure that the child is effectively reaped self.add_zombie(53, 101) @@ -1183,13 +1160,8 @@ self.assertFalse(callback.called) - @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) - @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) - @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) - @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) - @unittest.mock.patch('os.waitpid', wraps=waitpid) - def test_remove_child_handler(self, m_waitpid, m_WIFEXITED, - m_WIFSIGNALED, m_WEXITSTATUS, m_WTERMSIG): + @waitpid_mocks + def test_remove_child_handler(self, m): callback1 = unittest.mock.Mock() callback2 = unittest.mock.Mock() callback3 = unittest.mock.Mock() @@ -1221,8 +1193,8 @@ self.assertFalse(callback2.called) callback3.assert_called_once_with(56, 2, 3) - @unittest.mock.patch('os.waitpid', wraps=waitpid) - def test_sigchld_unhandled_exception(self, m_waitpid): + @waitpid_mocks + def test_sigchld_unhandled_exception(self, m): callback = unittest.mock.Mock() # register a child @@ -1231,7 +1203,7 @@ self.watcher.add_child_handler(57, callback) # raise an exception - m_waitpid.side_effect = ValueError + m.waitpid.side_effect = ValueError with unittest.mock.patch.object(unix_events.logger, "exception") as m_exception: @@ -1239,15 +1211,8 @@ self.assertEqual(self.watcher._sig_chld(), None) self.assertTrue(m_exception.called) - @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) - @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) - @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) - @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) - @unittest.mock.patch('os.waitpid', wraps=waitpid) - def test_sigchld_child_reaped_elsewhere( - self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, - m_WTERMSIG): - + @waitpid_mocks + def test_sigchld_child_reaped_elsewhere(self, m): # register a child callback = unittest.mock.Mock() @@ -1256,10 +1221,10 @@ self.watcher.add_child_handler(58, callback) self.assertFalse(callback.called) - self.assertFalse(m_WIFEXITED.called) - self.assertFalse(m_WIFSIGNALED.called) - self.assertFalse(m_WEXITSTATUS.called) - self.assertFalse(m_WTERMSIG.called) + self.assertFalse(m.WIFEXITED.called) + self.assertFalse(m.WIFSIGNALED.called) + self.assertFalse(m.WEXITSTATUS.called) + self.assertFalse(m.WTERMSIG.called) # child terminates self.running = False @@ -1268,13 +1233,13 @@ # waitpid is called elsewhere os.waitpid(58, os.WNOHANG) - m_waitpid.reset_mock() + m.waitpid.reset_mock() # sigchld with self.ignore_warnings: self.watcher._sig_chld() - callback.assert_called(m_waitpid) + callback.assert_called(m.waitpid) if isinstance(self.watcher, unix_events.FastChildWatcher): # here the FastChildWatche enters a deadlock # (there is no way to prevent it) @@ -1282,15 +1247,8 @@ else: callback.assert_called_once_with(58, 255) - @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) - @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) - @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) - @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) - @unittest.mock.patch('os.waitpid', wraps=waitpid) - def test_sigchld_unknown_pid_during_registration( - self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, - m_WTERMSIG): - + @waitpid_mocks + def test_sigchld_unknown_pid_during_registration(self, m): # register two children callback1 = unittest.mock.Mock() callback2 = unittest.mock.Mock() @@ -1310,15 +1268,8 @@ callback1.assert_called_once_with(591, 7) self.assertFalse(callback2.called) - @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) - @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) - @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) - @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) - @unittest.mock.patch('os.waitpid', wraps=waitpid) - def test_set_loop( - self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, - m_WTERMSIG): - + @waitpid_mocks + def test_set_loop(self, m): # register a child callback = unittest.mock.Mock() @@ -1351,15 +1302,8 @@ callback.assert_called_once_with(60, 9) - @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) - @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) - @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) - @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) - @unittest.mock.patch('os.waitpid', wraps=waitpid) - def test_set_loop_race_condition( - self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, - m_WTERMSIG): - + @waitpid_mocks + def test_set_loop_race_condition(self, m): # register 3 children callback1 = unittest.mock.Mock() callback2 = unittest.mock.Mock() @@ -1418,15 +1362,8 @@ self.assertFalse(callback2.called) callback3.assert_called_once_with(622, 19) - @unittest.mock.patch('os.WTERMSIG', wraps=WTERMSIG) - @unittest.mock.patch('os.WEXITSTATUS', wraps=WEXITSTATUS) - @unittest.mock.patch('os.WIFSIGNALED', wraps=WIFSIGNALED) - @unittest.mock.patch('os.WIFEXITED', wraps=WIFEXITED) - @unittest.mock.patch('os.waitpid', wraps=waitpid) - def test_close( - self, m_waitpid, m_WIFEXITED, m_WIFSIGNALED, m_WEXITSTATUS, - m_WTERMSIG): - + @waitpid_mocks + def test_close(self, m): # register two children callback1 = unittest.mock.Mock() callback2 = unittest.mock.Mock() -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Fri Nov 15 07:34:53 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Fri, 15 Nov 2013 07:34:53 +0100 Subject: [Python-checkins] Daily reference leaks (a75b88048339): sum=84 Message-ID: results for a75b88048339 on branch "default" -------------------------------------------- test_codecs leaked [21, 21, 21] references, sum=63 test_codecs leaked [7, 7, 7] memory blocks, sum=21 test_site leaked [0, 2, -2] references, sum=0 test_site leaked [0, 2, -2] memory blocks, sum=0 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflog_aMkMv', '-x'] From python-checkins at python.org Fri Nov 15 12:47:51 2013 From: python-checkins at python.org (nick.coghlan) Date: Fri, 15 Nov 2013 12:47:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_19609=3A_narrow_scop?= =?utf-8?q?e_of_codec_exc_chaining?= Message-ID: <3dLd930Sl3z7Ll1@mail.python.org> http://hg.python.org/cpython/rev/4ea622c085ca changeset: 87109:4ea622c085ca user: Nick Coghlan date: Fri Nov 15 21:47:37 2013 +1000 summary: Close 19609: narrow scope of codec exc chaining files: Lib/test/test_codecs.py | 37 ++++++++++++++++++++-------- Python/codecs.c | 10 ++++--- 2 files changed, 32 insertions(+), 15 deletions(-) diff --git a/Lib/test/test_codecs.py b/Lib/test/test_codecs.py --- a/Lib/test/test_codecs.py +++ b/Lib/test/test_codecs.py @@ -2435,22 +2435,22 @@ self.check_wrapped(RuntimeError(msg), msg) @contextlib.contextmanager - def assertNotWrapped(self, operation, exc_type, msg): + def assertNotWrapped(self, operation, exc_type, msg_re, msg=None): + if msg is None: + msg = msg_re with self.assertRaisesRegex(exc_type, msg) as caught: yield caught - actual_msg = str(caught.exception) - self.assertNotIn(operation, actual_msg) - self.assertNotIn(self.codec_name, actual_msg) + self.assertEqual(str(caught.exception), msg) - def check_not_wrapped(self, obj_to_raise, msg): + def check_not_wrapped(self, obj_to_raise, msg_re, msg=None): self.set_codec(obj_to_raise) - with self.assertNotWrapped("encoding", RuntimeError, msg): + with self.assertNotWrapped("encoding", RuntimeError, msg_re, msg): "str input".encode(self.codec_name) - with self.assertNotWrapped("encoding", RuntimeError, msg): + with self.assertNotWrapped("encoding", RuntimeError, msg_re, msg): codecs.encode("str input", self.codec_name) - with self.assertNotWrapped("decoding", RuntimeError, msg): + with self.assertNotWrapped("decoding", RuntimeError, msg_re, msg): b"bytes input".decode(self.codec_name) - with self.assertNotWrapped("decoding", RuntimeError, msg): + with self.assertNotWrapped("decoding", RuntimeError, msg_re, msg): codecs.decode(b"bytes input", self.codec_name) def test_init_override_is_not_wrapped(self): @@ -2475,8 +2475,23 @@ self.check_not_wrapped(RuntimeError(1), "1") def test_multiple_args_is_not_wrapped(self): - msg = "\('a', 'b', 'c'\)" - self.check_not_wrapped(RuntimeError('a', 'b', 'c'), msg) + msg_re = "\('a', 'b', 'c'\)" + msg = "('a', 'b', 'c')" + self.check_not_wrapped(RuntimeError('a', 'b', 'c'), msg_re, msg) + + # http://bugs.python.org/issue19609 + def test_codec_lookup_failure_not_wrapped(self): + msg = "unknown encoding: %s" % self.codec_name + # The initial codec lookup should not be wrapped + with self.assertNotWrapped("encoding", LookupError, msg): + "str input".encode(self.codec_name) + with self.assertNotWrapped("encoding", LookupError, msg): + codecs.encode("str input", self.codec_name) + with self.assertNotWrapped("decoding", LookupError, msg): + b"bytes input".decode(self.codec_name) + with self.assertNotWrapped("decoding", LookupError, msg): + codecs.decode(b"bytes input", self.codec_name) + @unittest.skipUnless(sys.platform == 'win32', diff --git a/Python/codecs.c b/Python/codecs.c --- a/Python/codecs.c +++ b/Python/codecs.c @@ -370,8 +370,10 @@ goto onError; result = PyEval_CallObject(encoder, args); - if (result == NULL) + if (result == NULL) { + wrap_codec_error("encoding", encoding); goto onError; + } if (!PyTuple_Check(result) || PyTuple_GET_SIZE(result) != 2) { @@ -392,7 +394,6 @@ Py_XDECREF(result); Py_XDECREF(args); Py_XDECREF(encoder); - wrap_codec_error("encoding", encoding); return NULL; } @@ -418,8 +419,10 @@ goto onError; result = PyEval_CallObject(decoder,args); - if (result == NULL) + if (result == NULL) { + wrap_codec_error("decoding", encoding); goto onError; + } if (!PyTuple_Check(result) || PyTuple_GET_SIZE(result) != 2) { PyErr_SetString(PyExc_TypeError, @@ -439,7 +442,6 @@ Py_XDECREF(args); Py_XDECREF(decoder); Py_XDECREF(result); - wrap_codec_error("decoding", encoding); return NULL; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 13:20:27 2013 From: python-checkins at python.org (nick.coghlan) Date: Fri, 15 Nov 2013 13:20:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Add_PEP_458=3A_Surviving_a_co?= =?utf-8?q?mpromise_of_PyPI?= Message-ID: <3dLdtg0r99z7Ljn@mail.python.org> http://hg.python.org/peps/rev/09de1a759c5a changeset: 5272:09de1a759c5a user: Nick Coghlan date: Fri Nov 15 22:20:14 2013 +1000 summary: Add PEP 458: Surviving a compromise of PyPI files: pep-0458.txt | 1083 ++++++++++++++++++++++++++++++++++++++ 1 files changed, 1083 insertions(+), 0 deletions(-) diff --git a/pep-0458.txt b/pep-0458.txt new file mode 100644 --- /dev/null +++ b/pep-0458.txt @@ -0,0 +1,1083 @@ +PEP: 458 +Title: Surviving a Compromise of PyPI +Version: $Revision$ +Last-Modified: $Date$ +Author: Trishank Karthik Kuppusamy , + Donald Stufft , + Justin Cappos +Discussions-To: Distutils SIG +Status: Draft +Type: Standards Track +Content-Type: text/x-rst +Created: 27-Sep-2013 + + +Abstract +======== + +This PEP describes how the Python Package Index (PyPI [1]_) may be integrated +with The Update Framework [2]_ (TUF). TUF was designed to be a plug-and-play +security add-on to a software updater or package manager. TUF provides +end-to-end security like SSL, but for software updates instead of HTTP +connections. The framework integrates best security practices such as +separating responsibilities, adopting the many-man rule for signing packages, +keeping signing keys offline, and revocation of expired or compromised signing +keys. + +The proposed integration will render modern package managers such as pip [3]_ +more secure against various types of security attacks on PyPI and protect users +against them. Even in the worst case where an attacker manages to compromise +PyPI itself, the damage is controlled in scope and limited in duration. + +Specifically, this PEP will describe how PyPI processes should be adapted to +incorporate TUF metadata. It will not prescribe how package managers such as +pip should be adapted to install or update with TUF metadata projects from +PyPI. + + +Rationale +========= + +In January 2013, the Python Software Foundation (PSF) announced [4]_ that the +python.org wikis for Python, Jython, and the PSF were subjected to a security +breach which caused all of the wiki data to be destroyed on January 5 2013. +Fortunately, the PyPI infrastructure was not affected by this security breach. +However, the incident is a reminder that PyPI should take defensive steps to +protect users as much as possible in the event of a compromise. Attacks on +software repositories happen all the time [5]_. We must accept the possibility +of security breaches and prepare PyPI accordingly because it is a valuable +target used by thousands, if not millions, of people. + +Before the wiki attack, PyPI used MD5 hashes to tell package managers such as +pip whether or not a package was corrupted in transit. However, the absence of +SSL made it hard for package managers to verify transport integrity to PyPI. +It was easy to launch a man-in-the-middle attack between pip and PyPI to change +package contents arbitrarily. This can be used to trick users into installing +malicious packages. After the wiki attack, several steps were proposed (some +of which were implemented) to deliver a much higher level of security than was +previously the case: requiring SSL to communicate with PyPI [6]_, restricting +project names [7]_, and migrating from MD5 to SHA-2 hashes [8]_. + +These steps, though necessary, are insufficient because attacks are still +possible through other avenues. For example, a public mirror is trusted to +honestly mirror PyPI, but some mirrors may misbehave due to malice or accident. +Package managers such as pip are supposed to use signatures from PyPI to verify +packages downloaded from a public mirror [9]_, but none are known to actually +do so [10]_. Therefore, it is also wise to add more security measures to +detect attacks from public mirrors or content delivery networks [11]_ (CDNs). + +Even though official mirrors are being deprecated on PyPI [12]_, there remain a +wide variety of other attack vectors on package managers [13]_. Among other +things, these attacks can crash client systems, cause obsolete packages to be +installed, or even allow an attacker to execute arbitrary code. In September +2013, we showed how the latest version of pip then was susceptible to these +attacks and how TUF could protect users against them [14]_. + +Finally, PyPI allows for packages to be signed with GPG keys [15]_, although no +package manager is known to verify those signatures, thus negating much of the +benefits of having those signatures at all. Validating integrity through +cryptography is important, but issues such as immediate and secure key +revocation or specifying a required threshold number of signatures still +remain. Furthermore, GPG by itself does not immediately address the attacks +mentioned above. + +In order to protect PyPI against infrastructure compromises, we propose +integrating PyPI with The Update Framework [2]_ (TUF). + + +Definitions +=========== + +The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", +"SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be +interpreted as described in RFC 2119__. + +__ http://www.ietf.org/rfc/rfc2119.txt + +In order to keep this PEP focused solely on the application of TUF on PyPI, the +reader is assumed to already be familiar with the design principles of +TUF [2]_. It is also strongly RECOMMENDED that the reader be familiar with the +TUF specification [16]_. + +* Projects: Projects are software components that are made available for + integration. Projects include Python libraries, frameworks, scripts, plugins, + applications, collections of data or other resources, and various + combinations thereof. Public Python projects are typically registered on the + Python Package Index [17]_. + +* Releases: Releases are uniquely identified snapshots of a project [17]_. + +* Distributions: Distributions are the packaged files which are used to publish + and distribute a release [17]_. + +* Simple index: The HTML page which contains internal links to the + distributions of a project [17]_. + +* Consistent snapshot: A set of TUF metadata and PyPI targets that capture the + complete state of all projects on PyPI as they were at some fixed point in + time. + +* The *consistent-snapshot* (*release*) role: In order to prevent confusion due + to the different meanings of the term "release" as employed by PEP 426 [17]_ + and the TUF specification [16]_, we rename the *release* role as the + *consistent-snapshot* role. + +* Continuous delivery: A set of processes with which PyPI produces consistent + snapshots that can safely coexist and deleted independently [18]_. + +* Developer: Either the owner or maintainer of a project who is allowed to + update the TUF metadata as well as distribution metadata and data for the + project. + +* Online key: A key that MUST be stored on the PyPI server infrastructure. + This is usually to allow automated signing with the key. However, this means + that an attacker who compromises PyPI infrastructure will be able to read + these keys. + +* Offline key: A key that MUST be stored off the PyPI infrastructure. This + prevents automated signing with the key. This means that an attacker who + compromises PyPI infrastructure will not be able to immediately read these + keys. + +* Developer key: A private key for which its corresponding public key is + registered with PyPI to say that it is responsible for directly signing for + or delegating the distributions belonging to a project. For the purposes of + this PEP, it is offline in the sense that the private key MUST not be stored + on PyPI. However, the project is free to require certain developer keys to + be online on its own infrastructure. + +* Threshold signature scheme: A role could increase its resilience to key + compromises by requiring that at least t out of n keys are REQUIRED to sign + its metadata. This means that a compromise of t-1 keys is insufficient to + compromise the role itself. We denote this property by saying that the role + requires (t, n) keys. + + +Overview +======== + +.. image:: https://raw.github.com/theupdateframework/pep-on-pypi-with-tuf/master/figure1.png + +Figure 1: A simplified overview of the roles in PyPI with TUF + +Figure 1 shows a simplified overview of the roles that TUF metadata assume on +PyPI. The top-level *root* role signs for the keys of the top-level +*timestamp*, *consistent-snapshot*, *targets* and *root* roles. The +*timestamp* role signs for a new and consistent snapshot. The *consistent- +snapshot* role signs for the *root*, *targets* and all delegated targets +metadata. The *claimed* role signs for all projects that have registered their +own developer keys with PyPI. The *recently-claimed* role signs for all +projects that recently registered their own developer keys with PyPI. Finally, +the *unclaimed* role signs for all projects that have not registered developer +keys with PyPI. The *claimed*, *recently-claimed* and *unclaimed* roles are +numbered 1, 2, 3 respectively because a project will be searched for in each of +those roles in that descending order: first in *claimed*, then in +*recently-claimed* if necessary, and finally in *unclaimed* if necessary. + +Every year, PyPI administrators are going to sign for *root* role keys. After +that, automation will continuously sign for a timestamped, consistent snapshot +of all projects. Every few months, PyPI administrators will move projects with +vetted developer keys from the *recently-claimed* role to the *claimed* role. +As we will soon see, they will sign for *claimed* with projects with offline +keys. + +This PEP does not require project developers to use TUF to secure their +packages from attacks on PyPI. By default, all projects will be signed for by +the *unclaimed* role. If a project wishes stronger security guarantees, then +the project is strongly RECOMMENDED to register developer keys with PyPI so +that it may sign for its own distributions. By doing so, the project must +remain as a *recently-claimed* project until PyPI administrators have had an +opportunity to vet the developer keys of the project, after which the project +will be moved to the *claimed* role. + +This PEP has **not** been designed to be backward-compatible for package +managers that do not use the TUF security protocol to install or update a +project from the PyPI described here. Instead, it is RECOMMENDED that PyPI +maintain a backward-compatible API of itself that does NOT offer TUF so that +older package managers that do not use TUF will be able to install or update +projects from PyPI as usual but without any of the security offered by TUF. +For the rest of this PEP, we will assume that PyPI will simultaneously maintain +a backward-incompatible API of itself for package managers that MUST use TUF to +securely install or update projects. We think that this approach represents a +reasonable trade-off: older package managers that do not TUF will still be able +to install or update projects without any TUF security from PyPI, and newer +package managers that do use TUF will be able to securely install or update +projects. At some point in the future, PyPI administrators MAY choose to +permanently deprecate the backward-compatible version of itself that does not +offer TUF metadata. + +Unless a mirror, CDN or the PyPI repository has been compromised, the end-user +will not be able to discern whether or not a package manager is using TUF to +install or update a project from PyPI. + + +Responsibility Separation +========================= + +Recall that TUF requires four top-level roles: *root*, *timestamp*, +*consistent-snapshot* and *targets*. The *root* role specifies the keys of all +the top-level roles (including itself). The *timestamp* role specifies the +latest consistent snapshot. The *consistent-snapshot* role specifies the +latest versions of all TUF metadata files (other than *timestamp*). The +*targets* role specifies available target files (in our case, it will be all +files on PyPI under the /simple and /packages directories). In this PEP, each +of these roles will serve their responsibilities without exception. + +Our proposal offers two levels of security to developers. If developers opt in +to secure their projects with their own developer keys, then their projects +will be very secure. Otherwise, TUF will still protect them in many cases: + +1. Minimum security (no action by a developer): protects *unclaimed* and + *recently-claimed* projects without developer keys from CDNs [19]_ or public + mirrors, but not from some PyPI compromises. This is because continuous + delivery requires some keys to be online. This level of security protects + projects from being accidentally or deliberately tampered with by a mirror + or a CDN because the mirror or CDN will not have any of the PyPI or + developer keys required to sign for projects. However, it would not protect + projects from attackers who have compromised PyPI because they will be able + to manipulate the TUF metadata for *unclaimed* projects with the appropriate + online keys. + +2. Maximum security (developer signs their project): protects projects with + developer keys not only from CDNs or public mirrors, but also from some PyPI + compromises. This is because many important keys will be offline. This + level of security protects projects from being accidentally or deliberately + tampered with by a mirror or a CDN for reasons identical to the minimum + security level. It will also protect projects (or at least mitigate + damages) from the most likely attacks on PyPI. For example: given access to + online keys after a PyPI compromise, attackers will be able to freeze the + distributions for these projects, but they will not be able to serve + malicious distributions for these projects (not without compromising other + offline keys which would entail more risk, time and energy). Details for + the exact level of security offered is discussed in the section on key + management. + +In order to complete support for continuous delivery, we propose three +delegated targets roles: + +1. *claimed*: Signs for the delegation of PyPI projects to their respective + developer keys. + +2. *recently-claimed*: This role is almost identical to the *claimed* role and + could technically be performed by the *unclaimed* role, but there are two + important reasons why it exists independently: the first reason is to + improve the performance of looking up projects in the *unclaimed* role (by + moving metadata to the *recently-claimed* role instead), and the second + reason is to make it easier for PyPI administrators to move + *recently-claimed* projects to the *claimed* role. + +3. *unclaimed*: Signs for PyPI projects without developer keys. + +The *targets* role MUST delegate all PyPI projects to the three delegated +targets roles in the order of appearance listed above. This means that when +pip downloads with TUF a distribution from a project on PyPI, it will first +consult the *claimed* role about it. If the *claimed* role has delegated the +project, then pip will trust the project developers (in order of delegation) +about the TUF metadata for the project. Otherwise, pip will consult the +*recently-claimed* role about the project. If the *recently-claimed* role has +delegated the project, then pip will trust the project developers (in order of +delegation) about the TUF metadata for the project. Otherwise, pip will +consult the *unclaimed* role about the TUF metadata for the project. If the +*unclaimed* role has not delegated the project, then the project is considered +to be non-existent on PyPI. + +A PyPI project MAY begin without registering a developer key. Therefore, the +project will be signed for by the *unclaimed* role. After registering +developer keys, the project will be removed from the *unclaimed* role and +delegated to the *recently-claimed* role. After a probation period and a +vetting process to verify the developer keys of the project, the project will +be removed from the *recently-claimed* role and delegated to the *claimed* +role. + +The *claimed* role offers maximum security, whereas the *recently-claimed* and +*unclaimed* role offer minimum security. All three roles support continuous +delivery of PyPI projects. + +The *unclaimed* role offers minimum security because PyPI will sign for +projects without developer keys with an online key in order to permit +continuous delivery. + +The *recently-claimed* role offers minimum security because while the project +developers will sign for their own distributions with offline developer keys, +PyPI will sign with an online key the delegation of the project to those +offline developer keys. The signing of the delegation with an online key +allows PyPI administrators to continuously deliver projects without having to +continuously sign the delegation whenever one of those projects registers +developer keys. + +Finally, the *claimed* role offers maximum security because PyPI will sign with +offline keys the delegation of a project to its offline developer keys. This +means that every now and then, PyPI administrators will vet developer keys and +sign the delegation of a project to those developer keys after being reasonably +sure about the ownership of the developer keys. The process for vetting +developer keys is out of the scope of this PEP. + + +Metadata Management +=================== + +In this section, we examine the TUF metadata that PyPI must manage by itself, +and other TUF metadata that must be safely delegated to projects. Examples of +the metadata described here may be seen at our testbed mirror of +`PyPI-with-TUF`__. + +__ http://mirror1.poly.edu/ + +The metadata files that change most frequently will be *timestamp*, +*consistent-snapshot* and delegated targets (*claimed*, *recently-claimed*, +*unclaimed*, project) metadata. The *timestamp* and *consistent-snapshot* +metadata MUST be updated whenever *root*, *targets* or delegated targets +metadata are updated. Observe, though, that *root* and *targets* metadata are +much less likely to be updated as often as delegated targets metadata. +Therefore, *timestamp* and *consistent-snapshot* metadata will most likely be +updated frequently (possibly every minute) due to delegated targets metadata +being updated frequently in order to drive continuous delivery of projects. + +Consequently, the processes with which PyPI updates projects will have to be +updated accordingly, the details of which are explained in the following +subsections. + + +Why Do We Need Consistent Snapshots? +------------------------------------ + +In an ideal world, metadata and data should be immediately updated and +presented whenever a project is updated. In practice, there will be problems +when there are many readers and writers who access the same metadata or data at +the same time. + +An important example at the time of writing is that, mirrors are very likely, +as far as we can tell, to update in an inconsistent manner from PyPI as it is +without TUF. Specifically, a mirror would update itself in such a way that +project A would be from time T, whereas project B would be from time T+5, +project C would be from time T+3, and so on where T is the time that the mirror +first begun updating itself. There is no known way for a mirror to update +itself such that it captures the state of all projects as they were at time T. + +Adding TUF to PyPI will not automatically solve the problem. Consider what we +call the `"inverse replay" or "fast-forward" problem`__. Suppose that PyPI has +timestamped a consistent snapshot at version 1. A mirror is later in the +middle of copying PyPI at this snapshot. While the mirror is copying PyPI at +this snapshot, PyPI timestamps a new snapshot at, say, version 2. Without +accounting for consistency, the mirror would then find itself with a copy of +PyPI in an inconsistent state which is indistinguishable from arbitrary +metadata or target attacks. The problem would also apply when the mirror is +substituted with a pip user. + +__ https://groups.google.com/forum/#!topic/theupdateframework/8mkR9iqivQA + +Therefore, the problem can be summarized as such: there are problems of +consistency on PyPI with or without TUF. TUF requires its metadata to be +consistent with the data, but how would the metadata be kept consistent with +projects that change all the time? + +As a result, we will solve for PyPI the problem of producing a consistent +snapshot that captures the state of all known projects at a given time. Each +consistent snapshot can safely coexist with any other consistent snapshot and +deleted independently without affecting any other consistent snapshot. + +The gist of the solution is that every metadata or data file written to disk +MUST include in its filename the `cryptographic hash`__ of the file. How would +this help clients which use the TUF protocol to securely and consistently +install or update a project from PyPI? + +__ https://en.wikipedia.org/wiki/Cryptographic_hash_function + +Recall that the first step in the TUF protocol requires the client to download +the latest *timestamp* metadata. However, the client would not know in advance +the hash of the *timestamp* metadata file from the latest consistent snapshot. +Therefore, PyPI MUST redirect all HTTP GET requests for *timestamp* metadata to +the *timestamp* metadata file from the latest consistent snapshot. Since the +*timestamp* metadata is the root of a tree of cryptographic hashes pointing to +every other metadata or target file that are meant to exist together for +consistency, the client is then able to retrieve any file from this consistent +snapshot by deterministically including, in the request for the file, the hash +of the file in the filename. Assuming infinite disk space and no `hash +collisions`__, a client may safely read from one consistent snapshot while PyPI +produces another consistent snapshot. + +__ https://en.wikipedia.org/wiki/Collision_(computer_science) + +In this simple but effective manner, we are able to capture a consistent +snapshot of all projects and the associated metadata at a given time. The next +subsection will explicate the implementation details of this idea. + + +Producing Consistent Snapshots +------------------------------ + +Given a project, PyPI is responsible for updating, depending on the project, +either the *claimed*, *recently-claimed* or *unclaimed* metadata as well as +associated delegated targets metadata. Every project MUST upload its set of +metadata and targets in a single transaction. We will call this set of files +the project transaction. We will discuss later how PyPI MAY validate the files +in a project transaction. For now, let us focus on how PyPI will respond to a +project transaction. We will call this response the project transaction +process. There will also be a consistent snapshot process that we will define +momentarily; for now, it suffices to know that project transaction processes +and the consistent snapshot process must coordinate with each other. + +Also, every metadata and target file MUST include in its filename the `hex +digest`__ of its `SHA-256`__ hash. For this PEP, it is RECOMMENDED that PyPI +adopt a simple convention of the form filename.digest.ext, where filename is +the original filename without a copy of the hash, digest is the hex digest of +the hash, and ext is the filename extension. + +__ http://docs.python.org/2/library/hashlib.html#hashlib.hash.hexdigest +__ https://en.wikipedia.org/wiki/SHA-2 + +When an *unclaimed* project uploads a new transaction, a project transaction +process MUST add all new targets and relevant delegated *unclaimed* metadata. +(We will see later in this section why the *unclaimed* role will delegate +targets to a number of delegated *unclaimed* roles.) Finally, the project +transaction process MUST inform the consistent snapshot process about new +delegated *unclaimed* metadata. + +When a *recently-claimed* project uploads a new a transaction, a project +transaction process MUST add all new targets and delegated targets metadata for +the project. If the project is new, then the project transaction process MUST +also add new *recently-claimed* metadata with public keys and threshold number +(which MUST be part of the transaction) for the project. Finally, the project +transaction process MUST inform the consistent snapshot process about new +*recently-claimed* metadata as well as the current set of delegated targets +metadata for the project. + +The process for a *claimed* project is slightly different. The difference is +that PyPI administrators will choose to move the project from the +*recently-claimed* role to the *claimed* role. A project transaction process +MUST then add new *recently-claimed* and *claimed* metadata to reflect this +migration. As is the case for a *recently-claimed* project, the project +transaction process MUST always add all new targets and delegated targets +metadata for the *claimed* project. Finally, the project transaction process +MUST inform the consistent snapshot process about new *recently-claimed* or +*claimed* metadata as well as the current set of delegated targets metadata for +the project. + +Project transaction processes SHOULD be automated, except when PyPI +administrators move a project from the *recently-claimed* role to the *claimed* +role. Project transaction processes MUST also be applied atomically: either +all metadata and targets, or none of them, are added. The project transaction +processes and consistent snapshot process SHOULD work concurrently. Finally, +project transaction processes SHOULD keep in memory the latest *claimed*, +*recently-claimed* and *unclaimed* metadata so that they will be correctly +updated in new consistent snapshots. + +All project transactions MAY be placed in a single queue and processed +serially. Alternatively, the queue MAY be processed concurrently in order of +appearance provided that the following rules are observed: + +1. No pair of project transaction processes must concurrently work on the same + project. + +2. No pair of project transaction processes must concurrently work on + *unclaimed* projects that belong to the same delegated *unclaimed* targets + role. + +3. No pair of project transaction processes must concurrently work on new + *recently-claimed* projects. + +4. No pair of project transaction processes must concurrently work on new + *claimed* projects. + +5. No project transaction process must work on a new *claimed* project while + another project transaction process is working on a new *recently-claimed* + project and vice versa. + +These rules MUST be observed so that metadata is not read from or written to +inconsistently. + +The consistent snapshot process is fairly simple and SHOULD be automated. The +consistent snapshot process MUST keep in memory the latest working set of +*root*, *targets* and delegated targets metadata. Every minute or so, the +consistent snapshot process will sign for this latest working set. (Recall +that project transaction processes continuously inform the consistent snapshot +process about the latest delegated targets metadata in a concurrency-safe +manner. The consistent snapshot process will actually sign for a copy of the +latest working set while the actual latest working set in memory will be +updated with information continuously communicated by project transaction +processes.) Next, the consistent snapshot process MUST generate and sign new +*timestamp* metadata that will vouch for the *consistent-snapshot* metadata +generated in the previous step. Finally, the consistent snapshot process MUST +add new *timestamp* and *consistent-snapshot* metadata representing the latest +consistent snapshot. + +A few implementation notes are now in order. So far, we have seen only that +new metadata and targets are added, but not that old metadata and targets are +removed. Practical constraints are such that eventually PyPI will run out of +disk space to produce a new consistent snapshot. In that case, PyPI MAY then +use something like a "mark-and-sweep" algorithm to delete sufficiently old +consistent snapshots: in order to preserve the latest consistent snapshot, PyPI +would walk objects beginning from the root (*timestamp*) of the latest +consistent snapshot, mark all visited objects, and delete all unmarked +objects. The last few consistent snapshots may be preserved in a similar +fashion. Deleting a consistent snapshot will cause clients to see nothing +thereafter but HTTP 404 responses to any request for a file in that consistent +snapshot. Clients SHOULD then retry their requests with the latest consistent +snapshot. + +We do **not** consider updates to any consistent snapshot because `hash +collisions`__ are out of the scope of this PEP. In case a hash collision is +observed, PyPI MAY wish to check that the file being added is identical to the +file already stored. (Should a hash collision be observed, it is far more +likely the case that the file is identical rather than being a genuine +`collision attack`__.) Otherwise, PyPI MAY either overwrite the existing file +or ignore any write operation to an existing file. + +__ https://en.wikipedia.org/wiki/Collision_(computer_science) +__ https://en.wikipedia.org/wiki/Collision_attack + +All clients, such as pip using the TUF protocol, MUST be modified to download +every metadata and target file (except for *timestamp* metadata) by including, +in the request for the file, the hash of the file in the filename. Following +the filename convention recommended earlier, a request for the file at +filename.ext will be transformed to the equivalent request for the file at +filename.digest.ext. + +Finally, PyPI SHOULD use a `transaction log`__ to record project transaction +processes and queues so that it will be easier to recover from errors after a +server failure. + +__ https://en.wikipedia.org/wiki/Transaction_log + + +Metadata Validation +------------------- + +A *claimed* or *recently-claimed* project will need to upload in its +transaction to PyPI not just targets (a simple index as well as distributions) +but also TUF metadata. The project MAY do so by uploading a ZIP file +containing two directories, /metadata/ (containing delegated targets metadata +files) and /targets/ (containing targets such as the project simple index and +distributions which are signed for by the delegated targets metadata). + +Whenever the project uploads metadata or targets to PyPI, PyPI SHOULD check the +project TUF metadata for at least the following properties: + +* A threshold number of the developers keys registered with PyPI by that + project MUST have signed for the delegated targets metadata file that + represents the "root" of targets for that project (e.g. metadata/targets/ + project.txt). + +* The signatures of delegated targets metadata files MUST be valid. + +* The delegated targets metadata files MUST NOT be expired. + +* The delegated targets metadata MUST be consistent with the targets. + +* A delegator MUST NOT delegate targets that were not delegated to itself by + another delegator. + +* A delegatee MUST NOT sign for targets that were not delegated to itself by a + delegator. + +* Every file MUST contain a unique copy of its hash in its filename following + the filename.digest.ext convention recommended earlier. + +If PyPI chooses to check the project TUF metadata, then PyPI MAY choose to +reject publishing any set of metadata or targets that do not meet these +requirements. + +PyPI MUST enforce access control by ensuring that each project can only write +to the TUF metadata for which it is responsible. It MUST do so by ensuring +that project transaction processes write to the correct metadata as well as +correct locations within those metadata. For example, a project transaction +process for an *unclaimed* project MUST write to the correct target paths in +the correct delegated *unclaimed* metadata for the targets of the project. + +On rare occasions, PyPI MAY wish to extend the TUF metadata format for projects +in a backward-incompatible manner. Note that PyPI will NOT be able to +automatically rewrite existing TUF metadata on behalf of projects in order to +upgrade the metadata to the new backward-incompatible format because this would +invalidate the signatures of the metadata as signed by developer keys. +Instead, package managers SHOULD be written to recognize and handle multiple +incompatible versions of TUF metadata so that *claimed* and *recently-claimed* +projects could be offered a reasonable time to migrate their metadata to newer +but backward-incompatible formats. + +The details of how each project manages its TUF metadata is beyond the scope of +this PEP. + + +Mirroring Protocol +------------------ + +The mirroring protocol as described in PEP 381 [9]_ SHOULD change to mirror +PyPI with TUF. + +A mirror SHOULD have to maintain for its clients only one consistent snapshot +which would represent the latest consistent snapshot from PyPI known to the +mirror. The mirror would then serve all HTTP requests for metadata or targets +by simply reading directly from this consistent snapshot directory. + +The mirroring protocol itself is fairly simple. The mirror would ask PyPI for +*timestamp* metadata from the latest consistent snapshot and proceed to copy +the entire consistent snapshot from the *timestamp* metadata onwards. If the +mirror encounters a failure to copy any metadata or target file while copying +the consistent snapshot, it SHOULD retrying resuming the copy of that +particular consistent snapshot. If PyPI has deleted that consistent snapshot, +then the mirror SHOULD delete the failed consistent snapshot and try +downloading the latest consistent snapshot instead. + +The mirror SHOULD point users to a previous consistent snapshot directory while +it is copying the latest consistent snapshot from PyPI. Only after the latest +consistent snapshot has been completely copied SHOULD the mirror switch clients +to the latest consistent snapshot. The mirror MAY then delete the previous +consistent snapshot once it finds that no client is reading from the previous +consistent snapshot. + +The mirror MAY use extant file transfer software such as rsync__ to mirror +PyPI. In that case, the mirror MUST first obtain the latest known timestamp +metadata from PyPI. The mirror MUST NOT immediately publish the latest known +timestamp metadata from PyPI. Instead, the mirror MUST first iteratively +transfer all new files from PyPI until there are no new files left to transfer. +Finally, the mirror MUST publish the latest known timestamp it fetched from +PyPI so that package managers such as pip may be directed to the latest +consistent snapshot known to the mirror. + +__ https://rsync.samba.org/ + + +Backup Process +-------------- + +In order to be able to safely restore from static snapshots later in the event +of a compromise, PyPI SHOULD maintain a small number of its own mirrors to copy +PyPI consistent snapshots according to some schedule. The mirroring protocol +can be used immediately for this purpose. The mirrors must be secured and +isolated such that they are responsible only for mirroring PyPI. The mirrors +can be checked against one another to detect accidental or malicious failures. + + +Metadata Expiry Times +--------------------- + +The *root* and *targets* role metadata SHOULD expire in a year, because these +metadata files are expected to change very rarely. + +The *claimed* role metadata SHOULD expire in three to six months, because this +metadata is expected to be refreshed in that time frame. This time frame was +chosen to induce an easier administration process for PyPI. + +The *timestamp*, *consistent-snapshot*, *recently-claimed* and *unclaimed* role +metadata SHOULD expire in a day because a CDN or mirror SHOULD synchronize +itself with PyPI every day. Furthermore, this generous time frame also takes +into account client clocks that are highly skewed or adrift. + +The expiry times for the delegated targets metadata of a project is beyond the +scope of this PEP. + + +Metadata Scalability +-------------------- + +Due to the growing number of projects and distributions, the TUF metadata will +also grow correspondingly. + +For example, consider the *unclaimed* role. In August 2013, we found that the +size of the *unclaimed* role metadata was about 42MB if the *unclaimed* role +itself signed for about 220K PyPI targets (which are simple indices and +distributions). We will not delve into details in this PEP, but TUF features a +so-called "`lazy bin walk`__" scheme which splits a large targets or delegated +targets metadata file into many small ones. This allows a TUF client updater +to intelligently download only a small number of TUF metadata files in order to +update any project signed for by the *unclaimed* role. For example, applying +this scheme to the previous repository resulted in pip downloading between +1.3KB and 111KB to install or upgrade a PyPI project via TUF. + +__ https://github.com/theupdateframework/tuf/issues/39 + +From our findings as of the time of writing, PyPI SHOULD split all targets in +the *unclaimed* role by delegating it to 1024 delegated targets role, each of +which would sign for PyPI targets whose hashes fall into that "bin" or +delegated targets role. We found that 1024 bins would result in the +*unclaimed* role metadata and each of its binned delegated targets role +metadata to be about the same size (40-50KB) for about 220K PyPI targets +(simple indices and distributions). + +It is possible to make the TUF metadata more compact by representing it in a +binary format as opposed to the JSON text format. Nevertheless, we believe +that a sufficiently large number of project and distributions will induce +scalability challenges at some point, and therefore the *unclaimed* role will +then still need delegations in order to address the problem. Furthermore, the +JSON format is an open and well-known standard for data interchange. + +Due to the large number of delegated target metadata files, compressed versions +of *consistent-snapshot* metadata SHOULD also be made available. + + +Key Management +============== + +In this section, we examine the kind of keys required to sign for TUF roles on +PyPI. TUF is agnostic with respect to choices of digital signature algorithms. +For the purpose of discussion, we will assume that most digital signatures will +be produced with the well-tested and tried RSA algorithm [20]_. Nevertheless, +we do NOT recommend any particular digital signature algorithm in this PEP +because there are a few important constraints: firstly, cryptography changes +over time; secondly, package managers such as pip may wish to perform signature +verification in Python, without resorting to a compiled C library, in order to +be able to run on as many systems as Python supports; finally, TUF recommends +diversity of keys for certain applications, and we will soon discuss these +exceptions. + + +Number Of Keys +-------------- + +The *timestamp*, *consistent-snapshot*, *recently-claimed* and *unclaimed* +roles will need to support continuous delivery. Even though their respective +keys will then need to be online, we will require that the keys be independent +of each other. This allows for each of the keys to be placed on separate +servers if need be, and prevents side channel attacks that compromise one key +from automatically compromising the rest of the keys. Therefore, each of the +*timestamp*, *consistent-snapshot*, *recently-claimed* and *unclaimed* roles +MUST require (1, 1) keys. + +The *unclaimed* role MAY delegate targets in an automated manner to a number of +roles called "bins", as we discussed in the previous section. Each of the +"bin" roles SHOULD share the same key as the *unclaimed* role, due +simultaneously to space efficiency of metadata and because there is no security +advantage in requiring separate keys. + +The *root* role is critical for security and should very rarely be used. It is +primarily used for key revocation, and it is the root of trust for all of PyPI. +The *root* role signs for the keys that are authorized for each of the +top-level roles (including itself). The keys belonging to the *root* role are +intended to be very well-protected and used with the least frequency of all +keys. We propose that every PSF board member own a (strong) root key. A +majority of them can then constitute the quorum to revoke or endow trust in all +top-level keys. Alternatively, the system administrators of PyPI (instead of +PSF board members) could be responsible for signing for the *root* role. +Therefore, the *root* role SHOULD require (t, n) keys, where n is the number of +either all PyPI administrators or all PSF board members, and t > 1 (so that at +least two members must sign the *root* role). + +The *targets* role will be used only to sign for the static delegation of all +targets to the *claimed*, *recently-claimed* and *unclaimed* roles. Since +these target delegations must be secured against attacks in the event of a +compromise, the keys for the *targets* role MUST be offline and independent +from other keys. For simplicity of key management without sacrificing +security, it is RECOMMENDED that the keys of the *targets* role are permanently +discarded as soon as they have been created and used to sign for the role. +Therefore, the *targets* role SHOULD require (1, 1) keys. Again, this is +because the keys are going to be permanently discarded, and more offline keys +will not help against key recovery attacks [21]_ unless diversity of keys is +maintained. + +Similarly, the *claimed* role will be used only to sign for the dynamic +delegation of projects to their respective developer keys. Since these target +delegations must be secured against attacks in the event of a compromise, the +keys for the *claimed* role MUST be offline and independent from other keys. +Therefore, the *claimed* role SHOULD require (t, n) keys, where n is the number +of all PyPI administrators (in order to keep it manageable), and t ? 1 (so that +at least one member MUST sign the *claimed* role). While a stronger threshold +would indeed render the role more robust against a compromise of the *claimed* +keys (which is highly unlikely assuming that the keys are independent and +securely kept offline), we think that this trade-off is acceptable for the +important purpose of keeping the maintenance overhead for PyPI administrators +as little as possible. At the time of writing, we are keeping this point open +for discussion by the distutils-sig community. + +The number of developer keys is project-specific and thus beyond the scope of +this PEP. + + +Online and Offline Keys +----------------------- + +In order to support continuous delivery, the *timestamp*, +*consistent-snapshot*, *recently-claimed* and *unclaimed* role keys MUST be +online. + +As explained in the previous section, the *root*, *targets* and *claimed* role +keys MUST be offline for maximum security. Developers keys will be offline in +the sense that the private keys MUST NOT be stored on PyPI, though some of them +may be online on the private infrastructure of the project. + + +Key Strength +------------ + +At the time of writing, we recommend that all RSA keys (both offline and +online) SHOULD have a minimum key size of 3072 bits for data-protection +lifetimes beyond 2030 [22]_. + + +Diversity Of Keys +----------------- + +Due to the threats of weak key generation and implementation weaknesses [2]_, +the types of keys as well as the libraries used to generate them should vary +within TUF on PyPI. Our current implementation of TUF supports multiple +digital signature algorithms such as RSA (with OpenSSL [23]_ or PyCrypto [24]_) +and ed25519 [25]_. Furthermore, TUF supports the binding of other +cryptographic libraries that it does not immediately support "out of the box", +and so one MAY generate keys using other cryptographic libraries and use them +for TUF on PyPI. + +As such, the root role keys SHOULD be generated by a variety of digital +signature algorithms as implemented by different cryptographic libraries. + + +Key Compromise Analysis +----------------------- + +.. image:: https://raw.github.com/theupdateframework/pep-on-pypi-with-tuf/master/table1.png + +Table 1: Attacks possible by compromising certain combinations of role keys + + +Table 1 summarizes the kinds of attacks rendered possible by compromising a +threshold number of keys belonging to the TUF roles on PyPI. Except for the +*timestamp* and *consistent-snapshot* roles, the pairwise interaction of role +compromises may be found by taking the union of both rows. + +In September 2013, we showed how the latest version of pip then was susceptible +to these attacks and how TUF could protect users against them [14]_. + +An attacker who compromises developer keys for a project and who is able to +somehow upload malicious metadata and targets to PyPI will be able to serve +malicious updates to users of that project (and that project alone). Note that +compromising *targets* or any delegated targets role (except for project +targets metadata) does not immediately endow the attacker with the ability to +serve malicious updates. The attacker must also compromise the *timestamp* and +*consistent-snapshot* roles (which are both online and therefore more likely to +be compromised). This means that in order to launch any attack, one must be +not only be able to act as a man-in-the-middle but also compromise the +*timestamp* key (or the *root* keys and sign a new *timestamp* key). To launch +any attack other than a freeze attack, one must also compromise the +*consistent-snapshot* key. + +Finally, a compromise of the PyPI infrastructure MAY introduce malicious +updates to *recently-claimed* and *unclaimed* projects because the keys for +those roles are online. However, attackers cannot modify *claimed* projects in +such an event because *targets* and *claimed* metadata have been signed with +offline keys. Therefore, it is RECOMMENDED that high-value projects register +their developer keys with PyPI and sign for their own distributions. + + +In the Event of a Key Compromise +-------------------------------- + +By a key compromise, we mean that the key as well as PyPI infrastructure has +been compromised and used to sign new metadata on PyPI. + +If a threshold number of developer keys of a project have been compromised, +then the project MUST take the following steps: + +1. The project metadata and targets MUST be restored to the last known good + consistent snapshot where the project was not known to be compromised. This + can be done by the developers repackaging and resigning all targets with the + new keys. + +2. The project delegated targets metadata MUST have their version numbers + incremented, expiry times suitably extended and signatures renewed. + +Whereas PyPI MUST take the following steps: + +1. Revoke the compromised developer keys from the delegation to the project by + the *recently-claimed* or *claimed* role. This is done by replacing the + compromised developer keys with newly issued developer keys. + +2. A new timestamped consistent snapshot MUST be issued. + +If a threshold number of *timestamp*, *consistent-snapshot*, *recently-claimed* +or *unclaimed* keys have been compromised, then PyPI MUST take the following +steps: + +1. Revoke the *timestamp*, *consistent-snapshot* and *targets* role keys from + the *root* role. This is done by replacing the compromised *timestamp*, + *consistent-snapshot* and *targets* keys with newly issued keys. + +2. Revoke the *recently-claimed* and *unclaimed* keys from the *targets* role + by replacing their keys with newly issued keys. Sign the new *targets* role + metadata and discard the new keys (because, as we explained earlier, this + increases the security of *targets* metadata). + +3. Clear all targets or delegations in the *recently-claimed* role and delete + all associated delegated targets metadata. Recently registered projects + SHOULD register their developer keys again with PyPI. + +4. All targets of the *recently-claimed* and *unclaimed* roles SHOULD be + compared with the last known good consistent snapshot where none of the + *timestamp*, *consistent-snapshot*, *recently-claimed* or *unclaimed* keys + were known to have been compromised. Added, updated or deleted targets in + the compromised consistent snapshot that do not match the last known good + consistent snapshot MAY be restored to their previous versions. After + ensuring the integrity of all *unclaimed* targets, the *unclaimed* metadata + MUST be regenerated. + +5. The *recently-claimed* and *unclaimed* metadata MUST have their version + numbers incremented, expiry times suitably extended and signatures renewed. + +6. A new timestamped consistent snapshot MUST be issued. + +This would preemptively protect all of these roles even though only one of them +may have been compromised. + +If a threshold number of the *targets* or *claimed* keys have been compromised, +then there is little that an attacker could do without the *timestamp* and +*consistent-snapshot* keys. In this case, PyPI MUST simply revoke the +compromised *targets* or *claimed* keys by replacing them with new keys in the +*root* and *targets* roles respectively. + +If a threshold number of the *timestamp*, *consistent-snapshot* and *claimed* +keys have been compromised, then PyPI MUST take the following steps in addition +to the steps taken when either the *timestamp* or *consistent-snapshot* keys +are compromised: + +1. Revoke the *claimed* role keys from the *targets* role and replace them with + newly issued keys. + +2. All project targets of the *claimed* roles SHOULD be compared with the last + known good consistent snapshot where none of the *timestamp*, + *consistent-snapshot* or *claimed* keys were known to have been compromised. + Added, updated or deleted targets in the compromised consistent snapshot + that do not match the last known good consistent snapshot MAY be restored to + their previous versions. After ensuring the integrity of all *claimed* + project targets, the *claimed* metadata MUST be regenerated. + +3. The *claimed* metadata MUST have their version numbers incremented, expiry + times suitably extended and signatures renewed. + +If a threshold number of the *timestamp*, *consistent-snapshot* and *targets* +keys have been compromised, then PyPI MUST take the union of the steps taken +when the *claimed*, *recently-claimed* and *unclaimed* keys have been +compromised. + +If a threshold number of the *root* keys have been compromised, then PyPI MUST +take the steps taken when the *targets* role has been compromised as well as +replace all of the *root* keys. + +It is also RECOMMENDED that PyPI sufficiently document compromises with +security bulletins. These security bulletins will be most informative when +users of pip with TUF are unable to install or update a project because the +keys for the *timestamp*, *consistent-snapshot* or *root* roles are no longer +valid. They could then visit the PyPI web site to consult security bulletins +that would help to explain why they are no longer able to install or update, +and then take action accordingly. When a threshold number of *root* keys have +not been revoked due to a compromise, then new *root* metadata may be safely +updated because a threshold number of existing *root* keys will be used to sign +for the integrity of the new *root* metadata so that TUF clients will be able +to verify the integrity of the new *root* metadata with a threshold number of +previously known *root* keys. This will be the common case. Otherwise, in the +worst case where a threshold number of *root* keys have been revoked due to a +compromise, an end-user may choose to update new *root* metadata with +`out-of-band`__ mechanisms. + +__ https://en.wikipedia.org/wiki/Out-of-band#Authentication + + +Appendix: Rejected Proposals +============================ + + +Alternative Proposals for Producing Consistent Snapshots +-------------------------------------------------------- + +The complete file snapshot (CFS) scheme uses file system directories to store +efficient consistent snapshots over time. In this scheme, every consistent +snapshot will be stored in a separate directory, wherein files that are shared +with previous consistent snapshots will be `hard links`__ instead of copies. + +__ https://en.wikipedia.org/wiki/Hard_link + +The `differential file`__ snapshot (DFS) scheme is a variant of the CFS scheme, +wherein the next consistent snapshot directory will contain only the additions +of new files and updates to existing files of the previous consistent snapshot. +(The first consistent snapshot will contain a complete set of files known +then.) Deleted files will be marked as such in the next consistent snapshot +directory. This means that files will be resolved in this manner: First, set +the current consistent snapshot directory to be the latest consistent snapshot +directory. Then, any requested file will be seeked in the current consistent +snapshot directory. If the file exists in the current consistent snapshot +directory, then that file will be returned. If it has been marked as deleted +in the current consistent snapshot directory, then that file will be reported +as missing. Otherwise, the current consistent snapshot directory will be set +to the preceding consistent snapshot directory and the previous few steps will +be iterated until there is no preceding consistent snapshot to be considered, +at which point the file will be reported as missing. + +__ http://dl.acm.org/citation.cfm?id=320484 + +With the CFS scheme, the trade-off is the I/O costs of producing a consistent +snapshot with the file system. As of October 2013, we found that a fairly +modern computer with a 7200RPM hard disk drive required at least three minutes +to produce a consistent snapshot with the "cp -lr" command on the ext3__ file +system. Perhaps the I/O costs of this scheme may be ameliorated with advanced +tools or file systems such as ZFS__ or btrfs__. + +__ https://en.wikipedia.org/wiki/Ext3 +__ https://en.wikipedia.org/wiki/ZFS +__ https://en.wikipedia.org/wiki/Btrfs + +While the DFS scheme improves upon the CFS scheme in terms of producing faster +consistent snapshots, there are at least two trade-offs. The first is that a +web server will need to be modified to perform the "daisy chain" resolution of +a file. The second is that every now and then, the differential snapshots will +need to be "squashed" or merged together with the first consistent snapshot to +produce a new first consistent snapshot with the latest and complete set of +files. Although the merge cost may be amortized over time, this scheme is not +conceptually si + + + + +References +========== + +.. [1] https://pypi.python.org +.. [2] https://isis.poly.edu/~jcappos/papers/samuel_tuf_ccs_2010.pdf +.. [3] http://www.pip-installer.org +.. [4] https://wiki.python.org/moin/WikiAttack2013 +.. [5] https://github.com/theupdateframework/pip/wiki/Attacks-on-software-repositories +.. [6] https://mail.python.org/pipermail/distutils-sig/2013-April/020596.html +.. [7] https://mail.python.org/pipermail/distutils-sig/2013-May/020701.html +.. [8] https://mail.python.org/pipermail/distutils-sig/2013-July/022008.html +.. [9] PEP 381, Mirroring infrastructure for PyPI, Ziad?, L?wis + http://www.python.org/dev/peps/pep-0381/ +.. [10] https://mail.python.org/pipermail/distutils-sig/2013-September/022773.html +.. [11] https://mail.python.org/pipermail/distutils-sig/2013-May/020848.html +.. [12] PEP 449, Removal of the PyPI Mirror Auto Discovery and Naming Scheme, Stufft + http://www.python.org/dev/peps/pep-0449/ +.. [13] https://isis.poly.edu/~jcappos/papers/cappos_mirror_ccs_08.pdf +.. [14] https://mail.python.org/pipermail/distutils-sig/2013-September/022755.html +.. [15] https://pypi.python.org/security +.. [16] https://github.com/theupdateframework/tuf/blob/develop/docs/tuf-spec.txt +.. [17] PEP 426, Metadata for Python Software Packages 2.0, Coghlan, Holth, Stufft + http://www.python.org/dev/peps/pep-0426/ +.. [18] https://en.wikipedia.org/wiki/Continuous_delivery +.. [19] https://mail.python.org/pipermail/distutils-sig/2013-August/022154.html +.. [20] https://en.wikipedia.org/wiki/RSA_%28algorithm%29 +.. [21] https://en.wikipedia.org/wiki/Key-recovery_attack +.. [22] http://csrc.nist.gov/publications/nistpubs/800-57/SP800-57-Part1.pdf +.. [23] https://www.openssl.org/ +.. [24] https://pypi.python.org/pypi/pycrypto +.. [25] http://ed25519.cr.yp.to/ + + +Acknowledgements +================ + +Nick Coghlan, Daniel Holth and the distutils-sig community in general for +helping us to think about how to usably and efficiently integrate TUF with +PyPI. + +Roger Dingledine, Sebastian Hahn, Nick Mathewson, Martin Peck and Justin +Samuel for helping us to design TUF from its predecessor Thandy of the Tor +project. + +Konstantin Andrianov, Geremy Condra, Vladimir Diaz, Zane Fisher, Justin Samuel, +Tian Tian, Santiago Torres, John Ward, and Yuyu Zheng for helping us to develop +TUF. + +Vladimir Diaz, Monzur Muhammad and Sai Teja Peddinti for helping us to review +this PEP. + +Zane Fisher for helping us to review and transcribe this PEP. + + +Copyright +========= + +This document has been placed in the public domain. -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 15 15:34:25 2013 From: python-checkins at python.org (nick.coghlan) Date: Fri, 15 Nov 2013 15:34:25 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Don=27t_decref_exc_too_soo?= =?utf-8?q?n?= Message-ID: <3dLhsF58pqzMjB@mail.python.org> http://hg.python.org/cpython/rev/ace11097a22e changeset: 87110:ace11097a22e user: Nick Coghlan date: Sat Nov 16 00:34:13 2013 +1000 summary: Don't decref exc too soon files: Objects/exceptions.c | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Objects/exceptions.c b/Objects/exceptions.c --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -2685,7 +2685,6 @@ * state potentially stored on OSError instances. */ - Py_DECREF(exc); Py_XDECREF(tb); #ifdef HAVE_STDARG_PROTOTYPES @@ -2696,12 +2695,14 @@ msg_prefix = PyUnicode_FromFormatV(format, vargs); va_end(vargs); if (msg_prefix == NULL) { + Py_DECREF(exc); Py_DECREF(val); return NULL; } PyErr_Format(exc, "%U (%s: %S)", msg_prefix, Py_TYPE(val)->tp_name, val); + Py_DECREF(exc); Py_DECREF(msg_prefix); PyErr_Fetch(&new_exc, &new_val, &new_tb); PyErr_NormalizeException(&new_exc, &new_val, &new_tb); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 15:35:44 2013 From: python-checkins at python.org (nick.coghlan) Date: Fri, 15 Nov 2013 15:35:44 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Avoid_triggering_the_refle?= =?utf-8?q?ak_detector?= Message-ID: <3dLhtm2HXkz7LlV@mail.python.org> http://hg.python.org/cpython/rev/8c4d42c0dc8e changeset: 87111:8c4d42c0dc8e user: Nick Coghlan date: Sat Nov 16 00:35:34 2013 +1000 summary: Avoid triggering the refleak detector files: Lib/test/test_codecs.py | 22 ++++++++++++++-------- 1 files changed, 14 insertions(+), 8 deletions(-) diff --git a/Lib/test/test_codecs.py b/Lib/test/test_codecs.py --- a/Lib/test/test_codecs.py +++ b/Lib/test/test_codecs.py @@ -2386,6 +2386,15 @@ # currently *only* want this to happen for relatively stateless # exceptions, where the only significant information they contain is their # type and a single str argument. + +# Use a local codec registry to avoid appearing to leak objects when +# registering multiple seach functions +_TEST_CODECS = {} + +def _get_test_codec(codec_name): + return _TEST_CODECS.get(codec_name) +codecs.register(_get_test_codec) # Returns None, not usable as a decorator + class ExceptionChainingTest(unittest.TestCase): def setUp(self): @@ -2395,19 +2404,16 @@ # The codecs module normalizes codec names, although this doesn't # appear to be formally documented... self.codec_name = repr(self).lower().replace(" ", "-") - self.codec_info = None - codecs.register(self.get_codec) - def get_codec(self, codec_name): - if codec_name != self.codec_name: - return None - return self.codec_info + def tearDown(self): + _TEST_CODECS.pop(self.codec_name, None) def set_codec(self, obj_to_raise): def raise_obj(*args, **kwds): raise obj_to_raise - self.codec_info = codecs.CodecInfo(raise_obj, raise_obj, - name=self.codec_name) + codec_info = codecs.CodecInfo(raise_obj, raise_obj, + name=self.codec_name) + _TEST_CODECS[self.codec_name] = codec_info @contextlib.contextmanager def assertWrapped(self, operation, exc_type, msg): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 16:41:14 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 15 Nov 2013 16:41:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Longer_timeout_?= =?utf-8?q?in_Windows_test=5Fpopen=2E_Fixes_issue_19598=2E?= Message-ID: <3dLkLL57Csz7Lqs@mail.python.org> http://hg.python.org/cpython/rev/d48ec67b3b0e changeset: 87112:d48ec67b3b0e user: Guido van Rossum date: Fri Nov 15 07:41:10 2013 -0800 summary: asyncio: Longer timeout in Windows test_popen. Fixes issue 19598. files: Lib/test/test_asyncio/test_windows_utils.py | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_asyncio/test_windows_utils.py b/Lib/test/test_asyncio/test_windows_utils.py --- a/Lib/test/test_asyncio/test_windows_utils.py +++ b/Lib/test/test_asyncio/test_windows_utils.py @@ -119,7 +119,8 @@ overr.ReadFile(p.stderr.handle, 100) events = [ovin.event, ovout.event, overr.event] - res = _winapi.WaitForMultipleObjects(events, True, 2000) + # Super-long timeout for slow buildbots. + res = _winapi.WaitForMultipleObjects(events, True, 10000) self.assertEqual(res, _winapi.WAIT_OBJECT_0) self.assertFalse(ovout.pending) self.assertFalse(overr.pending) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 17:13:20 2013 From: python-checkins at python.org (victor.stinner) Date: Fri, 15 Nov 2013 17:13:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogcHl0aG9ucnVuLmM6?= =?utf-8?q?_fix_Py=5FGetPythonHome=28=29=2C_use_Py=5FARRAY=5FLENGTH=28=29_?= =?utf-8?q?to_get_the_size_of?= Message-ID: <3dLl3N4h6lz7Ljd@mail.python.org> http://hg.python.org/cpython/rev/159e51e5fc2c changeset: 87113:159e51e5fc2c branch: 3.3 parent: 87102:46fc4fb2c8c5 user: Victor Stinner date: Fri Nov 15 17:09:24 2013 +0100 summary: pythonrun.c: fix Py_GetPythonHome(), use Py_ARRAY_LENGTH() to get the size of the env_home buffer, not PATH_MAX+1. env_home is declared using MAXPATHLEN+1, and PATH_MAX is not declared on IRIX. files: Python/pythonrun.c | 5 +++-- 1 files changed, 3 insertions(+), 2 deletions(-) diff --git a/Python/pythonrun.c b/Python/pythonrun.c --- a/Python/pythonrun.c +++ b/Python/pythonrun.c @@ -817,8 +817,9 @@ if (home == NULL && !Py_IgnoreEnvironmentFlag) { char* chome = Py_GETENV("PYTHONHOME"); if (chome) { - size_t r = mbstowcs(env_home, chome, PATH_MAX+1); - if (r != (size_t)-1 && r <= PATH_MAX) + size_t size = Py_ARRAY_LENGTH(env_home); + size_t r = mbstowcs(env_home, chome, size); + if (r != (size_t)-1 && r < size) home = env_home; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 17:13:22 2013 From: python-checkins at python.org (victor.stinner) Date: Fri, 15 Nov 2013 17:13:22 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_=28Merge_3=2E3=29_pythonrun=2Ec=3A_fix_Py=5FGetPythonHom?= =?utf-8?b?ZSgpLCB1c2UgUHlfQVJSQVlfTEVOR1RIKCkgdG8gZ2V0?= Message-ID: <3dLl3Q0ShLz7LnG@mail.python.org> http://hg.python.org/cpython/rev/58ffd70d02b6 changeset: 87114:58ffd70d02b6 parent: 87112:d48ec67b3b0e parent: 87113:159e51e5fc2c user: Victor Stinner date: Fri Nov 15 17:12:14 2013 +0100 summary: (Merge 3.3) pythonrun.c: fix Py_GetPythonHome(), use Py_ARRAY_LENGTH() to get the size of the env_home buffer, not PATH_MAX+1. env_home is declared using MAXPATHLEN+1, and PATH_MAX is not declared on IRIX. files: Python/pythonrun.c | 5 +++-- 1 files changed, 3 insertions(+), 2 deletions(-) diff --git a/Python/pythonrun.c b/Python/pythonrun.c --- a/Python/pythonrun.c +++ b/Python/pythonrun.c @@ -900,8 +900,9 @@ if (home == NULL && !Py_IgnoreEnvironmentFlag) { char* chome = Py_GETENV("PYTHONHOME"); if (chome) { - size_t r = mbstowcs(env_home, chome, PATH_MAX+1); - if (r != (size_t)-1 && r <= PATH_MAX) + size_t size = Py_ARRAY_LENGTH(env_home); + size_t r = mbstowcs(env_home, chome, size); + if (r != (size_t)-1 && r < size) home = env_home; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 17:36:02 2013 From: python-checkins at python.org (victor.stinner) Date: Fri, 15 Nov 2013 17:36:02 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogc3lzbW9kdWxlLmM6?= =?utf-8?b?IGZpeCBzeXNfdXBkYXRlX3BhdGgoKSwgdXNlIFB5X0FSUkFZX0xFTkdUSCgp?= =?utf-8?q?_to_get_the_size_of?= Message-ID: <3dLlYZ58YQz7Lnx@mail.python.org> http://hg.python.org/cpython/rev/780a0199c7f1 changeset: 87115:780a0199c7f1 branch: 3.3 parent: 87113:159e51e5fc2c user: Victor Stinner date: Fri Nov 15 17:33:43 2013 +0100 summary: sysmodule.c: fix sys_update_path(), use Py_ARRAY_LENGTH() to get the size of the fullpath buffer, not PATH_MAX. fullpath is declared using MAXPATHLEN or MAX_PATH depending on the OS, and PATH_MAX is not declared on IRIX. files: Python/sysmodule.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Python/sysmodule.c b/Python/sysmodule.c --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -1895,7 +1895,7 @@ #else /* All other filename syntaxes */ if (_HAVE_SCRIPT_ARGUMENT(argc, argv)) { #if defined(HAVE_REALPATH) - if (_Py_wrealpath(argv0, fullpath, PATH_MAX)) { + if (_Py_wrealpath(argv0, fullpath, Py_ARRAY_LENGTH(fullpath))) { argv0 = fullpath; } #endif -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 17:36:04 2013 From: python-checkins at python.org (victor.stinner) Date: Fri, 15 Nov 2013 17:36:04 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?b?KTogKE1lcmdlIDMuMykgc3lzbW9kdWxlLmM6IGZpeCBzeXNfdXBkYXRlX3Bh?= =?utf-8?q?th=28=29=2C_use_Py=5FARRAY=5FLENGTH=28=29_to_get?= Message-ID: <3dLlYc0zFWz7Lkw@mail.python.org> http://hg.python.org/cpython/rev/f39fac22ca9a changeset: 87116:f39fac22ca9a parent: 87114:58ffd70d02b6 parent: 87115:780a0199c7f1 user: Victor Stinner date: Fri Nov 15 17:35:31 2013 +0100 summary: (Merge 3.3) sysmodule.c: fix sys_update_path(), use Py_ARRAY_LENGTH() to get the size of the fullpath buffer, not PATH_MAX. fullpath is declared using MAXPATHLEN or MAX_PATH depending on the OS, and PATH_MAX is not declared on IRIX. files: Python/sysmodule.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Python/sysmodule.c b/Python/sysmodule.c --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -1959,7 +1959,7 @@ #else /* All other filename syntaxes */ if (_HAVE_SCRIPT_ARGUMENT(argc, argv)) { #if defined(HAVE_REALPATH) - if (_Py_wrealpath(argv0, fullpath, PATH_MAX)) { + if (_Py_wrealpath(argv0, fullpath, Py_ARRAY_LENGTH(fullpath))) { argv0 = fullpath; } #endif -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 18:14:53 2013 From: python-checkins at python.org (victor.stinner) Date: Fri, 15 Nov 2013 18:14:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogZmlsZXV0aWxzLmM6?= =?utf-8?q?_use_MAXPATHLEN_instead_of_PATH=5FMAX?= Message-ID: <3dLmQP6Wd2z7Ljd@mail.python.org> http://hg.python.org/cpython/rev/8bbd215940f1 changeset: 87117:8bbd215940f1 branch: 3.3 parent: 87115:780a0199c7f1 user: Victor Stinner date: Fri Nov 15 18:14:11 2013 +0100 summary: fileutils.c: use MAXPATHLEN instead of PATH_MAX PATH_MAX is not declared on IRIX nor Windows. files: Python/fileutils.c | 12 ++++++------ 1 files changed, 6 insertions(+), 6 deletions(-) diff --git a/Python/fileutils.c b/Python/fileutils.c --- a/Python/fileutils.c +++ b/Python/fileutils.c @@ -623,7 +623,7 @@ _Py_wreadlink(const wchar_t *path, wchar_t *buf, size_t bufsiz) { char *cpath; - char cbuf[PATH_MAX]; + char cbuf[MAXPATHLEN]; wchar_t *wbuf; int res; size_t r1; @@ -633,11 +633,11 @@ errno = EINVAL; return -1; } - res = (int)readlink(cpath, cbuf, PATH_MAX); + res = (int)readlink(cpath, cbuf, Py_ARRAY_LENGTH(cbuf)); PyMem_Free(cpath); if (res == -1) return -1; - if (res == PATH_MAX) { + if (res == Py_ARRAY_LENGTH(cbuf)) { errno = EINVAL; return -1; } @@ -669,7 +669,7 @@ wchar_t *resolved_path, size_t resolved_path_size) { char *cpath; - char cresolved_path[PATH_MAX]; + char cresolved_path[MAXPATHLEN]; wchar_t *wresolved_path; char *res; size_t r; @@ -709,11 +709,11 @@ #ifdef MS_WINDOWS return _wgetcwd(buf, size); #else - char fname[PATH_MAX]; + char fname[MAXPATHLEN]; wchar_t *wname; size_t len; - if (getcwd(fname, PATH_MAX) == NULL) + if (getcwd(fname, Py_ARRAY_LENGTH(fname)) == NULL) return NULL; wname = _Py_char2wchar(fname, &len); if (wname == NULL) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 18:14:55 2013 From: python-checkins at python.org (victor.stinner) Date: Fri, 15 Nov 2013 18:14:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_=28Merge_3=2E3=29_fileutils=2Ec=3A_use_MAXPATHLEN_instea?= =?utf-8?q?d_of_PATH=5FMAX?= Message-ID: <3dLmQR1BxKz7Ljd@mail.python.org> http://hg.python.org/cpython/rev/37d36a029df0 changeset: 87118:37d36a029df0 parent: 87116:f39fac22ca9a parent: 87117:8bbd215940f1 user: Victor Stinner date: Fri Nov 15 18:14:33 2013 +0100 summary: (Merge 3.3) fileutils.c: use MAXPATHLEN instead of PATH_MAX PATH_MAX is not declared on IRIX nor Windows. files: Python/fileutils.c | 12 ++++++------ 1 files changed, 6 insertions(+), 6 deletions(-) diff --git a/Python/fileutils.c b/Python/fileutils.c --- a/Python/fileutils.c +++ b/Python/fileutils.c @@ -869,7 +869,7 @@ _Py_wreadlink(const wchar_t *path, wchar_t *buf, size_t bufsiz) { char *cpath; - char cbuf[PATH_MAX]; + char cbuf[MAXPATHLEN]; wchar_t *wbuf; int res; size_t r1; @@ -879,11 +879,11 @@ errno = EINVAL; return -1; } - res = (int)readlink(cpath, cbuf, PATH_MAX); + res = (int)readlink(cpath, cbuf, Py_ARRAY_LENGTH(cbuf)); PyMem_Free(cpath); if (res == -1) return -1; - if (res == PATH_MAX) { + if (res == Py_ARRAY_LENGTH(cbuf)) { errno = EINVAL; return -1; } @@ -915,7 +915,7 @@ wchar_t *resolved_path, size_t resolved_path_size) { char *cpath; - char cresolved_path[PATH_MAX]; + char cresolved_path[MAXPATHLEN]; wchar_t *wresolved_path; char *res; size_t r; @@ -956,11 +956,11 @@ int isize = (int)Py_MIN(size, INT_MAX); return _wgetcwd(buf, isize); #else - char fname[PATH_MAX]; + char fname[MAXPATHLEN]; wchar_t *wname; size_t len; - if (getcwd(fname, PATH_MAX) == NULL) + if (getcwd(fname, Py_ARRAY_LENGTH(fname)) == NULL) return NULL; wname = _Py_char2wchar(fname, &len); if (wname == NULL) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 19:07:02 2013 From: python-checkins at python.org (jason.coombs) Date: Fri, 15 Nov 2013 19:07:02 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319544_and_Issue_?= =?utf-8?q?=236516=3A_Restore_support_for_--user_and_--group_parameters?= Message-ID: <3dLnZZ0Vjnz7LqJ@mail.python.org> http://hg.python.org/cpython/rev/b9c9c4b2effe changeset: 87119:b9c9c4b2effe user: Andrew Kuchling date: Fri Nov 15 13:01:52 2013 -0500 summary: Issue #19544 and Issue #6516: Restore support for --user and --group parameters to sdist command as found in Python 2.7 and originally slated for Python 3.2 but accidentally rolled back as part of the distutils2 rollback. Closes Issue #6516. files: Doc/distutils/sourcedist.rst | 18 ++- Lib/distutils/archive_util.py | 71 +++++++++++- Lib/distutils/cmd.py | 6 +- Lib/distutils/command/bdist.py | 13 ++ Lib/distutils/command/bdist_dumb.py | 11 +- Lib/distutils/command/sdist.py | 9 +- Lib/distutils/tests/test_archive_util.py | 61 ++++++++++- Lib/distutils/tests/test_sdist.py | 53 ++++++++ Misc/NEWS | 3 + 9 files changed, 230 insertions(+), 15 deletions(-) diff --git a/Doc/distutils/sourcedist.rst b/Doc/distutils/sourcedist.rst --- a/Doc/distutils/sourcedist.rst +++ b/Doc/distutils/sourcedist.rst @@ -26,16 +26,16 @@ +===========+=========================+=========+ | ``zip`` | zip file (:file:`.zip`) | (1),(3) | +-----------+-------------------------+---------+ -| ``gztar`` | gzip'ed tar file | (2),(4) | +| ``gztar`` | gzip'ed tar file | \(2) | | | (:file:`.tar.gz`) | | +-----------+-------------------------+---------+ -| ``bztar`` | bzip2'ed tar file | \(4) | +| ``bztar`` | bzip2'ed tar file | | | | (:file:`.tar.bz2`) | | +-----------+-------------------------+---------+ | ``ztar`` | compressed tar file | \(4) | | | (:file:`.tar.Z`) | | +-----------+-------------------------+---------+ -| ``tar`` | tar file (:file:`.tar`) | \(4) | +| ``tar`` | tar file (:file:`.tar`) | | +-----------+-------------------------+---------+ Notes: @@ -51,8 +51,16 @@ of the standard Python library since Python 1.6) (4) - requires external utilities: :program:`tar` and possibly one of :program:`gzip`, - :program:`bzip2`, or :program:`compress` + requires the :program:`compress` program. Notice that this format is now + pending for deprecation and will be removed in the future versions of Python. + +When using any ``tar`` format (``gztar``, ``bztar``, ``ztar`` or +``tar``), under Unix you can specify the ``owner`` and ``group`` names +that will be set for each member of the archive. + +For example, if you want all files of the archive to be owned by root:: + + python setup.py sdist --owner=root --group=root .. _manifest: diff --git a/Lib/distutils/archive_util.py b/Lib/distutils/archive_util.py --- a/Lib/distutils/archive_util.py +++ b/Lib/distutils/archive_util.py @@ -18,15 +18,55 @@ from distutils.dir_util import mkpath from distutils import log -def make_tarball(base_name, base_dir, compress="gzip", verbose=0, dry_run=0): +try: + from pwd import getpwnam +except AttributeError: + getpwnam = None + +try: + from grp import getgrnam +except AttributeError: + getgrnam = None + +def _get_gid(name): + """Returns a gid, given a group name.""" + if getgrnam is None or name is None: + return None + try: + result = getgrnam(name) + except KeyError: + result = None + if result is not None: + return result[2] + return None + +def _get_uid(name): + """Returns an uid, given a user name.""" + if getpwnam is None or name is None: + return None + try: + result = getpwnam(name) + except KeyError: + result = None + if result is not None: + return result[2] + return None + +def make_tarball(base_name, base_dir, compress="gzip", verbose=0, dry_run=0, + owner=None, group=None): """Create a (possibly compressed) tar file from all the files under 'base_dir'. 'compress' must be "gzip" (the default), "compress", "bzip2", or None. - Both "tar" and the compression utility named by 'compress' must be on - the default program search path, so this is probably Unix-specific. + (compress will be deprecated in Python 3.2) + + 'owner' and 'group' can be used to define an owner and a group for the + archive that is being built. If not provided, the current owner and group + will be used. + The output tar file will be named 'base_dir' + ".tar", possibly plus the appropriate compression extension (".gz", ".bz2" or ".Z"). + Returns the output filename. """ tar_compression = {'gzip': 'gz', 'bzip2': 'bz2', None: '', 'compress': ''} @@ -48,10 +88,23 @@ import tarfile # late import so Python build itself doesn't break log.info('Creating tar archive') + + uid = _get_uid(owner) + gid = _get_gid(group) + + def _set_uid_gid(tarinfo): + if gid is not None: + tarinfo.gid = gid + tarinfo.gname = group + if uid is not None: + tarinfo.uid = uid + tarinfo.uname = owner + return tarinfo + if not dry_run: tar = tarfile.open(archive_name, 'w|%s' % tar_compression[compress]) try: - tar.add(base_dir) + tar.add(base_dir, filter=_set_uid_gid) finally: tar.close() @@ -140,7 +193,7 @@ return None def make_archive(base_name, format, root_dir=None, base_dir=None, verbose=0, - dry_run=0): + dry_run=0, owner=None, group=None): """Create an archive file (eg. zip or tar). 'base_name' is the name of the file to create, minus any format-specific @@ -153,6 +206,9 @@ ie. 'base_dir' will be the common prefix of all files and directories in the archive. 'root_dir' and 'base_dir' both default to the current directory. Returns the name of the archive file. + + 'owner' and 'group' are used when creating a tar archive. By default, + uses the current owner and group. """ save_cwd = os.getcwd() if root_dir is not None: @@ -174,6 +230,11 @@ func = format_info[0] for arg, val in format_info[1]: kwargs[arg] = val + + if format != 'zip': + kwargs['owner'] = owner + kwargs['group'] = group + try: filename = func(base_name, base_dir, **kwargs) finally: diff --git a/Lib/distutils/cmd.py b/Lib/distutils/cmd.py --- a/Lib/distutils/cmd.py +++ b/Lib/distutils/cmd.py @@ -365,9 +365,11 @@ from distutils.spawn import spawn spawn(cmd, search_path, dry_run=self.dry_run) - def make_archive(self, base_name, format, root_dir=None, base_dir=None): + def make_archive(self, base_name, format, root_dir=None, base_dir=None, + owner=None, group=None): return archive_util.make_archive(base_name, format, root_dir, base_dir, - dry_run=self.dry_run) + dry_run=self.dry_run, + owner=owner, group=group) def make_file(self, infiles, outfile, func, args, exec_msg=None, skip_msg=None, level=1): diff --git a/Lib/distutils/command/bdist.py b/Lib/distutils/command/bdist.py --- a/Lib/distutils/command/bdist.py +++ b/Lib/distutils/command/bdist.py @@ -37,6 +37,12 @@ "[default: dist]"), ('skip-build', None, "skip rebuilding everything (for testing/debugging)"), + ('owner=', 'u', + "Owner name used when creating a tar file" + " [default: current user]"), + ('group=', 'g', + "Group name used when creating a tar file" + " [default: current group]"), ] boolean_options = ['skip-build'] @@ -77,6 +83,8 @@ self.formats = None self.dist_dir = None self.skip_build = 0 + self.group = None + self.owner = None def finalize_options(self): # have to finalize 'plat_name' before 'bdist_base' @@ -122,6 +130,11 @@ if cmd_name not in self.no_format_option: sub_cmd.format = self.formats[i] + # passing the owner and group names for tar archiving + if cmd_name == 'bdist_dumb': + sub_cmd.owner = self.owner + sub_cmd.group = self.group + # If we're going to need to run this command again, tell it to # keep its temporary files around so subsequent runs go faster. if cmd_name in commands[i+1:]: diff --git a/Lib/distutils/command/bdist_dumb.py b/Lib/distutils/command/bdist_dumb.py --- a/Lib/distutils/command/bdist_dumb.py +++ b/Lib/distutils/command/bdist_dumb.py @@ -33,6 +33,12 @@ ('relative', None, "build the archive using relative paths" "(default: false)"), + ('owner=', 'u', + "Owner name used when creating a tar file" + " [default: current user]"), + ('group=', 'g', + "Group name used when creating a tar file" + " [default: current group]"), ] boolean_options = ['keep-temp', 'skip-build', 'relative'] @@ -48,6 +54,8 @@ self.dist_dir = None self.skip_build = None self.relative = 0 + self.owner = None + self.group = None def finalize_options(self): if self.bdist_dir is None: @@ -101,7 +109,8 @@ # Make the archive filename = self.make_archive(pseudoinstall_root, - self.format, root_dir=archive_root) + self.format, root_dir=archive_root, + owner=self.owner, group=self.group) if self.distribution.has_ext_modules(): pyversion = get_python_version() else: diff --git a/Lib/distutils/command/sdist.py b/Lib/distutils/command/sdist.py --- a/Lib/distutils/command/sdist.py +++ b/Lib/distutils/command/sdist.py @@ -74,6 +74,10 @@ ('metadata-check', None, "Ensure that all required elements of meta-data " "are supplied. Warn if any missing. [default]"), + ('owner=', 'u', + "Owner name used when creating a tar file [default: current user]"), + ('group=', 'g', + "Group name used when creating a tar file [default: current group]"), ] boolean_options = ['use-defaults', 'prune', @@ -113,6 +117,8 @@ self.archive_files = None self.metadata_check = 1 + self.owner = None + self.group = None def finalize_options(self): if self.manifest is None: @@ -444,7 +450,8 @@ self.formats.append(self.formats.pop(self.formats.index('tar'))) for fmt in self.formats: - file = self.make_archive(base_name, fmt, base_dir=base_dir) + file = self.make_archive(base_name, fmt, base_dir=base_dir, + owner=self.owner, group=self.group) archive_files.append(file) self.distribution.dist_files.append(('sdist', '', file)) diff --git a/Lib/distutils/tests/test_archive_util.py b/Lib/distutils/tests/test_archive_util.py --- a/Lib/distutils/tests/test_archive_util.py +++ b/Lib/distutils/tests/test_archive_util.py @@ -16,6 +16,13 @@ from test.support import check_warnings, run_unittest, patch try: + import grp + import pwd + UID_GID_SUPPORT = True +except ImportError: + UID_GID_SUPPORT = False + +try: import zipfile ZIP_SUPPORT = True except ImportError: @@ -77,7 +84,7 @@ tmpdir2 = self.mkdtemp() unittest.skipUnless(splitdrive(tmpdir)[0] == splitdrive(tmpdir2)[0], - "Source and target should be on same drive") + "source and target should be on same drive") base_name = os.path.join(tmpdir2, target_name) @@ -275,6 +282,58 @@ finally: del ARCHIVE_FORMATS['xxx'] + def test_make_archive_owner_group(self): + # testing make_archive with owner and group, with various combinations + # this works even if there's not gid/uid support + if UID_GID_SUPPORT: + group = grp.getgrgid(0)[0] + owner = pwd.getpwuid(0)[0] + else: + group = owner = 'root' + + base_dir, root_dir, base_name = self._create_files() + base_name = os.path.join(self.mkdtemp() , 'archive') + res = make_archive(base_name, 'zip', root_dir, base_dir, owner=owner, + group=group) + self.assertTrue(os.path.exists(res)) + + res = make_archive(base_name, 'zip', root_dir, base_dir) + self.assertTrue(os.path.exists(res)) + + res = make_archive(base_name, 'tar', root_dir, base_dir, + owner=owner, group=group) + self.assertTrue(os.path.exists(res)) + + res = make_archive(base_name, 'tar', root_dir, base_dir, + owner='kjhkjhkjg', group='oihohoh') + self.assertTrue(os.path.exists(res)) + + @unittest.skipUnless(zlib, "Requires zlib") + @unittest.skipUnless(UID_GID_SUPPORT, "Requires grp and pwd support") + def test_tarfile_root_owner(self): + tmpdir, tmpdir2, base_name = self._create_files() + old_dir = os.getcwd() + os.chdir(tmpdir) + group = grp.getgrgid(0)[0] + owner = pwd.getpwuid(0)[0] + try: + archive_name = make_tarball(base_name, 'dist', compress=None, + owner=owner, group=group) + finally: + os.chdir(old_dir) + + # check if the compressed tarball was created + self.assertTrue(os.path.exists(archive_name)) + + # now checks the rights + archive = tarfile.open(archive_name) + try: + for member in archive.getmembers(): + self.assertEquals(member.uid, 0) + self.assertEquals(member.gid, 0) + finally: + archive.close() + def test_suite(): return unittest.makeSuite(ArchiveUtilTestCase) diff --git a/Lib/distutils/tests/test_sdist.py b/Lib/distutils/tests/test_sdist.py --- a/Lib/distutils/tests/test_sdist.py +++ b/Lib/distutils/tests/test_sdist.py @@ -14,6 +14,12 @@ except ImportError: ZLIB_SUPPORT = False +try: + import grp + import pwd + UID_GID_SUPPORT = True +except ImportError: + UID_GID_SUPPORT = False from distutils.command.sdist import sdist, show_formats from distutils.core import Distribution @@ -425,6 +431,53 @@ self.assertEqual(sorted(filenames), ['fake-1.0', 'fake-1.0/PKG-INFO', 'fake-1.0/README.manual']) + @unittest.skipUnless(zlib, "requires zlib") + @unittest.skipUnless(UID_GID_SUPPORT, "Requires grp and pwd support") + def test_make_distribution_owner_group(self): + + # check if tar and gzip are installed + if (find_executable('tar') is None or + find_executable('gzip') is None): + return + + # now building a sdist + dist, cmd = self.get_cmd() + + # creating a gztar and specifying the owner+group + cmd.formats = ['gztar'] + cmd.owner = pwd.getpwuid(0)[0] + cmd.group = grp.getgrgid(0)[0] + cmd.ensure_finalized() + cmd.run() + + # making sure we have the good rights + archive_name = join(self.tmp_dir, 'dist', 'fake-1.0.tar.gz') + archive = tarfile.open(archive_name) + try: + for member in archive.getmembers(): + self.assertEquals(member.uid, 0) + self.assertEquals(member.gid, 0) + finally: + archive.close() + + # building a sdist again + dist, cmd = self.get_cmd() + + # creating a gztar + cmd.formats = ['gztar'] + cmd.ensure_finalized() + cmd.run() + + # making sure we have the good rights + archive_name = join(self.tmp_dir, 'dist', 'fake-1.0.tar.gz') + archive = tarfile.open(archive_name) + try: + for member in archive.getmembers(): + self.assertEquals(member.uid, os.getuid()) + self.assertEquals(member.gid, os.getgid()) + finally: + archive.close() + def test_suite(): return unittest.makeSuite(SDistTestCase) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -47,6 +47,9 @@ Library ------- +- Issue #19544 and #6516: Restore support for --user and --group parameters to + sdist command accidentally rolled back as part of the distutils2 rollback. + - Issue #13674: Prevented time.strftime from crashing on Windows when given a year before 1900 and a format of %y. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 21:43:02 2013 From: python-checkins at python.org (vinay.sajip) Date: Fri, 15 Nov 2013 21:43:02 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NTIz?= =?utf-8?q?=3A_Closed_FileHandler_leak_which_occurred_when_delay_was_set?= =?utf-8?q?=2E?= Message-ID: <3dLs2Z2tvVz7Ljc@mail.python.org> http://hg.python.org/cpython/rev/bbb227b96c45 changeset: 87120:bbb227b96c45 branch: 2.7 parent: 87101:c178d72e2f84 user: Vinay Sajip date: Fri Nov 15 20:39:33 2013 +0000 summary: Issue #19523: Closed FileHandler leak which occurred when delay was set. files: Lib/logging/__init__.py | 4 +++- Misc/NEWS | 2 ++ 2 files changed, 5 insertions(+), 1 deletions(-) diff --git a/Lib/logging/__init__.py b/Lib/logging/__init__.py --- a/Lib/logging/__init__.py +++ b/Lib/logging/__init__.py @@ -912,8 +912,10 @@ self.flush() if hasattr(self.stream, "close"): self.stream.close() - StreamHandler.close(self) self.stream = None + # Issue #19523: call unconditionally to + # prevent a handler leak when delay is set + StreamHandler.close(self) finally: self.release() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -12,6 +12,8 @@ Library ------- +- Issue #19523: Closed FileHandler leak which occurred when delay was set. + - Issue #1575020: Fixed support of 24-bit wave files on big-endian platforms. - Issue #19480: HTMLParser now accepts all valid start-tag names as defined -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 21:43:03 2013 From: python-checkins at python.org (vinay.sajip) Date: Fri, 15 Nov 2013 21:43:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NTIz?= =?utf-8?q?=3A_Closed_FileHandler_leak_which_occurred_when_delay_was_set?= =?utf-8?q?=2E?= Message-ID: <3dLs2b4v0tz7Ljk@mail.python.org> http://hg.python.org/cpython/rev/058810fe1b98 changeset: 87121:058810fe1b98 branch: 3.3 parent: 87117:8bbd215940f1 user: Vinay Sajip date: Fri Nov 15 20:40:27 2013 +0000 summary: Issue #19523: Closed FileHandler leak which occurred when delay was set. files: Lib/logging/__init__.py | 4 +++- Misc/NEWS | 2 ++ 2 files changed, 5 insertions(+), 1 deletions(-) diff --git a/Lib/logging/__init__.py b/Lib/logging/__init__.py --- a/Lib/logging/__init__.py +++ b/Lib/logging/__init__.py @@ -976,8 +976,10 @@ self.flush() if hasattr(self.stream, "close"): self.stream.close() - StreamHandler.close(self) self.stream = None + # Issue #19523: call unconditionally to + # prevent a handler leak when delay is set + StreamHandler.close(self) finally: self.release() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -13,6 +13,8 @@ Library ------- +- Issue #19523: Closed FileHandler leak which occurred when delay was set. + - Issue #13674: Prevented time.strftime from crashing on Windows when given a year before 1900 and a format of %y. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 21:43:04 2013 From: python-checkins at python.org (vinay.sajip) Date: Fri, 15 Nov 2013 21:43:04 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Closes_=2319523=3A_Merged_fix_from_3=2E3=2E?= Message-ID: <3dLs2c6mp3z7Ln8@mail.python.org> http://hg.python.org/cpython/rev/a3640822c3e6 changeset: 87122:a3640822c3e6 parent: 87119:b9c9c4b2effe parent: 87121:058810fe1b98 user: Vinay Sajip date: Fri Nov 15 20:42:47 2013 +0000 summary: Closes #19523: Merged fix from 3.3. files: Lib/logging/__init__.py | 4 +++- Misc/NEWS | 2 ++ 2 files changed, 5 insertions(+), 1 deletions(-) diff --git a/Lib/logging/__init__.py b/Lib/logging/__init__.py --- a/Lib/logging/__init__.py +++ b/Lib/logging/__init__.py @@ -990,8 +990,10 @@ self.flush() if hasattr(self.stream, "close"): self.stream.close() - StreamHandler.close(self) self.stream = None + # Issue #19523: call unconditionally to + # prevent a handler leak when delay is set + StreamHandler.close(self) finally: self.release() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -47,6 +47,8 @@ Library ------- +- Issue #19523: Closed FileHandler leak which occurred when delay was set. + - Issue #19544 and #6516: Restore support for --user and --group parameters to sdist command accidentally rolled back as part of the distutils2 rollback. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 21:59:02 2013 From: python-checkins at python.org (vinay.sajip) Date: Fri, 15 Nov 2013 21:59:02 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NTA0?= =?utf-8?q?=3A_Used_American_spelling_for_=27customize=27=2E?= Message-ID: <3dLsP21rmwz7Ljd@mail.python.org> http://hg.python.org/cpython/rev/e9d9bebb979f changeset: 87123:e9d9bebb979f branch: 2.7 parent: 87120:bbb227b96c45 user: Vinay Sajip date: Fri Nov 15 20:54:15 2013 +0000 summary: Issue #19504: Used American spelling for 'customize'. files: Doc/library/logging.rst | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Doc/library/logging.rst b/Doc/library/logging.rst --- a/Doc/library/logging.rst +++ b/Doc/library/logging.rst @@ -727,8 +727,8 @@ Return either the standard :class:`Logger` class, or the last class passed to :func:`setLoggerClass`. This function may be called from within a new class - definition, to ensure that installing a customised :class:`Logger` class will - not undo customisations already applied by other code. For example:: + definition, to ensure that installing a customized :class:`Logger` class will + not undo customizations already applied by other code. For example:: class MyLogger(logging.getLoggerClass()): # ... override behaviour here -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 21:59:03 2013 From: python-checkins at python.org (vinay.sajip) Date: Fri, 15 Nov 2013 21:59:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NTA0?= =?utf-8?q?=3A_Used_American_spelling_for_=27customize=27=2E?= Message-ID: <3dLsP34zwmz7Ll0@mail.python.org> http://hg.python.org/cpython/rev/1c714c35c02a changeset: 87124:1c714c35c02a branch: 3.3 parent: 87121:058810fe1b98 user: Vinay Sajip date: Fri Nov 15 20:58:13 2013 +0000 summary: Issue #19504: Used American spelling for 'customize'. files: Doc/howto/logging-cookbook.rst | 10 +++++----- Doc/library/logging.handlers.rst | 2 +- Doc/library/logging.rst | 4 ++-- Lib/venv/__init__.py | 2 +- PC/launcher.c | 4 ++-- 5 files changed, 11 insertions(+), 11 deletions(-) diff --git a/Doc/howto/logging-cookbook.rst b/Doc/howto/logging-cookbook.rst --- a/Doc/howto/logging-cookbook.rst +++ b/Doc/howto/logging-cookbook.rst @@ -1098,7 +1098,7 @@ .. _custom-logrecord: -Customising ``LogRecord`` +Customizing ``LogRecord`` ------------------------- Every logging event is represented by a :class:`LogRecord` instance. @@ -1315,7 +1315,7 @@ .. _cookbook-rotator-namer: -Using a rotator and namer to customise log rotation processing +Using a rotator and namer to customize log rotation processing -------------------------------------------------------------- An example of how you can define a namer and rotator is given in the following @@ -1696,14 +1696,14 @@ .. currentmodule:: logging.config -Customising handlers with :func:`dictConfig` +Customizing handlers with :func:`dictConfig` -------------------------------------------- -There are times when you want to customise logging handlers in particular ways, +There are times when you want to customize logging handlers in particular ways, and if you use :func:`dictConfig` you may be able to do this without subclassing. As an example, consider that you may want to set the ownership of a log file. On POSIX, this is easily done using :func:`shutil.chown`, but the file -handlers in the stdlib don't offer built-in support. You can customise handler +handlers in the stdlib don't offer built-in support. You can customize handler creation using a plain function such as:: def owned_file_handler(filename, mode='a', encoding=None, owner=None): diff --git a/Doc/library/logging.handlers.rst b/Doc/library/logging.handlers.rst --- a/Doc/library/logging.handlers.rst +++ b/Doc/library/logging.handlers.rst @@ -892,7 +892,7 @@ Enqueues the record on the queue using ``put_nowait()``; you may want to override this if you want to use blocking behaviour, or a - timeout, or a customised queue implementation. + timeout, or a customized queue implementation. diff --git a/Doc/library/logging.rst b/Doc/library/logging.rst --- a/Doc/library/logging.rst +++ b/Doc/library/logging.rst @@ -845,8 +845,8 @@ Return either the standard :class:`Logger` class, or the last class passed to :func:`setLoggerClass`. This function may be called from within a new class - definition, to ensure that installing a customised :class:`Logger` class will - not undo customisations already applied by other code. For example:: + definition, to ensure that installing a customized :class:`Logger` class will + not undo customizations already applied by other code. For example:: class MyLogger(logging.getLoggerClass()): # ... override behaviour here diff --git a/Lib/venv/__init__.py b/Lib/venv/__init__.py --- a/Lib/venv/__init__.py +++ b/Lib/venv/__init__.py @@ -38,7 +38,7 @@ class EnvBuilder: """ This class exists to allow virtual environment creation to be - customised. The constructor parameters determine the builder's + customized. The constructor parameters determine the builder's behaviour when called upon to create a virtual environment. By default, the builder makes the system (global) site-packages dir diff --git a/PC/launcher.c b/PC/launcher.c --- a/PC/launcher.c +++ b/PC/launcher.c @@ -807,10 +807,10 @@ } if (*vpp == NULL) { /* - * Not found in builtins - look in customised commands. + * Not found in builtins - look in customized commands. * * We can't permanently modify the shebang line in case - * it's not a customised command, but we can temporarily + * it's not a customized command, but we can temporarily * stick a NUL after the command while searching for it, * then put back the char we zapped. */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 21:59:05 2013 From: python-checkins at python.org (vinay.sajip) Date: Fri, 15 Nov 2013 21:59:05 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319504=3A_Used_American_spelling_for_=27customiz?= =?utf-8?b?ZScu?= Message-ID: <3dLsP50xz3z7LlY@mail.python.org> http://hg.python.org/cpython/rev/08c7bc4266e6 changeset: 87125:08c7bc4266e6 parent: 87122:a3640822c3e6 parent: 87124:1c714c35c02a user: Vinay Sajip date: Fri Nov 15 20:58:47 2013 +0000 summary: Issue #19504: Used American spelling for 'customize'. files: Doc/howto/logging-cookbook.rst | 10 +++++----- Doc/library/logging.handlers.rst | 2 +- Doc/library/logging.rst | 4 ++-- Lib/venv/__init__.py | 2 +- PC/launcher.c | 4 ++-- 5 files changed, 11 insertions(+), 11 deletions(-) diff --git a/Doc/howto/logging-cookbook.rst b/Doc/howto/logging-cookbook.rst --- a/Doc/howto/logging-cookbook.rst +++ b/Doc/howto/logging-cookbook.rst @@ -1096,7 +1096,7 @@ .. _custom-logrecord: -Customising ``LogRecord`` +Customizing ``LogRecord`` ------------------------- Every logging event is represented by a :class:`LogRecord` instance. @@ -1313,7 +1313,7 @@ .. _cookbook-rotator-namer: -Using a rotator and namer to customise log rotation processing +Using a rotator and namer to customize log rotation processing -------------------------------------------------------------- An example of how you can define a namer and rotator is given in the following @@ -1694,14 +1694,14 @@ .. currentmodule:: logging.config -Customising handlers with :func:`dictConfig` +Customizing handlers with :func:`dictConfig` -------------------------------------------- -There are times when you want to customise logging handlers in particular ways, +There are times when you want to customize logging handlers in particular ways, and if you use :func:`dictConfig` you may be able to do this without subclassing. As an example, consider that you may want to set the ownership of a log file. On POSIX, this is easily done using :func:`shutil.chown`, but the file -handlers in the stdlib don't offer built-in support. You can customise handler +handlers in the stdlib don't offer built-in support. You can customize handler creation using a plain function such as:: def owned_file_handler(filename, mode='a', encoding=None, owner=None): diff --git a/Doc/library/logging.handlers.rst b/Doc/library/logging.handlers.rst --- a/Doc/library/logging.handlers.rst +++ b/Doc/library/logging.handlers.rst @@ -904,7 +904,7 @@ Enqueues the record on the queue using ``put_nowait()``; you may want to override this if you want to use blocking behaviour, or a - timeout, or a customised queue implementation. + timeout, or a customized queue implementation. diff --git a/Doc/library/logging.rst b/Doc/library/logging.rst --- a/Doc/library/logging.rst +++ b/Doc/library/logging.rst @@ -845,8 +845,8 @@ Return either the standard :class:`Logger` class, or the last class passed to :func:`setLoggerClass`. This function may be called from within a new class - definition, to ensure that installing a customised :class:`Logger` class will - not undo customisations already applied by other code. For example:: + definition, to ensure that installing a customized :class:`Logger` class will + not undo customizations already applied by other code. For example:: class MyLogger(logging.getLoggerClass()): # ... override behaviour here diff --git a/Lib/venv/__init__.py b/Lib/venv/__init__.py --- a/Lib/venv/__init__.py +++ b/Lib/venv/__init__.py @@ -38,7 +38,7 @@ class EnvBuilder: """ This class exists to allow virtual environment creation to be - customised. The constructor parameters determine the builder's + customized. The constructor parameters determine the builder's behaviour when called upon to create a virtual environment. By default, the builder makes the system (global) site-packages dir diff --git a/PC/launcher.c b/PC/launcher.c --- a/PC/launcher.c +++ b/PC/launcher.c @@ -878,10 +878,10 @@ } if (vpp->shebang == NULL) { /* - * Not found in builtins - look in customised commands. + * Not found in builtins - look in customized commands. * * We can't permanently modify the shebang line in case - * it's not a customised command, but we can temporarily + * it's not a customized command, but we can temporarily * stick a NUL after the command while searching for it, * then put back the char we zapped. */ -- Repository URL: http://hg.python.org/cpython From christian at python.org Fri Nov 15 23:05:55 2013 From: christian at python.org (Christian Heimes) Date: Fri, 15 Nov 2013 23:05:55 +0100 Subject: [Python-checkins] cpython: Issue #19544 and Issue #6516: Restore support for --user and --group parameters In-Reply-To: <3dLnZZ0Vjnz7LqJ@mail.python.org> References: <3dLnZZ0Vjnz7LqJ@mail.python.org> Message-ID: <52869AC3.1040405@python.org> Am 15.11.2013 19:07, schrieb jason.coombs: > http://hg.python.org/cpython/rev/b9c9c4b2effe > changeset: 87119:b9c9c4b2effe > user: Andrew Kuchling > date: Fri Nov 15 13:01:52 2013 -0500 > summary: > Issue #19544 and Issue #6516: Restore support for --user and --group parameters to sdist command as found in Python 2.7 and originally slated for Python 3.2 but accidentally rolled back as part of the distutils2 rollback. Closes Issue #6516. > Your commit has broken the build: ./python -E -S -m sysconfig --generate-posix-vars Could not find platform dependent libraries Consider setting $PYTHONHOME to [:] Traceback (most recent call last): File "./setup.py", line 11, in from distutils.core import Extension, setup File "/home/heimes/dev/python/cpython/Lib/distutils/core.py", line 18, in from distutils.cmd import Command File "/home/heimes/dev/python/cpython/Lib/distutils/cmd.py", line 9, in from distutils import util, dir_util, file_util, archive_util, dep_util File "/home/heimes/dev/python/cpython/Lib/distutils/archive_util.py", line 27, in from grp import getgrnam ImportError: No module named 'grp' The grp module is built later by setup.py. From python-checkins at python.org Fri Nov 15 23:08:31 2013 From: python-checkins at python.org (christian.heimes) Date: Fri, 15 Nov 2013 23:08:31 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319544_and_Issue_?= =?utf-8?q?=236516=3A_quick_workaround_for_failing_builds?= Message-ID: <3dLtxC20Kqz7Lk6@mail.python.org> http://hg.python.org/cpython/rev/b08868fd5994 changeset: 87126:b08868fd5994 user: Christian Heimes date: Fri Nov 15 23:08:21 2013 +0100 summary: Issue #19544 and Issue #6516: quick workaround for failing builds files: Lib/distutils/archive_util.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/distutils/archive_util.py b/Lib/distutils/archive_util.py --- a/Lib/distutils/archive_util.py +++ b/Lib/distutils/archive_util.py @@ -20,12 +20,12 @@ try: from pwd import getpwnam -except AttributeError: +except (ImportError, AttributeError): getpwnam = None try: from grp import getgrnam -except AttributeError: +except (ImportError, AttributeError): getgrnam = None def _get_gid(name): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 15 23:13:29 2013 From: python-checkins at python.org (victor.stinner) Date: Fri, 15 Nov 2013 23:13:29 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319544=2C_=236516?= =?utf-8?q?=3A_no_need_to_catch_AttributeError_on_import_pwd/grp?= Message-ID: <3dLv2x0c31z7Ljt@mail.python.org> http://hg.python.org/cpython/rev/015463176d2e changeset: 87127:015463176d2e user: Victor Stinner date: Fri Nov 15 23:13:17 2013 +0100 summary: Issue #19544, #6516: no need to catch AttributeError on import pwd/grp files: Lib/distutils/archive_util.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/distutils/archive_util.py b/Lib/distutils/archive_util.py --- a/Lib/distutils/archive_util.py +++ b/Lib/distutils/archive_util.py @@ -20,12 +20,12 @@ try: from pwd import getpwnam -except (ImportError, AttributeError): +except ImportError: getpwnam = None try: from grp import getgrnam -except (ImportError, AttributeError): +except ImportError: getgrnam = None def _get_gid(name): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 00:34:12 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 16 Nov 2013 00:34:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_compiler_warning_on_Wi?= =?utf-8?q?ndows_64-bit=3A_explicit_cast_size=5Ft_to_unsigned_long?= Message-ID: <3dLwr41LW1zQBQ@mail.python.org> http://hg.python.org/cpython/rev/f0a7924fac56 changeset: 87128:f0a7924fac56 user: Victor Stinner date: Fri Nov 15 23:16:15 2013 +0100 summary: Fix compiler warning on Windows 64-bit: explicit cast size_t to unsigned long files: Modules/_randommodule.c | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Modules/_randommodule.c b/Modules/_randommodule.c --- a/Modules/_randommodule.c +++ b/Modules/_randommodule.c @@ -179,7 +179,7 @@ k = (N>key_length ? N : key_length); for (; k; k--) { mt[i] = (mt[i] ^ ((mt[i-1] ^ (mt[i-1] >> 30)) * 1664525UL)) - + init_key[j] + j; /* non linear */ + + init_key[j] + (unsigned long)j; /* non linear */ mt[i] &= 0xffffffffUL; /* for WORDSIZE > 32 machines */ i++; j++; if (i>=N) { mt[0] = mt[N-1]; i=1; } @@ -187,7 +187,7 @@ } for (k=N-1; k; k--) { mt[i] = (mt[i] ^ ((mt[i-1] ^ (mt[i-1] >> 30)) * 1566083941UL)) - - i; /* non linear */ + - (unsigned long)i; /* non linear */ mt[i] &= 0xffffffffUL; /* for WORDSIZE > 32 machines */ i++; if (i>=N) { mt[0] = mt[N-1]; i=1; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 00:34:13 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 16 Nov 2013 00:34:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_compiler_warning_=28es?= =?utf-8?q?pecially_on_Windows_64-bit=29=3A_don=27t_truncate_Py=5Fssize=5F?= =?utf-8?q?t?= Message-ID: <3dLwr537NPzQBQ@mail.python.org> http://hg.python.org/cpython/rev/8adbd8a3a626 changeset: 87129:8adbd8a3a626 user: Victor Stinner date: Fri Nov 15 23:21:11 2013 +0100 summary: Fix compiler warning (especially on Windows 64-bit): don't truncate Py_ssize_t to int files: Modules/_sre.c | 8 ++++---- 1 files changed, 4 insertions(+), 4 deletions(-) diff --git a/Modules/_sre.c b/Modules/_sre.c --- a/Modules/_sre.c +++ b/Modules/_sre.c @@ -269,7 +269,7 @@ /* factories and destructors */ /* see sre.h for object declarations */ -static PyObject*pattern_new_match(PatternObject*, SRE_STATE*, int); +static PyObject*pattern_new_match(PatternObject*, SRE_STATE*, Py_ssize_t); static PyObject*pattern_scanner(PatternObject*, PyObject*, PyObject* kw); static PyObject * @@ -468,7 +468,7 @@ } static void -pattern_error(int status) +pattern_error(Py_ssize_t status) { switch (status) { case SRE_ERROR_RECURSION_LIMIT: @@ -562,7 +562,7 @@ pattern_search(PatternObject* self, PyObject* args, PyObject* kw) { SRE_STATE state; - int status; + Py_ssize_t status; PyObject* string; Py_ssize_t start = 0; @@ -2322,7 +2322,7 @@ }; static PyObject* -pattern_new_match(PatternObject* pattern, SRE_STATE* state, int status) +pattern_new_match(PatternObject* pattern, SRE_STATE* state, Py_ssize_t status) { /* create match object (from state object) */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 00:34:14 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 16 Nov 2013 00:34:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_compiler_warning_in_wi?= =?utf-8?q?n32=5Furandom=28=29=3A_explicit_cast_to_DWORD_in?= Message-ID: <3dLwr64lWVz7LlY@mail.python.org> http://hg.python.org/cpython/rev/baab11a466ab changeset: 87130:baab11a466ab user: Victor Stinner date: Fri Nov 15 23:26:25 2013 +0100 summary: Fix compiler warning in win32_urandom(): explicit cast to DWORD in CryptGenRandom() files: Python/random.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Python/random.c b/Python/random.c --- a/Python/random.c +++ b/Python/random.c @@ -50,7 +50,7 @@ while (size > 0) { chunk = size > INT_MAX ? INT_MAX : size; - if (!CryptGenRandom(hCryptProv, chunk, buffer)) + if (!CryptGenRandom(hCryptProv, (DWORD)chunk, buffer)) { /* CryptGenRandom() failed */ if (raise) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 00:34:15 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 16 Nov 2013 00:34:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_compiler_warning_=28on?= =?utf-8?q?_Windows_64-bit=29=3A_explicit_cast_Py=5Fssize=5Ft_to_unsigned?= Message-ID: <3dLwr76RBgz7Lmt@mail.python.org> http://hg.python.org/cpython/rev/e086cb1c6e5a changeset: 87131:e086cb1c6e5a user: Victor Stinner date: Sat Nov 16 00:13:29 2013 +0100 summary: Fix compiler warning (on Windows 64-bit): explicit cast Py_ssize_t to unsigned char, n is in range [0; 255] (a tuple cannot have a negative length) files: Python/marshal.c | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Python/marshal.c b/Python/marshal.c --- a/Python/marshal.c +++ b/Python/marshal.c @@ -80,7 +80,7 @@ #define w_byte(c, p) if (((p)->fp)) putc((c), (p)->fp); \ else if ((p)->ptr != (p)->end) *(p)->ptr++ = (c); \ - else w_more(c, p) + else w_more((c), p) static void w_more(char c, WFILE *p) @@ -448,7 +448,7 @@ n = PyTuple_Size(v); if (p->version >= 4 && n < 256) { W_TYPE(TYPE_SMALL_TUPLE, p); - w_byte(n, p); + w_byte((unsigned char)n, p); } else { W_TYPE(TYPE_TUPLE, p); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 00:34:17 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 16 Nov 2013 00:34:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_compiler_warning_on_Wi?= =?utf-8?q?ndows_64-bit=3A_asdl=5Fseq=5FSET=28=29_stores_the_index?= Message-ID: <3dLwr914sJz7Lnj@mail.python.org> http://hg.python.org/cpython/rev/cf399d13a707 changeset: 87132:cf399d13a707 user: Victor Stinner date: Sat Nov 16 00:16:58 2013 +0100 summary: Fix compiler warning on Windows 64-bit: asdl_seq_SET() stores the index parameter into a Py_ssize_t, instead of an int files: Include/asdl.h | 10 ++++++---- 1 files changed, 6 insertions(+), 4 deletions(-) diff --git a/Include/asdl.h b/Include/asdl.h --- a/Include/asdl.h +++ b/Include/asdl.h @@ -31,11 +31,13 @@ #define asdl_seq_GET(S, I) (S)->elements[(I)] #define asdl_seq_LEN(S) ((S) == NULL ? 0 : (S)->size) #ifdef Py_DEBUG -#define asdl_seq_SET(S, I, V) { \ - int _asdl_i = (I); \ - assert((S) && _asdl_i < (S)->size); \ +#define asdl_seq_SET(S, I, V) \ + do { \ + Py_ssize_t _asdl_i = (I); \ + assert((S) != NULL); \ + assert(_asdl_i < (S)->size); \ (S)->elements[_asdl_i] = (V); \ -} + } while (0) #else #define asdl_seq_SET(S, I, V) (S)->elements[I] = (V) #endif -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 00:34:18 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 16 Nov 2013 00:34:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_compiler_warning_on_Wi?= =?utf-8?q?ndows_64_bit=3A_=5Finit=5Fpos=5Fargs=28=29_result_type_is?= Message-ID: <3dLwrB2tgRz7Lmt@mail.python.org> http://hg.python.org/cpython/rev/309d855ebc3e changeset: 87133:309d855ebc3e user: Victor Stinner date: Sat Nov 16 00:17:22 2013 +0100 summary: Fix compiler warning on Windows 64 bit: _init_pos_args() result type is Py_ssize_t, not int files: Modules/_ctypes/_ctypes.c | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Modules/_ctypes/_ctypes.c b/Modules/_ctypes/_ctypes.c --- a/Modules/_ctypes/_ctypes.c +++ b/Modules/_ctypes/_ctypes.c @@ -4054,8 +4054,8 @@ return -1; } if (PyTuple_GET_SIZE(args)) { - int res = _init_pos_args(self, Py_TYPE(self), - args, kwds, 0); + Py_ssize_t res = _init_pos_args(self, Py_TYPE(self), + args, kwds, 0); if (res == -1) return -1; if (res < PyTuple_GET_SIZE(args)) { -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 00:34:19 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 16 Nov 2013 00:34:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_sock=5Frecvfrom=5Fguts?= =?utf-8?q?=28=29=3A_recvfrom=28=29_size_is_limited_to_an_int_on_Windows?= =?utf-8?q?=2C_not?= Message-ID: <3dLwrC4cfQz7Lm9@mail.python.org> http://hg.python.org/cpython/rev/7cd4c3e9e310 changeset: 87134:7cd4c3e9e310 user: Victor Stinner date: Sat Nov 16 00:18:58 2013 +0100 summary: Fix sock_recvfrom_guts(): recvfrom() size is limited to an int on Windows, not on other OSes! files: Modules/socketmodule.c | 16 ++++++++-------- 1 files changed, 8 insertions(+), 8 deletions(-) diff --git a/Modules/socketmodule.c b/Modules/socketmodule.c --- a/Modules/socketmodule.c +++ b/Modules/socketmodule.c @@ -840,7 +840,7 @@ return siz; } /* special-case broadcast - inet_addr() below can return INADDR_NONE for - * this */ + * this */ if (strcmp(name, "255.255.255.255") == 0 || strcmp(name, "") == 0) { struct sockaddr_in *sin; @@ -901,7 +901,7 @@ #endif return 4; } - } + } #endif /* HAVE_INET_PTON */ /* perform a name resolution */ @@ -2833,7 +2833,7 @@ memset(&addrbuf, 0, addrlen); timeout = internal_select_ex(s, 0, interval); if (!timeout) { -#ifndef MS_WINDOWS +#ifdef MS_WINDOWS if (len > INT_MAX) len = INT_MAX; n = recvfrom(s->sock_fd, cbuf, (int)len, flags, @@ -4702,7 +4702,7 @@ /* On UNIX, dup can be used to duplicate the file descriptor of a socket */ newfd = _Py_dup(fd); if (newfd == INVALID_SOCKET) - return NULL; + return NULL; #endif newfdobj = PyLong_FromSocket_t(newfd); @@ -5093,7 +5093,7 @@ return NULL; } - size = sizeof(addr); + size = sizeof(addr); ret = WSAStringToAddressA(ip, af, NULL, (LPSOCKADDR)&addr, &size); if (ret) { @@ -5101,10 +5101,10 @@ return NULL; } else if(af == AF_INET) { struct sockaddr_in *addr4 = (struct sockaddr_in*)&addr; - return PyBytes_FromStringAndSize((const char *)&(addr4->sin_addr), + return PyBytes_FromStringAndSize((const char *)&(addr4->sin_addr), sizeof(addr4->sin_addr)); } else if (af == AF_INET6) { - return PyBytes_FromStringAndSize((const char *)&(addr.sin6_addr), + return PyBytes_FromStringAndSize((const char *)&(addr.sin6_addr), sizeof(addr.sin6_addr)); } else { PyErr_SetString(PyExc_OSError, "unknown address family"); @@ -5231,7 +5231,7 @@ } retlen = sizeof(ip); - ret = WSAAddressToStringA((struct sockaddr*)&addr, addrlen, NULL, + ret = WSAAddressToStringA((struct sockaddr*)&addr, addrlen, NULL, ip, &retlen); if (ret) { -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 00:34:20 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 16 Nov 2013 00:34:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_compiler_warnings_on_W?= =?utf-8?q?indows_64_bit=3A_add_an_explicit_cast_from_Py=5Fssize=5Ft?= Message-ID: <3dLwrD6N3Pz7Lnq@mail.python.org> http://hg.python.org/cpython/rev/9e25367095c4 changeset: 87135:9e25367095c4 user: Victor Stinner date: Sat Nov 16 00:27:16 2013 +0100 summary: Fix compiler warnings on Windows 64 bit: add an explicit cast from Py_ssize_t to int, password.len was checked for being smaller than INT_MAX. files: Modules/_hashopenssl.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Modules/_hashopenssl.c b/Modules/_hashopenssl.c --- a/Modules/_hashopenssl.c +++ b/Modules/_hashopenssl.c @@ -670,7 +670,7 @@ key = PyBytes_AS_STRING(key_obj); Py_BEGIN_ALLOW_THREADS - retval = PKCS5_PBKDF2_HMAC_fast((char*)password.buf, password.len, + retval = PKCS5_PBKDF2_HMAC_fast((char*)password.buf, (int)password.len, (unsigned char *)salt.buf, salt.len, iterations, digest, dklen, (unsigned char *)key); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 00:46:03 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 16 Nov 2013 00:46:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Don=27t_mix_wide_character?= =?utf-8?q?_strings_and_byte_strings_=28L=22lib/python=22_VERSION=29=3A_us?= =?utf-8?q?e?= Message-ID: <3dLx5l5LHtz7LqR@mail.python.org> http://hg.python.org/cpython/rev/a7a8ccf69708 changeset: 87136:a7a8ccf69708 user: Victor Stinner date: Sat Nov 16 00:45:54 2013 +0100 summary: Don't mix wide character strings and byte strings (L"lib/python" VERSION): use _Py_char2wchar() to decode lib_python instead. Some compilers don't support concatenating literals: L"wide" "bytes". Example: IRIX compiler. files: Modules/getpath.c | 19 +++++++++++++------ 1 files changed, 13 insertions(+), 6 deletions(-) diff --git a/Modules/getpath.c b/Modules/getpath.c --- a/Modules/getpath.c +++ b/Modules/getpath.c @@ -135,7 +135,6 @@ static wchar_t progpath[MAXPATHLEN+1]; static wchar_t *module_search_path = NULL; static int module_search_path_malloced = 0; -static wchar_t *lib_python = L"lib/python" VERSION; static void reduce(wchar_t *dir) @@ -317,7 +316,8 @@ bytes long. */ static int -search_for_prefix(wchar_t *argv0_path, wchar_t *home, wchar_t *_prefix) +search_for_prefix(wchar_t *argv0_path, wchar_t *home, wchar_t *_prefix, + wchar_t *lib_python) { size_t n; wchar_t *vpath; @@ -383,7 +383,8 @@ MAXPATHLEN bytes long. */ static int -search_for_exec_prefix(wchar_t *argv0_path, wchar_t *home, wchar_t *_exec_prefix) +search_for_exec_prefix(wchar_t *argv0_path, wchar_t *home, + wchar_t *_exec_prefix, wchar_t *lib_python) { size_t n; @@ -493,12 +494,14 @@ char execpath[MAXPATHLEN+1]; #endif wchar_t *_pythonpath, *_prefix, *_exec_prefix; + wchar_t *lib_python; _pythonpath = _Py_char2wchar(PYTHONPATH, NULL); _prefix = _Py_char2wchar(PREFIX, NULL); _exec_prefix = _Py_char2wchar(EXEC_PREFIX, NULL); + lib_python = _Py_char2wchar("lib/python" VERSION, NULL); - if (!_pythonpath || !_prefix || !_exec_prefix) { + if (!_pythonpath || !_prefix || !_exec_prefix || !lib_python) { Py_FatalError( "Unable to decode path variables in getpath.c: " "memory error"); @@ -666,7 +669,8 @@ } } - if (!(pfound = search_for_prefix(argv0_path, home, _prefix))) { + pfound = search_for_prefix(argv0_path, home, _prefix, lib_python); + if (!pfound) { if (!Py_FrozenFlag) fprintf(stderr, "Could not find platform independent libraries \n"); @@ -689,7 +693,9 @@ zip_path[bufsz - 6] = VERSION[0]; zip_path[bufsz - 5] = VERSION[2]; - if (!(efound = search_for_exec_prefix(argv0_path, home, _exec_prefix))) { + efound = search_for_exec_prefix(argv0_path, home, + _exec_prefix, lib_python); + if (!efound) { if (!Py_FrozenFlag) fprintf(stderr, "Could not find platform dependent libraries \n"); @@ -818,6 +824,7 @@ PyMem_RawFree(_pythonpath); PyMem_RawFree(_prefix); PyMem_RawFree(_exec_prefix); + PyMem_RawFree(lib_python); PyMem_RawFree(rtpypath); } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 00:57:10 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 16 Nov 2013 00:57:10 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319544_and_Issue_?= =?utf-8?q?=231180=3A_Restore_global_option_to_ignore?= Message-ID: <3dLxLZ0lJ8z7LpM@mail.python.org> http://hg.python.org/cpython/rev/c35311fcc967 changeset: 87137:c35311fcc967 user: Andrew Kuchling date: Sun Nov 10 18:11:00 2013 -0500 summary: Issue #19544 and Issue #1180: Restore global option to ignore ~/.pydistutils.cfg in Distutils, accidentally removed in backout of distutils2 changes. files: Doc/distutils/builtdist.rst | 3 +- Lib/distutils/core.py | 5 +- Lib/distutils/dist.py | 35 +++++++++++++++++-- Lib/distutils/tests/test_dist.py | 29 ++++++++++++++++ Misc/NEWS | 4 ++ 5 files changed, 68 insertions(+), 8 deletions(-) diff --git a/Doc/distutils/builtdist.rst b/Doc/distutils/builtdist.rst --- a/Doc/distutils/builtdist.rst +++ b/Doc/distutils/builtdist.rst @@ -239,7 +239,8 @@ configuration file, :file:`setup.cfg`\ ---see section :ref:`setup-config`. If you distribute or package many Python module distributions, you might want to put options that apply to all of them in your personal Distutils configuration -file (:file:`~/.pydistutils.cfg`). +file (:file:`~/.pydistutils.cfg`). If you want to temporarily disable +this file, you can pass the :option:`--no-user-cfg` option to :file:`setup.py`. There are three steps to building a binary RPM package, all of which are handled automatically by the Distutils: diff --git a/Lib/distutils/core.py b/Lib/distutils/core.py --- a/Lib/distutils/core.py +++ b/Lib/distutils/core.py @@ -128,8 +128,9 @@ if _setup_stop_after == "config": return dist - # Parse the command line; any command-line errors are the end user's - # fault, so turn them into SystemExit to suppress tracebacks. + # Parse the command line and override config files; any + # command-line errors are the end user's fault, so turn them into + # SystemExit to suppress tracebacks. try: ok = dist.parse_command_line() except DistutilsArgError as msg: diff --git a/Lib/distutils/dist.py b/Lib/distutils/dist.py --- a/Lib/distutils/dist.py +++ b/Lib/distutils/dist.py @@ -52,7 +52,9 @@ ('quiet', 'q', "run quietly (turns verbosity off)"), ('dry-run', 'n', "don't actually do anything"), ('help', 'h', "show detailed help message"), - ] + ('no-user-cfg', None, + 'ignore pydistutils.cfg in your home directory'), + ] # 'common_usage' is a short (2-3 line) string describing the common # usage of the setup script. @@ -259,6 +261,22 @@ else: sys.stderr.write(msg + "\n") + # no-user-cfg is handled before other command line args + # because other args override the config files, and this + # one is needed before we can load the config files. + # If attrs['script_args'] wasn't passed, assume false. + # + # This also make sure we just look at the global options + self.want_user_cfg = True + + if self.script_args is not None: + for arg in self.script_args: + if not arg.startswith('-'): + break + if arg == '--no-user-cfg': + self.want_user_cfg = False + break + self.finalize_options() def get_option_dict(self, command): @@ -310,7 +328,10 @@ Distutils installation directory (ie. where the top-level Distutils __inst__.py file lives), a file in the user's home directory named .pydistutils.cfg on Unix and pydistutils.cfg - on Windows/Mac, and setup.cfg in the current directory. + on Windows/Mac; and setup.cfg in the current directory. + + The file in the user's home directory can be disabled with the + --no-user-cfg option. """ files = [] check_environ() @@ -330,15 +351,19 @@ user_filename = "pydistutils.cfg" # And look for the user config file - user_file = os.path.join(os.path.expanduser('~'), user_filename) - if os.path.isfile(user_file): - files.append(user_file) + if self.want_user_cfg: + user_file = os.path.join(os.path.expanduser('~'), user_filename) + if os.path.isfile(user_file): + files.append(user_file) # All platforms support local setup.cfg local_file = "setup.cfg" if os.path.isfile(local_file): files.append(local_file) + if DEBUG: + self.announce("using config files: %s" % ', '.join(files)) + return files def parse_config_files(self, filenames=None): diff --git a/Lib/distutils/tests/test_dist.py b/Lib/distutils/tests/test_dist.py --- a/Lib/distutils/tests/test_dist.py +++ b/Lib/distutils/tests/test_dist.py @@ -39,6 +39,7 @@ class DistributionTestCase(support.LoggingSilencer, + support.TempdirManager, support.EnvironGuard, unittest.TestCase): @@ -213,6 +214,34 @@ self.assertRaises(ValueError, dist.announce, args, kwargs) + def test_find_config_files_disable(self): + # Ticket #1180: Allow user to disable their home config file. + temp_home = self.mkdtemp() + if os.name == 'posix': + user_filename = os.path.join(temp_home, ".pydistutils.cfg") + else: + user_filename = os.path.join(temp_home, "pydistutils.cfg") + + with open(user_filename, 'w') as f: + f.write('[distutils]\n') + + def _expander(path): + return temp_home + + old_expander = os.path.expanduser + os.path.expanduser = _expander + try: + d = Distribution() + all_files = d.find_config_files() + + d = Distribution(attrs={'script_args': ['--no-user-cfg']}) + files = d.find_config_files() + finally: + os.path.expanduser = old_expander + + # make sure --no-user-cfg disables the user cfg file + self.assertEquals(len(all_files)-1, len(files)) + class MetadataTestCase(support.TempdirManager, support.EnvironGuard, unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -47,6 +47,10 @@ Library ------- +- Issue #19544 and Issue #1180: Restore global option to ignore + ~/.pydistutils.cfg in Distutils, accidentally removed in backout of + distutils2 changes. + - Issue #19523: Closed FileHandler leak which occurred when delay was set. - Issue #19544 and #6516: Restore support for --user and --group parameters to -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 01:24:15 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 16 Nov 2013 01:24:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_calculate=5Fpath=28=29_now?= =?utf-8?q?_fails_with_a_fatal_error_when_it_fails_to_allocate_memory?= Message-ID: <3dLxxq6vP8z7Llq@mail.python.org> http://hg.python.org/cpython/rev/255253286aa9 changeset: 87138:255253286aa9 user: Victor Stinner date: Sat Nov 16 01:22:04 2013 +0100 summary: calculate_path() now fails with a fatal error when it fails to allocate memory for module_search_path. It was already the case on _Py_char2wchar() failure. files: Modules/getpath.c | 97 +++++++++++++++------------------- 1 files changed, 44 insertions(+), 53 deletions(-) diff --git a/Modules/getpath.c b/Modules/getpath.c --- a/Modules/getpath.c +++ b/Modules/getpath.c @@ -134,7 +134,6 @@ static wchar_t exec_prefix[MAXPATHLEN+1]; static wchar_t progpath[MAXPATHLEN+1]; static wchar_t *module_search_path = NULL; -static int module_search_path_malloced = 0; static void reduce(wchar_t *dir) @@ -740,60 +739,55 @@ bufsz += wcslen(zip_path) + 1; bufsz += wcslen(exec_prefix) + 1; - buf = (wchar_t *)PyMem_Malloc(bufsz*sizeof(wchar_t)); + buf = (wchar_t *)PyMem_Malloc(bufsz * sizeof(wchar_t)); + if (buf == NULL) { + Py_FatalError( + "Not enough memory for dynamic PYTHONPATH"); + } - if (buf == NULL) { - /* We can't exit, so print a warning and limp along */ - fprintf(stderr, "Not enough memory for dynamic PYTHONPATH.\n"); - fprintf(stderr, "Using default static PYTHONPATH.\n"); - module_search_path = L"" PYTHONPATH; + /* Run-time value of $PYTHONPATH goes first */ + if (rtpypath) { + wcscpy(buf, rtpypath); + wcscat(buf, delimiter); } - else { - /* Run-time value of $PYTHONPATH goes first */ - if (rtpypath) { - wcscpy(buf, rtpypath); - wcscat(buf, delimiter); + else + buf[0] = '\0'; + + /* Next is the default zip path */ + wcscat(buf, zip_path); + wcscat(buf, delimiter); + + /* Next goes merge of compile-time $PYTHONPATH with + * dynamically located prefix. + */ + defpath = _pythonpath; + while (1) { + wchar_t *delim = wcschr(defpath, DELIM); + + if (defpath[0] != SEP) { + wcscat(buf, prefix); + wcscat(buf, separator); } - else - buf[0] = '\0'; - /* Next is the default zip path */ - wcscat(buf, zip_path); - wcscat(buf, delimiter); + if (delim) { + size_t len = delim - defpath + 1; + size_t end = wcslen(buf) + len; + wcsncat(buf, defpath, len); + *(buf + end) = '\0'; + } + else { + wcscat(buf, defpath); + break; + } + defpath = delim + 1; + } + wcscat(buf, delimiter); - /* Next goes merge of compile-time $PYTHONPATH with - * dynamically located prefix. - */ - defpath = _pythonpath; - while (1) { - wchar_t *delim = wcschr(defpath, DELIM); + /* Finally, on goes the directory for dynamic-load modules */ + wcscat(buf, exec_prefix); - if (defpath[0] != SEP) { - wcscat(buf, prefix); - wcscat(buf, separator); - } - - if (delim) { - size_t len = delim - defpath + 1; - size_t end = wcslen(buf) + len; - wcsncat(buf, defpath, len); - *(buf + end) = '\0'; - } - else { - wcscat(buf, defpath); - break; - } - defpath = delim + 1; - } - wcscat(buf, delimiter); - - /* Finally, on goes the directory for dynamic-load modules */ - wcscat(buf, exec_prefix); - - /* And publish the results */ - module_search_path = buf; - module_search_path_malloced = 1; - } + /* And publish the results */ + module_search_path = buf; /* Reduce prefix and exec_prefix to their essence, * e.g. /usr/local/lib/python1.5 is reduced to /usr/local. @@ -834,10 +828,8 @@ Py_SetPath(const wchar_t *path) { if (module_search_path != NULL) { - if (module_search_path_malloced) - PyMem_RawFree(module_search_path); + PyMem_RawFree(module_search_path); module_search_path = NULL; - module_search_path_malloced = 0; } if (path != NULL) { extern wchar_t *Py_GetProgramName(void); @@ -845,7 +837,6 @@ wcsncpy(progpath, prog, MAXPATHLEN); exec_prefix[0] = prefix[0] = L'\0'; module_search_path = PyMem_RawMalloc((wcslen(path) + 1) * sizeof(wchar_t)); - module_search_path_malloced = 1; if (module_search_path != NULL) wcscpy(module_search_path, path); } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 01:29:23 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 16 Nov 2013 01:29:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=237408=3A_Forward_p?= =?utf-8?q?ort_limited_test_from_Python_2=2E7=2C_fixing_failing_buildbot?= Message-ID: <3dLy3l3571z7LpM@mail.python.org> http://hg.python.org/cpython/rev/f4b364617abc changeset: 87139:f4b364617abc user: Jason R. Coombs date: Fri Nov 15 19:24:07 2013 -0500 summary: Issue #7408: Forward port limited test from Python 2.7, fixing failing buildbot tests on BSD-based platforms. files: Lib/distutils/tests/test_sdist.py | 5 ++++- 1 files changed, 4 insertions(+), 1 deletions(-) diff --git a/Lib/distutils/tests/test_sdist.py b/Lib/distutils/tests/test_sdist.py --- a/Lib/distutils/tests/test_sdist.py +++ b/Lib/distutils/tests/test_sdist.py @@ -471,10 +471,13 @@ # making sure we have the good rights archive_name = join(self.tmp_dir, 'dist', 'fake-1.0.tar.gz') archive = tarfile.open(archive_name) + + # note that we are not testing the group ownership here + # because, depending on the platforms and the container + # rights (see #7408) try: for member in archive.getmembers(): self.assertEquals(member.uid, os.getuid()) - self.assertEquals(member.gid, os.getgid()) finally: archive.close() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 01:35:29 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 16 Nov 2013 01:35:29 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Use_preferred_assertEqual_?= =?utf-8?q?form=2E_Correct_indentation=2E?= Message-ID: <3dLyBn04DHz7Ljb@mail.python.org> http://hg.python.org/cpython/rev/a20b6867d60f changeset: 87140:a20b6867d60f user: Jason R. Coombs date: Fri Nov 15 19:35:05 2013 -0500 summary: Use preferred assertEqual form. Correct indentation. files: Lib/distutils/tests/test_sdist.py | 8 ++++---- 1 files changed, 4 insertions(+), 4 deletions(-) diff --git a/Lib/distutils/tests/test_sdist.py b/Lib/distutils/tests/test_sdist.py --- a/Lib/distutils/tests/test_sdist.py +++ b/Lib/distutils/tests/test_sdist.py @@ -437,7 +437,7 @@ # check if tar and gzip are installed if (find_executable('tar') is None or - find_executable('gzip') is None): + find_executable('gzip') is None): return # now building a sdist @@ -455,8 +455,8 @@ archive = tarfile.open(archive_name) try: for member in archive.getmembers(): - self.assertEquals(member.uid, 0) - self.assertEquals(member.gid, 0) + self.assertEqual(member.uid, 0) + self.assertEqual(member.gid, 0) finally: archive.close() @@ -477,7 +477,7 @@ # rights (see #7408) try: for member in archive.getmembers(): - self.assertEquals(member.uid, os.getuid()) + self.assertEqual(member.uid, os.getuid()) finally: archive.close() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 01:40:26 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 16 Nov 2013 01:40:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Use_preferred_assertEqual?= Message-ID: <3dLyJV20QNz7Ljb@mail.python.org> http://hg.python.org/cpython/rev/6bb96bfd897e changeset: 87141:6bb96bfd897e user: Jason R. Coombs date: Fri Nov 15 19:38:51 2013 -0500 summary: Use preferred assertEqual files: Lib/distutils/tests/test_dist.py | 18 +++++++++--------- 1 files changed, 9 insertions(+), 9 deletions(-) diff --git a/Lib/distutils/tests/test_dist.py b/Lib/distutils/tests/test_dist.py --- a/Lib/distutils/tests/test_dist.py +++ b/Lib/distutils/tests/test_dist.py @@ -240,7 +240,7 @@ os.path.expanduser = old_expander # make sure --no-user-cfg disables the user cfg file - self.assertEquals(len(all_files)-1, len(files)) + self.assertEqual(len(all_files)-1, len(files)) class MetadataTestCase(support.TempdirManager, support.EnvironGuard, unittest.TestCase): @@ -435,14 +435,14 @@ PKG_INFO.seek(0) metadata.read_pkg_file(PKG_INFO) - self.assertEquals(metadata.name, "package") - self.assertEquals(metadata.version, "1.0") - self.assertEquals(metadata.description, "xxx") - self.assertEquals(metadata.download_url, 'http://example.com') - self.assertEquals(metadata.keywords, ['one', 'two']) - self.assertEquals(metadata.platforms, ['UNKNOWN']) - self.assertEquals(metadata.obsoletes, None) - self.assertEquals(metadata.requires, ['foo']) + self.assertEqual(metadata.name, "package") + self.assertEqual(metadata.version, "1.0") + self.assertEqual(metadata.description, "xxx") + self.assertEqual(metadata.download_url, 'http://example.com') + self.assertEqual(metadata.keywords, ['one', 'two']) + self.assertEqual(metadata.platforms, ['UNKNOWN']) + self.assertEqual(metadata.obsoletes, None) + self.assertEqual(metadata.requires, ['foo']) def test_suite(): suite = unittest.TestSuite() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 01:42:20 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 16 Nov 2013 01:42:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Update_more_usage_of_asser?= =?utf-8?q?tEqual?= Message-ID: <3dLyLh0dRMz7Ljb@mail.python.org> http://hg.python.org/cpython/rev/039d6ace3d28 changeset: 87142:039d6ace3d28 user: Jason R. Coombs date: Fri Nov 15 19:41:57 2013 -0500 summary: Update more usage of assertEqual files: Lib/distutils/tests/test_archive_util.py | 4 ++-- Lib/distutils/tests/test_upload.py | 6 +++--- Lib/test/test_strftime.py | 6 +++--- 3 files changed, 8 insertions(+), 8 deletions(-) diff --git a/Lib/distutils/tests/test_archive_util.py b/Lib/distutils/tests/test_archive_util.py --- a/Lib/distutils/tests/test_archive_util.py +++ b/Lib/distutils/tests/test_archive_util.py @@ -329,8 +329,8 @@ archive = tarfile.open(archive_name) try: for member in archive.getmembers(): - self.assertEquals(member.uid, 0) - self.assertEquals(member.gid, 0) + self.assertEqual(member.uid, 0) + self.assertEqual(member.gid, 0) finally: archive.close() diff --git a/Lib/distutils/tests/test_upload.py b/Lib/distutils/tests/test_upload.py --- a/Lib/distutils/tests/test_upload.py +++ b/Lib/distutils/tests/test_upload.py @@ -115,9 +115,9 @@ headers = dict(self.last_open.req.headers) self.assertEqual(headers['Content-length'], '2087') self.assert_(headers['Content-type'].startswith('multipart/form-data')) - self.assertEquals(self.last_open.req.get_method(), 'POST') - self.assertEquals(self.last_open.req.get_full_url(), - 'https://pypi.python.org/pypi') + self.assertEqual(self.last_open.req.get_method(), 'POST') + expected_url = 'https://pypi.python.org/pypi' + self.assertEqual(self.last_open.req.get_full_url(), expected_url) self.assert_(b'xxx' in self.last_open.req.data) def test_suite(): diff --git a/Lib/test/test_strftime.py b/Lib/test/test_strftime.py --- a/Lib/test/test_strftime.py +++ b/Lib/test/test_strftime.py @@ -189,15 +189,15 @@ @unittest.skipIf(sys.platform == "win32", "Doesn't apply on Windows") def test_y_before_1900_nonwin(self): - self.assertEquals( + self.assertEqual( time.strftime("%y", (1899, 1, 1, 0, 0, 0, 0, 0, 0)), "99") def test_y_1900(self): - self.assertEquals( + self.assertEqual( time.strftime("%y", (1900, 1, 1, 0, 0, 0, 0, 0, 0)), "00") def test_y_after_1900(self): - self.assertEquals( + self.assertEqual( time.strftime("%y", (2013, 1, 1, 0, 0, 0, 0, 0, 0)), "13") if __name__ == '__main__': -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 01:47:47 2013 From: python-checkins at python.org (guido.van.rossum) Date: Sat, 16 Nov 2013 01:47:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Replace_connection=5Frefused?= =?utf-8?b?KCkgd2l0aCBlcnJvcl9yZWNlaXZlZCgpLg==?= Message-ID: <3dLySz18Cxz7Ljd@mail.python.org> http://hg.python.org/peps/rev/d5db4dc4c4e8 changeset: 5273:d5db4dc4c4e8 user: Guido van Rossum date: Fri Nov 15 16:47:42 2013 -0800 summary: Replace connection_refused() with error_received(). files: pep-3156.txt | 25 +++++++++---------------- 1 files changed, 9 insertions(+), 16 deletions(-) diff --git a/pep-3156.txt b/pep-3156.txt --- a/pep-3156.txt +++ b/pep-3156.txt @@ -1259,15 +1259,11 @@ Datagram transports call the following methods on the associated protocol object: ``connection_made()``, ``connection_lost()``, -``connection_refused()`` and ``datagram_received()``. ("Connection" +``error_received()`` and ``datagram_received()``. ("Connection" in these method names is a slight misnomer, but the concepts still exist: ``connection_made()`` means the transport representing the endpoint has been created, and ``connection_lost()`` means the -transport is closed. The ``connection_refused()`` method is called -before ``connection_lost()`` when ``remote_addr`` was given and an -explicit negative acknowledgement was received (this is a UDP -feature). (TBD: Should fix `connection_refused()`` to not close the -transport.) +transport is closed.) Subprocess Transports ''''''''''''''''''''' @@ -1390,20 +1386,17 @@ ``data`` (a bytes objects) was received from remote address ``addr`` (an IPv4 2-tuple or an IPv6 4-tuple). -- ``connection_refused(exc)``. Indicates that a send or receive - operation raised a ``ConnectionRefused`` exception. This typically - indicates that a negative acknowledgment was received for a - previously sent datagram (not for the datagram that was being sent, - if the exception was raised by a send operation). Immediately after - this the socket will be closed and ``connection_lost()`` will be - called with the same exception argument. +- ``error_received(exc)``. Indicates that a send or receive operation + raised an ``OSError`` exception. Since datagram errors may be + transient, it is up to the protocol to call the transport's + ``close()`` method if it wants to close the endpoint. Here is a chart indicating the order and multiplicity of calls: 1. ``connection_made()`` -- exactly once - 2. ``datagram_received()`` -- zero or more times - 3. ``connection_refused()`` -- at most once - 4. ``connection_lost()`` -- exactly once + 2. ``datagram_received()``, ``error_received()`` -- zero or more times + 3. ``connection_lost()`` -- exactly once + Subprocess Protocol ''''''''''''''''''' -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sat Nov 16 01:51:54 2013 From: python-checkins at python.org (guido.van.rossum) Date: Sat, 16 Nov 2013 01:51:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Replace_connect?= =?utf-8?b?aW9uX3JlZnVzZWQoKSB3aXRoIGVycm9yX3JlY2VpdmVkKCku?= Message-ID: <3dLyYk4z4xz7Ljb@mail.python.org> http://hg.python.org/cpython/rev/2248b9fbab39 changeset: 87143:2248b9fbab39 user: Guido van Rossum date: Fri Nov 15 16:51:48 2013 -0800 summary: asyncio: Replace connection_refused() with error_received(). files: Lib/asyncio/protocols.py | 12 ++- Lib/asyncio/selector_events.py | 17 +--- Lib/test/test_asyncio/test_base_events.py | 2 +- Lib/test/test_asyncio/test_events.py | 4 +- Lib/test/test_asyncio/test_selector_events.py | 33 ++++++--- 5 files changed, 39 insertions(+), 29 deletions(-) diff --git a/Lib/asyncio/protocols.py b/Lib/asyncio/protocols.py --- a/Lib/asyncio/protocols.py +++ b/Lib/asyncio/protocols.py @@ -100,15 +100,18 @@ def datagram_received(self, data, addr): """Called when some datagram is received.""" - def connection_refused(self, exc): - """Connection is refused.""" + def error_received(self, exc): + """Called when a send or receive operation raises an OSError. + + (Other than BlockingIOError or InterruptedError.) + """ class SubprocessProtocol(BaseProtocol): """ABC representing a protocol for subprocess calls.""" def pipe_data_received(self, fd, data): - """Called when subprocess write a data into stdout/stderr pipes. + """Called when the subprocess writes data into stdout/stderr pipe. fd is int file dascriptor. data is bytes object. @@ -122,5 +125,4 @@ """ def process_exited(self): - """Called when subprocess has exited. - """ + """Called when subprocess has exited.""" diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -771,6 +771,8 @@ data, addr = self._sock.recvfrom(self.max_size) except (BlockingIOError, InterruptedError): pass + except OSError as exc: + self._protocol.error_received(exc) except Exception as exc: self._fatal_error(exc) else: @@ -800,9 +802,8 @@ return except (BlockingIOError, InterruptedError): self._loop.add_writer(self._sock_fd, self._sendto_ready) - except ConnectionRefusedError as exc: - if self._address: - self._fatal_error(exc) + except OSError as exc: + self._protocol.error_received(exc) return except Exception as exc: self._fatal_error(exc) @@ -822,9 +823,8 @@ except (BlockingIOError, InterruptedError): self._buffer.appendleft((data, addr)) # Try again later. break - except ConnectionRefusedError as exc: - if self._address: - self._fatal_error(exc) + except OSError as exc: + self._protocol.error_received(exc) return except Exception as exc: self._fatal_error(exc) @@ -835,8 +835,3 @@ self._loop.remove_writer(self._sock_fd) if self._closing: self._call_connection_lost(None) - - def _force_close(self, exc): - if self._address and isinstance(exc, ConnectionRefusedError): - self._protocol.connection_refused(exc) - super()._force_close(exc) diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -284,7 +284,7 @@ assert self.state == 'INITIALIZED', self.state self.nbytes += len(data) - def connection_refused(self, exc): + def error_received(self, exc): assert self.state == 'INITIALIZED', self.state def connection_lost(self, exc): diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -78,7 +78,7 @@ assert self.state == 'INITIALIZED', self.state self.nbytes += len(data) - def connection_refused(self, exc): + def error_received(self, exc): assert self.state == 'INITIALIZED', self.state def connection_lost(self, exc): @@ -1557,7 +1557,7 @@ dp = protocols.DatagramProtocol() self.assertIsNone(dp.connection_made(f)) self.assertIsNone(dp.connection_lost(f)) - self.assertIsNone(dp.connection_refused(f)) + self.assertIsNone(dp.error_received(f)) self.assertIsNone(dp.datagram_received(f, f)) sp = protocols.SubprocessProtocol() diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -1329,11 +1329,22 @@ transport = _SelectorDatagramTransport( self.loop, self.sock, self.protocol) + err = self.sock.recvfrom.side_effect = RuntimeError() + transport._fatal_error = unittest.mock.Mock() + transport._read_ready() + + transport._fatal_error.assert_called_with(err) + + def test_read_ready_oserr(self): + transport = _SelectorDatagramTransport( + self.loop, self.sock, self.protocol) + err = self.sock.recvfrom.side_effect = OSError() transport._fatal_error = unittest.mock.Mock() transport._read_ready() - transport._fatal_error.assert_called_with(err) + self.assertFalse(transport._fatal_error.called) + self.protocol.error_received.assert_called_with(err) def test_sendto(self): data = b'data' @@ -1380,7 +1391,7 @@ @unittest.mock.patch('asyncio.selector_events.logger') def test_sendto_exception(self, m_log): data = b'data' - err = self.sock.sendto.side_effect = OSError() + err = self.sock.sendto.side_effect = RuntimeError() transport = _SelectorDatagramTransport( self.loop, self.sock, self.protocol) @@ -1399,7 +1410,7 @@ transport.sendto(data) m_log.warning.assert_called_with('socket.send() raised exception.') - def test_sendto_connection_refused(self): + def test_sendto_error_received(self): data = b'data' self.sock.sendto.side_effect = ConnectionRefusedError @@ -1412,7 +1423,7 @@ self.assertEqual(transport._conn_lost, 0) self.assertFalse(transport._fatal_error.called) - def test_sendto_connection_refused_connected(self): + def test_sendto_error_received_connected(self): data = b'data' self.sock.send.side_effect = ConnectionRefusedError @@ -1422,7 +1433,8 @@ transport._fatal_error = unittest.mock.Mock() transport.sendto(data) - self.assertTrue(transport._fatal_error.called) + self.assertFalse(transport._fatal_error.called) + self.assertTrue(self.protocol.error_received.called) def test_sendto_str(self): transport = _SelectorDatagramTransport( @@ -1495,7 +1507,7 @@ list(transport._buffer)) def test_sendto_ready_exception(self): - err = self.sock.sendto.side_effect = OSError() + err = self.sock.sendto.side_effect = RuntimeError() transport = _SelectorDatagramTransport( self.loop, self.sock, self.protocol) @@ -1505,7 +1517,7 @@ transport._fatal_error.assert_called_with(err) - def test_sendto_ready_connection_refused(self): + def test_sendto_ready_error_received(self): self.sock.sendto.side_effect = ConnectionRefusedError transport = _SelectorDatagramTransport( @@ -1516,7 +1528,7 @@ self.assertFalse(transport._fatal_error.called) - def test_sendto_ready_connection_refused_connection(self): + def test_sendto_ready_error_received_connection(self): self.sock.send.side_effect = ConnectionRefusedError transport = _SelectorDatagramTransport( @@ -1525,7 +1537,8 @@ transport._buffer.append((b'data', ())) transport._sendto_ready() - self.assertTrue(transport._fatal_error.called) + self.assertFalse(transport._fatal_error.called) + self.assertTrue(self.protocol.error_received.called) @unittest.mock.patch('asyncio.log.logger.exception') def test_fatal_error_connected(self, m_exc): @@ -1533,7 +1546,7 @@ self.loop, self.sock, self.protocol, ('0.0.0.0', 1)) err = ConnectionRefusedError() transport._fatal_error(err) - self.protocol.connection_refused.assert_called_with(err) + self.assertFalse(self.protocol.error_received.called) m_exc.assert_called_with('Fatal error for %s', transport) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 02:09:19 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 16 Nov 2013 02:09:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzEyODUz?= =?utf-8?q?=3A_Correct_NameError_in_distutils_upload_command=2E?= Message-ID: <3dLyxq2w7jz7Ljb@mail.python.org> http://hg.python.org/cpython/rev/aa3a7d5e0478 changeset: 87144:aa3a7d5e0478 branch: 2.7 parent: 87123:e9d9bebb979f user: Jason R. Coombs date: Fri Nov 15 20:08:22 2013 -0500 summary: Issue #12853: Correct NameError in distutils upload command. files: Lib/distutils/command/upload.py | 2 +- Misc/NEWS | 2 ++ 2 files changed, 3 insertions(+), 1 deletions(-) diff --git a/Lib/distutils/command/upload.py b/Lib/distutils/command/upload.py --- a/Lib/distutils/command/upload.py +++ b/Lib/distutils/command/upload.py @@ -177,7 +177,7 @@ status = result.getcode() reason = result.msg if self.show_response: - msg = '\n'.join(('-' * 75, r.read(), '-' * 75)) + msg = '\n'.join(('-' * 75, result.read(), '-' * 75)) self.announce(msg, log.INFO) except socket.error, e: self.announce(str(e), log.ERROR) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -12,6 +12,8 @@ Library ------- +- Issue #12853: Fix NameError in distutils.command.upload. + - Issue #19523: Closed FileHandler leak which occurred when delay was set. - Issue #1575020: Fixed support of 24-bit wave files on big-endian platforms. -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Sat Nov 16 07:39:18 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sat, 16 Nov 2013 07:39:18 +0100 Subject: [Python-checkins] Daily reference leaks (2248b9fbab39): sum=4 Message-ID: results for 2248b9fbab39 on branch "default" -------------------------------------------- test_site leaked [0, 0, 2] references, sum=2 test_site leaked [0, 0, 2] memory blocks, sum=2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogUfJrsp', '-x'] From python-checkins at python.org Sat Nov 16 11:57:05 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 16 Nov 2013 11:57:05 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NTkw?= =?utf-8?q?=3A_Use_specific_asserts_in_email_tests=2E?= Message-ID: <3dMD016SK5zNk8@mail.python.org> http://hg.python.org/cpython/rev/27567e954cbe changeset: 87145:27567e954cbe branch: 2.7 user: Serhiy Storchaka date: Sat Nov 16 12:56:05 2013 +0200 summary: Issue #19590: Use specific asserts in email tests. files: Lib/email/test/test_email.py | 152 ++++++-------- Lib/email/test/test_email_renamed.py | 140 ++++++------- 2 files changed, 135 insertions(+), 157 deletions(-) diff --git a/Lib/email/test/test_email.py b/Lib/email/test/test_email.py --- a/Lib/email/test/test_email.py +++ b/Lib/email/test/test_email.py @@ -267,12 +267,12 @@ msg['From'] = 'Me' msg['to'] = 'You' # Check for case insensitivity - self.assertTrue('from' in msg) - self.assertTrue('From' in msg) - self.assertTrue('FROM' in msg) - self.assertTrue('to' in msg) - self.assertTrue('To' in msg) - self.assertTrue('TO' in msg) + self.assertIn('from', msg) + self.assertIn('From', msg) + self.assertIn('FROM', msg) + self.assertIn('to', msg) + self.assertIn('To', msg) + self.assertIn('TO', msg) def test_as_string(self): eq = self.assertEqual @@ -1002,7 +1002,6 @@ def test_add_header(self): eq = self.assertEqual - unless = self.assertTrue self._au.add_header('Content-Disposition', 'attachment', filename='audiotest.au') eq(self._au['content-disposition'], @@ -1013,12 +1012,12 @@ 'audiotest.au') missing = [] eq(self._au.get_param('attachment', header='content-disposition'), '') - unless(self._au.get_param('foo', failobj=missing, - header='content-disposition') is missing) + self.assertIs(self._au.get_param('foo', failobj=missing, + header='content-disposition'), missing) # Try some missing stuff - unless(self._au.get_param('foobar', missing) is missing) - unless(self._au.get_param('attachment', missing, - header='foobar') is missing) + self.assertIs(self._au.get_param('foobar', missing), missing) + self.assertIs(self._au.get_param('attachment', missing, + header='foobar'), missing) @@ -1045,7 +1044,6 @@ def test_add_header(self): eq = self.assertEqual - unless = self.assertTrue self._im.add_header('Content-Disposition', 'attachment', filename='dingusfish.gif') eq(self._im['content-disposition'], @@ -1056,12 +1054,12 @@ 'dingusfish.gif') missing = [] eq(self._im.get_param('attachment', header='content-disposition'), '') - unless(self._im.get_param('foo', failobj=missing, - header='content-disposition') is missing) + self.assertIs(self._im.get_param('foo', failobj=missing, + header='content-disposition'), missing) # Try some missing stuff - unless(self._im.get_param('foobar', missing) is missing) - unless(self._im.get_param('attachment', missing, - header='foobar') is missing) + self.assertIs(self._im.get_param('foobar', missing), missing) + self.assertIs(self._im.get_param('attachment', missing, + header='foobar'), missing) @@ -1072,17 +1070,16 @@ def test_types(self): eq = self.assertEqual - unless = self.assertTrue eq(self._msg.get_content_type(), 'text/plain') eq(self._msg.get_param('charset'), 'us-ascii') missing = [] - unless(self._msg.get_param('foobar', missing) is missing) - unless(self._msg.get_param('charset', missing, header='foobar') - is missing) + self.assertIs(self._msg.get_param('foobar', missing), missing) + self.assertIs(self._msg.get_param('charset', missing, header='foobar'), + missing) def test_payload(self): self.assertEqual(self._msg.get_payload(), 'hello there') - self.assertTrue(not self._msg.is_multipart()) + self.assertFalse(self._msg.is_multipart()) def test_charset(self): eq = self.assertEqual @@ -1101,7 +1098,7 @@ msg = MIMEText(u'hello there') eq(msg.get_charset(), 'us-ascii') eq(msg['content-type'], 'text/plain; charset="us-ascii"') - self.assertTrue('hello there' in msg.as_string()) + self.assertIn('hello there', msg.as_string()) def test_8bit_unicode_input(self): teststr = u'\u043a\u0438\u0440\u0438\u043b\u0438\u0446\u0430' @@ -1162,21 +1159,20 @@ def test_hierarchy(self): # convenience eq = self.assertEqual - unless = self.assertTrue raises = self.assertRaises # tests m = self._msg - unless(m.is_multipart()) + self.assertTrue(m.is_multipart()) eq(m.get_content_type(), 'multipart/mixed') eq(len(m.get_payload()), 2) raises(IndexError, m.get_payload, 2) m0 = m.get_payload(0) m1 = m.get_payload(1) - unless(m0 is self._txt) - unless(m1 is self._im) + self.assertIs(m0, self._txt) + self.assertIs(m1, self._im) eq(m.get_payload(), [m0, m1]) - unless(not m0.is_multipart()) - unless(not m1.is_multipart()) + self.assertFalse(m0.is_multipart()) + self.assertFalse(m1.is_multipart()) def test_empty_multipart_idempotent(self): text = """\ @@ -1506,23 +1502,22 @@ eq(msg.get_content_subtype(), 'plain') def test_same_boundary_inner_outer(self): - unless = self.assertTrue msg = self._msgobj('msg_15.txt') # XXX We can probably eventually do better inner = msg.get_payload(0) - unless(hasattr(inner, 'defects')) + self.assertTrue(hasattr(inner, 'defects')) self.assertEqual(len(inner.defects), 1) - unless(isinstance(inner.defects[0], - Errors.StartBoundaryNotFoundDefect)) + self.assertIsInstance(inner.defects[0], + Errors.StartBoundaryNotFoundDefect) def test_multipart_no_boundary(self): - unless = self.assertTrue msg = self._msgobj('msg_25.txt') - unless(isinstance(msg.get_payload(), str)) + self.assertIsInstance(msg.get_payload(), str) self.assertEqual(len(msg.defects), 2) - unless(isinstance(msg.defects[0], Errors.NoBoundaryInMultipartDefect)) - unless(isinstance(msg.defects[1], - Errors.MultipartInvariantViolationDefect)) + self.assertIsInstance(msg.defects[0], + Errors.NoBoundaryInMultipartDefect) + self.assertIsInstance(msg.defects[1], + Errors.MultipartInvariantViolationDefect) def test_invalid_content_type(self): eq = self.assertEqual @@ -1574,13 +1569,13 @@ """) def test_lying_multipart(self): - unless = self.assertTrue msg = self._msgobj('msg_41.txt') - unless(hasattr(msg, 'defects')) + self.assertTrue(hasattr(msg, 'defects')) self.assertEqual(len(msg.defects), 2) - unless(isinstance(msg.defects[0], Errors.NoBoundaryInMultipartDefect)) - unless(isinstance(msg.defects[1], - Errors.MultipartInvariantViolationDefect)) + self.assertIsInstance(msg.defects[0], + Errors.NoBoundaryInMultipartDefect) + self.assertIsInstance(msg.defects[1], + Errors.MultipartInvariantViolationDefect) def test_missing_start_boundary(self): outer = self._msgobj('msg_42.txt') @@ -1594,8 +1589,8 @@ # [*] This message is missing its start boundary bad = outer.get_payload(1).get_payload(0) self.assertEqual(len(bad.defects), 1) - self.assertTrue(isinstance(bad.defects[0], - Errors.StartBoundaryNotFoundDefect)) + self.assertIsInstance(bad.defects[0], + Errors.StartBoundaryNotFoundDefect) def test_first_line_is_continuation_header(self): eq = self.assertEqual @@ -1604,8 +1599,8 @@ eq(msg.keys(), []) eq(msg.get_payload(), 'Line 2\nLine 3') eq(len(msg.defects), 1) - self.assertTrue(isinstance(msg.defects[0], - Errors.FirstHeaderLineIsContinuationDefect)) + self.assertIsInstance(msg.defects[0], + Errors.FirstHeaderLineIsContinuationDefect) eq(msg.defects[0].line, ' Line 1\n') @@ -1687,17 +1682,16 @@ def test_valid_argument(self): eq = self.assertEqual - unless = self.assertTrue subject = 'A sub-message' m = Message() m['Subject'] = subject r = MIMEMessage(m) eq(r.get_content_type(), 'message/rfc822') payload = r.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) subpart = payload[0] - unless(subpart is m) + self.assertIs(subpart, m) eq(subpart['subject'], subject) def test_bad_multipart(self): @@ -1731,24 +1725,22 @@ def test_parse_message_rfc822(self): eq = self.assertEqual - unless = self.assertTrue msg = self._msgobj('msg_11.txt') eq(msg.get_content_type(), 'message/rfc822') payload = msg.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) submsg = payload[0] - self.assertTrue(isinstance(submsg, Message)) + self.assertIsInstance(submsg, Message) eq(submsg['subject'], 'An enclosed message') eq(submsg.get_payload(), 'Here is the body of the message.\n') def test_dsn(self): eq = self.assertEqual - unless = self.assertTrue # msg 16 is a Delivery Status Notification, see RFC 1894 msg = self._msgobj('msg_16.txt') eq(msg.get_content_type(), 'multipart/report') - unless(msg.is_multipart()) + self.assertTrue(msg.is_multipart()) eq(len(msg.get_payload()), 3) # Subpart 1 is a text/plain, human readable section subpart = msg.get_payload(0) @@ -1777,13 +1769,13 @@ # message/delivery-status should treat each block as a bunch of # headers, i.e. a bunch of Message objects. dsn1 = subpart.get_payload(0) - unless(isinstance(dsn1, Message)) + self.assertIsInstance(dsn1, Message) eq(dsn1['original-envelope-id'], '0GK500B4HD0888 at cougar.noc.ucla.edu') eq(dsn1.get_param('dns', header='reporting-mta'), '') # Try a missing one eq(dsn1.get_param('nsd', header='reporting-mta'), None) dsn2 = subpart.get_payload(1) - unless(isinstance(dsn2, Message)) + self.assertIsInstance(dsn2, Message) eq(dsn2['action'], 'failed') eq(dsn2.get_params(header='original-recipient'), [('rfc822', ''), ('jangel1 at cougar.noc.ucla.edu', '')]) @@ -1792,10 +1784,10 @@ subpart = msg.get_payload(2) eq(subpart.get_content_type(), 'message/rfc822') payload = subpart.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) subsubpart = payload[0] - unless(isinstance(subsubpart, Message)) + self.assertIsInstance(subsubpart, Message) eq(subsubpart.get_content_type(), 'text/plain') eq(subsubpart['message-id'], '<002001c144a6$8752e060$56104586 at oxy.edu>') @@ -2094,7 +2086,6 @@ def test_content_type(self): eq = self.assertEqual - unless = self.assertTrue # Get a message object and reset the seek pointer for other tests msg, text = self._msgobj('msg_05.txt') eq(msg.get_content_type(), 'multipart/report') @@ -2116,29 +2107,28 @@ eq(msg2.get_payload(), 'Yadda yadda yadda\n') msg3 = msg.get_payload(2) eq(msg3.get_content_type(), 'message/rfc822') - self.assertTrue(isinstance(msg3, Message)) + self.assertIsInstance(msg3, Message) payload = msg3.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) msg4 = payload[0] - unless(isinstance(msg4, Message)) + self.assertIsInstance(msg4, Message) eq(msg4.get_payload(), 'Yadda yadda yadda\n') def test_parser(self): eq = self.assertEqual - unless = self.assertTrue msg, text = self._msgobj('msg_06.txt') # Check some of the outer headers eq(msg.get_content_type(), 'message/rfc822') # Make sure the payload is a list of exactly one sub-Message, and that # that submessage has a type of text/plain payload = msg.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) msg1 = payload[0] - self.assertTrue(isinstance(msg1, Message)) + self.assertIsInstance(msg1, Message) eq(msg1.get_content_type(), 'text/plain') - self.assertTrue(isinstance(msg1.get_payload(), str)) + self.assertIsInstance(msg1.get_payload(), str) eq(msg1.get_payload(), '\n') @@ -2175,7 +2165,6 @@ fp.close() def test_message_from_string_with_class(self): - unless = self.assertTrue fp = openfile('msg_01.txt') try: text = fp.read() @@ -2186,7 +2175,7 @@ pass msg = email.message_from_string(text, MyMessage) - unless(isinstance(msg, MyMessage)) + self.assertIsInstance(msg, MyMessage) # Try something more complicated fp = openfile('msg_02.txt') try: @@ -2195,10 +2184,9 @@ fp.close() msg = email.message_from_string(text, MyMessage) for subpart in msg.walk(): - unless(isinstance(subpart, MyMessage)) + self.assertIsInstance(subpart, MyMessage) def test_message_from_file_with_class(self): - unless = self.assertTrue # Create a subclass class MyMessage(Message): pass @@ -2208,7 +2196,7 @@ msg = email.message_from_file(fp, MyMessage) finally: fp.close() - unless(isinstance(msg, MyMessage)) + self.assertIsInstance(msg, MyMessage) # Try something more complicated fp = openfile('msg_02.txt') try: @@ -2216,7 +2204,7 @@ finally: fp.close() for subpart in msg.walk(): - unless(isinstance(subpart, MyMessage)) + self.assertIsInstance(subpart, MyMessage) def test__all__(self): module = __import__('email') @@ -2591,9 +2579,9 @@ break om.append(ol) n1 += 1 - self.assertTrue(n == n1) - self.assertTrue(len(om) == nt) - self.assertTrue(''.join([il for il, n in imt]) == ''.join(om)) + self.assertEqual(n, n1) + self.assertEqual(len(om), nt) + self.assertEqual(''.join([il for il, n in imt]), ''.join(om)) @@ -2610,7 +2598,7 @@ eq(msg['to'], 'ppp at zzz.org') eq(msg.get_content_type(), 'multipart/mixed') self.assertFalse(msg.is_multipart()) - self.assertTrue(isinstance(msg.get_payload(), str)) + self.assertIsInstance(msg.get_payload(), str) def test_whitespace_continuation(self): eq = self.assertEqual @@ -3027,7 +3015,7 @@ h = Header("I am the very model of a modern Major-General; I've information vegetable, animal, and mineral; I know the kings of England, and I quote the fights historical from Marathon to Waterloo, in order categorical; I'm very well acquainted, too, with matters mathematical; I understand equations, both the simple and quadratical; about binomial theorem I'm teeming with a lot o' news, with many cheerful facts about the square of the hypotenuse.", maxlinelen=76) for l in h.encode(splitchars=' ').split('\n '): - self.assertTrue(len(l) <= 76) + self.assertLessEqual(len(l), 76) def test_multilingual(self): eq = self.ndiffAssertEqual @@ -3279,7 +3267,7 @@ ''' msg = email.message_from_string(m) param = msg.get_param('NAME') - self.assertFalse(isinstance(param, tuple)) + self.assertNotIsInstance(param, tuple) self.assertEqual( param, 'file____C__DOCUMENTS_20AND_20SETTINGS_FABIEN_LOCAL_20SETTINGS_TEMP_nsmail.htm') @@ -3432,7 +3420,7 @@ """ msg = email.message_from_string(m) param = msg.get_param('name') - self.assertFalse(isinstance(param, tuple)) + self.assertNotIsInstance(param, tuple) self.assertEqual(param, "Frank's Document") def test_rfc2231_tick_attack_extended(self): @@ -3456,7 +3444,7 @@ """ msg = email.message_from_string(m) param = msg.get_param('name') - self.assertFalse(isinstance(param, tuple)) + self.assertNotIsInstance(param, tuple) self.assertEqual(param, "us-ascii'en-us'Frank's Document") def test_rfc2231_no_extended_values(self): diff --git a/Lib/email/test/test_email_renamed.py b/Lib/email/test/test_email_renamed.py --- a/Lib/email/test/test_email_renamed.py +++ b/Lib/email/test/test_email_renamed.py @@ -231,12 +231,12 @@ msg['From'] = 'Me' msg['to'] = 'You' # Check for case insensitivity - self.assertTrue('from' in msg) - self.assertTrue('From' in msg) - self.assertTrue('FROM' in msg) - self.assertTrue('to' in msg) - self.assertTrue('To' in msg) - self.assertTrue('TO' in msg) + self.assertIn('from', msg) + self.assertIn('From', msg) + self.assertIn('FROM', msg) + self.assertIn('to', msg) + self.assertIn('To', msg) + self.assertIn('TO', msg) def test_as_string(self): eq = self.assertEqual @@ -916,7 +916,6 @@ def test_add_header(self): eq = self.assertEqual - unless = self.assertTrue self._au.add_header('Content-Disposition', 'attachment', filename='audiotest.au') eq(self._au['content-disposition'], @@ -927,12 +926,13 @@ 'audiotest.au') missing = [] eq(self._au.get_param('attachment', header='content-disposition'), '') - unless(self._au.get_param('foo', failobj=missing, - header='content-disposition') is missing) + self.assertIs(self._au.get_param('foo', failobj=missing, + header='content-disposition'), + missing) # Try some missing stuff - unless(self._au.get_param('foobar', missing) is missing) - unless(self._au.get_param('attachment', missing, - header='foobar') is missing) + self.assertIs(self._au.get_param('foobar', missing), missing) + self.assertIs(self._au.get_param('attachment', missing, + header='foobar'), missing) @@ -959,7 +959,6 @@ def test_add_header(self): eq = self.assertEqual - unless = self.assertTrue self._im.add_header('Content-Disposition', 'attachment', filename='dingusfish.gif') eq(self._im['content-disposition'], @@ -970,12 +969,13 @@ 'dingusfish.gif') missing = [] eq(self._im.get_param('attachment', header='content-disposition'), '') - unless(self._im.get_param('foo', failobj=missing, - header='content-disposition') is missing) + self.assertIs(self._im.get_param('foo', failobj=missing, + header='content-disposition'), + missing) # Try some missing stuff - unless(self._im.get_param('foobar', missing) is missing) - unless(self._im.get_param('attachment', missing, - header='foobar') is missing) + self.assertIs(self._im.get_param('foobar', missing), missing) + self.assertIs(self._im.get_param('attachment', missing, + header='foobar'), missing) @@ -1035,17 +1035,16 @@ def test_types(self): eq = self.assertEqual - unless = self.assertTrue eq(self._msg.get_content_type(), 'text/plain') eq(self._msg.get_param('charset'), 'us-ascii') missing = [] - unless(self._msg.get_param('foobar', missing) is missing) - unless(self._msg.get_param('charset', missing, header='foobar') - is missing) + self.assertIs(self._msg.get_param('foobar', missing), missing) + self.assertIs(self._msg.get_param('charset', missing, header='foobar'), + missing) def test_payload(self): self.assertEqual(self._msg.get_payload(), 'hello there') - self.assertTrue(not self._msg.is_multipart()) + self.assertFalse(self._msg.is_multipart()) def test_charset(self): eq = self.assertEqual @@ -1100,21 +1099,20 @@ def test_hierarchy(self): # convenience eq = self.assertEqual - unless = self.assertTrue raises = self.assertRaises # tests m = self._msg - unless(m.is_multipart()) + self.assertTrue(m.is_multipart()) eq(m.get_content_type(), 'multipart/mixed') eq(len(m.get_payload()), 2) raises(IndexError, m.get_payload, 2) m0 = m.get_payload(0) m1 = m.get_payload(1) - unless(m0 is self._txt) - unless(m1 is self._im) + self.assertIs(m0, self._txt) + self.assertIs(m1, self._im) eq(m.get_payload(), [m0, m1]) - unless(not m0.is_multipart()) - unless(not m1.is_multipart()) + self.assertFalse(m0.is_multipart()) + self.assertFalse(m1.is_multipart()) def test_empty_multipart_idempotent(self): text = """\ @@ -1444,23 +1442,22 @@ eq(msg.get_content_subtype(), 'plain') def test_same_boundary_inner_outer(self): - unless = self.assertTrue msg = self._msgobj('msg_15.txt') # XXX We can probably eventually do better inner = msg.get_payload(0) - unless(hasattr(inner, 'defects')) + self.assertTrue(hasattr(inner, 'defects')) self.assertEqual(len(inner.defects), 1) - unless(isinstance(inner.defects[0], - errors.StartBoundaryNotFoundDefect)) + self.assertIsInstance(inner.defects[0], + errors.StartBoundaryNotFoundDefect) def test_multipart_no_boundary(self): - unless = self.assertTrue msg = self._msgobj('msg_25.txt') - unless(isinstance(msg.get_payload(), str)) + self.assertIsInstance(msg.get_payload(), str) self.assertEqual(len(msg.defects), 2) - unless(isinstance(msg.defects[0], errors.NoBoundaryInMultipartDefect)) - unless(isinstance(msg.defects[1], - errors.MultipartInvariantViolationDefect)) + self.assertIsInstance(msg.defects[0], + errors.NoBoundaryInMultipartDefect) + self.assertIsInstance(msg.defects[1], + errors.MultipartInvariantViolationDefect) def test_invalid_content_type(self): eq = self.assertEqual @@ -1512,13 +1509,13 @@ """) def test_lying_multipart(self): - unless = self.assertTrue msg = self._msgobj('msg_41.txt') - unless(hasattr(msg, 'defects')) + self.assertTrue(hasattr(msg, 'defects')) self.assertEqual(len(msg.defects), 2) - unless(isinstance(msg.defects[0], errors.NoBoundaryInMultipartDefect)) - unless(isinstance(msg.defects[1], - errors.MultipartInvariantViolationDefect)) + self.assertIsInstance(msg.defects[0], + errors.NoBoundaryInMultipartDefect) + self.assertIsInstance(msg.defects[1], + errors.MultipartInvariantViolationDefect) def test_missing_start_boundary(self): outer = self._msgobj('msg_42.txt') @@ -1532,8 +1529,8 @@ # [*] This message is missing its start boundary bad = outer.get_payload(1).get_payload(0) self.assertEqual(len(bad.defects), 1) - self.assertTrue(isinstance(bad.defects[0], - errors.StartBoundaryNotFoundDefect)) + self.assertIsInstance(bad.defects[0], + errors.StartBoundaryNotFoundDefect) def test_first_line_is_continuation_header(self): eq = self.assertEqual @@ -1542,8 +1539,8 @@ eq(msg.keys(), []) eq(msg.get_payload(), 'Line 2\nLine 3') eq(len(msg.defects), 1) - self.assertTrue(isinstance(msg.defects[0], - errors.FirstHeaderLineIsContinuationDefect)) + self.assertIsInstance(msg.defects[0], + errors.FirstHeaderLineIsContinuationDefect) eq(msg.defects[0].line, ' Line 1\n') @@ -1609,17 +1606,16 @@ def test_valid_argument(self): eq = self.assertEqual - unless = self.assertTrue subject = 'A sub-message' m = Message() m['Subject'] = subject r = MIMEMessage(m) eq(r.get_content_type(), 'message/rfc822') payload = r.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) subpart = payload[0] - unless(subpart is m) + self.assertIs(subpart, m) eq(subpart['subject'], subject) def test_bad_multipart(self): @@ -1653,24 +1649,22 @@ def test_parse_message_rfc822(self): eq = self.assertEqual - unless = self.assertTrue msg = self._msgobj('msg_11.txt') eq(msg.get_content_type(), 'message/rfc822') payload = msg.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) submsg = payload[0] - self.assertTrue(isinstance(submsg, Message)) + self.assertIsInstance(submsg, Message) eq(submsg['subject'], 'An enclosed message') eq(submsg.get_payload(), 'Here is the body of the message.\n') def test_dsn(self): eq = self.assertEqual - unless = self.assertTrue # msg 16 is a Delivery Status Notification, see RFC 1894 msg = self._msgobj('msg_16.txt') eq(msg.get_content_type(), 'multipart/report') - unless(msg.is_multipart()) + self.assertTrue(msg.is_multipart()) eq(len(msg.get_payload()), 3) # Subpart 1 is a text/plain, human readable section subpart = msg.get_payload(0) @@ -1699,13 +1693,13 @@ # message/delivery-status should treat each block as a bunch of # headers, i.e. a bunch of Message objects. dsn1 = subpart.get_payload(0) - unless(isinstance(dsn1, Message)) + self.assertIsInstance(dsn1, Message) eq(dsn1['original-envelope-id'], '0GK500B4HD0888 at cougar.noc.ucla.edu') eq(dsn1.get_param('dns', header='reporting-mta'), '') # Try a missing one eq(dsn1.get_param('nsd', header='reporting-mta'), None) dsn2 = subpart.get_payload(1) - unless(isinstance(dsn2, Message)) + self.assertIsInstance(dsn2, Message) eq(dsn2['action'], 'failed') eq(dsn2.get_params(header='original-recipient'), [('rfc822', ''), ('jangel1 at cougar.noc.ucla.edu', '')]) @@ -1714,10 +1708,10 @@ subpart = msg.get_payload(2) eq(subpart.get_content_type(), 'message/rfc822') payload = subpart.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) subsubpart = payload[0] - unless(isinstance(subsubpart, Message)) + self.assertIsInstance(subsubpart, Message) eq(subsubpart.get_content_type(), 'text/plain') eq(subsubpart['message-id'], '<002001c144a6$8752e060$56104586 at oxy.edu>') @@ -2013,7 +2007,6 @@ def test_content_type(self): eq = self.assertEqual - unless = self.assertTrue # Get a message object and reset the seek pointer for other tests msg, text = self._msgobj('msg_05.txt') eq(msg.get_content_type(), 'multipart/report') @@ -2035,29 +2028,28 @@ eq(msg2.get_payload(), 'Yadda yadda yadda\n') msg3 = msg.get_payload(2) eq(msg3.get_content_type(), 'message/rfc822') - self.assertTrue(isinstance(msg3, Message)) + self.assertIsInstance(msg3, Message) payload = msg3.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) msg4 = payload[0] - unless(isinstance(msg4, Message)) + self.assertIsInstance(msg4, Message) eq(msg4.get_payload(), 'Yadda yadda yadda\n') def test_parser(self): eq = self.assertEqual - unless = self.assertTrue msg, text = self._msgobj('msg_06.txt') # Check some of the outer headers eq(msg.get_content_type(), 'message/rfc822') # Make sure the payload is a list of exactly one sub-Message, and that # that submessage has a type of text/plain payload = msg.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) msg1 = payload[0] - self.assertTrue(isinstance(msg1, Message)) + self.assertIsInstance(msg1, Message) eq(msg1.get_content_type(), 'text/plain') - self.assertTrue(isinstance(msg1.get_payload(), str)) + self.assertIsInstance(msg1.get_payload(), str) eq(msg1.get_payload(), '\n') @@ -2094,7 +2086,6 @@ fp.close() def test_message_from_string_with_class(self): - unless = self.assertTrue fp = openfile('msg_01.txt') try: text = fp.read() @@ -2105,7 +2096,7 @@ pass msg = email.message_from_string(text, MyMessage) - unless(isinstance(msg, MyMessage)) + self.assertIsInstance(msg, MyMessage) # Try something more complicated fp = openfile('msg_02.txt') try: @@ -2114,10 +2105,9 @@ fp.close() msg = email.message_from_string(text, MyMessage) for subpart in msg.walk(): - unless(isinstance(subpart, MyMessage)) + self.assertIsInstance(subpart, MyMessage) def test_message_from_file_with_class(self): - unless = self.assertTrue # Create a subclass class MyMessage(Message): pass @@ -2127,7 +2117,7 @@ msg = email.message_from_file(fp, MyMessage) finally: fp.close() - unless(isinstance(msg, MyMessage)) + self.assertIsInstance(msg, MyMessage) # Try something more complicated fp = openfile('msg_02.txt') try: @@ -2135,7 +2125,7 @@ finally: fp.close() for subpart in msg.walk(): - unless(isinstance(subpart, MyMessage)) + self.assertIsInstance(subpart, MyMessage) def test__all__(self): module = __import__('email') @@ -2460,7 +2450,7 @@ eq(msg['to'], 'ppp at zzz.org') eq(msg.get_content_type(), 'multipart/mixed') self.assertFalse(msg.is_multipart()) - self.assertTrue(isinstance(msg.get_payload(), str)) + self.assertIsInstance(msg.get_payload(), str) def test_whitespace_continuation(self): eq = self.assertEqual @@ -2856,7 +2846,7 @@ h = Header("I am the very model of a modern Major-General; I've information vegetable, animal, and mineral; I know the kings of England, and I quote the fights historical from Marathon to Waterloo, in order categorical; I'm very well acquainted, too, with matters mathematical; I understand equations, both the simple and quadratical; about binomial theorem I'm teeming with a lot o' news, with many cheerful facts about the square of the hypotenuse.", maxlinelen=76) for l in h.encode(splitchars=' ').split('\n '): - self.assertTrue(len(l) <= 76) + self.assertLessEqual(len(l), 76) def test_multilingual(self): eq = self.ndiffAssertEqual -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 11:57:07 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 16 Nov 2013 11:57:07 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NTkw?= =?utf-8?q?=3A_Use_specific_asserts_in_email_tests=2E?= Message-ID: <3dMD033WsZz7Ljr@mail.python.org> http://hg.python.org/cpython/rev/db6ea9abd317 changeset: 87146:db6ea9abd317 branch: 3.3 parent: 87124:1c714c35c02a user: Serhiy Storchaka date: Sat Nov 16 12:56:23 2013 +0200 summary: Issue #19590: Use specific asserts in email tests. files: Lib/test/test_email/test_defect_handling.py | 26 +- Lib/test/test_email/test_email.py | 165 ++++----- Lib/test/test_email/test_parser.py | 4 +- Lib/test/test_email/test_utils.py | 4 +- 4 files changed, 92 insertions(+), 107 deletions(-) diff --git a/Lib/test/test_email/test_defect_handling.py b/Lib/test/test_email/test_defect_handling.py --- a/Lib/test/test_email/test_defect_handling.py +++ b/Lib/test/test_email/test_defect_handling.py @@ -59,8 +59,8 @@ inner = msg.get_payload(0) self.assertTrue(hasattr(inner, 'defects')) self.assertEqual(len(self.get_defects(inner)), 1) - self.assertTrue(isinstance(self.get_defects(inner)[0], - errors.StartBoundaryNotFoundDefect)) + self.assertIsInstance(self.get_defects(inner)[0], + errors.StartBoundaryNotFoundDefect) def test_multipart_no_boundary(self): source = textwrap.dedent("""\ @@ -84,12 +84,12 @@ with self._raise_point(errors.NoBoundaryInMultipartDefect): msg = self._str_msg(source) if self.raise_expected: return - self.assertTrue(isinstance(msg.get_payload(), str)) + self.assertIsInstance(msg.get_payload(), str) self.assertEqual(len(self.get_defects(msg)), 2) - self.assertTrue(isinstance(self.get_defects(msg)[0], - errors.NoBoundaryInMultipartDefect)) - self.assertTrue(isinstance(self.get_defects(msg)[1], - errors.MultipartInvariantViolationDefect)) + self.assertIsInstance(self.get_defects(msg)[0], + errors.NoBoundaryInMultipartDefect) + self.assertIsInstance(self.get_defects(msg)[1], + errors.MultipartInvariantViolationDefect) multipart_msg = textwrap.dedent("""\ Date: Wed, 14 Nov 2007 12:56:23 GMT @@ -153,10 +153,10 @@ if self.raise_expected: return self.assertTrue(hasattr(msg, 'defects')) self.assertEqual(len(self.get_defects(msg)), 2) - self.assertTrue(isinstance(self.get_defects(msg)[0], - errors.NoBoundaryInMultipartDefect)) - self.assertTrue(isinstance(self.get_defects(msg)[1], - errors.MultipartInvariantViolationDefect)) + self.assertIsInstance(self.get_defects(msg)[0], + errors.NoBoundaryInMultipartDefect) + self.assertIsInstance(self.get_defects(msg)[1], + errors.MultipartInvariantViolationDefect) def test_missing_start_boundary(self): source = textwrap.dedent("""\ @@ -193,8 +193,8 @@ if self.raise_expected: return bad = outer.get_payload(1).get_payload(0) self.assertEqual(len(self.get_defects(bad)), 1) - self.assertTrue(isinstance(self.get_defects(bad)[0], - errors.StartBoundaryNotFoundDefect)) + self.assertIsInstance(self.get_defects(bad)[0], + errors.StartBoundaryNotFoundDefect) def test_first_line_is_continuation_header(self): with self._raise_point(errors.FirstHeaderLineIsContinuationDefect): diff --git a/Lib/test/test_email/test_email.py b/Lib/test/test_email/test_email.py --- a/Lib/test/test_email/test_email.py +++ b/Lib/test/test_email/test_email.py @@ -241,12 +241,12 @@ msg['From'] = 'Me' msg['to'] = 'You' # Check for case insensitivity - self.assertTrue('from' in msg) - self.assertTrue('From' in msg) - self.assertTrue('FROM' in msg) - self.assertTrue('to' in msg) - self.assertTrue('To' in msg) - self.assertTrue('TO' in msg) + self.assertIn('from', msg) + self.assertIn('From', msg) + self.assertIn('FROM', msg) + self.assertIn('to', msg) + self.assertIn('To', msg) + self.assertIn('TO', msg) def test_as_string(self): eq = self.ndiffAssertEqual @@ -339,12 +339,11 @@ self.assertEqual(msg.get_param('bar'), 'baz"foobar"baz') def test_field_containment(self): - unless = self.assertTrue msg = email.message_from_string('Header: exists') - unless('header' in msg) - unless('Header' in msg) - unless('HEADER' in msg) - self.assertFalse('headerx' in msg) + self.assertIn('header', msg) + self.assertIn('Header', msg) + self.assertIn('HEADER', msg) + self.assertNotIn('headerx', msg) def test_set_param(self): eq = self.assertEqual @@ -1400,7 +1399,6 @@ def test_add_header(self): eq = self.assertEqual - unless = self.assertTrue self._au.add_header('Content-Disposition', 'attachment', filename='audiotest.au') eq(self._au['content-disposition'], @@ -1411,12 +1409,12 @@ 'audiotest.au') missing = [] eq(self._au.get_param('attachment', header='content-disposition'), '') - unless(self._au.get_param('foo', failobj=missing, - header='content-disposition') is missing) + self.assertIs(self._au.get_param('foo', failobj=missing, + header='content-disposition'), missing) # Try some missing stuff - unless(self._au.get_param('foobar', missing) is missing) - unless(self._au.get_param('attachment', missing, - header='foobar') is missing) + self.assertIs(self._au.get_param('foobar', missing), missing) + self.assertIs(self._au.get_param('attachment', missing, + header='foobar'), missing) @@ -1441,7 +1439,6 @@ def test_add_header(self): eq = self.assertEqual - unless = self.assertTrue self._im.add_header('Content-Disposition', 'attachment', filename='dingusfish.gif') eq(self._im['content-disposition'], @@ -1452,12 +1449,12 @@ 'dingusfish.gif') missing = [] eq(self._im.get_param('attachment', header='content-disposition'), '') - unless(self._im.get_param('foo', failobj=missing, - header='content-disposition') is missing) + self.assertIs(self._im.get_param('foo', failobj=missing, + header='content-disposition'), missing) # Try some missing stuff - unless(self._im.get_param('foobar', missing) is missing) - unless(self._im.get_param('attachment', missing, - header='foobar') is missing) + self.assertIs(self._im.get_param('foobar', missing), missing) + self.assertIs(self._im.get_param('attachment', missing, + header='foobar'), missing) @@ -1548,17 +1545,16 @@ def test_types(self): eq = self.assertEqual - unless = self.assertTrue eq(self._msg.get_content_type(), 'text/plain') eq(self._msg.get_param('charset'), 'us-ascii') missing = [] - unless(self._msg.get_param('foobar', missing) is missing) - unless(self._msg.get_param('charset', missing, header='foobar') - is missing) + self.assertIs(self._msg.get_param('foobar', missing), missing) + self.assertIs(self._msg.get_param('charset', missing, header='foobar'), + missing) def test_payload(self): self.assertEqual(self._msg.get_payload(), 'hello there') - self.assertTrue(not self._msg.is_multipart()) + self.assertFalse(self._msg.is_multipart()) def test_charset(self): eq = self.assertEqual @@ -1577,7 +1573,7 @@ msg = MIMEText('hello there') eq(msg.get_charset(), 'us-ascii') eq(msg['content-type'], 'text/plain; charset="us-ascii"') - self.assertTrue('hello there' in msg.as_string()) + self.assertIn('hello there', msg.as_string()) def test_utf8_input(self): teststr = '\u043a\u0438\u0440\u0438\u043b\u0438\u0446\u0430' @@ -1636,21 +1632,20 @@ def test_hierarchy(self): # convenience eq = self.assertEqual - unless = self.assertTrue raises = self.assertRaises # tests m = self._msg - unless(m.is_multipart()) + self.assertTrue(m.is_multipart()) eq(m.get_content_type(), 'multipart/mixed') eq(len(m.get_payload()), 2) raises(IndexError, m.get_payload, 2) m0 = m.get_payload(0) m1 = m.get_payload(1) - unless(m0 is self._txt) - unless(m1 is self._im) + self.assertIs(m0, self._txt) + self.assertIs(m1, self._im) eq(m.get_payload(), [m0, m1]) - unless(not m0.is_multipart()) - unless(not m1.is_multipart()) + self.assertFalse(m0.is_multipart()) + self.assertFalse(m1.is_multipart()) def test_empty_multipart_idempotent(self): text = """\ @@ -1982,25 +1977,23 @@ # test_defect_handling def test_same_boundary_inner_outer(self): - unless = self.assertTrue msg = self._msgobj('msg_15.txt') # XXX We can probably eventually do better inner = msg.get_payload(0) - unless(hasattr(inner, 'defects')) + self.assertTrue(hasattr(inner, 'defects')) self.assertEqual(len(inner.defects), 1) - unless(isinstance(inner.defects[0], - errors.StartBoundaryNotFoundDefect)) + self.assertIsInstance(inner.defects[0], + errors.StartBoundaryNotFoundDefect) # test_defect_handling def test_multipart_no_boundary(self): - unless = self.assertTrue msg = self._msgobj('msg_25.txt') - unless(isinstance(msg.get_payload(), str)) + self.assertIsInstance(msg.get_payload(), str) self.assertEqual(len(msg.defects), 2) - unless(isinstance(msg.defects[0], - errors.NoBoundaryInMultipartDefect)) - unless(isinstance(msg.defects[1], - errors.MultipartInvariantViolationDefect)) + self.assertIsInstance(msg.defects[0], + errors.NoBoundaryInMultipartDefect) + self.assertIsInstance(msg.defects[1], + errors.MultipartInvariantViolationDefect) multipart_msg = textwrap.dedent("""\ Date: Wed, 14 Nov 2007 12:56:23 GMT @@ -2098,14 +2091,13 @@ # test_defect_handling def test_lying_multipart(self): - unless = self.assertTrue msg = self._msgobj('msg_41.txt') - unless(hasattr(msg, 'defects')) + self.assertTrue(hasattr(msg, 'defects')) self.assertEqual(len(msg.defects), 2) - unless(isinstance(msg.defects[0], - errors.NoBoundaryInMultipartDefect)) - unless(isinstance(msg.defects[1], - errors.MultipartInvariantViolationDefect)) + self.assertIsInstance(msg.defects[0], + errors.NoBoundaryInMultipartDefect) + self.assertIsInstance(msg.defects[1], + errors.MultipartInvariantViolationDefect) # test_defect_handling def test_missing_start_boundary(self): @@ -2120,8 +2112,8 @@ # [*] This message is missing its start boundary bad = outer.get_payload(1).get_payload(0) self.assertEqual(len(bad.defects), 1) - self.assertTrue(isinstance(bad.defects[0], - errors.StartBoundaryNotFoundDefect)) + self.assertIsInstance(bad.defects[0], + errors.StartBoundaryNotFoundDefect) # test_defect_handling def test_first_line_is_continuation_header(self): @@ -2288,17 +2280,16 @@ def test_valid_argument(self): eq = self.assertEqual - unless = self.assertTrue subject = 'A sub-message' m = Message() m['Subject'] = subject r = MIMEMessage(m) eq(r.get_content_type(), 'message/rfc822') payload = r.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) subpart = payload[0] - unless(subpart is m) + self.assertIs(subpart, m) eq(subpart['subject'], subject) def test_bad_multipart(self): @@ -2331,24 +2322,22 @@ def test_parse_message_rfc822(self): eq = self.assertEqual - unless = self.assertTrue msg = self._msgobj('msg_11.txt') eq(msg.get_content_type(), 'message/rfc822') payload = msg.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) submsg = payload[0] - self.assertTrue(isinstance(submsg, Message)) + self.assertIsInstance(submsg, Message) eq(submsg['subject'], 'An enclosed message') eq(submsg.get_payload(), 'Here is the body of the message.\n') def test_dsn(self): eq = self.assertEqual - unless = self.assertTrue # msg 16 is a Delivery Status Notification, see RFC 1894 msg = self._msgobj('msg_16.txt') eq(msg.get_content_type(), 'multipart/report') - unless(msg.is_multipart()) + self.assertTrue(msg.is_multipart()) eq(len(msg.get_payload()), 3) # Subpart 1 is a text/plain, human readable section subpart = msg.get_payload(0) @@ -2377,13 +2366,13 @@ # message/delivery-status should treat each block as a bunch of # headers, i.e. a bunch of Message objects. dsn1 = subpart.get_payload(0) - unless(isinstance(dsn1, Message)) + self.assertIsInstance(dsn1, Message) eq(dsn1['original-envelope-id'], '0GK500B4HD0888 at cougar.noc.ucla.edu') eq(dsn1.get_param('dns', header='reporting-mta'), '') # Try a missing one eq(dsn1.get_param('nsd', header='reporting-mta'), None) dsn2 = subpart.get_payload(1) - unless(isinstance(dsn2, Message)) + self.assertIsInstance(dsn2, Message) eq(dsn2['action'], 'failed') eq(dsn2.get_params(header='original-recipient'), [('rfc822', ''), ('jangel1 at cougar.noc.ucla.edu', '')]) @@ -2392,10 +2381,10 @@ subpart = msg.get_payload(2) eq(subpart.get_content_type(), 'message/rfc822') payload = subpart.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) subsubpart = payload[0] - unless(isinstance(subsubpart, Message)) + self.assertIsInstance(subsubpart, Message) eq(subsubpart.get_content_type(), 'text/plain') eq(subsubpart['message-id'], '<002001c144a6$8752e060$56104586 at oxy.edu>') @@ -2693,7 +2682,6 @@ def test_content_type(self): eq = self.assertEqual - unless = self.assertTrue # Get a message object and reset the seek pointer for other tests msg, text = self._msgobj('msg_05.txt') eq(msg.get_content_type(), 'multipart/report') @@ -2715,29 +2703,28 @@ eq(msg2.get_payload(), 'Yadda yadda yadda' + self.linesep) msg3 = msg.get_payload(2) eq(msg3.get_content_type(), 'message/rfc822') - self.assertTrue(isinstance(msg3, Message)) + self.assertIsInstance(msg3, Message) payload = msg3.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) msg4 = payload[0] - unless(isinstance(msg4, Message)) + self.assertIsInstance(msg4, Message) eq(msg4.get_payload(), 'Yadda yadda yadda' + self.linesep) def test_parser(self): eq = self.assertEqual - unless = self.assertTrue msg, text = self._msgobj('msg_06.txt') # Check some of the outer headers eq(msg.get_content_type(), 'message/rfc822') # Make sure the payload is a list of exactly one sub-Message, and that # that submessage has a type of text/plain payload = msg.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) msg1 = payload[0] - self.assertTrue(isinstance(msg1, Message)) + self.assertIsInstance(msg1, Message) eq(msg1.get_content_type(), 'text/plain') - self.assertTrue(isinstance(msg1.get_payload(), str)) + self.assertIsInstance(msg1.get_payload(), str) eq(msg1.get_payload(), self.linesep) @@ -2768,7 +2755,6 @@ self.assertEqual(text, s.getvalue()) def test_message_from_string_with_class(self): - unless = self.assertTrue with openfile('msg_01.txt') as fp: text = fp.read() @@ -2777,35 +2763,34 @@ pass msg = email.message_from_string(text, MyMessage) - unless(isinstance(msg, MyMessage)) + self.assertIsInstance(msg, MyMessage) # Try something more complicated with openfile('msg_02.txt') as fp: text = fp.read() msg = email.message_from_string(text, MyMessage) for subpart in msg.walk(): - unless(isinstance(subpart, MyMessage)) + self.assertIsInstance(subpart, MyMessage) def test_message_from_file_with_class(self): - unless = self.assertTrue # Create a subclass class MyMessage(Message): pass with openfile('msg_01.txt') as fp: msg = email.message_from_file(fp, MyMessage) - unless(isinstance(msg, MyMessage)) + self.assertIsInstance(msg, MyMessage) # Try something more complicated with openfile('msg_02.txt') as fp: msg = email.message_from_file(fp, MyMessage) for subpart in msg.walk(): - unless(isinstance(subpart, MyMessage)) + self.assertIsInstance(subpart, MyMessage) def test_custom_message_does_not_require_arguments(self): class MyMessage(Message): def __init__(self): super().__init__() msg = self._str_msg("Subject: test\n\ntest", MyMessage) - self.assertTrue(isinstance(msg, MyMessage)) + self.assertIsInstance(msg, MyMessage) def test__all__(self): module = __import__('email') @@ -3295,9 +3280,9 @@ break om.append(ol) n1 += 1 - self.assertTrue(n == n1) - self.assertTrue(len(om) == nt) - self.assertTrue(''.join([il for il, n in imt]) == ''.join(om)) + self.assertEqual(n, n1) + self.assertEqual(len(om), nt) + self.assertEqual(''.join([il for il, n in imt]), ''.join(om)) @@ -3312,7 +3297,7 @@ eq(msg['to'], 'ppp at zzz.org') eq(msg.get_content_type(), 'multipart/mixed') self.assertFalse(msg.is_multipart()) - self.assertTrue(isinstance(msg.get_payload(), str)) + self.assertIsInstance(msg.get_payload(), str) def test_bytes_header_parser(self): eq = self.assertEqual @@ -3323,8 +3308,8 @@ eq(msg['to'], 'ppp at zzz.org') eq(msg.get_content_type(), 'multipart/mixed') self.assertFalse(msg.is_multipart()) - self.assertTrue(isinstance(msg.get_payload(), str)) - self.assertTrue(isinstance(msg.get_payload(decode=True), bytes)) + self.assertIsInstance(msg.get_payload(), str) + self.assertIsInstance(msg.get_payload(decode=True), bytes) def test_whitespace_continuation(self): eq = self.assertEqual @@ -4365,7 +4350,7 @@ h = Header("I am the very model of a modern Major-General; I've information vegetable, animal, and mineral; I know the kings of England, and I quote the fights historical from Marathon to Waterloo, in order categorical; I'm very well acquainted, too, with matters mathematical; I understand equations, both the simple and quadratical; about binomial theorem I'm teeming with a lot o' news, with many cheerful facts about the square of the hypotenuse.", maxlinelen=76) for l in h.encode(splitchars=' ').split('\n '): - self.assertTrue(len(l) <= 76) + self.assertLessEqual(len(l), 76) def test_multilingual(self): eq = self.ndiffAssertEqual @@ -4834,7 +4819,7 @@ ''' msg = email.message_from_string(m) param = msg.get_param('NAME') - self.assertFalse(isinstance(param, tuple)) + self.assertNotIsInstance(param, tuple) self.assertEqual( param, 'file____C__DOCUMENTS_20AND_20SETTINGS_FABIEN_LOCAL_20SETTINGS_TEMP_nsmail.htm') @@ -4993,7 +4978,7 @@ """ msg = email.message_from_string(m) param = msg.get_param('name') - self.assertFalse(isinstance(param, tuple)) + self.assertNotIsInstance(param, tuple) self.assertEqual(param, "Frank's Document") # test_headerregistry.TestContentTypeHeader.rfc2231_single_quote_in_value_with_charset_and_lang @@ -5019,7 +5004,7 @@ """ msg = email.message_from_string(m) param = msg.get_param('name') - self.assertFalse(isinstance(param, tuple)) + self.assertNotIsInstance(param, tuple) self.assertEqual(param, "us-ascii'en-us'Frank's Document") # test_headerregistry.TestContentTypeHeader.rfc2231_single_quotes_inside_quotes diff --git a/Lib/test/test_email/test_parser.py b/Lib/test/test_email/test_parser.py --- a/Lib/test/test_email/test_parser.py +++ b/Lib/test/test_email/test_parser.py @@ -18,7 +18,7 @@ msg = email.message_from_string("Subject: bogus\n\nmsg\n", self.MyMessage, policy=self.MyPolicy) - self.assertTrue(isinstance(msg, self.MyMessage)) + self.assertIsInstance(msg, self.MyMessage) self.assertIs(msg.check_policy, self.MyPolicy) def test_custom_message_gets_policy_if_possible_from_file(self): @@ -26,7 +26,7 @@ msg = email.message_from_file(source_file, self.MyMessage, policy=self.MyPolicy) - self.assertTrue(isinstance(msg, self.MyMessage)) + self.assertIsInstance(msg, self.MyMessage) self.assertIs(msg.check_policy, self.MyPolicy) # XXX add tests for other functions that take Message arg. diff --git a/Lib/test/test_email/test_utils.py b/Lib/test/test_email/test_utils.py --- a/Lib/test/test_email/test_utils.py +++ b/Lib/test/test_email/test_utils.py @@ -54,12 +54,12 @@ def test_localtime_is_tz_aware_daylight_true(self): test.support.patch(self, time, 'daylight', True) t = utils.localtime() - self.assertIsNot(t.tzinfo, None) + self.assertIsNotNone(t.tzinfo) def test_localtime_is_tz_aware_daylight_false(self): test.support.patch(self, time, 'daylight', False) t = utils.localtime() - self.assertIsNot(t.tzinfo, None) + self.assertIsNotNone(t.tzinfo) def test_localtime_daylight_true_dst_false(self): test.support.patch(self, time, 'daylight', True) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 11:57:09 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 16 Nov 2013 11:57:09 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319590=3A_Use_specific_asserts_in_email_tests=2E?= Message-ID: <3dMD050MXpz7Ljb@mail.python.org> http://hg.python.org/cpython/rev/cc8e56886807 changeset: 87147:cc8e56886807 parent: 87143:2248b9fbab39 parent: 87146:db6ea9abd317 user: Serhiy Storchaka date: Sat Nov 16 12:56:54 2013 +0200 summary: Issue #19590: Use specific asserts in email tests. files: Lib/test/test_email/test_defect_handling.py | 26 +- Lib/test/test_email/test_email.py | 165 ++++----- Lib/test/test_email/test_parser.py | 4 +- Lib/test/test_email/test_utils.py | 4 +- 4 files changed, 92 insertions(+), 107 deletions(-) diff --git a/Lib/test/test_email/test_defect_handling.py b/Lib/test/test_email/test_defect_handling.py --- a/Lib/test/test_email/test_defect_handling.py +++ b/Lib/test/test_email/test_defect_handling.py @@ -59,8 +59,8 @@ inner = msg.get_payload(0) self.assertTrue(hasattr(inner, 'defects')) self.assertEqual(len(self.get_defects(inner)), 1) - self.assertTrue(isinstance(self.get_defects(inner)[0], - errors.StartBoundaryNotFoundDefect)) + self.assertIsInstance(self.get_defects(inner)[0], + errors.StartBoundaryNotFoundDefect) def test_multipart_no_boundary(self): source = textwrap.dedent("""\ @@ -84,12 +84,12 @@ with self._raise_point(errors.NoBoundaryInMultipartDefect): msg = self._str_msg(source) if self.raise_expected: return - self.assertTrue(isinstance(msg.get_payload(), str)) + self.assertIsInstance(msg.get_payload(), str) self.assertEqual(len(self.get_defects(msg)), 2) - self.assertTrue(isinstance(self.get_defects(msg)[0], - errors.NoBoundaryInMultipartDefect)) - self.assertTrue(isinstance(self.get_defects(msg)[1], - errors.MultipartInvariantViolationDefect)) + self.assertIsInstance(self.get_defects(msg)[0], + errors.NoBoundaryInMultipartDefect) + self.assertIsInstance(self.get_defects(msg)[1], + errors.MultipartInvariantViolationDefect) multipart_msg = textwrap.dedent("""\ Date: Wed, 14 Nov 2007 12:56:23 GMT @@ -153,10 +153,10 @@ if self.raise_expected: return self.assertTrue(hasattr(msg, 'defects')) self.assertEqual(len(self.get_defects(msg)), 2) - self.assertTrue(isinstance(self.get_defects(msg)[0], - errors.NoBoundaryInMultipartDefect)) - self.assertTrue(isinstance(self.get_defects(msg)[1], - errors.MultipartInvariantViolationDefect)) + self.assertIsInstance(self.get_defects(msg)[0], + errors.NoBoundaryInMultipartDefect) + self.assertIsInstance(self.get_defects(msg)[1], + errors.MultipartInvariantViolationDefect) def test_missing_start_boundary(self): source = textwrap.dedent("""\ @@ -193,8 +193,8 @@ if self.raise_expected: return bad = outer.get_payload(1).get_payload(0) self.assertEqual(len(self.get_defects(bad)), 1) - self.assertTrue(isinstance(self.get_defects(bad)[0], - errors.StartBoundaryNotFoundDefect)) + self.assertIsInstance(self.get_defects(bad)[0], + errors.StartBoundaryNotFoundDefect) def test_first_line_is_continuation_header(self): with self._raise_point(errors.FirstHeaderLineIsContinuationDefect): diff --git a/Lib/test/test_email/test_email.py b/Lib/test/test_email/test_email.py --- a/Lib/test/test_email/test_email.py +++ b/Lib/test/test_email/test_email.py @@ -241,12 +241,12 @@ msg['From'] = 'Me' msg['to'] = 'You' # Check for case insensitivity - self.assertTrue('from' in msg) - self.assertTrue('From' in msg) - self.assertTrue('FROM' in msg) - self.assertTrue('to' in msg) - self.assertTrue('To' in msg) - self.assertTrue('TO' in msg) + self.assertIn('from', msg) + self.assertIn('From', msg) + self.assertIn('FROM', msg) + self.assertIn('to', msg) + self.assertIn('To', msg) + self.assertIn('TO', msg) def test_as_string(self): msg = self._msgobj('msg_01.txt') @@ -366,12 +366,11 @@ self.assertEqual(msg.get_param('bar'), 'baz"foobar"baz') def test_field_containment(self): - unless = self.assertTrue msg = email.message_from_string('Header: exists') - unless('header' in msg) - unless('Header' in msg) - unless('HEADER' in msg) - self.assertFalse('headerx' in msg) + self.assertIn('header', msg) + self.assertIn('Header', msg) + self.assertIn('HEADER', msg) + self.assertNotIn('headerx', msg) def test_set_param(self): eq = self.assertEqual @@ -1427,7 +1426,6 @@ def test_add_header(self): eq = self.assertEqual - unless = self.assertTrue self._au.add_header('Content-Disposition', 'attachment', filename='audiotest.au') eq(self._au['content-disposition'], @@ -1438,12 +1436,12 @@ 'audiotest.au') missing = [] eq(self._au.get_param('attachment', header='content-disposition'), '') - unless(self._au.get_param('foo', failobj=missing, - header='content-disposition') is missing) + self.assertIs(self._au.get_param('foo', failobj=missing, + header='content-disposition'), missing) # Try some missing stuff - unless(self._au.get_param('foobar', missing) is missing) - unless(self._au.get_param('attachment', missing, - header='foobar') is missing) + self.assertIs(self._au.get_param('foobar', missing), missing) + self.assertIs(self._au.get_param('attachment', missing, + header='foobar'), missing) @@ -1468,7 +1466,6 @@ def test_add_header(self): eq = self.assertEqual - unless = self.assertTrue self._im.add_header('Content-Disposition', 'attachment', filename='dingusfish.gif') eq(self._im['content-disposition'], @@ -1479,12 +1476,12 @@ 'dingusfish.gif') missing = [] eq(self._im.get_param('attachment', header='content-disposition'), '') - unless(self._im.get_param('foo', failobj=missing, - header='content-disposition') is missing) + self.assertIs(self._im.get_param('foo', failobj=missing, + header='content-disposition'), missing) # Try some missing stuff - unless(self._im.get_param('foobar', missing) is missing) - unless(self._im.get_param('attachment', missing, - header='foobar') is missing) + self.assertIs(self._im.get_param('foobar', missing), missing) + self.assertIs(self._im.get_param('attachment', missing, + header='foobar'), missing) @@ -1575,17 +1572,16 @@ def test_types(self): eq = self.assertEqual - unless = self.assertTrue eq(self._msg.get_content_type(), 'text/plain') eq(self._msg.get_param('charset'), 'us-ascii') missing = [] - unless(self._msg.get_param('foobar', missing) is missing) - unless(self._msg.get_param('charset', missing, header='foobar') - is missing) + self.assertIs(self._msg.get_param('foobar', missing), missing) + self.assertIs(self._msg.get_param('charset', missing, header='foobar'), + missing) def test_payload(self): self.assertEqual(self._msg.get_payload(), 'hello there') - self.assertTrue(not self._msg.is_multipart()) + self.assertFalse(self._msg.is_multipart()) def test_charset(self): eq = self.assertEqual @@ -1604,7 +1600,7 @@ msg = MIMEText('hello there') eq(msg.get_charset(), 'us-ascii') eq(msg['content-type'], 'text/plain; charset="us-ascii"') - self.assertTrue('hello there' in msg.as_string()) + self.assertIn('hello there', msg.as_string()) def test_utf8_input(self): teststr = '\u043a\u0438\u0440\u0438\u043b\u0438\u0446\u0430' @@ -1663,21 +1659,20 @@ def test_hierarchy(self): # convenience eq = self.assertEqual - unless = self.assertTrue raises = self.assertRaises # tests m = self._msg - unless(m.is_multipart()) + self.assertTrue(m.is_multipart()) eq(m.get_content_type(), 'multipart/mixed') eq(len(m.get_payload()), 2) raises(IndexError, m.get_payload, 2) m0 = m.get_payload(0) m1 = m.get_payload(1) - unless(m0 is self._txt) - unless(m1 is self._im) + self.assertIs(m0, self._txt) + self.assertIs(m1, self._im) eq(m.get_payload(), [m0, m1]) - unless(not m0.is_multipart()) - unless(not m1.is_multipart()) + self.assertFalse(m0.is_multipart()) + self.assertFalse(m1.is_multipart()) def test_empty_multipart_idempotent(self): text = """\ @@ -2009,25 +2004,23 @@ # test_defect_handling def test_same_boundary_inner_outer(self): - unless = self.assertTrue msg = self._msgobj('msg_15.txt') # XXX We can probably eventually do better inner = msg.get_payload(0) - unless(hasattr(inner, 'defects')) + self.assertTrue(hasattr(inner, 'defects')) self.assertEqual(len(inner.defects), 1) - unless(isinstance(inner.defects[0], - errors.StartBoundaryNotFoundDefect)) + self.assertIsInstance(inner.defects[0], + errors.StartBoundaryNotFoundDefect) # test_defect_handling def test_multipart_no_boundary(self): - unless = self.assertTrue msg = self._msgobj('msg_25.txt') - unless(isinstance(msg.get_payload(), str)) + self.assertIsInstance(msg.get_payload(), str) self.assertEqual(len(msg.defects), 2) - unless(isinstance(msg.defects[0], - errors.NoBoundaryInMultipartDefect)) - unless(isinstance(msg.defects[1], - errors.MultipartInvariantViolationDefect)) + self.assertIsInstance(msg.defects[0], + errors.NoBoundaryInMultipartDefect) + self.assertIsInstance(msg.defects[1], + errors.MultipartInvariantViolationDefect) multipart_msg = textwrap.dedent("""\ Date: Wed, 14 Nov 2007 12:56:23 GMT @@ -2125,14 +2118,13 @@ # test_defect_handling def test_lying_multipart(self): - unless = self.assertTrue msg = self._msgobj('msg_41.txt') - unless(hasattr(msg, 'defects')) + self.assertTrue(hasattr(msg, 'defects')) self.assertEqual(len(msg.defects), 2) - unless(isinstance(msg.defects[0], - errors.NoBoundaryInMultipartDefect)) - unless(isinstance(msg.defects[1], - errors.MultipartInvariantViolationDefect)) + self.assertIsInstance(msg.defects[0], + errors.NoBoundaryInMultipartDefect) + self.assertIsInstance(msg.defects[1], + errors.MultipartInvariantViolationDefect) # test_defect_handling def test_missing_start_boundary(self): @@ -2147,8 +2139,8 @@ # [*] This message is missing its start boundary bad = outer.get_payload(1).get_payload(0) self.assertEqual(len(bad.defects), 1) - self.assertTrue(isinstance(bad.defects[0], - errors.StartBoundaryNotFoundDefect)) + self.assertIsInstance(bad.defects[0], + errors.StartBoundaryNotFoundDefect) # test_defect_handling def test_first_line_is_continuation_header(self): @@ -2315,17 +2307,16 @@ def test_valid_argument(self): eq = self.assertEqual - unless = self.assertTrue subject = 'A sub-message' m = Message() m['Subject'] = subject r = MIMEMessage(m) eq(r.get_content_type(), 'message/rfc822') payload = r.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) subpart = payload[0] - unless(subpart is m) + self.assertIs(subpart, m) eq(subpart['subject'], subject) def test_bad_multipart(self): @@ -2358,24 +2349,22 @@ def test_parse_message_rfc822(self): eq = self.assertEqual - unless = self.assertTrue msg = self._msgobj('msg_11.txt') eq(msg.get_content_type(), 'message/rfc822') payload = msg.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) submsg = payload[0] - self.assertTrue(isinstance(submsg, Message)) + self.assertIsInstance(submsg, Message) eq(submsg['subject'], 'An enclosed message') eq(submsg.get_payload(), 'Here is the body of the message.\n') def test_dsn(self): eq = self.assertEqual - unless = self.assertTrue # msg 16 is a Delivery Status Notification, see RFC 1894 msg = self._msgobj('msg_16.txt') eq(msg.get_content_type(), 'multipart/report') - unless(msg.is_multipart()) + self.assertTrue(msg.is_multipart()) eq(len(msg.get_payload()), 3) # Subpart 1 is a text/plain, human readable section subpart = msg.get_payload(0) @@ -2404,13 +2393,13 @@ # message/delivery-status should treat each block as a bunch of # headers, i.e. a bunch of Message objects. dsn1 = subpart.get_payload(0) - unless(isinstance(dsn1, Message)) + self.assertIsInstance(dsn1, Message) eq(dsn1['original-envelope-id'], '0GK500B4HD0888 at cougar.noc.ucla.edu') eq(dsn1.get_param('dns', header='reporting-mta'), '') # Try a missing one eq(dsn1.get_param('nsd', header='reporting-mta'), None) dsn2 = subpart.get_payload(1) - unless(isinstance(dsn2, Message)) + self.assertIsInstance(dsn2, Message) eq(dsn2['action'], 'failed') eq(dsn2.get_params(header='original-recipient'), [('rfc822', ''), ('jangel1 at cougar.noc.ucla.edu', '')]) @@ -2419,10 +2408,10 @@ subpart = msg.get_payload(2) eq(subpart.get_content_type(), 'message/rfc822') payload = subpart.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) subsubpart = payload[0] - unless(isinstance(subsubpart, Message)) + self.assertIsInstance(subsubpart, Message) eq(subsubpart.get_content_type(), 'text/plain') eq(subsubpart['message-id'], '<002001c144a6$8752e060$56104586 at oxy.edu>') @@ -2720,7 +2709,6 @@ def test_content_type(self): eq = self.assertEqual - unless = self.assertTrue # Get a message object and reset the seek pointer for other tests msg, text = self._msgobj('msg_05.txt') eq(msg.get_content_type(), 'multipart/report') @@ -2742,29 +2730,28 @@ eq(msg2.get_payload(), 'Yadda yadda yadda' + self.linesep) msg3 = msg.get_payload(2) eq(msg3.get_content_type(), 'message/rfc822') - self.assertTrue(isinstance(msg3, Message)) + self.assertIsInstance(msg3, Message) payload = msg3.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) msg4 = payload[0] - unless(isinstance(msg4, Message)) + self.assertIsInstance(msg4, Message) eq(msg4.get_payload(), 'Yadda yadda yadda' + self.linesep) def test_parser(self): eq = self.assertEqual - unless = self.assertTrue msg, text = self._msgobj('msg_06.txt') # Check some of the outer headers eq(msg.get_content_type(), 'message/rfc822') # Make sure the payload is a list of exactly one sub-Message, and that # that submessage has a type of text/plain payload = msg.get_payload() - unless(isinstance(payload, list)) + self.assertIsInstance(payload, list) eq(len(payload), 1) msg1 = payload[0] - self.assertTrue(isinstance(msg1, Message)) + self.assertIsInstance(msg1, Message) eq(msg1.get_content_type(), 'text/plain') - self.assertTrue(isinstance(msg1.get_payload(), str)) + self.assertIsInstance(msg1.get_payload(), str) eq(msg1.get_payload(), self.linesep) @@ -2795,7 +2782,6 @@ self.assertEqual(text, s.getvalue()) def test_message_from_string_with_class(self): - unless = self.assertTrue with openfile('msg_01.txt') as fp: text = fp.read() @@ -2804,35 +2790,34 @@ pass msg = email.message_from_string(text, MyMessage) - unless(isinstance(msg, MyMessage)) + self.assertIsInstance(msg, MyMessage) # Try something more complicated with openfile('msg_02.txt') as fp: text = fp.read() msg = email.message_from_string(text, MyMessage) for subpart in msg.walk(): - unless(isinstance(subpart, MyMessage)) + self.assertIsInstance(subpart, MyMessage) def test_message_from_file_with_class(self): - unless = self.assertTrue # Create a subclass class MyMessage(Message): pass with openfile('msg_01.txt') as fp: msg = email.message_from_file(fp, MyMessage) - unless(isinstance(msg, MyMessage)) + self.assertIsInstance(msg, MyMessage) # Try something more complicated with openfile('msg_02.txt') as fp: msg = email.message_from_file(fp, MyMessage) for subpart in msg.walk(): - unless(isinstance(subpart, MyMessage)) + self.assertIsInstance(subpart, MyMessage) def test_custom_message_does_not_require_arguments(self): class MyMessage(Message): def __init__(self): super().__init__() msg = self._str_msg("Subject: test\n\ntest", MyMessage) - self.assertTrue(isinstance(msg, MyMessage)) + self.assertIsInstance(msg, MyMessage) def test__all__(self): module = __import__('email') @@ -3322,9 +3307,9 @@ break om.append(ol) n1 += 1 - self.assertTrue(n == n1) - self.assertTrue(len(om) == nt) - self.assertTrue(''.join([il for il, n in imt]) == ''.join(om)) + self.assertEqual(n, n1) + self.assertEqual(len(om), nt) + self.assertEqual(''.join([il for il, n in imt]), ''.join(om)) @@ -3339,7 +3324,7 @@ eq(msg['to'], 'ppp at zzz.org') eq(msg.get_content_type(), 'multipart/mixed') self.assertFalse(msg.is_multipart()) - self.assertTrue(isinstance(msg.get_payload(), str)) + self.assertIsInstance(msg.get_payload(), str) def test_bytes_header_parser(self): eq = self.assertEqual @@ -3350,8 +3335,8 @@ eq(msg['to'], 'ppp at zzz.org') eq(msg.get_content_type(), 'multipart/mixed') self.assertFalse(msg.is_multipart()) - self.assertTrue(isinstance(msg.get_payload(), str)) - self.assertTrue(isinstance(msg.get_payload(decode=True), bytes)) + self.assertIsInstance(msg.get_payload(), str) + self.assertIsInstance(msg.get_payload(decode=True), bytes) def test_whitespace_continuation(self): eq = self.assertEqual @@ -4392,7 +4377,7 @@ h = Header("I am the very model of a modern Major-General; I've information vegetable, animal, and mineral; I know the kings of England, and I quote the fights historical from Marathon to Waterloo, in order categorical; I'm very well acquainted, too, with matters mathematical; I understand equations, both the simple and quadratical; about binomial theorem I'm teeming with a lot o' news, with many cheerful facts about the square of the hypotenuse.", maxlinelen=76) for l in h.encode(splitchars=' ').split('\n '): - self.assertTrue(len(l) <= 76) + self.assertLessEqual(len(l), 76) def test_multilingual(self): eq = self.ndiffAssertEqual @@ -4861,7 +4846,7 @@ ''' msg = email.message_from_string(m) param = msg.get_param('NAME') - self.assertFalse(isinstance(param, tuple)) + self.assertNotIsInstance(param, tuple) self.assertEqual( param, 'file____C__DOCUMENTS_20AND_20SETTINGS_FABIEN_LOCAL_20SETTINGS_TEMP_nsmail.htm') @@ -5020,7 +5005,7 @@ """ msg = email.message_from_string(m) param = msg.get_param('name') - self.assertFalse(isinstance(param, tuple)) + self.assertNotIsInstance(param, tuple) self.assertEqual(param, "Frank's Document") # test_headerregistry.TestContentTypeHeader.rfc2231_single_quote_in_value_with_charset_and_lang @@ -5046,7 +5031,7 @@ """ msg = email.message_from_string(m) param = msg.get_param('name') - self.assertFalse(isinstance(param, tuple)) + self.assertNotIsInstance(param, tuple) self.assertEqual(param, "us-ascii'en-us'Frank's Document") # test_headerregistry.TestContentTypeHeader.rfc2231_single_quotes_inside_quotes diff --git a/Lib/test/test_email/test_parser.py b/Lib/test/test_email/test_parser.py --- a/Lib/test/test_email/test_parser.py +++ b/Lib/test/test_email/test_parser.py @@ -18,7 +18,7 @@ msg = email.message_from_string("Subject: bogus\n\nmsg\n", self.MyMessage, policy=self.MyPolicy) - self.assertTrue(isinstance(msg, self.MyMessage)) + self.assertIsInstance(msg, self.MyMessage) self.assertIs(msg.check_policy, self.MyPolicy) def test_custom_message_gets_policy_if_possible_from_file(self): @@ -26,7 +26,7 @@ msg = email.message_from_file(source_file, self.MyMessage, policy=self.MyPolicy) - self.assertTrue(isinstance(msg, self.MyMessage)) + self.assertIsInstance(msg, self.MyMessage) self.assertIs(msg.check_policy, self.MyPolicy) # XXX add tests for other functions that take Message arg. diff --git a/Lib/test/test_email/test_utils.py b/Lib/test/test_email/test_utils.py --- a/Lib/test/test_email/test_utils.py +++ b/Lib/test/test_email/test_utils.py @@ -54,12 +54,12 @@ def test_localtime_is_tz_aware_daylight_true(self): test.support.patch(self, time, 'daylight', True) t = utils.localtime() - self.assertIsNot(t.tzinfo, None) + self.assertIsNotNone(t.tzinfo) def test_localtime_is_tz_aware_daylight_false(self): test.support.patch(self, time, 'daylight', False) t = utils.localtime() - self.assertIsNot(t.tzinfo, None) + self.assertIsNotNone(t.tzinfo) def test_localtime_daylight_true_dst_false(self): test.support.patch(self, time, 'daylight', True) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 12:04:27 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 16 Nov 2013 12:04:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=235202=3A_Added_sup?= =?utf-8?q?port_for_unseekable_files_in_the_wave_module=2E?= Message-ID: <3dMD8W6vvHz7LpT@mail.python.org> http://hg.python.org/cpython/rev/6a599249e8b7 changeset: 87148:6a599249e8b7 user: Serhiy Storchaka date: Sat Nov 16 13:04:00 2013 +0200 summary: Issue #5202: Added support for unseekable files in the wave module. files: Doc/library/wave.rst | 10 +++- Lib/test/audiotests.py | 60 ++++++++++++++++++++++++++++++ Lib/test/test_aifc.py | 44 +++++++-------------- Lib/wave.py | 8 +++- Misc/NEWS | 2 + 5 files changed, 90 insertions(+), 34 deletions(-) diff --git a/Doc/library/wave.rst b/Doc/library/wave.rst --- a/Doc/library/wave.rst +++ b/Doc/library/wave.rst @@ -19,7 +19,7 @@ .. function:: open(file, mode=None) If *file* is a string, open the file by that name, otherwise treat it as a - seekable file-like object. *mode* can be: + file-like object. *mode* can be: ``'rb'`` Read only mode. @@ -43,6 +43,8 @@ ` or :meth:`Wave_write.close() ` method is called. + .. versionchanged:: 3.4 + Added support for unseekable files. .. function:: openfp(file, mode) @@ -154,7 +156,8 @@ .. method:: Wave_write.close() Make sure *nframes* is correct, and close the file if it was opened by - :mod:`wave`. This method is called upon object collection. + :mod:`wave`. This method is called upon object collection. Can raise an + exception if *nframes* is not correct and a file is not seekable. .. method:: Wave_write.setnchannels(n) @@ -208,7 +211,8 @@ .. method:: Wave_write.writeframes(data) - Write audio frames and make sure *nframes* is correct. + Write audio frames and make sure *nframes* is correct. Can raise an + exception if a file is not seekable. Note that it is invalid to set any parameters after calling :meth:`writeframes` diff --git a/Lib/test/audiotests.py b/Lib/test/audiotests.py --- a/Lib/test/audiotests.py +++ b/Lib/test/audiotests.py @@ -21,6 +21,13 @@ a.byteswap() return a.tobytes() +class UnseekableIO(io.FileIO): + def tell(self): + raise io.UnsupportedOperation + + def seek(self, *args, **kwargs): + raise io.UnsupportedOperation + class AudioTests: close_fd = False @@ -177,6 +184,59 @@ self.assertEqual(testfile.read(13), b'ababagalamaga') self.check_file(testfile, self.nframes, self.frames) + def test_unseekable_read(self): + with self.create_file(TESTFN) as f: + f.setnframes(self.nframes) + f.writeframes(self.frames) + + with UnseekableIO(TESTFN, 'rb') as testfile: + self.check_file(testfile, self.nframes, self.frames) + + def test_unseekable_write(self): + with UnseekableIO(TESTFN, 'wb') as testfile: + with self.create_file(testfile) as f: + f.setnframes(self.nframes) + f.writeframes(self.frames) + + self.check_file(TESTFN, self.nframes, self.frames) + + def test_unseekable_incompleted_write(self): + with UnseekableIO(TESTFN, 'wb') as testfile: + testfile.write(b'ababagalamaga') + f = self.create_file(testfile) + f.setnframes(self.nframes + 1) + try: + f.writeframes(self.frames) + except OSError: + pass + try: + f.close() + except OSError: + pass + + with open(TESTFN, 'rb') as testfile: + self.assertEqual(testfile.read(13), b'ababagalamaga') + self.check_file(testfile, self.nframes + 1, self.frames) + + def test_unseekable_overflowed_write(self): + with UnseekableIO(TESTFN, 'wb') as testfile: + testfile.write(b'ababagalamaga') + f = self.create_file(testfile) + f.setnframes(self.nframes - 1) + try: + f.writeframes(self.frames) + except OSError: + pass + try: + f.close() + except OSError: + pass + + with open(TESTFN, 'rb') as testfile: + self.assertEqual(testfile.read(13), b'ababagalamaga') + framesize = self.nchannels * self.sampwidth + self.check_file(testfile, self.nframes - 1, self.frames[:-framesize]) + class AudioTestsWithSourceFile(AudioTests): diff --git a/Lib/test/test_aifc.py b/Lib/test/test_aifc.py --- a/Lib/test/test_aifc.py +++ b/Lib/test/test_aifc.py @@ -8,10 +8,17 @@ import aifc -class AifcPCM8Test(audiotests.AudioWriteTests, - audiotests.AudioTestsWithSourceFile, - unittest.TestCase): +class AifcTest(audiotests.AudioWriteTests, + audiotests.AudioTestsWithSourceFile): module = aifc + close_fd = True + test_unseekable_read = None + test_unseekable_write = None + test_unseekable_incompleted_write = None + test_unseekable_overflowed_write = None + + +class AifcPCM8Test(AifcTest, unittest.TestCase): sndfilename = 'pluck-pcm8.aiff' sndfilenframes = 3307 nchannels = 2 @@ -26,13 +33,9 @@ 11FA 3EFB BCFC 66FF CF04 4309 C10E 5112 EE17 8216 7F14 8012 \ 490E 520D EF0F CE0F E40C 630A 080A 2B0B 510E 8B11 B60E 440A \ """) - close_fd = True -class AifcPCM16Test(audiotests.AudioWriteTests, - audiotests.AudioTestsWithSourceFile, - unittest.TestCase): - module = aifc +class AifcPCM16Test(AifcTest, unittest.TestCase): sndfilename = 'pluck-pcm16.aiff' sndfilenframes = 3307 nchannels = 2 @@ -49,13 +52,9 @@ EEE21753 82071665 7FFF1443 8004128F 49A20EAF 52BB0DBA EFB40F60 CE3C0FBF \ E4B30CEC 63430A5C 08C80A20 2BBB0B08 514A0E43 8BCF1139 B6F60EEB 44120A5E \ """) - close_fd = True -class AifcPCM24Test(audiotests.AudioWriteTests, - audiotests.AudioTestsWithSourceFile, - unittest.TestCase): - module = aifc +class AifcPCM24Test(AifcTest, unittest.TestCase): sndfilename = 'pluck-pcm24.aiff' sndfilenframes = 3307 nchannels = 2 @@ -78,13 +77,9 @@ E4B49C0CEA2D 6344A80A5A7C 08C8FE0A1FFE 2BB9860B0A0E \ 51486F0E44E1 8BCC64113B05 B6F4EC0EEB36 4413170A5B48 \ """) - close_fd = True -class AifcPCM32Test(audiotests.AudioWriteTests, - audiotests.AudioTestsWithSourceFile, - unittest.TestCase): - module = aifc +class AifcPCM32Test(AifcTest, unittest.TestCase): sndfilename = 'pluck-pcm32.aiff' sndfilenframes = 3307 nchannels = 2 @@ -107,13 +102,9 @@ E4B49CC00CEA2D90 6344A8800A5A7CA0 08C8FE800A1FFEE0 2BB986C00B0A0E00 \ 51486F800E44E190 8BCC6480113B0580 B6F4EC000EEB3630 441317800A5B48A0 \ """) - close_fd = True -class AifcULAWTest(audiotests.AudioWriteTests, - audiotests.AudioTestsWithSourceFile, - unittest.TestCase): - module = aifc +class AifcULAWTest(AifcTest, unittest.TestCase): sndfilename = 'pluck-ulaw.aifc' sndfilenframes = 3307 nchannels = 2 @@ -132,13 +123,9 @@ """) if sys.byteorder != 'big': frames = audiotests.byteswap2(frames) - close_fd = True -class AifcALAWTest(audiotests.AudioWriteTests, - audiotests.AudioTestsWithSourceFile, - unittest.TestCase): - module = aifc +class AifcALAWTest(AifcTest, unittest.TestCase): sndfilename = 'pluck-alaw.aifc' sndfilenframes = 3307 nchannels = 2 @@ -157,7 +144,6 @@ """) if sys.byteorder != 'big': frames = audiotests.byteswap2(frames) - close_fd = True class AifcMiscTest(audiotests.AudioTests, unittest.TestCase): diff --git a/Lib/wave.py b/Lib/wave.py --- a/Lib/wave.py +++ b/Lib/wave.py @@ -491,14 +491,18 @@ if not self._nframes: self._nframes = initlength // (self._nchannels * self._sampwidth) self._datalength = self._nframes * self._nchannels * self._sampwidth - self._form_length_pos = self._file.tell() + try: + self._form_length_pos = self._file.tell() + except (AttributeError, OSError): + self._form_length_pos = None self._file.write(struct.pack(' http://hg.python.org/cpython/rev/b96f4ee1b08b changeset: 87149:b96f4ee1b08b user: Serhiy Storchaka date: Sat Nov 16 14:01:31 2013 +0200 summary: Issue #16685: Added support for writing any bytes-like objects in the aifc, sunau, and wave modules. files: Doc/library/aifc.rst | 6 ++++++ Doc/library/sunau.rst | 6 ++++++ Doc/library/wave.rst | 6 ++++++ Lib/aifc.py | 2 ++ Lib/sunau.py | 2 ++ Lib/test/audiotests.py | 24 ++++++++++++++++++++++++ Lib/wave.py | 2 ++ Misc/NEWS | 3 +++ 8 files changed, 51 insertions(+), 0 deletions(-) diff --git a/Doc/library/aifc.rst b/Doc/library/aifc.rst --- a/Doc/library/aifc.rst +++ b/Doc/library/aifc.rst @@ -225,12 +225,18 @@ Write data to the output file. This method can only be called after the audio file parameters have been set. + .. versionchanged:: 3.4 + Any :term:`bytes-like object`\ s are now accepted. + .. method:: aifc.writeframesraw(data) Like :meth:`writeframes`, except that the header of the audio file is not updated. + .. versionchanged:: 3.4 + Any :term:`bytes-like object`\ s are now accepted. + .. method:: aifc.close() diff --git a/Doc/library/sunau.rst b/Doc/library/sunau.rst --- a/Doc/library/sunau.rst +++ b/Doc/library/sunau.rst @@ -250,11 +250,17 @@ Write audio frames, without correcting *nframes*. + .. versionchanged:: 3.4 + Any :term:`bytes-like object`\ s are now accepted. + .. method:: AU_write.writeframes(data) Write audio frames and make sure *nframes* is correct. + .. versionchanged:: 3.4 + Any :term:`bytes-like object`\ s are now accepted. + .. method:: AU_write.close() diff --git a/Doc/library/wave.rst b/Doc/library/wave.rst --- a/Doc/library/wave.rst +++ b/Doc/library/wave.rst @@ -208,12 +208,18 @@ Write audio frames, without correcting *nframes*. + .. versionchanged:: 3.4 + Any :term:`bytes-like object`\ s are now accepted. + .. method:: Wave_write.writeframes(data) Write audio frames and make sure *nframes* is correct. Can raise an exception if a file is not seekable. + .. versionchanged:: 3.4 + Any :term:`bytes-like object`\ s are now accepted. + Note that it is invalid to set any parameters after calling :meth:`writeframes` or :meth:`writeframesraw`, and any attempt to do so will raise diff --git a/Lib/aifc.py b/Lib/aifc.py --- a/Lib/aifc.py +++ b/Lib/aifc.py @@ -692,6 +692,8 @@ return self._nframeswritten def writeframesraw(self, data): + if not isinstance(data, (bytes, bytearray)): + data = memoryview(data).cast('B') self._ensure_header_written(len(data)) nframes = len(data) // (self._sampwidth * self._nchannels) if self._convert: diff --git a/Lib/sunau.py b/Lib/sunau.py --- a/Lib/sunau.py +++ b/Lib/sunau.py @@ -415,6 +415,8 @@ return self._nframeswritten def writeframesraw(self, data): + if not isinstance(data, (bytes, bytearray)): + data = memoryview(data).cast('B') self._ensure_header_written() if self._comptype == 'ULAW': import audioop diff --git a/Lib/test/audiotests.py b/Lib/test/audiotests.py --- a/Lib/test/audiotests.py +++ b/Lib/test/audiotests.py @@ -146,6 +146,30 @@ self.check_file(TESTFN, self.nframes, self.frames) + def test_write_bytearray(self): + f = self.create_file(TESTFN) + f.setnframes(self.nframes) + f.writeframes(bytearray(self.frames)) + f.close() + + self.check_file(TESTFN, self.nframes, self.frames) + + def test_write_array(self): + f = self.create_file(TESTFN) + f.setnframes(self.nframes) + f.writeframes(array.array('h', self.frames)) + f.close() + + self.check_file(TESTFN, self.nframes, self.frames) + + def test_write_memoryview(self): + f = self.create_file(TESTFN) + f.setnframes(self.nframes) + f.writeframes(memoryview(self.frames)) + f.close() + + self.check_file(TESTFN, self.nframes, self.frames) + def test_incompleted_write(self): with open(TESTFN, 'wb') as testfile: testfile.write(b'ababagalamaga') diff --git a/Lib/wave.py b/Lib/wave.py --- a/Lib/wave.py +++ b/Lib/wave.py @@ -435,6 +435,8 @@ return self._nframeswritten def writeframesraw(self, data): + if not isinstance(data, (bytes, bytearray)): + data = memoryview(data).cast('B') self._ensure_header_written(len(data)) nframes = len(data) // (self._sampwidth * self._nchannels) if self._convert: diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -47,6 +47,9 @@ Library ------- +- Issue #16685: Added support for writing any bytes-like objects in the aifc, + sunau, and wave modules. + - Issue #5202: Added support for unseekable files in the wave module. - Issue #19544 and Issue #1180: Restore global option to ignore -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 13:09:23 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 16 Nov 2013 13:09:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fixed_issue_number_for_iss?= =?utf-8?b?dWUgIzgzMTEu?= Message-ID: <3dMFbR2PTqz7Ljb@mail.python.org> http://hg.python.org/cpython/rev/932db179585d changeset: 87150:932db179585d user: Serhiy Storchaka date: Sat Nov 16 14:08:46 2013 +0200 summary: Fixed issue number for issue #8311. files: Misc/NEWS | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -47,7 +47,7 @@ Library ------- -- Issue #16685: Added support for writing any bytes-like objects in the aifc, +- Issue #8311: Added support for writing any bytes-like objects in the aifc, sunau, and wave modules. - Issue #5202: Added support for unseekable files in the wave module. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 16:36:07 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 16 Nov 2013 16:36:07 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319586=3A_Update_r?= =?utf-8?q?emaining_deprecated_assertions_to_their_preferred_usage=2E?= Message-ID: <3dML9z5TLvz7LsQ@mail.python.org> http://hg.python.org/cpython/rev/ff471d8526a8 changeset: 87151:ff471d8526a8 user: Jason R. Coombs date: Sat Nov 16 10:35:46 2013 -0500 summary: Issue #19586: Update remaining deprecated assertions to their preferred usage. files: Lib/distutils/tests/test_upload.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/distutils/tests/test_upload.py b/Lib/distutils/tests/test_upload.py --- a/Lib/distutils/tests/test_upload.py +++ b/Lib/distutils/tests/test_upload.py @@ -114,11 +114,11 @@ # what did we send ? headers = dict(self.last_open.req.headers) self.assertEqual(headers['Content-length'], '2087') - self.assert_(headers['Content-type'].startswith('multipart/form-data')) + self.assertTrue(headers['Content-type'].startswith('multipart/form-data')) self.assertEqual(self.last_open.req.get_method(), 'POST') expected_url = 'https://pypi.python.org/pypi' self.assertEqual(self.last_open.req.get_full_url(), expected_url) - self.assert_(b'xxx' in self.last_open.req.data) + self.assertTrue(b'xxx' in self.last_open.req.data) def test_suite(): return unittest.makeSuite(uploadTestCase) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 16:47:23 2013 From: python-checkins at python.org (jason.coombs) Date: Sat, 16 Nov 2013 16:47:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Correct_long_line?= Message-ID: <3dMLQz55mGzQSv@mail.python.org> http://hg.python.org/cpython/rev/8e7a738c3840 changeset: 87152:8e7a738c3840 user: Jason R. Coombs date: Sat Nov 16 10:47:17 2013 -0500 summary: Correct long line files: Lib/distutils/tests/test_upload.py | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Lib/distutils/tests/test_upload.py b/Lib/distutils/tests/test_upload.py --- a/Lib/distutils/tests/test_upload.py +++ b/Lib/distutils/tests/test_upload.py @@ -114,7 +114,8 @@ # what did we send ? headers = dict(self.last_open.req.headers) self.assertEqual(headers['Content-length'], '2087') - self.assertTrue(headers['Content-type'].startswith('multipart/form-data')) + content_type = headers['Content-type'] + self.assertTrue(content_type.startswith('multipart/form-data')) self.assertEqual(self.last_open.req.get_method(), 'POST') expected_url = 'https://pypi.python.org/pypi' self.assertEqual(self.last_open.req.get_full_url(), expected_url) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 18:11:12 2013 From: python-checkins at python.org (ezio.melotti) Date: Sat, 16 Nov 2013 18:11:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=2317806=3A_Added_keyword-?= =?utf-8?q?argument_support_for_=22tabsize=22_to_str/bytes=2Eexpandtabs=28?= =?utf-8?b?KS4=?= Message-ID: <3dMNHh3WQfz7Lrd@mail.python.org> http://hg.python.org/cpython/rev/82b58807f481 changeset: 87153:82b58807f481 user: Ezio Melotti date: Sat Nov 16 19:10:57 2013 +0200 summary: #17806: Added keyword-argument support for "tabsize" to str/bytes.expandtabs(). files: Doc/library/stdtypes.rst | 2 +- Lib/test/buffer_tests.py | 16 ++++++++--- Lib/test/string_tests.py | 27 ++++++++++++++----- Misc/NEWS | 3 ++ Objects/bytearrayobject.c | 2 +- Objects/bytesobject.c | 2 +- Objects/stringlib/transmogrify.h | 8 +++-- Objects/unicodeobject.c | 14 ++++++--- 8 files changed, 51 insertions(+), 23 deletions(-) diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -1523,7 +1523,7 @@ at that position. -.. method:: str.expandtabs([tabsize]) +.. method:: str.expandtabs(tabsize=8) Return a copy of the string where all tab characters are replaced by one or more spaces, depending on the current column and the given tab size. Tab diff --git a/Lib/test/buffer_tests.py b/Lib/test/buffer_tests.py --- a/Lib/test/buffer_tests.py +++ b/Lib/test/buffer_tests.py @@ -160,14 +160,20 @@ self.marshal(b'abc\rab\tdef\ng\thi').expandtabs(8)) self.assertEqual(b'abc\rab def\ng hi', self.marshal(b'abc\rab\tdef\ng\thi').expandtabs(4)) + self.assertEqual(b'abc\r\nab def\ng hi', + self.marshal(b'abc\r\nab\tdef\ng\thi').expandtabs()) + self.assertEqual(b'abc\r\nab def\ng hi', + self.marshal(b'abc\r\nab\tdef\ng\thi').expandtabs(8)) self.assertEqual(b'abc\r\nab def\ng hi', self.marshal(b'abc\r\nab\tdef\ng\thi').expandtabs(4)) + self.assertEqual(b'abc\r\nab\r\ndef\ng\r\nhi', + self.marshal(b'abc\r\nab\r\ndef\ng\r\nhi').expandtabs(4)) + # check keyword args self.assertEqual(b'abc\rab def\ng hi', - self.marshal(b'abc\rab\tdef\ng\thi').expandtabs()) - self.assertEqual(b'abc\rab def\ng hi', - self.marshal(b'abc\rab\tdef\ng\thi').expandtabs(8)) - self.assertEqual(b'abc\r\nab\r\ndef\ng\r\nhi', - self.marshal(b'abc\r\nab\r\ndef\ng\r\nhi').expandtabs(4)) + self.marshal(b'abc\rab\tdef\ng\thi').expandtabs(tabsize=8)) + self.assertEqual(b'abc\rab def\ng hi', + self.marshal(b'abc\rab\tdef\ng\thi').expandtabs(tabsize=4)) + self.assertEqual(b' a\n b', self.marshal(b' \ta\n\tb').expandtabs(1)) self.assertRaises(TypeError, self.marshal(b'hello').expandtabs, 42, 42) diff --git a/Lib/test/string_tests.py b/Lib/test/string_tests.py --- a/Lib/test/string_tests.py +++ b/Lib/test/string_tests.py @@ -328,13 +328,26 @@ self.checkraises(TypeError, 'hello', 'upper', 42) def test_expandtabs(self): - self.checkequal('abc\rab def\ng hi', 'abc\rab\tdef\ng\thi', 'expandtabs') - self.checkequal('abc\rab def\ng hi', 'abc\rab\tdef\ng\thi', 'expandtabs', 8) - self.checkequal('abc\rab def\ng hi', 'abc\rab\tdef\ng\thi', 'expandtabs', 4) - self.checkequal('abc\r\nab def\ng hi', 'abc\r\nab\tdef\ng\thi', 'expandtabs', 4) - self.checkequal('abc\rab def\ng hi', 'abc\rab\tdef\ng\thi', 'expandtabs') - self.checkequal('abc\rab def\ng hi', 'abc\rab\tdef\ng\thi', 'expandtabs', 8) - self.checkequal('abc\r\nab\r\ndef\ng\r\nhi', 'abc\r\nab\r\ndef\ng\r\nhi', 'expandtabs', 4) + self.checkequal('abc\rab def\ng hi', 'abc\rab\tdef\ng\thi', + 'expandtabs') + self.checkequal('abc\rab def\ng hi', 'abc\rab\tdef\ng\thi', + 'expandtabs', 8) + self.checkequal('abc\rab def\ng hi', 'abc\rab\tdef\ng\thi', + 'expandtabs', 4) + self.checkequal('abc\r\nab def\ng hi', 'abc\r\nab\tdef\ng\thi', + 'expandtabs') + self.checkequal('abc\r\nab def\ng hi', 'abc\r\nab\tdef\ng\thi', + 'expandtabs', 8) + self.checkequal('abc\r\nab def\ng hi', 'abc\r\nab\tdef\ng\thi', + 'expandtabs', 4) + self.checkequal('abc\r\nab\r\ndef\ng\r\nhi', 'abc\r\nab\r\ndef\ng\r\nhi', + 'expandtabs', 4) + # check keyword args + self.checkequal('abc\rab def\ng hi', 'abc\rab\tdef\ng\thi', + 'expandtabs', tabsize=8) + self.checkequal('abc\rab def\ng hi', 'abc\rab\tdef\ng\thi', + 'expandtabs', tabsize=4) + self.checkequal(' a\n b', ' \ta\n\tb', 'expandtabs', 1) self.checkraises(TypeError, 'hello', 'expandtabs', 42, 42) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,6 +10,9 @@ Core and Builtins ----------------- +- Issue #17806: Added keyword-argument support for "tabsize" to + str/bytes.expandtabs(). + - Issue #17828: Output type errors in str.encode(), bytes.decode() and bytearray.decode() now direct users to codecs.encode() or codecs.decode() as appropriate. diff --git a/Objects/bytearrayobject.c b/Objects/bytearrayobject.c --- a/Objects/bytearrayobject.c +++ b/Objects/bytearrayobject.c @@ -2805,7 +2805,7 @@ {"count", (PyCFunction)bytearray_count, METH_VARARGS, count__doc__}, {"decode", (PyCFunction)bytearray_decode, METH_VARARGS | METH_KEYWORDS, decode_doc}, {"endswith", (PyCFunction)bytearray_endswith, METH_VARARGS, endswith__doc__}, - {"expandtabs", (PyCFunction)stringlib_expandtabs, METH_VARARGS, + {"expandtabs", (PyCFunction)stringlib_expandtabs, METH_VARARGS | METH_KEYWORDS, expandtabs__doc__}, {"extend", (PyCFunction)bytearray_extend, METH_O, extend__doc__}, {"find", (PyCFunction)bytearray_find, METH_VARARGS, find__doc__}, diff --git a/Objects/bytesobject.c b/Objects/bytesobject.c --- a/Objects/bytesobject.c +++ b/Objects/bytesobject.c @@ -2389,7 +2389,7 @@ {"decode", (PyCFunction)bytes_decode, METH_VARARGS | METH_KEYWORDS, decode__doc__}, {"endswith", (PyCFunction)bytes_endswith, METH_VARARGS, endswith__doc__}, - {"expandtabs", (PyCFunction)stringlib_expandtabs, METH_VARARGS, + {"expandtabs", (PyCFunction)stringlib_expandtabs, METH_VARARGS | METH_KEYWORDS, expandtabs__doc__}, {"find", (PyCFunction)bytes_find, METH_VARARGS, find__doc__}, {"fromhex", (PyCFunction)bytes_fromhex, METH_VARARGS|METH_CLASS, diff --git a/Objects/stringlib/transmogrify.h b/Objects/stringlib/transmogrify.h --- a/Objects/stringlib/transmogrify.h +++ b/Objects/stringlib/transmogrify.h @@ -5,21 +5,23 @@ shared code in bytes_methods.c to cut down on duplicate code bloat. */ PyDoc_STRVAR(expandtabs__doc__, -"B.expandtabs([tabsize]) -> copy of B\n\ +"B.expandtabs(tabsize=8) -> copy of B\n\ \n\ Return a copy of B where all tab characters are expanded using spaces.\n\ If tabsize is not given, a tab size of 8 characters is assumed."); static PyObject* -stringlib_expandtabs(PyObject *self, PyObject *args) +stringlib_expandtabs(PyObject *self, PyObject *args, PyObject *kwds) { const char *e, *p; char *q; size_t i, j; PyObject *u; + static char *kwlist[] = {"tabsize", 0}; int tabsize = 8; - if (!PyArg_ParseTuple(args, "|i:expandtabs", &tabsize)) + if (!PyArg_ParseTupleAndKeywords(args, kwds, "|i:expandtabs", + kwlist, &tabsize)) return NULL; /* First pass: determine size of output string */ diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -11010,23 +11010,25 @@ } PyDoc_STRVAR(expandtabs__doc__, - "S.expandtabs([tabsize]) -> str\n\ + "S.expandtabs(tabsize=8) -> str\n\ \n\ Return a copy of S where all tab characters are expanded using spaces.\n\ If tabsize is not given, a tab size of 8 characters is assumed."); static PyObject* -unicode_expandtabs(PyObject *self, PyObject *args) +unicode_expandtabs(PyObject *self, PyObject *args, PyObject *kwds) { Py_ssize_t i, j, line_pos, src_len, incr; Py_UCS4 ch; PyObject *u; void *src_data, *dest_data; + static char *kwlist[] = {"tabsize", 0}; int tabsize = 8; int kind; int found; - if (!PyArg_ParseTuple(args, "|i:expandtabs", &tabsize)) + if (!PyArg_ParseTupleAndKeywords(args, kwds, "|i:expandtabs", + kwlist, &tabsize)) return NULL; if (PyUnicode_READY(self) == -1) @@ -13394,7 +13396,8 @@ {"title", (PyCFunction) unicode_title, METH_NOARGS, title__doc__}, {"center", (PyCFunction) unicode_center, METH_VARARGS, center__doc__}, {"count", (PyCFunction) unicode_count, METH_VARARGS, count__doc__}, - {"expandtabs", (PyCFunction) unicode_expandtabs, METH_VARARGS, expandtabs__doc__}, + {"expandtabs", (PyCFunction) unicode_expandtabs, + METH_VARARGS | METH_KEYWORDS, expandtabs__doc__}, {"find", (PyCFunction) unicode_find, METH_VARARGS, find__doc__}, {"partition", (PyCFunction) unicode_partition, METH_O, partition__doc__}, {"index", (PyCFunction) unicode_index, METH_VARARGS, index__doc__}, @@ -13406,7 +13409,8 @@ {"rjust", (PyCFunction) unicode_rjust, METH_VARARGS, rjust__doc__}, {"rstrip", (PyCFunction) unicode_rstrip, METH_VARARGS, rstrip__doc__}, {"rpartition", (PyCFunction) unicode_rpartition, METH_O, rpartition__doc__}, - {"splitlines", (PyCFunction) unicode_splitlines, METH_VARARGS | METH_KEYWORDS, splitlines__doc__}, + {"splitlines", (PyCFunction) unicode_splitlines, + METH_VARARGS | METH_KEYWORDS, splitlines__doc__}, {"strip", (PyCFunction) unicode_strip, METH_VARARGS, strip__doc__}, {"swapcase", (PyCFunction) unicode_swapcase, METH_NOARGS, swapcase__doc__}, {"translate", (PyCFunction) unicode_translate, METH_O, translate__doc__}, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 19:08:08 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 16 Nov 2013 19:08:08 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Add_current_opcodes=2C_as_imp?= =?utf-8?q?lemented_by_Alexandre?= Message-ID: <3dMPYN06tMz7Lty@mail.python.org> http://hg.python.org/peps/rev/a6d05d3fe0cd changeset: 5274:a6d05d3fe0cd user: Antoine Pitrou date: Sat Nov 16 19:08:03 2013 +0100 summary: Add current opcodes, as implemented by Alexandre files: pep-3154.txt | 33 +++++++++++++++++++++++++++++++++ 1 files changed, 33 insertions(+), 0 deletions(-) diff --git a/pep-3154.txt b/pep-3154.txt --- a/pep-3154.txt +++ b/pep-3154.txt @@ -143,6 +143,39 @@ would make many pickles smaller. +Summary of new opcodes +====================== + +* ``SHORT_BINUNICODE``: push a utf8-encoded str object with a one-byte + size prefix (therefore less than 256 bytes long). + +* ``BINUNICODE8``: push a utf8-encoded str object with a eight-byte + size prefix (for strings longer than 2**32 bytes, which therefore cannot + be serialized using ``BINUNICODE``). + +* ``BINBYTES8``: push a bytes object with a eight-byte size prefix + (for bytes objects longer than 2**32 bytes, which therefore cannot be + serialized using ``BINBYTES``). + +* ``EMPTY_SET``: push a new empty set object on the stack. + +* ``ADDITEMS``: add the topmost stack items to the set (to be used with + ``EMPTY_SET``). + +* ``EMPTY_FROZENSET``: push a new empty frozenset object on the stack. + +* ``FROZENSET``: create a frozenset object from the topmost stack items, + and push it on the stack. + +* ``NEWOBJ_EX``: take the three topmost stack items ``cls``, ``args`` + and ``kwargs``, and push the result of calling + ``cls.__new__(*args, **kwargs)``. + +* ``STACK_GLOBAL``: take the two topmost stack items ``module_name`` and + ``qualname``, and push the result of looking up the dotted ``qualname`` + in the module named ``module_name``. + + Alternative ideas ================= -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sat Nov 16 19:10:14 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 16 Nov 2013 19:10:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Wording_tweaks?= Message-ID: <3dMPbp1sYYz7Lm1@mail.python.org> http://hg.python.org/peps/rev/b49b9185b139 changeset: 5275:b49b9185b139 user: Antoine Pitrou date: Sat Nov 16 19:10:10 2013 +0100 summary: Wording tweaks files: pep-3154.txt | 9 ++++++--- 1 files changed, 6 insertions(+), 3 deletions(-) diff --git a/pep-3154.txt b/pep-3154.txt --- a/pep-3154.txt +++ b/pep-3154.txt @@ -106,8 +106,8 @@ The ``__qualname__`` attribute from :pep:`3155` makes it possible to lookup many more objects by name. Making the GLOBAL_STACK opcode accept -dot-separated names, or adding a special GETATTR opcode, would allow the -standard pickle implementation to support all those kinds of objects. +dot-separated names would allow the standard pickle implementation to +support all those kinds of objects. 64-bit opcodes for large objects -------------------------------- @@ -132,7 +132,7 @@ Currently, classes whose __new__ mandates the use of keyword-only arguments can not be pickled (or, rather, unpickled) [3]_. Both a new -special method (``__getnewargs_ex__`` ?) and a new opcode (NEWOBJEX ?) +special method (``__getnewargs_ex__`` ?) and a new opcode (NEWOBJ_EX ?) are needed. Better string encoding @@ -146,6 +146,9 @@ Summary of new opcodes ====================== +These reflect the state of the proposed implementation (thanks mostly +to Alexandre Vassalotti's work): + * ``SHORT_BINUNICODE``: push a utf8-encoded str object with a one-byte size prefix (therefore less than 256 bytes long). -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sat Nov 16 19:32:43 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 16 Nov 2013 19:32:43 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Fix_an_example?= Message-ID: <3dMQ5l5ltVzQlC@mail.python.org> http://hg.python.org/peps/rev/e7f328e9a005 changeset: 5276:e7f328e9a005 user: Antoine Pitrou date: Sat Nov 16 19:32:39 2013 +0100 summary: Fix an example files: pep-0428.txt | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -217,7 +217,7 @@ '/' >>> p.parts - >>> p.relative('/home/antoine') + >>> p.relative_to('/home/antoine') PosixPath('pathlib/setup.py') >>> p.exists() True -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sat Nov 16 19:50:56 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 16 Nov 2013 19:50:56 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?cGVwczogS2lsbCBhc19ieXRlcygp?= Message-ID: <3dMQVm5Sjzz7Ljh@mail.python.org> http://hg.python.org/peps/rev/091f768e34e6 changeset: 5277:091f768e34e6 user: Antoine Pitrou date: Sat Nov 16 19:50:52 2013 +0100 summary: Kill as_bytes() files: pep-0428.txt | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -343,7 +343,7 @@ 'c:/windows' To get the bytes representation (which might be useful under Unix systems), -call ``bytes()`` on it, or use the ``as_bytes()`` method:: +call ``bytes()`` on it:: >>> bytes(p) b'/home/antoine/pathlib/setup.py' -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sat Nov 16 19:55:34 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 16 Nov 2013 19:55:34 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?cGVwczogS2lsbCByYXdfb3Blbigp?= Message-ID: <3dMQc64RKzz7Ljh@mail.python.org> http://hg.python.org/peps/rev/586d40e32260 changeset: 5278:586d40e32260 user: Antoine Pitrou date: Sat Nov 16 19:55:30 2013 +0100 summary: Kill raw_open() files: pep-0428.txt | 6 ------ 1 files changed, 0 insertions(+), 6 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -630,12 +630,6 @@ ... '#!/usr/bin/env python3\n' -The ``raw_open()`` method, on the other hand, is similar to ``os.open``:: - - >>> fd = p.raw_open(os.O_RDONLY) - >>> os.read(fd, 15) - b'#!/usr/bin/env ' - Filesystem modification ----------------------- -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sat Nov 16 19:57:48 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 16 Nov 2013 19:57:48 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Point_to_readthedocs_for_deta?= =?utf-8?q?iled_API_spec?= Message-ID: <3dMQfh6G33z7Ljh@mail.python.org> http://hg.python.org/peps/rev/dcfef95bfc74 changeset: 5279:dcfef95bfc74 user: Antoine Pitrou date: Sat Nov 16 19:57:44 2013 +0100 summary: Point to readthedocs for detailed API spec files: pep-0428.txt | 5 +++++ 1 files changed, 5 insertions(+), 0 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -639,6 +639,11 @@ ``chmod()``, ``lchmod()``, ``symlink_to()``. More operations could be provided, for example some of the functionality of the shutil module. +Detailed documentation of the proposed API can be found at the `pathlib +docs`_. + +.. _pathlib docs: https://pathlib.readthedocs.org/en/pep428/ + Discussion ========== -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sat Nov 16 20:08:13 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 16 Nov 2013 20:08:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Implicit_iteration_is_replace?= =?utf-8?q?d_by_the_iterdir=28=29_method?= Message-ID: <3dMQtj4p2zz7Lk2@mail.python.org> http://hg.python.org/peps/rev/d7a201b5c9fc changeset: 5280:d7a201b5c9fc user: Antoine Pitrou date: Sat Nov 16 20:08:09 2013 +0100 summary: Implicit iteration is replaced by the iterdir() method files: pep-0428.txt | 7 ++++--- 1 files changed, 4 insertions(+), 3 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -589,10 +589,11 @@ Directory walking ----------------- -Simple (non-recursive) directory access is done by iteration:: +Simple (non-recursive) directory access is done by calling the iterdir() +method, which returns an iterator over the child paths:: >>> p = Path('docs') - >>> for child in p: child + >>> for child in p.iterdir(): child ... PosixPath('docs/conf.py') PosixPath('docs/_templates') @@ -605,7 +606,7 @@ This allows simple filtering through list comprehensions:: >>> p = Path('.') - >>> [child for child in p if child.is_dir()] + >>> [child for child in p.iterdir() if child.is_dir()] [PosixPath('.hg'), PosixPath('docs'), PosixPath('dist'), PosixPath('__pycache__'), PosixPath('build')] Simple and recursive globbing is also provided:: -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sat Nov 16 22:31:53 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 16 Nov 2013 22:31:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_NTPath_becomes_WindowsPath?= Message-ID: <3dMV4T5Fxrz7LlY@mail.python.org> http://hg.python.org/peps/rev/c381244b5e63 changeset: 5281:c381244b5e63 user: Antoine Pitrou date: Sat Nov 16 22:31:45 2013 +0100 summary: NTPath becomes WindowsPath files: pep-0428.txt | 100 +++++++++++++++++++------------------- 1 files changed, 50 insertions(+), 50 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -98,11 +98,11 @@ | | | | | | v | v - +---------------+ | +------------+ - | | | | | - | PurePosixPath | | | PureNTPath | - | | | | | - +---------------+ | +------------+ + +---------------+ | +-----------------+ + | | | | | + | PurePosixPath | | | PureWindowsPath | + | | | | | + +---------------+ | +-----------------+ | v | | +------+ | | | | | @@ -112,11 +112,11 @@ | | | | | | | | v v v v - +-----------+ +--------+ - | | | | - | PosixPath | | NTPath | - | | | | - +-----------+ +--------+ + +-----------+ +-------------+ + | | | | + | PosixPath | | WindowsPath | + | | | | + +-----------+ +-------------+ This hierarchy divides path classes along two dimensions: @@ -132,15 +132,15 @@ other systems (``os.name``'s terminology is re-used here). Any pure class can be instantiated on any system: for example, you can -manipulate ``PurePosixPath`` objects under Windows, ``PureNTPath`` objects -under Unix, and so on. However, concrete classes can only be instantiated -on a matching system: indeed, it would be error-prone to start doing I/O -with ``NTPath`` objects under Unix, or vice-versa. +manipulate ``PurePosixPath`` objects under Windows, ``PureWindowsPath`` +objects under Unix, and so on. However, concrete classes can only be +instantiated on a matching system: indeed, it would be error-prone to start +doing I/O with ``WindowsPath`` objects under Unix, or vice-versa. Furthermore, there are two base classes which also act as system-dependent factories: ``PurePath`` will instantiate either a ``PurePosixPath`` or a -``PureNTPath`` depending on the operating system. Similarly, ``Path`` -will instantiate either a ``PosixPath`` or a ``NTPath``. +``PureWindowsPath`` depending on the operating system. Similarly, ``Path`` +will instantiate either a ``PosixPath`` or a ``WindowsPath``. It is expected that, in most uses, using the ``Path`` class is adequate, which is why it has the shortest name of all. @@ -189,17 +189,17 @@ Comparing and ordering Windows path objects is case-insensitive:: - >>> PureNTPath('a') == PureNTPath('A') + >>> PureWindowsPath('a') == PureWindowsPath('A') True Paths of different flavours always compare unequal, and cannot be ordered:: - >>> PurePosixPath('a') == PureNTPath('a') + >>> PurePosixPath('a') == PureWindowsPath('a') False - >>> PurePosixPath('a') < PureNTPath('a') + >>> PurePosixPath('a') < PureWindowsPath('a') Traceback (most recent call last): File "", line 1, in - TypeError: unorderable types: PurePosixPath() < PureNTPath() + TypeError: unorderable types: PurePosixPath() < PureWindowsPath() Useful notations @@ -280,14 +280,14 @@ However, with Windows paths, the drive is retained as necessary:: - >>> PureNTPath('c:/foo', '/Windows') - PureNTPath('c:\\Windows') - >>> PureNTPath('c:/foo', 'd:') - PureNTPath('d:') + >>> PureWindowsPath('c:/foo', '/Windows') + PureWindowsPath('c:\\Windows') + >>> PureWindowsPath('c:/foo', 'd:') + PureWindowsPath('d:') Also, path separators are normalized to the platform default:: - >>> PureNTPath('a/b') == PureNTPath('a\\b') + >>> PureWindowsPath('a/b') == PureWindowsPath('a\\b') True Extraneous path separators and ``"."`` components are eliminated, but not @@ -302,8 +302,8 @@ flavour. They are always retained on Windows paths (because of the UNC notation):: - >>> PureNTPath('//some/path') - PureNTPath('\\\\some\\path\\') + >>> PureWindowsPath('//some/path') + PureWindowsPath('\\\\some\\path\\') On POSIX, they are collapsed except if there are exactly two leading slashes, which is a special case in the POSIX specification on `pathname resolution`_ @@ -332,7 +332,7 @@ >>> p = PurePath('/home/antoine/pathlib/setup.py') >>> str(p) '/home/antoine/pathlib/setup.py' - >>> p = PureNTPath('c:/windows') + >>> p = PureWindowsPath('c:/windows') >>> str(p) 'c:\\windows' @@ -353,7 +353,7 @@ >>> p = PurePosixPath('/etc/passwd') >>> p.as_uri() 'file:///etc/passwd' - >>> p = PureNTPath('c:/Windows') + >>> p = PureWindowsPath('c:/Windows') >>> p.as_uri() 'file:///c:/Windows' @@ -363,7 +363,7 @@ Seven simple properties are provided on every path (each can be empty):: - >>> p = PureNTPath('c:/Downloads/pathlib.tar.gz') + >>> p = PureWindowsPath('c:/Downloads/pathlib.tar.gz') >>> p.drive 'c:' >>> p.root @@ -414,31 +414,31 @@ The ``with_name()`` method returns a new path, with the name changed:: - >>> p = PureNTPath('c:/Downloads/pathlib.tar.gz') + >>> p = PureWindowsPath('c:/Downloads/pathlib.tar.gz') >>> p.with_name('setup.py') - PureNTPath('c:\\Downloads\\setup.py') + PureWindowsPath('c:\\Downloads\\setup.py') It fails with a ``ValueError`` if the path doesn't have an actual name:: - >>> p = PureNTPath('c:/') + >>> p = PureWindowsPath('c:/') >>> p.with_name('setup.py') Traceback (most recent call last): File "", line 1, in File "pathlib.py", line 875, in with_name raise ValueError("%r has an empty name" % (self,)) - ValueError: PureNTPath('c:\\') has an empty name + ValueError: PureWindowsPath('c:\\') has an empty name >>> p.name '' The ``with_suffix()`` method returns a new path with the suffix changed. However, if the path has no suffix, the new suffix is added:: - >>> p = PureNTPath('c:/Downloads/pathlib.tar.gz') + >>> p = PureWindowsPath('c:/Downloads/pathlib.tar.gz') >>> p.with_suffix('.bz2') - PureNTPath('c:\\Downloads\\pathlib.tar.bz2') - >>> p = PureNTPath('README') + PureWindowsPath('c:\\Downloads\\pathlib.tar.bz2') + >>> p = PureWindowsPath('README') >>> p.with_suffix('.bz2') - PureNTPath('README.bz2') + PureWindowsPath('README.bz2') Making the path relative ^^^^^^^^^^^^^^^^^^^^^^^^ @@ -478,9 +478,9 @@ Windows paths handle the drive and the root as a single path component:: - >>> p = PureNTPath('c:/setup.py') + >>> p = PureWindowsPath('c:/setup.py') >>> p.parts - + >>> p.root '\\' >>> p.parts[0] @@ -491,21 +491,21 @@ The ``parent()`` method returns an ancestor of the path:: >>> p.parent() - PureNTPath('c:\\python33\\bin') + PureWindowsPath('c:\\python33\\bin') >>> p.parent(2) - PureNTPath('c:\\python33') + PureWindowsPath('c:\\python33') >>> p.parent(3) - PureNTPath('c:\\') + PureWindowsPath('c:\\') The ``parents()`` method automates repeated invocations of ``parent()``, until the anchor is reached:: - >>> p = PureNTPath('c:/python33/bin/python.exe') + >>> p = PureWindowsPath('c:/python33/bin/python.exe') >>> for parent in p.parents(): parent ... - PureNTPath('c:\\python33\\bin') - PureNTPath('c:\\python33') - PureNTPath('c:\\') + PureWindowsPath('c:\\python33\\bin') + PureWindowsPath('c:\\python33') + PureWindowsPath('c:\\') Querying @@ -519,15 +519,15 @@ ``match()`` matches the path against a glob pattern:: - >>> PureNTPath('c:/PATHLIB/setup.py').match('c:*lib/*.PY') + >>> PureWindowsPath('c:/PATHLIB/setup.py').match('c:*lib/*.PY') True ``normcase()`` returns a case-folded version of the path for NT paths:: >>> PurePosixPath('CAPS').normcase() PurePosixPath('CAPS') - >>> PureNTPath('CAPS').normcase() - PureNTPath('caps') + >>> PureWindowsPath('CAPS').normcase() + PureWindowsPath('caps') Concrete paths API -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sat Nov 16 22:54:15 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 16 Nov 2013 22:54:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NTkx?= =?utf-8?q?=3A_Use_specific_asserts_in_ctype_tests=2E?= Message-ID: <3dMVZH1WWzz7LlY@mail.python.org> http://hg.python.org/cpython/rev/98adbaccaae3 changeset: 87154:98adbaccaae3 branch: 3.3 parent: 87146:db6ea9abd317 user: Serhiy Storchaka date: Sat Nov 16 23:51:26 2013 +0200 summary: Issue #19591: Use specific asserts in ctype tests. files: Lib/ctypes/test/test_arrays.py | 6 +- Lib/ctypes/test/test_as_parameter.py | 2 +- Lib/ctypes/test/test_buffers.py | 10 +- Lib/ctypes/test/test_byteswap.py | 52 ++++++++-------- Lib/ctypes/test/test_cast.py | 6 +- Lib/ctypes/test/test_frombuffer.py | 2 +- Lib/ctypes/test/test_funcptr.py | 2 +- Lib/ctypes/test/test_functions.py | 2 +- Lib/ctypes/test/test_loading.py | 2 +- Lib/ctypes/test/test_numbers.py | 6 +- Lib/ctypes/test/test_parameters.py | 4 +- Lib/ctypes/test/test_pointers.py | 8 +- Lib/ctypes/test/test_python_api.py | 2 +- Lib/ctypes/test/test_refcounts.py | 14 ++-- Lib/ctypes/test/test_strings.py | 38 +++++----- Lib/ctypes/test/test_structures.py | 13 +-- 16 files changed, 84 insertions(+), 85 deletions(-) diff --git a/Lib/ctypes/test/test_arrays.py b/Lib/ctypes/test/test_arrays.py --- a/Lib/ctypes/test/test_arrays.py +++ b/Lib/ctypes/test/test_arrays.py @@ -84,8 +84,8 @@ self.assertEqual(values, [1, 2, 3, 4, 5]) def test_classcache(self): - self.assertTrue(not ARRAY(c_int, 3) is ARRAY(c_int, 4)) - self.assertTrue(ARRAY(c_int, 3) is ARRAY(c_int, 3)) + self.assertIsNot(ARRAY(c_int, 3), ARRAY(c_int, 4)) + self.assertIs(ARRAY(c_int, 3), ARRAY(c_int, 3)) def test_from_address(self): # Failed with 0.9.8, reported by JUrner @@ -125,7 +125,7 @@ # Create a new array type based on it: t1 = my_int * 1 t2 = my_int * 1 - self.assertTrue(t1 is t2) + self.assertIs(t1, t2) def test_subclass(self): class T(Array): diff --git a/Lib/ctypes/test/test_as_parameter.py b/Lib/ctypes/test/test_as_parameter.py --- a/Lib/ctypes/test/test_as_parameter.py +++ b/Lib/ctypes/test/test_as_parameter.py @@ -134,7 +134,7 @@ f.argtypes = [c_longlong, MyCallback] def callback(value): - self.assertTrue(isinstance(value, int)) + self.assertIsInstance(value, int) return value & 0x7FFFFFFF cb = MyCallback(callback) diff --git a/Lib/ctypes/test/test_buffers.py b/Lib/ctypes/test/test_buffers.py --- a/Lib/ctypes/test/test_buffers.py +++ b/Lib/ctypes/test/test_buffers.py @@ -7,12 +7,12 @@ b = create_string_buffer(32) self.assertEqual(len(b), 32) self.assertEqual(sizeof(b), 32 * sizeof(c_char)) - self.assertTrue(type(b[0]) is bytes) + self.assertIs(type(b[0]), bytes) b = create_string_buffer(b"abc") self.assertEqual(len(b), 4) # trailing nul char self.assertEqual(sizeof(b), 4 * sizeof(c_char)) - self.assertTrue(type(b[0]) is bytes) + self.assertIs(type(b[0]), bytes) self.assertEqual(b[0], b"a") self.assertEqual(b[:], b"abc\0") self.assertEqual(b[::], b"abc\0") @@ -33,12 +33,12 @@ b = create_unicode_buffer(32) self.assertEqual(len(b), 32) self.assertEqual(sizeof(b), 32 * sizeof(c_wchar)) - self.assertTrue(type(b[0]) is str) + self.assertIs(type(b[0]), str) b = create_unicode_buffer("abc") self.assertEqual(len(b), 4) # trailing nul char self.assertEqual(sizeof(b), 4 * sizeof(c_wchar)) - self.assertTrue(type(b[0]) is str) + self.assertIs(type(b[0]), str) self.assertEqual(b[0], "a") self.assertEqual(b[:], "abc\0") self.assertEqual(b[::], "abc\0") @@ -50,7 +50,7 @@ b = create_unicode_buffer("abc") self.assertEqual(len(b), 4) # trailing nul char self.assertEqual(sizeof(b), 4 * sizeof(c_wchar)) - self.assertTrue(type(b[0]) is str) + self.assertIs(type(b[0]), str) self.assertEqual(b[0], "a") self.assertEqual(b[:], "abc\0") self.assertEqual(b[::], "abc\0") diff --git a/Lib/ctypes/test/test_byteswap.py b/Lib/ctypes/test/test_byteswap.py --- a/Lib/ctypes/test/test_byteswap.py +++ b/Lib/ctypes/test/test_byteswap.py @@ -23,11 +23,11 @@ def test_endian_short(self): if sys.byteorder == "little": - self.assertTrue(c_short.__ctype_le__ is c_short) - self.assertTrue(c_short.__ctype_be__.__ctype_le__ is c_short) + self.assertIs(c_short.__ctype_le__, c_short) + self.assertIs(c_short.__ctype_be__.__ctype_le__, c_short) else: - self.assertTrue(c_short.__ctype_be__ is c_short) - self.assertTrue(c_short.__ctype_le__.__ctype_be__ is c_short) + self.assertIs(c_short.__ctype_be__, c_short) + self.assertIs(c_short.__ctype_le__.__ctype_be__, c_short) s = c_short.__ctype_be__(0x1234) self.assertEqual(bin(struct.pack(">h", 0x1234)), "1234") self.assertEqual(bin(s), "1234") @@ -50,11 +50,11 @@ def test_endian_int(self): if sys.byteorder == "little": - self.assertTrue(c_int.__ctype_le__ is c_int) - self.assertTrue(c_int.__ctype_be__.__ctype_le__ is c_int) + self.assertIs(c_int.__ctype_le__, c_int) + self.assertIs(c_int.__ctype_be__.__ctype_le__, c_int) else: - self.assertTrue(c_int.__ctype_be__ is c_int) - self.assertTrue(c_int.__ctype_le__.__ctype_be__ is c_int) + self.assertIs(c_int.__ctype_be__, c_int) + self.assertIs(c_int.__ctype_le__.__ctype_be__, c_int) s = c_int.__ctype_be__(0x12345678) self.assertEqual(bin(struct.pack(">i", 0x12345678)), "12345678") @@ -78,11 +78,11 @@ def test_endian_longlong(self): if sys.byteorder == "little": - self.assertTrue(c_longlong.__ctype_le__ is c_longlong) - self.assertTrue(c_longlong.__ctype_be__.__ctype_le__ is c_longlong) + self.assertIs(c_longlong.__ctype_le__, c_longlong) + self.assertIs(c_longlong.__ctype_be__.__ctype_le__, c_longlong) else: - self.assertTrue(c_longlong.__ctype_be__ is c_longlong) - self.assertTrue(c_longlong.__ctype_le__.__ctype_be__ is c_longlong) + self.assertIs(c_longlong.__ctype_be__, c_longlong) + self.assertIs(c_longlong.__ctype_le__.__ctype_be__, c_longlong) s = c_longlong.__ctype_be__(0x1234567890ABCDEF) self.assertEqual(bin(struct.pack(">q", 0x1234567890ABCDEF)), "1234567890ABCDEF") @@ -106,11 +106,11 @@ def test_endian_float(self): if sys.byteorder == "little": - self.assertTrue(c_float.__ctype_le__ is c_float) - self.assertTrue(c_float.__ctype_be__.__ctype_le__ is c_float) + self.assertIs(c_float.__ctype_le__, c_float) + self.assertIs(c_float.__ctype_be__.__ctype_le__, c_float) else: - self.assertTrue(c_float.__ctype_be__ is c_float) - self.assertTrue(c_float.__ctype_le__.__ctype_be__ is c_float) + self.assertIs(c_float.__ctype_be__, c_float) + self.assertIs(c_float.__ctype_le__.__ctype_be__, c_float) s = c_float(math.pi) self.assertEqual(bin(struct.pack("f", math.pi)), bin(s)) # Hm, what's the precision of a float compared to a double? @@ -124,11 +124,11 @@ def test_endian_double(self): if sys.byteorder == "little": - self.assertTrue(c_double.__ctype_le__ is c_double) - self.assertTrue(c_double.__ctype_be__.__ctype_le__ is c_double) + self.assertIs(c_double.__ctype_le__, c_double) + self.assertIs(c_double.__ctype_be__.__ctype_le__, c_double) else: - self.assertTrue(c_double.__ctype_be__ is c_double) - self.assertTrue(c_double.__ctype_le__.__ctype_be__ is c_double) + self.assertIs(c_double.__ctype_be__, c_double) + self.assertIs(c_double.__ctype_le__.__ctype_be__, c_double) s = c_double(math.pi) self.assertEqual(s.value, math.pi) self.assertEqual(bin(struct.pack("d", math.pi)), bin(s)) @@ -140,14 +140,14 @@ self.assertEqual(bin(struct.pack(">d", math.pi)), bin(s)) def test_endian_other(self): - self.assertTrue(c_byte.__ctype_le__ is c_byte) - self.assertTrue(c_byte.__ctype_be__ is c_byte) + self.assertIs(c_byte.__ctype_le__, c_byte) + self.assertIs(c_byte.__ctype_be__, c_byte) - self.assertTrue(c_ubyte.__ctype_le__ is c_ubyte) - self.assertTrue(c_ubyte.__ctype_be__ is c_ubyte) + self.assertIs(c_ubyte.__ctype_le__, c_ubyte) + self.assertIs(c_ubyte.__ctype_be__, c_ubyte) - self.assertTrue(c_char.__ctype_le__ is c_char) - self.assertTrue(c_char.__ctype_be__ is c_char) + self.assertIs(c_char.__ctype_le__, c_char) + self.assertIs(c_char.__ctype_be__, c_char) def test_struct_fields_1(self): if sys.byteorder == "little": diff --git a/Lib/ctypes/test/test_cast.py b/Lib/ctypes/test/test_cast.py --- a/Lib/ctypes/test/test_cast.py +++ b/Lib/ctypes/test/test_cast.py @@ -38,14 +38,14 @@ p = cast(array, POINTER(c_char_p)) # array and p share a common _objects attribute - self.assertTrue(p._objects is array._objects) + self.assertIs(p._objects, array._objects) self.assertEqual(array._objects, {'0': b"foo bar", id(array): array}) p[0] = b"spam spam" self.assertEqual(p._objects, {'0': b"spam spam", id(array): array}) - self.assertTrue(array._objects is p._objects) + self.assertIs(array._objects, p._objects) p[1] = b"foo bar" self.assertEqual(p._objects, {'1': b'foo bar', '0': b"spam spam", id(array): array}) - self.assertTrue(array._objects is p._objects) + self.assertIs(array._objects, p._objects) def test_other(self): p = cast((c_int * 4)(1, 2, 3, 4), POINTER(c_int)) diff --git a/Lib/ctypes/test/test_frombuffer.py b/Lib/ctypes/test/test_frombuffer.py --- a/Lib/ctypes/test/test_frombuffer.py +++ b/Lib/ctypes/test/test_frombuffer.py @@ -23,7 +23,7 @@ a[0], a[-1] = 200, -200 self.assertEqual(x[:], a.tolist()) - self.assertTrue(a in x._objects.values()) + self.assertIn(a, x._objects.values()) self.assertRaises(ValueError, c_int.from_buffer, a, -1) diff --git a/Lib/ctypes/test/test_funcptr.py b/Lib/ctypes/test/test_funcptr.py --- a/Lib/ctypes/test/test_funcptr.py +++ b/Lib/ctypes/test/test_funcptr.py @@ -75,7 +75,7 @@ ## "lpfnWndProc", WNDPROC_2(wndproc)) # instead: - self.assertTrue(WNDPROC is WNDPROC_2) + self.assertIs(WNDPROC, WNDPROC_2) # 'wndclass.lpfnWndProc' leaks 94 references. Why? self.assertEqual(wndclass.lpfnWndProc(1, 2, 3, 4), 10) diff --git a/Lib/ctypes/test/test_functions.py b/Lib/ctypes/test/test_functions.py --- a/Lib/ctypes/test/test_functions.py +++ b/Lib/ctypes/test/test_functions.py @@ -306,7 +306,7 @@ f.argtypes = [c_longlong, MyCallback] def callback(value): - self.assertTrue(isinstance(value, int)) + self.assertIsInstance(value, int) return value & 0x7FFFFFFF cb = MyCallback(callback) diff --git a/Lib/ctypes/test/test_loading.py b/Lib/ctypes/test/test_loading.py --- a/Lib/ctypes/test/test_loading.py +++ b/Lib/ctypes/test/test_loading.py @@ -43,7 +43,7 @@ if os.name in ("nt", "ce"): def test_load_library(self): - self.assertFalse(libc_name is None) + self.assertIsNotNone(libc_name) if is_resource_enabled("printing"): print(find_library("kernel32")) print(find_library("user32")) diff --git a/Lib/ctypes/test/test_numbers.py b/Lib/ctypes/test/test_numbers.py --- a/Lib/ctypes/test/test_numbers.py +++ b/Lib/ctypes/test/test_numbers.py @@ -181,10 +181,10 @@ a = array(t._type_, [3.14]) v = t.from_address(a.buffer_info()[0]) self.assertEqual(v.value, a[0]) - self.assertTrue(type(v) is t) + self.assertIs(type(v), t) a[0] = 2.3456e17 self.assertEqual(v.value, a[0]) - self.assertTrue(type(v) is t) + self.assertIs(type(v), t) def test_char_from_address(self): from ctypes import c_char @@ -194,7 +194,7 @@ a[0] = ord('x') v = c_char.from_address(a.buffer_info()[0]) self.assertEqual(v.value, b'x') - self.assertTrue(type(v) is c_char) + self.assertIs(type(v), c_char) a[0] = ord('?') self.assertEqual(v.value, b'?') diff --git a/Lib/ctypes/test/test_parameters.py b/Lib/ctypes/test/test_parameters.py --- a/Lib/ctypes/test/test_parameters.py +++ b/Lib/ctypes/test/test_parameters.py @@ -54,7 +54,7 @@ # c_char_p.from_param on a Python String packs the string # into a cparam object s = b"123" - self.assertTrue(c_char_p.from_param(s)._obj is s) + self.assertIs(c_char_p.from_param(s)._obj, s) # new in 0.9.1: convert (encode) unicode to ascii self.assertEqual(c_char_p.from_param(b"123")._obj, b"123") @@ -64,7 +64,7 @@ # calling c_char_p.from_param with a c_char_p instance # returns the argument itself: a = c_char_p(b"123") - self.assertTrue(c_char_p.from_param(a) is a) + self.assertIs(c_char_p.from_param(a), a) def test_cw_strings(self): from ctypes import byref diff --git a/Lib/ctypes/test/test_pointers.py b/Lib/ctypes/test/test_pointers.py --- a/Lib/ctypes/test/test_pointers.py +++ b/Lib/ctypes/test/test_pointers.py @@ -78,7 +78,7 @@ ## i = c_int(42) ## callback(byref(i)) -## self.assertTrue(i.value == 84) +## self.assertEqual(i.value, 84) doit(callback) ## print self.result @@ -91,11 +91,11 @@ i = ct(42) p = pointer(i) ## print type(p.contents), ct - self.assertTrue(type(p.contents) is ct) + self.assertIs(type(p.contents), ct) # p.contents is the same as p[0] ## print p.contents -## self.assertTrue(p.contents == 42) -## self.assertTrue(p[0] == 42) +## self.assertEqual(p.contents, 42) +## self.assertEqual(p[0], 42) self.assertRaises(TypeError, delitem, p, 0) diff --git a/Lib/ctypes/test/test_python_api.py b/Lib/ctypes/test/test_python_api.py --- a/Lib/ctypes/test/test_python_api.py +++ b/Lib/ctypes/test/test_python_api.py @@ -64,7 +64,7 @@ ref = grc(s) # id(python-object) is the address pyobj = PyObj_FromPtr(id(s)) - self.assertTrue(s is pyobj) + self.assertIs(s, pyobj) self.assertEqual(grc(s), ref + 1) del pyobj diff --git a/Lib/ctypes/test/test_refcounts.py b/Lib/ctypes/test/test_refcounts.py --- a/Lib/ctypes/test/test_refcounts.py +++ b/Lib/ctypes/test/test_refcounts.py @@ -26,7 +26,7 @@ self.assertEqual(grc(callback), 2) cb = MyCallback(callback) - self.assertTrue(grc(callback) > 2) + self.assertGreater(grc(callback), 2) result = f(-10, cb) self.assertEqual(result, -18) cb = None @@ -46,15 +46,15 @@ # the CFuncPtr instance holds at least one refcount on func: f = OtherCallback(func) - self.assertTrue(grc(func) > 2) + self.assertGreater(grc(func), 2) # and may release it again del f - self.assertTrue(grc(func) >= 2) + self.assertGreaterEqual(grc(func), 2) # but now it must be gone gc.collect() - self.assertTrue(grc(func) == 2) + self.assertEqual(grc(func), 2) class X(ctypes.Structure): _fields_ = [("a", OtherCallback)] @@ -62,11 +62,11 @@ x.a = OtherCallback(func) # the CFuncPtr instance holds at least one refcount on func: - self.assertTrue(grc(func) > 2) + self.assertGreater(grc(func), 2) # and may release it again del x - self.assertTrue(grc(func) >= 2) + self.assertGreaterEqual(grc(func), 2) # and now it must be gone again gc.collect() @@ -75,7 +75,7 @@ f = OtherCallback(func) # the CFuncPtr instance holds at least one refcount on func: - self.assertTrue(grc(func) > 2) + self.assertGreater(grc(func), 2) # create a cycle f.cycle = f diff --git a/Lib/ctypes/test/test_strings.py b/Lib/ctypes/test/test_strings.py --- a/Lib/ctypes/test/test_strings.py +++ b/Lib/ctypes/test/test_strings.py @@ -115,24 +115,24 @@ # New in releases later than 0.4.0: # c_string(number) returns an empty string of size number - self.assertTrue(len(c_string(32).raw) == 32) + self.assertEqual(len(c_string(32).raw), 32) self.assertRaises(ValueError, c_string, -1) self.assertRaises(ValueError, c_string, 0) # These tests fail, because it is no longer initialized -## self.assertTrue(c_string(2).value == "") -## self.assertTrue(c_string(2).raw == "\000\000") - self.assertTrue(c_string(2).raw[-1] == "\000") - self.assertTrue(len(c_string(2).raw) == 2) +## self.assertEqual(c_string(2).value, "") +## self.assertEqual(c_string(2).raw, "\000\000") + self.assertEqual(c_string(2).raw[-1], "\000") + self.assertEqual(len(c_string(2).raw), 2) def XX_test_initialized_strings(self): - self.assertTrue(c_string("ab", 4).raw[:2] == "ab") - self.assertTrue(c_string("ab", 4).raw[:2:] == "ab") - self.assertTrue(c_string("ab", 4).raw[:2:-1] == "ba") - self.assertTrue(c_string("ab", 4).raw[:2:2] == "a") - self.assertTrue(c_string("ab", 4).raw[-1] == "\000") - self.assertTrue(c_string("ab", 2).raw == "a\000") + self.assertEqual(c_string("ab", 4).raw[:2], "ab") + self.assertEqual(c_string("ab", 4).raw[:2:], "ab") + self.assertEqual(c_string("ab", 4).raw[:2:-1], "ba") + self.assertEqual(c_string("ab", 4).raw[:2:2], "a") + self.assertEqual(c_string("ab", 4).raw[-1], "\000") + self.assertEqual(c_string("ab", 2).raw, "a\000") def XX_test_toolong(self): cs = c_string("abcdef") @@ -163,22 +163,22 @@ # XXX This behaviour is about to change: # len returns the size of the internal buffer in bytes. # This includes the terminating NUL character. - self.assertTrue(sizeof(cs) == 14) + self.assertEqual(sizeof(cs), 14) # The value property is the string up to the first terminating NUL. - self.assertTrue(cs.value == "abcdef") - self.assertTrue(c_wstring("abc\000def").value == "abc") + self.assertEqual(cs.value, "abcdef") + self.assertEqual(c_wstring("abc\000def").value, "abc") - self.assertTrue(c_wstring("abc\000def").value == "abc") + self.assertEqual(c_wstring("abc\000def").value, "abc") # The raw property is the total buffer contents: - self.assertTrue(cs.raw == "abcdef\000") - self.assertTrue(c_wstring("abc\000def").raw == "abc\000def\000") + self.assertEqual(cs.raw, "abcdef\000") + self.assertEqual(c_wstring("abc\000def").raw, "abc\000def\000") # We can change the value: cs.value = "ab" - self.assertTrue(cs.value == "ab") - self.assertTrue(cs.raw == "ab\000\000\000\000\000") + self.assertEqual(cs.value, "ab") + self.assertEqual(cs.raw, "ab\000\000\000\000\000") self.assertRaises(TypeError, c_wstring, "123") self.assertRaises(ValueError, c_wstring, 0) diff --git a/Lib/ctypes/test/test_structures.py b/Lib/ctypes/test/test_structures.py --- a/Lib/ctypes/test/test_structures.py +++ b/Lib/ctypes/test/test_structures.py @@ -374,9 +374,9 @@ ## class X(Structure): ## _fields_ = [] - self.assertTrue("in_dll" in dir(type(Structure))) - self.assertTrue("from_address" in dir(type(Structure))) - self.assertTrue("in_dll" in dir(type(Structure))) + self.assertIn("in_dll", dir(type(Structure))) + self.assertIn("from_address", dir(type(Structure))) + self.assertIn("in_dll", dir(type(Structure))) def test_positional_args(self): # see also http://bugs.python.org/issue5042 @@ -446,8 +446,8 @@ try: Recursive._fields_ = [("next", Recursive)] except AttributeError as details: - self.assertTrue("Structure or union cannot contain itself" in - str(details)) + self.assertIn("Structure or union cannot contain itself", + str(details)) else: self.fail("Structure or union cannot contain itself") @@ -463,8 +463,7 @@ try: Second._fields_ = [("first", First)] except AttributeError as details: - self.assertTrue("_fields_ is final" in - str(details)) + self.assertIn("_fields_ is final", str(details)) else: self.fail("AttributeError not raised") -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 22:54:16 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 16 Nov 2013 22:54:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319591=3A_Use_specific_asserts_in_ctype_tests=2E?= Message-ID: <3dMVZJ5jbRz7LvF@mail.python.org> http://hg.python.org/cpython/rev/5fa9293b6cde changeset: 87155:5fa9293b6cde parent: 87153:82b58807f481 parent: 87154:98adbaccaae3 user: Serhiy Storchaka date: Sat Nov 16 23:53:39 2013 +0200 summary: Issue #19591: Use specific asserts in ctype tests. files: Lib/ctypes/test/test_arrays.py | 6 +- Lib/ctypes/test/test_as_parameter.py | 2 +- Lib/ctypes/test/test_buffers.py | 10 +- Lib/ctypes/test/test_byteswap.py | 52 ++++++++-------- Lib/ctypes/test/test_cast.py | 6 +- Lib/ctypes/test/test_frombuffer.py | 2 +- Lib/ctypes/test/test_funcptr.py | 2 +- Lib/ctypes/test/test_functions.py | 2 +- Lib/ctypes/test/test_loading.py | 2 +- Lib/ctypes/test/test_numbers.py | 6 +- Lib/ctypes/test/test_parameters.py | 4 +- Lib/ctypes/test/test_pointers.py | 8 +- Lib/ctypes/test/test_python_api.py | 2 +- Lib/ctypes/test/test_refcounts.py | 14 ++-- Lib/ctypes/test/test_strings.py | 38 +++++----- Lib/ctypes/test/test_structures.py | 13 +-- 16 files changed, 84 insertions(+), 85 deletions(-) diff --git a/Lib/ctypes/test/test_arrays.py b/Lib/ctypes/test/test_arrays.py --- a/Lib/ctypes/test/test_arrays.py +++ b/Lib/ctypes/test/test_arrays.py @@ -84,8 +84,8 @@ self.assertEqual(values, [1, 2, 3, 4, 5]) def test_classcache(self): - self.assertTrue(not ARRAY(c_int, 3) is ARRAY(c_int, 4)) - self.assertTrue(ARRAY(c_int, 3) is ARRAY(c_int, 3)) + self.assertIsNot(ARRAY(c_int, 3), ARRAY(c_int, 4)) + self.assertIs(ARRAY(c_int, 3), ARRAY(c_int, 3)) def test_from_address(self): # Failed with 0.9.8, reported by JUrner @@ -125,7 +125,7 @@ # Create a new array type based on it: t1 = my_int * 1 t2 = my_int * 1 - self.assertTrue(t1 is t2) + self.assertIs(t1, t2) def test_subclass(self): class T(Array): diff --git a/Lib/ctypes/test/test_as_parameter.py b/Lib/ctypes/test/test_as_parameter.py --- a/Lib/ctypes/test/test_as_parameter.py +++ b/Lib/ctypes/test/test_as_parameter.py @@ -134,7 +134,7 @@ f.argtypes = [c_longlong, MyCallback] def callback(value): - self.assertTrue(isinstance(value, int)) + self.assertIsInstance(value, int) return value & 0x7FFFFFFF cb = MyCallback(callback) diff --git a/Lib/ctypes/test/test_buffers.py b/Lib/ctypes/test/test_buffers.py --- a/Lib/ctypes/test/test_buffers.py +++ b/Lib/ctypes/test/test_buffers.py @@ -7,12 +7,12 @@ b = create_string_buffer(32) self.assertEqual(len(b), 32) self.assertEqual(sizeof(b), 32 * sizeof(c_char)) - self.assertTrue(type(b[0]) is bytes) + self.assertIs(type(b[0]), bytes) b = create_string_buffer(b"abc") self.assertEqual(len(b), 4) # trailing nul char self.assertEqual(sizeof(b), 4 * sizeof(c_char)) - self.assertTrue(type(b[0]) is bytes) + self.assertIs(type(b[0]), bytes) self.assertEqual(b[0], b"a") self.assertEqual(b[:], b"abc\0") self.assertEqual(b[::], b"abc\0") @@ -33,12 +33,12 @@ b = create_unicode_buffer(32) self.assertEqual(len(b), 32) self.assertEqual(sizeof(b), 32 * sizeof(c_wchar)) - self.assertTrue(type(b[0]) is str) + self.assertIs(type(b[0]), str) b = create_unicode_buffer("abc") self.assertEqual(len(b), 4) # trailing nul char self.assertEqual(sizeof(b), 4 * sizeof(c_wchar)) - self.assertTrue(type(b[0]) is str) + self.assertIs(type(b[0]), str) self.assertEqual(b[0], "a") self.assertEqual(b[:], "abc\0") self.assertEqual(b[::], "abc\0") @@ -50,7 +50,7 @@ b = create_unicode_buffer("abc") self.assertEqual(len(b), 4) # trailing nul char self.assertEqual(sizeof(b), 4 * sizeof(c_wchar)) - self.assertTrue(type(b[0]) is str) + self.assertIs(type(b[0]), str) self.assertEqual(b[0], "a") self.assertEqual(b[:], "abc\0") self.assertEqual(b[::], "abc\0") diff --git a/Lib/ctypes/test/test_byteswap.py b/Lib/ctypes/test/test_byteswap.py --- a/Lib/ctypes/test/test_byteswap.py +++ b/Lib/ctypes/test/test_byteswap.py @@ -23,11 +23,11 @@ def test_endian_short(self): if sys.byteorder == "little": - self.assertTrue(c_short.__ctype_le__ is c_short) - self.assertTrue(c_short.__ctype_be__.__ctype_le__ is c_short) + self.assertIs(c_short.__ctype_le__, c_short) + self.assertIs(c_short.__ctype_be__.__ctype_le__, c_short) else: - self.assertTrue(c_short.__ctype_be__ is c_short) - self.assertTrue(c_short.__ctype_le__.__ctype_be__ is c_short) + self.assertIs(c_short.__ctype_be__, c_short) + self.assertIs(c_short.__ctype_le__.__ctype_be__, c_short) s = c_short.__ctype_be__(0x1234) self.assertEqual(bin(struct.pack(">h", 0x1234)), "1234") self.assertEqual(bin(s), "1234") @@ -50,11 +50,11 @@ def test_endian_int(self): if sys.byteorder == "little": - self.assertTrue(c_int.__ctype_le__ is c_int) - self.assertTrue(c_int.__ctype_be__.__ctype_le__ is c_int) + self.assertIs(c_int.__ctype_le__, c_int) + self.assertIs(c_int.__ctype_be__.__ctype_le__, c_int) else: - self.assertTrue(c_int.__ctype_be__ is c_int) - self.assertTrue(c_int.__ctype_le__.__ctype_be__ is c_int) + self.assertIs(c_int.__ctype_be__, c_int) + self.assertIs(c_int.__ctype_le__.__ctype_be__, c_int) s = c_int.__ctype_be__(0x12345678) self.assertEqual(bin(struct.pack(">i", 0x12345678)), "12345678") @@ -78,11 +78,11 @@ def test_endian_longlong(self): if sys.byteorder == "little": - self.assertTrue(c_longlong.__ctype_le__ is c_longlong) - self.assertTrue(c_longlong.__ctype_be__.__ctype_le__ is c_longlong) + self.assertIs(c_longlong.__ctype_le__, c_longlong) + self.assertIs(c_longlong.__ctype_be__.__ctype_le__, c_longlong) else: - self.assertTrue(c_longlong.__ctype_be__ is c_longlong) - self.assertTrue(c_longlong.__ctype_le__.__ctype_be__ is c_longlong) + self.assertIs(c_longlong.__ctype_be__, c_longlong) + self.assertIs(c_longlong.__ctype_le__.__ctype_be__, c_longlong) s = c_longlong.__ctype_be__(0x1234567890ABCDEF) self.assertEqual(bin(struct.pack(">q", 0x1234567890ABCDEF)), "1234567890ABCDEF") @@ -106,11 +106,11 @@ def test_endian_float(self): if sys.byteorder == "little": - self.assertTrue(c_float.__ctype_le__ is c_float) - self.assertTrue(c_float.__ctype_be__.__ctype_le__ is c_float) + self.assertIs(c_float.__ctype_le__, c_float) + self.assertIs(c_float.__ctype_be__.__ctype_le__, c_float) else: - self.assertTrue(c_float.__ctype_be__ is c_float) - self.assertTrue(c_float.__ctype_le__.__ctype_be__ is c_float) + self.assertIs(c_float.__ctype_be__, c_float) + self.assertIs(c_float.__ctype_le__.__ctype_be__, c_float) s = c_float(math.pi) self.assertEqual(bin(struct.pack("f", math.pi)), bin(s)) # Hm, what's the precision of a float compared to a double? @@ -124,11 +124,11 @@ def test_endian_double(self): if sys.byteorder == "little": - self.assertTrue(c_double.__ctype_le__ is c_double) - self.assertTrue(c_double.__ctype_be__.__ctype_le__ is c_double) + self.assertIs(c_double.__ctype_le__, c_double) + self.assertIs(c_double.__ctype_be__.__ctype_le__, c_double) else: - self.assertTrue(c_double.__ctype_be__ is c_double) - self.assertTrue(c_double.__ctype_le__.__ctype_be__ is c_double) + self.assertIs(c_double.__ctype_be__, c_double) + self.assertIs(c_double.__ctype_le__.__ctype_be__, c_double) s = c_double(math.pi) self.assertEqual(s.value, math.pi) self.assertEqual(bin(struct.pack("d", math.pi)), bin(s)) @@ -140,14 +140,14 @@ self.assertEqual(bin(struct.pack(">d", math.pi)), bin(s)) def test_endian_other(self): - self.assertTrue(c_byte.__ctype_le__ is c_byte) - self.assertTrue(c_byte.__ctype_be__ is c_byte) + self.assertIs(c_byte.__ctype_le__, c_byte) + self.assertIs(c_byte.__ctype_be__, c_byte) - self.assertTrue(c_ubyte.__ctype_le__ is c_ubyte) - self.assertTrue(c_ubyte.__ctype_be__ is c_ubyte) + self.assertIs(c_ubyte.__ctype_le__, c_ubyte) + self.assertIs(c_ubyte.__ctype_be__, c_ubyte) - self.assertTrue(c_char.__ctype_le__ is c_char) - self.assertTrue(c_char.__ctype_be__ is c_char) + self.assertIs(c_char.__ctype_le__, c_char) + self.assertIs(c_char.__ctype_be__, c_char) def test_struct_fields_1(self): if sys.byteorder == "little": diff --git a/Lib/ctypes/test/test_cast.py b/Lib/ctypes/test/test_cast.py --- a/Lib/ctypes/test/test_cast.py +++ b/Lib/ctypes/test/test_cast.py @@ -38,14 +38,14 @@ p = cast(array, POINTER(c_char_p)) # array and p share a common _objects attribute - self.assertTrue(p._objects is array._objects) + self.assertIs(p._objects, array._objects) self.assertEqual(array._objects, {'0': b"foo bar", id(array): array}) p[0] = b"spam spam" self.assertEqual(p._objects, {'0': b"spam spam", id(array): array}) - self.assertTrue(array._objects is p._objects) + self.assertIs(array._objects, p._objects) p[1] = b"foo bar" self.assertEqual(p._objects, {'1': b'foo bar', '0': b"spam spam", id(array): array}) - self.assertTrue(array._objects is p._objects) + self.assertIs(array._objects, p._objects) def test_other(self): p = cast((c_int * 4)(1, 2, 3, 4), POINTER(c_int)) diff --git a/Lib/ctypes/test/test_frombuffer.py b/Lib/ctypes/test/test_frombuffer.py --- a/Lib/ctypes/test/test_frombuffer.py +++ b/Lib/ctypes/test/test_frombuffer.py @@ -23,7 +23,7 @@ a[0], a[-1] = 200, -200 self.assertEqual(x[:], a.tolist()) - self.assertTrue(a in x._objects.values()) + self.assertIn(a, x._objects.values()) self.assertRaises(ValueError, c_int.from_buffer, a, -1) diff --git a/Lib/ctypes/test/test_funcptr.py b/Lib/ctypes/test/test_funcptr.py --- a/Lib/ctypes/test/test_funcptr.py +++ b/Lib/ctypes/test/test_funcptr.py @@ -75,7 +75,7 @@ ## "lpfnWndProc", WNDPROC_2(wndproc)) # instead: - self.assertTrue(WNDPROC is WNDPROC_2) + self.assertIs(WNDPROC, WNDPROC_2) # 'wndclass.lpfnWndProc' leaks 94 references. Why? self.assertEqual(wndclass.lpfnWndProc(1, 2, 3, 4), 10) diff --git a/Lib/ctypes/test/test_functions.py b/Lib/ctypes/test/test_functions.py --- a/Lib/ctypes/test/test_functions.py +++ b/Lib/ctypes/test/test_functions.py @@ -306,7 +306,7 @@ f.argtypes = [c_longlong, MyCallback] def callback(value): - self.assertTrue(isinstance(value, int)) + self.assertIsInstance(value, int) return value & 0x7FFFFFFF cb = MyCallback(callback) diff --git a/Lib/ctypes/test/test_loading.py b/Lib/ctypes/test/test_loading.py --- a/Lib/ctypes/test/test_loading.py +++ b/Lib/ctypes/test/test_loading.py @@ -43,7 +43,7 @@ if os.name in ("nt", "ce"): def test_load_library(self): - self.assertFalse(libc_name is None) + self.assertIsNotNone(libc_name) if is_resource_enabled("printing"): print(find_library("kernel32")) print(find_library("user32")) diff --git a/Lib/ctypes/test/test_numbers.py b/Lib/ctypes/test/test_numbers.py --- a/Lib/ctypes/test/test_numbers.py +++ b/Lib/ctypes/test/test_numbers.py @@ -181,10 +181,10 @@ a = array(t._type_, [3.14]) v = t.from_address(a.buffer_info()[0]) self.assertEqual(v.value, a[0]) - self.assertTrue(type(v) is t) + self.assertIs(type(v), t) a[0] = 2.3456e17 self.assertEqual(v.value, a[0]) - self.assertTrue(type(v) is t) + self.assertIs(type(v), t) def test_char_from_address(self): from ctypes import c_char @@ -194,7 +194,7 @@ a[0] = ord('x') v = c_char.from_address(a.buffer_info()[0]) self.assertEqual(v.value, b'x') - self.assertTrue(type(v) is c_char) + self.assertIs(type(v), c_char) a[0] = ord('?') self.assertEqual(v.value, b'?') diff --git a/Lib/ctypes/test/test_parameters.py b/Lib/ctypes/test/test_parameters.py --- a/Lib/ctypes/test/test_parameters.py +++ b/Lib/ctypes/test/test_parameters.py @@ -54,7 +54,7 @@ # c_char_p.from_param on a Python String packs the string # into a cparam object s = b"123" - self.assertTrue(c_char_p.from_param(s)._obj is s) + self.assertIs(c_char_p.from_param(s)._obj, s) # new in 0.9.1: convert (encode) unicode to ascii self.assertEqual(c_char_p.from_param(b"123")._obj, b"123") @@ -64,7 +64,7 @@ # calling c_char_p.from_param with a c_char_p instance # returns the argument itself: a = c_char_p(b"123") - self.assertTrue(c_char_p.from_param(a) is a) + self.assertIs(c_char_p.from_param(a), a) def test_cw_strings(self): from ctypes import byref diff --git a/Lib/ctypes/test/test_pointers.py b/Lib/ctypes/test/test_pointers.py --- a/Lib/ctypes/test/test_pointers.py +++ b/Lib/ctypes/test/test_pointers.py @@ -78,7 +78,7 @@ ## i = c_int(42) ## callback(byref(i)) -## self.assertTrue(i.value == 84) +## self.assertEqual(i.value, 84) doit(callback) ## print self.result @@ -91,11 +91,11 @@ i = ct(42) p = pointer(i) ## print type(p.contents), ct - self.assertTrue(type(p.contents) is ct) + self.assertIs(type(p.contents), ct) # p.contents is the same as p[0] ## print p.contents -## self.assertTrue(p.contents == 42) -## self.assertTrue(p[0] == 42) +## self.assertEqual(p.contents, 42) +## self.assertEqual(p[0], 42) self.assertRaises(TypeError, delitem, p, 0) diff --git a/Lib/ctypes/test/test_python_api.py b/Lib/ctypes/test/test_python_api.py --- a/Lib/ctypes/test/test_python_api.py +++ b/Lib/ctypes/test/test_python_api.py @@ -64,7 +64,7 @@ ref = grc(s) # id(python-object) is the address pyobj = PyObj_FromPtr(id(s)) - self.assertTrue(s is pyobj) + self.assertIs(s, pyobj) self.assertEqual(grc(s), ref + 1) del pyobj diff --git a/Lib/ctypes/test/test_refcounts.py b/Lib/ctypes/test/test_refcounts.py --- a/Lib/ctypes/test/test_refcounts.py +++ b/Lib/ctypes/test/test_refcounts.py @@ -26,7 +26,7 @@ self.assertEqual(grc(callback), 2) cb = MyCallback(callback) - self.assertTrue(grc(callback) > 2) + self.assertGreater(grc(callback), 2) result = f(-10, cb) self.assertEqual(result, -18) cb = None @@ -46,15 +46,15 @@ # the CFuncPtr instance holds at least one refcount on func: f = OtherCallback(func) - self.assertTrue(grc(func) > 2) + self.assertGreater(grc(func), 2) # and may release it again del f - self.assertTrue(grc(func) >= 2) + self.assertGreaterEqual(grc(func), 2) # but now it must be gone gc.collect() - self.assertTrue(grc(func) == 2) + self.assertEqual(grc(func), 2) class X(ctypes.Structure): _fields_ = [("a", OtherCallback)] @@ -62,11 +62,11 @@ x.a = OtherCallback(func) # the CFuncPtr instance holds at least one refcount on func: - self.assertTrue(grc(func) > 2) + self.assertGreater(grc(func), 2) # and may release it again del x - self.assertTrue(grc(func) >= 2) + self.assertGreaterEqual(grc(func), 2) # and now it must be gone again gc.collect() @@ -75,7 +75,7 @@ f = OtherCallback(func) # the CFuncPtr instance holds at least one refcount on func: - self.assertTrue(grc(func) > 2) + self.assertGreater(grc(func), 2) # create a cycle f.cycle = f diff --git a/Lib/ctypes/test/test_strings.py b/Lib/ctypes/test/test_strings.py --- a/Lib/ctypes/test/test_strings.py +++ b/Lib/ctypes/test/test_strings.py @@ -115,24 +115,24 @@ # New in releases later than 0.4.0: # c_string(number) returns an empty string of size number - self.assertTrue(len(c_string(32).raw) == 32) + self.assertEqual(len(c_string(32).raw), 32) self.assertRaises(ValueError, c_string, -1) self.assertRaises(ValueError, c_string, 0) # These tests fail, because it is no longer initialized -## self.assertTrue(c_string(2).value == "") -## self.assertTrue(c_string(2).raw == "\000\000") - self.assertTrue(c_string(2).raw[-1] == "\000") - self.assertTrue(len(c_string(2).raw) == 2) +## self.assertEqual(c_string(2).value, "") +## self.assertEqual(c_string(2).raw, "\000\000") + self.assertEqual(c_string(2).raw[-1], "\000") + self.assertEqual(len(c_string(2).raw), 2) def XX_test_initialized_strings(self): - self.assertTrue(c_string("ab", 4).raw[:2] == "ab") - self.assertTrue(c_string("ab", 4).raw[:2:] == "ab") - self.assertTrue(c_string("ab", 4).raw[:2:-1] == "ba") - self.assertTrue(c_string("ab", 4).raw[:2:2] == "a") - self.assertTrue(c_string("ab", 4).raw[-1] == "\000") - self.assertTrue(c_string("ab", 2).raw == "a\000") + self.assertEqual(c_string("ab", 4).raw[:2], "ab") + self.assertEqual(c_string("ab", 4).raw[:2:], "ab") + self.assertEqual(c_string("ab", 4).raw[:2:-1], "ba") + self.assertEqual(c_string("ab", 4).raw[:2:2], "a") + self.assertEqual(c_string("ab", 4).raw[-1], "\000") + self.assertEqual(c_string("ab", 2).raw, "a\000") def XX_test_toolong(self): cs = c_string("abcdef") @@ -163,22 +163,22 @@ # XXX This behaviour is about to change: # len returns the size of the internal buffer in bytes. # This includes the terminating NUL character. - self.assertTrue(sizeof(cs) == 14) + self.assertEqual(sizeof(cs), 14) # The value property is the string up to the first terminating NUL. - self.assertTrue(cs.value == "abcdef") - self.assertTrue(c_wstring("abc\000def").value == "abc") + self.assertEqual(cs.value, "abcdef") + self.assertEqual(c_wstring("abc\000def").value, "abc") - self.assertTrue(c_wstring("abc\000def").value == "abc") + self.assertEqual(c_wstring("abc\000def").value, "abc") # The raw property is the total buffer contents: - self.assertTrue(cs.raw == "abcdef\000") - self.assertTrue(c_wstring("abc\000def").raw == "abc\000def\000") + self.assertEqual(cs.raw, "abcdef\000") + self.assertEqual(c_wstring("abc\000def").raw, "abc\000def\000") # We can change the value: cs.value = "ab" - self.assertTrue(cs.value == "ab") - self.assertTrue(cs.raw == "ab\000\000\000\000\000") + self.assertEqual(cs.value, "ab") + self.assertEqual(cs.raw, "ab\000\000\000\000\000") self.assertRaises(TypeError, c_wstring, "123") self.assertRaises(ValueError, c_wstring, 0) diff --git a/Lib/ctypes/test/test_structures.py b/Lib/ctypes/test/test_structures.py --- a/Lib/ctypes/test/test_structures.py +++ b/Lib/ctypes/test/test_structures.py @@ -374,9 +374,9 @@ ## class X(Structure): ## _fields_ = [] - self.assertTrue("in_dll" in dir(type(Structure))) - self.assertTrue("from_address" in dir(type(Structure))) - self.assertTrue("in_dll" in dir(type(Structure))) + self.assertIn("in_dll", dir(type(Structure))) + self.assertIn("from_address", dir(type(Structure))) + self.assertIn("in_dll", dir(type(Structure))) def test_positional_args(self): # see also http://bugs.python.org/issue5042 @@ -446,8 +446,8 @@ try: Recursive._fields_ = [("next", Recursive)] except AttributeError as details: - self.assertTrue("Structure or union cannot contain itself" in - str(details)) + self.assertIn("Structure or union cannot contain itself", + str(details)) else: self.fail("Structure or union cannot contain itself") @@ -463,8 +463,7 @@ try: Second._fields_ = [("first", First)] except AttributeError as details: - self.assertTrue("_fields_ is final" in - str(details)) + self.assertIn("_fields_ is final", str(details)) else: self.fail("AttributeError not raised") -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 23:06:33 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 16 Nov 2013 23:06:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NTkx?= =?utf-8?q?=3A_Use_specific_asserts_in_ctype_tests=2E?= Message-ID: <3dMVrT2l02z7Lnv@mail.python.org> http://hg.python.org/cpython/rev/8dbb78be2496 changeset: 87156:8dbb78be2496 branch: 2.7 parent: 87145:27567e954cbe user: Serhiy Storchaka date: Sun Nov 17 00:06:02 2013 +0200 summary: Issue #19591: Use specific asserts in ctype tests. files: Lib/ctypes/test/test_arrays.py | 6 +- Lib/ctypes/test/test_as_parameter.py | 2 +- Lib/ctypes/test/test_buffers.py | 10 +- Lib/ctypes/test/test_byteswap.py | 52 ++++++++-------- Lib/ctypes/test/test_cast.py | 6 +- Lib/ctypes/test/test_frombuffer.py | 2 +- Lib/ctypes/test/test_funcptr.py | 2 +- Lib/ctypes/test/test_functions.py | 2 +- Lib/ctypes/test/test_loading.py | 2 +- Lib/ctypes/test/test_numbers.py | 6 +- Lib/ctypes/test/test_parameters.py | 4 +- Lib/ctypes/test/test_pointers.py | 8 +- Lib/ctypes/test/test_python_api.py | 2 +- Lib/ctypes/test/test_refcounts.py | 14 ++-- Lib/ctypes/test/test_strings.py | 38 +++++----- Lib/ctypes/test/test_structures.py | 13 +-- 16 files changed, 84 insertions(+), 85 deletions(-) diff --git a/Lib/ctypes/test/test_arrays.py b/Lib/ctypes/test/test_arrays.py --- a/Lib/ctypes/test/test_arrays.py +++ b/Lib/ctypes/test/test_arrays.py @@ -87,8 +87,8 @@ self.assertEqual(values, [1, 2, 3, 4, 5]) def test_classcache(self): - self.assertTrue(not ARRAY(c_int, 3) is ARRAY(c_int, 4)) - self.assertTrue(ARRAY(c_int, 3) is ARRAY(c_int, 3)) + self.assertIsNot(ARRAY(c_int, 3), ARRAY(c_int, 4)) + self.assertIs(ARRAY(c_int, 3), ARRAY(c_int, 3)) def test_from_address(self): # Failed with 0.9.8, reported by JUrner @@ -128,7 +128,7 @@ # Create a new array type based on it: t1 = my_int * 1 t2 = my_int * 1 - self.assertTrue(t1 is t2) + self.assertIs(t1, t2) if __name__ == '__main__': unittest.main() diff --git a/Lib/ctypes/test/test_as_parameter.py b/Lib/ctypes/test/test_as_parameter.py --- a/Lib/ctypes/test/test_as_parameter.py +++ b/Lib/ctypes/test/test_as_parameter.py @@ -134,7 +134,7 @@ f.argtypes = [c_longlong, MyCallback] def callback(value): - self.assertTrue(isinstance(value, (int, long))) + self.assertIsInstance(value, (int, long)) return value & 0x7FFFFFFF cb = MyCallback(callback) diff --git a/Lib/ctypes/test/test_buffers.py b/Lib/ctypes/test/test_buffers.py --- a/Lib/ctypes/test/test_buffers.py +++ b/Lib/ctypes/test/test_buffers.py @@ -7,12 +7,12 @@ b = create_string_buffer(32) self.assertEqual(len(b), 32) self.assertEqual(sizeof(b), 32 * sizeof(c_char)) - self.assertTrue(type(b[0]) is str) + self.assertIs(type(b[0]), str) b = create_string_buffer("abc") self.assertEqual(len(b), 4) # trailing nul char self.assertEqual(sizeof(b), 4 * sizeof(c_char)) - self.assertTrue(type(b[0]) is str) + self.assertIs(type(b[0]), str) self.assertEqual(b[0], "a") self.assertEqual(b[:], "abc\0") self.assertEqual(b[::], "abc\0") @@ -45,12 +45,12 @@ b = create_unicode_buffer(32) self.assertEqual(len(b), 32) self.assertEqual(sizeof(b), 32 * sizeof(c_wchar)) - self.assertTrue(type(b[0]) is unicode) + self.assertIs(type(b[0]), unicode) b = create_unicode_buffer(u"abc") self.assertEqual(len(b), 4) # trailing nul char self.assertEqual(sizeof(b), 4 * sizeof(c_wchar)) - self.assertTrue(type(b[0]) is unicode) + self.assertIs(type(b[0]), unicode) self.assertEqual(b[0], u"a") self.assertEqual(b[:], "abc\0") self.assertEqual(b[::], "abc\0") @@ -62,7 +62,7 @@ b = create_unicode_buffer("abc") self.assertEqual(len(b), 4) # trailing nul char self.assertEqual(sizeof(b), 4 * sizeof(c_wchar)) - self.assertTrue(type(b[0]) is unicode) + self.assertIs(type(b[0]), unicode) self.assertEqual(b[0], u"a") self.assertEqual(b[:], "abc\0") self.assertEqual(b[::], "abc\0") diff --git a/Lib/ctypes/test/test_byteswap.py b/Lib/ctypes/test/test_byteswap.py --- a/Lib/ctypes/test/test_byteswap.py +++ b/Lib/ctypes/test/test_byteswap.py @@ -23,11 +23,11 @@ def test_endian_short(self): if sys.byteorder == "little": - self.assertTrue(c_short.__ctype_le__ is c_short) - self.assertTrue(c_short.__ctype_be__.__ctype_le__ is c_short) + self.assertIs(c_short.__ctype_le__, c_short) + self.assertIs(c_short.__ctype_be__.__ctype_le__, c_short) else: - self.assertTrue(c_short.__ctype_be__ is c_short) - self.assertTrue(c_short.__ctype_le__.__ctype_be__ is c_short) + self.assertIs(c_short.__ctype_be__, c_short) + self.assertIs(c_short.__ctype_le__.__ctype_be__, c_short) s = c_short.__ctype_be__(0x1234) self.assertEqual(bin(struct.pack(">h", 0x1234)), "1234") self.assertEqual(bin(s), "1234") @@ -50,11 +50,11 @@ def test_endian_int(self): if sys.byteorder == "little": - self.assertTrue(c_int.__ctype_le__ is c_int) - self.assertTrue(c_int.__ctype_be__.__ctype_le__ is c_int) + self.assertIs(c_int.__ctype_le__, c_int) + self.assertIs(c_int.__ctype_be__.__ctype_le__, c_int) else: - self.assertTrue(c_int.__ctype_be__ is c_int) - self.assertTrue(c_int.__ctype_le__.__ctype_be__ is c_int) + self.assertIs(c_int.__ctype_be__, c_int) + self.assertIs(c_int.__ctype_le__.__ctype_be__, c_int) s = c_int.__ctype_be__(0x12345678) self.assertEqual(bin(struct.pack(">i", 0x12345678)), "12345678") @@ -78,11 +78,11 @@ def test_endian_longlong(self): if sys.byteorder == "little": - self.assertTrue(c_longlong.__ctype_le__ is c_longlong) - self.assertTrue(c_longlong.__ctype_be__.__ctype_le__ is c_longlong) + self.assertIs(c_longlong.__ctype_le__, c_longlong) + self.assertIs(c_longlong.__ctype_be__.__ctype_le__, c_longlong) else: - self.assertTrue(c_longlong.__ctype_be__ is c_longlong) - self.assertTrue(c_longlong.__ctype_le__.__ctype_be__ is c_longlong) + self.assertIs(c_longlong.__ctype_be__, c_longlong) + self.assertIs(c_longlong.__ctype_le__.__ctype_be__, c_longlong) s = c_longlong.__ctype_be__(0x1234567890ABCDEF) self.assertEqual(bin(struct.pack(">q", 0x1234567890ABCDEF)), "1234567890ABCDEF") @@ -106,11 +106,11 @@ def test_endian_float(self): if sys.byteorder == "little": - self.assertTrue(c_float.__ctype_le__ is c_float) - self.assertTrue(c_float.__ctype_be__.__ctype_le__ is c_float) + self.assertIs(c_float.__ctype_le__, c_float) + self.assertIs(c_float.__ctype_be__.__ctype_le__, c_float) else: - self.assertTrue(c_float.__ctype_be__ is c_float) - self.assertTrue(c_float.__ctype_le__.__ctype_be__ is c_float) + self.assertIs(c_float.__ctype_be__, c_float) + self.assertIs(c_float.__ctype_le__.__ctype_be__, c_float) s = c_float(math.pi) self.assertEqual(bin(struct.pack("f", math.pi)), bin(s)) # Hm, what's the precision of a float compared to a double? @@ -124,11 +124,11 @@ def test_endian_double(self): if sys.byteorder == "little": - self.assertTrue(c_double.__ctype_le__ is c_double) - self.assertTrue(c_double.__ctype_be__.__ctype_le__ is c_double) + self.assertIs(c_double.__ctype_le__, c_double) + self.assertIs(c_double.__ctype_be__.__ctype_le__, c_double) else: - self.assertTrue(c_double.__ctype_be__ is c_double) - self.assertTrue(c_double.__ctype_le__.__ctype_be__ is c_double) + self.assertIs(c_double.__ctype_be__, c_double) + self.assertIs(c_double.__ctype_le__.__ctype_be__, c_double) s = c_double(math.pi) self.assertEqual(s.value, math.pi) self.assertEqual(bin(struct.pack("d", math.pi)), bin(s)) @@ -140,14 +140,14 @@ self.assertEqual(bin(struct.pack(">d", math.pi)), bin(s)) def test_endian_other(self): - self.assertTrue(c_byte.__ctype_le__ is c_byte) - self.assertTrue(c_byte.__ctype_be__ is c_byte) + self.assertIs(c_byte.__ctype_le__, c_byte) + self.assertIs(c_byte.__ctype_be__, c_byte) - self.assertTrue(c_ubyte.__ctype_le__ is c_ubyte) - self.assertTrue(c_ubyte.__ctype_be__ is c_ubyte) + self.assertIs(c_ubyte.__ctype_le__, c_ubyte) + self.assertIs(c_ubyte.__ctype_be__, c_ubyte) - self.assertTrue(c_char.__ctype_le__ is c_char) - self.assertTrue(c_char.__ctype_be__ is c_char) + self.assertIs(c_char.__ctype_le__, c_char) + self.assertIs(c_char.__ctype_be__, c_char) def test_struct_fields_1(self): if sys.byteorder == "little": diff --git a/Lib/ctypes/test/test_cast.py b/Lib/ctypes/test/test_cast.py --- a/Lib/ctypes/test/test_cast.py +++ b/Lib/ctypes/test/test_cast.py @@ -38,14 +38,14 @@ p = cast(array, POINTER(c_char_p)) # array and p share a common _objects attribute - self.assertTrue(p._objects is array._objects) + self.assertIs(p._objects, array._objects) self.assertEqual(array._objects, {'0': "foo bar", id(array): array}) p[0] = "spam spam" self.assertEqual(p._objects, {'0': "spam spam", id(array): array}) - self.assertTrue(array._objects is p._objects) + self.assertIs(array._objects, p._objects) p[1] = "foo bar" self.assertEqual(p._objects, {'1': 'foo bar', '0': "spam spam", id(array): array}) - self.assertTrue(array._objects is p._objects) + self.assertIs(array._objects, p._objects) def test_other(self): p = cast((c_int * 4)(1, 2, 3, 4), POINTER(c_int)) diff --git a/Lib/ctypes/test/test_frombuffer.py b/Lib/ctypes/test/test_frombuffer.py --- a/Lib/ctypes/test/test_frombuffer.py +++ b/Lib/ctypes/test/test_frombuffer.py @@ -23,7 +23,7 @@ a[0], a[-1] = 200, -200 self.assertEqual(x[:], a.tolist()) - self.assertTrue(a in x._objects.values()) + self.assertIn(a, x._objects.values()) self.assertRaises(ValueError, c_int.from_buffer, a, -1) diff --git a/Lib/ctypes/test/test_funcptr.py b/Lib/ctypes/test/test_funcptr.py --- a/Lib/ctypes/test/test_funcptr.py +++ b/Lib/ctypes/test/test_funcptr.py @@ -75,7 +75,7 @@ ## "lpfnWndProc", WNDPROC_2(wndproc)) # instead: - self.assertTrue(WNDPROC is WNDPROC_2) + self.assertIs(WNDPROC, WNDPROC_2) # 'wndclass.lpfnWndProc' leaks 94 references. Why? self.assertEqual(wndclass.lpfnWndProc(1, 2, 3, 4), 10) diff --git a/Lib/ctypes/test/test_functions.py b/Lib/ctypes/test/test_functions.py --- a/Lib/ctypes/test/test_functions.py +++ b/Lib/ctypes/test/test_functions.py @@ -306,7 +306,7 @@ f.argtypes = [c_longlong, MyCallback] def callback(value): - self.assertTrue(isinstance(value, (int, long))) + self.assertIsInstance(value, (int, long)) return value & 0x7FFFFFFF cb = MyCallback(callback) diff --git a/Lib/ctypes/test/test_loading.py b/Lib/ctypes/test/test_loading.py --- a/Lib/ctypes/test/test_loading.py +++ b/Lib/ctypes/test/test_loading.py @@ -43,7 +43,7 @@ if os.name in ("nt", "ce"): def test_load_library(self): - self.assertFalse(libc_name is None) + self.assertIsNotNone(libc_name) if is_resource_enabled("printing"): print find_library("kernel32") print find_library("user32") diff --git a/Lib/ctypes/test/test_numbers.py b/Lib/ctypes/test/test_numbers.py --- a/Lib/ctypes/test/test_numbers.py +++ b/Lib/ctypes/test/test_numbers.py @@ -181,10 +181,10 @@ a = array(t._type_, [3.14]) v = t.from_address(a.buffer_info()[0]) self.assertEqual(v.value, a[0]) - self.assertTrue(type(v) is t) + self.assertIs(type(v), t) a[0] = 2.3456e17 self.assertEqual(v.value, a[0]) - self.assertTrue(type(v) is t) + self.assertIs(type(v), t) def test_char_from_address(self): from ctypes import c_char @@ -193,7 +193,7 @@ a = array('c', 'x') v = c_char.from_address(a.buffer_info()[0]) self.assertEqual(v.value, a[0]) - self.assertTrue(type(v) is c_char) + self.assertIs(type(v), c_char) a[0] = '?' self.assertEqual(v.value, a[0]) diff --git a/Lib/ctypes/test/test_parameters.py b/Lib/ctypes/test/test_parameters.py --- a/Lib/ctypes/test/test_parameters.py +++ b/Lib/ctypes/test/test_parameters.py @@ -55,7 +55,7 @@ # c_char_p.from_param on a Python String packs the string # into a cparam object s = "123" - self.assertTrue(c_char_p.from_param(s)._obj is s) + self.assertIs(c_char_p.from_param(s)._obj, s) # new in 0.9.1: convert (encode) unicode to ascii self.assertEqual(c_char_p.from_param(u"123")._obj, "123") @@ -66,7 +66,7 @@ # calling c_char_p.from_param with a c_char_p instance # returns the argument itself: a = c_char_p("123") - self.assertTrue(c_char_p.from_param(a) is a) + self.assertIs(c_char_p.from_param(a), a) def test_cw_strings(self): from ctypes import byref diff --git a/Lib/ctypes/test/test_pointers.py b/Lib/ctypes/test/test_pointers.py --- a/Lib/ctypes/test/test_pointers.py +++ b/Lib/ctypes/test/test_pointers.py @@ -78,7 +78,7 @@ ## i = c_int(42) ## callback(byref(i)) -## self.assertTrue(i.value == 84) +## self.assertEqual(i.value, 84) doit(callback) ## print self.result @@ -91,11 +91,11 @@ i = ct(42) p = pointer(i) ## print type(p.contents), ct - self.assertTrue(type(p.contents) is ct) + self.assertIs(type(p.contents), ct) # p.contents is the same as p[0] ## print p.contents -## self.assertTrue(p.contents == 42) -## self.assertTrue(p[0] == 42) +## self.assertEqual(p.contents, 42) +## self.assertEqual(p[0], 42) self.assertRaises(TypeError, delitem, p, 0) diff --git a/Lib/ctypes/test/test_python_api.py b/Lib/ctypes/test/test_python_api.py --- a/Lib/ctypes/test/test_python_api.py +++ b/Lib/ctypes/test/test_python_api.py @@ -61,7 +61,7 @@ ref = grc(s) # id(python-object) is the address pyobj = PyObj_FromPtr(id(s)) - self.assertTrue(s is pyobj) + self.assertIs(s, pyobj) self.assertEqual(grc(s), ref + 1) del pyobj diff --git a/Lib/ctypes/test/test_refcounts.py b/Lib/ctypes/test/test_refcounts.py --- a/Lib/ctypes/test/test_refcounts.py +++ b/Lib/ctypes/test/test_refcounts.py @@ -24,7 +24,7 @@ self.assertEqual(grc(callback), 2) cb = MyCallback(callback) - self.assertTrue(grc(callback) > 2) + self.assertGreater(grc(callback), 2) result = f(-10, cb) self.assertEqual(result, -18) cb = None @@ -43,15 +43,15 @@ # the CFuncPtr instance holds at least one refcount on func: f = OtherCallback(func) - self.assertTrue(grc(func) > 2) + self.assertGreater(grc(func), 2) # and may release it again del f - self.assertTrue(grc(func) >= 2) + self.assertGreaterEqual(grc(func), 2) # but now it must be gone gc.collect() - self.assertTrue(grc(func) == 2) + self.assertEqual(grc(func), 2) class X(ctypes.Structure): _fields_ = [("a", OtherCallback)] @@ -59,11 +59,11 @@ x.a = OtherCallback(func) # the CFuncPtr instance holds at least one refcount on func: - self.assertTrue(grc(func) > 2) + self.assertGreater(grc(func), 2) # and may release it again del x - self.assertTrue(grc(func) >= 2) + self.assertGreaterEqual(grc(func), 2) # and now it must be gone again gc.collect() @@ -72,7 +72,7 @@ f = OtherCallback(func) # the CFuncPtr instance holds at least one refcount on func: - self.assertTrue(grc(func) > 2) + self.assertGreater(grc(func), 2) # create a cycle f.cycle = f diff --git a/Lib/ctypes/test/test_strings.py b/Lib/ctypes/test/test_strings.py --- a/Lib/ctypes/test/test_strings.py +++ b/Lib/ctypes/test/test_strings.py @@ -115,24 +115,24 @@ # New in releases later than 0.4.0: # c_string(number) returns an empty string of size number - self.assertTrue(len(c_string(32).raw) == 32) + self.assertEqual(len(c_string(32).raw), 32) self.assertRaises(ValueError, c_string, -1) self.assertRaises(ValueError, c_string, 0) # These tests fail, because it is no longer initialized -## self.assertTrue(c_string(2).value == "") -## self.assertTrue(c_string(2).raw == "\000\000") - self.assertTrue(c_string(2).raw[-1] == "\000") - self.assertTrue(len(c_string(2).raw) == 2) +## self.assertEqual(c_string(2).value, "") +## self.assertEqual(c_string(2).raw, "\000\000") + self.assertEqual(c_string(2).raw[-1], "\000") + self.assertEqual(len(c_string(2).raw), 2) def XX_test_initialized_strings(self): - self.assertTrue(c_string("ab", 4).raw[:2] == "ab") - self.assertTrue(c_string("ab", 4).raw[:2:] == "ab") - self.assertTrue(c_string("ab", 4).raw[:2:-1] == "ba") - self.assertTrue(c_string("ab", 4).raw[:2:2] == "a") - self.assertTrue(c_string("ab", 4).raw[-1] == "\000") - self.assertTrue(c_string("ab", 2).raw == "a\000") + self.assertEqual(c_string("ab", 4).raw[:2], "ab") + self.assertEqual(c_string("ab", 4).raw[:2:], "ab") + self.assertEqual(c_string("ab", 4).raw[:2:-1], "ba") + self.assertEqual(c_string("ab", 4).raw[:2:2], "a") + self.assertEqual(c_string("ab", 4).raw[-1], "\000") + self.assertEqual(c_string("ab", 2).raw, "a\000") def XX_test_toolong(self): cs = c_string("abcdef") @@ -163,22 +163,22 @@ # XXX This behaviour is about to change: # len returns the size of the internal buffer in bytes. # This includes the terminating NUL character. - self.assertTrue(sizeof(cs) == 14) + self.assertEqual(sizeof(cs), 14) # The value property is the string up to the first terminating NUL. - self.assertTrue(cs.value == u"abcdef") - self.assertTrue(c_wstring(u"abc\000def").value == u"abc") + self.assertEqual(cs.value, u"abcdef") + self.assertEqual(c_wstring(u"abc\000def").value, u"abc") - self.assertTrue(c_wstring(u"abc\000def").value == u"abc") + self.assertEqual(c_wstring(u"abc\000def").value, u"abc") # The raw property is the total buffer contents: - self.assertTrue(cs.raw == u"abcdef\000") - self.assertTrue(c_wstring(u"abc\000def").raw == u"abc\000def\000") + self.assertEqual(cs.raw, u"abcdef\000") + self.assertEqual(c_wstring(u"abc\000def").raw, u"abc\000def\000") # We can change the value: cs.value = u"ab" - self.assertTrue(cs.value == u"ab") - self.assertTrue(cs.raw == u"ab\000\000\000\000\000") + self.assertEqual(cs.value, u"ab") + self.assertEqual(cs.raw, u"ab\000\000\000\000\000") self.assertRaises(TypeError, c_wstring, "123") self.assertRaises(ValueError, c_wstring, 0) diff --git a/Lib/ctypes/test/test_structures.py b/Lib/ctypes/test/test_structures.py --- a/Lib/ctypes/test/test_structures.py +++ b/Lib/ctypes/test/test_structures.py @@ -380,9 +380,9 @@ ## class X(Structure): ## _fields_ = [] - self.assertTrue("in_dll" in dir(type(Structure))) - self.assertTrue("from_address" in dir(type(Structure))) - self.assertTrue("in_dll" in dir(type(Structure))) + self.assertIn("in_dll", dir(type(Structure))) + self.assertIn("from_address", dir(type(Structure))) + self.assertIn("in_dll", dir(type(Structure))) def test_positional_args(self): # see also http://bugs.python.org/issue5042 @@ -452,8 +452,8 @@ try: Recursive._fields_ = [("next", Recursive)] except AttributeError, details: - self.assertTrue("Structure or union cannot contain itself" in - str(details)) + self.assertIn("Structure or union cannot contain itself", + str(details)) else: self.fail("Structure or union cannot contain itself") @@ -469,8 +469,7 @@ try: Second._fields_ = [("first", First)] except AttributeError, details: - self.assertTrue("_fields_ is final" in - str(details)) + self.assertIn("_fields_ is final", str(details)) else: self.fail("AttributeError not raised") -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 23:15:37 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 16 Nov 2013 23:15:37 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NTk0?= =?utf-8?q?=3A_Use_specific_asserts_in_unittest_tests=2E?= Message-ID: <3dMW2x5C6Yz7Lq0@mail.python.org> http://hg.python.org/cpython/rev/f32353baecc3 changeset: 87157:f32353baecc3 branch: 3.3 parent: 87154:98adbaccaae3 user: Serhiy Storchaka date: Sun Nov 17 00:12:21 2013 +0200 summary: Issue #19594: Use specific asserts in unittest tests. files: Lib/unittest/test/test_case.py | 12 +++--- Lib/unittest/test/test_loader.py | 2 +- Lib/unittest/test/test_result.py | 4 +- Lib/unittest/test/testmock/testhelpers.py | 18 +++++----- Lib/unittest/test/testmock/testmagicmethods.py | 8 ++-- Lib/unittest/test/testmock/testmock.py | 10 ++-- Lib/unittest/test/testmock/testsentinel.py | 2 +- 7 files changed, 28 insertions(+), 28 deletions(-) diff --git a/Lib/unittest/test/test_case.py b/Lib/unittest/test/test_case.py --- a/Lib/unittest/test/test_case.py +++ b/Lib/unittest/test/test_case.py @@ -307,7 +307,7 @@ def test(self): pass - self.assertTrue(Foo('test').failureException is AssertionError) + self.assertIs(Foo('test').failureException, AssertionError) # "This class attribute gives the exception raised by the test() method. # If a test framework needs to use a specialized exception, possibly to @@ -325,7 +325,7 @@ failureException = RuntimeError - self.assertTrue(Foo('test').failureException is RuntimeError) + self.assertIs(Foo('test').failureException, RuntimeError) Foo('test').run(result) @@ -348,7 +348,7 @@ failureException = RuntimeError - self.assertTrue(Foo('test').failureException is RuntimeError) + self.assertIs(Foo('test').failureException, RuntimeError) Foo('test').run(result) @@ -660,7 +660,7 @@ msg = e.args[0] else: self.fail('assertSequenceEqual did not fail.') - self.assertTrue(len(msg) < len(diff)) + self.assertLess(len(msg), len(diff)) self.assertIn(omitted, msg) self.maxDiff = len(diff) * 2 @@ -670,7 +670,7 @@ msg = e.args[0] else: self.fail('assertSequenceEqual did not fail.') - self.assertTrue(len(msg) > len(diff)) + self.assertGreater(len(msg), len(diff)) self.assertNotIn(omitted, msg) self.maxDiff = None @@ -680,7 +680,7 @@ msg = e.args[0] else: self.fail('assertSequenceEqual did not fail.') - self.assertTrue(len(msg) > len(diff)) + self.assertGreater(len(msg), len(diff)) self.assertNotIn(omitted, msg) def testTruncateMessage(self): diff --git a/Lib/unittest/test/test_loader.py b/Lib/unittest/test/test_loader.py --- a/Lib/unittest/test/test_loader.py +++ b/Lib/unittest/test/test_loader.py @@ -1305,4 +1305,4 @@ # "The default value is the TestSuite class" def test_suiteClass__default_value(self): loader = unittest.TestLoader() - self.assertTrue(loader.suiteClass is unittest.TestSuite) + self.assertIs(loader.suiteClass, unittest.TestSuite) diff --git a/Lib/unittest/test/test_result.py b/Lib/unittest/test/test_result.py --- a/Lib/unittest/test/test_result.py +++ b/Lib/unittest/test/test_result.py @@ -176,7 +176,7 @@ self.assertEqual(result.shouldStop, False) test_case, formatted_exc = result.failures[0] - self.assertTrue(test_case is test) + self.assertIs(test_case, test) self.assertIsInstance(formatted_exc, str) # "addError(test, err)" @@ -224,7 +224,7 @@ self.assertEqual(result.shouldStop, False) test_case, formatted_exc = result.errors[0] - self.assertTrue(test_case is test) + self.assertIs(test_case, test) self.assertIsInstance(formatted_exc, str) def testGetDescriptionWithoutDocstring(self): diff --git a/Lib/unittest/test/testmock/testhelpers.py b/Lib/unittest/test/testmock/testhelpers.py --- a/Lib/unittest/test/testmock/testhelpers.py +++ b/Lib/unittest/test/testmock/testhelpers.py @@ -177,7 +177,7 @@ args = _Call(((1, 2, 3), {})) self.assertEqual(args, call(1, 2, 3)) self.assertEqual(call(1, 2, 3), args) - self.assertTrue(call(1, 2, 3) in [args]) + self.assertIn(call(1, 2, 3), [args]) def test_call_ne(self): @@ -793,7 +793,7 @@ mock_property = foo.foo # no spec on properties - self.assertTrue(isinstance(mock_property, MagicMock)) + self.assertIsInstance(mock_property, MagicMock) mock_property(1, 2, 3) mock_property.abc(4, 5, 6) mock_property.assert_called_once_with(1, 2, 3) @@ -826,19 +826,19 @@ mock(b=6) for kall in call(1, 2), call(a=3), call(3, 4), call(b=6): - self.assertTrue(kall in mock.call_args_list) + self.assertIn(kall, mock.call_args_list) calls = [call(a=3), call(3, 4)] - self.assertTrue(calls in mock.call_args_list) + self.assertIn(calls, mock.call_args_list) calls = [call(1, 2), call(a=3)] - self.assertTrue(calls in mock.call_args_list) + self.assertIn(calls, mock.call_args_list) calls = [call(3, 4), call(b=6)] - self.assertTrue(calls in mock.call_args_list) + self.assertIn(calls, mock.call_args_list) calls = [call(3, 4)] - self.assertTrue(calls in mock.call_args_list) + self.assertIn(calls, mock.call_args_list) - self.assertFalse(call('fish') in mock.call_args_list) - self.assertFalse([call('fish')] in mock.call_args_list) + self.assertNotIn(call('fish'), mock.call_args_list) + self.assertNotIn([call('fish')], mock.call_args_list) def test_call_list_str(self): diff --git a/Lib/unittest/test/testmock/testmagicmethods.py b/Lib/unittest/test/testmock/testmagicmethods.py --- a/Lib/unittest/test/testmock/testmagicmethods.py +++ b/Lib/unittest/test/testmock/testmagicmethods.py @@ -37,12 +37,12 @@ return self, 'fish' mock.__getitem__ = f - self.assertFalse(mock.__getitem__ is f) + self.assertIsNot(mock.__getitem__, f) self.assertEqual(mock['foo'], (mock, 'fish')) self.assertEqual(mock.__getitem__('foo'), (mock, 'fish')) mock.__getitem__ = mock - self.assertTrue(mock.__getitem__ is mock) + self.assertIs(mock.__getitem__, mock) def test_magic_methods_isolated_between_mocks(self): @@ -212,8 +212,8 @@ self.assertEqual(len(mock), 6) mock.__contains__ = lambda s, o: o == 3 - self.assertTrue(3 in mock) - self.assertFalse(6 in mock) + self.assertIn(3, mock) + self.assertNotIn(6, mock) mock.__iter__ = lambda s: iter('foobarbaz') self.assertEqual(list(mock), list('foobarbaz')) diff --git a/Lib/unittest/test/testmock/testmock.py b/Lib/unittest/test/testmock/testmock.py --- a/Lib/unittest/test/testmock/testmock.py +++ b/Lib/unittest/test/testmock/testmock.py @@ -52,7 +52,7 @@ "method_calls not initialised correctly") # Can't use hasattr for this test as it always returns True on a mock - self.assertFalse('_items' in mock.__dict__, + self.assertNotIn('_items', mock.__dict__, "default mock should not have '_items' attribute") self.assertIsNone(mock._mock_parent, @@ -493,19 +493,19 @@ pass mock = Mock(spec=X) - self.assertTrue(isinstance(mock, X)) + self.assertIsInstance(mock, X) mock = Mock(spec=X()) - self.assertTrue(isinstance(mock, X)) + self.assertIsInstance(mock, X) self.assertIs(mock.__class__, X) self.assertEqual(Mock().__class__.__name__, 'Mock') mock = Mock(spec_set=X) - self.assertTrue(isinstance(mock, X)) + self.assertIsInstance(mock, X) mock = Mock(spec_set=X()) - self.assertTrue(isinstance(mock, X)) + self.assertIsInstance(mock, X) def test_setting_attribute_with_spec_set(self): diff --git a/Lib/unittest/test/testmock/testsentinel.py b/Lib/unittest/test/testmock/testsentinel.py --- a/Lib/unittest/test/testmock/testsentinel.py +++ b/Lib/unittest/test/testmock/testsentinel.py @@ -17,7 +17,7 @@ def testDEFAULT(self): - self.assertTrue(DEFAULT is sentinel.DEFAULT) + self.assertIs(DEFAULT, sentinel.DEFAULT) def testBases(self): # If this doesn't raise an AttributeError then help(mock) is broken -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 23:15:39 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 16 Nov 2013 23:15:39 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319594=3A_Use_specific_asserts_in_unittest_tests?= =?utf-8?q?=2E?= Message-ID: <3dMW2z200Hz7LvC@mail.python.org> http://hg.python.org/cpython/rev/aacb0bd969e5 changeset: 87158:aacb0bd969e5 parent: 87155:5fa9293b6cde parent: 87157:f32353baecc3 user: Serhiy Storchaka date: Sun Nov 17 00:14:35 2013 +0200 summary: Issue #19594: Use specific asserts in unittest tests. files: Lib/unittest/test/test_case.py | 12 +++--- Lib/unittest/test/test_loader.py | 2 +- Lib/unittest/test/test_result.py | 4 +- Lib/unittest/test/testmock/testhelpers.py | 18 +++++----- Lib/unittest/test/testmock/testmagicmethods.py | 8 ++-- Lib/unittest/test/testmock/testmock.py | 10 ++-- Lib/unittest/test/testmock/testsentinel.py | 2 +- 7 files changed, 28 insertions(+), 28 deletions(-) diff --git a/Lib/unittest/test/test_case.py b/Lib/unittest/test/test_case.py --- a/Lib/unittest/test/test_case.py +++ b/Lib/unittest/test/test_case.py @@ -407,7 +407,7 @@ def test(self): pass - self.assertTrue(Foo('test').failureException is AssertionError) + self.assertIs(Foo('test').failureException, AssertionError) # "This class attribute gives the exception raised by the test() method. # If a test framework needs to use a specialized exception, possibly to @@ -425,7 +425,7 @@ failureException = RuntimeError - self.assertTrue(Foo('test').failureException is RuntimeError) + self.assertIs(Foo('test').failureException, RuntimeError) Foo('test').run(result) @@ -448,7 +448,7 @@ failureException = RuntimeError - self.assertTrue(Foo('test').failureException is RuntimeError) + self.assertIs(Foo('test').failureException, RuntimeError) Foo('test').run(result) @@ -760,7 +760,7 @@ msg = e.args[0] else: self.fail('assertSequenceEqual did not fail.') - self.assertTrue(len(msg) < len(diff)) + self.assertLess(len(msg), len(diff)) self.assertIn(omitted, msg) self.maxDiff = len(diff) * 2 @@ -770,7 +770,7 @@ msg = e.args[0] else: self.fail('assertSequenceEqual did not fail.') - self.assertTrue(len(msg) > len(diff)) + self.assertGreater(len(msg), len(diff)) self.assertNotIn(omitted, msg) self.maxDiff = None @@ -780,7 +780,7 @@ msg = e.args[0] else: self.fail('assertSequenceEqual did not fail.') - self.assertTrue(len(msg) > len(diff)) + self.assertGreater(len(msg), len(diff)) self.assertNotIn(omitted, msg) def testTruncateMessage(self): diff --git a/Lib/unittest/test/test_loader.py b/Lib/unittest/test/test_loader.py --- a/Lib/unittest/test/test_loader.py +++ b/Lib/unittest/test/test_loader.py @@ -1305,7 +1305,7 @@ # "The default value is the TestSuite class" def test_suiteClass__default_value(self): loader = unittest.TestLoader() - self.assertTrue(loader.suiteClass is unittest.TestSuite) + self.assertIs(loader.suiteClass, unittest.TestSuite) if __name__ == "__main__": diff --git a/Lib/unittest/test/test_result.py b/Lib/unittest/test/test_result.py --- a/Lib/unittest/test/test_result.py +++ b/Lib/unittest/test/test_result.py @@ -176,7 +176,7 @@ self.assertEqual(result.shouldStop, False) test_case, formatted_exc = result.failures[0] - self.assertTrue(test_case is test) + self.assertIs(test_case, test) self.assertIsInstance(formatted_exc, str) # "addError(test, err)" @@ -224,7 +224,7 @@ self.assertEqual(result.shouldStop, False) test_case, formatted_exc = result.errors[0] - self.assertTrue(test_case is test) + self.assertIs(test_case, test) self.assertIsInstance(formatted_exc, str) def test_addSubTest(self): diff --git a/Lib/unittest/test/testmock/testhelpers.py b/Lib/unittest/test/testmock/testhelpers.py --- a/Lib/unittest/test/testmock/testhelpers.py +++ b/Lib/unittest/test/testmock/testhelpers.py @@ -177,7 +177,7 @@ args = _Call(((1, 2, 3), {})) self.assertEqual(args, call(1, 2, 3)) self.assertEqual(call(1, 2, 3), args) - self.assertTrue(call(1, 2, 3) in [args]) + self.assertIn(call(1, 2, 3), [args]) def test_call_ne(self): @@ -812,7 +812,7 @@ mock_property = foo.foo # no spec on properties - self.assertTrue(isinstance(mock_property, MagicMock)) + self.assertIsInstance(mock_property, MagicMock) mock_property(1, 2, 3) mock_property.abc(4, 5, 6) mock_property.assert_called_once_with(1, 2, 3) @@ -845,19 +845,19 @@ mock(b=6) for kall in call(1, 2), call(a=3), call(3, 4), call(b=6): - self.assertTrue(kall in mock.call_args_list) + self.assertIn(kall, mock.call_args_list) calls = [call(a=3), call(3, 4)] - self.assertTrue(calls in mock.call_args_list) + self.assertIn(calls, mock.call_args_list) calls = [call(1, 2), call(a=3)] - self.assertTrue(calls in mock.call_args_list) + self.assertIn(calls, mock.call_args_list) calls = [call(3, 4), call(b=6)] - self.assertTrue(calls in mock.call_args_list) + self.assertIn(calls, mock.call_args_list) calls = [call(3, 4)] - self.assertTrue(calls in mock.call_args_list) + self.assertIn(calls, mock.call_args_list) - self.assertFalse(call('fish') in mock.call_args_list) - self.assertFalse([call('fish')] in mock.call_args_list) + self.assertNotIn(call('fish'), mock.call_args_list) + self.assertNotIn([call('fish')], mock.call_args_list) def test_call_list_str(self): diff --git a/Lib/unittest/test/testmock/testmagicmethods.py b/Lib/unittest/test/testmock/testmagicmethods.py --- a/Lib/unittest/test/testmock/testmagicmethods.py +++ b/Lib/unittest/test/testmock/testmagicmethods.py @@ -37,12 +37,12 @@ return self, 'fish' mock.__getitem__ = f - self.assertFalse(mock.__getitem__ is f) + self.assertIsNot(mock.__getitem__, f) self.assertEqual(mock['foo'], (mock, 'fish')) self.assertEqual(mock.__getitem__('foo'), (mock, 'fish')) mock.__getitem__ = mock - self.assertTrue(mock.__getitem__ is mock) + self.assertIs(mock.__getitem__, mock) def test_magic_methods_isolated_between_mocks(self): @@ -212,8 +212,8 @@ self.assertEqual(len(mock), 6) mock.__contains__ = lambda s, o: o == 3 - self.assertTrue(3 in mock) - self.assertFalse(6 in mock) + self.assertIn(3, mock) + self.assertNotIn(6, mock) mock.__iter__ = lambda s: iter('foobarbaz') self.assertEqual(list(mock), list('foobarbaz')) diff --git a/Lib/unittest/test/testmock/testmock.py b/Lib/unittest/test/testmock/testmock.py --- a/Lib/unittest/test/testmock/testmock.py +++ b/Lib/unittest/test/testmock/testmock.py @@ -64,7 +64,7 @@ "method_calls not initialised correctly") # Can't use hasattr for this test as it always returns True on a mock - self.assertFalse('_items' in mock.__dict__, + self.assertNotIn('_items', mock.__dict__, "default mock should not have '_items' attribute") self.assertIsNone(mock._mock_parent, @@ -565,19 +565,19 @@ pass mock = Mock(spec=X) - self.assertTrue(isinstance(mock, X)) + self.assertIsInstance(mock, X) mock = Mock(spec=X()) - self.assertTrue(isinstance(mock, X)) + self.assertIsInstance(mock, X) self.assertIs(mock.__class__, X) self.assertEqual(Mock().__class__.__name__, 'Mock') mock = Mock(spec_set=X) - self.assertTrue(isinstance(mock, X)) + self.assertIsInstance(mock, X) mock = Mock(spec_set=X()) - self.assertTrue(isinstance(mock, X)) + self.assertIsInstance(mock, X) def test_setting_attribute_with_spec_set(self): diff --git a/Lib/unittest/test/testmock/testsentinel.py b/Lib/unittest/test/testmock/testsentinel.py --- a/Lib/unittest/test/testmock/testsentinel.py +++ b/Lib/unittest/test/testmock/testsentinel.py @@ -17,7 +17,7 @@ def testDEFAULT(self): - self.assertTrue(DEFAULT is sentinel.DEFAULT) + self.assertIs(DEFAULT, sentinel.DEFAULT) def testBases(self): # If this doesn't raise an AttributeError then help(mock) is broken -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 23:15:40 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 16 Nov 2013 23:15:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NTk0?= =?utf-8?q?=3A_Use_specific_asserts_in_unittest_tests=2E?= Message-ID: <3dMW304JZwz7Lnv@mail.python.org> http://hg.python.org/cpython/rev/1d1bad272733 changeset: 87159:1d1bad272733 branch: 2.7 parent: 87156:8dbb78be2496 user: Serhiy Storchaka date: Sun Nov 17 00:15:09 2013 +0200 summary: Issue #19594: Use specific asserts in unittest tests. files: Lib/unittest/test/test_case.py | 12 ++++++------ Lib/unittest/test/test_loader.py | 2 +- Lib/unittest/test/test_result.py | 4 ++-- 3 files changed, 9 insertions(+), 9 deletions(-) diff --git a/Lib/unittest/test/test_case.py b/Lib/unittest/test/test_case.py --- a/Lib/unittest/test/test_case.py +++ b/Lib/unittest/test/test_case.py @@ -293,7 +293,7 @@ def test(self): pass - self.assertTrue(Foo('test').failureException is AssertionError) + self.assertIs(Foo('test').failureException, AssertionError) # "This class attribute gives the exception raised by the test() method. # If a test framework needs to use a specialized exception, possibly to @@ -311,7 +311,7 @@ failureException = RuntimeError - self.assertTrue(Foo('test').failureException is RuntimeError) + self.assertIs(Foo('test').failureException, RuntimeError) Foo('test').run(result) @@ -334,7 +334,7 @@ failureException = RuntimeError - self.assertTrue(Foo('test').failureException is RuntimeError) + self.assertIs(Foo('test').failureException, RuntimeError) Foo('test').run(result) @@ -607,7 +607,7 @@ msg = e.args[0] else: self.fail('assertSequenceEqual did not fail.') - self.assertTrue(len(msg) < len(diff)) + self.assertLess(len(msg), len(diff)) self.assertIn(omitted, msg) self.maxDiff = len(diff) * 2 @@ -617,7 +617,7 @@ msg = e.args[0] else: self.fail('assertSequenceEqual did not fail.') - self.assertTrue(len(msg) > len(diff)) + self.assertGreater(len(msg), len(diff)) self.assertNotIn(omitted, msg) self.maxDiff = None @@ -627,7 +627,7 @@ msg = e.args[0] else: self.fail('assertSequenceEqual did not fail.') - self.assertTrue(len(msg) > len(diff)) + self.assertGreater(len(msg), len(diff)) self.assertNotIn(omitted, msg) def testTruncateMessage(self): diff --git a/Lib/unittest/test/test_loader.py b/Lib/unittest/test/test_loader.py --- a/Lib/unittest/test/test_loader.py +++ b/Lib/unittest/test/test_loader.py @@ -1279,7 +1279,7 @@ # "The default value is the TestSuite class" def test_suiteClass__default_value(self): loader = unittest.TestLoader() - self.assertTrue(loader.suiteClass is unittest.TestSuite) + self.assertIs(loader.suiteClass, unittest.TestSuite) # Make sure the dotted name resolution works even if the actual # function doesn't have the same name as is used to find it. diff --git a/Lib/unittest/test/test_result.py b/Lib/unittest/test/test_result.py --- a/Lib/unittest/test/test_result.py +++ b/Lib/unittest/test/test_result.py @@ -176,7 +176,7 @@ self.assertEqual(result.shouldStop, False) test_case, formatted_exc = result.failures[0] - self.assertTrue(test_case is test) + self.assertIs(test_case, test) self.assertIsInstance(formatted_exc, str) # "addError(test, err)" @@ -224,7 +224,7 @@ self.assertEqual(result.shouldStop, False) test_case, formatted_exc = result.errors[0] - self.assertTrue(test_case is test) + self.assertIs(test_case, test) self.assertIsInstance(formatted_exc, str) def testGetDescriptionWithoutDocstring(self): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 23:30:00 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 16 Nov 2013 23:30:00 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NjAw?= =?utf-8?q?=3A_Use_specific_asserts_in_distutils_tests=2E?= Message-ID: <3dMWMX4V2cz7LlF@mail.python.org> http://hg.python.org/cpython/rev/5a0236f3f972 changeset: 87160:5a0236f3f972 branch: 3.3 parent: 87157:f32353baecc3 user: Serhiy Storchaka date: Sun Nov 17 00:17:46 2013 +0200 summary: Issue #19600: Use specific asserts in distutils tests. files: Lib/distutils/tests/test_archive_util.py | 2 +- Lib/distutils/tests/test_bdist_rpm.py | 4 +- Lib/distutils/tests/test_bdist_wininst.py | 2 +- Lib/distutils/tests/test_build_clib.py | 2 +- Lib/distutils/tests/test_build_ext.py | 16 +++++----- Lib/distutils/tests/test_build_scripts.py | 8 ++-- Lib/distutils/tests/test_clean.py | 2 +- Lib/distutils/tests/test_config.py | 2 +- Lib/distutils/tests/test_config_cmd.py | 2 +- Lib/distutils/tests/test_install.py | 2 +- Lib/distutils/tests/test_install_lib.py | 2 +- Lib/distutils/tests/test_install_scripts.py | 10 +++--- Lib/distutils/tests/test_msvc9compiler.py | 6 +- Lib/distutils/tests/test_register.py | 8 ++-- Lib/distutils/tests/test_sysconfig.py | 2 +- Lib/distutils/tests/test_util.py | 2 +- 16 files changed, 36 insertions(+), 36 deletions(-) diff --git a/Lib/distutils/tests/test_archive_util.py b/Lib/distutils/tests/test_archive_util.py --- a/Lib/distutils/tests/test_archive_util.py +++ b/Lib/distutils/tests/test_archive_util.py @@ -210,7 +210,7 @@ dry_run=True) finally: os.chdir(old_dir) - self.assertTrue(not os.path.exists(tarball)) + self.assertFalse(os.path.exists(tarball)) self.assertEqual(len(w.warnings), 1) @unittest.skipUnless(ZIP_SUPPORT and ZLIB_SUPPORT, diff --git a/Lib/distutils/tests/test_bdist_rpm.py b/Lib/distutils/tests/test_bdist_rpm.py --- a/Lib/distutils/tests/test_bdist_rpm.py +++ b/Lib/distutils/tests/test_bdist_rpm.py @@ -81,7 +81,7 @@ cmd.run() dist_created = os.listdir(os.path.join(pkg_dir, 'dist')) - self.assertTrue('foo-0.1-1.noarch.rpm' in dist_created) + self.assertIn('foo-0.1-1.noarch.rpm', dist_created) # bug #2945: upload ignores bdist_rpm files self.assertIn(('bdist_rpm', 'any', 'dist/foo-0.1-1.src.rpm'), dist.dist_files) @@ -125,7 +125,7 @@ cmd.run() dist_created = os.listdir(os.path.join(pkg_dir, 'dist')) - self.assertTrue('foo-0.1-1.noarch.rpm' in dist_created) + self.assertIn('foo-0.1-1.noarch.rpm', dist_created) # bug #2945: upload ignores bdist_rpm files self.assertIn(('bdist_rpm', 'any', 'dist/foo-0.1-1.src.rpm'), dist.dist_files) diff --git a/Lib/distutils/tests/test_bdist_wininst.py b/Lib/distutils/tests/test_bdist_wininst.py --- a/Lib/distutils/tests/test_bdist_wininst.py +++ b/Lib/distutils/tests/test_bdist_wininst.py @@ -22,7 +22,7 @@ # and make sure it finds it and returns its content # no matter what platform we have exe_file = cmd.get_exe_bytes() - self.assertTrue(len(exe_file) > 10) + self.assertGreater(len(exe_file), 10) def test_suite(): return unittest.makeSuite(BuildWinInstTestCase) diff --git a/Lib/distutils/tests/test_build_clib.py b/Lib/distutils/tests/test_build_clib.py --- a/Lib/distutils/tests/test_build_clib.py +++ b/Lib/distutils/tests/test_build_clib.py @@ -137,7 +137,7 @@ cmd.run() # let's check the result - self.assertTrue('libfoo.a' in os.listdir(build_temp)) + self.assertIn('libfoo.a', os.listdir(build_temp)) def test_suite(): return unittest.makeSuite(BuildCLibTestCase) diff --git a/Lib/distutils/tests/test_build_ext.py b/Lib/distutils/tests/test_build_ext.py --- a/Lib/distutils/tests/test_build_ext.py +++ b/Lib/distutils/tests/test_build_ext.py @@ -76,8 +76,8 @@ if support.HAVE_DOCSTRINGS: doc = 'This is a template module just for instruction.' self.assertEqual(xx.__doc__, doc) - self.assertTrue(isinstance(xx.Null(), xx.Null)) - self.assertTrue(isinstance(xx.Str(), xx.Str)) + self.assertIsInstance(xx.Null(), xx.Null) + self.assertIsInstance(xx.Str(), xx.Str) def tearDown(self): # Get everything back to normal @@ -110,7 +110,7 @@ _config_vars['Py_ENABLE_SHARED'] = old_var # make sure we get some library dirs under solaris - self.assertTrue(len(cmd.library_dirs) > 0) + self.assertGreater(len(cmd.library_dirs), 0) def test_user_site(self): # site.USER_SITE was introduced in 2.6 @@ -124,7 +124,7 @@ # making sure the user option is there options = [name for name, short, lable in cmd.user_options] - self.assertTrue('user' in options) + self.assertIn('user', options) # setting a value cmd.user = 1 @@ -171,10 +171,10 @@ from distutils import sysconfig py_include = sysconfig.get_python_inc() - self.assertTrue(py_include in cmd.include_dirs) + self.assertIn(py_include, cmd.include_dirs) plat_py_include = sysconfig.get_python_inc(plat_specific=1) - self.assertTrue(plat_py_include in cmd.include_dirs) + self.assertIn(plat_py_include, cmd.include_dirs) # make sure cmd.libraries is turned into a list # if it's a string @@ -255,13 +255,13 @@ 'some': 'bar'})] cmd.check_extensions_list(exts) ext = exts[0] - self.assertTrue(isinstance(ext, Extension)) + self.assertIsInstance(ext, Extension) # check_extensions_list adds in ext the values passed # when they are in ('include_dirs', 'library_dirs', 'libraries' # 'extra_objects', 'extra_compile_args', 'extra_link_args') self.assertEqual(ext.libraries, 'foo') - self.assertTrue(not hasattr(ext, 'some')) + self.assertFalse(hasattr(ext, 'some')) # 'macros' element of build info dict must be 1- or 2-tuple exts = [('foo.bar', {'sources': [''], 'libraries': 'foo', diff --git a/Lib/distutils/tests/test_build_scripts.py b/Lib/distutils/tests/test_build_scripts.py --- a/Lib/distutils/tests/test_build_scripts.py +++ b/Lib/distutils/tests/test_build_scripts.py @@ -17,8 +17,8 @@ def test_default_settings(self): cmd = self.get_build_scripts_cmd("/foo/bar", []) - self.assertTrue(not cmd.force) - self.assertTrue(cmd.build_dir is None) + self.assertFalse(cmd.force) + self.assertIsNone(cmd.build_dir) cmd.finalize_options() @@ -38,7 +38,7 @@ built = os.listdir(target) for name in expected: - self.assertTrue(name in built) + self.assertIn(name, built) def get_build_scripts_cmd(self, target, scripts): import sys @@ -103,7 +103,7 @@ built = os.listdir(target) for name in expected: - self.assertTrue(name in built) + self.assertIn(name, built) def test_suite(): return unittest.makeSuite(BuildScriptsTestCase) diff --git a/Lib/distutils/tests/test_clean.py b/Lib/distutils/tests/test_clean.py --- a/Lib/distutils/tests/test_clean.py +++ b/Lib/distutils/tests/test_clean.py @@ -36,7 +36,7 @@ # make sure the files where removed for name, path in dirs: - self.assertTrue(not os.path.exists(path), + self.assertFalse(os.path.exists(path), '%s was not removed' % path) # let's run the command again (should spit warnings but succeed) diff --git a/Lib/distutils/tests/test_config.py b/Lib/distutils/tests/test_config.py --- a/Lib/distutils/tests/test_config.py +++ b/Lib/distutils/tests/test_config.py @@ -103,7 +103,7 @@ def test_server_empty_registration(self): cmd = self._cmd(self.dist) rc = cmd._get_rc_file() - self.assertTrue(not os.path.exists(rc)) + self.assertFalse(os.path.exists(rc)) cmd._store_pypirc('tarek', 'xxx') self.assertTrue(os.path.exists(rc)) f = open(rc) diff --git a/Lib/distutils/tests/test_config_cmd.py b/Lib/distutils/tests/test_config_cmd.py --- a/Lib/distutils/tests/test_config_cmd.py +++ b/Lib/distutils/tests/test_config_cmd.py @@ -81,7 +81,7 @@ cmd._clean(f1, f2) for f in (f1, f2): - self.assertTrue(not os.path.exists(f)) + self.assertFalse(os.path.exists(f)) def test_suite(): return unittest.makeSuite(ConfigTestCase) diff --git a/Lib/distutils/tests/test_install.py b/Lib/distutils/tests/test_install.py --- a/Lib/distutils/tests/test_install.py +++ b/Lib/distutils/tests/test_install.py @@ -236,7 +236,7 @@ self.test_record() finally: install_module.DEBUG = False - self.assertTrue(len(self.logs) > old_logs_len) + self.assertGreater(len(self.logs), old_logs_len) def test_suite(): diff --git a/Lib/distutils/tests/test_install_lib.py b/Lib/distutils/tests/test_install_lib.py --- a/Lib/distutils/tests/test_install_lib.py +++ b/Lib/distutils/tests/test_install_lib.py @@ -103,7 +103,7 @@ finally: sys.dont_write_bytecode = old_dont_write_bytecode - self.assertTrue('byte-compiling is disabled' in self.logs[0][1]) + self.assertIn('byte-compiling is disabled', self.logs[0][1]) def test_suite(): diff --git a/Lib/distutils/tests/test_install_scripts.py b/Lib/distutils/tests/test_install_scripts.py --- a/Lib/distutils/tests/test_install_scripts.py +++ b/Lib/distutils/tests/test_install_scripts.py @@ -24,10 +24,10 @@ skip_build=1, ) cmd = install_scripts(dist) - self.assertTrue(not cmd.force) - self.assertTrue(not cmd.skip_build) - self.assertTrue(cmd.build_dir is None) - self.assertTrue(cmd.install_dir is None) + self.assertFalse(cmd.force) + self.assertFalse(cmd.skip_build) + self.assertIsNone(cmd.build_dir) + self.assertIsNone(cmd.install_dir) cmd.finalize_options() @@ -72,7 +72,7 @@ installed = os.listdir(target) for name in expected: - self.assertTrue(name in installed) + self.assertIn(name, installed) def test_suite(): diff --git a/Lib/distutils/tests/test_msvc9compiler.py b/Lib/distutils/tests/test_msvc9compiler.py --- a/Lib/distutils/tests/test_msvc9compiler.py +++ b/Lib/distutils/tests/test_msvc9compiler.py @@ -128,7 +128,7 @@ # windows registeries versions. path = r'Control Panel\Desktop' v = Reg.get_value(path, 'dragfullwindows') - self.assertTrue(v in ('0', '1', '2')) + self.assertIn(v, ('0', '1', '2')) import winreg HKCU = winreg.HKEY_CURRENT_USER @@ -136,7 +136,7 @@ self.assertEqual(keys, None) keys = Reg.read_keys(HKCU, r'Control Panel') - self.assertTrue('Desktop' in keys) + self.assertIn('Desktop', keys) def test_remove_visual_c_ref(self): from distutils.msvc9compiler import MSVCCompiler @@ -174,7 +174,7 @@ compiler = MSVCCompiler() got = compiler._remove_visual_c_ref(manifest) - self.assertIs(got, None) + self.assertIsNone(got) def test_suite(): diff --git a/Lib/distutils/tests/test_register.py b/Lib/distutils/tests/test_register.py --- a/Lib/distutils/tests/test_register.py +++ b/Lib/distutils/tests/test_register.py @@ -98,7 +98,7 @@ cmd = self._get_cmd() # we shouldn't have a .pypirc file yet - self.assertTrue(not os.path.exists(self.rc)) + self.assertFalse(os.path.exists(self.rc)) # patching input and getpass.getpass # so register gets happy @@ -145,7 +145,7 @@ self.assertEqual(req1['Content-length'], '1374') self.assertEqual(req2['Content-length'], '1374') - self.assertTrue((b'xxx') in self.conn.reqs[1].data) + self.assertIn(b'xxx', self.conn.reqs[1].data) def test_password_not_in_file(self): @@ -175,7 +175,7 @@ req = self.conn.reqs[0] headers = dict(req.headers) self.assertEqual(headers['Content-length'], '608') - self.assertTrue((b'tarek') in req.data) + self.assertIn(b'tarek', req.data) def test_password_reset(self): # this test runs choice 3 @@ -193,7 +193,7 @@ req = self.conn.reqs[0] headers = dict(req.headers) self.assertEqual(headers['Content-length'], '290') - self.assertTrue((b'tarek') in req.data) + self.assertIn(b'tarek', req.data) @unittest.skipUnless(docutils is not None, 'needs docutils') def test_strict(self): diff --git a/Lib/distutils/tests/test_sysconfig.py b/Lib/distutils/tests/test_sysconfig.py --- a/Lib/distutils/tests/test_sysconfig.py +++ b/Lib/distutils/tests/test_sysconfig.py @@ -50,7 +50,7 @@ def test_get_config_vars(self): cvars = sysconfig.get_config_vars() - self.assertTrue(isinstance(cvars, dict)) + self.assertIsInstance(cvars, dict) self.assertTrue(cvars) def test_srcdir(self): diff --git a/Lib/distutils/tests/test_util.py b/Lib/distutils/tests/test_util.py --- a/Lib/distutils/tests/test_util.py +++ b/Lib/distutils/tests/test_util.py @@ -266,7 +266,7 @@ self.assertTrue(strtobool(y)) for n in no: - self.assertTrue(not strtobool(n)) + self.assertFalse(strtobool(n)) def test_rfc822_escape(self): header = 'I am a\npoor\nlonesome\nheader\n' -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 23:30:02 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 16 Nov 2013 23:30:02 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319600=3A_Use_specific_asserts_in_distutils_test?= =?utf-8?q?s=2E?= Message-ID: <3dMWMZ1fcTz7Lq0@mail.python.org> http://hg.python.org/cpython/rev/6a3501aab559 changeset: 87161:6a3501aab559 parent: 87158:aacb0bd969e5 parent: 87160:5a0236f3f972 user: Serhiy Storchaka date: Sun Nov 17 00:20:12 2013 +0200 summary: Issue #19600: Use specific asserts in distutils tests. files: Lib/distutils/tests/test_archive_util.py | 2 +- Lib/distutils/tests/test_bdist_rpm.py | 4 +- Lib/distutils/tests/test_bdist_wininst.py | 2 +- Lib/distutils/tests/test_build_clib.py | 2 +- Lib/distutils/tests/test_build_ext.py | 16 +++++----- Lib/distutils/tests/test_build_scripts.py | 8 ++-- Lib/distutils/tests/test_clean.py | 2 +- Lib/distutils/tests/test_config.py | 2 +- Lib/distutils/tests/test_config_cmd.py | 2 +- Lib/distutils/tests/test_install.py | 2 +- Lib/distutils/tests/test_install_lib.py | 2 +- Lib/distutils/tests/test_install_scripts.py | 10 +++--- Lib/distutils/tests/test_msvc9compiler.py | 6 +- Lib/distutils/tests/test_register.py | 8 ++-- Lib/distutils/tests/test_sysconfig.py | 2 +- Lib/distutils/tests/test_util.py | 2 +- 16 files changed, 36 insertions(+), 36 deletions(-) diff --git a/Lib/distutils/tests/test_archive_util.py b/Lib/distutils/tests/test_archive_util.py --- a/Lib/distutils/tests/test_archive_util.py +++ b/Lib/distutils/tests/test_archive_util.py @@ -217,7 +217,7 @@ dry_run=True) finally: os.chdir(old_dir) - self.assertTrue(not os.path.exists(tarball)) + self.assertFalse(os.path.exists(tarball)) self.assertEqual(len(w.warnings), 1) @unittest.skipUnless(ZIP_SUPPORT and ZLIB_SUPPORT, diff --git a/Lib/distutils/tests/test_bdist_rpm.py b/Lib/distutils/tests/test_bdist_rpm.py --- a/Lib/distutils/tests/test_bdist_rpm.py +++ b/Lib/distutils/tests/test_bdist_rpm.py @@ -81,7 +81,7 @@ cmd.run() dist_created = os.listdir(os.path.join(pkg_dir, 'dist')) - self.assertTrue('foo-0.1-1.noarch.rpm' in dist_created) + self.assertIn('foo-0.1-1.noarch.rpm', dist_created) # bug #2945: upload ignores bdist_rpm files self.assertIn(('bdist_rpm', 'any', 'dist/foo-0.1-1.src.rpm'), dist.dist_files) @@ -125,7 +125,7 @@ cmd.run() dist_created = os.listdir(os.path.join(pkg_dir, 'dist')) - self.assertTrue('foo-0.1-1.noarch.rpm' in dist_created) + self.assertIn('foo-0.1-1.noarch.rpm', dist_created) # bug #2945: upload ignores bdist_rpm files self.assertIn(('bdist_rpm', 'any', 'dist/foo-0.1-1.src.rpm'), dist.dist_files) diff --git a/Lib/distutils/tests/test_bdist_wininst.py b/Lib/distutils/tests/test_bdist_wininst.py --- a/Lib/distutils/tests/test_bdist_wininst.py +++ b/Lib/distutils/tests/test_bdist_wininst.py @@ -22,7 +22,7 @@ # and make sure it finds it and returns its content # no matter what platform we have exe_file = cmd.get_exe_bytes() - self.assertTrue(len(exe_file) > 10) + self.assertGreater(len(exe_file), 10) def test_suite(): return unittest.makeSuite(BuildWinInstTestCase) diff --git a/Lib/distutils/tests/test_build_clib.py b/Lib/distutils/tests/test_build_clib.py --- a/Lib/distutils/tests/test_build_clib.py +++ b/Lib/distutils/tests/test_build_clib.py @@ -137,7 +137,7 @@ cmd.run() # let's check the result - self.assertTrue('libfoo.a' in os.listdir(build_temp)) + self.assertIn('libfoo.a', os.listdir(build_temp)) def test_suite(): return unittest.makeSuite(BuildCLibTestCase) diff --git a/Lib/distutils/tests/test_build_ext.py b/Lib/distutils/tests/test_build_ext.py --- a/Lib/distutils/tests/test_build_ext.py +++ b/Lib/distutils/tests/test_build_ext.py @@ -76,8 +76,8 @@ if support.HAVE_DOCSTRINGS: doc = 'This is a template module just for instruction.' self.assertEqual(xx.__doc__, doc) - self.assertTrue(isinstance(xx.Null(), xx.Null)) - self.assertTrue(isinstance(xx.Str(), xx.Str)) + self.assertIsInstance(xx.Null(), xx.Null) + self.assertIsInstance(xx.Str(), xx.Str) def tearDown(self): # Get everything back to normal @@ -110,7 +110,7 @@ _config_vars['Py_ENABLE_SHARED'] = old_var # make sure we get some library dirs under solaris - self.assertTrue(len(cmd.library_dirs) > 0) + self.assertGreater(len(cmd.library_dirs), 0) def test_user_site(self): # site.USER_SITE was introduced in 2.6 @@ -124,7 +124,7 @@ # making sure the user option is there options = [name for name, short, lable in cmd.user_options] - self.assertTrue('user' in options) + self.assertIn('user', options) # setting a value cmd.user = 1 @@ -171,10 +171,10 @@ from distutils import sysconfig py_include = sysconfig.get_python_inc() - self.assertTrue(py_include in cmd.include_dirs) + self.assertIn(py_include, cmd.include_dirs) plat_py_include = sysconfig.get_python_inc(plat_specific=1) - self.assertTrue(plat_py_include in cmd.include_dirs) + self.assertIn(plat_py_include, cmd.include_dirs) # make sure cmd.libraries is turned into a list # if it's a string @@ -255,13 +255,13 @@ 'some': 'bar'})] cmd.check_extensions_list(exts) ext = exts[0] - self.assertTrue(isinstance(ext, Extension)) + self.assertIsInstance(ext, Extension) # check_extensions_list adds in ext the values passed # when they are in ('include_dirs', 'library_dirs', 'libraries' # 'extra_objects', 'extra_compile_args', 'extra_link_args') self.assertEqual(ext.libraries, 'foo') - self.assertTrue(not hasattr(ext, 'some')) + self.assertFalse(hasattr(ext, 'some')) # 'macros' element of build info dict must be 1- or 2-tuple exts = [('foo.bar', {'sources': [''], 'libraries': 'foo', diff --git a/Lib/distutils/tests/test_build_scripts.py b/Lib/distutils/tests/test_build_scripts.py --- a/Lib/distutils/tests/test_build_scripts.py +++ b/Lib/distutils/tests/test_build_scripts.py @@ -17,8 +17,8 @@ def test_default_settings(self): cmd = self.get_build_scripts_cmd("/foo/bar", []) - self.assertTrue(not cmd.force) - self.assertTrue(cmd.build_dir is None) + self.assertFalse(cmd.force) + self.assertIsNone(cmd.build_dir) cmd.finalize_options() @@ -38,7 +38,7 @@ built = os.listdir(target) for name in expected: - self.assertTrue(name in built) + self.assertIn(name, built) def get_build_scripts_cmd(self, target, scripts): import sys @@ -103,7 +103,7 @@ built = os.listdir(target) for name in expected: - self.assertTrue(name in built) + self.assertIn(name, built) def test_suite(): return unittest.makeSuite(BuildScriptsTestCase) diff --git a/Lib/distutils/tests/test_clean.py b/Lib/distutils/tests/test_clean.py --- a/Lib/distutils/tests/test_clean.py +++ b/Lib/distutils/tests/test_clean.py @@ -36,7 +36,7 @@ # make sure the files where removed for name, path in dirs: - self.assertTrue(not os.path.exists(path), + self.assertFalse(os.path.exists(path), '%s was not removed' % path) # let's run the command again (should spit warnings but succeed) diff --git a/Lib/distutils/tests/test_config.py b/Lib/distutils/tests/test_config.py --- a/Lib/distutils/tests/test_config.py +++ b/Lib/distutils/tests/test_config.py @@ -103,7 +103,7 @@ def test_server_empty_registration(self): cmd = self._cmd(self.dist) rc = cmd._get_rc_file() - self.assertTrue(not os.path.exists(rc)) + self.assertFalse(os.path.exists(rc)) cmd._store_pypirc('tarek', 'xxx') self.assertTrue(os.path.exists(rc)) f = open(rc) diff --git a/Lib/distutils/tests/test_config_cmd.py b/Lib/distutils/tests/test_config_cmd.py --- a/Lib/distutils/tests/test_config_cmd.py +++ b/Lib/distutils/tests/test_config_cmd.py @@ -81,7 +81,7 @@ cmd._clean(f1, f2) for f in (f1, f2): - self.assertTrue(not os.path.exists(f)) + self.assertFalse(os.path.exists(f)) def test_suite(): return unittest.makeSuite(ConfigTestCase) diff --git a/Lib/distutils/tests/test_install.py b/Lib/distutils/tests/test_install.py --- a/Lib/distutils/tests/test_install.py +++ b/Lib/distutils/tests/test_install.py @@ -236,7 +236,7 @@ self.test_record() finally: install_module.DEBUG = False - self.assertTrue(len(self.logs) > old_logs_len) + self.assertGreater(len(self.logs), old_logs_len) def test_suite(): diff --git a/Lib/distutils/tests/test_install_lib.py b/Lib/distutils/tests/test_install_lib.py --- a/Lib/distutils/tests/test_install_lib.py +++ b/Lib/distutils/tests/test_install_lib.py @@ -105,7 +105,7 @@ finally: sys.dont_write_bytecode = old_dont_write_bytecode - self.assertTrue('byte-compiling is disabled' in self.logs[0][1]) + self.assertIn('byte-compiling is disabled', self.logs[0][1]) def test_suite(): diff --git a/Lib/distutils/tests/test_install_scripts.py b/Lib/distutils/tests/test_install_scripts.py --- a/Lib/distutils/tests/test_install_scripts.py +++ b/Lib/distutils/tests/test_install_scripts.py @@ -24,10 +24,10 @@ skip_build=1, ) cmd = install_scripts(dist) - self.assertTrue(not cmd.force) - self.assertTrue(not cmd.skip_build) - self.assertTrue(cmd.build_dir is None) - self.assertTrue(cmd.install_dir is None) + self.assertFalse(cmd.force) + self.assertFalse(cmd.skip_build) + self.assertIsNone(cmd.build_dir) + self.assertIsNone(cmd.install_dir) cmd.finalize_options() @@ -72,7 +72,7 @@ installed = os.listdir(target) for name in expected: - self.assertTrue(name in installed) + self.assertIn(name, installed) def test_suite(): diff --git a/Lib/distutils/tests/test_msvc9compiler.py b/Lib/distutils/tests/test_msvc9compiler.py --- a/Lib/distutils/tests/test_msvc9compiler.py +++ b/Lib/distutils/tests/test_msvc9compiler.py @@ -128,7 +128,7 @@ # windows registeries versions. path = r'Control Panel\Desktop' v = Reg.get_value(path, 'dragfullwindows') - self.assertTrue(v in ('0', '1', '2')) + self.assertIn(v, ('0', '1', '2')) import winreg HKCU = winreg.HKEY_CURRENT_USER @@ -136,7 +136,7 @@ self.assertEqual(keys, None) keys = Reg.read_keys(HKCU, r'Control Panel') - self.assertTrue('Desktop' in keys) + self.assertIn('Desktop', keys) def test_remove_visual_c_ref(self): from distutils.msvc9compiler import MSVCCompiler @@ -174,7 +174,7 @@ compiler = MSVCCompiler() got = compiler._remove_visual_c_ref(manifest) - self.assertIs(got, None) + self.assertIsNone(got) def test_suite(): diff --git a/Lib/distutils/tests/test_register.py b/Lib/distutils/tests/test_register.py --- a/Lib/distutils/tests/test_register.py +++ b/Lib/distutils/tests/test_register.py @@ -98,7 +98,7 @@ cmd = self._get_cmd() # we shouldn't have a .pypirc file yet - self.assertTrue(not os.path.exists(self.rc)) + self.assertFalse(os.path.exists(self.rc)) # patching input and getpass.getpass # so register gets happy @@ -145,7 +145,7 @@ self.assertEqual(req1['Content-length'], '1374') self.assertEqual(req2['Content-length'], '1374') - self.assertTrue((b'xxx') in self.conn.reqs[1].data) + self.assertIn(b'xxx', self.conn.reqs[1].data) def test_password_not_in_file(self): @@ -175,7 +175,7 @@ req = self.conn.reqs[0] headers = dict(req.headers) self.assertEqual(headers['Content-length'], '608') - self.assertTrue((b'tarek') in req.data) + self.assertIn(b'tarek', req.data) def test_password_reset(self): # this test runs choice 3 @@ -193,7 +193,7 @@ req = self.conn.reqs[0] headers = dict(req.headers) self.assertEqual(headers['Content-length'], '290') - self.assertTrue((b'tarek') in req.data) + self.assertIn(b'tarek', req.data) @unittest.skipUnless(docutils is not None, 'needs docutils') def test_strict(self): diff --git a/Lib/distutils/tests/test_sysconfig.py b/Lib/distutils/tests/test_sysconfig.py --- a/Lib/distutils/tests/test_sysconfig.py +++ b/Lib/distutils/tests/test_sysconfig.py @@ -50,7 +50,7 @@ def test_get_config_vars(self): cvars = sysconfig.get_config_vars() - self.assertTrue(isinstance(cvars, dict)) + self.assertIsInstance(cvars, dict) self.assertTrue(cvars) def test_srcdir(self): diff --git a/Lib/distutils/tests/test_util.py b/Lib/distutils/tests/test_util.py --- a/Lib/distutils/tests/test_util.py +++ b/Lib/distutils/tests/test_util.py @@ -266,7 +266,7 @@ self.assertTrue(strtobool(y)) for n in no: - self.assertTrue(not strtobool(n)) + self.assertFalse(strtobool(n)) def test_rfc822_escape(self): header = 'I am a\npoor\nlonesome\nheader\n' -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 16 23:30:03 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 16 Nov 2013 23:30:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NjAw?= =?utf-8?q?=3A_Use_specific_asserts_in_distutils_tests=2E?= Message-ID: <3dMWMb4hwrz7LrH@mail.python.org> http://hg.python.org/cpython/rev/097389413dd3 changeset: 87162:097389413dd3 branch: 2.7 parent: 87159:1d1bad272733 user: Serhiy Storchaka date: Sun Nov 17 00:29:27 2013 +0200 summary: Issue #19600: Use specific asserts in distutils tests. files: Lib/distutils/tests/test_archive_util.py | 2 +- Lib/distutils/tests/test_bdist_rpm.py | 4 +- Lib/distutils/tests/test_bdist_wininst.py | 2 +- Lib/distutils/tests/test_build_clib.py | 2 +- Lib/distutils/tests/test_build_ext.py | 14 +++++----- Lib/distutils/tests/test_build_scripts.py | 8 ++-- Lib/distutils/tests/test_clean.py | 2 +- Lib/distutils/tests/test_config.py | 2 +- Lib/distutils/tests/test_config_cmd.py | 2 +- Lib/distutils/tests/test_install.py | 2 +- Lib/distutils/tests/test_install_lib.py | 4 +- Lib/distutils/tests/test_install_scripts.py | 10 +++--- Lib/distutils/tests/test_msvc9compiler.py | 6 ++-- Lib/distutils/tests/test_register.py | 8 ++-- Lib/distutils/tests/test_upload.py | 4 +- 15 files changed, 36 insertions(+), 36 deletions(-) diff --git a/Lib/distutils/tests/test_archive_util.py b/Lib/distutils/tests/test_archive_util.py --- a/Lib/distutils/tests/test_archive_util.py +++ b/Lib/distutils/tests/test_archive_util.py @@ -199,7 +199,7 @@ dry_run=True) finally: os.chdir(old_dir) - self.assertTrue(not os.path.exists(tarball)) + self.assertFalse(os.path.exists(tarball)) self.assertEqual(len(w.warnings), 1) @unittest.skipUnless(zlib, "Requires zlib") diff --git a/Lib/distutils/tests/test_bdist_rpm.py b/Lib/distutils/tests/test_bdist_rpm.py --- a/Lib/distutils/tests/test_bdist_rpm.py +++ b/Lib/distutils/tests/test_bdist_rpm.py @@ -77,7 +77,7 @@ cmd.run() dist_created = os.listdir(os.path.join(pkg_dir, 'dist')) - self.assertTrue('foo-0.1-1.noarch.rpm' in dist_created) + self.assertIn('foo-0.1-1.noarch.rpm', dist_created) # bug #2945: upload ignores bdist_rpm files self.assertIn(('bdist_rpm', 'any', 'dist/foo-0.1-1.src.rpm'), dist.dist_files) @@ -121,7 +121,7 @@ cmd.run() dist_created = os.listdir(os.path.join(pkg_dir, 'dist')) - self.assertTrue('foo-0.1-1.noarch.rpm' in dist_created) + self.assertIn('foo-0.1-1.noarch.rpm', dist_created) # bug #2945: upload ignores bdist_rpm files self.assertIn(('bdist_rpm', 'any', 'dist/foo-0.1-1.src.rpm'), dist.dist_files) diff --git a/Lib/distutils/tests/test_bdist_wininst.py b/Lib/distutils/tests/test_bdist_wininst.py --- a/Lib/distutils/tests/test_bdist_wininst.py +++ b/Lib/distutils/tests/test_bdist_wininst.py @@ -23,7 +23,7 @@ # and make sure it finds it and returns its content # no matter what platform we have exe_file = cmd.get_exe_bytes() - self.assertTrue(len(exe_file) > 10) + self.assertGreater(len(exe_file), 10) def test_suite(): return unittest.makeSuite(BuildWinInstTestCase) diff --git a/Lib/distutils/tests/test_build_clib.py b/Lib/distutils/tests/test_build_clib.py --- a/Lib/distutils/tests/test_build_clib.py +++ b/Lib/distutils/tests/test_build_clib.py @@ -137,7 +137,7 @@ cmd.run() # let's check the result - self.assertTrue('libfoo.a' in os.listdir(build_temp)) + self.assertIn('libfoo.a', os.listdir(build_temp)) def test_suite(): return unittest.makeSuite(BuildCLibTestCase) diff --git a/Lib/distutils/tests/test_build_ext.py b/Lib/distutils/tests/test_build_ext.py --- a/Lib/distutils/tests/test_build_ext.py +++ b/Lib/distutils/tests/test_build_ext.py @@ -80,8 +80,8 @@ if test_support.HAVE_DOCSTRINGS: doc = 'This is a template module just for instruction.' self.assertEqual(xx.__doc__, doc) - self.assertTrue(isinstance(xx.Null(), xx.Null)) - self.assertTrue(isinstance(xx.Str(), xx.Str)) + self.assertIsInstance(xx.Null(), xx.Null) + self.assertIsInstance(xx.Str(), xx.Str) def test_solaris_enable_shared(self): dist = Distribution({'name': 'xx'}) @@ -102,7 +102,7 @@ _config_vars['Py_ENABLE_SHARED'] = old_var # make sure we get some library dirs under solaris - self.assertTrue(len(cmd.library_dirs) > 0) + self.assertGreater(len(cmd.library_dirs), 0) def test_user_site(self): # site.USER_SITE was introduced in 2.6 @@ -143,10 +143,10 @@ cmd.finalize_options() py_include = sysconfig.get_python_inc() - self.assertTrue(py_include in cmd.include_dirs) + self.assertIn(py_include, cmd.include_dirs) plat_py_include = sysconfig.get_python_inc(plat_specific=1) - self.assertTrue(plat_py_include in cmd.include_dirs) + self.assertIn(plat_py_include, cmd.include_dirs) # make sure cmd.libraries is turned into a list # if it's a string @@ -226,13 +226,13 @@ 'some': 'bar'})] cmd.check_extensions_list(exts) ext = exts[0] - self.assertTrue(isinstance(ext, Extension)) + self.assertIsInstance(ext, Extension) # check_extensions_list adds in ext the values passed # when they are in ('include_dirs', 'library_dirs', 'libraries' # 'extra_objects', 'extra_compile_args', 'extra_link_args') self.assertEqual(ext.libraries, 'foo') - self.assertTrue(not hasattr(ext, 'some')) + self.assertFalse(hasattr(ext, 'some')) # 'macros' element of build info dict must be 1- or 2-tuple exts = [('foo.bar', {'sources': [''], 'libraries': 'foo', diff --git a/Lib/distutils/tests/test_build_scripts.py b/Lib/distutils/tests/test_build_scripts.py --- a/Lib/distutils/tests/test_build_scripts.py +++ b/Lib/distutils/tests/test_build_scripts.py @@ -17,8 +17,8 @@ def test_default_settings(self): cmd = self.get_build_scripts_cmd("/foo/bar", []) - self.assertTrue(not cmd.force) - self.assertTrue(cmd.build_dir is None) + self.assertFalse(cmd.force) + self.assertIsNone(cmd.build_dir) cmd.finalize_options() @@ -38,7 +38,7 @@ built = os.listdir(target) for name in expected: - self.assertTrue(name in built) + self.assertIn(name, built) def get_build_scripts_cmd(self, target, scripts): import sys @@ -103,7 +103,7 @@ built = os.listdir(target) for name in expected: - self.assertTrue(name in built) + self.assertIn(name, built) def test_suite(): return unittest.makeSuite(BuildScriptsTestCase) diff --git a/Lib/distutils/tests/test_clean.py b/Lib/distutils/tests/test_clean.py --- a/Lib/distutils/tests/test_clean.py +++ b/Lib/distutils/tests/test_clean.py @@ -36,7 +36,7 @@ # make sure the files where removed for name, path in dirs: - self.assertTrue(not os.path.exists(path), + self.assertFalse(os.path.exists(path), '%s was not removed' % path) # let's run the command again (should spit warnings but succeed) diff --git a/Lib/distutils/tests/test_config.py b/Lib/distutils/tests/test_config.py --- a/Lib/distutils/tests/test_config.py +++ b/Lib/distutils/tests/test_config.py @@ -106,7 +106,7 @@ def test_server_empty_registration(self): cmd = self._cmd(self.dist) rc = cmd._get_rc_file() - self.assertTrue(not os.path.exists(rc)) + self.assertFalse(os.path.exists(rc)) cmd._store_pypirc('tarek', 'xxx') self.assertTrue(os.path.exists(rc)) f = open(rc) diff --git a/Lib/distutils/tests/test_config_cmd.py b/Lib/distutils/tests/test_config_cmd.py --- a/Lib/distutils/tests/test_config_cmd.py +++ b/Lib/distutils/tests/test_config_cmd.py @@ -81,7 +81,7 @@ cmd._clean(f1, f2) for f in (f1, f2): - self.assertTrue(not os.path.exists(f)) + self.assertFalse(os.path.exists(f)) def test_suite(): return unittest.makeSuite(ConfigTestCase) diff --git a/Lib/distutils/tests/test_install.py b/Lib/distutils/tests/test_install.py --- a/Lib/distutils/tests/test_install.py +++ b/Lib/distutils/tests/test_install.py @@ -237,7 +237,7 @@ self.test_record() finally: install_module.DEBUG = False - self.assertTrue(len(self.logs) > old_logs_len) + self.assertGreater(len(self.logs), old_logs_len) def test_suite(): diff --git a/Lib/distutils/tests/test_install_lib.py b/Lib/distutils/tests/test_install_lib.py --- a/Lib/distutils/tests/test_install_lib.py +++ b/Lib/distutils/tests/test_install_lib.py @@ -65,7 +65,7 @@ cmd.distribution.script_name = 'setup.py' # get_output should return 4 elements - self.assertTrue(len(cmd.get_outputs()) >= 2) + self.assertGreaterEqual(len(cmd.get_outputs()), 2) def test_get_inputs(self): pkg_dir, dist = self.create_dist() @@ -98,7 +98,7 @@ finally: sys.dont_write_bytecode = old_dont_write_bytecode - self.assertTrue('byte-compiling is disabled' in self.logs[0][1]) + self.assertIn('byte-compiling is disabled', self.logs[0][1]) def test_suite(): return unittest.makeSuite(InstallLibTestCase) diff --git a/Lib/distutils/tests/test_install_scripts.py b/Lib/distutils/tests/test_install_scripts.py --- a/Lib/distutils/tests/test_install_scripts.py +++ b/Lib/distutils/tests/test_install_scripts.py @@ -24,10 +24,10 @@ skip_build=1, ) cmd = install_scripts(dist) - self.assertTrue(not cmd.force) - self.assertTrue(not cmd.skip_build) - self.assertTrue(cmd.build_dir is None) - self.assertTrue(cmd.install_dir is None) + self.assertFalse(cmd.force) + self.assertFalse(cmd.skip_build) + self.assertIsNone(cmd.build_dir) + self.assertIsNone(cmd.install_dir) cmd.finalize_options() @@ -72,7 +72,7 @@ installed = os.listdir(target) for name in expected: - self.assertTrue(name in installed) + self.assertIn(name, installed) def test_suite(): diff --git a/Lib/distutils/tests/test_msvc9compiler.py b/Lib/distutils/tests/test_msvc9compiler.py --- a/Lib/distutils/tests/test_msvc9compiler.py +++ b/Lib/distutils/tests/test_msvc9compiler.py @@ -128,7 +128,7 @@ # windows registeries versions. path = r'Control Panel\Desktop' v = Reg.get_value(path, u'dragfullwindows') - self.assertTrue(v in (u'0', u'1', u'2')) + self.assertIn(v, (u'0', u'1', u'2')) import _winreg HKCU = _winreg.HKEY_CURRENT_USER @@ -136,7 +136,7 @@ self.assertEqual(keys, None) keys = Reg.read_keys(HKCU, r'Control Panel') - self.assertTrue('Desktop' in keys) + self.assertIn('Desktop', keys) def test_remove_visual_c_ref(self): from distutils.msvc9compiler import MSVCCompiler @@ -174,7 +174,7 @@ compiler = MSVCCompiler() got = compiler._remove_visual_c_ref(manifest) - self.assertIs(got, None) + self.assertIsNone(got) def test_suite(): diff --git a/Lib/distutils/tests/test_register.py b/Lib/distutils/tests/test_register.py --- a/Lib/distutils/tests/test_register.py +++ b/Lib/distutils/tests/test_register.py @@ -99,7 +99,7 @@ cmd = self._get_cmd() # we shouldn't have a .pypirc file yet - self.assertTrue(not os.path.exists(self.rc)) + self.assertFalse(os.path.exists(self.rc)) # patching raw_input and getpass.getpass # so register gets happy @@ -144,7 +144,7 @@ req1 = dict(self.conn.reqs[0].headers) req2 = dict(self.conn.reqs[1].headers) self.assertEqual(req2['Content-length'], req1['Content-length']) - self.assertTrue('xxx' in self.conn.reqs[1].data) + self.assertIn('xxx', self.conn.reqs[1].data) def test_password_not_in_file(self): @@ -174,7 +174,7 @@ req = self.conn.reqs[0] headers = dict(req.headers) self.assertEqual(headers['Content-length'], '608') - self.assertTrue('tarek' in req.data) + self.assertIn('tarek', req.data) def test_password_reset(self): # this test runs choice 3 @@ -192,7 +192,7 @@ req = self.conn.reqs[0] headers = dict(req.headers) self.assertEqual(headers['Content-length'], '290') - self.assertTrue('tarek' in req.data) + self.assertIn('tarek', req.data) @unittest.skipUnless(docutils is not None, 'needs docutils') def test_strict(self): diff --git a/Lib/distutils/tests/test_upload.py b/Lib/distutils/tests/test_upload.py --- a/Lib/distutils/tests/test_upload.py +++ b/Lib/distutils/tests/test_upload.py @@ -120,9 +120,9 @@ self.assertEqual(self.last_open.req.get_method(), 'POST') self.assertEqual(self.last_open.req.get_full_url(), 'http://pypi.python.org/pypi') - self.assertTrue('xxx' in self.last_open.req.data) + self.assertIn('xxx', self.last_open.req.data) auth = self.last_open.req.headers['Authorization'] - self.assertFalse('\n' in auth) + self.assertNotIn('\n', auth) def test_suite(): return unittest.makeSuite(uploadTestCase) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 01:47:52 2013 From: python-checkins at python.org (ezio.melotti) Date: Sun, 17 Nov 2013 01:47:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogIzE5MjM4OiBmaXgg?= =?utf-8?q?typo_in_documentation=2E?= Message-ID: <3dMZQc3yGDz7Lqr@mail.python.org> http://hg.python.org/cpython/rev/2b0690c9a026 changeset: 87163:2b0690c9a026 branch: 2.7 user: Ezio Melotti date: Sun Nov 17 02:47:12 2013 +0200 summary: #19238: fix typo in documentation. files: Doc/library/string.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/library/string.rst b/Doc/library/string.rst --- a/Doc/library/string.rst +++ b/Doc/library/string.rst @@ -330,7 +330,7 @@ precision: `integer` type: "b" | "c" | "d" | "e" | "E" | "f" | "F" | "g" | "G" | "n" | "o" | "s" | "x" | "X" | "%" -If a valid *align* value is specified, it can be preceeded by a *fill* +If a valid *align* value is specified, it can be preceded by a *fill* character that can be any character and defaults to a space if omitted. Note that it is not possible to use ``{`` and ``}`` as *fill* char while using the :meth:`str.format` method; this limitation however doesn't -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 01:47:53 2013 From: python-checkins at python.org (ezio.melotti) Date: Sun, 17 Nov 2013 01:47:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogIzE5MjM4OiBmaXgg?= =?utf-8?q?typo_in_documentation=2E?= Message-ID: <3dMZQd5n6yz7Lvc@mail.python.org> http://hg.python.org/cpython/rev/7e3f8026ed30 changeset: 87164:7e3f8026ed30 branch: 3.3 parent: 87160:5a0236f3f972 user: Ezio Melotti date: Sun Nov 17 02:47:12 2013 +0200 summary: #19238: fix typo in documentation. files: Doc/library/string.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/library/string.rst b/Doc/library/string.rst --- a/Doc/library/string.rst +++ b/Doc/library/string.rst @@ -300,7 +300,7 @@ precision: `integer` type: "b" | "c" | "d" | "e" | "E" | "f" | "F" | "g" | "G" | "n" | "o" | "s" | "x" | "X" | "%" -If a valid *align* value is specified, it can be preceeded by a *fill* +If a valid *align* value is specified, it can be preceded by a *fill* character that can be any character and defaults to a space if omitted. Note that it is not possible to use ``{`` and ``}`` as *fill* char while using the :meth:`str.format` method; this limitation however doesn't -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 01:47:55 2013 From: python-checkins at python.org (ezio.melotti) Date: Sun, 17 Nov 2013 01:47:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?b?KTogIzE5MjM4OiBtZXJnZSB3aXRoIDMuMy4=?= Message-ID: <3dMZQg0T51z7Lvc@mail.python.org> http://hg.python.org/cpython/rev/829b95824867 changeset: 87165:829b95824867 parent: 87161:6a3501aab559 parent: 87164:7e3f8026ed30 user: Ezio Melotti date: Sun Nov 17 02:47:38 2013 +0200 summary: #19238: merge with 3.3. files: Doc/library/string.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/library/string.rst b/Doc/library/string.rst --- a/Doc/library/string.rst +++ b/Doc/library/string.rst @@ -300,7 +300,7 @@ precision: `integer` type: "b" | "c" | "d" | "e" | "E" | "f" | "F" | "g" | "G" | "n" | "o" | "s" | "x" | "X" | "%" -If a valid *align* value is specified, it can be preceeded by a *fill* +If a valid *align* value is specified, it can be preceded by a *fill* character that can be any character and defaults to a space if omitted. Note that it is not possible to use ``{`` and ``}`` as *fill* char while using the :meth:`str.format` method; this limitation however doesn't -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 07:00:21 2013 From: python-checkins at python.org (nick.coghlan) Date: Sun, 17 Nov 2013 07:00:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2319282=3A_Native_c?= =?utf-8?q?ontext_management_in_dbm?= Message-ID: <3dMjM94D6zz7Ljp@mail.python.org> http://hg.python.org/cpython/rev/c2f1bb56760d changeset: 87166:c2f1bb56760d user: Nick Coghlan date: Sun Nov 17 15:59:51 2013 +1000 summary: Close #19282: Native context management in dbm files: Doc/library/dbm.rst | 38 +++++++++++++++----------- Lib/dbm/dumb.py | 6 ++++ Lib/test/test_dbm_dumb.py | 13 +++++++++ Lib/test/test_dbm_gnu.py | 11 +++++++ Lib/test/test_dbm_ndbm.py | 13 +++++++++ Misc/NEWS | 3 ++ Modules/_dbmmodule.c | 17 ++++++++++++ Modules/_gdbmmodule.c | 16 +++++++++++ 8 files changed, 101 insertions(+), 16 deletions(-) diff --git a/Doc/library/dbm.rst b/Doc/library/dbm.rst --- a/Doc/library/dbm.rst +++ b/Doc/library/dbm.rst @@ -73,33 +73,39 @@ strings are used they are implicitly converted to the default encoding before being stored. +These objects also support being used in a :keyword:`with` statement, which +will automatically close them when done. + +.. versionchanged:: 3.4 + Added native support for the context management protocol to the objects + returned by :func:`.open`. + The following example records some hostnames and a corresponding title, and then prints out the contents of the database:: import dbm # Open database, creating it if necessary. - db = dbm.open('cache', 'c') + with dbm.open('cache', 'c') as db: - # Record some values - db[b'hello'] = b'there' - db['www.python.org'] = 'Python Website' - db['www.cnn.com'] = 'Cable News Network' + # Record some values + db[b'hello'] = b'there' + db['www.python.org'] = 'Python Website' + db['www.cnn.com'] = 'Cable News Network' - # Note that the keys are considered bytes now. - assert db[b'www.python.org'] == b'Python Website' - # Notice how the value is now in bytes. - assert db['www.cnn.com'] == b'Cable News Network' + # Note that the keys are considered bytes now. + assert db[b'www.python.org'] == b'Python Website' + # Notice how the value is now in bytes. + assert db['www.cnn.com'] == b'Cable News Network' - # Often-used methods of the dict interface work too. - print(db.get('python.org', b'not present')) + # Often-used methods of the dict interface work too. + print(db.get('python.org', b'not present')) - # Storing a non-string key or value will raise an exception (most - # likely a TypeError). - db['www.yahoo.com'] = 4 + # Storing a non-string key or value will raise an exception (most + # likely a TypeError). + db['www.yahoo.com'] = 4 - # Close when done. - db.close() + # db is automatically closed when leaving the with statement. .. seealso:: diff --git a/Lib/dbm/dumb.py b/Lib/dbm/dumb.py --- a/Lib/dbm/dumb.py +++ b/Lib/dbm/dumb.py @@ -236,6 +236,12 @@ if hasattr(self._os, 'chmod'): self._os.chmod(file, self._mode) + def __enter__(self): + return self + + def __exit__(self, *args): + self.close() + def open(file, flag=None, mode=0o666): """Open the database file, filename, and return corresponding object. diff --git a/Lib/test/test_dbm_dumb.py b/Lib/test/test_dbm_dumb.py --- a/Lib/test/test_dbm_dumb.py +++ b/Lib/test/test_dbm_dumb.py @@ -184,6 +184,19 @@ self.assertEqual(expected, got) f.close() + def test_context_manager(self): + with dumbdbm.open(_fname, 'c') as db: + db["dumbdbm context manager"] = "context manager" + + with dumbdbm.open(_fname, 'r') as db: + self.assertEqual(list(db.keys()), [b"dumbdbm context manager"]) + + # This currently just raises AttributeError rather than a specific + # exception like the GNU or NDBM based implementations. See + # http://bugs.python.org/issue19385 for details. + with self.assertRaises(Exception): + db.keys() + def tearDown(self): _delete_files() diff --git a/Lib/test/test_dbm_gnu.py b/Lib/test/test_dbm_gnu.py --- a/Lib/test/test_dbm_gnu.py +++ b/Lib/test/test_dbm_gnu.py @@ -81,6 +81,17 @@ size2 = os.path.getsize(filename) self.assertTrue(size1 > size2 >= size0) + def test_context_manager(self): + with gdbm.open(filename, 'c') as db: + db["gdbm context manager"] = "context manager" + + with gdbm.open(filename, 'r') as db: + self.assertEqual(list(db.keys()), [b"gdbm context manager"]) + + with self.assertRaises(gdbm.error) as cm: + db.keys() + self.assertEqual(str(cm.exception), + "GDBM object has already been closed") if __name__ == '__main__': unittest.main() diff --git a/Lib/test/test_dbm_ndbm.py b/Lib/test/test_dbm_ndbm.py --- a/Lib/test/test_dbm_ndbm.py +++ b/Lib/test/test_dbm_ndbm.py @@ -37,5 +37,18 @@ except error: self.fail() + def test_context_manager(self): + with dbm.ndbm.open(self.filename, 'c') as db: + db["ndbm context manager"] = "context manager" + + with dbm.ndbm.open(self.filename, 'r') as db: + self.assertEqual(list(db.keys()), [b"ndbm context manager"]) + + with self.assertRaises(dbm.ndbm.error) as cm: + db.keys() + self.assertEqual(str(cm.exception), + "DBM object has already been closed") + + if __name__ == '__main__': unittest.main() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -50,6 +50,9 @@ Library ------- +- Issue #19282: dbm.open now supports the context manager protocol. (Inital + patch by Claudiu Popa) + - Issue #8311: Added support for writing any bytes-like objects in the aifc, sunau, and wave modules. diff --git a/Modules/_dbmmodule.c b/Modules/_dbmmodule.c --- a/Modules/_dbmmodule.c +++ b/Modules/_dbmmodule.c @@ -313,6 +313,21 @@ return defvalue; } +static PyObject * +dbm__enter__(PyObject *self, PyObject *args) +{ + Py_INCREF(self); + return self; +} + +static PyObject * +dbm__exit__(PyObject *self, PyObject *args) +{ + _Py_IDENTIFIER(close); + return _PyObject_CallMethodId(self, &PyId_close, NULL); +} + + static PyMethodDef dbm_methods[] = { {"close", (PyCFunction)dbm__close, METH_NOARGS, "close()\nClose the database."}, @@ -325,6 +340,8 @@ "setdefault(key[, default]) -> value\n" "Return the value for key if present, otherwise default. If key\n" "is not in the database, it is inserted with default as the value."}, + {"__enter__", dbm__enter__, METH_NOARGS, NULL}, + {"__exit__", dbm__exit__, METH_VARARGS, NULL}, {NULL, NULL} /* sentinel */ }; diff --git a/Modules/_gdbmmodule.c b/Modules/_gdbmmodule.c --- a/Modules/_gdbmmodule.c +++ b/Modules/_gdbmmodule.c @@ -425,6 +425,20 @@ return Py_None; } +static PyObject * +dbm__enter__(PyObject *self, PyObject *args) +{ + Py_INCREF(self); + return self; +} + +static PyObject * +dbm__exit__(PyObject *self, PyObject *args) +{ + _Py_IDENTIFIER(close); + return _PyObject_CallMethodId(self, &PyId_close, NULL); +} + static PyMethodDef dbm_methods[] = { {"close", (PyCFunction)dbm_close, METH_NOARGS, dbm_close__doc__}, {"keys", (PyCFunction)dbm_keys, METH_NOARGS, dbm_keys__doc__}, @@ -434,6 +448,8 @@ {"sync", (PyCFunction)dbm_sync, METH_NOARGS, dbm_sync__doc__}, {"get", (PyCFunction)dbm_get, METH_VARARGS, dbm_get__doc__}, {"setdefault",(PyCFunction)dbm_setdefault,METH_VARARGS, dbm_setdefault__doc__}, + {"__enter__", dbm__enter__, METH_NOARGS, NULL}, + {"__exit__", dbm__exit__, METH_VARARGS, NULL}, {NULL, NULL} /* sentinel */ }; -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Sun Nov 17 07:36:46 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sun, 17 Nov 2013 07:36:46 +0100 Subject: [Python-checkins] Daily reference leaks (829b95824867): sum=4 Message-ID: results for 829b95824867 on branch "default" -------------------------------------------- test_site leaked [0, 2, 0] references, sum=2 test_site leaked [0, 2, 0] memory blocks, sum=2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflog0yr6pM', '-x'] From python-checkins at python.org Sun Nov 17 09:19:35 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 17 Nov 2013 09:19:35 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Add_news_entry?= =?utf-8?q?_for_3=2E3=2E3_final=2E?= Message-ID: <3dMmRq16d2z7LtH@mail.python.org> http://hg.python.org/cpython/rev/ef44d2c58c7f changeset: 87167:ef44d2c58c7f branch: 3.3 parent: 86697:8609f6df9974 user: Georg Brandl date: Mon Oct 28 08:06:50 2013 +0100 summary: Add news entry for 3.3.3 final. files: Misc/NEWS | 13 +++++++++++-- 1 files changed, 11 insertions(+), 2 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -2,11 +2,20 @@ Python News +++++++++++ +What's New in Python 3.3.3? +=========================== + +*Release date: XX-Nov-2013* + +Tests +----- + +- Issue #18964: Fix test_tcl when run with Tcl/Tk versions < 8.5. + + What's New in Python 3.3.3 release candidate 1? =============================================== -.. *Not yet released, see sections below for changes released in 3.3.2* - *Release date: 27-Oct-2013* Core and Builtins -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 09:19:36 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 17 Nov 2013 09:19:36 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE1NjYz?= =?utf-8?q?=3A_Revert_OS_X_installer_built-in_Tcl/Tk_support_for_3=2E3=2E3?= =?utf-8?q?=2E?= Message-ID: <3dMmRr413Mz7LwR@mail.python.org> http://hg.python.org/cpython/rev/f9927dcc85cf changeset: 87168:f9927dcc85cf branch: 3.3 user: Ned Deily date: Tue Nov 05 02:44:17 2013 -0800 summary: Issue #15663: Revert OS X installer built-in Tcl/Tk support for 3.3.3. Some third-party projects, such as matplotlib and PIL/Pillow, depended on being able to build with Tcl and Tk frameworks in /Library/Frameworks. They were unable to build with the built-in Tcl/Tk and/or execute correctly. files: Mac/BuildScript/README.txt | 29 +-------------- Mac/BuildScript/build-installer.py | 10 ++-- Mac/BuildScript/resources/ReadMe.txt | 23 +++-------- Mac/BuildScript/resources/Welcome.rtf | 10 +++- Misc/NEWS | 8 ++++ 5 files changed, 28 insertions(+), 52 deletions(-) diff --git a/Mac/BuildScript/README.txt b/Mac/BuildScript/README.txt --- a/Mac/BuildScript/README.txt +++ b/Mac/BuildScript/README.txt @@ -57,40 +57,13 @@ * NCurses 5.9 (http://bugs.python.org/issue15037) * SQLite 3.7.13 - * Tcl 8.5.15 - * Tk 8.5.15 * XZ 5.0.3 - uses system-supplied versions of third-party libraries * readline module links with Apple BSD editline (libedit) - - requires ActiveState Tcl/Tk 8.5.14 (or later) to be installed for building - - * Beginning with Python 3.3.3, this installer now includes its own - builtin copy of Tcl and Tk 8.5.15 libraries and thus is no longer - dependent on the buggy releases of Aqua Cocoa Tk 8.5 shipped with - OS X 10.6 or on installing a newer third-party version of Tcl/Tk - in /Library/Frameworks, such as from ActiveState. If it is - necessary to fallback to using a third-party Tcl/Tk because of - a problem with the builtin Tcl/Tk, there is a backup version of - the _tkinter extension included which will dynamically link to - Tcl and Tk frameworks in /Library/Frameworks as in previous releases. - To enable (for all users of this Python 3.3):: - - sudo bash - cd /Library/Frameworks/Python.framework/Versions/3.3 - cd ./lib/python3.3 - cp -p ./lib-tkinter/library/_tkinter.so ./lib-dynload - exit - - To restore using Python's builtin versions of Tcl and Tk:: - - sudo bash - cd /Library/Frameworks/Python.framework/Versions/3.3 - cd ./lib/python3.3 - cp -p ./lib-tkinter/builtin/_tkinter.so ./lib-dynload - exit + - requires ActiveState Tcl/Tk 8.5.9 (or later) to be installed for building - recommended build environment: diff --git a/Mac/BuildScript/build-installer.py b/Mac/BuildScript/build-installer.py --- a/Mac/BuildScript/build-installer.py +++ b/Mac/BuildScript/build-installer.py @@ -193,7 +193,7 @@ LT_10_5 = bool(DEPTARGET < '10.5') - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): result.extend([ dict( name="Tcl 8.5.15", @@ -571,7 +571,7 @@ # - the traditional version (renamed to _tkinter_library.so) linked # with /Library/Frameworks/{Tcl,Tk}.framework # - the default version linked with our builtin copies of Tcl and Tk - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): EXPECTED_SHARED_LIBS['_tkinter_library.so'] = \ EXPECTED_SHARED_LIBS['_tkinter.so'] EXPECTED_SHARED_LIBS['_tkinter.so'] = [ @@ -971,7 +971,7 @@ # out-of-date and has critical bugs. Save the _tkinter.so that was # linked with /Library/Frameworks/{Tck,Tk}.framework and build # another _tkinter.so linked with our builtin Tcl and Tk libs. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): runCommand("find build -name '_tkinter.so' " " -execdir mv '{}' _tkinter_library.so \;") print("Running make to build builtin _tkinter") @@ -1012,7 +1012,7 @@ # users to select which to import by manipulating sys.path # directly or with PYTHONPATH. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): TKINTERS = ['builtin', 'library'] tkinter_moves = [('_tkinter_' + tkn + '.so', os.path.join(path_to_lib, 'lib-tkinter', tkn)) @@ -1059,7 +1059,7 @@ # The files are moved after the entire tree has been walked # since the shared library checking depends on the files # having unique names. - if DEPTARGET > '10.5': + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): for tkm in tkinter_moves: if fn == tkm[0]: moves_list.append( diff --git a/Mac/BuildScript/resources/ReadMe.txt b/Mac/BuildScript/resources/ReadMe.txt --- a/Mac/BuildScript/resources/ReadMe.txt +++ b/Mac/BuildScript/resources/ReadMe.txt @@ -17,27 +17,18 @@ installer package icon. Then select "Open using ... Installer" from the contextual menu that appears. - **** IMPORTANT changes if you use IDLE and Tkinter **** + **** IMPORTANT **** -Installing a third-party version of Tcl/Tk is no longer required -================================================================ +Update your version of Tcl/Tk to use IDLE or other Tk applications +================================================================== -As of Python 3.3.3, the 10.6+ 64-bit installer now -comes with its own private copy of Tcl and Tk 8.5 libraries. For -this version of Python, it is no longer necessary to install -a third-party version of Tcl/Tk 8.5, such as those from ActiveState, -to work around the problematic versions of Tcl/Tk 8.5 shipped by -Apple in OS X 10.6 and later. (This does not change the requirements -for older versions of Python installed from python.org.) By default, -this version of Python will always use its own private version, -regardless of whether a third-party Tcl/Tk is installed. -The 10.5+ 32-bit-only installer continues to use Tcl/Tk 8.4, -either a third-party or system-supplied version. - -Visit http://www.python.org/download/mac/tcltk/ +To use IDLE or other programs that use the Tkinter graphical user +interface toolkit, you may need to install a newer third-party version +of the Tcl/Tk frameworks. Visit http://www.python.org/download/mac/tcltk/ for current information about supported and recommended versions of Tcl/Tk for this version of Python and of Mac OS X. + Using this version of Python on OS X ==================================== diff --git a/Mac/BuildScript/resources/Welcome.rtf b/Mac/BuildScript/resources/Welcome.rtf --- a/Mac/BuildScript/resources/Welcome.rtf +++ b/Mac/BuildScript/resources/Welcome.rtf @@ -25,7 +25,11 @@ \b0 at any time to make $FULL_VERSION the default Python 3 version. This version can co-exist with other installed versions of Python 3 and Python 2.\ \ -\b IMPORTANT for users of IDLE and tkinter: -\b0 As of Python 3.3.3, it is no longer necessary to install third-party versions of the +\b IMPORTANT: +\b0 +\b IDLE +\b0 and other programs using the +\b tkinter +\b0 graphical user interface toolkit require specific versions of the \b Tcl/Tk -\b0 platform independent windowing toolkit. Please read the ReadMe file and visit {\field{\*\fldinst{HYPERLINK "http://www.python.org/download/mac/tcltk/"}}{\fldrslt http://www.python.org/download/mac/tcltk/}} for more information on supported and recommended versions of Tcl/Tk for this version of Python and Mac OS X.} \ No newline at end of file +\b0 platform independent windowing toolkit. Visit {\field{\*\fldinst{HYPERLINK "http://www.python.org/download/mac/tcltk/"}}{\fldrslt http://www.python.org/download/mac/tcltk/}} for current information on supported and recommended versions of Tcl/Tk for this version of Python and Mac OS X.} \ No newline at end of file diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -12,6 +12,14 @@ - Issue #18964: Fix test_tcl when run with Tcl/Tk versions < 8.5. +Build +----- + +- Issue #15663: Revert OS X installer built-in Tcl/Tk support for 3.3.3. + Some third-party projects, such as Matplotlib and PIL/Pillow, + depended on being able to build with Tcl and Tk frameworks in + /Library/Frameworks. + What's New in Python 3.3.3 release candidate 1? =============================================== -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 09:19:37 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 17 Nov 2013 09:19:37 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Transplant_of_?= =?utf-8?q?rev_544b654d000c=3A_directory_traversal_attack_in?= Message-ID: <3dMmRs6TQnz7LwR@mail.python.org> http://hg.python.org/cpython/rev/691b3df5036b changeset: 87169:691b3df5036b branch: 3.3 user: Georg Brandl date: Mon Nov 11 06:10:23 2013 +0100 summary: Transplant of rev 544b654d000c: directory traversal attack in CGIHttpRequestHandler. files: Lib/http/server.py | 9 ++++----- Lib/test/test_httpservers.py | 12 ++++++++++++ Misc/NEWS | 5 +++++ 3 files changed, 21 insertions(+), 5 deletions(-) diff --git a/Lib/http/server.py b/Lib/http/server.py --- a/Lib/http/server.py +++ b/Lib/http/server.py @@ -987,18 +987,17 @@ def run_cgi(self): """Execute a CGI script.""" - path = self.path dir, rest = self.cgi_info - i = path.find('/', len(dir) + 1) + i = rest.find('/') while i >= 0: - nextdir = path[:i] - nextrest = path[i+1:] + nextdir = rest[:i] + nextrest = rest[i+1:] scriptdir = self.translate_path(nextdir) if os.path.isdir(scriptdir): dir, rest = nextdir, nextrest - i = path.find('/', len(dir) + 1) + i = rest.find('/') else: break diff --git a/Lib/test/test_httpservers.py b/Lib/test/test_httpservers.py --- a/Lib/test/test_httpservers.py +++ b/Lib/test/test_httpservers.py @@ -325,6 +325,7 @@ self.parent_dir = tempfile.mkdtemp() self.cgi_dir = os.path.join(self.parent_dir, 'cgi-bin') os.mkdir(self.cgi_dir) + self.nocgi_path = None self.file1_path = None self.file2_path = None @@ -345,6 +346,11 @@ self.tearDown() self.skipTest("Python executable path is not encodable to utf-8") + self.nocgi_path = os.path.join(self.parent_dir, 'nocgi.py') + with open(self.nocgi_path, 'w') as fp: + fp.write(cgi_file1 % self.pythonexe) + os.chmod(self.nocgi_path, 0o777) + self.file1_path = os.path.join(self.cgi_dir, 'file1.py') with open(self.file1_path, 'w', encoding='utf-8') as file1: file1.write(cgi_file1 % self.pythonexe) @@ -362,6 +368,8 @@ os.chdir(self.cwd) if self.pythonexe != sys.executable: os.remove(self.pythonexe) + if self.nocgi_path: + os.remove(self.nocgi_path) if self.file1_path: os.remove(self.file1_path) if self.file2_path: @@ -418,6 +426,10 @@ self.assertEqual((b'Hello World' + self.linesep, 'text/html', 200), (res.read(), res.getheader('Content-type'), res.status)) + def test_issue19435(self): + res = self.request('///////////nocgi.py/../cgi-bin/nothere.sh') + self.assertEqual(res.status, 404) + def test_post(self): params = urllib.parse.urlencode( {'spam' : 1, 'eggs' : 'python', 'bacon' : 123456}) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -7,6 +7,11 @@ *Release date: XX-Nov-2013* +Library +------- + +- Issue #19435: Fix directory traversal attack on CGIHttpRequestHandler. + Tests ----- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 09:19:39 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 17 Nov 2013 09:19:39 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5MjI3?= =?utf-8?q?_/_Issue_=2318747=3A_Remove_pthread=5Fatfork=28=29_handler_to_r?= =?utf-8?q?emove_OpenSSL?= Message-ID: <3dMmRv15m2z7Lx2@mail.python.org> http://hg.python.org/cpython/rev/4e6aa98bb11c changeset: 87170:4e6aa98bb11c branch: 3.3 user: Christian Heimes date: Tue Oct 29 20:50:01 2013 +0100 summary: Issue #19227 / Issue #18747: Remove pthread_atfork() handler to remove OpenSSL re-seeding It is causing trouble like e.g. hanging processes. files: Modules/_ssl.c | 67 -------------------------------------- 1 files changed, 0 insertions(+), 67 deletions(-) diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -19,9 +19,6 @@ #ifdef WITH_THREAD #include "pythread.h" -#ifdef HAVE_PTHREAD_ATFORK -# include -#endif #define PySSL_BEGIN_ALLOW_THREADS_S(save) \ do { if (_ssl_locks_count>0) { (save) = PyEval_SaveThread(); } } while (0) @@ -2584,65 +2581,6 @@ Returns number of bytes read. Raises SSLError if connection to EGD\n\ fails or if it does not provide enough data to seed PRNG."); -/* Seed OpenSSL's PRNG at fork(), http://bugs.python.org/issue18747 - * - * The prepare handler seeds the PRNG from pseudo-random data like pid, the - * current time (miliseconds or seconds) and an uninitialized array. - * The array contains stack variables that are impossible to predict - * on most systems, e.g. function return address (subject to ASLR), the - * stack protection canary and automatic variables. - * The code is inspired by Apache's ssl_rand_seed() function. - * - * Note: - * The code uses pthread_atfork() until Python has a proper atfork API. The - * handlers are not removed from the child process. A prepare handler is used - * instead of a child handler because fork() is supposed to be async-signal - * safe but the handler calls unsafe functions. A parent handler has caused - * other problems, see issue #19227. - */ - -#if defined(HAVE_PTHREAD_ATFORK) && defined(WITH_THREAD) -#define PYSSL_RAND_ATFORK 1 - -static void -PySSL_RAND_atfork_prepare(void) -{ - struct { - char stack[128]; /* uninitialized (!) stack data, 128 is an - arbitrary number. */ - pid_t pid; /* current pid */ - _PyTime_timeval tp; /* current time */ - } seed; - -#ifdef WITH_VALGRIND - VALGRIND_MAKE_MEM_DEFINED(seed.stack, sizeof(seed.stack)); -#endif - seed.pid = getpid(); - _PyTime_gettimeofday(&(seed.tp)); - RAND_add((unsigned char *)&seed, sizeof(seed), 0.0); -} - -static int -PySSL_RAND_atfork(void) -{ - static int registered = 0; - int retval; - - if (registered) - return 0; - - retval = pthread_atfork(PySSL_RAND_atfork_prepare, /* prepare */ - NULL, /* parent */ - NULL); /* child */ - if (retval != 0) { - PyErr_SetFromErrno(PyExc_OSError); - return -1; - } - registered = 1; - return 0; -} -#endif /* HAVE_PTHREAD_ATFORK */ - #endif /* HAVE_OPENSSL_RAND */ @@ -3022,10 +2960,5 @@ if (r == NULL || PyModule_AddObject(m, "_OPENSSL_API_VERSION", r)) return NULL; -#ifdef PYSSL_RAND_ATFORK - if (PySSL_RAND_atfork() == -1) - return NULL; -#endif - return m; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 09:19:40 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 17 Nov 2013 09:19:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogQnVtcCB0byAzLjMu?= =?utf-8?q?3rc2=2E?= Message-ID: <3dMmRw2vwnz7Lx2@mail.python.org> http://hg.python.org/cpython/rev/d32442c0e60d changeset: 87171:d32442c0e60d branch: 3.3 tag: v3.3.3rc2 user: Georg Brandl date: Mon Nov 11 06:13:54 2013 +0100 summary: Bump to 3.3.3rc2. files: Include/patchlevel.h | 4 ++-- Lib/distutils/__init__.py | 2 +- Lib/idlelib/idlever.py | 2 +- Misc/NEWS | 11 +++++++---- Misc/RPM/python-3.3.spec | 2 +- README | 2 +- 6 files changed, 13 insertions(+), 10 deletions(-) diff --git a/Include/patchlevel.h b/Include/patchlevel.h --- a/Include/patchlevel.h +++ b/Include/patchlevel.h @@ -20,10 +20,10 @@ #define PY_MINOR_VERSION 3 #define PY_MICRO_VERSION 3 #define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_GAMMA -#define PY_RELEASE_SERIAL 1 +#define PY_RELEASE_SERIAL 2 /* Version as a string */ -#define PY_VERSION "3.3.3rc1" +#define PY_VERSION "3.3.3rc2" /*--end constants--*/ /* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2. diff --git a/Lib/distutils/__init__.py b/Lib/distutils/__init__.py --- a/Lib/distutils/__init__.py +++ b/Lib/distutils/__init__.py @@ -13,5 +13,5 @@ # Updated automatically by the Python release process. # #--start constants-- -__version__ = "3.3.3rc1" +__version__ = "3.3.3rc2" #--end constants-- diff --git a/Lib/idlelib/idlever.py b/Lib/idlelib/idlever.py --- a/Lib/idlelib/idlever.py +++ b/Lib/idlelib/idlever.py @@ -1,1 +1,1 @@ -IDLE_VERSION = "3.3.3rc1" +IDLE_VERSION = "3.3.3rc2" diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -2,14 +2,17 @@ Python News +++++++++++ -What's New in Python 3.3.3? -=========================== - -*Release date: XX-Nov-2013* +What's New in Python 3.3.3 release candidate 2? +=============================================== + +*Release date: 11-Nov-2013* Library ------- +- Issue #19227: Any re-seeding of the OpenSSL RNG on fork has been removed; + this should be handled by OpenSSL itself or by the application. + - Issue #19435: Fix directory traversal attack on CGIHttpRequestHandler. Tests diff --git a/Misc/RPM/python-3.3.spec b/Misc/RPM/python-3.3.spec --- a/Misc/RPM/python-3.3.spec +++ b/Misc/RPM/python-3.3.spec @@ -39,7 +39,7 @@ %define name python #--start constants-- -%define version 3.3.3rc1 +%define version 3.3.3rc2 %define libvers 3.3 #--end constants-- %define release 1pydotorg diff --git a/README b/README --- a/README +++ b/README @@ -1,4 +1,4 @@ -This is Python version 3.3.3 release candidate 1 +This is Python version 3.3.3 release candidate 2 ================================================ Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 09:19:41 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 17 Nov 2013 09:19:41 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Added_tag_v3?= =?utf-8?q?=2E3=2E3rc2_for_changeset_d32442c0e60d?= Message-ID: <3dMmRx50CGz7LtH@mail.python.org> http://hg.python.org/cpython/rev/6e81c3a16d1c changeset: 87172:6e81c3a16d1c branch: 3.3 user: Georg Brandl date: Mon Nov 11 06:16:15 2013 +0100 summary: Added tag v3.3.3rc2 for changeset d32442c0e60d files: .hgtags | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/.hgtags b/.hgtags --- a/.hgtags +++ b/.hgtags @@ -116,3 +116,4 @@ d9893d13c6289aa03d33559ec67f97dcbf5c9e3c v3.3.1 d047928ae3f6314a13b6137051315453d0ae89b6 v3.3.2 fd53c500f8b80f54f3ecedec9da2e8c7e52a6888 v3.3.3rc1 +d32442c0e60dfbd71234e807d3d1dedd227495a9 v3.3.3rc2 -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 09:19:43 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 17 Nov 2013 09:19:43 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogQnVtcCB0byAzLjMu?= =?utf-8?q?3_final=2E?= Message-ID: <3dMmRz1Bq0zN4q@mail.python.org> http://hg.python.org/cpython/rev/c3896275c0f6 changeset: 87173:c3896275c0f6 branch: 3.3 tag: v3.3.3 user: Georg Brandl date: Sun Nov 17 07:58:22 2013 +0100 summary: Bump to 3.3.3 final. files: Include/patchlevel.h | 6 +++--- Lib/distutils/__init__.py | 2 +- Lib/idlelib/idlever.py | 2 +- Misc/NEWS | 8 ++++++++ Misc/RPM/python-3.3.spec | 2 +- README | 4 ++-- 6 files changed, 16 insertions(+), 8 deletions(-) diff --git a/Include/patchlevel.h b/Include/patchlevel.h --- a/Include/patchlevel.h +++ b/Include/patchlevel.h @@ -19,11 +19,11 @@ #define PY_MAJOR_VERSION 3 #define PY_MINOR_VERSION 3 #define PY_MICRO_VERSION 3 -#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_GAMMA -#define PY_RELEASE_SERIAL 2 +#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_FINAL +#define PY_RELEASE_SERIAL 0 /* Version as a string */ -#define PY_VERSION "3.3.3rc2" +#define PY_VERSION "3.3.3" /*--end constants--*/ /* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2. diff --git a/Lib/distutils/__init__.py b/Lib/distutils/__init__.py --- a/Lib/distutils/__init__.py +++ b/Lib/distutils/__init__.py @@ -13,5 +13,5 @@ # Updated automatically by the Python release process. # #--start constants-- -__version__ = "3.3.3rc2" +__version__ = "3.3.3" #--end constants-- diff --git a/Lib/idlelib/idlever.py b/Lib/idlelib/idlever.py --- a/Lib/idlelib/idlever.py +++ b/Lib/idlelib/idlever.py @@ -1,1 +1,1 @@ -IDLE_VERSION = "3.3.3rc2" +IDLE_VERSION = "3.3.3" diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -2,6 +2,14 @@ Python News +++++++++++ +What's New in Python 3.3.3? +=========================== + +*Release date: 17-Nov-2013* + +No changes from release candidate 2. + + What's New in Python 3.3.3 release candidate 2? =============================================== diff --git a/Misc/RPM/python-3.3.spec b/Misc/RPM/python-3.3.spec --- a/Misc/RPM/python-3.3.spec +++ b/Misc/RPM/python-3.3.spec @@ -39,7 +39,7 @@ %define name python #--start constants-- -%define version 3.3.3rc2 +%define version 3.3.3 %define libvers 3.3 #--end constants-- %define release 1pydotorg diff --git a/README b/README --- a/README +++ b/README @@ -1,5 +1,5 @@ -This is Python version 3.3.3 release candidate 2 -================================================ +This is Python version 3.3.3 +============================ Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013 Python Software Foundation. All rights reserved. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 09:19:44 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 17 Nov 2013 09:19:44 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Added_tag_v3?= =?utf-8?q?=2E3=2E3_for_changeset_c3896275c0f6?= Message-ID: <3dMmS02qRGz7LtH@mail.python.org> http://hg.python.org/cpython/rev/b34363761e4c changeset: 87174:b34363761e4c branch: 3.3 user: Georg Brandl date: Sun Nov 17 07:59:06 2013 +0100 summary: Added tag v3.3.3 for changeset c3896275c0f6 files: .hgtags | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/.hgtags b/.hgtags --- a/.hgtags +++ b/.hgtags @@ -117,3 +117,4 @@ d047928ae3f6314a13b6137051315453d0ae89b6 v3.3.2 fd53c500f8b80f54f3ecedec9da2e8c7e52a6888 v3.3.3rc1 d32442c0e60dfbd71234e807d3d1dedd227495a9 v3.3.3rc2 +c3896275c0f61b2510a6c7e6c458a750359a91b8 v3.3.3 -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 09:19:45 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 17 Nov 2013 09:19:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMy4zIC0+IDMuMyk6?= =?utf-8?q?_merge_with_3=2E3=2E3_release_clone?= Message-ID: <3dMmS15vNxz7LjN@mail.python.org> http://hg.python.org/cpython/rev/7b697f764018 changeset: 87175:7b697f764018 branch: 3.3 parent: 87164:7e3f8026ed30 parent: 87174:b34363761e4c user: Georg Brandl date: Sun Nov 17 09:17:18 2013 +0100 summary: merge with 3.3.3 release clone files: .hgtags | 2 + Include/patchlevel.h | 6 ++-- Lib/distutils/__init__.py | 2 +- Lib/idlelib/idlever.py | 2 +- Misc/NEWS | 40 ++++++++++++++++++++++---- Misc/RPM/python-3.3.spec | 2 +- README | 4 +- 7 files changed, 44 insertions(+), 14 deletions(-) diff --git a/.hgtags b/.hgtags --- a/.hgtags +++ b/.hgtags @@ -116,3 +116,5 @@ d9893d13c6289aa03d33559ec67f97dcbf5c9e3c v3.3.1 d047928ae3f6314a13b6137051315453d0ae89b6 v3.3.2 fd53c500f8b80f54f3ecedec9da2e8c7e52a6888 v3.3.3rc1 +d32442c0e60dfbd71234e807d3d1dedd227495a9 v3.3.3rc2 +c3896275c0f61b2510a6c7e6c458a750359a91b8 v3.3.3 diff --git a/Include/patchlevel.h b/Include/patchlevel.h --- a/Include/patchlevel.h +++ b/Include/patchlevel.h @@ -19,11 +19,11 @@ #define PY_MAJOR_VERSION 3 #define PY_MINOR_VERSION 3 #define PY_MICRO_VERSION 3 -#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_GAMMA -#define PY_RELEASE_SERIAL 1 +#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_FINAL +#define PY_RELEASE_SERIAL 0 /* Version as a string */ -#define PY_VERSION "3.3.3rc1" +#define PY_VERSION "3.3.3" /*--end constants--*/ /* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2. diff --git a/Lib/distutils/__init__.py b/Lib/distutils/__init__.py --- a/Lib/distutils/__init__.py +++ b/Lib/distutils/__init__.py @@ -13,5 +13,5 @@ # Updated automatically by the Python release process. # #--start constants-- -__version__ = "3.3.3rc1" +__version__ = "3.3.3" #--end constants-- diff --git a/Lib/idlelib/idlever.py b/Lib/idlelib/idlever.py --- a/Lib/idlelib/idlever.py +++ b/Lib/idlelib/idlever.py @@ -1,1 +1,1 @@ -IDLE_VERSION = "3.3.3rc1" +IDLE_VERSION = "3.3.3" diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -2,10 +2,10 @@ Python News +++++++++++ -What's New in Python 3.3.4? -=========================== - -*Not yet released, see sections below for changes released in 3.3.2* +What's New in Python 3.3.4 release candidate 1? +=============================================== + +*Not yet released, see sections below for changes released in 3.3.3* Core and Builtins ----------------- @@ -42,8 +42,6 @@ - Issue #19286: Directories in ``package_data`` are no longer added to the filelist, preventing failure outlined in the ticket. -- Issue #19435: Fix directory traversal attack on CGIHttpRequestHandler. - Tests ----- @@ -58,6 +56,36 @@ Build ----- + +What's New in Python 3.3.3? +=========================== + +*Release date: 17-Nov-2013* + +No changes from release candidate 2. + + +What's New in Python 3.3.3 release candidate 2? +=============================================== + +*Release date: 11-Nov-2013* + +Library +------- + +- Issue #19227: Any re-seeding of the OpenSSL RNG on fork has been removed; + this should be handled by OpenSSL itself or by the application. + +- Issue #19435: Fix directory traversal attack on CGIHttpRequestHandler. + +Tests +----- + +- Issue #18964: Fix test_tcl when run with Tcl/Tk versions < 8.5. + +Build +----- + - Issue #15663: Revert OS X installer built-in Tcl/Tk support for 3.3.3. Some third-party projects, such as Matplotlib and PIL/Pillow, depended on being able to build with Tcl and Tk frameworks in diff --git a/Misc/RPM/python-3.3.spec b/Misc/RPM/python-3.3.spec --- a/Misc/RPM/python-3.3.spec +++ b/Misc/RPM/python-3.3.spec @@ -39,7 +39,7 @@ %define name python #--start constants-- -%define version 3.3.3rc1 +%define version 3.3.3 %define libvers 3.3 #--end constants-- %define release 1pydotorg diff --git a/README b/README --- a/README +++ b/README @@ -1,5 +1,5 @@ -This is Python version 3.3.3 release candidate 1 -================================================ +This is Python version 3.3.3 +============================ Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013 Python Software Foundation. All rights reserved. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 09:19:47 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 17 Nov 2013 09:19:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Post-release_b?= =?utf-8?q?ump=2E?= Message-ID: <3dMmS30Pw4z7LwR@mail.python.org> http://hg.python.org/cpython/rev/895c80ce4e10 changeset: 87176:895c80ce4e10 branch: 3.3 user: Georg Brandl date: Sun Nov 17 09:17:40 2013 +0100 summary: Post-release bump. files: Include/patchlevel.h | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Include/patchlevel.h b/Include/patchlevel.h --- a/Include/patchlevel.h +++ b/Include/patchlevel.h @@ -23,7 +23,7 @@ #define PY_RELEASE_SERIAL 0 /* Version as a string */ -#define PY_VERSION "3.3.3" +#define PY_VERSION "3.3.3+" /*--end constants--*/ /* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 09:19:48 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 17 Nov 2013 09:19:48 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_merge_with_3=2E3?= Message-ID: <3dMmS42Ph7z7Lx2@mail.python.org> http://hg.python.org/cpython/rev/d6ad412dee40 changeset: 87177:d6ad412dee40 parent: 87166:c2f1bb56760d parent: 87176:895c80ce4e10 user: Georg Brandl date: Sun Nov 17 09:19:11 2013 +0100 summary: merge with 3.3 files: .hgtags | 2 ++ 1 files changed, 2 insertions(+), 0 deletions(-) diff --git a/.hgtags b/.hgtags --- a/.hgtags +++ b/.hgtags @@ -116,6 +116,8 @@ d9893d13c6289aa03d33559ec67f97dcbf5c9e3c v3.3.1 d047928ae3f6314a13b6137051315453d0ae89b6 v3.3.2 fd53c500f8b80f54f3ecedec9da2e8c7e52a6888 v3.3.3rc1 +d32442c0e60dfbd71234e807d3d1dedd227495a9 v3.3.3rc2 +c3896275c0f61b2510a6c7e6c458a750359a91b8 v3.3.3 46535f65e7f3bcdcf176f36d34bc1fed719ffd2b v3.4.0a1 9265a2168e2cb2a84785d8717792acc661e6b692 v3.4.0a2 dd9cdf90a5073510877e9dd5112f8e6cf20d5e89 v3.4.0a3 -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 11:58:21 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 11:58:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NjAx?= =?utf-8?q?=3A_Use_specific_asserts_in_sqlite3_tests=2E?= Message-ID: <3dMqz15nmfz7Ltv@mail.python.org> http://hg.python.org/cpython/rev/981d161a52be changeset: 87178:981d161a52be branch: 3.3 parent: 87160:5a0236f3f972 user: Serhiy Storchaka date: Sun Nov 17 00:39:12 2013 +0200 summary: Issue #19601: Use specific asserts in sqlite3 tests. files: Lib/sqlite3/test/factory.py | 48 +++++++++--------------- Lib/sqlite3/test/hooks.py | 2 +- 2 files changed, 20 insertions(+), 30 deletions(-) diff --git a/Lib/sqlite3/test/factory.py b/Lib/sqlite3/test/factory.py --- a/Lib/sqlite3/test/factory.py +++ b/Lib/sqlite3/test/factory.py @@ -47,9 +47,7 @@ self.con.close() def CheckIsInstance(self): - self.assertTrue(isinstance(self.con, - MyConnection), - "connection is not instance of MyConnection") + self.assertIsInstance(self.con, MyConnection) class CursorFactoryTests(unittest.TestCase): def setUp(self): @@ -60,9 +58,7 @@ def CheckIsInstance(self): cur = self.con.cursor(factory=MyCursor) - self.assertTrue(isinstance(cur, - MyCursor), - "cursor is not instance of MyCursor") + self.assertIsInstance(cur, MyCursor) class RowFactoryTestsBackwardsCompat(unittest.TestCase): def setUp(self): @@ -72,9 +68,7 @@ cur = self.con.cursor(factory=MyCursor) cur.execute("select 4+5 as foo") row = cur.fetchone() - self.assertTrue(isinstance(row, - dict), - "row is not instance of dict") + self.assertIsInstance(row, dict) cur.close() def tearDown(self): @@ -87,28 +81,24 @@ def CheckCustomFactory(self): self.con.row_factory = lambda cur, row: list(row) row = self.con.execute("select 1, 2").fetchone() - self.assertTrue(isinstance(row, - list), - "row is not instance of list") + self.assertIsInstance(row, list) def CheckSqliteRowIndex(self): self.con.row_factory = sqlite.Row row = self.con.execute("select 1 as a, 2 as b").fetchone() - self.assertTrue(isinstance(row, - sqlite.Row), - "row is not instance of sqlite.Row") + self.assertIsInstance(row, sqlite.Row) col1, col2 = row["a"], row["b"] - self.assertTrue(col1 == 1, "by name: wrong result for column 'a'") - self.assertTrue(col2 == 2, "by name: wrong result for column 'a'") + self.assertEqual(col1, 1, "by name: wrong result for column 'a'") + self.assertEqual(col2, 2, "by name: wrong result for column 'a'") col1, col2 = row["A"], row["B"] - self.assertTrue(col1 == 1, "by name: wrong result for column 'A'") - self.assertTrue(col2 == 2, "by name: wrong result for column 'B'") + self.assertEqual(col1, 1, "by name: wrong result for column 'A'") + self.assertEqual(col2, 2, "by name: wrong result for column 'B'") col1, col2 = row[0], row[1] - self.assertTrue(col1 == 1, "by index: wrong result for column 0") - self.assertTrue(col2 == 2, "by index: wrong result for column 1") + self.assertEqual(col1, 1, "by index: wrong result for column 0") + self.assertEqual(col2, 2, "by index: wrong result for column 1") def CheckSqliteRowIter(self): """Checks if the row object is iterable""" @@ -138,8 +128,8 @@ row_2 = self.con.execute("select 1 as a, 2 as b").fetchone() row_3 = self.con.execute("select 1 as a, 3 as b").fetchone() - self.assertTrue(row_1 == row_1) - self.assertTrue(row_1 == row_2) + self.assertEqual(row_1, row_1) + self.assertEqual(row_1, row_2) self.assertTrue(row_2 != row_3) self.assertFalse(row_1 != row_1) @@ -161,20 +151,20 @@ def CheckUnicode(self): austria = "?sterreich" row = self.con.execute("select ?", (austria,)).fetchone() - self.assertTrue(type(row[0]) == str, "type of row[0] must be unicode") + self.assertEqual(type(row[0]), str, "type of row[0] must be unicode") def CheckString(self): self.con.text_factory = bytes austria = "?sterreich" row = self.con.execute("select ?", (austria,)).fetchone() - self.assertTrue(type(row[0]) == bytes, "type of row[0] must be bytes") - self.assertTrue(row[0] == austria.encode("utf-8"), "column must equal original data in UTF-8") + self.assertEqual(type(row[0]), bytes, "type of row[0] must be bytes") + self.assertEqual(row[0], austria.encode("utf-8"), "column must equal original data in UTF-8") def CheckCustom(self): self.con.text_factory = lambda x: str(x, "utf-8", "ignore") austria = "?sterreich" row = self.con.execute("select ?", (austria,)).fetchone() - self.assertTrue(type(row[0]) == str, "type of row[0] must be unicode") + self.assertEqual(type(row[0]), str, "type of row[0] must be unicode") self.assertTrue(row[0].endswith("reich"), "column must contain original data") def CheckOptimizedUnicode(self): @@ -185,8 +175,8 @@ germany = "Deutchland" a_row = self.con.execute("select ?", (austria,)).fetchone() d_row = self.con.execute("select ?", (germany,)).fetchone() - self.assertTrue(type(a_row[0]) == str, "type of non-ASCII row must be str") - self.assertTrue(type(d_row[0]) == str, "type of ASCII-only row must be str") + self.assertEqual(type(a_row[0]), str, "type of non-ASCII row must be str") + self.assertEqual(type(d_row[0]), str, "type of ASCII-only row must be str") def tearDown(self): self.con.close() diff --git a/Lib/sqlite3/test/hooks.py b/Lib/sqlite3/test/hooks.py --- a/Lib/sqlite3/test/hooks.py +++ b/Lib/sqlite3/test/hooks.py @@ -162,7 +162,7 @@ create table bar (a, b) """) second_count = len(progress_calls) - self.assertTrue(first_count > second_count) + self.assertGreater(first_count, second_count) def CheckCancelOperation(self): """ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 11:58:23 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 11:58:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NjAx?= =?utf-8?q?=3A_Use_specific_asserts_in_sqlite3_tests=2E?= Message-ID: <3dMqz31g76z7Lx5@mail.python.org> http://hg.python.org/cpython/rev/d004ccdd6a57 changeset: 87179:d004ccdd6a57 branch: 2.7 parent: 87162:097389413dd3 user: Serhiy Storchaka date: Sun Nov 17 00:39:43 2013 +0200 summary: Issue #19601: Use specific asserts in sqlite3 tests. files: Lib/sqlite3/test/factory.py | 48 +++++++++--------------- Lib/sqlite3/test/hooks.py | 2 +- 2 files changed, 20 insertions(+), 30 deletions(-) diff --git a/Lib/sqlite3/test/factory.py b/Lib/sqlite3/test/factory.py --- a/Lib/sqlite3/test/factory.py +++ b/Lib/sqlite3/test/factory.py @@ -47,9 +47,7 @@ self.con.close() def CheckIsInstance(self): - self.assertTrue(isinstance(self.con, - MyConnection), - "connection is not instance of MyConnection") + self.assertIsInstance(self.con, MyConnection) class CursorFactoryTests(unittest.TestCase): def setUp(self): @@ -60,9 +58,7 @@ def CheckIsInstance(self): cur = self.con.cursor(factory=MyCursor) - self.assertTrue(isinstance(cur, - MyCursor), - "cursor is not instance of MyCursor") + self.assertIsInstance(cur, MyCursor) class RowFactoryTestsBackwardsCompat(unittest.TestCase): def setUp(self): @@ -72,9 +68,7 @@ cur = self.con.cursor(factory=MyCursor) cur.execute("select 4+5 as foo") row = cur.fetchone() - self.assertTrue(isinstance(row, - dict), - "row is not instance of dict") + self.assertIsInstance(row, dict) cur.close() def tearDown(self): @@ -87,28 +81,24 @@ def CheckCustomFactory(self): self.con.row_factory = lambda cur, row: list(row) row = self.con.execute("select 1, 2").fetchone() - self.assertTrue(isinstance(row, - list), - "row is not instance of list") + self.assertIsInstance(row, list) def CheckSqliteRowIndex(self): self.con.row_factory = sqlite.Row row = self.con.execute("select 1 as a, 2 as b").fetchone() - self.assertTrue(isinstance(row, - sqlite.Row), - "row is not instance of sqlite.Row") + self.assertIsInstance(row, sqlite.Row) col1, col2 = row["a"], row["b"] - self.assertTrue(col1 == 1, "by name: wrong result for column 'a'") - self.assertTrue(col2 == 2, "by name: wrong result for column 'a'") + self.assertEqual(col1, 1, "by name: wrong result for column 'a'") + self.assertEqual(col2, 2, "by name: wrong result for column 'a'") col1, col2 = row["A"], row["B"] - self.assertTrue(col1 == 1, "by name: wrong result for column 'A'") - self.assertTrue(col2 == 2, "by name: wrong result for column 'B'") + self.assertEqual(col1, 1, "by name: wrong result for column 'A'") + self.assertEqual(col2, 2, "by name: wrong result for column 'B'") col1, col2 = row[0], row[1] - self.assertTrue(col1 == 1, "by index: wrong result for column 0") - self.assertTrue(col2 == 2, "by index: wrong result for column 1") + self.assertEqual(col1, 1, "by index: wrong result for column 0") + self.assertEqual(col2, 2, "by index: wrong result for column 1") def CheckSqliteRowIter(self): """Checks if the row object is iterable""" @@ -138,8 +128,8 @@ row_2 = self.con.execute("select 1 as a, 2 as b").fetchone() row_3 = self.con.execute("select 1 as a, 3 as b").fetchone() - self.assertTrue(row_1 == row_1) - self.assertTrue(row_1 == row_2) + self.assertEqual(row_1, row_1) + self.assertEqual(row_1, row_2) self.assertTrue(row_2 != row_3) self.assertFalse(row_1 != row_1) @@ -161,20 +151,20 @@ def CheckUnicode(self): austria = unicode("?sterreich", "latin1") row = self.con.execute("select ?", (austria,)).fetchone() - self.assertTrue(type(row[0]) == unicode, "type of row[0] must be unicode") + self.assertEqual(type(row[0]), unicode, "type of row[0] must be unicode") def CheckString(self): self.con.text_factory = str austria = unicode("?sterreich", "latin1") row = self.con.execute("select ?", (austria,)).fetchone() - self.assertTrue(type(row[0]) == str, "type of row[0] must be str") - self.assertTrue(row[0] == austria.encode("utf-8"), "column must equal original data in UTF-8") + self.assertEqual(type(row[0]), str, "type of row[0] must be str") + self.assertEqual(row[0], austria.encode("utf-8"), "column must equal original data in UTF-8") def CheckCustom(self): self.con.text_factory = lambda x: unicode(x, "utf-8", "ignore") austria = unicode("?sterreich", "latin1") row = self.con.execute("select ?", (austria.encode("latin1"),)).fetchone() - self.assertTrue(type(row[0]) == unicode, "type of row[0] must be unicode") + self.assertEqual(type(row[0]), unicode, "type of row[0] must be unicode") self.assertTrue(row[0].endswith(u"reich"), "column must contain original data") def CheckOptimizedUnicode(self): @@ -183,8 +173,8 @@ germany = unicode("Deutchland") a_row = self.con.execute("select ?", (austria,)).fetchone() d_row = self.con.execute("select ?", (germany,)).fetchone() - self.assertTrue(type(a_row[0]) == unicode, "type of non-ASCII row must be unicode") - self.assertTrue(type(d_row[0]) == str, "type of ASCII-only row must be str") + self.assertEqual(type(a_row[0]), unicode, "type of non-ASCII row must be unicode") + self.assertEqual(type(d_row[0]), str, "type of ASCII-only row must be str") def tearDown(self): self.con.close() diff --git a/Lib/sqlite3/test/hooks.py b/Lib/sqlite3/test/hooks.py --- a/Lib/sqlite3/test/hooks.py +++ b/Lib/sqlite3/test/hooks.py @@ -162,7 +162,7 @@ create table bar (a, b) """) second_count = len(progress_calls) - self.assertTrue(first_count > second_count) + self.assertGreater(first_count, second_count) def CheckCancelOperation(self): """ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 11:58:24 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 11:58:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319601=3A_Use_specific_asserts_in_sqlite3_tests?= =?utf-8?q?=2E?= Message-ID: <3dMqz44sDXz7Lx5@mail.python.org> http://hg.python.org/cpython/rev/6a6ecc7149f4 changeset: 87180:6a6ecc7149f4 parent: 87161:6a3501aab559 parent: 87178:981d161a52be user: Serhiy Storchaka date: Sun Nov 17 00:40:01 2013 +0200 summary: Issue #19601: Use specific asserts in sqlite3 tests. files: Lib/sqlite3/test/factory.py | 48 +++++++++--------------- Lib/sqlite3/test/hooks.py | 2 +- 2 files changed, 20 insertions(+), 30 deletions(-) diff --git a/Lib/sqlite3/test/factory.py b/Lib/sqlite3/test/factory.py --- a/Lib/sqlite3/test/factory.py +++ b/Lib/sqlite3/test/factory.py @@ -47,9 +47,7 @@ self.con.close() def CheckIsInstance(self): - self.assertTrue(isinstance(self.con, - MyConnection), - "connection is not instance of MyConnection") + self.assertIsInstance(self.con, MyConnection) class CursorFactoryTests(unittest.TestCase): def setUp(self): @@ -60,9 +58,7 @@ def CheckIsInstance(self): cur = self.con.cursor(factory=MyCursor) - self.assertTrue(isinstance(cur, - MyCursor), - "cursor is not instance of MyCursor") + self.assertIsInstance(cur, MyCursor) class RowFactoryTestsBackwardsCompat(unittest.TestCase): def setUp(self): @@ -72,9 +68,7 @@ cur = self.con.cursor(factory=MyCursor) cur.execute("select 4+5 as foo") row = cur.fetchone() - self.assertTrue(isinstance(row, - dict), - "row is not instance of dict") + self.assertIsInstance(row, dict) cur.close() def tearDown(self): @@ -87,28 +81,24 @@ def CheckCustomFactory(self): self.con.row_factory = lambda cur, row: list(row) row = self.con.execute("select 1, 2").fetchone() - self.assertTrue(isinstance(row, - list), - "row is not instance of list") + self.assertIsInstance(row, list) def CheckSqliteRowIndex(self): self.con.row_factory = sqlite.Row row = self.con.execute("select 1 as a, 2 as b").fetchone() - self.assertTrue(isinstance(row, - sqlite.Row), - "row is not instance of sqlite.Row") + self.assertIsInstance(row, sqlite.Row) col1, col2 = row["a"], row["b"] - self.assertTrue(col1 == 1, "by name: wrong result for column 'a'") - self.assertTrue(col2 == 2, "by name: wrong result for column 'a'") + self.assertEqual(col1, 1, "by name: wrong result for column 'a'") + self.assertEqual(col2, 2, "by name: wrong result for column 'a'") col1, col2 = row["A"], row["B"] - self.assertTrue(col1 == 1, "by name: wrong result for column 'A'") - self.assertTrue(col2 == 2, "by name: wrong result for column 'B'") + self.assertEqual(col1, 1, "by name: wrong result for column 'A'") + self.assertEqual(col2, 2, "by name: wrong result for column 'B'") col1, col2 = row[0], row[1] - self.assertTrue(col1 == 1, "by index: wrong result for column 0") - self.assertTrue(col2 == 2, "by index: wrong result for column 1") + self.assertEqual(col1, 1, "by index: wrong result for column 0") + self.assertEqual(col2, 2, "by index: wrong result for column 1") def CheckSqliteRowIter(self): """Checks if the row object is iterable""" @@ -138,8 +128,8 @@ row_2 = self.con.execute("select 1 as a, 2 as b").fetchone() row_3 = self.con.execute("select 1 as a, 3 as b").fetchone() - self.assertTrue(row_1 == row_1) - self.assertTrue(row_1 == row_2) + self.assertEqual(row_1, row_1) + self.assertEqual(row_1, row_2) self.assertTrue(row_2 != row_3) self.assertFalse(row_1 != row_1) @@ -161,20 +151,20 @@ def CheckUnicode(self): austria = "?sterreich" row = self.con.execute("select ?", (austria,)).fetchone() - self.assertTrue(type(row[0]) == str, "type of row[0] must be unicode") + self.assertEqual(type(row[0]), str, "type of row[0] must be unicode") def CheckString(self): self.con.text_factory = bytes austria = "?sterreich" row = self.con.execute("select ?", (austria,)).fetchone() - self.assertTrue(type(row[0]) == bytes, "type of row[0] must be bytes") - self.assertTrue(row[0] == austria.encode("utf-8"), "column must equal original data in UTF-8") + self.assertEqual(type(row[0]), bytes, "type of row[0] must be bytes") + self.assertEqual(row[0], austria.encode("utf-8"), "column must equal original data in UTF-8") def CheckCustom(self): self.con.text_factory = lambda x: str(x, "utf-8", "ignore") austria = "?sterreich" row = self.con.execute("select ?", (austria,)).fetchone() - self.assertTrue(type(row[0]) == str, "type of row[0] must be unicode") + self.assertEqual(type(row[0]), str, "type of row[0] must be unicode") self.assertTrue(row[0].endswith("reich"), "column must contain original data") def CheckOptimizedUnicode(self): @@ -185,8 +175,8 @@ germany = "Deutchland" a_row = self.con.execute("select ?", (austria,)).fetchone() d_row = self.con.execute("select ?", (germany,)).fetchone() - self.assertTrue(type(a_row[0]) == str, "type of non-ASCII row must be str") - self.assertTrue(type(d_row[0]) == str, "type of ASCII-only row must be str") + self.assertEqual(type(a_row[0]), str, "type of non-ASCII row must be str") + self.assertEqual(type(d_row[0]), str, "type of ASCII-only row must be str") def tearDown(self): self.con.close() diff --git a/Lib/sqlite3/test/hooks.py b/Lib/sqlite3/test/hooks.py --- a/Lib/sqlite3/test/hooks.py +++ b/Lib/sqlite3/test/hooks.py @@ -162,7 +162,7 @@ create table bar (a, b) """) second_count = len(progress_calls) - self.assertTrue(first_count > second_count) + self.assertGreater(first_count, second_count) def CheckCancelOperation(self): """ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 11:58:26 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 11:58:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NjAy?= =?utf-8?q?=3A_Use_specific_asserts_in_tkinter_tests=2E?= Message-ID: <3dMqz60qWjz7Lkc@mail.python.org> http://hg.python.org/cpython/rev/9d1309e31491 changeset: 87181:9d1309e31491 branch: 3.3 parent: 87178:981d161a52be user: Serhiy Storchaka date: Sun Nov 17 00:42:25 2013 +0200 summary: Issue #19602: Use specific asserts in tkinter tests. files: Lib/tkinter/test/test_ttk/test_extensions.py | 10 +- Lib/tkinter/test/test_ttk/test_style.py | 6 +- Lib/tkinter/test/test_ttk/test_widgets.py | 36 +++++----- 3 files changed, 26 insertions(+), 26 deletions(-) diff --git a/Lib/tkinter/test/test_ttk/test_extensions.py b/Lib/tkinter/test/test_ttk/test_extensions.py --- a/Lib/tkinter/test/test_ttk/test_extensions.py +++ b/Lib/tkinter/test/test_ttk/test_extensions.py @@ -45,7 +45,7 @@ # it tries calling instance attributes not yet defined. ttk.LabeledScale(variable=myvar) if hasattr(sys, 'last_type'): - self.assertFalse(sys.last_type == tkinter.TclError) + self.assertNotEqual(sys.last_type, tkinter.TclError) def test_initialization(self): @@ -120,14 +120,14 @@ # at the same time this shouldn't affect test outcome lscale.update() curr_xcoord = lscale.scale.coords()[0] - self.assertTrue(prev_xcoord != curr_xcoord) + self.assertNotEqual(prev_xcoord, curr_xcoord) # the label widget should have been repositioned too linfo_2 = lscale.label.place_info() self.assertEqual(lscale.label['text'], 0) self.assertEqual(curr_xcoord, int(linfo_2['x'])) # change the range back lscale.scale.configure(from_=0, to=10) - self.assertTrue(prev_xcoord != curr_xcoord) + self.assertNotEqual(prev_xcoord, curr_xcoord) self.assertEqual(prev_xcoord, int(linfo_1['x'])) lscale.destroy() @@ -146,7 +146,7 @@ # at the same time this shouldn't affect test outcome x.update() self.assertEqual(x.label['text'], newval) - self.assertTrue(x.scale.coords()[0] > curr_xcoord) + self.assertGreater(x.scale.coords()[0], curr_xcoord) self.assertEqual(x.scale.coords()[0], int(x.label.place_info()['x'])) @@ -238,7 +238,7 @@ if last == curr: # no more menu entries break - self.assertFalse(curr == default) + self.assertNotEqual(curr, default) i += 1 self.assertEqual(i, len(items)) diff --git a/Lib/tkinter/test/test_ttk/test_style.py b/Lib/tkinter/test/test_ttk/test_style.py --- a/Lib/tkinter/test/test_ttk/test_style.py +++ b/Lib/tkinter/test/test_ttk/test_style.py @@ -18,7 +18,7 @@ style.configure('TButton', background='yellow') self.assertEqual(style.configure('TButton', 'background'), 'yellow') - self.assertTrue(isinstance(style.configure('TButton'), dict)) + self.assertIsInstance(style.configure('TButton'), dict) def test_map(self): @@ -26,7 +26,7 @@ style.map('TButton', background=[('active', 'background', 'blue')]) self.assertEqual(style.map('TButton', 'background'), [('active', 'background', 'blue')]) - self.assertTrue(isinstance(style.map('TButton'), dict)) + self.assertIsInstance(style.map('TButton'), dict) def test_lookup(self): @@ -57,7 +57,7 @@ self.assertEqual(style.layout('Treeview'), tv_style) # should return a list - self.assertTrue(isinstance(style.layout('TButton'), list)) + self.assertIsInstance(style.layout('TButton'), list) # correct layout, but "option" doesn't exist as option self.assertRaises(tkinter.TclError, style.layout, 'Treeview', diff --git a/Lib/tkinter/test/test_ttk/test_widgets.py b/Lib/tkinter/test/test_ttk/test_widgets.py --- a/Lib/tkinter/test/test_ttk/test_widgets.py +++ b/Lib/tkinter/test/test_ttk/test_widgets.py @@ -273,7 +273,7 @@ cbtn['command'] = '' res = cbtn.invoke() self.assertFalse(str(res)) - self.assertFalse(len(success) > 1) + self.assertLessEqual(len(success), 1) self.assertEqual(cbtn['offvalue'], cbtn.tk.globalgetvar(cbtn['variable'])) @@ -454,7 +454,7 @@ def test_bbox(self): self.assertEqual(len(self.entry.bbox(0)), 4) for item in self.entry.bbox(0): - self.assertTrue(isinstance(item, int)) + self.assertIsInstance(item, int) self.assertRaises(tkinter.TclError, self.entry.bbox, 'noindex') self.assertRaises(tkinter.TclError, self.entry.bbox, None) @@ -652,7 +652,7 @@ child = ttk.Label() self.paned.add(child) - self.assertTrue(isinstance(self.paned.pane(0), dict)) + self.assertIsInstance(self.paned.pane(0), dict) self.assertEqual(self.paned.pane(0, weight=None), 0) # newer form for querying a single option self.assertEqual(self.paned.pane(0, 'weight'), 0) @@ -679,8 +679,8 @@ curr_pos = self.paned.sashpos(0) self.paned.sashpos(0, 1000) - self.assertTrue(curr_pos != self.paned.sashpos(0)) - self.assertTrue(isinstance(self.paned.sashpos(0), int)) + self.assertNotEqual(curr_pos, self.paned.sashpos(0)) + self.assertIsInstance(self.paned.sashpos(0), int) @add_standard_options(StandardTtkOptionsTests) @@ -720,7 +720,7 @@ cbtn2['command'] = '' res = cbtn2.invoke() self.assertEqual(str(res), '') - self.assertFalse(len(success) > 1) + self.assertLessEqual(len(success), 1) self.assertEqual(cbtn2['value'], myvar.get()) self.assertEqual(myvar.get(), cbtn.tk.globalgetvar(cbtn['variable'])) @@ -982,7 +982,7 @@ self.nb.add(self.child2) self.assertEqual(self.nb.tabs(), tabs) self.assertEqual(self.nb.index(self.child2), child2_index) - self.assertTrue(str(self.child2) == self.nb.tabs()[child2_index]) + self.assertEqual(str(self.child2), self.nb.tabs()[child2_index]) # but the tab next to it (not hidden) is the one selected now self.assertEqual(self.nb.index('current'), curr + 1) @@ -995,19 +995,19 @@ tabs = self.nb.tabs() child1_index = self.nb.index(self.child1) self.nb.forget(self.child1) - self.assertFalse(str(self.child1) in self.nb.tabs()) + self.assertNotIn(str(self.child1), self.nb.tabs()) self.assertEqual(len(tabs) - 1, len(self.nb.tabs())) self.nb.add(self.child1) self.assertEqual(self.nb.index(self.child1), 1) - self.assertFalse(child1_index == self.nb.index(self.child1)) + self.assertNotEqual(child1_index, self.nb.index(self.child1)) def test_index(self): self.assertRaises(tkinter.TclError, self.nb.index, -1) self.assertRaises(tkinter.TclError, self.nb.index, None) - self.assertTrue(isinstance(self.nb.index('end'), int)) + self.assertIsInstance(self.nb.index('end'), int) self.assertEqual(self.nb.index(self.child1), 0) self.assertEqual(self.nb.index(self.child2), 1) self.assertEqual(self.nb.index('end'), 2) @@ -1071,7 +1071,7 @@ self.assertRaises(tkinter.TclError, self.nb.tab, 'notab') self.assertRaises(tkinter.TclError, self.nb.tab, None) - self.assertTrue(isinstance(self.nb.tab(self.child1), dict)) + self.assertIsInstance(self.nb.tab(self.child1), dict) self.assertEqual(self.nb.tab(self.child1, text=None), 'a') # newer form for querying a single option self.assertEqual(self.nb.tab(self.child1, 'text'), 'a') @@ -1192,7 +1192,7 @@ bbox = self.tv.bbox(children[0]) self.assertEqual(len(bbox), 4) - self.assertTrue(isinstance(bbox, tuple)) + self.assertIsInstance(bbox, tuple) for item in bbox: if not isinstance(item, int): self.fail("Invalid bounding box: %s" % bbox) @@ -1215,7 +1215,7 @@ self.assertEqual(self.tv.get_children(), ()) item_id = self.tv.insert('', 'end') - self.assertTrue(isinstance(self.tv.get_children(), tuple)) + self.assertIsInstance(self.tv.get_children(), tuple) self.assertEqual(self.tv.get_children()[0], item_id) # add item_id and child3 as children of child2 @@ -1240,9 +1240,9 @@ def test_column(self): # return a dict with all options/values - self.assertTrue(isinstance(self.tv.column('#0'), dict)) + self.assertIsInstance(self.tv.column('#0'), dict) # return a single value of the given option - self.assertTrue(isinstance(self.tv.column('#0', width=None), int)) + self.assertIsInstance(self.tv.column('#0', width=None), int) # set a new value for an option self.tv.column('#0', width=10) # testing new way to get option value @@ -1355,7 +1355,7 @@ def test_heading(self): # check a dict is returned - self.assertTrue(isinstance(self.tv.heading('#0'), dict)) + self.assertIsInstance(self.tv.heading('#0'), dict) # check a value is returned self.tv.heading('#0', text='hi') @@ -1466,7 +1466,7 @@ self.tv.item(item, values=list(self.tv.item(item, values=None))) self.assertEqual(self.tv.item(item, values=None), (value, )) - self.assertTrue(isinstance(self.tv.item(item), dict)) + self.assertIsInstance(self.tv.item(item), dict) # erase item values self.tv.item(item, values='') @@ -1567,7 +1567,7 @@ 'blue') self.assertEqual(str(self.tv.tag_configure('test', foreground=None)), 'blue') - self.assertTrue(isinstance(self.tv.tag_configure('test'), dict)) + self.assertIsInstance(self.tv.tag_configure('test'), dict) @add_standard_options(StandardTtkOptionsTests) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 11:58:27 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 11:58:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319602=3A_Use_specific_asserts_in_tkinter_tests?= =?utf-8?q?=2E?= Message-ID: <3dMqz745Rxz7Llx@mail.python.org> http://hg.python.org/cpython/rev/a3ed49cf7c70 changeset: 87182:a3ed49cf7c70 parent: 87180:6a6ecc7149f4 parent: 87181:9d1309e31491 user: Serhiy Storchaka date: Sun Nov 17 00:42:52 2013 +0200 summary: Issue #19602: Use specific asserts in tkinter tests. files: Lib/tkinter/test/test_ttk/test_extensions.py | 10 +- Lib/tkinter/test/test_ttk/test_style.py | 6 +- Lib/tkinter/test/test_ttk/test_widgets.py | 36 +++++----- 3 files changed, 26 insertions(+), 26 deletions(-) diff --git a/Lib/tkinter/test/test_ttk/test_extensions.py b/Lib/tkinter/test/test_ttk/test_extensions.py --- a/Lib/tkinter/test/test_ttk/test_extensions.py +++ b/Lib/tkinter/test/test_ttk/test_extensions.py @@ -45,7 +45,7 @@ # it tries calling instance attributes not yet defined. ttk.LabeledScale(variable=myvar) if hasattr(sys, 'last_type'): - self.assertFalse(sys.last_type == tkinter.TclError) + self.assertNotEqual(sys.last_type, tkinter.TclError) def test_initialization(self): @@ -120,14 +120,14 @@ # at the same time this shouldn't affect test outcome lscale.update() curr_xcoord = lscale.scale.coords()[0] - self.assertTrue(prev_xcoord != curr_xcoord) + self.assertNotEqual(prev_xcoord, curr_xcoord) # the label widget should have been repositioned too linfo_2 = lscale.label.place_info() self.assertEqual(lscale.label['text'], 0) self.assertEqual(curr_xcoord, int(linfo_2['x'])) # change the range back lscale.scale.configure(from_=0, to=10) - self.assertTrue(prev_xcoord != curr_xcoord) + self.assertNotEqual(prev_xcoord, curr_xcoord) self.assertEqual(prev_xcoord, int(linfo_1['x'])) lscale.destroy() @@ -146,7 +146,7 @@ # at the same time this shouldn't affect test outcome x.update() self.assertEqual(x.label['text'], newval) - self.assertTrue(x.scale.coords()[0] > curr_xcoord) + self.assertGreater(x.scale.coords()[0], curr_xcoord) self.assertEqual(x.scale.coords()[0], int(x.label.place_info()['x'])) @@ -238,7 +238,7 @@ if last == curr: # no more menu entries break - self.assertFalse(curr == default) + self.assertNotEqual(curr, default) i += 1 self.assertEqual(i, len(items)) diff --git a/Lib/tkinter/test/test_ttk/test_style.py b/Lib/tkinter/test/test_ttk/test_style.py --- a/Lib/tkinter/test/test_ttk/test_style.py +++ b/Lib/tkinter/test/test_ttk/test_style.py @@ -18,7 +18,7 @@ style.configure('TButton', background='yellow') self.assertEqual(style.configure('TButton', 'background'), 'yellow') - self.assertTrue(isinstance(style.configure('TButton'), dict)) + self.assertIsInstance(style.configure('TButton'), dict) def test_map(self): @@ -26,7 +26,7 @@ style.map('TButton', background=[('active', 'background', 'blue')]) self.assertEqual(style.map('TButton', 'background'), [('active', 'background', 'blue')]) - self.assertTrue(isinstance(style.map('TButton'), dict)) + self.assertIsInstance(style.map('TButton'), dict) def test_lookup(self): @@ -57,7 +57,7 @@ self.assertEqual(style.layout('Treeview'), tv_style) # should return a list - self.assertTrue(isinstance(style.layout('TButton'), list)) + self.assertIsInstance(style.layout('TButton'), list) # correct layout, but "option" doesn't exist as option self.assertRaises(tkinter.TclError, style.layout, 'Treeview', diff --git a/Lib/tkinter/test/test_ttk/test_widgets.py b/Lib/tkinter/test/test_ttk/test_widgets.py --- a/Lib/tkinter/test/test_ttk/test_widgets.py +++ b/Lib/tkinter/test/test_ttk/test_widgets.py @@ -273,7 +273,7 @@ cbtn['command'] = '' res = cbtn.invoke() self.assertFalse(str(res)) - self.assertFalse(len(success) > 1) + self.assertLessEqual(len(success), 1) self.assertEqual(cbtn['offvalue'], cbtn.tk.globalgetvar(cbtn['variable'])) @@ -454,7 +454,7 @@ def test_bbox(self): self.assertEqual(len(self.entry.bbox(0)), 4) for item in self.entry.bbox(0): - self.assertTrue(isinstance(item, int)) + self.assertIsInstance(item, int) self.assertRaises(tkinter.TclError, self.entry.bbox, 'noindex') self.assertRaises(tkinter.TclError, self.entry.bbox, None) @@ -652,7 +652,7 @@ child = ttk.Label() self.paned.add(child) - self.assertTrue(isinstance(self.paned.pane(0), dict)) + self.assertIsInstance(self.paned.pane(0), dict) self.assertEqual(self.paned.pane(0, weight=None), 0) # newer form for querying a single option self.assertEqual(self.paned.pane(0, 'weight'), 0) @@ -679,8 +679,8 @@ curr_pos = self.paned.sashpos(0) self.paned.sashpos(0, 1000) - self.assertTrue(curr_pos != self.paned.sashpos(0)) - self.assertTrue(isinstance(self.paned.sashpos(0), int)) + self.assertNotEqual(curr_pos, self.paned.sashpos(0)) + self.assertIsInstance(self.paned.sashpos(0), int) @add_standard_options(StandardTtkOptionsTests) @@ -720,7 +720,7 @@ cbtn2['command'] = '' res = cbtn2.invoke() self.assertEqual(str(res), '') - self.assertFalse(len(success) > 1) + self.assertLessEqual(len(success), 1) self.assertEqual(cbtn2['value'], myvar.get()) self.assertEqual(myvar.get(), cbtn.tk.globalgetvar(cbtn['variable'])) @@ -982,7 +982,7 @@ self.nb.add(self.child2) self.assertEqual(self.nb.tabs(), tabs) self.assertEqual(self.nb.index(self.child2), child2_index) - self.assertTrue(str(self.child2) == self.nb.tabs()[child2_index]) + self.assertEqual(str(self.child2), self.nb.tabs()[child2_index]) # but the tab next to it (not hidden) is the one selected now self.assertEqual(self.nb.index('current'), curr + 1) @@ -995,19 +995,19 @@ tabs = self.nb.tabs() child1_index = self.nb.index(self.child1) self.nb.forget(self.child1) - self.assertFalse(str(self.child1) in self.nb.tabs()) + self.assertNotIn(str(self.child1), self.nb.tabs()) self.assertEqual(len(tabs) - 1, len(self.nb.tabs())) self.nb.add(self.child1) self.assertEqual(self.nb.index(self.child1), 1) - self.assertFalse(child1_index == self.nb.index(self.child1)) + self.assertNotEqual(child1_index, self.nb.index(self.child1)) def test_index(self): self.assertRaises(tkinter.TclError, self.nb.index, -1) self.assertRaises(tkinter.TclError, self.nb.index, None) - self.assertTrue(isinstance(self.nb.index('end'), int)) + self.assertIsInstance(self.nb.index('end'), int) self.assertEqual(self.nb.index(self.child1), 0) self.assertEqual(self.nb.index(self.child2), 1) self.assertEqual(self.nb.index('end'), 2) @@ -1071,7 +1071,7 @@ self.assertRaises(tkinter.TclError, self.nb.tab, 'notab') self.assertRaises(tkinter.TclError, self.nb.tab, None) - self.assertTrue(isinstance(self.nb.tab(self.child1), dict)) + self.assertIsInstance(self.nb.tab(self.child1), dict) self.assertEqual(self.nb.tab(self.child1, text=None), 'a') # newer form for querying a single option self.assertEqual(self.nb.tab(self.child1, 'text'), 'a') @@ -1192,7 +1192,7 @@ bbox = self.tv.bbox(children[0]) self.assertEqual(len(bbox), 4) - self.assertTrue(isinstance(bbox, tuple)) + self.assertIsInstance(bbox, tuple) for item in bbox: if not isinstance(item, int): self.fail("Invalid bounding box: %s" % bbox) @@ -1215,7 +1215,7 @@ self.assertEqual(self.tv.get_children(), ()) item_id = self.tv.insert('', 'end') - self.assertTrue(isinstance(self.tv.get_children(), tuple)) + self.assertIsInstance(self.tv.get_children(), tuple) self.assertEqual(self.tv.get_children()[0], item_id) # add item_id and child3 as children of child2 @@ -1240,9 +1240,9 @@ def test_column(self): # return a dict with all options/values - self.assertTrue(isinstance(self.tv.column('#0'), dict)) + self.assertIsInstance(self.tv.column('#0'), dict) # return a single value of the given option - self.assertTrue(isinstance(self.tv.column('#0', width=None), int)) + self.assertIsInstance(self.tv.column('#0', width=None), int) # set a new value for an option self.tv.column('#0', width=10) # testing new way to get option value @@ -1355,7 +1355,7 @@ def test_heading(self): # check a dict is returned - self.assertTrue(isinstance(self.tv.heading('#0'), dict)) + self.assertIsInstance(self.tv.heading('#0'), dict) # check a value is returned self.tv.heading('#0', text='hi') @@ -1466,7 +1466,7 @@ self.tv.item(item, values=list(self.tv.item(item, values=None))) self.assertEqual(self.tv.item(item, values=None), (value, )) - self.assertTrue(isinstance(self.tv.item(item), dict)) + self.assertIsInstance(self.tv.item(item), dict) # erase item values self.tv.item(item, values='') @@ -1567,7 +1567,7 @@ 'blue') self.assertEqual(str(self.tv.tag_configure('test', foreground=None)), 'blue') - self.assertTrue(isinstance(self.tv.tag_configure('test'), dict)) + self.assertIsInstance(self.tv.tag_configure('test'), dict) @add_standard_options(StandardTtkOptionsTests) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 11:58:29 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 11:58:29 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NjAy?= =?utf-8?q?=3A_Use_specific_asserts_in_tkinter_tests=2E?= Message-ID: <3dMqz9074fz7Ltv@mail.python.org> http://hg.python.org/cpython/rev/0aed137d0d49 changeset: 87183:0aed137d0d49 branch: 2.7 parent: 87179:d004ccdd6a57 user: Serhiy Storchaka date: Sun Nov 17 00:43:03 2013 +0200 summary: Issue #19602: Use specific asserts in tkinter tests. files: Lib/lib-tk/test/test_ttk/test_extensions.py | 12 ++- Lib/lib-tk/test/test_ttk/test_style.py | 6 +- Lib/lib-tk/test/test_ttk/test_widgets.py | 36 +++++----- 3 files changed, 29 insertions(+), 25 deletions(-) diff --git a/Lib/lib-tk/test/test_ttk/test_extensions.py b/Lib/lib-tk/test/test_ttk/test_extensions.py --- a/Lib/lib-tk/test/test_ttk/test_extensions.py +++ b/Lib/lib-tk/test/test_ttk/test_extensions.py @@ -45,7 +45,11 @@ # it tries calling instance attributes not yet defined. ttk.LabeledScale(variable=myvar) if hasattr(sys, 'last_type'): +<<<<<<< self.assertFalse(sys.last_type == Tkinter.TclError) +======= + self.assertNotEqual(sys.last_type, tkinter.TclError) +>>>>>>> def test_initialization(self): @@ -120,14 +124,14 @@ # at the same time this shouldn't affect test outcome lscale.update() curr_xcoord = lscale.scale.coords()[0] - self.assertTrue(prev_xcoord != curr_xcoord) + self.assertNotEqual(prev_xcoord, curr_xcoord) # the label widget should have been repositioned too linfo_2 = lscale.label.place_info() self.assertEqual(lscale.label['text'], 0) self.assertEqual(curr_xcoord, int(linfo_2['x'])) # change the range back lscale.scale.configure(from_=0, to=10) - self.assertTrue(prev_xcoord != curr_xcoord) + self.assertNotEqual(prev_xcoord, curr_xcoord) self.assertEqual(prev_xcoord, int(linfo_1['x'])) lscale.destroy() @@ -146,7 +150,7 @@ # at the same time this shouldn't affect test outcome x.update() self.assertEqual(x.label['text'], newval) - self.assertTrue(x.scale.coords()[0] > curr_xcoord) + self.assertGreater(x.scale.coords()[0], curr_xcoord) self.assertEqual(x.scale.coords()[0], int(x.label.place_info()['x'])) @@ -238,7 +242,7 @@ if last == curr: # no more menu entries break - self.assertFalse(curr == default) + self.assertNotEqual(curr, default) i += 1 self.assertEqual(i, len(items)) diff --git a/Lib/lib-tk/test/test_ttk/test_style.py b/Lib/lib-tk/test/test_ttk/test_style.py --- a/Lib/lib-tk/test/test_ttk/test_style.py +++ b/Lib/lib-tk/test/test_ttk/test_style.py @@ -18,7 +18,7 @@ style.configure('TButton', background='yellow') self.assertEqual(style.configure('TButton', 'background'), 'yellow') - self.assertTrue(isinstance(style.configure('TButton'), dict)) + self.assertIsInstance(style.configure('TButton'), dict) def test_map(self): @@ -26,7 +26,7 @@ style.map('TButton', background=[('active', 'background', 'blue')]) self.assertEqual(style.map('TButton', 'background'), [('active', 'background', 'blue')]) - self.assertTrue(isinstance(style.map('TButton'), dict)) + self.assertIsInstance(style.map('TButton'), dict) def test_lookup(self): @@ -57,7 +57,7 @@ self.assertEqual(style.layout('Treeview'), tv_style) # should return a list - self.assertTrue(isinstance(style.layout('TButton'), list)) + self.assertIsInstance(style.layout('TButton'), list) # correct layout, but "option" doesn't exist as option self.assertRaises(Tkinter.TclError, style.layout, 'Treeview', diff --git a/Lib/lib-tk/test/test_ttk/test_widgets.py b/Lib/lib-tk/test/test_ttk/test_widgets.py --- a/Lib/lib-tk/test/test_ttk/test_widgets.py +++ b/Lib/lib-tk/test/test_ttk/test_widgets.py @@ -274,7 +274,7 @@ cbtn['command'] = '' res = cbtn.invoke() self.assertFalse(str(res)) - self.assertFalse(len(success) > 1) + self.assertLessEqual(len(success), 1) self.assertEqual(cbtn['offvalue'], cbtn.tk.globalgetvar(cbtn['variable'])) @@ -455,7 +455,7 @@ def test_bbox(self): self.assertEqual(len(self.entry.bbox(0)), 4) for item in self.entry.bbox(0): - self.assertTrue(isinstance(item, int)) + self.assertIsInstance(item, int) self.assertRaises(Tkinter.TclError, self.entry.bbox, 'noindex') self.assertRaises(Tkinter.TclError, self.entry.bbox, None) @@ -653,7 +653,7 @@ child = ttk.Label() self.paned.add(child) - self.assertTrue(isinstance(self.paned.pane(0), dict)) + self.assertIsInstance(self.paned.pane(0), dict) self.assertEqual(self.paned.pane(0, weight=None), 0) # newer form for querying a single option self.assertEqual(self.paned.pane(0, 'weight'), 0) @@ -680,8 +680,8 @@ curr_pos = self.paned.sashpos(0) self.paned.sashpos(0, 1000) - self.assertTrue(curr_pos != self.paned.sashpos(0)) - self.assertTrue(isinstance(self.paned.sashpos(0), int)) + self.assertNotEqual(curr_pos, self.paned.sashpos(0)) + self.assertIsInstance(self.paned.sashpos(0), int) @add_standard_options(StandardTtkOptionsTests) @@ -721,7 +721,7 @@ cbtn2['command'] = '' res = cbtn2.invoke() self.assertEqual(str(res), '') - self.assertFalse(len(success) > 1) + self.assertLessEqual(len(success), 1) self.assertEqual(cbtn2['value'], myvar.get()) self.assertEqual(myvar.get(), cbtn.tk.globalgetvar(cbtn['variable'])) @@ -983,7 +983,7 @@ self.nb.add(self.child2) self.assertEqual(self.nb.tabs(), tabs) self.assertEqual(self.nb.index(self.child2), child2_index) - self.assertTrue(str(self.child2) == self.nb.tabs()[child2_index]) + self.assertEqual(str(self.child2), self.nb.tabs()[child2_index]) # but the tab next to it (not hidden) is the one selected now self.assertEqual(self.nb.index('current'), curr + 1) @@ -996,19 +996,19 @@ tabs = self.nb.tabs() child1_index = self.nb.index(self.child1) self.nb.forget(self.child1) - self.assertFalse(str(self.child1) in self.nb.tabs()) + self.assertNotIn(str(self.child1), self.nb.tabs()) self.assertEqual(len(tabs) - 1, len(self.nb.tabs())) self.nb.add(self.child1) self.assertEqual(self.nb.index(self.child1), 1) - self.assertFalse(child1_index == self.nb.index(self.child1)) + self.assertNotEqual(child1_index, self.nb.index(self.child1)) def test_index(self): self.assertRaises(Tkinter.TclError, self.nb.index, -1) self.assertRaises(Tkinter.TclError, self.nb.index, None) - self.assertTrue(isinstance(self.nb.index('end'), int)) + self.assertIsInstance(self.nb.index('end'), int) self.assertEqual(self.nb.index(self.child1), 0) self.assertEqual(self.nb.index(self.child2), 1) self.assertEqual(self.nb.index('end'), 2) @@ -1072,7 +1072,7 @@ self.assertRaises(Tkinter.TclError, self.nb.tab, 'notab') self.assertRaises(Tkinter.TclError, self.nb.tab, None) - self.assertTrue(isinstance(self.nb.tab(self.child1), dict)) + self.assertIsInstance(self.nb.tab(self.child1), dict) self.assertEqual(self.nb.tab(self.child1, text=None), 'a') # newer form for querying a single option self.assertEqual(self.nb.tab(self.child1, 'text'), 'a') @@ -1193,7 +1193,7 @@ bbox = self.tv.bbox(children[0]) self.assertEqual(len(bbox), 4) - self.assertTrue(isinstance(bbox, tuple)) + self.assertIsInstance(bbox, tuple) for item in bbox: if not isinstance(item, int): self.fail("Invalid bounding box: %s" % bbox) @@ -1216,7 +1216,7 @@ self.assertEqual(self.tv.get_children(), ()) item_id = self.tv.insert('', 'end') - self.assertTrue(isinstance(self.tv.get_children(), tuple)) + self.assertIsInstance(self.tv.get_children(), tuple) self.assertEqual(self.tv.get_children()[0], item_id) # add item_id and child3 as children of child2 @@ -1241,9 +1241,9 @@ def test_column(self): # return a dict with all options/values - self.assertTrue(isinstance(self.tv.column('#0'), dict)) + self.assertIsInstance(self.tv.column('#0'), dict) # return a single value of the given option - self.assertTrue(isinstance(self.tv.column('#0', width=None), int)) + self.assertIsInstance(self.tv.column('#0', width=None), int) # set a new value for an option self.tv.column('#0', width=10) # testing new way to get option value @@ -1356,7 +1356,7 @@ def test_heading(self): # check a dict is returned - self.assertTrue(isinstance(self.tv.heading('#0'), dict)) + self.assertIsInstance(self.tv.heading('#0'), dict) # check a value is returned self.tv.heading('#0', text='hi') @@ -1467,7 +1467,7 @@ self.tv.item(item, values=list(self.tv.item(item, values=None))) self.assertEqual(self.tv.item(item, values=None), (value, )) - self.assertTrue(isinstance(self.tv.item(item), dict)) + self.assertIsInstance(self.tv.item(item), dict) # erase item values self.tv.item(item, values='') @@ -1568,7 +1568,7 @@ 'blue') self.assertEqual(str(self.tv.tag_configure('test', foreground=None)), 'blue') - self.assertTrue(isinstance(self.tv.tag_configure('test'), dict)) + self.assertIsInstance(self.tv.tag_configure('test'), dict) @add_standard_options(StandardTtkOptionsTests) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 11:58:30 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 11:58:30 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NjA0?= =?utf-8?q?=3A_Use_specific_asserts_in_array_tests=2E?= Message-ID: <3dMqzB32nyz7Lx2@mail.python.org> http://hg.python.org/cpython/rev/f7d8c7a176fb changeset: 87184:f7d8c7a176fb branch: 2.7 user: Serhiy Storchaka date: Sun Nov 17 00:44:57 2013 +0200 summary: Issue #19604: Use specific asserts in array tests. files: Lib/test/test_array.py | 64 +++++++++++++++--------------- 1 files changed, 32 insertions(+), 32 deletions(-) diff --git a/Lib/test/test_array.py b/Lib/test/test_array.py --- a/Lib/test/test_array.py +++ b/Lib/test/test_array.py @@ -50,7 +50,7 @@ def test_constructor(self): a = array.array(self.typecode) self.assertEqual(a.typecode, self.typecode) - self.assertTrue(a.itemsize>=self.minitemsize) + self.assertGreaterEqual(a.itemsize, self.minitemsize) self.assertRaises(TypeError, array.array, self.typecode, None) def test_len(self): @@ -254,39 +254,39 @@ def test_cmp(self): a = array.array(self.typecode, self.example) - self.assertTrue((a == 42) is False) - self.assertTrue((a != 42) is True) + self.assertIs(a == 42, False) + self.assertIs(a != 42, True) - self.assertTrue((a == a) is True) - self.assertTrue((a != a) is False) - self.assertTrue((a < a) is False) - self.assertTrue((a <= a) is True) - self.assertTrue((a > a) is False) - self.assertTrue((a >= a) is True) + self.assertIs(a == a, True) + self.assertIs(a != a, False) + self.assertIs(a < a, False) + self.assertIs(a <= a, True) + self.assertIs(a > a, False) + self.assertIs(a >= a, True) al = array.array(self.typecode, self.smallerexample) ab = array.array(self.typecode, self.biggerexample) - self.assertTrue((a == 2*a) is False) - self.assertTrue((a != 2*a) is True) - self.assertTrue((a < 2*a) is True) - self.assertTrue((a <= 2*a) is True) - self.assertTrue((a > 2*a) is False) - self.assertTrue((a >= 2*a) is False) + self.assertIs(a == 2*a, False) + self.assertIs(a != 2*a, True) + self.assertIs(a < 2*a, True) + self.assertIs(a <= 2*a, True) + self.assertIs(a > 2*a, False) + self.assertIs(a >= 2*a, False) - self.assertTrue((a == al) is False) - self.assertTrue((a != al) is True) - self.assertTrue((a < al) is False) - self.assertTrue((a <= al) is False) - self.assertTrue((a > al) is True) - self.assertTrue((a >= al) is True) + self.assertIs(a == al, False) + self.assertIs(a != al, True) + self.assertIs(a < al, False) + self.assertIs(a <= al, False) + self.assertIs(a > al, True) + self.assertIs(a >= al, True) - self.assertTrue((a == ab) is False) - self.assertTrue((a != ab) is True) - self.assertTrue((a < ab) is True) - self.assertTrue((a <= ab) is True) - self.assertTrue((a > ab) is False) - self.assertTrue((a >= ab) is False) + self.assertIs(a == ab, False) + self.assertIs(a != ab, True) + self.assertIs(a < ab, True) + self.assertIs(a <= ab, True) + self.assertIs(a > ab, False) + self.assertIs(a >= ab, False) def test_add(self): a = array.array(self.typecode, self.example) \ @@ -305,7 +305,7 @@ a = array.array(self.typecode, self.example[::-1]) b = a a += array.array(self.typecode, 2*self.example) - self.assertTrue(a is b) + self.assertIs(a, b) self.assertEqual( a, array.array(self.typecode, self.example[::-1]+2*self.example) @@ -354,22 +354,22 @@ b = a a *= 5 - self.assertTrue(a is b) + self.assertIs(a, b) self.assertEqual( a, array.array(self.typecode, 5*self.example) ) a *= 0 - self.assertTrue(a is b) + self.assertIs(a, b) self.assertEqual(a, array.array(self.typecode)) a *= 1000 - self.assertTrue(a is b) + self.assertIs(a, b) self.assertEqual(a, array.array(self.typecode)) a *= -1 - self.assertTrue(a is b) + self.assertIs(a, b) self.assertEqual(a, array.array(self.typecode)) a = array.array(self.typecode, self.example) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 11:58:31 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 11:58:31 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NjA0?= =?utf-8?q?=3A_Use_specific_asserts_in_array_tests=2E?= Message-ID: <3dMqzC63GMz7Lx0@mail.python.org> http://hg.python.org/cpython/rev/3650790ca33a changeset: 87185:3650790ca33a branch: 3.3 parent: 87181:9d1309e31491 user: Serhiy Storchaka date: Sun Nov 17 00:45:17 2013 +0200 summary: Issue #19604: Use specific asserts in array tests. files: Lib/test/test_array.py | 64 +++++++++++++++--------------- 1 files changed, 32 insertions(+), 32 deletions(-) diff --git a/Lib/test/test_array.py b/Lib/test/test_array.py --- a/Lib/test/test_array.py +++ b/Lib/test/test_array.py @@ -195,7 +195,7 @@ def test_constructor(self): a = array.array(self.typecode) self.assertEqual(a.typecode, self.typecode) - self.assertTrue(a.itemsize>=self.minitemsize) + self.assertGreaterEqual(a.itemsize, self.minitemsize) self.assertRaises(TypeError, array.array, self.typecode, None) def test_len(self): @@ -442,39 +442,39 @@ def test_cmp(self): a = array.array(self.typecode, self.example) - self.assertTrue((a == 42) is False) - self.assertTrue((a != 42) is True) + self.assertIs(a == 42, False) + self.assertIs(a != 42, True) - self.assertTrue((a == a) is True) - self.assertTrue((a != a) is False) - self.assertTrue((a < a) is False) - self.assertTrue((a <= a) is True) - self.assertTrue((a > a) is False) - self.assertTrue((a >= a) is True) + self.assertIs(a == a, True) + self.assertIs(a != a, False) + self.assertIs(a < a, False) + self.assertIs(a <= a, True) + self.assertIs(a > a, False) + self.assertIs(a >= a, True) al = array.array(self.typecode, self.smallerexample) ab = array.array(self.typecode, self.biggerexample) - self.assertTrue((a == 2*a) is False) - self.assertTrue((a != 2*a) is True) - self.assertTrue((a < 2*a) is True) - self.assertTrue((a <= 2*a) is True) - self.assertTrue((a > 2*a) is False) - self.assertTrue((a >= 2*a) is False) + self.assertIs(a == 2*a, False) + self.assertIs(a != 2*a, True) + self.assertIs(a < 2*a, True) + self.assertIs(a <= 2*a, True) + self.assertIs(a > 2*a, False) + self.assertIs(a >= 2*a, False) - self.assertTrue((a == al) is False) - self.assertTrue((a != al) is True) - self.assertTrue((a < al) is False) - self.assertTrue((a <= al) is False) - self.assertTrue((a > al) is True) - self.assertTrue((a >= al) is True) + self.assertIs(a == al, False) + self.assertIs(a != al, True) + self.assertIs(a < al, False) + self.assertIs(a <= al, False) + self.assertIs(a > al, True) + self.assertIs(a >= al, True) - self.assertTrue((a == ab) is False) - self.assertTrue((a != ab) is True) - self.assertTrue((a < ab) is True) - self.assertTrue((a <= ab) is True) - self.assertTrue((a > ab) is False) - self.assertTrue((a >= ab) is False) + self.assertIs(a == ab, False) + self.assertIs(a != ab, True) + self.assertIs(a < ab, True) + self.assertIs(a <= ab, True) + self.assertIs(a > ab, False) + self.assertIs(a >= ab, False) def test_add(self): a = array.array(self.typecode, self.example) \ @@ -493,7 +493,7 @@ a = array.array(self.typecode, self.example[::-1]) b = a a += array.array(self.typecode, 2*self.example) - self.assertTrue(a is b) + self.assertIs(a, b) self.assertEqual( a, array.array(self.typecode, self.example[::-1]+2*self.example) @@ -548,22 +548,22 @@ b = a a *= 5 - self.assertTrue(a is b) + self.assertIs(a, b) self.assertEqual( a, array.array(self.typecode, 5*self.example) ) a *= 0 - self.assertTrue(a is b) + self.assertIs(a, b) self.assertEqual(a, array.array(self.typecode)) a *= 1000 - self.assertTrue(a is b) + self.assertIs(a, b) self.assertEqual(a, array.array(self.typecode)) a *= -1 - self.assertTrue(a is b) + self.assertIs(a, b) self.assertEqual(a, array.array(self.typecode)) a = array.array(self.typecode, self.example) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 11:58:33 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 11:58:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319604=3A_Use_specific_asserts_in_array_tests=2E?= Message-ID: <3dMqzF1s7nz7LxR@mail.python.org> http://hg.python.org/cpython/rev/e8276bb6a156 changeset: 87186:e8276bb6a156 parent: 87182:a3ed49cf7c70 parent: 87185:3650790ca33a user: Serhiy Storchaka date: Sun Nov 17 00:45:39 2013 +0200 summary: Issue #19604: Use specific asserts in array tests. files: Lib/test/test_array.py | 64 +++++++++++++++--------------- 1 files changed, 32 insertions(+), 32 deletions(-) diff --git a/Lib/test/test_array.py b/Lib/test/test_array.py --- a/Lib/test/test_array.py +++ b/Lib/test/test_array.py @@ -195,7 +195,7 @@ def test_constructor(self): a = array.array(self.typecode) self.assertEqual(a.typecode, self.typecode) - self.assertTrue(a.itemsize>=self.minitemsize) + self.assertGreaterEqual(a.itemsize, self.minitemsize) self.assertRaises(TypeError, array.array, self.typecode, None) def test_len(self): @@ -442,39 +442,39 @@ def test_cmp(self): a = array.array(self.typecode, self.example) - self.assertTrue((a == 42) is False) - self.assertTrue((a != 42) is True) + self.assertIs(a == 42, False) + self.assertIs(a != 42, True) - self.assertTrue((a == a) is True) - self.assertTrue((a != a) is False) - self.assertTrue((a < a) is False) - self.assertTrue((a <= a) is True) - self.assertTrue((a > a) is False) - self.assertTrue((a >= a) is True) + self.assertIs(a == a, True) + self.assertIs(a != a, False) + self.assertIs(a < a, False) + self.assertIs(a <= a, True) + self.assertIs(a > a, False) + self.assertIs(a >= a, True) al = array.array(self.typecode, self.smallerexample) ab = array.array(self.typecode, self.biggerexample) - self.assertTrue((a == 2*a) is False) - self.assertTrue((a != 2*a) is True) - self.assertTrue((a < 2*a) is True) - self.assertTrue((a <= 2*a) is True) - self.assertTrue((a > 2*a) is False) - self.assertTrue((a >= 2*a) is False) + self.assertIs(a == 2*a, False) + self.assertIs(a != 2*a, True) + self.assertIs(a < 2*a, True) + self.assertIs(a <= 2*a, True) + self.assertIs(a > 2*a, False) + self.assertIs(a >= 2*a, False) - self.assertTrue((a == al) is False) - self.assertTrue((a != al) is True) - self.assertTrue((a < al) is False) - self.assertTrue((a <= al) is False) - self.assertTrue((a > al) is True) - self.assertTrue((a >= al) is True) + self.assertIs(a == al, False) + self.assertIs(a != al, True) + self.assertIs(a < al, False) + self.assertIs(a <= al, False) + self.assertIs(a > al, True) + self.assertIs(a >= al, True) - self.assertTrue((a == ab) is False) - self.assertTrue((a != ab) is True) - self.assertTrue((a < ab) is True) - self.assertTrue((a <= ab) is True) - self.assertTrue((a > ab) is False) - self.assertTrue((a >= ab) is False) + self.assertIs(a == ab, False) + self.assertIs(a != ab, True) + self.assertIs(a < ab, True) + self.assertIs(a <= ab, True) + self.assertIs(a > ab, False) + self.assertIs(a >= ab, False) def test_add(self): a = array.array(self.typecode, self.example) \ @@ -493,7 +493,7 @@ a = array.array(self.typecode, self.example[::-1]) b = a a += array.array(self.typecode, 2*self.example) - self.assertTrue(a is b) + self.assertIs(a, b) self.assertEqual( a, array.array(self.typecode, self.example[::-1]+2*self.example) @@ -548,22 +548,22 @@ b = a a *= 5 - self.assertTrue(a is b) + self.assertIs(a, b) self.assertEqual( a, array.array(self.typecode, 5*self.example) ) a *= 0 - self.assertTrue(a is b) + self.assertIs(a, b) self.assertEqual(a, array.array(self.typecode)) a *= 1000 - self.assertTrue(a is b) + self.assertIs(a, b) self.assertEqual(a, array.array(self.typecode)) a *= -1 - self.assertTrue(a is b) + self.assertIs(a, b) self.assertEqual(a, array.array(self.typecode)) a = array.array(self.typecode, self.example) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 11:58:34 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 11:58:34 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMy4zIC0+IDMuMyk6?= =?utf-8?q?_Merge_heads?= Message-ID: <3dMqzG4wmQz7LxD@mail.python.org> http://hg.python.org/cpython/rev/dcc12d083aaf changeset: 87187:dcc12d083aaf branch: 3.3 parent: 87185:3650790ca33a parent: 87176:895c80ce4e10 user: Serhiy Storchaka date: Sun Nov 17 12:30:29 2013 +0200 summary: Merge heads files: .hgtags | 2 + Doc/library/string.rst | 2 +- Include/patchlevel.h | 6 ++-- Lib/distutils/__init__.py | 2 +- Lib/idlelib/idlever.py | 2 +- Misc/NEWS | 40 ++++++++++++++++++++++---- Misc/RPM/python-3.3.spec | 2 +- README | 4 +- 8 files changed, 45 insertions(+), 15 deletions(-) diff --git a/.hgtags b/.hgtags --- a/.hgtags +++ b/.hgtags @@ -116,3 +116,5 @@ d9893d13c6289aa03d33559ec67f97dcbf5c9e3c v3.3.1 d047928ae3f6314a13b6137051315453d0ae89b6 v3.3.2 fd53c500f8b80f54f3ecedec9da2e8c7e52a6888 v3.3.3rc1 +d32442c0e60dfbd71234e807d3d1dedd227495a9 v3.3.3rc2 +c3896275c0f61b2510a6c7e6c458a750359a91b8 v3.3.3 diff --git a/Doc/library/string.rst b/Doc/library/string.rst --- a/Doc/library/string.rst +++ b/Doc/library/string.rst @@ -300,7 +300,7 @@ precision: `integer` type: "b" | "c" | "d" | "e" | "E" | "f" | "F" | "g" | "G" | "n" | "o" | "s" | "x" | "X" | "%" -If a valid *align* value is specified, it can be preceeded by a *fill* +If a valid *align* value is specified, it can be preceded by a *fill* character that can be any character and defaults to a space if omitted. Note that it is not possible to use ``{`` and ``}`` as *fill* char while using the :meth:`str.format` method; this limitation however doesn't diff --git a/Include/patchlevel.h b/Include/patchlevel.h --- a/Include/patchlevel.h +++ b/Include/patchlevel.h @@ -19,11 +19,11 @@ #define PY_MAJOR_VERSION 3 #define PY_MINOR_VERSION 3 #define PY_MICRO_VERSION 3 -#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_GAMMA -#define PY_RELEASE_SERIAL 1 +#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_FINAL +#define PY_RELEASE_SERIAL 0 /* Version as a string */ -#define PY_VERSION "3.3.3rc1" +#define PY_VERSION "3.3.3+" /*--end constants--*/ /* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2. diff --git a/Lib/distutils/__init__.py b/Lib/distutils/__init__.py --- a/Lib/distutils/__init__.py +++ b/Lib/distutils/__init__.py @@ -13,5 +13,5 @@ # Updated automatically by the Python release process. # #--start constants-- -__version__ = "3.3.3rc1" +__version__ = "3.3.3" #--end constants-- diff --git a/Lib/idlelib/idlever.py b/Lib/idlelib/idlever.py --- a/Lib/idlelib/idlever.py +++ b/Lib/idlelib/idlever.py @@ -1,1 +1,1 @@ -IDLE_VERSION = "3.3.3rc1" +IDLE_VERSION = "3.3.3" diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -2,10 +2,10 @@ Python News +++++++++++ -What's New in Python 3.3.4? -=========================== - -*Not yet released, see sections below for changes released in 3.3.2* +What's New in Python 3.3.4 release candidate 1? +=============================================== + +*Not yet released, see sections below for changes released in 3.3.3* Core and Builtins ----------------- @@ -42,8 +42,6 @@ - Issue #19286: Directories in ``package_data`` are no longer added to the filelist, preventing failure outlined in the ticket. -- Issue #19435: Fix directory traversal attack on CGIHttpRequestHandler. - Tests ----- @@ -58,6 +56,36 @@ Build ----- + +What's New in Python 3.3.3? +=========================== + +*Release date: 17-Nov-2013* + +No changes from release candidate 2. + + +What's New in Python 3.3.3 release candidate 2? +=============================================== + +*Release date: 11-Nov-2013* + +Library +------- + +- Issue #19227: Any re-seeding of the OpenSSL RNG on fork has been removed; + this should be handled by OpenSSL itself or by the application. + +- Issue #19435: Fix directory traversal attack on CGIHttpRequestHandler. + +Tests +----- + +- Issue #18964: Fix test_tcl when run with Tcl/Tk versions < 8.5. + +Build +----- + - Issue #15663: Revert OS X installer built-in Tcl/Tk support for 3.3.3. Some third-party projects, such as Matplotlib and PIL/Pillow, depended on being able to build with Tcl and Tk frameworks in diff --git a/Misc/RPM/python-3.3.spec b/Misc/RPM/python-3.3.spec --- a/Misc/RPM/python-3.3.spec +++ b/Misc/RPM/python-3.3.spec @@ -39,7 +39,7 @@ %define name python #--start constants-- -%define version 3.3.3rc1 +%define version 3.3.3 %define libvers 3.3 #--end constants-- %define release 1pydotorg diff --git a/README b/README --- a/README +++ b/README @@ -1,5 +1,5 @@ -This is Python version 3.3.3 release candidate 1 -================================================ +This is Python version 3.3.3 +============================ Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013 Python Software Foundation. All rights reserved. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 11:58:36 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 11:58:36 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merge_heads?= Message-ID: <3dMqzJ0yCMz7LxG@mail.python.org> http://hg.python.org/cpython/rev/9c2da409dfaa changeset: 87188:9c2da409dfaa parent: 87186:e8276bb6a156 parent: 87177:d6ad412dee40 user: Serhiy Storchaka date: Sun Nov 17 12:30:50 2013 +0200 summary: Merge heads files: .hgtags | 2 + Doc/library/dbm.rst | 38 +++++++++++++++----------- Doc/library/string.rst | 2 +- Lib/dbm/dumb.py | 6 ++++ Lib/test/test_dbm_dumb.py | 13 +++++++++ Lib/test/test_dbm_gnu.py | 11 +++++++ Lib/test/test_dbm_ndbm.py | 13 +++++++++ Misc/NEWS | 3 ++ Modules/_dbmmodule.c | 17 ++++++++++++ Modules/_gdbmmodule.c | 16 +++++++++++ 10 files changed, 104 insertions(+), 17 deletions(-) diff --git a/.hgtags b/.hgtags --- a/.hgtags +++ b/.hgtags @@ -116,6 +116,8 @@ d9893d13c6289aa03d33559ec67f97dcbf5c9e3c v3.3.1 d047928ae3f6314a13b6137051315453d0ae89b6 v3.3.2 fd53c500f8b80f54f3ecedec9da2e8c7e52a6888 v3.3.3rc1 +d32442c0e60dfbd71234e807d3d1dedd227495a9 v3.3.3rc2 +c3896275c0f61b2510a6c7e6c458a750359a91b8 v3.3.3 46535f65e7f3bcdcf176f36d34bc1fed719ffd2b v3.4.0a1 9265a2168e2cb2a84785d8717792acc661e6b692 v3.4.0a2 dd9cdf90a5073510877e9dd5112f8e6cf20d5e89 v3.4.0a3 diff --git a/Doc/library/dbm.rst b/Doc/library/dbm.rst --- a/Doc/library/dbm.rst +++ b/Doc/library/dbm.rst @@ -73,33 +73,39 @@ strings are used they are implicitly converted to the default encoding before being stored. +These objects also support being used in a :keyword:`with` statement, which +will automatically close them when done. + +.. versionchanged:: 3.4 + Added native support for the context management protocol to the objects + returned by :func:`.open`. + The following example records some hostnames and a corresponding title, and then prints out the contents of the database:: import dbm # Open database, creating it if necessary. - db = dbm.open('cache', 'c') + with dbm.open('cache', 'c') as db: - # Record some values - db[b'hello'] = b'there' - db['www.python.org'] = 'Python Website' - db['www.cnn.com'] = 'Cable News Network' + # Record some values + db[b'hello'] = b'there' + db['www.python.org'] = 'Python Website' + db['www.cnn.com'] = 'Cable News Network' - # Note that the keys are considered bytes now. - assert db[b'www.python.org'] == b'Python Website' - # Notice how the value is now in bytes. - assert db['www.cnn.com'] == b'Cable News Network' + # Note that the keys are considered bytes now. + assert db[b'www.python.org'] == b'Python Website' + # Notice how the value is now in bytes. + assert db['www.cnn.com'] == b'Cable News Network' - # Often-used methods of the dict interface work too. - print(db.get('python.org', b'not present')) + # Often-used methods of the dict interface work too. + print(db.get('python.org', b'not present')) - # Storing a non-string key or value will raise an exception (most - # likely a TypeError). - db['www.yahoo.com'] = 4 + # Storing a non-string key or value will raise an exception (most + # likely a TypeError). + db['www.yahoo.com'] = 4 - # Close when done. - db.close() + # db is automatically closed when leaving the with statement. .. seealso:: diff --git a/Doc/library/string.rst b/Doc/library/string.rst --- a/Doc/library/string.rst +++ b/Doc/library/string.rst @@ -300,7 +300,7 @@ precision: `integer` type: "b" | "c" | "d" | "e" | "E" | "f" | "F" | "g" | "G" | "n" | "o" | "s" | "x" | "X" | "%" -If a valid *align* value is specified, it can be preceeded by a *fill* +If a valid *align* value is specified, it can be preceded by a *fill* character that can be any character and defaults to a space if omitted. Note that it is not possible to use ``{`` and ``}`` as *fill* char while using the :meth:`str.format` method; this limitation however doesn't diff --git a/Lib/dbm/dumb.py b/Lib/dbm/dumb.py --- a/Lib/dbm/dumb.py +++ b/Lib/dbm/dumb.py @@ -236,6 +236,12 @@ if hasattr(self._os, 'chmod'): self._os.chmod(file, self._mode) + def __enter__(self): + return self + + def __exit__(self, *args): + self.close() + def open(file, flag=None, mode=0o666): """Open the database file, filename, and return corresponding object. diff --git a/Lib/test/test_dbm_dumb.py b/Lib/test/test_dbm_dumb.py --- a/Lib/test/test_dbm_dumb.py +++ b/Lib/test/test_dbm_dumb.py @@ -184,6 +184,19 @@ self.assertEqual(expected, got) f.close() + def test_context_manager(self): + with dumbdbm.open(_fname, 'c') as db: + db["dumbdbm context manager"] = "context manager" + + with dumbdbm.open(_fname, 'r') as db: + self.assertEqual(list(db.keys()), [b"dumbdbm context manager"]) + + # This currently just raises AttributeError rather than a specific + # exception like the GNU or NDBM based implementations. See + # http://bugs.python.org/issue19385 for details. + with self.assertRaises(Exception): + db.keys() + def tearDown(self): _delete_files() diff --git a/Lib/test/test_dbm_gnu.py b/Lib/test/test_dbm_gnu.py --- a/Lib/test/test_dbm_gnu.py +++ b/Lib/test/test_dbm_gnu.py @@ -81,6 +81,17 @@ size2 = os.path.getsize(filename) self.assertTrue(size1 > size2 >= size0) + def test_context_manager(self): + with gdbm.open(filename, 'c') as db: + db["gdbm context manager"] = "context manager" + + with gdbm.open(filename, 'r') as db: + self.assertEqual(list(db.keys()), [b"gdbm context manager"]) + + with self.assertRaises(gdbm.error) as cm: + db.keys() + self.assertEqual(str(cm.exception), + "GDBM object has already been closed") if __name__ == '__main__': unittest.main() diff --git a/Lib/test/test_dbm_ndbm.py b/Lib/test/test_dbm_ndbm.py --- a/Lib/test/test_dbm_ndbm.py +++ b/Lib/test/test_dbm_ndbm.py @@ -37,5 +37,18 @@ except error: self.fail() + def test_context_manager(self): + with dbm.ndbm.open(self.filename, 'c') as db: + db["ndbm context manager"] = "context manager" + + with dbm.ndbm.open(self.filename, 'r') as db: + self.assertEqual(list(db.keys()), [b"ndbm context manager"]) + + with self.assertRaises(dbm.ndbm.error) as cm: + db.keys() + self.assertEqual(str(cm.exception), + "DBM object has already been closed") + + if __name__ == '__main__': unittest.main() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -50,6 +50,9 @@ Library ------- +- Issue #19282: dbm.open now supports the context manager protocol. (Inital + patch by Claudiu Popa) + - Issue #8311: Added support for writing any bytes-like objects in the aifc, sunau, and wave modules. diff --git a/Modules/_dbmmodule.c b/Modules/_dbmmodule.c --- a/Modules/_dbmmodule.c +++ b/Modules/_dbmmodule.c @@ -313,6 +313,21 @@ return defvalue; } +static PyObject * +dbm__enter__(PyObject *self, PyObject *args) +{ + Py_INCREF(self); + return self; +} + +static PyObject * +dbm__exit__(PyObject *self, PyObject *args) +{ + _Py_IDENTIFIER(close); + return _PyObject_CallMethodId(self, &PyId_close, NULL); +} + + static PyMethodDef dbm_methods[] = { {"close", (PyCFunction)dbm__close, METH_NOARGS, "close()\nClose the database."}, @@ -325,6 +340,8 @@ "setdefault(key[, default]) -> value\n" "Return the value for key if present, otherwise default. If key\n" "is not in the database, it is inserted with default as the value."}, + {"__enter__", dbm__enter__, METH_NOARGS, NULL}, + {"__exit__", dbm__exit__, METH_VARARGS, NULL}, {NULL, NULL} /* sentinel */ }; diff --git a/Modules/_gdbmmodule.c b/Modules/_gdbmmodule.c --- a/Modules/_gdbmmodule.c +++ b/Modules/_gdbmmodule.c @@ -425,6 +425,20 @@ return Py_None; } +static PyObject * +dbm__enter__(PyObject *self, PyObject *args) +{ + Py_INCREF(self); + return self; +} + +static PyObject * +dbm__exit__(PyObject *self, PyObject *args) +{ + _Py_IDENTIFIER(close); + return _PyObject_CallMethodId(self, &PyId_close, NULL); +} + static PyMethodDef dbm_methods[] = { {"close", (PyCFunction)dbm_close, METH_NOARGS, dbm_close__doc__}, {"keys", (PyCFunction)dbm_keys, METH_NOARGS, dbm_keys__doc__}, @@ -434,6 +448,8 @@ {"sync", (PyCFunction)dbm_sync, METH_NOARGS, dbm_sync__doc__}, {"get", (PyCFunction)dbm_get, METH_VARARGS, dbm_get__doc__}, {"setdefault",(PyCFunction)dbm_setdefault,METH_VARARGS, dbm_setdefault__doc__}, + {"__enter__", dbm__enter__, METH_NOARGS, NULL}, + {"__exit__", dbm__exit__, METH_VARARGS, NULL}, {NULL, NULL} /* sentinel */ }; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 11:58:37 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 11:58:37 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMi43IC0+IDIuNyk6?= =?utf-8?q?_Merge_heads?= Message-ID: <3dMqzK2pHTz7Lx2@mail.python.org> http://hg.python.org/cpython/rev/436f7e55be77 changeset: 87189:436f7e55be77 branch: 2.7 parent: 87184:f7d8c7a176fb parent: 87163:2b0690c9a026 user: Serhiy Storchaka date: Sun Nov 17 12:31:29 2013 +0200 summary: Merge heads files: Doc/library/string.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/library/string.rst b/Doc/library/string.rst --- a/Doc/library/string.rst +++ b/Doc/library/string.rst @@ -330,7 +330,7 @@ precision: `integer` type: "b" | "c" | "d" | "e" | "E" | "f" | "F" | "g" | "G" | "n" | "o" | "s" | "x" | "X" | "%" -If a valid *align* value is specified, it can be preceeded by a *fill* +If a valid *align* value is specified, it can be preceded by a *fill* character that can be any character and defaults to a space if omitted. Note that it is not possible to use ``{`` and ``}`` as *fill* char while using the :meth:`str.format` method; this limitation however doesn't -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 11:58:38 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 11:58:38 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Null_merge?= Message-ID: <3dMqzL4Yvsz7Lx2@mail.python.org> http://hg.python.org/cpython/rev/b36974a52f51 changeset: 87190:b36974a52f51 parent: 87188:9c2da409dfaa parent: 87187:dcc12d083aaf user: Serhiy Storchaka date: Sun Nov 17 12:31:56 2013 +0200 summary: Null merge files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 11:58:39 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 11:58:39 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Fix_merge_erro?= =?utf-8?q?r_in_issue_=2319602=2E?= Message-ID: <3dMqzM6Ch2z7LkH@mail.python.org> http://hg.python.org/cpython/rev/6bd5e4325506 changeset: 87191:6bd5e4325506 branch: 2.7 parent: 87189:436f7e55be77 user: Serhiy Storchaka date: Sun Nov 17 12:49:28 2013 +0200 summary: Fix merge error in issue #19602. files: Lib/lib-tk/test/test_ttk/test_extensions.py | 4 ---- 1 files changed, 0 insertions(+), 4 deletions(-) diff --git a/Lib/lib-tk/test/test_ttk/test_extensions.py b/Lib/lib-tk/test/test_ttk/test_extensions.py --- a/Lib/lib-tk/test/test_ttk/test_extensions.py +++ b/Lib/lib-tk/test/test_ttk/test_extensions.py @@ -45,11 +45,7 @@ # it tries calling instance attributes not yet defined. ttk.LabeledScale(variable=myvar) if hasattr(sys, 'last_type'): -<<<<<<< - self.assertFalse(sys.last_type == Tkinter.TclError) -======= self.assertNotEqual(sys.last_type, tkinter.TclError) ->>>>>>> def test_initialization(self): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 12:25:09 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 12:25:09 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NjA1?= =?utf-8?q?=3A_Use_specific_asserts_in_datetime_tests?= Message-ID: <3dMrYx3PNFz7Llx@mail.python.org> http://hg.python.org/cpython/rev/bc67e8d39164 changeset: 87192:bc67e8d39164 branch: 3.3 parent: 87187:dcc12d083aaf user: Serhiy Storchaka date: Sun Nov 17 12:52:33 2013 +0200 summary: Issue #19605: Use specific asserts in datetime tests files: Lib/test/datetimetester.py | 159 ++++++++++++------------ 1 files changed, 80 insertions(+), 79 deletions(-) diff --git a/Lib/test/datetimetester.py b/Lib/test/datetimetester.py --- a/Lib/test/datetimetester.py +++ b/Lib/test/datetimetester.py @@ -115,11 +115,11 @@ # carry no data), but they need to be picklable anyway else # concrete subclasses can't be pickled. orig = tzinfo.__new__(tzinfo) - self.assertTrue(type(orig) is tzinfo) + self.assertIs(type(orig), tzinfo) for pickler, unpickler, proto in pickle_choices: green = pickler.dumps(orig, proto) derived = unpickler.loads(green) - self.assertTrue(type(derived) is tzinfo) + self.assertIs(type(derived), tzinfo) def test_pickling_subclass(self): # Make sure we can pickle/unpickle an instance of a subclass. @@ -479,9 +479,9 @@ self.assertEqual(t1, t2) self.assertTrue(t1 <= t2) self.assertTrue(t1 >= t2) - self.assertTrue(not t1 != t2) - self.assertTrue(not t1 < t2) - self.assertTrue(not t1 > t2) + self.assertFalse(t1 != t2) + self.assertFalse(t1 < t2) + self.assertFalse(t1 > t2) for args in (3, 3, 3), (2, 4, 4), (2, 3, 5): t2 = timedelta(*args) # this is larger than t1 @@ -491,12 +491,12 @@ self.assertTrue(t2 >= t1) self.assertTrue(t1 != t2) self.assertTrue(t2 != t1) - self.assertTrue(not t1 == t2) - self.assertTrue(not t2 == t1) - self.assertTrue(not t1 > t2) - self.assertTrue(not t2 < t1) - self.assertTrue(not t1 >= t2) - self.assertTrue(not t2 <= t1) + self.assertFalse(t1 == t2) + self.assertFalse(t2 == t1) + self.assertFalse(t1 > t2) + self.assertFalse(t2 < t1) + self.assertFalse(t1 >= t2) + self.assertFalse(t2 <= t1) for badarg in OTHERSTUFF: self.assertEqual(t1 == badarg, False) @@ -629,7 +629,7 @@ self.assertTrue(timedelta(0, 1)) self.assertTrue(timedelta(0, 0, 1)) self.assertTrue(timedelta(microseconds=1)) - self.assertTrue(not timedelta(0)) + self.assertFalse(timedelta(0)) def test_subclass_timedelta(self): @@ -645,17 +645,17 @@ return round(sum) t1 = T(days=1) - self.assertTrue(type(t1) is T) + self.assertIs(type(t1), T) self.assertEqual(t1.as_hours(), 24) t2 = T(days=-1, seconds=-3600) - self.assertTrue(type(t2) is T) + self.assertIs(type(t2), T) self.assertEqual(t2.as_hours(), -25) t3 = t1 + t2 - self.assertTrue(type(t3) is timedelta) + self.assertIs(type(t3), timedelta) t4 = T.from_td(t3) - self.assertTrue(type(t4) is T) + self.assertIs(type(t4), T) self.assertEqual(t3.days, t4.days) self.assertEqual(t3.seconds, t4.seconds) self.assertEqual(t3.microseconds, t4.microseconds) @@ -1007,8 +1007,9 @@ # It worked or it didn't. If it didn't, assume it's reason #2, and # let the test pass if they're within half a second of each other. - self.assertTrue(today == todayagain or - abs(todayagain - today) < timedelta(seconds=0.5)) + if today != todayagain: + self.assertAlmostEqual(todayagain, today, + delta=timedelta(seconds=0.5)) def test_weekday(self): for i in range(7): @@ -1202,9 +1203,9 @@ self.assertEqual(t1, t2) self.assertTrue(t1 <= t2) self.assertTrue(t1 >= t2) - self.assertTrue(not t1 != t2) - self.assertTrue(not t1 < t2) - self.assertTrue(not t1 > t2) + self.assertFalse(t1 != t2) + self.assertFalse(t1 < t2) + self.assertFalse(t1 > t2) for args in (3, 3, 3), (2, 4, 4), (2, 3, 5): t2 = self.theclass(*args) # this is larger than t1 @@ -1214,12 +1215,12 @@ self.assertTrue(t2 >= t1) self.assertTrue(t1 != t2) self.assertTrue(t2 != t1) - self.assertTrue(not t1 == t2) - self.assertTrue(not t2 == t1) - self.assertTrue(not t1 > t2) - self.assertTrue(not t2 < t1) - self.assertTrue(not t1 >= t2) - self.assertTrue(not t2 <= t1) + self.assertFalse(t1 == t2) + self.assertFalse(t2 == t1) + self.assertFalse(t1 > t2) + self.assertFalse(t2 < t1) + self.assertFalse(t1 >= t2) + self.assertFalse(t2 <= t1) for badarg in OTHERSTUFF: self.assertEqual(t1 == badarg, False) @@ -1690,9 +1691,9 @@ self.assertEqual(t1, t2) self.assertTrue(t1 <= t2) self.assertTrue(t1 >= t2) - self.assertTrue(not t1 != t2) - self.assertTrue(not t1 < t2) - self.assertTrue(not t1 > t2) + self.assertFalse(t1 != t2) + self.assertFalse(t1 < t2) + self.assertFalse(t1 > t2) for i in range(len(args)): newargs = args[:] @@ -1704,12 +1705,12 @@ self.assertTrue(t2 >= t1) self.assertTrue(t1 != t2) self.assertTrue(t2 != t1) - self.assertTrue(not t1 == t2) - self.assertTrue(not t2 == t1) - self.assertTrue(not t1 > t2) - self.assertTrue(not t2 < t1) - self.assertTrue(not t1 >= t2) - self.assertTrue(not t2 <= t1) + self.assertFalse(t1 == t2) + self.assertFalse(t2 == t1) + self.assertFalse(t1 > t2) + self.assertFalse(t2 < t1) + self.assertFalse(t1 >= t2) + self.assertFalse(t2 <= t1) # A helper for timestamp constructor tests. @@ -1846,7 +1847,7 @@ if abs(from_timestamp - from_now) <= tolerance: break # Else try again a few times. - self.assertTrue(abs(from_timestamp - from_now) <= tolerance) + self.assertLessEqual(abs(from_timestamp - from_now), tolerance) def test_strptime(self): string = '2004-12-01 13:02:47.197' @@ -2072,9 +2073,9 @@ self.assertEqual(t1, t2) self.assertTrue(t1 <= t2) self.assertTrue(t1 >= t2) - self.assertTrue(not t1 != t2) - self.assertTrue(not t1 < t2) - self.assertTrue(not t1 > t2) + self.assertFalse(t1 != t2) + self.assertFalse(t1 < t2) + self.assertFalse(t1 > t2) for i in range(len(args)): newargs = args[:] @@ -2086,12 +2087,12 @@ self.assertTrue(t2 >= t1) self.assertTrue(t1 != t2) self.assertTrue(t2 != t1) - self.assertTrue(not t1 == t2) - self.assertTrue(not t2 == t1) - self.assertTrue(not t1 > t2) - self.assertTrue(not t2 < t1) - self.assertTrue(not t1 >= t2) - self.assertTrue(not t2 <= t1) + self.assertFalse(t1 == t2) + self.assertFalse(t2 == t1) + self.assertFalse(t1 > t2) + self.assertFalse(t2 < t1) + self.assertFalse(t1 >= t2) + self.assertFalse(t2 <= t1) for badarg in OTHERSTUFF: self.assertEqual(t1 == badarg, False) @@ -2269,8 +2270,8 @@ self.assertTrue(cls(0, 1)) self.assertTrue(cls(0, 0, 1)) self.assertTrue(cls(0, 0, 0, 1)) - self.assertTrue(not cls(0)) - self.assertTrue(not cls()) + self.assertFalse(cls(0)) + self.assertFalse(cls()) def test_replace(self): cls = self.theclass @@ -2367,7 +2368,7 @@ def utcoffset(self, dt): pass b = BetterTry() t = cls(1, 1, 1, tzinfo=b) - self.assertTrue(t.tzinfo is b) + self.assertIs(t.tzinfo, b) def test_utc_offset_out_of_bounds(self): class Edgy(tzinfo): @@ -2406,9 +2407,9 @@ for t in (cls(1, 1, 1), cls(1, 1, 1, tzinfo=None), cls(1, 1, 1, tzinfo=C1())): - self.assertTrue(t.utcoffset() is None) - self.assertTrue(t.dst() is None) - self.assertTrue(t.tzname() is None) + self.assertIsNone(t.utcoffset()) + self.assertIsNone(t.dst()) + self.assertIsNone(t.tzname()) class C3(tzinfo): def utcoffset(self, dt): return timedelta(minutes=-1439) @@ -2503,7 +2504,7 @@ self.assertEqual(t.minute, 0) self.assertEqual(t.second, 0) self.assertEqual(t.microsecond, 0) - self.assertTrue(t.tzinfo is None) + self.assertIsNone(t.tzinfo) def test_zones(self): est = FixedOffset(-300, "EST", 1) @@ -2518,25 +2519,25 @@ self.assertEqual(t1.tzinfo, est) self.assertEqual(t2.tzinfo, utc) self.assertEqual(t3.tzinfo, met) - self.assertTrue(t4.tzinfo is None) + self.assertIsNone(t4.tzinfo) self.assertEqual(t5.tzinfo, utc) self.assertEqual(t1.utcoffset(), timedelta(minutes=-300)) self.assertEqual(t2.utcoffset(), timedelta(minutes=0)) self.assertEqual(t3.utcoffset(), timedelta(minutes=60)) - self.assertTrue(t4.utcoffset() is None) + self.assertIsNone(t4.utcoffset()) self.assertRaises(TypeError, t1.utcoffset, "no args") self.assertEqual(t1.tzname(), "EST") self.assertEqual(t2.tzname(), "UTC") self.assertEqual(t3.tzname(), "MET") - self.assertTrue(t4.tzname() is None) + self.assertIsNone(t4.tzname()) self.assertRaises(TypeError, t1.tzname, "no args") self.assertEqual(t1.dst(), timedelta(minutes=1)) self.assertEqual(t2.dst(), timedelta(minutes=-2)) self.assertEqual(t3.dst(), timedelta(minutes=3)) - self.assertTrue(t4.dst() is None) + self.assertIsNone(t4.dst()) self.assertRaises(TypeError, t1.dst, "no args") self.assertEqual(hash(t1), hash(t2)) @@ -2633,10 +2634,10 @@ self.assertTrue(t) t = cls(5, tzinfo=FixedOffset(300, "")) - self.assertTrue(not t) + self.assertFalse(t) t = cls(23, 59, tzinfo=FixedOffset(23*60 + 59, "")) - self.assertTrue(not t) + self.assertFalse(t) # Mostly ensuring this doesn't overflow internally. t = cls(0, tzinfo=FixedOffset(23*60 + 59, "")) @@ -2674,13 +2675,13 @@ # Ensure we can get rid of a tzinfo. self.assertEqual(base.tzname(), "+100") base2 = base.replace(tzinfo=None) - self.assertTrue(base2.tzinfo is None) - self.assertTrue(base2.tzname() is None) + self.assertIsNone(base2.tzinfo) + self.assertIsNone(base2.tzname()) # Ensure we can add one. base3 = base2.replace(tzinfo=z100) self.assertEqual(base, base3) - self.assertTrue(base.tzinfo is base3.tzinfo) + self.assertIs(base.tzinfo, base3.tzinfo) # Out of bounds. base = cls(1) @@ -2915,7 +2916,7 @@ tz55 = FixedOffset(-330, "west 5:30") timeaware = now.time().replace(tzinfo=tz55) nowaware = self.theclass.combine(now.date(), timeaware) - self.assertTrue(nowaware.tzinfo is tz55) + self.assertIs(nowaware.tzinfo, tz55) self.assertEqual(nowaware.timetz(), timeaware) # Can't mix aware and non-aware. @@ -2934,15 +2935,15 @@ # Adding a delta should preserve tzinfo. delta = timedelta(weeks=1, minutes=12, microseconds=5678) nowawareplus = nowaware + delta - self.assertTrue(nowaware.tzinfo is tz55) + self.assertIs(nowaware.tzinfo, tz55) nowawareplus2 = delta + nowaware - self.assertTrue(nowawareplus2.tzinfo is tz55) + self.assertIs(nowawareplus2.tzinfo, tz55) self.assertEqual(nowawareplus, nowawareplus2) # that - delta should be what we started with, and that - what we # started with should be delta. diff = nowawareplus - delta - self.assertTrue(diff.tzinfo is tz55) + self.assertIs(diff.tzinfo, tz55) self.assertEqual(nowaware, diff) self.assertRaises(TypeError, lambda: delta - nowawareplus) self.assertEqual(nowawareplus - nowaware, delta) @@ -2951,7 +2952,7 @@ tzr = FixedOffset(random.randrange(-1439, 1440), "randomtimezone") # Attach it to nowawareplus. nowawareplus = nowawareplus.replace(tzinfo=tzr) - self.assertTrue(nowawareplus.tzinfo is tzr) + self.assertIs(nowawareplus.tzinfo, tzr) # Make sure the difference takes the timezone adjustments into account. got = nowaware - nowawareplus # Expected: (nowaware base - nowaware offset) - @@ -2983,7 +2984,7 @@ off42 = FixedOffset(42, "42") another = meth(off42) again = meth(tz=off42) - self.assertTrue(another.tzinfo is again.tzinfo) + self.assertIs(another.tzinfo, again.tzinfo) self.assertEqual(another.utcoffset(), timedelta(minutes=42)) # Bad argument with and w/o naming the keyword. self.assertRaises(TypeError, meth, 16) @@ -3001,7 +3002,7 @@ timezone(timedelta(hours=15, minutes=58), "weirdtz"),]: for dummy in range(3): now = datetime.now(weirdtz) - self.assertTrue(now.tzinfo is weirdtz) + self.assertIs(now.tzinfo, weirdtz) utcnow = datetime.utcnow().replace(tzinfo=utc) now2 = utcnow.astimezone(weirdtz) if abs(now - now2) < timedelta(seconds=30): @@ -3022,7 +3023,7 @@ off42 = FixedOffset(42, "42") another = meth(ts, off42) again = meth(ts, tz=off42) - self.assertTrue(another.tzinfo is again.tzinfo) + self.assertIs(another.tzinfo, again.tzinfo) self.assertEqual(another.utcoffset(), timedelta(minutes=42)) # Bad argument with and w/o naming the keyword. self.assertRaises(TypeError, meth, ts, 16) @@ -3233,13 +3234,13 @@ # Ensure we can get rid of a tzinfo. self.assertEqual(base.tzname(), "+100") base2 = base.replace(tzinfo=None) - self.assertTrue(base2.tzinfo is None) - self.assertTrue(base2.tzname() is None) + self.assertIsNone(base2.tzinfo) + self.assertIsNone(base2.tzname()) # Ensure we can add one. base3 = base2.replace(tzinfo=z100) self.assertEqual(base, base3) - self.assertTrue(base.tzinfo is base3.tzinfo) + self.assertIs(base.tzinfo, base3.tzinfo) # Out of bounds. base = cls(2000, 2, 29) @@ -3252,18 +3253,18 @@ fm5h = FixedOffset(-timedelta(hours=5), "m300") dt = self.theclass.now(tz=f44m) - self.assertTrue(dt.tzinfo is f44m) + self.assertIs(dt.tzinfo, f44m) # Replacing with degenerate tzinfo raises an exception. self.assertRaises(ValueError, dt.astimezone, fnone) # Replacing with same tzinfo makes no change. x = dt.astimezone(dt.tzinfo) - self.assertTrue(x.tzinfo is f44m) + self.assertIs(x.tzinfo, f44m) self.assertEqual(x.date(), dt.date()) self.assertEqual(x.time(), dt.time()) # Replacing with different tzinfo does adjust. got = dt.astimezone(fm5h) - self.assertTrue(got.tzinfo is fm5h) + self.assertIs(got.tzinfo, fm5h) self.assertEqual(got.utcoffset(), timedelta(hours=-5)) expected = dt - dt.utcoffset() # in effect, convert to UTC expected += fm5h.utcoffset(dt) # and from there to local time @@ -3271,7 +3272,7 @@ self.assertEqual(got.date(), expected.date()) self.assertEqual(got.time(), expected.time()) self.assertEqual(got.timetz(), expected.timetz()) - self.assertTrue(got.tzinfo is expected.tzinfo) + self.assertIs(got.tzinfo, expected.tzinfo) self.assertEqual(got, expected) @support.run_with_tz('UTC') @@ -3732,8 +3733,8 @@ as_datetime = datetime.combine(as_date, time()) self.assertTrue(as_date != as_datetime) self.assertTrue(as_datetime != as_date) - self.assertTrue(not as_date == as_datetime) - self.assertTrue(not as_datetime == as_date) + self.assertFalse(as_date == as_datetime) + self.assertFalse(as_datetime == as_date) self.assertRaises(TypeError, lambda: as_date < as_datetime) self.assertRaises(TypeError, lambda: as_datetime < as_date) self.assertRaises(TypeError, lambda: as_date <= as_datetime) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 12:25:11 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 12:25:11 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319605=3A_Use_specific_asserts_in_datetime_tests?= Message-ID: <3dMrYz0Hqpz7Lnv@mail.python.org> http://hg.python.org/cpython/rev/0b7a519cb58f changeset: 87193:0b7a519cb58f parent: 87190:b36974a52f51 parent: 87192:bc67e8d39164 user: Serhiy Storchaka date: Sun Nov 17 13:03:07 2013 +0200 summary: Issue #19605: Use specific asserts in datetime tests files: Lib/test/datetimetester.py | 159 ++++++++++++------------ 1 files changed, 80 insertions(+), 79 deletions(-) diff --git a/Lib/test/datetimetester.py b/Lib/test/datetimetester.py --- a/Lib/test/datetimetester.py +++ b/Lib/test/datetimetester.py @@ -115,11 +115,11 @@ # carry no data), but they need to be picklable anyway else # concrete subclasses can't be pickled. orig = tzinfo.__new__(tzinfo) - self.assertTrue(type(orig) is tzinfo) + self.assertIs(type(orig), tzinfo) for pickler, unpickler, proto in pickle_choices: green = pickler.dumps(orig, proto) derived = unpickler.loads(green) - self.assertTrue(type(derived) is tzinfo) + self.assertIs(type(derived), tzinfo) def test_pickling_subclass(self): # Make sure we can pickle/unpickle an instance of a subclass. @@ -479,9 +479,9 @@ self.assertEqual(t1, t2) self.assertTrue(t1 <= t2) self.assertTrue(t1 >= t2) - self.assertTrue(not t1 != t2) - self.assertTrue(not t1 < t2) - self.assertTrue(not t1 > t2) + self.assertFalse(t1 != t2) + self.assertFalse(t1 < t2) + self.assertFalse(t1 > t2) for args in (3, 3, 3), (2, 4, 4), (2, 3, 5): t2 = timedelta(*args) # this is larger than t1 @@ -491,12 +491,12 @@ self.assertTrue(t2 >= t1) self.assertTrue(t1 != t2) self.assertTrue(t2 != t1) - self.assertTrue(not t1 == t2) - self.assertTrue(not t2 == t1) - self.assertTrue(not t1 > t2) - self.assertTrue(not t2 < t1) - self.assertTrue(not t1 >= t2) - self.assertTrue(not t2 <= t1) + self.assertFalse(t1 == t2) + self.assertFalse(t2 == t1) + self.assertFalse(t1 > t2) + self.assertFalse(t2 < t1) + self.assertFalse(t1 >= t2) + self.assertFalse(t2 <= t1) for badarg in OTHERSTUFF: self.assertEqual(t1 == badarg, False) @@ -633,7 +633,7 @@ self.assertTrue(timedelta(0, 1)) self.assertTrue(timedelta(0, 0, 1)) self.assertTrue(timedelta(microseconds=1)) - self.assertTrue(not timedelta(0)) + self.assertFalse(timedelta(0)) def test_subclass_timedelta(self): @@ -649,17 +649,17 @@ return round(sum) t1 = T(days=1) - self.assertTrue(type(t1) is T) + self.assertIs(type(t1), T) self.assertEqual(t1.as_hours(), 24) t2 = T(days=-1, seconds=-3600) - self.assertTrue(type(t2) is T) + self.assertIs(type(t2), T) self.assertEqual(t2.as_hours(), -25) t3 = t1 + t2 - self.assertTrue(type(t3) is timedelta) + self.assertIs(type(t3), timedelta) t4 = T.from_td(t3) - self.assertTrue(type(t4) is T) + self.assertIs(type(t4), T) self.assertEqual(t3.days, t4.days) self.assertEqual(t3.seconds, t4.seconds) self.assertEqual(t3.microseconds, t4.microseconds) @@ -1011,8 +1011,9 @@ # It worked or it didn't. If it didn't, assume it's reason #2, and # let the test pass if they're within half a second of each other. - self.assertTrue(today == todayagain or - abs(todayagain - today) < timedelta(seconds=0.5)) + if today != todayagain: + self.assertAlmostEqual(todayagain, today, + delta=timedelta(seconds=0.5)) def test_weekday(self): for i in range(7): @@ -1206,9 +1207,9 @@ self.assertEqual(t1, t2) self.assertTrue(t1 <= t2) self.assertTrue(t1 >= t2) - self.assertTrue(not t1 != t2) - self.assertTrue(not t1 < t2) - self.assertTrue(not t1 > t2) + self.assertFalse(t1 != t2) + self.assertFalse(t1 < t2) + self.assertFalse(t1 > t2) for args in (3, 3, 3), (2, 4, 4), (2, 3, 5): t2 = self.theclass(*args) # this is larger than t1 @@ -1218,12 +1219,12 @@ self.assertTrue(t2 >= t1) self.assertTrue(t1 != t2) self.assertTrue(t2 != t1) - self.assertTrue(not t1 == t2) - self.assertTrue(not t2 == t1) - self.assertTrue(not t1 > t2) - self.assertTrue(not t2 < t1) - self.assertTrue(not t1 >= t2) - self.assertTrue(not t2 <= t1) + self.assertFalse(t1 == t2) + self.assertFalse(t2 == t1) + self.assertFalse(t1 > t2) + self.assertFalse(t2 < t1) + self.assertFalse(t1 >= t2) + self.assertFalse(t2 <= t1) for badarg in OTHERSTUFF: self.assertEqual(t1 == badarg, False) @@ -1694,9 +1695,9 @@ self.assertEqual(t1, t2) self.assertTrue(t1 <= t2) self.assertTrue(t1 >= t2) - self.assertTrue(not t1 != t2) - self.assertTrue(not t1 < t2) - self.assertTrue(not t1 > t2) + self.assertFalse(t1 != t2) + self.assertFalse(t1 < t2) + self.assertFalse(t1 > t2) for i in range(len(args)): newargs = args[:] @@ -1708,12 +1709,12 @@ self.assertTrue(t2 >= t1) self.assertTrue(t1 != t2) self.assertTrue(t2 != t1) - self.assertTrue(not t1 == t2) - self.assertTrue(not t2 == t1) - self.assertTrue(not t1 > t2) - self.assertTrue(not t2 < t1) - self.assertTrue(not t1 >= t2) - self.assertTrue(not t2 <= t1) + self.assertFalse(t1 == t2) + self.assertFalse(t2 == t1) + self.assertFalse(t1 > t2) + self.assertFalse(t2 < t1) + self.assertFalse(t1 >= t2) + self.assertFalse(t2 <= t1) # A helper for timestamp constructor tests. @@ -1850,7 +1851,7 @@ if abs(from_timestamp - from_now) <= tolerance: break # Else try again a few times. - self.assertTrue(abs(from_timestamp - from_now) <= tolerance) + self.assertLessEqual(abs(from_timestamp - from_now), tolerance) def test_strptime(self): string = '2004-12-01 13:02:47.197' @@ -2076,9 +2077,9 @@ self.assertEqual(t1, t2) self.assertTrue(t1 <= t2) self.assertTrue(t1 >= t2) - self.assertTrue(not t1 != t2) - self.assertTrue(not t1 < t2) - self.assertTrue(not t1 > t2) + self.assertFalse(t1 != t2) + self.assertFalse(t1 < t2) + self.assertFalse(t1 > t2) for i in range(len(args)): newargs = args[:] @@ -2090,12 +2091,12 @@ self.assertTrue(t2 >= t1) self.assertTrue(t1 != t2) self.assertTrue(t2 != t1) - self.assertTrue(not t1 == t2) - self.assertTrue(not t2 == t1) - self.assertTrue(not t1 > t2) - self.assertTrue(not t2 < t1) - self.assertTrue(not t1 >= t2) - self.assertTrue(not t2 <= t1) + self.assertFalse(t1 == t2) + self.assertFalse(t2 == t1) + self.assertFalse(t1 > t2) + self.assertFalse(t2 < t1) + self.assertFalse(t1 >= t2) + self.assertFalse(t2 <= t1) for badarg in OTHERSTUFF: self.assertEqual(t1 == badarg, False) @@ -2273,8 +2274,8 @@ self.assertTrue(cls(0, 1)) self.assertTrue(cls(0, 0, 1)) self.assertTrue(cls(0, 0, 0, 1)) - self.assertTrue(not cls(0)) - self.assertTrue(not cls()) + self.assertFalse(cls(0)) + self.assertFalse(cls()) def test_replace(self): cls = self.theclass @@ -2371,7 +2372,7 @@ def utcoffset(self, dt): pass b = BetterTry() t = cls(1, 1, 1, tzinfo=b) - self.assertTrue(t.tzinfo is b) + self.assertIs(t.tzinfo, b) def test_utc_offset_out_of_bounds(self): class Edgy(tzinfo): @@ -2410,9 +2411,9 @@ for t in (cls(1, 1, 1), cls(1, 1, 1, tzinfo=None), cls(1, 1, 1, tzinfo=C1())): - self.assertTrue(t.utcoffset() is None) - self.assertTrue(t.dst() is None) - self.assertTrue(t.tzname() is None) + self.assertIsNone(t.utcoffset()) + self.assertIsNone(t.dst()) + self.assertIsNone(t.tzname()) class C3(tzinfo): def utcoffset(self, dt): return timedelta(minutes=-1439) @@ -2507,7 +2508,7 @@ self.assertEqual(t.minute, 0) self.assertEqual(t.second, 0) self.assertEqual(t.microsecond, 0) - self.assertTrue(t.tzinfo is None) + self.assertIsNone(t.tzinfo) def test_zones(self): est = FixedOffset(-300, "EST", 1) @@ -2522,25 +2523,25 @@ self.assertEqual(t1.tzinfo, est) self.assertEqual(t2.tzinfo, utc) self.assertEqual(t3.tzinfo, met) - self.assertTrue(t4.tzinfo is None) + self.assertIsNone(t4.tzinfo) self.assertEqual(t5.tzinfo, utc) self.assertEqual(t1.utcoffset(), timedelta(minutes=-300)) self.assertEqual(t2.utcoffset(), timedelta(minutes=0)) self.assertEqual(t3.utcoffset(), timedelta(minutes=60)) - self.assertTrue(t4.utcoffset() is None) + self.assertIsNone(t4.utcoffset()) self.assertRaises(TypeError, t1.utcoffset, "no args") self.assertEqual(t1.tzname(), "EST") self.assertEqual(t2.tzname(), "UTC") self.assertEqual(t3.tzname(), "MET") - self.assertTrue(t4.tzname() is None) + self.assertIsNone(t4.tzname()) self.assertRaises(TypeError, t1.tzname, "no args") self.assertEqual(t1.dst(), timedelta(minutes=1)) self.assertEqual(t2.dst(), timedelta(minutes=-2)) self.assertEqual(t3.dst(), timedelta(minutes=3)) - self.assertTrue(t4.dst() is None) + self.assertIsNone(t4.dst()) self.assertRaises(TypeError, t1.dst, "no args") self.assertEqual(hash(t1), hash(t2)) @@ -2637,10 +2638,10 @@ self.assertTrue(t) t = cls(5, tzinfo=FixedOffset(300, "")) - self.assertTrue(not t) + self.assertFalse(t) t = cls(23, 59, tzinfo=FixedOffset(23*60 + 59, "")) - self.assertTrue(not t) + self.assertFalse(t) # Mostly ensuring this doesn't overflow internally. t = cls(0, tzinfo=FixedOffset(23*60 + 59, "")) @@ -2678,13 +2679,13 @@ # Ensure we can get rid of a tzinfo. self.assertEqual(base.tzname(), "+100") base2 = base.replace(tzinfo=None) - self.assertTrue(base2.tzinfo is None) - self.assertTrue(base2.tzname() is None) + self.assertIsNone(base2.tzinfo) + self.assertIsNone(base2.tzname()) # Ensure we can add one. base3 = base2.replace(tzinfo=z100) self.assertEqual(base, base3) - self.assertTrue(base.tzinfo is base3.tzinfo) + self.assertIs(base.tzinfo, base3.tzinfo) # Out of bounds. base = cls(1) @@ -2919,7 +2920,7 @@ tz55 = FixedOffset(-330, "west 5:30") timeaware = now.time().replace(tzinfo=tz55) nowaware = self.theclass.combine(now.date(), timeaware) - self.assertTrue(nowaware.tzinfo is tz55) + self.assertIs(nowaware.tzinfo, tz55) self.assertEqual(nowaware.timetz(), timeaware) # Can't mix aware and non-aware. @@ -2938,15 +2939,15 @@ # Adding a delta should preserve tzinfo. delta = timedelta(weeks=1, minutes=12, microseconds=5678) nowawareplus = nowaware + delta - self.assertTrue(nowaware.tzinfo is tz55) + self.assertIs(nowaware.tzinfo, tz55) nowawareplus2 = delta + nowaware - self.assertTrue(nowawareplus2.tzinfo is tz55) + self.assertIs(nowawareplus2.tzinfo, tz55) self.assertEqual(nowawareplus, nowawareplus2) # that - delta should be what we started with, and that - what we # started with should be delta. diff = nowawareplus - delta - self.assertTrue(diff.tzinfo is tz55) + self.assertIs(diff.tzinfo, tz55) self.assertEqual(nowaware, diff) self.assertRaises(TypeError, lambda: delta - nowawareplus) self.assertEqual(nowawareplus - nowaware, delta) @@ -2955,7 +2956,7 @@ tzr = FixedOffset(random.randrange(-1439, 1440), "randomtimezone") # Attach it to nowawareplus. nowawareplus = nowawareplus.replace(tzinfo=tzr) - self.assertTrue(nowawareplus.tzinfo is tzr) + self.assertIs(nowawareplus.tzinfo, tzr) # Make sure the difference takes the timezone adjustments into account. got = nowaware - nowawareplus # Expected: (nowaware base - nowaware offset) - @@ -2987,7 +2988,7 @@ off42 = FixedOffset(42, "42") another = meth(off42) again = meth(tz=off42) - self.assertTrue(another.tzinfo is again.tzinfo) + self.assertIs(another.tzinfo, again.tzinfo) self.assertEqual(another.utcoffset(), timedelta(minutes=42)) # Bad argument with and w/o naming the keyword. self.assertRaises(TypeError, meth, 16) @@ -3005,7 +3006,7 @@ timezone(timedelta(hours=15, minutes=58), "weirdtz"),]: for dummy in range(3): now = datetime.now(weirdtz) - self.assertTrue(now.tzinfo is weirdtz) + self.assertIs(now.tzinfo, weirdtz) utcnow = datetime.utcnow().replace(tzinfo=utc) now2 = utcnow.astimezone(weirdtz) if abs(now - now2) < timedelta(seconds=30): @@ -3026,7 +3027,7 @@ off42 = FixedOffset(42, "42") another = meth(ts, off42) again = meth(ts, tz=off42) - self.assertTrue(another.tzinfo is again.tzinfo) + self.assertIs(another.tzinfo, again.tzinfo) self.assertEqual(another.utcoffset(), timedelta(minutes=42)) # Bad argument with and w/o naming the keyword. self.assertRaises(TypeError, meth, ts, 16) @@ -3237,13 +3238,13 @@ # Ensure we can get rid of a tzinfo. self.assertEqual(base.tzname(), "+100") base2 = base.replace(tzinfo=None) - self.assertTrue(base2.tzinfo is None) - self.assertTrue(base2.tzname() is None) + self.assertIsNone(base2.tzinfo) + self.assertIsNone(base2.tzname()) # Ensure we can add one. base3 = base2.replace(tzinfo=z100) self.assertEqual(base, base3) - self.assertTrue(base.tzinfo is base3.tzinfo) + self.assertIs(base.tzinfo, base3.tzinfo) # Out of bounds. base = cls(2000, 2, 29) @@ -3256,18 +3257,18 @@ fm5h = FixedOffset(-timedelta(hours=5), "m300") dt = self.theclass.now(tz=f44m) - self.assertTrue(dt.tzinfo is f44m) + self.assertIs(dt.tzinfo, f44m) # Replacing with degenerate tzinfo raises an exception. self.assertRaises(ValueError, dt.astimezone, fnone) # Replacing with same tzinfo makes no change. x = dt.astimezone(dt.tzinfo) - self.assertTrue(x.tzinfo is f44m) + self.assertIs(x.tzinfo, f44m) self.assertEqual(x.date(), dt.date()) self.assertEqual(x.time(), dt.time()) # Replacing with different tzinfo does adjust. got = dt.astimezone(fm5h) - self.assertTrue(got.tzinfo is fm5h) + self.assertIs(got.tzinfo, fm5h) self.assertEqual(got.utcoffset(), timedelta(hours=-5)) expected = dt - dt.utcoffset() # in effect, convert to UTC expected += fm5h.utcoffset(dt) # and from there to local time @@ -3275,7 +3276,7 @@ self.assertEqual(got.date(), expected.date()) self.assertEqual(got.time(), expected.time()) self.assertEqual(got.timetz(), expected.timetz()) - self.assertTrue(got.tzinfo is expected.tzinfo) + self.assertIs(got.tzinfo, expected.tzinfo) self.assertEqual(got, expected) @support.run_with_tz('UTC') @@ -3736,8 +3737,8 @@ as_datetime = datetime.combine(as_date, time()) self.assertTrue(as_date != as_datetime) self.assertTrue(as_datetime != as_date) - self.assertTrue(not as_date == as_datetime) - self.assertTrue(not as_datetime == as_date) + self.assertFalse(as_date == as_datetime) + self.assertFalse(as_datetime == as_date) self.assertRaises(TypeError, lambda: as_date < as_datetime) self.assertRaises(TypeError, lambda: as_datetime < as_date) self.assertRaises(TypeError, lambda: as_date <= as_datetime) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 12:25:12 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 12:25:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NjA1?= =?utf-8?q?=3A_Use_specific_asserts_in_datetime_tests?= Message-ID: <3dMrZ04LwSz7Lkn@mail.python.org> http://hg.python.org/cpython/rev/648be7ea8b1d changeset: 87194:648be7ea8b1d branch: 2.7 parent: 87191:6bd5e4325506 user: Serhiy Storchaka date: Sun Nov 17 13:03:21 2013 +0200 summary: Issue #19605: Use specific asserts in datetime tests files: Lib/test/test_datetime.py | 162 +++++++++++++------------- 1 files changed, 81 insertions(+), 81 deletions(-) diff --git a/Lib/test/test_datetime.py b/Lib/test/test_datetime.py --- a/Lib/test/test_datetime.py +++ b/Lib/test/test_datetime.py @@ -101,11 +101,11 @@ # carry no data), but they need to be picklable anyway else # concrete subclasses can't be pickled. orig = tzinfo.__new__(tzinfo) - self.assertTrue(type(orig) is tzinfo) + self.assertIs(type(orig), tzinfo) for pickler, unpickler, proto in pickle_choices: green = pickler.dumps(orig, proto) derived = unpickler.loads(green) - self.assertTrue(type(derived) is tzinfo) + self.assertIs(type(derived), tzinfo) def test_pickling_subclass(self): # Make sure we can pickle/unpickle an instance of a subclass. @@ -328,9 +328,9 @@ self.assertTrue(t1 == t2) self.assertTrue(t1 <= t2) self.assertTrue(t1 >= t2) - self.assertTrue(not t1 != t2) - self.assertTrue(not t1 < t2) - self.assertTrue(not t1 > t2) + self.assertFalse(t1 != t2) + self.assertFalse(t1 < t2) + self.assertFalse(t1 > t2) self.assertEqual(cmp(t1, t2), 0) self.assertEqual(cmp(t2, t1), 0) @@ -342,12 +342,12 @@ self.assertTrue(t2 >= t1) self.assertTrue(t1 != t2) self.assertTrue(t2 != t1) - self.assertTrue(not t1 == t2) - self.assertTrue(not t2 == t1) - self.assertTrue(not t1 > t2) - self.assertTrue(not t2 < t1) - self.assertTrue(not t1 >= t2) - self.assertTrue(not t2 <= t1) + self.assertFalse(t1 == t2) + self.assertFalse(t2 == t1) + self.assertFalse(t1 > t2) + self.assertFalse(t2 < t1) + self.assertFalse(t1 >= t2) + self.assertFalse(t2 <= t1) self.assertEqual(cmp(t1, t2), -1) self.assertEqual(cmp(t2, t1), 1) @@ -459,7 +459,7 @@ self.assertTrue(timedelta(0, 1)) self.assertTrue(timedelta(0, 0, 1)) self.assertTrue(timedelta(microseconds=1)) - self.assertTrue(not timedelta(0)) + self.assertFalse(timedelta(0)) def test_subclass_timedelta(self): @@ -475,17 +475,17 @@ return round(sum) t1 = T(days=1) - self.assertTrue(type(t1) is T) + self.assertIs(type(t1), T) self.assertEqual(t1.as_hours(), 24) t2 = T(days=-1, seconds=-3600) - self.assertTrue(type(t2) is T) + self.assertIs(type(t2), T) self.assertEqual(t2.as_hours(), -25) t3 = t1 + t2 - self.assertTrue(type(t3) is timedelta) + self.assertIs(type(t3), timedelta) t4 = T.from_td(t3) - self.assertTrue(type(t4) is T) + self.assertIs(type(t4), T) self.assertEqual(t3.days, t4.days) self.assertEqual(t3.seconds, t4.seconds) self.assertEqual(t3.microseconds, t4.microseconds) @@ -783,8 +783,9 @@ # It worked or it didn't. If it didn't, assume it's reason #2, and # let the test pass if they're within half a second of each other. - self.assertTrue(today == todayagain or - abs(todayagain - today) < timedelta(seconds=0.5)) + if today != todayagain: + self.assertAlmostEqual(todayagain, today, + delta=timedelta(seconds=0.5)) def test_weekday(self): for i in range(7): @@ -974,9 +975,9 @@ self.assertTrue(t1 == t2) self.assertTrue(t1 <= t2) self.assertTrue(t1 >= t2) - self.assertTrue(not t1 != t2) - self.assertTrue(not t1 < t2) - self.assertTrue(not t1 > t2) + self.assertFalse(t1 != t2) + self.assertFalse(t1 < t2) + self.assertFalse(t1 > t2) self.assertEqual(cmp(t1, t2), 0) self.assertEqual(cmp(t2, t1), 0) @@ -988,12 +989,12 @@ self.assertTrue(t2 >= t1) self.assertTrue(t1 != t2) self.assertTrue(t2 != t1) - self.assertTrue(not t1 == t2) - self.assertTrue(not t2 == t1) - self.assertTrue(not t1 > t2) - self.assertTrue(not t2 < t1) - self.assertTrue(not t1 >= t2) - self.assertTrue(not t2 <= t1) + self.assertFalse(t1 == t2) + self.assertFalse(t2 == t1) + self.assertFalse(t1 > t2) + self.assertFalse(t2 < t1) + self.assertFalse(t1 >= t2) + self.assertFalse(t2 <= t1) self.assertEqual(cmp(t1, t2), -1) self.assertEqual(cmp(t2, t1), 1) @@ -1444,9 +1445,9 @@ self.assertTrue(t1 == t2) self.assertTrue(t1 <= t2) self.assertTrue(t1 >= t2) - self.assertTrue(not t1 != t2) - self.assertTrue(not t1 < t2) - self.assertTrue(not t1 > t2) + self.assertFalse(t1 != t2) + self.assertFalse(t1 < t2) + self.assertFalse(t1 > t2) self.assertEqual(cmp(t1, t2), 0) self.assertEqual(cmp(t2, t1), 0) @@ -1460,12 +1461,12 @@ self.assertTrue(t2 >= t1) self.assertTrue(t1 != t2) self.assertTrue(t2 != t1) - self.assertTrue(not t1 == t2) - self.assertTrue(not t2 == t1) - self.assertTrue(not t1 > t2) - self.assertTrue(not t2 < t1) - self.assertTrue(not t1 >= t2) - self.assertTrue(not t2 <= t1) + self.assertFalse(t1 == t2) + self.assertFalse(t2 == t1) + self.assertFalse(t1 > t2) + self.assertFalse(t2 < t1) + self.assertFalse(t1 >= t2) + self.assertFalse(t2 <= t1) self.assertEqual(cmp(t1, t2), -1) self.assertEqual(cmp(t2, t1), 1) @@ -1541,7 +1542,7 @@ if abs(from_timestamp - from_now) <= tolerance: break # Else try again a few times. - self.assertTrue(abs(from_timestamp - from_now) <= tolerance) + self.assertLessEqual(abs(from_timestamp - from_now), tolerance) def test_strptime(self): import _strptime @@ -1727,9 +1728,9 @@ self.assertTrue(t1 == t2) self.assertTrue(t1 <= t2) self.assertTrue(t1 >= t2) - self.assertTrue(not t1 != t2) - self.assertTrue(not t1 < t2) - self.assertTrue(not t1 > t2) + self.assertFalse(t1 != t2) + self.assertFalse(t1 < t2) + self.assertFalse(t1 > t2) self.assertEqual(cmp(t1, t2), 0) self.assertEqual(cmp(t2, t1), 0) @@ -1743,12 +1744,12 @@ self.assertTrue(t2 >= t1) self.assertTrue(t1 != t2) self.assertTrue(t2 != t1) - self.assertTrue(not t1 == t2) - self.assertTrue(not t2 == t1) - self.assertTrue(not t1 > t2) - self.assertTrue(not t2 < t1) - self.assertTrue(not t1 >= t2) - self.assertTrue(not t2 <= t1) + self.assertFalse(t1 == t2) + self.assertFalse(t2 == t1) + self.assertFalse(t1 > t2) + self.assertFalse(t2 < t1) + self.assertFalse(t1 >= t2) + self.assertFalse(t2 <= t1) self.assertEqual(cmp(t1, t2), -1) self.assertEqual(cmp(t2, t1), 1) @@ -1928,8 +1929,8 @@ self.assertTrue(cls(0, 1)) self.assertTrue(cls(0, 0, 1)) self.assertTrue(cls(0, 0, 0, 1)) - self.assertTrue(not cls(0)) - self.assertTrue(not cls()) + self.assertFalse(cls(0)) + self.assertFalse(cls()) def test_replace(self): cls = self.theclass @@ -2026,7 +2027,7 @@ def utcoffset(self, dt): pass b = BetterTry() t = cls(1, 1, 1, tzinfo=b) - self.assertTrue(t.tzinfo is b) + self.assertIs(t.tzinfo, b) def test_utc_offset_out_of_bounds(self): class Edgy(tzinfo): @@ -2065,9 +2066,9 @@ for t in (cls(1, 1, 1), cls(1, 1, 1, tzinfo=None), cls(1, 1, 1, tzinfo=C1())): - self.assertTrue(t.utcoffset() is None) - self.assertTrue(t.dst() is None) - self.assertTrue(t.tzname() is None) + self.assertIsNone(t.utcoffset()) + self.assertIsNone(t.dst()) + self.assertIsNone(t.tzname()) class C3(tzinfo): def utcoffset(self, dt): return timedelta(minutes=-1439) @@ -2161,7 +2162,7 @@ self.assertEqual(t.minute, 0) self.assertEqual(t.second, 0) self.assertEqual(t.microsecond, 0) - self.assertTrue(t.tzinfo is None) + self.assertIsNone(t.tzinfo) def test_zones(self): est = FixedOffset(-300, "EST", 1) @@ -2176,25 +2177,25 @@ self.assertEqual(t1.tzinfo, est) self.assertEqual(t2.tzinfo, utc) self.assertEqual(t3.tzinfo, met) - self.assertTrue(t4.tzinfo is None) + self.assertIsNone(t4.tzinfo) self.assertEqual(t5.tzinfo, utc) self.assertEqual(t1.utcoffset(), timedelta(minutes=-300)) self.assertEqual(t2.utcoffset(), timedelta(minutes=0)) self.assertEqual(t3.utcoffset(), timedelta(minutes=60)) - self.assertTrue(t4.utcoffset() is None) + self.assertIsNone(t4.utcoffset()) self.assertRaises(TypeError, t1.utcoffset, "no args") self.assertEqual(t1.tzname(), "EST") self.assertEqual(t2.tzname(), "UTC") self.assertEqual(t3.tzname(), "MET") - self.assertTrue(t4.tzname() is None) + self.assertIsNone(t4.tzname()) self.assertRaises(TypeError, t1.tzname, "no args") self.assertEqual(t1.dst(), timedelta(minutes=1)) self.assertEqual(t2.dst(), timedelta(minutes=-2)) self.assertEqual(t3.dst(), timedelta(minutes=3)) - self.assertTrue(t4.dst() is None) + self.assertIsNone(t4.dst()) self.assertRaises(TypeError, t1.dst, "no args") self.assertEqual(hash(t1), hash(t2)) @@ -2285,10 +2286,10 @@ self.assertTrue(t) t = cls(5, tzinfo=FixedOffset(300, "")) - self.assertTrue(not t) + self.assertFalse(t) t = cls(23, 59, tzinfo=FixedOffset(23*60 + 59, "")) - self.assertTrue(not t) + self.assertFalse(t) # Mostly ensuring this doesn't overflow internally. t = cls(0, tzinfo=FixedOffset(23*60 + 59, "")) @@ -2326,13 +2327,13 @@ # Ensure we can get rid of a tzinfo. self.assertEqual(base.tzname(), "+100") base2 = base.replace(tzinfo=None) - self.assertTrue(base2.tzinfo is None) - self.assertTrue(base2.tzname() is None) + self.assertIsNone(base2.tzinfo) + self.assertIsNone(base2.tzname()) # Ensure we can add one. base3 = base2.replace(tzinfo=z100) self.assertEqual(base, base3) - self.assertTrue(base.tzinfo is base3.tzinfo) + self.assertIs(base.tzinfo, base3.tzinfo) # Out of bounds. base = cls(1) @@ -2567,7 +2568,7 @@ tz55 = FixedOffset(-330, "west 5:30") timeaware = now.time().replace(tzinfo=tz55) nowaware = self.theclass.combine(now.date(), timeaware) - self.assertTrue(nowaware.tzinfo is tz55) + self.assertIs(nowaware.tzinfo, tz55) self.assertEqual(nowaware.timetz(), timeaware) # Can't mix aware and non-aware. @@ -2586,15 +2587,15 @@ # Adding a delta should preserve tzinfo. delta = timedelta(weeks=1, minutes=12, microseconds=5678) nowawareplus = nowaware + delta - self.assertTrue(nowaware.tzinfo is tz55) + self.assertIs(nowaware.tzinfo, tz55) nowawareplus2 = delta + nowaware - self.assertTrue(nowawareplus2.tzinfo is tz55) + self.assertIs(nowawareplus2.tzinfo, tz55) self.assertEqual(nowawareplus, nowawareplus2) # that - delta should be what we started with, and that - what we # started with should be delta. diff = nowawareplus - delta - self.assertTrue(diff.tzinfo is tz55) + self.assertIs(diff.tzinfo, tz55) self.assertEqual(nowaware, diff) self.assertRaises(TypeError, lambda: delta - nowawareplus) self.assertEqual(nowawareplus - nowaware, delta) @@ -2603,7 +2604,7 @@ tzr = FixedOffset(random.randrange(-1439, 1440), "randomtimezone") # Attach it to nowawareplus. nowawareplus = nowawareplus.replace(tzinfo=tzr) - self.assertTrue(nowawareplus.tzinfo is tzr) + self.assertIs(nowawareplus.tzinfo, tzr) # Make sure the difference takes the timezone adjustments into account. got = nowaware - nowawareplus # Expected: (nowaware base - nowaware offset) - @@ -2630,7 +2631,7 @@ off42 = FixedOffset(42, "42") another = meth(off42) again = meth(tz=off42) - self.assertTrue(another.tzinfo is again.tzinfo) + self.assertIs(another.tzinfo, again.tzinfo) self.assertEqual(another.utcoffset(), timedelta(minutes=42)) # Bad argument with and w/o naming the keyword. self.assertRaises(TypeError, meth, 16) @@ -2647,7 +2648,7 @@ utc = FixedOffset(0, "utc", 0) for dummy in range(3): now = datetime.now(weirdtz) - self.assertTrue(now.tzinfo is weirdtz) + self.assertIs(now.tzinfo, weirdtz) utcnow = datetime.utcnow().replace(tzinfo=utc) now2 = utcnow.astimezone(weirdtz) if abs(now - now2) < timedelta(seconds=30): @@ -2668,7 +2669,7 @@ off42 = FixedOffset(42, "42") another = meth(ts, off42) again = meth(ts, tz=off42) - self.assertTrue(another.tzinfo is again.tzinfo) + self.assertIs(another.tzinfo, again.tzinfo) self.assertEqual(another.utcoffset(), timedelta(minutes=42)) # Bad argument with and w/o naming the keyword. self.assertRaises(TypeError, meth, ts, 16) @@ -2862,13 +2863,13 @@ # Ensure we can get rid of a tzinfo. self.assertEqual(base.tzname(), "+100") base2 = base.replace(tzinfo=None) - self.assertTrue(base2.tzinfo is None) - self.assertTrue(base2.tzname() is None) + self.assertIsNone(base2.tzinfo) + self.assertIsNone(base2.tzname()) # Ensure we can add one. base3 = base2.replace(tzinfo=z100) self.assertEqual(base, base3) - self.assertTrue(base.tzinfo is base3.tzinfo) + self.assertIs(base.tzinfo, base3.tzinfo) # Out of bounds. base = cls(2000, 2, 29) @@ -2881,20 +2882,20 @@ fm5h = FixedOffset(-timedelta(hours=5), "m300") dt = self.theclass.now(tz=f44m) - self.assertTrue(dt.tzinfo is f44m) + self.assertIs(dt.tzinfo, f44m) # Replacing with degenerate tzinfo raises an exception. self.assertRaises(ValueError, dt.astimezone, fnone) # Ditto with None tz. self.assertRaises(TypeError, dt.astimezone, None) # Replacing with same tzinfo makes no change. x = dt.astimezone(dt.tzinfo) - self.assertTrue(x.tzinfo is f44m) + self.assertIs(x.tzinfo, f44m) self.assertEqual(x.date(), dt.date()) self.assertEqual(x.time(), dt.time()) # Replacing with different tzinfo does adjust. got = dt.astimezone(fm5h) - self.assertTrue(got.tzinfo is fm5h) + self.assertIs(got.tzinfo, fm5h) self.assertEqual(got.utcoffset(), timedelta(hours=-5)) expected = dt - dt.utcoffset() # in effect, convert to UTC expected += fm5h.utcoffset(dt) # and from there to local time @@ -2902,7 +2903,7 @@ self.assertEqual(got.date(), expected.date()) self.assertEqual(got.time(), expected.time()) self.assertEqual(got.timetz(), expected.timetz()) - self.assertTrue(got.tzinfo is expected.tzinfo) + self.assertIs(got.tzinfo, expected.tzinfo) self.assertEqual(got, expected) def test_aware_subtract(self): @@ -3330,8 +3331,8 @@ as_datetime = datetime.combine(as_date, time()) self.assertTrue(as_date != as_datetime) self.assertTrue(as_datetime != as_date) - self.assertTrue(not as_date == as_datetime) - self.assertTrue(not as_datetime == as_date) + self.assertFalse(as_date == as_datetime) + self.assertFalse(as_datetime == as_date) self.assertRaises(TypeError, lambda: as_date < as_datetime) self.assertRaises(TypeError, lambda: as_datetime < as_date) self.assertRaises(TypeError, lambda: as_date <= as_datetime) @@ -3345,8 +3346,7 @@ # projection if use of a date method is forced. self.assertTrue(as_date.__eq__(as_datetime)) different_day = (as_date.day + 1) % 20 + 1 - self.assertTrue(not as_date.__eq__(as_datetime.replace(day= - different_day))) + self.assertFalse(as_date.__eq__(as_datetime.replace(day=different_day))) # And date should compare with other subclasses of date. If a # subclass wants to stop this, it's up to the subclass to do so. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 12:25:14 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 12:25:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NjA3?= =?utf-8?q?=3A_Use_specific_asserts_in_weakref_tests=2E?= Message-ID: <3dMrZ20DGGz7LrH@mail.python.org> http://hg.python.org/cpython/rev/d187eae08b8a changeset: 87195:d187eae08b8a branch: 3.3 parent: 87192:bc67e8d39164 user: Serhiy Storchaka date: Sun Nov 17 13:20:09 2013 +0200 summary: Issue #19607: Use specific asserts in weakref tests. files: Lib/test/test_weakref.py | 127 +++++++++++++------------- 1 files changed, 62 insertions(+), 65 deletions(-) diff --git a/Lib/test/test_weakref.py b/Lib/test/test_weakref.py --- a/Lib/test/test_weakref.py +++ b/Lib/test/test_weakref.py @@ -88,11 +88,9 @@ ref1 = weakref.ref(o, self.callback) ref2 = weakref.ref(o, self.callback) del o - self.assertTrue(ref1() is None, - "expected reference to be invalidated") - self.assertTrue(ref2() is None, - "expected reference to be invalidated") - self.assertTrue(self.cbcalled == 2, + self.assertIsNone(ref1(), "expected reference to be invalidated") + self.assertIsNone(ref2(), "expected reference to be invalidated") + self.assertEqual(self.cbcalled, 2, "callback not called the right number of times") def test_multiple_selfref_callbacks(self): @@ -131,10 +129,10 @@ def check_basic_ref(self, factory): o = factory() ref = weakref.ref(o) - self.assertTrue(ref() is not None, + self.assertIsNotNone(ref(), "weak reference to live object should be live") o2 = ref() - self.assertTrue(o is o2, + self.assertIs(o, o2, "() should return original object if live") def check_basic_callback(self, factory): @@ -142,9 +140,9 @@ o = factory() ref = weakref.ref(o, self.callback) del o - self.assertTrue(self.cbcalled == 1, + self.assertEqual(self.cbcalled, 1, "callback did not properly set 'cbcalled'") - self.assertTrue(ref() is None, + self.assertIsNone(ref(), "ref2 should be dead after deleting object reference") def test_ref_reuse(self): @@ -154,19 +152,19 @@ # between these two; it should make no difference proxy = weakref.proxy(o) ref2 = weakref.ref(o) - self.assertTrue(ref1 is ref2, + self.assertIs(ref1, ref2, "reference object w/out callback should be re-used") o = C() proxy = weakref.proxy(o) ref1 = weakref.ref(o) ref2 = weakref.ref(o) - self.assertTrue(ref1 is ref2, + self.assertIs(ref1, ref2, "reference object w/out callback should be re-used") - self.assertTrue(weakref.getweakrefcount(o) == 2, + self.assertEqual(weakref.getweakrefcount(o), 2, "wrong weak ref count for object") del proxy - self.assertTrue(weakref.getweakrefcount(o) == 1, + self.assertEqual(weakref.getweakrefcount(o), 1, "wrong weak ref count for object after deleting proxy") def test_proxy_reuse(self): @@ -174,7 +172,7 @@ proxy1 = weakref.proxy(o) ref = weakref.ref(o) proxy2 = weakref.proxy(o) - self.assertTrue(proxy1 is proxy2, + self.assertIs(proxy1, proxy2, "proxy object w/out callback should have been re-used") def test_basic_proxy(self): @@ -254,19 +252,19 @@ o = Object(1) p1 = makeref(o, None) p2 = makeref(o, None) - self.assertTrue(p1 is p2, "both callbacks were None in the C API") + self.assertIs(p1, p2, "both callbacks were None in the C API") del p1, p2 p1 = makeref(o) p2 = makeref(o, None) - self.assertTrue(p1 is p2, "callbacks were NULL, None in the C API") + self.assertIs(p1, p2, "callbacks were NULL, None in the C API") del p1, p2 p1 = makeref(o) p2 = makeref(o) - self.assertTrue(p1 is p2, "both callbacks were NULL in the C API") + self.assertIs(p1, p2, "both callbacks were NULL in the C API") del p1, p2 p1 = makeref(o, None) p2 = makeref(o) - self.assertTrue(p1 is p2, "callbacks were None, NULL in the C API") + self.assertIs(p1, p2, "callbacks were None, NULL in the C API") def test_callable_proxy(self): o = Callable() @@ -274,13 +272,13 @@ self.check_proxy(o, ref1) - self.assertTrue(type(ref1) is weakref.CallableProxyType, + self.assertIs(type(ref1), weakref.CallableProxyType, "proxy is not of callable type") ref1('twinkies!') - self.assertTrue(o.bar == 'twinkies!', + self.assertEqual(o.bar, 'twinkies!', "call through proxy not passed through to original") ref1(x='Splat.') - self.assertTrue(o.bar == 'Splat.', + self.assertEqual(o.bar, 'Splat.', "call through proxy not passed through to original") # expect due to too few args @@ -291,24 +289,23 @@ def check_proxy(self, o, proxy): o.foo = 1 - self.assertTrue(proxy.foo == 1, + self.assertEqual(proxy.foo, 1, "proxy does not reflect attribute addition") o.foo = 2 - self.assertTrue(proxy.foo == 2, + self.assertEqual(proxy.foo, 2, "proxy does not reflect attribute modification") del o.foo - self.assertTrue(not hasattr(proxy, 'foo'), + self.assertFalse(hasattr(proxy, 'foo'), "proxy does not reflect attribute removal") proxy.foo = 1 - self.assertTrue(o.foo == 1, + self.assertEqual(o.foo, 1, "object does not reflect attribute addition via proxy") proxy.foo = 2 - self.assertTrue( - o.foo == 2, + self.assertEqual(o.foo, 2, "object does not reflect attribute modification via proxy") del proxy.foo - self.assertTrue(not hasattr(o, 'foo'), + self.assertFalse(hasattr(o, 'foo'), "object does not reflect attribute removal via proxy") def test_proxy_deletion(self): @@ -332,21 +329,21 @@ o = C() ref1 = weakref.ref(o) ref2 = weakref.ref(o, self.callback) - self.assertTrue(weakref.getweakrefcount(o) == 2, + self.assertEqual(weakref.getweakrefcount(o), 2, "got wrong number of weak reference objects") proxy1 = weakref.proxy(o) proxy2 = weakref.proxy(o, self.callback) - self.assertTrue(weakref.getweakrefcount(o) == 4, + self.assertEqual(weakref.getweakrefcount(o), 4, "got wrong number of weak reference objects") del ref1, ref2, proxy1, proxy2 - self.assertTrue(weakref.getweakrefcount(o) == 0, + self.assertEqual(weakref.getweakrefcount(o), 0, "weak reference objects not unlinked from" " referent when discarded.") # assumes ints do not support weakrefs - self.assertTrue(weakref.getweakrefcount(1) == 0, + self.assertEqual(weakref.getweakrefcount(1), 0, "got wrong number of weak reference objects for int") def test_getweakrefs(self): @@ -354,22 +351,22 @@ ref1 = weakref.ref(o, self.callback) ref2 = weakref.ref(o, self.callback) del ref1 - self.assertTrue(weakref.getweakrefs(o) == [ref2], + self.assertEqual(weakref.getweakrefs(o), [ref2], "list of refs does not match") o = C() ref1 = weakref.ref(o, self.callback) ref2 = weakref.ref(o, self.callback) del ref2 - self.assertTrue(weakref.getweakrefs(o) == [ref1], + self.assertEqual(weakref.getweakrefs(o), [ref1], "list of refs does not match") del ref1 - self.assertTrue(weakref.getweakrefs(o) == [], + self.assertEqual(weakref.getweakrefs(o), [], "list of refs not cleared") # assumes ints do not support weakrefs - self.assertTrue(weakref.getweakrefs(1) == [], + self.assertEqual(weakref.getweakrefs(1), [], "list of refs does not match for int") def test_newstyle_number_ops(self): @@ -377,8 +374,8 @@ pass f = F(2.0) p = weakref.proxy(f) - self.assertTrue(p + 1.0 == 3.0) - self.assertTrue(1.0 + p == 3.0) # this used to SEGV + self.assertEqual(p + 1.0, 3.0) + self.assertEqual(1.0 + p, 3.0) # this used to SEGV def test_callbacks_protected(self): # Callbacks protected from already-set exceptions? @@ -631,7 +628,7 @@ c.wr = weakref.ref(d, callback) # this won't trigger d.wr = weakref.ref(callback, d.cb) # ditto external_wr = weakref.ref(callback, safe_callback) # but this will - self.assertTrue(external_wr() is callback) + self.assertIs(external_wr(), callback) # The weakrefs attached to c and d should get cleared, so that # C.cb is never called. But external_wr isn't part of the cyclic @@ -810,11 +807,11 @@ return super().__call__() o = Object("foo") mr = MyRef(o, value=24) - self.assertTrue(mr() is o) + self.assertIs(mr(), o) self.assertTrue(mr.called) self.assertEqual(mr.value, 24) del o - self.assertTrue(mr() is None) + self.assertIsNone(mr()) self.assertTrue(mr.called) def test_subclass_refs_dont_replace_standard_refs(self): @@ -823,14 +820,14 @@ o = Object(42) r1 = MyRef(o) r2 = weakref.ref(o) - self.assertTrue(r1 is not r2) + self.assertIsNot(r1, r2) self.assertEqual(weakref.getweakrefs(o), [r2, r1]) self.assertEqual(weakref.getweakrefcount(o), 2) r3 = MyRef(o) self.assertEqual(weakref.getweakrefcount(o), 3) refs = weakref.getweakrefs(o) self.assertEqual(len(refs), 3) - self.assertTrue(r2 is refs[0]) + self.assertIs(r2, refs[0]) self.assertIn(r1, refs[1:]) self.assertIn(r3, refs[1:]) @@ -840,7 +837,7 @@ o = Object(42) r1 = MyRef(o, id) r2 = MyRef(o, str) - self.assertTrue(r1 is not r2) + self.assertIsNot(r1, r2) refs = weakref.getweakrefs(o) self.assertIn(r1, refs) self.assertIn(r2, refs) @@ -968,7 +965,7 @@ dict, objects = self.make_weak_valued_dict() for o in objects: self.assertEqual(weakref.getweakrefcount(o), 1) - self.assertTrue(o is dict[o.arg], + self.assertIs(o, dict[o.arg], "wrong object returned by weak dict!") items1 = list(dict.items()) items2 = list(dict.copy().items()) @@ -997,9 +994,9 @@ # dict, objects = self.make_weak_keyed_dict() for o in objects: - self.assertTrue(weakref.getweakrefcount(o) == 1, + self.assertEqual(weakref.getweakrefcount(o), 1, "wrong number of weak references to %r!" % o) - self.assertTrue(o.arg is dict[o], + self.assertIs(o.arg, dict[o], "wrong object returned by weak dict!") items1 = dict.items() items2 = dict.copy().items() @@ -1008,10 +1005,10 @@ del items1, items2 self.assertEqual(len(dict), self.COUNT) del objects[0] - self.assertTrue(len(dict) == (self.COUNT - 1), + self.assertEqual(len(dict), (self.COUNT - 1), "deleting object did not cause dictionary update") del objects, o - self.assertTrue(len(dict) == 0, + self.assertEqual(len(dict), 0, "deleting the keys did not clear the dictionary") o = Object(42) dict[o] = "What is the meaning of the universe?" @@ -1221,15 +1218,15 @@ k, v = weakdict.popitem() self.assertEqual(len(weakdict), 1) if k is key1: - self.assertTrue(v is value1) + self.assertIs(v, value1) else: - self.assertTrue(v is value2) + self.assertIs(v, value2) k, v = weakdict.popitem() self.assertEqual(len(weakdict), 0) if k is key1: - self.assertTrue(v is value1) + self.assertIs(v, value1) else: - self.assertTrue(v is value2) + self.assertIs(v, value2) def test_weak_valued_dict_popitem(self): self.check_popitem(weakref.WeakValueDictionary, @@ -1240,21 +1237,21 @@ C(), "value 1", C(), "value 2") def check_setdefault(self, klass, key, value1, value2): - self.assertTrue(value1 is not value2, + self.assertIsNot(value1, value2, "invalid test" " -- value parameters must be distinct objects") weakdict = klass() o = weakdict.setdefault(key, value1) - self.assertTrue(o is value1) + self.assertIs(o, value1) self.assertIn(key, weakdict) - self.assertTrue(weakdict.get(key) is value1) - self.assertTrue(weakdict[key] is value1) + self.assertIs(weakdict.get(key), value1) + self.assertIs(weakdict[key], value1) o = weakdict.setdefault(key, value2) - self.assertTrue(o is value1) + self.assertIs(o, value1) self.assertIn(key, weakdict) - self.assertTrue(weakdict.get(key) is value1) - self.assertTrue(weakdict[key] is value1) + self.assertIs(weakdict.get(key), value1) + self.assertIs(weakdict[key], value1) def test_weak_valued_dict_setdefault(self): self.check_setdefault(weakref.WeakValueDictionary, @@ -1275,13 +1272,13 @@ for k in weakdict.keys(): self.assertIn(k, dict, "mysterious new key appeared in weak dict") v = dict.get(k) - self.assertTrue(v is weakdict[k]) - self.assertTrue(v is weakdict.get(k)) + self.assertIs(v, weakdict[k]) + self.assertIs(v, weakdict.get(k)) for k in dict.keys(): self.assertIn(k, weakdict, "original key disappeared in weak dict") v = dict[k] - self.assertTrue(v is weakdict[k]) - self.assertTrue(v is weakdict.get(k)) + self.assertIs(v, weakdict[k]) + self.assertIs(v, weakdict.get(k)) def test_weak_valued_dict_update(self): self.check_update(weakref.WeakValueDictionary, @@ -1311,7 +1308,7 @@ self.assertEqual(len(d), 2) del d['something'] self.assertEqual(len(d), 1) - self.assertTrue(list(d.items()) == [('something else', o2)]) + self.assertEqual(list(d.items()), [('something else', o2)]) def test_weak_keyed_bad_delitem(self): d = weakref.WeakKeyDictionary() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 12:25:15 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 12:25:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319607=3A_Use_specific_asserts_in_weakref_tests?= =?utf-8?q?=2E?= Message-ID: <3dMrZ338ynz7LlK@mail.python.org> http://hg.python.org/cpython/rev/8deab371850b changeset: 87196:8deab371850b parent: 87193:0b7a519cb58f parent: 87195:d187eae08b8a user: Serhiy Storchaka date: Sun Nov 17 13:20:39 2013 +0200 summary: Issue #19607: Use specific asserts in weakref tests. files: Lib/test/test_weakref.py | 127 +++++++++++++------------- 1 files changed, 62 insertions(+), 65 deletions(-) diff --git a/Lib/test/test_weakref.py b/Lib/test/test_weakref.py --- a/Lib/test/test_weakref.py +++ b/Lib/test/test_weakref.py @@ -97,11 +97,9 @@ ref1 = weakref.ref(o, self.callback) ref2 = weakref.ref(o, self.callback) del o - self.assertTrue(ref1() is None, - "expected reference to be invalidated") - self.assertTrue(ref2() is None, - "expected reference to be invalidated") - self.assertTrue(self.cbcalled == 2, + self.assertIsNone(ref1(), "expected reference to be invalidated") + self.assertIsNone(ref2(), "expected reference to be invalidated") + self.assertEqual(self.cbcalled, 2, "callback not called the right number of times") def test_multiple_selfref_callbacks(self): @@ -140,10 +138,10 @@ def check_basic_ref(self, factory): o = factory() ref = weakref.ref(o) - self.assertTrue(ref() is not None, + self.assertIsNotNone(ref(), "weak reference to live object should be live") o2 = ref() - self.assertTrue(o is o2, + self.assertIs(o, o2, "() should return original object if live") def check_basic_callback(self, factory): @@ -151,9 +149,9 @@ o = factory() ref = weakref.ref(o, self.callback) del o - self.assertTrue(self.cbcalled == 1, + self.assertEqual(self.cbcalled, 1, "callback did not properly set 'cbcalled'") - self.assertTrue(ref() is None, + self.assertIsNone(ref(), "ref2 should be dead after deleting object reference") def test_ref_reuse(self): @@ -163,19 +161,19 @@ # between these two; it should make no difference proxy = weakref.proxy(o) ref2 = weakref.ref(o) - self.assertTrue(ref1 is ref2, + self.assertIs(ref1, ref2, "reference object w/out callback should be re-used") o = C() proxy = weakref.proxy(o) ref1 = weakref.ref(o) ref2 = weakref.ref(o) - self.assertTrue(ref1 is ref2, + self.assertIs(ref1, ref2, "reference object w/out callback should be re-used") - self.assertTrue(weakref.getweakrefcount(o) == 2, + self.assertEqual(weakref.getweakrefcount(o), 2, "wrong weak ref count for object") del proxy - self.assertTrue(weakref.getweakrefcount(o) == 1, + self.assertEqual(weakref.getweakrefcount(o), 1, "wrong weak ref count for object after deleting proxy") def test_proxy_reuse(self): @@ -183,7 +181,7 @@ proxy1 = weakref.proxy(o) ref = weakref.ref(o) proxy2 = weakref.proxy(o) - self.assertTrue(proxy1 is proxy2, + self.assertIs(proxy1, proxy2, "proxy object w/out callback should have been re-used") def test_basic_proxy(self): @@ -263,19 +261,19 @@ o = Object(1) p1 = makeref(o, None) p2 = makeref(o, None) - self.assertTrue(p1 is p2, "both callbacks were None in the C API") + self.assertIs(p1, p2, "both callbacks were None in the C API") del p1, p2 p1 = makeref(o) p2 = makeref(o, None) - self.assertTrue(p1 is p2, "callbacks were NULL, None in the C API") + self.assertIs(p1, p2, "callbacks were NULL, None in the C API") del p1, p2 p1 = makeref(o) p2 = makeref(o) - self.assertTrue(p1 is p2, "both callbacks were NULL in the C API") + self.assertIs(p1, p2, "both callbacks were NULL in the C API") del p1, p2 p1 = makeref(o, None) p2 = makeref(o) - self.assertTrue(p1 is p2, "callbacks were None, NULL in the C API") + self.assertIs(p1, p2, "callbacks were None, NULL in the C API") def test_callable_proxy(self): o = Callable() @@ -283,13 +281,13 @@ self.check_proxy(o, ref1) - self.assertTrue(type(ref1) is weakref.CallableProxyType, + self.assertIs(type(ref1), weakref.CallableProxyType, "proxy is not of callable type") ref1('twinkies!') - self.assertTrue(o.bar == 'twinkies!', + self.assertEqual(o.bar, 'twinkies!', "call through proxy not passed through to original") ref1(x='Splat.') - self.assertTrue(o.bar == 'Splat.', + self.assertEqual(o.bar, 'Splat.', "call through proxy not passed through to original") # expect due to too few args @@ -300,24 +298,23 @@ def check_proxy(self, o, proxy): o.foo = 1 - self.assertTrue(proxy.foo == 1, + self.assertEqual(proxy.foo, 1, "proxy does not reflect attribute addition") o.foo = 2 - self.assertTrue(proxy.foo == 2, + self.assertEqual(proxy.foo, 2, "proxy does not reflect attribute modification") del o.foo - self.assertTrue(not hasattr(proxy, 'foo'), + self.assertFalse(hasattr(proxy, 'foo'), "proxy does not reflect attribute removal") proxy.foo = 1 - self.assertTrue(o.foo == 1, + self.assertEqual(o.foo, 1, "object does not reflect attribute addition via proxy") proxy.foo = 2 - self.assertTrue( - o.foo == 2, + self.assertEqual(o.foo, 2, "object does not reflect attribute modification via proxy") del proxy.foo - self.assertTrue(not hasattr(o, 'foo'), + self.assertFalse(hasattr(o, 'foo'), "object does not reflect attribute removal via proxy") def test_proxy_deletion(self): @@ -341,21 +338,21 @@ o = C() ref1 = weakref.ref(o) ref2 = weakref.ref(o, self.callback) - self.assertTrue(weakref.getweakrefcount(o) == 2, + self.assertEqual(weakref.getweakrefcount(o), 2, "got wrong number of weak reference objects") proxy1 = weakref.proxy(o) proxy2 = weakref.proxy(o, self.callback) - self.assertTrue(weakref.getweakrefcount(o) == 4, + self.assertEqual(weakref.getweakrefcount(o), 4, "got wrong number of weak reference objects") del ref1, ref2, proxy1, proxy2 - self.assertTrue(weakref.getweakrefcount(o) == 0, + self.assertEqual(weakref.getweakrefcount(o), 0, "weak reference objects not unlinked from" " referent when discarded.") # assumes ints do not support weakrefs - self.assertTrue(weakref.getweakrefcount(1) == 0, + self.assertEqual(weakref.getweakrefcount(1), 0, "got wrong number of weak reference objects for int") def test_getweakrefs(self): @@ -363,22 +360,22 @@ ref1 = weakref.ref(o, self.callback) ref2 = weakref.ref(o, self.callback) del ref1 - self.assertTrue(weakref.getweakrefs(o) == [ref2], + self.assertEqual(weakref.getweakrefs(o), [ref2], "list of refs does not match") o = C() ref1 = weakref.ref(o, self.callback) ref2 = weakref.ref(o, self.callback) del ref2 - self.assertTrue(weakref.getweakrefs(o) == [ref1], + self.assertEqual(weakref.getweakrefs(o), [ref1], "list of refs does not match") del ref1 - self.assertTrue(weakref.getweakrefs(o) == [], + self.assertEqual(weakref.getweakrefs(o), [], "list of refs not cleared") # assumes ints do not support weakrefs - self.assertTrue(weakref.getweakrefs(1) == [], + self.assertEqual(weakref.getweakrefs(1), [], "list of refs does not match for int") def test_newstyle_number_ops(self): @@ -386,8 +383,8 @@ pass f = F(2.0) p = weakref.proxy(f) - self.assertTrue(p + 1.0 == 3.0) - self.assertTrue(1.0 + p == 3.0) # this used to SEGV + self.assertEqual(p + 1.0, 3.0) + self.assertEqual(1.0 + p, 3.0) # this used to SEGV def test_callbacks_protected(self): # Callbacks protected from already-set exceptions? @@ -640,7 +637,7 @@ c.wr = weakref.ref(d, callback) # this won't trigger d.wr = weakref.ref(callback, d.cb) # ditto external_wr = weakref.ref(callback, safe_callback) # but this will - self.assertTrue(external_wr() is callback) + self.assertIs(external_wr(), callback) # The weakrefs attached to c and d should get cleared, so that # C.cb is never called. But external_wr isn't part of the cyclic @@ -843,11 +840,11 @@ return super().__call__() o = Object("foo") mr = MyRef(o, value=24) - self.assertTrue(mr() is o) + self.assertIs(mr(), o) self.assertTrue(mr.called) self.assertEqual(mr.value, 24) del o - self.assertTrue(mr() is None) + self.assertIsNone(mr()) self.assertTrue(mr.called) def test_subclass_refs_dont_replace_standard_refs(self): @@ -856,14 +853,14 @@ o = Object(42) r1 = MyRef(o) r2 = weakref.ref(o) - self.assertTrue(r1 is not r2) + self.assertIsNot(r1, r2) self.assertEqual(weakref.getweakrefs(o), [r2, r1]) self.assertEqual(weakref.getweakrefcount(o), 2) r3 = MyRef(o) self.assertEqual(weakref.getweakrefcount(o), 3) refs = weakref.getweakrefs(o) self.assertEqual(len(refs), 3) - self.assertTrue(r2 is refs[0]) + self.assertIs(r2, refs[0]) self.assertIn(r1, refs[1:]) self.assertIn(r3, refs[1:]) @@ -873,7 +870,7 @@ o = Object(42) r1 = MyRef(o, id) r2 = MyRef(o, str) - self.assertTrue(r1 is not r2) + self.assertIsNot(r1, r2) refs = weakref.getweakrefs(o) self.assertIn(r1, refs) self.assertIn(r2, refs) @@ -1135,7 +1132,7 @@ dict, objects = self.make_weak_valued_dict() for o in objects: self.assertEqual(weakref.getweakrefcount(o), 1) - self.assertTrue(o is dict[o.arg], + self.assertIs(o, dict[o.arg], "wrong object returned by weak dict!") items1 = list(dict.items()) items2 = list(dict.copy().items()) @@ -1164,9 +1161,9 @@ # dict, objects = self.make_weak_keyed_dict() for o in objects: - self.assertTrue(weakref.getweakrefcount(o) == 1, + self.assertEqual(weakref.getweakrefcount(o), 1, "wrong number of weak references to %r!" % o) - self.assertTrue(o.arg is dict[o], + self.assertIs(o.arg, dict[o], "wrong object returned by weak dict!") items1 = dict.items() items2 = dict.copy().items() @@ -1175,10 +1172,10 @@ del items1, items2 self.assertEqual(len(dict), self.COUNT) del objects[0] - self.assertTrue(len(dict) == (self.COUNT - 1), + self.assertEqual(len(dict), (self.COUNT - 1), "deleting object did not cause dictionary update") del objects, o - self.assertTrue(len(dict) == 0, + self.assertEqual(len(dict), 0, "deleting the keys did not clear the dictionary") o = Object(42) dict[o] = "What is the meaning of the universe?" @@ -1388,15 +1385,15 @@ k, v = weakdict.popitem() self.assertEqual(len(weakdict), 1) if k is key1: - self.assertTrue(v is value1) + self.assertIs(v, value1) else: - self.assertTrue(v is value2) + self.assertIs(v, value2) k, v = weakdict.popitem() self.assertEqual(len(weakdict), 0) if k is key1: - self.assertTrue(v is value1) + self.assertIs(v, value1) else: - self.assertTrue(v is value2) + self.assertIs(v, value2) def test_weak_valued_dict_popitem(self): self.check_popitem(weakref.WeakValueDictionary, @@ -1407,21 +1404,21 @@ C(), "value 1", C(), "value 2") def check_setdefault(self, klass, key, value1, value2): - self.assertTrue(value1 is not value2, + self.assertIsNot(value1, value2, "invalid test" " -- value parameters must be distinct objects") weakdict = klass() o = weakdict.setdefault(key, value1) - self.assertTrue(o is value1) + self.assertIs(o, value1) self.assertIn(key, weakdict) - self.assertTrue(weakdict.get(key) is value1) - self.assertTrue(weakdict[key] is value1) + self.assertIs(weakdict.get(key), value1) + self.assertIs(weakdict[key], value1) o = weakdict.setdefault(key, value2) - self.assertTrue(o is value1) + self.assertIs(o, value1) self.assertIn(key, weakdict) - self.assertTrue(weakdict.get(key) is value1) - self.assertTrue(weakdict[key] is value1) + self.assertIs(weakdict.get(key), value1) + self.assertIs(weakdict[key], value1) def test_weak_valued_dict_setdefault(self): self.check_setdefault(weakref.WeakValueDictionary, @@ -1442,13 +1439,13 @@ for k in weakdict.keys(): self.assertIn(k, dict, "mysterious new key appeared in weak dict") v = dict.get(k) - self.assertTrue(v is weakdict[k]) - self.assertTrue(v is weakdict.get(k)) + self.assertIs(v, weakdict[k]) + self.assertIs(v, weakdict.get(k)) for k in dict.keys(): self.assertIn(k, weakdict, "original key disappeared in weak dict") v = dict[k] - self.assertTrue(v is weakdict[k]) - self.assertTrue(v is weakdict.get(k)) + self.assertIs(v, weakdict[k]) + self.assertIs(v, weakdict.get(k)) def test_weak_valued_dict_update(self): self.check_update(weakref.WeakValueDictionary, @@ -1478,7 +1475,7 @@ self.assertEqual(len(d), 2) del d['something'] self.assertEqual(len(d), 1) - self.assertTrue(list(d.items()) == [('something else', o2)]) + self.assertEqual(list(d.items()), [('something else', o2)]) def test_weak_keyed_bad_delitem(self): d = weakref.WeakKeyDictionary() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 12:25:16 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 12:25:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NjA3?= =?utf-8?q?=3A_Use_specific_asserts_in_weakref_tests=2E?= Message-ID: <3dMrZ471xxz7Lny@mail.python.org> http://hg.python.org/cpython/rev/3d2cfeea8950 changeset: 87197:3d2cfeea8950 branch: 2.7 parent: 87194:648be7ea8b1d user: Serhiy Storchaka date: Sun Nov 17 13:20:50 2013 +0200 summary: Issue #19607: Use specific asserts in weakref tests. files: Lib/test/test_weakref.py | 151 +++++++++++++------------- 1 files changed, 74 insertions(+), 77 deletions(-) diff --git a/Lib/test/test_weakref.py b/Lib/test/test_weakref.py --- a/Lib/test/test_weakref.py +++ b/Lib/test/test_weakref.py @@ -91,11 +91,9 @@ ref1 = weakref.ref(o, self.callback) ref2 = weakref.ref(o, self.callback) del o - self.assertTrue(ref1() is None, - "expected reference to be invalidated") - self.assertTrue(ref2() is None, - "expected reference to be invalidated") - self.assertTrue(self.cbcalled == 2, + self.assertIsNone(ref1(), "expected reference to be invalidated") + self.assertIsNone(ref2(), "expected reference to be invalidated") + self.assertEqual(self.cbcalled, 2, "callback not called the right number of times") def test_multiple_selfref_callbacks(self): @@ -129,15 +127,15 @@ self.assertRaises(weakref.ReferenceError, check, ref1) self.assertRaises(weakref.ReferenceError, check, ref2) self.assertRaises(weakref.ReferenceError, bool, weakref.proxy(C())) - self.assertTrue(self.cbcalled == 2) + self.assertEqual(self.cbcalled, 2) def check_basic_ref(self, factory): o = factory() ref = weakref.ref(o) - self.assertTrue(ref() is not None, + self.assertIsNotNone(ref(), "weak reference to live object should be live") o2 = ref() - self.assertTrue(o is o2, + self.assertIs(o, o2, "() should return original object if live") def check_basic_callback(self, factory): @@ -145,9 +143,9 @@ o = factory() ref = weakref.ref(o, self.callback) del o - self.assertTrue(self.cbcalled == 1, + self.assertEqual(self.cbcalled, 1, "callback did not properly set 'cbcalled'") - self.assertTrue(ref() is None, + self.assertIsNone(ref(), "ref2 should be dead after deleting object reference") def test_ref_reuse(self): @@ -157,19 +155,19 @@ # between these two; it should make no difference proxy = weakref.proxy(o) ref2 = weakref.ref(o) - self.assertTrue(ref1 is ref2, + self.assertIs(ref1, ref2, "reference object w/out callback should be re-used") o = C() proxy = weakref.proxy(o) ref1 = weakref.ref(o) ref2 = weakref.ref(o) - self.assertTrue(ref1 is ref2, + self.assertIs(ref1, ref2, "reference object w/out callback should be re-used") - self.assertTrue(weakref.getweakrefcount(o) == 2, + self.assertEqual(weakref.getweakrefcount(o), 2, "wrong weak ref count for object") del proxy - self.assertTrue(weakref.getweakrefcount(o) == 1, + self.assertEqual(weakref.getweakrefcount(o), 1, "wrong weak ref count for object after deleting proxy") def test_proxy_reuse(self): @@ -177,7 +175,7 @@ proxy1 = weakref.proxy(o) ref = weakref.ref(o) proxy2 = weakref.proxy(o) - self.assertTrue(proxy1 is proxy2, + self.assertIs(proxy1, proxy2, "proxy object w/out callback should have been re-used") def test_basic_proxy(self): @@ -259,19 +257,19 @@ o = Object(1) p1 = makeref(o, None) p2 = makeref(o, None) - self.assertTrue(p1 is p2, "both callbacks were None in the C API") + self.assertIs(p1, p2, "both callbacks were None in the C API") del p1, p2 p1 = makeref(o) p2 = makeref(o, None) - self.assertTrue(p1 is p2, "callbacks were NULL, None in the C API") + self.assertIs(p1, p2, "callbacks were NULL, None in the C API") del p1, p2 p1 = makeref(o) p2 = makeref(o) - self.assertTrue(p1 is p2, "both callbacks were NULL in the C API") + self.assertIs(p1, p2, "both callbacks were NULL in the C API") del p1, p2 p1 = makeref(o, None) p2 = makeref(o) - self.assertTrue(p1 is p2, "callbacks were None, NULL in the C API") + self.assertIs(p1, p2, "callbacks were None, NULL in the C API") def test_callable_proxy(self): o = Callable() @@ -279,13 +277,13 @@ self.check_proxy(o, ref1) - self.assertTrue(type(ref1) is weakref.CallableProxyType, + self.assertIs(type(ref1), weakref.CallableProxyType, "proxy is not of callable type") ref1('twinkies!') - self.assertTrue(o.bar == 'twinkies!', + self.assertEqual(o.bar, 'twinkies!', "call through proxy not passed through to original") ref1(x='Splat.') - self.assertTrue(o.bar == 'Splat.', + self.assertEqual(o.bar, 'Splat.', "call through proxy not passed through to original") # expect due to too few args @@ -296,24 +294,23 @@ def check_proxy(self, o, proxy): o.foo = 1 - self.assertTrue(proxy.foo == 1, + self.assertEqual(proxy.foo, 1, "proxy does not reflect attribute addition") o.foo = 2 - self.assertTrue(proxy.foo == 2, + self.assertEqual(proxy.foo, 2, "proxy does not reflect attribute modification") del o.foo - self.assertTrue(not hasattr(proxy, 'foo'), + self.assertFalse(hasattr(proxy, 'foo'), "proxy does not reflect attribute removal") proxy.foo = 1 - self.assertTrue(o.foo == 1, + self.assertEqual(o.foo, 1, "object does not reflect attribute addition via proxy") proxy.foo = 2 - self.assertTrue( - o.foo == 2, + self.assertEqual(o.foo, 2, "object does not reflect attribute modification via proxy") del proxy.foo - self.assertTrue(not hasattr(o, 'foo'), + self.assertFalse(hasattr(o, 'foo'), "object does not reflect attribute removal via proxy") def test_proxy_deletion(self): @@ -337,21 +334,21 @@ o = C() ref1 = weakref.ref(o) ref2 = weakref.ref(o, self.callback) - self.assertTrue(weakref.getweakrefcount(o) == 2, + self.assertEqual(weakref.getweakrefcount(o), 2, "got wrong number of weak reference objects") proxy1 = weakref.proxy(o) proxy2 = weakref.proxy(o, self.callback) - self.assertTrue(weakref.getweakrefcount(o) == 4, + self.assertEqual(weakref.getweakrefcount(o), 4, "got wrong number of weak reference objects") del ref1, ref2, proxy1, proxy2 - self.assertTrue(weakref.getweakrefcount(o) == 0, + self.assertEqual(weakref.getweakrefcount(o), 0, "weak reference objects not unlinked from" " referent when discarded.") # assumes ints do not support weakrefs - self.assertTrue(weakref.getweakrefcount(1) == 0, + self.assertEqual(weakref.getweakrefcount(1), 0, "got wrong number of weak reference objects for int") def test_getweakrefs(self): @@ -359,22 +356,22 @@ ref1 = weakref.ref(o, self.callback) ref2 = weakref.ref(o, self.callback) del ref1 - self.assertTrue(weakref.getweakrefs(o) == [ref2], + self.assertEqual(weakref.getweakrefs(o), [ref2], "list of refs does not match") o = C() ref1 = weakref.ref(o, self.callback) ref2 = weakref.ref(o, self.callback) del ref2 - self.assertTrue(weakref.getweakrefs(o) == [ref1], + self.assertEqual(weakref.getweakrefs(o), [ref1], "list of refs does not match") del ref1 - self.assertTrue(weakref.getweakrefs(o) == [], + self.assertEqual(weakref.getweakrefs(o), [], "list of refs not cleared") # assumes ints do not support weakrefs - self.assertTrue(weakref.getweakrefs(1) == [], + self.assertEqual(weakref.getweakrefs(1), [], "list of refs does not match for int") def test_newstyle_number_ops(self): @@ -382,8 +379,8 @@ pass f = F(2.0) p = weakref.proxy(f) - self.assertTrue(p + 1.0 == 3.0) - self.assertTrue(1.0 + p == 3.0) # this used to SEGV + self.assertEqual(p + 1.0, 3.0) + self.assertEqual(1.0 + p, 3.0) # this used to SEGV def test_callbacks_protected(self): # Callbacks protected from already-set exceptions? @@ -636,7 +633,7 @@ c.wr = weakref.ref(d, callback) # this won't trigger d.wr = weakref.ref(callback, d.cb) # ditto external_wr = weakref.ref(callback, safe_callback) # but this will - self.assertTrue(external_wr() is callback) + self.assertIs(external_wr(), callback) # The weakrefs attached to c and d should get cleared, so that # C.cb is never called. But external_wr isn't part of the cyclic @@ -808,11 +805,11 @@ return super(MyRef, self).__call__() o = Object("foo") mr = MyRef(o, value=24) - self.assertTrue(mr() is o) + self.assertIs(mr(), o) self.assertTrue(mr.called) self.assertEqual(mr.value, 24) del o - self.assertTrue(mr() is None) + self.assertIsNone(mr()) self.assertTrue(mr.called) def test_subclass_refs_dont_replace_standard_refs(self): @@ -821,14 +818,14 @@ o = Object(42) r1 = MyRef(o) r2 = weakref.ref(o) - self.assertTrue(r1 is not r2) + self.assertIsNot(r1, r2) self.assertEqual(weakref.getweakrefs(o), [r2, r1]) self.assertEqual(weakref.getweakrefcount(o), 2) r3 = MyRef(o) self.assertEqual(weakref.getweakrefcount(o), 3) refs = weakref.getweakrefs(o) self.assertEqual(len(refs), 3) - self.assertTrue(r2 is refs[0]) + self.assertIs(r2, refs[0]) self.assertIn(r1, refs[1:]) self.assertIn(r3, refs[1:]) @@ -838,7 +835,7 @@ o = Object(42) r1 = MyRef(o, id) r2 = MyRef(o, str) - self.assertTrue(r1 is not r2) + self.assertIsNot(r1, r2) refs = weakref.getweakrefs(o) self.assertIn(r1, refs) self.assertIn(r2, refs) @@ -965,23 +962,23 @@ # dict, objects = self.make_weak_valued_dict() for o in objects: - self.assertTrue(weakref.getweakrefcount(o) == 1, + self.assertEqual(weakref.getweakrefcount(o), 1, "wrong number of weak references to %r!" % o) - self.assertTrue(o is dict[o.arg], + self.assertIs(o, dict[o.arg], "wrong object returned by weak dict!") items1 = dict.items() items2 = dict.copy().items() items1.sort() items2.sort() - self.assertTrue(items1 == items2, + self.assertEqual(items1, items2, "cloning of weak-valued dictionary did not work!") del items1, items2 - self.assertTrue(len(dict) == self.COUNT) + self.assertEqual(len(dict), self.COUNT) del objects[0] - self.assertTrue(len(dict) == (self.COUNT - 1), + self.assertEqual(len(dict), (self.COUNT - 1), "deleting object did not cause dictionary update") del objects, o - self.assertTrue(len(dict) == 0, + self.assertEqual(len(dict), 0, "deleting the values did not clear the dictionary") # regression on SF bug #447152: dict = weakref.WeakValueDictionary() @@ -996,21 +993,21 @@ # dict, objects = self.make_weak_keyed_dict() for o in objects: - self.assertTrue(weakref.getweakrefcount(o) == 1, + self.assertEqual(weakref.getweakrefcount(o), 1, "wrong number of weak references to %r!" % o) - self.assertTrue(o.arg is dict[o], + self.assertIs(o.arg, dict[o], "wrong object returned by weak dict!") items1 = dict.items() items2 = dict.copy().items() - self.assertTrue(set(items1) == set(items2), + self.assertEqual(set(items1), set(items2), "cloning of weak-keyed dictionary did not work!") del items1, items2 - self.assertTrue(len(dict) == self.COUNT) + self.assertEqual(len(dict), self.COUNT) del objects[0] - self.assertTrue(len(dict) == (self.COUNT - 1), + self.assertEqual(len(dict), (self.COUNT - 1), "deleting object did not cause dictionary update") del objects, o - self.assertTrue(len(dict) == 0, + self.assertEqual(len(dict), 0, "deleting the keys did not clear the dictionary") o = Object(42) dict[o] = "What is the meaning of the universe?" @@ -1072,37 +1069,37 @@ items = dict.items() for item in dict.iteritems(): items.remove(item) - self.assertTrue(len(items) == 0, "iteritems() did not touch all items") + self.assertEqual(len(items), 0, "iteritems() did not touch all items") # key iterator, via __iter__(): keys = dict.keys() for k in dict: keys.remove(k) - self.assertTrue(len(keys) == 0, "__iter__() did not touch all keys") + self.assertEqual(len(keys), 0, "__iter__() did not touch all keys") # key iterator, via iterkeys(): keys = dict.keys() for k in dict.iterkeys(): keys.remove(k) - self.assertTrue(len(keys) == 0, "iterkeys() did not touch all keys") + self.assertEqual(len(keys), 0, "iterkeys() did not touch all keys") # value iterator: values = dict.values() for v in dict.itervalues(): values.remove(v) - self.assertTrue(len(values) == 0, + self.assertEqual(len(values), 0, "itervalues() did not touch all values") def test_make_weak_keyed_dict_from_dict(self): o = Object(3) dict = weakref.WeakKeyDictionary({o:364}) - self.assertTrue(dict[o] == 364) + self.assertEqual(dict[o], 364) def test_make_weak_keyed_dict_from_weak_keyed_dict(self): o = Object(3) dict = weakref.WeakKeyDictionary({o:364}) dict2 = weakref.WeakKeyDictionary(dict) - self.assertTrue(dict[o] == 364) + self.assertEqual(dict[o], 364) def make_weak_keyed_dict(self): dict = weakref.WeakKeyDictionary() @@ -1122,19 +1119,19 @@ weakdict = klass() weakdict[key1] = value1 weakdict[key2] = value2 - self.assertTrue(len(weakdict) == 2) + self.assertEqual(len(weakdict), 2) k, v = weakdict.popitem() - self.assertTrue(len(weakdict) == 1) + self.assertEqual(len(weakdict), 1) if k is key1: - self.assertTrue(v is value1) + self.assertIs(v, value1) else: - self.assertTrue(v is value2) + self.assertIs(v, value2) k, v = weakdict.popitem() - self.assertTrue(len(weakdict) == 0) + self.assertEqual(len(weakdict), 0) if k is key1: - self.assertTrue(v is value1) + self.assertIs(v, value1) else: - self.assertTrue(v is value2) + self.assertIs(v, value2) def test_weak_valued_dict_popitem(self): self.check_popitem(weakref.WeakValueDictionary, @@ -1145,7 +1142,7 @@ C(), "value 1", C(), "value 2") def check_setdefault(self, klass, key, value1, value2): - self.assertTrue(value1 is not value2, + self.assertIsNot(value1, value2, "invalid test" " -- value parameters must be distinct objects") weakdict = klass() @@ -1204,10 +1201,10 @@ o2 = Object('2') d[o1] = 'something' d[o2] = 'something' - self.assertTrue(len(d) == 2) + self.assertEqual(len(d), 2) del d[o1] - self.assertTrue(len(d) == 1) - self.assertTrue(d.keys() == [o2]) + self.assertEqual(len(d), 1) + self.assertEqual(d.keys(), [o2]) def test_weak_valued_delitem(self): d = weakref.WeakValueDictionary() @@ -1215,10 +1212,10 @@ o2 = Object('2') d['something'] = o1 d['something else'] = o2 - self.assertTrue(len(d) == 2) + self.assertEqual(len(d), 2) del d['something'] - self.assertTrue(len(d) == 1) - self.assertTrue(d.items() == [('something else', o2)]) + self.assertEqual(len(d), 1) + self.assertEqual(d.items(), [('something else', o2)]) def test_weak_keyed_bad_delitem(self): d = weakref.WeakKeyDictionary() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 12:50:15 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 12:50:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NjA2?= =?utf-8?q?=3A_Use_specific_asserts_in_http=2Ecookiejar_tests=2E?= Message-ID: <3dMs6v2cb6z7LkY@mail.python.org> http://hg.python.org/cpython/rev/9444ee6864b3 changeset: 87198:9444ee6864b3 branch: 3.3 parent: 87195:d187eae08b8a user: Serhiy Storchaka date: Sun Nov 17 13:45:02 2013 +0200 summary: Issue #19606: Use specific asserts in http.cookiejar tests. files: Lib/test/test_http_cookiejar.py | 157 +++++++++---------- 1 files changed, 73 insertions(+), 84 deletions(-) diff --git a/Lib/test/test_http_cookiejar.py b/Lib/test/test_http_cookiejar.py --- a/Lib/test/test_http_cookiejar.py +++ b/Lib/test/test_http_cookiejar.py @@ -28,8 +28,8 @@ az = time2isoz() bz = time2isoz(500000) for text in (az, bz): - self.assertTrue(re.search(r"^\d{4}-\d\d-\d\d \d\d:\d\d:\d\dZ$", text), - "bad time2isoz format: %s %s" % (az, bz)) + self.assertRegex(text, r"^\d{4}-\d\d-\d\d \d\d:\d\d:\d\dZ$", + "bad time2isoz format: %s %s" % (az, bz)) def test_http2time(self): def parse_date(text): @@ -75,12 +75,9 @@ "%s => '%s' (%s)" % (test_t, result, expected)) for s in tests: - t = http2time(s) - t2 = http2time(s.lower()) - t3 = http2time(s.upper()) - - self.assertTrue(t == t2 == t3 == test_t, - "'%s' => %s, %s, %s (%s)" % (s, t, t2, t3, test_t)) + self.assertEqual(http2time(s), test_t, s) + self.assertEqual(http2time(s.lower()), test_t, s.lower()) + self.assertEqual(http2time(s.upper()), test_t, s.upper()) def test_http2time_garbage(self): for test in [ @@ -134,12 +131,9 @@ test_t = 760233600 # assume broken POSIX counting of seconds for s in tests: - t = iso2time(s) - t2 = iso2time(s.lower()) - t3 = iso2time(s.upper()) - - self.assertTrue(t == t2 == t3 == test_t, - "'%s' => %s, %s, %s (%s)" % (s, t, t2, t3, test_t)) + self.assertEqual(iso2time(s), test_t, s) + self.assertEqual(iso2time(s.lower()), test_t, s.lower()) + self.assertEqual(iso2time(s.upper()), test_t, s.upper()) def test_iso2time_garbage(self): for test in [ @@ -411,7 +405,7 @@ request = urllib.request.Request(url) r = pol.domain_return_ok(domain, request) if ok: self.assertTrue(r) - else: self.assertTrue(not r) + else: self.assertFalse(r) def test_missing_value(self): # missing = sign in Cookie: header is regarded by Mozilla as a missing @@ -421,10 +415,10 @@ interact_netscape(c, "http://www.acme.com/", 'eggs') interact_netscape(c, "http://www.acme.com/", '"spam"; path=/foo/') cookie = c._cookies["www.acme.com"]["/"]["eggs"] - self.assertTrue(cookie.value is None) + self.assertIsNone(cookie.value) self.assertEqual(cookie.name, "eggs") cookie = c._cookies["www.acme.com"]['/foo/']['"spam"'] - self.assertTrue(cookie.value is None) + self.assertIsNone(cookie.value) self.assertEqual(cookie.name, '"spam"') self.assertEqual(lwp_cookie_str(cookie), ( r'"spam"; path="/foo/"; domain="www.acme.com"; ' @@ -466,7 +460,7 @@ try: cookie = c._cookies["www.example.com"]["/"]["ni"] except KeyError: - self.assertTrue(version is None) # didn't expect a stored cookie + self.assertIsNone(version) # didn't expect a stored cookie else: self.assertEqual(cookie.version, version) # 2965 cookies are unaffected @@ -490,26 +484,26 @@ self.assertEqual(cookie.domain, ".acme.com") self.assertTrue(cookie.domain_specified) self.assertEqual(cookie.port, DEFAULT_HTTP_PORT) - self.assertTrue(not cookie.port_specified) + self.assertFalse(cookie.port_specified) # case is preserved - self.assertTrue(cookie.has_nonstandard_attr("blArgh") and - not cookie.has_nonstandard_attr("blargh")) + self.assertTrue(cookie.has_nonstandard_attr("blArgh")) + self.assertFalse(cookie.has_nonstandard_attr("blargh")) cookie = c._cookies["www.acme.com"]["/"]["ni"] self.assertEqual(cookie.domain, "www.acme.com") - self.assertTrue(not cookie.domain_specified) + self.assertFalse(cookie.domain_specified) self.assertEqual(cookie.port, "80,8080") self.assertTrue(cookie.port_specified) cookie = c._cookies["www.acme.com"]["/"]["nini"] - self.assertTrue(cookie.port is None) - self.assertTrue(not cookie.port_specified) + self.assertIsNone(cookie.port) + self.assertFalse(cookie.port_specified) # invalid expires should not cause cookie to be dropped foo = c._cookies["www.acme.com"]["/"]["foo"] spam = c._cookies["www.acme.com"]["/"]["foo"] - self.assertTrue(foo.expires is None) - self.assertTrue(spam.expires is None) + self.assertIsNone(foo.expires) + self.assertIsNone(spam.expires) def test_ns_parser_special_names(self): # names such as 'expires' are not special in first name=value pair @@ -679,12 +673,12 @@ def test_is_HDN(self): self.assertTrue(is_HDN("foo.bar.com")) self.assertTrue(is_HDN("1foo2.3bar4.5com")) - self.assertTrue(not is_HDN("192.168.1.1")) - self.assertTrue(not is_HDN("")) - self.assertTrue(not is_HDN(".")) - self.assertTrue(not is_HDN(".foo.bar.com")) - self.assertTrue(not is_HDN("..foo")) - self.assertTrue(not is_HDN("foo.")) + self.assertFalse(is_HDN("192.168.1.1")) + self.assertFalse(is_HDN("")) + self.assertFalse(is_HDN(".")) + self.assertFalse(is_HDN(".foo.bar.com")) + self.assertFalse(is_HDN("..foo")) + self.assertFalse(is_HDN("foo.")) def test_reach(self): self.assertEqual(reach("www.acme.com"), ".acme.com") @@ -698,39 +692,39 @@ def test_domain_match(self): self.assertTrue(domain_match("192.168.1.1", "192.168.1.1")) - self.assertTrue(not domain_match("192.168.1.1", ".168.1.1")) + self.assertFalse(domain_match("192.168.1.1", ".168.1.1")) self.assertTrue(domain_match("x.y.com", "x.Y.com")) self.assertTrue(domain_match("x.y.com", ".Y.com")) - self.assertTrue(not domain_match("x.y.com", "Y.com")) + self.assertFalse(domain_match("x.y.com", "Y.com")) self.assertTrue(domain_match("a.b.c.com", ".c.com")) - self.assertTrue(not domain_match(".c.com", "a.b.c.com")) + self.assertFalse(domain_match(".c.com", "a.b.c.com")) self.assertTrue(domain_match("example.local", ".local")) - self.assertTrue(not domain_match("blah.blah", "")) - self.assertTrue(not domain_match("", ".rhubarb.rhubarb")) + self.assertFalse(domain_match("blah.blah", "")) + self.assertFalse(domain_match("", ".rhubarb.rhubarb")) self.assertTrue(domain_match("", "")) self.assertTrue(user_domain_match("acme.com", "acme.com")) - self.assertTrue(not user_domain_match("acme.com", ".acme.com")) + self.assertFalse(user_domain_match("acme.com", ".acme.com")) self.assertTrue(user_domain_match("rhubarb.acme.com", ".acme.com")) self.assertTrue(user_domain_match("www.rhubarb.acme.com", ".acme.com")) self.assertTrue(user_domain_match("x.y.com", "x.Y.com")) self.assertTrue(user_domain_match("x.y.com", ".Y.com")) - self.assertTrue(not user_domain_match("x.y.com", "Y.com")) + self.assertFalse(user_domain_match("x.y.com", "Y.com")) self.assertTrue(user_domain_match("y.com", "Y.com")) - self.assertTrue(not user_domain_match(".y.com", "Y.com")) + self.assertFalse(user_domain_match(".y.com", "Y.com")) self.assertTrue(user_domain_match(".y.com", ".Y.com")) self.assertTrue(user_domain_match("x.y.com", ".com")) - self.assertTrue(not user_domain_match("x.y.com", "com")) - self.assertTrue(not user_domain_match("x.y.com", "m")) - self.assertTrue(not user_domain_match("x.y.com", ".m")) - self.assertTrue(not user_domain_match("x.y.com", "")) - self.assertTrue(not user_domain_match("x.y.com", ".")) + self.assertFalse(user_domain_match("x.y.com", "com")) + self.assertFalse(user_domain_match("x.y.com", "m")) + self.assertFalse(user_domain_match("x.y.com", ".m")) + self.assertFalse(user_domain_match("x.y.com", "")) + self.assertFalse(user_domain_match("x.y.com", ".")) self.assertTrue(user_domain_match("192.168.1.1", "192.168.1.1")) # not both HDNs, so must string-compare equal to match - self.assertTrue(not user_domain_match("192.168.1.1", ".168.1.1")) - self.assertTrue(not user_domain_match("192.168.1.1", ".")) + self.assertFalse(user_domain_match("192.168.1.1", ".168.1.1")) + self.assertFalse(user_domain_match("192.168.1.1", ".")) # empty string is a special case - self.assertTrue(not user_domain_match("192.168.1.1", "")) + self.assertFalse(user_domain_match("192.168.1.1", "")) def test_wrong_domain(self): # Cookies whose effective request-host name does not domain-match the @@ -877,7 +871,7 @@ self.assertEqual(len(c), 2) # ... and check is doesn't get returned c.add_cookie_header(req) - self.assertTrue(not req.has_header("Cookie")) + self.assertFalse(req.has_header("Cookie")) def test_domain_block(self): pol = DefaultCookiePolicy( @@ -901,8 +895,8 @@ self.assertEqual(len(c), 1) req = urllib.request.Request("http://www.roadrunner.net/") c.add_cookie_header(req) - self.assertTrue((req.has_header("Cookie") and - req.has_header("Cookie2"))) + self.assertTrue(req.has_header("Cookie")) + self.assertTrue(req.has_header("Cookie2")) c.clear() pol.set_blocked_domains([".acme.com"]) @@ -917,7 +911,7 @@ self.assertEqual(len(c), 2) # ... and check is doesn't get returned c.add_cookie_header(req) - self.assertTrue(not req.has_header("Cookie")) + self.assertFalse(req.has_header("Cookie")) def test_secure(self): for ns in True, False: @@ -935,8 +929,8 @@ url = "http://www.acme.com/" int(c, url, "foo1=bar%s%s" % (vs, whitespace)) int(c, url, "foo2=bar%s; secure%s" % (vs, whitespace)) - self.assertTrue( - not c._cookies["www.acme.com"]["/"]["foo1"].secure, + self.assertFalse( + c._cookies["www.acme.com"]["/"]["foo1"].secure, "non-secure cookie registered secure") self.assertTrue( c._cookies["www.acme.com"]["/"]["foo2"].secure, @@ -1009,8 +1003,8 @@ url = "http://foo.bar.com/" interact_2965(c, url, "spam=eggs; Version=1; Port") h = interact_2965(c, url) - self.assertTrue(re.search("\$Port([^=]|$)", h), - "port with no value not returned with no value") + self.assertRegex(h, "\$Port([^=]|$)", + "port with no value not returned with no value") c = CookieJar(pol) url = "http://foo.bar.com/" @@ -1034,8 +1028,7 @@ 'Comment="does anybody read these?"; ' 'CommentURL="http://foo.bar.net/comment.html"') h = interact_2965(c, url) - self.assertTrue( - "Comment" not in h, + self.assertNotIn("Comment", h, "Comment or CommentURL cookie-attributes returned to server") def test_Cookie_iterator(self): @@ -1063,7 +1056,7 @@ for i in range(4): i = 0 for c in cs: - self.assertTrue(isinstance(c, Cookie)) + self.assertIsInstance(c, Cookie) self.assertEqual(c.version, versions[i]) self.assertEqual(c.name, names[i]) self.assertEqual(c.domain, domains[i]) @@ -1118,7 +1111,7 @@ headers = ["Set-Cookie: c=foo; expires=Foo Bar 12 33:22:11 2000"] c = cookiejar_from_cookie_headers(headers) cookie = c._cookies["www.example.com"]["/"]["c"] - self.assertTrue(cookie.expires is None) + self.assertIsNone(cookie.expires) class LWPCookieTests(unittest.TestCase): @@ -1262,9 +1255,9 @@ req = urllib.request.Request("http://www.acme.com/ammo") c.add_cookie_header(req) - self.assertTrue(re.search(r"PART_NUMBER=RIDING_ROCKET_0023;\s*" - "PART_NUMBER=ROCKET_LAUNCHER_0001", - req.get_header("Cookie"))) + self.assertRegex(req.get_header("Cookie"), + r"PART_NUMBER=RIDING_ROCKET_0023;\s*" + "PART_NUMBER=ROCKET_LAUNCHER_0001") def test_ietf_example_1(self): #------------------------------------------------------------------- @@ -1297,7 +1290,7 @@ cookie = interact_2965( c, 'http://www.acme.com/acme/login', 'Customer="WILE_E_COYOTE"; Version="1"; Path="/acme"') - self.assertTrue(not cookie) + self.assertFalse(cookie) # # 3. User Agent -> Server @@ -1319,9 +1312,8 @@ cookie = interact_2965(c, 'http://www.acme.com/acme/pickitem', 'Part_Number="Rocket_Launcher_0001"; ' 'Version="1"; Path="/acme"'); - self.assertTrue(re.search( - r'^\$Version="?1"?; Customer="?WILE_E_COYOTE"?; \$Path="/acme"$', - cookie)) + self.assertRegex(cookie, + r'^\$Version="?1"?; Customer="?WILE_E_COYOTE"?; \$Path="/acme"$') # # 5. User Agent -> Server @@ -1344,11 +1336,11 @@ cookie = interact_2965(c, "http://www.acme.com/acme/shipping", 'Shipping="FedEx"; Version="1"; Path="/acme"') - self.assertTrue(re.search(r'^\$Version="?1"?;', cookie)) - self.assertTrue(re.search(r'Part_Number="?Rocket_Launcher_0001"?;' - '\s*\$Path="\/acme"', cookie)) - self.assertTrue(re.search(r'Customer="?WILE_E_COYOTE"?;\s*\$Path="\/acme"', - cookie)) + self.assertRegex(cookie, r'^\$Version="?1"?;') + self.assertRegex(cookie, r'Part_Number="?Rocket_Launcher_0001"?;' + '\s*\$Path="\/acme"') + self.assertRegex(cookie, r'Customer="?WILE_E_COYOTE"?;' + '\s*\$Path="\/acme"') # # 7. User Agent -> Server @@ -1369,9 +1361,8 @@ # Transaction is complete. cookie = interact_2965(c, "http://www.acme.com/acme/process") - self.assertTrue( - re.search(r'Shipping="?FedEx"?;\s*\$Path="\/acme"', cookie) and - "WILE_E_COYOTE" in cookie) + self.assertRegex(cookie, r'Shipping="?FedEx"?;\s*\$Path="\/acme"') + self.assertIn("WILE_E_COYOTE", cookie) # # The user agent makes a series of requests on the origin server, after @@ -1418,8 +1409,7 @@ # than once. cookie = interact_2965(c, "http://www.acme.com/acme/ammo/...") - self.assertTrue( - re.search(r"Riding_Rocket_0023.*Rocket_Launcher_0001", cookie)) + self.assertRegex(cookie, r"Riding_Rocket_0023.*Rocket_Launcher_0001") # A subsequent request by the user agent to the (same) server for a URL of # the form /acme/parts/ would include the following request header: @@ -1445,7 +1435,7 @@ # illegal domain (no embedded dots) cookie = interact_2965(c, "http://www.acme.com", 'foo=bar; domain=".com"; version=1') - self.assertTrue(not c) + self.assertFalse(c) # legal domain cookie = interact_2965(c, "http://www.acme.com", @@ -1538,11 +1528,11 @@ 'bar=baz; path="/foo/"; version=1'); version_re = re.compile(r'^\$version=\"?1\"?', re.I) self.assertIn("foo=bar", cookie) - self.assertTrue(version_re.search(cookie)) + self.assertRegex(cookie, version_re) cookie = interact_2965( c, "http://www.acme.com/foo/%25/<<%0anew\345/\346\370\345") - self.assertTrue(not cookie) + self.assertFalse(cookie) # unicode URL doesn't raise exception cookie = interact_2965(c, "http://www.acme.com/\xfc") @@ -1703,13 +1693,12 @@ key = "%s_after" % cookie.value counter[key] = counter[key] + 1 - self.assertTrue(not ( # a permanent cookie got lost accidently - counter["perm_after"] != counter["perm_before"] or + self.assertEqual(counter["perm_after"], counter["perm_before"]) # a session cookie hasn't been cleared - counter["session_after"] != 0 or + self.assertEqual(counter["session_after"], 0) # we didn't have session cookies in the first place - counter["session_before"] == 0)) + self.assertNotEqual(counter["session_before"], 0) def test_main(verbose=None): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 12:50:16 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 12:50:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319606=3A_Use_specific_asserts_in_http=2Ecookiej?= =?utf-8?q?ar_tests=2E?= Message-ID: <3dMs6w6dKlz7LrH@mail.python.org> http://hg.python.org/cpython/rev/c1f2b3fc965d changeset: 87199:c1f2b3fc965d parent: 87196:8deab371850b parent: 87198:9444ee6864b3 user: Serhiy Storchaka date: Sun Nov 17 13:46:42 2013 +0200 summary: Issue #19606: Use specific asserts in http.cookiejar tests. files: Lib/test/test_http_cookiejar.py | 157 +++++++++---------- 1 files changed, 73 insertions(+), 84 deletions(-) diff --git a/Lib/test/test_http_cookiejar.py b/Lib/test/test_http_cookiejar.py --- a/Lib/test/test_http_cookiejar.py +++ b/Lib/test/test_http_cookiejar.py @@ -28,8 +28,8 @@ az = time2isoz() bz = time2isoz(500000) for text in (az, bz): - self.assertTrue(re.search(r"^\d{4}-\d\d-\d\d \d\d:\d\d:\d\dZ$", text), - "bad time2isoz format: %s %s" % (az, bz)) + self.assertRegex(text, r"^\d{4}-\d\d-\d\d \d\d:\d\d:\d\dZ$", + "bad time2isoz format: %s %s" % (az, bz)) def test_http2time(self): def parse_date(text): @@ -75,12 +75,9 @@ "%s => '%s' (%s)" % (test_t, result, expected)) for s in tests: - t = http2time(s) - t2 = http2time(s.lower()) - t3 = http2time(s.upper()) - - self.assertTrue(t == t2 == t3 == test_t, - "'%s' => %s, %s, %s (%s)" % (s, t, t2, t3, test_t)) + self.assertEqual(http2time(s), test_t, s) + self.assertEqual(http2time(s.lower()), test_t, s.lower()) + self.assertEqual(http2time(s.upper()), test_t, s.upper()) def test_http2time_garbage(self): for test in [ @@ -134,12 +131,9 @@ test_t = 760233600 # assume broken POSIX counting of seconds for s in tests: - t = iso2time(s) - t2 = iso2time(s.lower()) - t3 = iso2time(s.upper()) - - self.assertTrue(t == t2 == t3 == test_t, - "'%s' => %s, %s, %s (%s)" % (s, t, t2, t3, test_t)) + self.assertEqual(iso2time(s), test_t, s) + self.assertEqual(iso2time(s.lower()), test_t, s.lower()) + self.assertEqual(iso2time(s.upper()), test_t, s.upper()) def test_iso2time_garbage(self): for test in [ @@ -411,7 +405,7 @@ request = urllib.request.Request(url) r = pol.domain_return_ok(domain, request) if ok: self.assertTrue(r) - else: self.assertTrue(not r) + else: self.assertFalse(r) def test_missing_value(self): # missing = sign in Cookie: header is regarded by Mozilla as a missing @@ -421,10 +415,10 @@ interact_netscape(c, "http://www.acme.com/", 'eggs') interact_netscape(c, "http://www.acme.com/", '"spam"; path=/foo/') cookie = c._cookies["www.acme.com"]["/"]["eggs"] - self.assertTrue(cookie.value is None) + self.assertIsNone(cookie.value) self.assertEqual(cookie.name, "eggs") cookie = c._cookies["www.acme.com"]['/foo/']['"spam"'] - self.assertTrue(cookie.value is None) + self.assertIsNone(cookie.value) self.assertEqual(cookie.name, '"spam"') self.assertEqual(lwp_cookie_str(cookie), ( r'"spam"; path="/foo/"; domain="www.acme.com"; ' @@ -466,7 +460,7 @@ try: cookie = c._cookies["www.example.com"]["/"]["ni"] except KeyError: - self.assertTrue(version is None) # didn't expect a stored cookie + self.assertIsNone(version) # didn't expect a stored cookie else: self.assertEqual(cookie.version, version) # 2965 cookies are unaffected @@ -490,26 +484,26 @@ self.assertEqual(cookie.domain, ".acme.com") self.assertTrue(cookie.domain_specified) self.assertEqual(cookie.port, DEFAULT_HTTP_PORT) - self.assertTrue(not cookie.port_specified) + self.assertFalse(cookie.port_specified) # case is preserved - self.assertTrue(cookie.has_nonstandard_attr("blArgh") and - not cookie.has_nonstandard_attr("blargh")) + self.assertTrue(cookie.has_nonstandard_attr("blArgh")) + self.assertFalse(cookie.has_nonstandard_attr("blargh")) cookie = c._cookies["www.acme.com"]["/"]["ni"] self.assertEqual(cookie.domain, "www.acme.com") - self.assertTrue(not cookie.domain_specified) + self.assertFalse(cookie.domain_specified) self.assertEqual(cookie.port, "80,8080") self.assertTrue(cookie.port_specified) cookie = c._cookies["www.acme.com"]["/"]["nini"] - self.assertTrue(cookie.port is None) - self.assertTrue(not cookie.port_specified) + self.assertIsNone(cookie.port) + self.assertFalse(cookie.port_specified) # invalid expires should not cause cookie to be dropped foo = c._cookies["www.acme.com"]["/"]["foo"] spam = c._cookies["www.acme.com"]["/"]["foo"] - self.assertTrue(foo.expires is None) - self.assertTrue(spam.expires is None) + self.assertIsNone(foo.expires) + self.assertIsNone(spam.expires) def test_ns_parser_special_names(self): # names such as 'expires' are not special in first name=value pair @@ -679,12 +673,12 @@ def test_is_HDN(self): self.assertTrue(is_HDN("foo.bar.com")) self.assertTrue(is_HDN("1foo2.3bar4.5com")) - self.assertTrue(not is_HDN("192.168.1.1")) - self.assertTrue(not is_HDN("")) - self.assertTrue(not is_HDN(".")) - self.assertTrue(not is_HDN(".foo.bar.com")) - self.assertTrue(not is_HDN("..foo")) - self.assertTrue(not is_HDN("foo.")) + self.assertFalse(is_HDN("192.168.1.1")) + self.assertFalse(is_HDN("")) + self.assertFalse(is_HDN(".")) + self.assertFalse(is_HDN(".foo.bar.com")) + self.assertFalse(is_HDN("..foo")) + self.assertFalse(is_HDN("foo.")) def test_reach(self): self.assertEqual(reach("www.acme.com"), ".acme.com") @@ -698,39 +692,39 @@ def test_domain_match(self): self.assertTrue(domain_match("192.168.1.1", "192.168.1.1")) - self.assertTrue(not domain_match("192.168.1.1", ".168.1.1")) + self.assertFalse(domain_match("192.168.1.1", ".168.1.1")) self.assertTrue(domain_match("x.y.com", "x.Y.com")) self.assertTrue(domain_match("x.y.com", ".Y.com")) - self.assertTrue(not domain_match("x.y.com", "Y.com")) + self.assertFalse(domain_match("x.y.com", "Y.com")) self.assertTrue(domain_match("a.b.c.com", ".c.com")) - self.assertTrue(not domain_match(".c.com", "a.b.c.com")) + self.assertFalse(domain_match(".c.com", "a.b.c.com")) self.assertTrue(domain_match("example.local", ".local")) - self.assertTrue(not domain_match("blah.blah", "")) - self.assertTrue(not domain_match("", ".rhubarb.rhubarb")) + self.assertFalse(domain_match("blah.blah", "")) + self.assertFalse(domain_match("", ".rhubarb.rhubarb")) self.assertTrue(domain_match("", "")) self.assertTrue(user_domain_match("acme.com", "acme.com")) - self.assertTrue(not user_domain_match("acme.com", ".acme.com")) + self.assertFalse(user_domain_match("acme.com", ".acme.com")) self.assertTrue(user_domain_match("rhubarb.acme.com", ".acme.com")) self.assertTrue(user_domain_match("www.rhubarb.acme.com", ".acme.com")) self.assertTrue(user_domain_match("x.y.com", "x.Y.com")) self.assertTrue(user_domain_match("x.y.com", ".Y.com")) - self.assertTrue(not user_domain_match("x.y.com", "Y.com")) + self.assertFalse(user_domain_match("x.y.com", "Y.com")) self.assertTrue(user_domain_match("y.com", "Y.com")) - self.assertTrue(not user_domain_match(".y.com", "Y.com")) + self.assertFalse(user_domain_match(".y.com", "Y.com")) self.assertTrue(user_domain_match(".y.com", ".Y.com")) self.assertTrue(user_domain_match("x.y.com", ".com")) - self.assertTrue(not user_domain_match("x.y.com", "com")) - self.assertTrue(not user_domain_match("x.y.com", "m")) - self.assertTrue(not user_domain_match("x.y.com", ".m")) - self.assertTrue(not user_domain_match("x.y.com", "")) - self.assertTrue(not user_domain_match("x.y.com", ".")) + self.assertFalse(user_domain_match("x.y.com", "com")) + self.assertFalse(user_domain_match("x.y.com", "m")) + self.assertFalse(user_domain_match("x.y.com", ".m")) + self.assertFalse(user_domain_match("x.y.com", "")) + self.assertFalse(user_domain_match("x.y.com", ".")) self.assertTrue(user_domain_match("192.168.1.1", "192.168.1.1")) # not both HDNs, so must string-compare equal to match - self.assertTrue(not user_domain_match("192.168.1.1", ".168.1.1")) - self.assertTrue(not user_domain_match("192.168.1.1", ".")) + self.assertFalse(user_domain_match("192.168.1.1", ".168.1.1")) + self.assertFalse(user_domain_match("192.168.1.1", ".")) # empty string is a special case - self.assertTrue(not user_domain_match("192.168.1.1", "")) + self.assertFalse(user_domain_match("192.168.1.1", "")) def test_wrong_domain(self): # Cookies whose effective request-host name does not domain-match the @@ -877,7 +871,7 @@ self.assertEqual(len(c), 2) # ... and check is doesn't get returned c.add_cookie_header(req) - self.assertTrue(not req.has_header("Cookie")) + self.assertFalse(req.has_header("Cookie")) def test_domain_block(self): pol = DefaultCookiePolicy( @@ -901,8 +895,8 @@ self.assertEqual(len(c), 1) req = urllib.request.Request("http://www.roadrunner.net/") c.add_cookie_header(req) - self.assertTrue((req.has_header("Cookie") and - req.has_header("Cookie2"))) + self.assertTrue(req.has_header("Cookie")) + self.assertTrue(req.has_header("Cookie2")) c.clear() pol.set_blocked_domains([".acme.com"]) @@ -917,7 +911,7 @@ self.assertEqual(len(c), 2) # ... and check is doesn't get returned c.add_cookie_header(req) - self.assertTrue(not req.has_header("Cookie")) + self.assertFalse(req.has_header("Cookie")) def test_secure(self): for ns in True, False: @@ -935,8 +929,8 @@ url = "http://www.acme.com/" int(c, url, "foo1=bar%s%s" % (vs, whitespace)) int(c, url, "foo2=bar%s; secure%s" % (vs, whitespace)) - self.assertTrue( - not c._cookies["www.acme.com"]["/"]["foo1"].secure, + self.assertFalse( + c._cookies["www.acme.com"]["/"]["foo1"].secure, "non-secure cookie registered secure") self.assertTrue( c._cookies["www.acme.com"]["/"]["foo2"].secure, @@ -1009,8 +1003,8 @@ url = "http://foo.bar.com/" interact_2965(c, url, "spam=eggs; Version=1; Port") h = interact_2965(c, url) - self.assertTrue(re.search("\$Port([^=]|$)", h), - "port with no value not returned with no value") + self.assertRegex(h, "\$Port([^=]|$)", + "port with no value not returned with no value") c = CookieJar(pol) url = "http://foo.bar.com/" @@ -1034,8 +1028,7 @@ 'Comment="does anybody read these?"; ' 'CommentURL="http://foo.bar.net/comment.html"') h = interact_2965(c, url) - self.assertTrue( - "Comment" not in h, + self.assertNotIn("Comment", h, "Comment or CommentURL cookie-attributes returned to server") def test_Cookie_iterator(self): @@ -1063,7 +1056,7 @@ for i in range(4): i = 0 for c in cs: - self.assertTrue(isinstance(c, Cookie)) + self.assertIsInstance(c, Cookie) self.assertEqual(c.version, versions[i]) self.assertEqual(c.name, names[i]) self.assertEqual(c.domain, domains[i]) @@ -1118,7 +1111,7 @@ headers = ["Set-Cookie: c=foo; expires=Foo Bar 12 33:22:11 2000"] c = cookiejar_from_cookie_headers(headers) cookie = c._cookies["www.example.com"]["/"]["c"] - self.assertTrue(cookie.expires is None) + self.assertIsNone(cookie.expires) class LWPCookieTests(unittest.TestCase): @@ -1262,9 +1255,9 @@ req = urllib.request.Request("http://www.acme.com/ammo") c.add_cookie_header(req) - self.assertTrue(re.search(r"PART_NUMBER=RIDING_ROCKET_0023;\s*" - "PART_NUMBER=ROCKET_LAUNCHER_0001", - req.get_header("Cookie"))) + self.assertRegex(req.get_header("Cookie"), + r"PART_NUMBER=RIDING_ROCKET_0023;\s*" + "PART_NUMBER=ROCKET_LAUNCHER_0001") def test_ietf_example_1(self): #------------------------------------------------------------------- @@ -1297,7 +1290,7 @@ cookie = interact_2965( c, 'http://www.acme.com/acme/login', 'Customer="WILE_E_COYOTE"; Version="1"; Path="/acme"') - self.assertTrue(not cookie) + self.assertFalse(cookie) # # 3. User Agent -> Server @@ -1319,9 +1312,8 @@ cookie = interact_2965(c, 'http://www.acme.com/acme/pickitem', 'Part_Number="Rocket_Launcher_0001"; ' 'Version="1"; Path="/acme"'); - self.assertTrue(re.search( - r'^\$Version="?1"?; Customer="?WILE_E_COYOTE"?; \$Path="/acme"$', - cookie)) + self.assertRegex(cookie, + r'^\$Version="?1"?; Customer="?WILE_E_COYOTE"?; \$Path="/acme"$') # # 5. User Agent -> Server @@ -1344,11 +1336,11 @@ cookie = interact_2965(c, "http://www.acme.com/acme/shipping", 'Shipping="FedEx"; Version="1"; Path="/acme"') - self.assertTrue(re.search(r'^\$Version="?1"?;', cookie)) - self.assertTrue(re.search(r'Part_Number="?Rocket_Launcher_0001"?;' - '\s*\$Path="\/acme"', cookie)) - self.assertTrue(re.search(r'Customer="?WILE_E_COYOTE"?;\s*\$Path="\/acme"', - cookie)) + self.assertRegex(cookie, r'^\$Version="?1"?;') + self.assertRegex(cookie, r'Part_Number="?Rocket_Launcher_0001"?;' + '\s*\$Path="\/acme"') + self.assertRegex(cookie, r'Customer="?WILE_E_COYOTE"?;' + '\s*\$Path="\/acme"') # # 7. User Agent -> Server @@ -1369,9 +1361,8 @@ # Transaction is complete. cookie = interact_2965(c, "http://www.acme.com/acme/process") - self.assertTrue( - re.search(r'Shipping="?FedEx"?;\s*\$Path="\/acme"', cookie) and - "WILE_E_COYOTE" in cookie) + self.assertRegex(cookie, r'Shipping="?FedEx"?;\s*\$Path="\/acme"') + self.assertIn("WILE_E_COYOTE", cookie) # # The user agent makes a series of requests on the origin server, after @@ -1418,8 +1409,7 @@ # than once. cookie = interact_2965(c, "http://www.acme.com/acme/ammo/...") - self.assertTrue( - re.search(r"Riding_Rocket_0023.*Rocket_Launcher_0001", cookie)) + self.assertRegex(cookie, r"Riding_Rocket_0023.*Rocket_Launcher_0001") # A subsequent request by the user agent to the (same) server for a URL of # the form /acme/parts/ would include the following request header: @@ -1445,7 +1435,7 @@ # illegal domain (no embedded dots) cookie = interact_2965(c, "http://www.acme.com", 'foo=bar; domain=".com"; version=1') - self.assertTrue(not c) + self.assertFalse(c) # legal domain cookie = interact_2965(c, "http://www.acme.com", @@ -1538,11 +1528,11 @@ 'bar=baz; path="/foo/"; version=1'); version_re = re.compile(r'^\$version=\"?1\"?', re.I) self.assertIn("foo=bar", cookie) - self.assertTrue(version_re.search(cookie)) + self.assertRegex(cookie, version_re) cookie = interact_2965( c, "http://www.acme.com/foo/%25/<<%0anew\345/\346\370\345") - self.assertTrue(not cookie) + self.assertFalse(cookie) # unicode URL doesn't raise exception cookie = interact_2965(c, "http://www.acme.com/\xfc") @@ -1703,13 +1693,12 @@ key = "%s_after" % cookie.value counter[key] = counter[key] + 1 - self.assertTrue(not ( # a permanent cookie got lost accidently - counter["perm_after"] != counter["perm_before"] or + self.assertEqual(counter["perm_after"], counter["perm_before"]) # a session cookie hasn't been cleared - counter["session_after"] != 0 or + self.assertEqual(counter["session_after"], 0) # we didn't have session cookies in the first place - counter["session_before"] == 0)) + self.assertNotEqual(counter["session_before"], 0) def test_main(verbose=None): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 12:50:18 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 12:50:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NjA2?= =?utf-8?q?=3A_Use_specific_asserts_in_cookielib_tests=2E?= Message-ID: <3dMs6y3X50z7Lny@mail.python.org> http://hg.python.org/cpython/rev/1479ba6bc511 changeset: 87200:1479ba6bc511 branch: 2.7 parent: 87197:3d2cfeea8950 user: Serhiy Storchaka date: Sun Nov 17 13:47:02 2013 +0200 summary: Issue #19606: Use specific asserts in cookielib tests. files: Lib/test/test_cookielib.py | 152 ++++++++++++------------ 1 files changed, 74 insertions(+), 78 deletions(-) diff --git a/Lib/test/test_cookielib.py b/Lib/test/test_cookielib.py --- a/Lib/test/test_cookielib.py +++ b/Lib/test/test_cookielib.py @@ -26,8 +26,9 @@ az = time2isoz() bz = time2isoz(500000) for text in (az, bz): - self.assertTrue(re.search(r"^\d{4}-\d\d-\d\d \d\d:\d\d:\d\dZ$", text), - "bad time2isoz format: %s %s" % (az, bz)) + self.assertRegexpMatches(text, + r"^\d{4}-\d\d-\d\d \d\d:\d\d:\d\dZ$", + "bad time2isoz format: %s %s" % (az, bz)) def test_http2time(self): from cookielib import http2time @@ -75,12 +76,9 @@ "%s => '%s' (%s)" % (test_t, result, expected)) for s in tests: - t = http2time(s) - t2 = http2time(s.lower()) - t3 = http2time(s.upper()) - - self.assertTrue(t == t2 == t3 == test_t, - "'%s' => %s, %s, %s (%s)" % (s, t, t2, t3, test_t)) + self.assertEqual(http2time(s), test_t, s) + self.assertEqual(http2time(s.lower()), test_t, s.lower()) + self.assertEqual(http2time(s.upper()), test_t, s.upper()) def test_http2time_garbage(self): from cookielib import http2time @@ -367,7 +365,7 @@ request = urllib2.Request(url) r = pol.domain_return_ok(domain, request) if ok: self.assertTrue(r) - else: self.assertTrue(not r) + else: self.assertFalse(r) def test_missing_value(self): from cookielib import MozillaCookieJar, lwp_cookie_str @@ -379,10 +377,10 @@ interact_netscape(c, "http://www.acme.com/", 'eggs') interact_netscape(c, "http://www.acme.com/", '"spam"; path=/foo/') cookie = c._cookies["www.acme.com"]["/"]["eggs"] - self.assertTrue(cookie.value is None) + self.assertIsNone(cookie.value) self.assertEqual(cookie.name, "eggs") cookie = c._cookies["www.acme.com"]['/foo/']['"spam"'] - self.assertTrue(cookie.value is None) + self.assertIsNone(cookie.value) self.assertEqual(cookie.name, '"spam"') self.assertEqual(lwp_cookie_str(cookie), ( r'"spam"; path="/foo/"; domain="www.acme.com"; ' @@ -426,7 +424,7 @@ try: cookie = c._cookies["www.example.com"]["/"]["ni"] except KeyError: - self.assertTrue(version is None) # didn't expect a stored cookie + self.assertIsNone(version) # didn't expect a stored cookie else: self.assertEqual(cookie.version, version) # 2965 cookies are unaffected @@ -452,26 +450,26 @@ self.assertEqual(cookie.domain, ".acme.com") self.assertTrue(cookie.domain_specified) self.assertEqual(cookie.port, DEFAULT_HTTP_PORT) - self.assertTrue(not cookie.port_specified) + self.assertFalse(cookie.port_specified) # case is preserved - self.assertTrue(cookie.has_nonstandard_attr("blArgh") and - not cookie.has_nonstandard_attr("blargh")) + self.assertTrue(cookie.has_nonstandard_attr("blArgh")) + self.assertFalse(cookie.has_nonstandard_attr("blargh")) cookie = c._cookies["www.acme.com"]["/"]["ni"] self.assertEqual(cookie.domain, "www.acme.com") - self.assertTrue(not cookie.domain_specified) + self.assertFalse(cookie.domain_specified) self.assertEqual(cookie.port, "80,8080") self.assertTrue(cookie.port_specified) cookie = c._cookies["www.acme.com"]["/"]["nini"] - self.assertTrue(cookie.port is None) - self.assertTrue(not cookie.port_specified) + self.assertIsNone(cookie.port) + self.assertFalse(cookie.port_specified) # invalid expires should not cause cookie to be dropped foo = c._cookies["www.acme.com"]["/"]["foo"] spam = c._cookies["www.acme.com"]["/"]["foo"] - self.assertTrue(foo.expires is None) - self.assertTrue(spam.expires is None) + self.assertIsNone(foo.expires) + self.assertIsNone(spam.expires) def test_ns_parser_special_names(self): # names such as 'expires' are not special in first name=value pair @@ -655,12 +653,12 @@ from cookielib import is_HDN self.assertTrue(is_HDN("foo.bar.com")) self.assertTrue(is_HDN("1foo2.3bar4.5com")) - self.assertTrue(not is_HDN("192.168.1.1")) - self.assertTrue(not is_HDN("")) - self.assertTrue(not is_HDN(".")) - self.assertTrue(not is_HDN(".foo.bar.com")) - self.assertTrue(not is_HDN("..foo")) - self.assertTrue(not is_HDN("foo.")) + self.assertFalse(is_HDN("192.168.1.1")) + self.assertFalse(is_HDN("")) + self.assertFalse(is_HDN(".")) + self.assertFalse(is_HDN(".foo.bar.com")) + self.assertFalse(is_HDN("..foo")) + self.assertFalse(is_HDN("foo.")) def test_reach(self): from cookielib import reach @@ -676,39 +674,39 @@ def test_domain_match(self): from cookielib import domain_match, user_domain_match self.assertTrue(domain_match("192.168.1.1", "192.168.1.1")) - self.assertTrue(not domain_match("192.168.1.1", ".168.1.1")) + self.assertFalse(domain_match("192.168.1.1", ".168.1.1")) self.assertTrue(domain_match("x.y.com", "x.Y.com")) self.assertTrue(domain_match("x.y.com", ".Y.com")) - self.assertTrue(not domain_match("x.y.com", "Y.com")) + self.assertFalse(domain_match("x.y.com", "Y.com")) self.assertTrue(domain_match("a.b.c.com", ".c.com")) - self.assertTrue(not domain_match(".c.com", "a.b.c.com")) + self.assertFalse(domain_match(".c.com", "a.b.c.com")) self.assertTrue(domain_match("example.local", ".local")) - self.assertTrue(not domain_match("blah.blah", "")) - self.assertTrue(not domain_match("", ".rhubarb.rhubarb")) + self.assertFalse(domain_match("blah.blah", "")) + self.assertFalse(domain_match("", ".rhubarb.rhubarb")) self.assertTrue(domain_match("", "")) self.assertTrue(user_domain_match("acme.com", "acme.com")) - self.assertTrue(not user_domain_match("acme.com", ".acme.com")) + self.assertFalse(user_domain_match("acme.com", ".acme.com")) self.assertTrue(user_domain_match("rhubarb.acme.com", ".acme.com")) self.assertTrue(user_domain_match("www.rhubarb.acme.com", ".acme.com")) self.assertTrue(user_domain_match("x.y.com", "x.Y.com")) self.assertTrue(user_domain_match("x.y.com", ".Y.com")) - self.assertTrue(not user_domain_match("x.y.com", "Y.com")) + self.assertFalse(user_domain_match("x.y.com", "Y.com")) self.assertTrue(user_domain_match("y.com", "Y.com")) - self.assertTrue(not user_domain_match(".y.com", "Y.com")) + self.assertFalse(user_domain_match(".y.com", "Y.com")) self.assertTrue(user_domain_match(".y.com", ".Y.com")) self.assertTrue(user_domain_match("x.y.com", ".com")) - self.assertTrue(not user_domain_match("x.y.com", "com")) - self.assertTrue(not user_domain_match("x.y.com", "m")) - self.assertTrue(not user_domain_match("x.y.com", ".m")) - self.assertTrue(not user_domain_match("x.y.com", "")) - self.assertTrue(not user_domain_match("x.y.com", ".")) + self.assertFalse(user_domain_match("x.y.com", "com")) + self.assertFalse(user_domain_match("x.y.com", "m")) + self.assertFalse(user_domain_match("x.y.com", ".m")) + self.assertFalse(user_domain_match("x.y.com", "")) + self.assertFalse(user_domain_match("x.y.com", ".")) self.assertTrue(user_domain_match("192.168.1.1", "192.168.1.1")) # not both HDNs, so must string-compare equal to match - self.assertTrue(not user_domain_match("192.168.1.1", ".168.1.1")) - self.assertTrue(not user_domain_match("192.168.1.1", ".")) + self.assertFalse(user_domain_match("192.168.1.1", ".168.1.1")) + self.assertFalse(user_domain_match("192.168.1.1", ".")) # empty string is a special case - self.assertTrue(not user_domain_match("192.168.1.1", "")) + self.assertFalse(user_domain_match("192.168.1.1", "")) def test_wrong_domain(self): # Cookies whose effective request-host name does not domain-match the @@ -865,7 +863,7 @@ self.assertEqual(len(c), 2) # ... and check is doesn't get returned c.add_cookie_header(req) - self.assertTrue(not req.has_header("Cookie")) + self.assertFalse(req.has_header("Cookie")) def test_domain_block(self): from cookielib import CookieJar, DefaultCookiePolicy @@ -892,8 +890,8 @@ self.assertEqual(len(c), 1) req = Request("http://www.roadrunner.net/") c.add_cookie_header(req) - self.assertTrue((req.has_header("Cookie") and - req.has_header("Cookie2"))) + self.assertTrue(req.has_header("Cookie")) + self.assertTrue(req.has_header("Cookie2")) c.clear() pol.set_blocked_domains([".acme.com"]) @@ -908,7 +906,7 @@ self.assertEqual(len(c), 2) # ... and check is doesn't get returned c.add_cookie_header(req) - self.assertTrue(not req.has_header("Cookie")) + self.assertFalse(req.has_header("Cookie")) def test_secure(self): from cookielib import CookieJar, DefaultCookiePolicy @@ -928,8 +926,8 @@ url = "http://www.acme.com/" int(c, url, "foo1=bar%s%s" % (vs, whitespace)) int(c, url, "foo2=bar%s; secure%s" % (vs, whitespace)) - self.assertTrue( - not c._cookies["www.acme.com"]["/"]["foo1"].secure, + self.assertFalse( + c._cookies["www.acme.com"]["/"]["foo1"].secure, "non-secure cookie registered secure") self.assertTrue( c._cookies["www.acme.com"]["/"]["foo2"].secure, @@ -1011,8 +1009,8 @@ url = "http://foo.bar.com/" interact_2965(c, url, "spam=eggs; Version=1; Port") h = interact_2965(c, url) - self.assertTrue(re.search("\$Port([^=]|$)", h), - "port with no value not returned with no value") + self.assertRegexpMatches(h, "\$Port([^=]|$)", + "port with no value not returned with no value") c = CookieJar(pol) url = "http://foo.bar.com/" @@ -1038,8 +1036,7 @@ 'Comment="does anybody read these?"; ' 'CommentURL="http://foo.bar.net/comment.html"') h = interact_2965(c, url) - self.assertTrue( - "Comment" not in h, + self.assertNotIn("Comment", h, "Comment or CommentURL cookie-attributes returned to server") def test_Cookie_iterator(self): @@ -1128,7 +1125,7 @@ headers = ["Set-Cookie: c=foo; expires=Foo Bar 12 33:22:11 2000"] c = cookiejar_from_cookie_headers(headers) cookie = c._cookies["www.example.com"]["/"]["c"] - self.assertTrue(cookie.expires is None) + self.assertIsNone(cookie.expires) class LWPCookieTests(TestCase): @@ -1278,9 +1275,9 @@ req = Request("http://www.acme.com/ammo") c.add_cookie_header(req) - self.assertTrue(re.search(r"PART_NUMBER=RIDING_ROCKET_0023;\s*" - "PART_NUMBER=ROCKET_LAUNCHER_0001", - req.get_header("Cookie"))) + self.assertRegexpMatches(req.get_header("Cookie"), + r"PART_NUMBER=RIDING_ROCKET_0023;\s*" + "PART_NUMBER=ROCKET_LAUNCHER_0001") def test_ietf_example_1(self): from cookielib import CookieJar, DefaultCookiePolicy @@ -1314,7 +1311,7 @@ cookie = interact_2965( c, 'http://www.acme.com/acme/login', 'Customer="WILE_E_COYOTE"; Version="1"; Path="/acme"') - self.assertTrue(not cookie) + self.assertFalse(cookie) # # 3. User Agent -> Server @@ -1336,9 +1333,8 @@ cookie = interact_2965(c, 'http://www.acme.com/acme/pickitem', 'Part_Number="Rocket_Launcher_0001"; ' 'Version="1"; Path="/acme"'); - self.assertTrue(re.search( - r'^\$Version="?1"?; Customer="?WILE_E_COYOTE"?; \$Path="/acme"$', - cookie)) + self.assertRegexpMatches(cookie, + r'^\$Version="?1"?; Customer="?WILE_E_COYOTE"?; \$Path="/acme"$') # # 5. User Agent -> Server @@ -1361,11 +1357,11 @@ cookie = interact_2965(c, "http://www.acme.com/acme/shipping", 'Shipping="FedEx"; Version="1"; Path="/acme"') - self.assertTrue(re.search(r'^\$Version="?1"?;', cookie)) - self.assertTrue(re.search(r'Part_Number="?Rocket_Launcher_0001"?;' - '\s*\$Path="\/acme"', cookie)) - self.assertTrue(re.search(r'Customer="?WILE_E_COYOTE"?;\s*\$Path="\/acme"', - cookie)) + self.assertRegexpMatches(cookie, r'^\$Version="?1"?;') + self.assertRegexpMatches(cookie, + r'Part_Number="?Rocket_Launcher_0001"?;\s*\$Path="\/acme"') + self.assertRegexpMatches(cookie, + r'Customer="?WILE_E_COYOTE"?;\s*\$Path="\/acme"') # # 7. User Agent -> Server @@ -1386,9 +1382,9 @@ # Transaction is complete. cookie = interact_2965(c, "http://www.acme.com/acme/process") - self.assertTrue( - re.search(r'Shipping="?FedEx"?;\s*\$Path="\/acme"', cookie) and - "WILE_E_COYOTE" in cookie) + self.assertRegexpMatches(cookie, + r'Shipping="?FedEx"?;\s*\$Path="\/acme"') + self.assertIn("WILE_E_COYOTE", cookie) # # The user agent makes a series of requests on the origin server, after @@ -1437,8 +1433,8 @@ # than once. cookie = interact_2965(c, "http://www.acme.com/acme/ammo/...") - self.assertTrue( - re.search(r"Riding_Rocket_0023.*Rocket_Launcher_0001", cookie)) + self.assertRegexpMatches(cookie, + r"Riding_Rocket_0023.*Rocket_Launcher_0001") # A subsequent request by the user agent to the (same) server for a URL of # the form /acme/parts/ would include the following request header: @@ -1466,7 +1462,7 @@ # illegal domain (no embedded dots) cookie = interact_2965(c, "http://www.acme.com", 'foo=bar; domain=".com"; version=1') - self.assertTrue(not c) + self.assertFalse(c) # legal domain cookie = interact_2965(c, "http://www.acme.com", @@ -1559,11 +1555,12 @@ c, "http://www.acme.com/foo%2f%25/<<%0anew?/???", 'bar=baz; path="/foo/"; version=1'); version_re = re.compile(r'^\$version=\"?1\"?', re.I) - self.assertTrue("foo=bar" in cookie and version_re.search(cookie)) + self.assertIn("foo=bar", cookie) + self.assertRegexpMatches(cookie, version_re) cookie = interact_2965( c, "http://www.acme.com/foo/%25/<<%0anew?/???") - self.assertTrue(not cookie) + self.assertFalse(cookie) # unicode URL doesn't raise exception cookie = interact_2965(c, u"http://www.acme.com/\xfc") @@ -1740,13 +1737,12 @@ key = "%s_after" % cookie.value counter[key] = counter[key] + 1 - self.assertTrue(not ( - # a permanent cookie got lost accidentally - counter["perm_after"] != counter["perm_before"] or + # a permanent cookie got lost accidently + self.assertEqual(counter["perm_after"], counter["perm_before"]) # a session cookie hasn't been cleared - counter["session_after"] != 0 or + self.assertEqual(counter["session_after"], 0) # we didn't have session cookies in the first place - counter["session_before"] == 0)) + self.assertNotEqual(counter["session_before"], 0) def test_main(verbose=None): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 14:24:31 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sun, 17 Nov 2013 14:24:31 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319565=3A_Prevent_?= =?utf-8?q?warnings_at_shutdown_about_pending_overlapped_ops=2E?= Message-ID: <3dMvCg1Xk3zR28@mail.python.org> http://hg.python.org/cpython/rev/da10196b94f4 changeset: 87201:da10196b94f4 parent: 87199:c1f2b3fc965d user: Richard Oudkerk date: Sun Nov 17 13:15:51 2013 +0000 summary: Issue #19565: Prevent warnings at shutdown about pending overlapped ops. files: Modules/_winapi.c | 36 +++++++++++++++++++++++++++------- 1 files changed, 28 insertions(+), 8 deletions(-) diff --git a/Modules/_winapi.c b/Modules/_winapi.c --- a/Modules/_winapi.c +++ b/Modules/_winapi.c @@ -107,17 +107,37 @@ { DWORD bytes; int err = GetLastError(); + if (self->pending) { - /* make it a programming error to deallocate while operation - is pending, even if we can safely cancel it */ if (check_CancelIoEx() && - Py_CancelIoEx(self->handle, &self->overlapped)) - GetOverlappedResult(self->handle, &self->overlapped, &bytes, TRUE); - PyErr_SetString(PyExc_RuntimeError, - "I/O operations still in flight while destroying " - "Overlapped object, the process may crash"); - PyErr_WriteUnraisable(NULL); + Py_CancelIoEx(self->handle, &self->overlapped) && + GetOverlappedResult(self->handle, &self->overlapped, &bytes, TRUE)) + { + /* The operation is no longer pending -- nothing to do. */ + } + else if (_Py_Finalizing == NULL) + { + /* The operation is still pending -- give a warning. This + will probably only happen on Windows XP. */ + PyErr_SetString(PyExc_RuntimeError, + "I/O operations still in flight while destroying " + "Overlapped object, the process may crash"); + PyErr_WriteUnraisable(NULL); + } + else + { + /* The operation is still pending, but the process is + probably about to exit, so we need not worry too much + about memory leaks. Leaking self prevents a potential + crash. This can happen when a daemon thread is cleaned + up at exit -- see #19565. We only expect to get here + on Windows XP. */ + CloseHandle(self->overlapped.hEvent); + SetLastError(err); + return; + } } + CloseHandle(self->overlapped.hEvent); SetLastError(err); if (self->write_buffer.obj) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 15:36:13 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 17 Nov 2013 15:36:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NTA4?= =?utf-8?q?=3A_direct_the_user_to_read_the_security_considerations_for_the?= =?utf-8?q?_ssl?= Message-ID: <3dMwpP5Kmgz7Lny@mail.python.org> http://hg.python.org/cpython/rev/f86fdaf529ea changeset: 87202:f86fdaf529ea branch: 3.3 parent: 87198:9444ee6864b3 user: Antoine Pitrou date: Sun Nov 17 15:35:33 2013 +0100 summary: Issue #19508: direct the user to read the security considerations for the ssl module files: Doc/library/ssl.rst | 19 ++++++++++++++----- 1 files changed, 14 insertions(+), 5 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -29,12 +29,10 @@ cause variations in behavior. .. warning:: + Don't use this module without reading the :ref:`ssl-security`. Doing so + may lead to a false sense of security, as the default settings of the + ssl module are not necessarily appropriate for your application. - OpenSSL's internal random number generator does not properly handle fork. - Applications must change the PRNG state of the parent process if they use - any SSL feature with :func:`os.fork`. Any successful call of - :func:`~ssl.RAND_add`, :func:`~ssl.RAND_bytes` or - :func:`~ssl.RAND_pseudo_bytes` is sufficient. This section documents the objects and functions in the ``ssl`` module; for more general information about TLS, SSL, and certificates, the reader is referred to @@ -1314,6 +1312,17 @@ If you want to check which ciphers are enabled by a given cipher list, use the ``openssl ciphers`` command on your system. +Multi-processing +^^^^^^^^^^^^^^^^ + +If using this module as part of a multi-processed application (using, +for example the :mod:`multiprocessing` or :mod:`concurrent.futures` modules), +be aware that OpenSSL's internal random number generator does not properly +handle forked processes. Applications must change the PRNG state of the +parent process if they use any SSL feature with :func:`os.fork`. Any +successful call of :func:`~ssl.RAND_add`, :func:`~ssl.RAND_bytes` or +:func:`~ssl.RAND_pseudo_bytes` is sufficient. + .. seealso:: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 15:36:14 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 17 Nov 2013 15:36:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319508=3A_direct_the_user_to_read_the_security_c?= =?utf-8?q?onsiderations_for_the_ssl?= Message-ID: <3dMwpQ74N2z7LvZ@mail.python.org> http://hg.python.org/cpython/rev/18d95780100e changeset: 87203:18d95780100e parent: 87201:da10196b94f4 parent: 87202:f86fdaf529ea user: Antoine Pitrou date: Sun Nov 17 15:36:03 2013 +0100 summary: Issue #19508: direct the user to read the security considerations for the ssl module files: Doc/library/ssl.rst | 19 ++++++++++++++----- 1 files changed, 14 insertions(+), 5 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -30,12 +30,10 @@ openssl version 1.0.1. .. warning:: + Don't use this module without reading the :ref:`ssl-security`. Doing so + may lead to a false sense of security, as the default settings of the + ssl module are not necessarily appropriate for your application. - OpenSSL's internal random number generator does not properly handle fork. - Applications must change the PRNG state of the parent process if they use - any SSL feature with :func:`os.fork`. Any successful call of - :func:`~ssl.RAND_add`, :func:`~ssl.RAND_bytes` or - :func:`~ssl.RAND_pseudo_bytes` is sufficient. This section documents the objects and functions in the ``ssl`` module; for more general information about TLS, SSL, and certificates, the reader is referred to @@ -1480,6 +1478,17 @@ If you want to check which ciphers are enabled by a given cipher list, use the ``openssl ciphers`` command on your system. +Multi-processing +^^^^^^^^^^^^^^^^ + +If using this module as part of a multi-processed application (using, +for example the :mod:`multiprocessing` or :mod:`concurrent.futures` modules), +be aware that OpenSSL's internal random number generator does not properly +handle forked processes. Applications must change the PRNG state of the +parent process if they use any SSL feature with :func:`os.fork`. Any +successful call of :func:`~ssl.RAND_add`, :func:`~ssl.RAND_bytes` or +:func:`~ssl.RAND_pseudo_bytes` is sufficient. + .. seealso:: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 15:43:05 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 17 Nov 2013 15:43:05 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NTA4?= =?utf-8?q?=3A_warn_that_ssl_doesn=27t_validate_certificates_by_default?= Message-ID: <3dMwyK0sPfz7Ljs@mail.python.org> http://hg.python.org/cpython/rev/a197b3c3b2c9 changeset: 87204:a197b3c3b2c9 branch: 2.7 parent: 87200:1479ba6bc511 user: Antoine Pitrou date: Sun Nov 17 15:42:58 2013 +0100 summary: Issue #19508: warn that ssl doesn't validate certificates by default files: Doc/library/ssl.rst | 4 ++++ 1 files changed, 4 insertions(+), 0 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -31,6 +31,10 @@ cause variations in behavior. .. warning:: + The ssl module won't validate certificates by default. When used in + client mode, this means you are vulnerable to man-in-the-middle attacks. + +.. warning:: OpenSSL's internal random number generator does not properly handle fork. Applications must change the PRNG state of the parent process if they use -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 16:33:10 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 17 Nov 2013 16:33:10 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_The_repr=28=29_of_a_Windows_p?= =?utf-8?q?ath_now_uses_forward_slashes_=28Guido=29=2E?= Message-ID: <3dMy464WTgz7LjT@mail.python.org> http://hg.python.org/peps/rev/82a9a30d8e57 changeset: 5282:82a9a30d8e57 user: Antoine Pitrou date: Sun Nov 17 16:33:06 2013 +0100 summary: The repr() of a Windows path now uses forward slashes (Guido). Note that str() and bytes() still use backward slashes, which is the canonical path syntax. files: pep-0428.txt | 37 ++++++++++++++++++++++--------------- 1 files changed, 22 insertions(+), 15 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -243,8 +243,8 @@ * A POSIX path is absolute if it has a root. A Windows path is absolute if it has both a drive *and* a root. A Windows UNC path (e.g. - ``\\some\share\myfile.txt``) always has a drive and a root - (here, ``\\some\share`` and ``\``, respectively). + ``\\host\share\myfile.txt``) always has a drive and a root + (here, ``\\host\share`` and ``\``, respectively). * A path which has either a drive *or* a root is said to be anchored. Its anchor is the concatenation of the drive and root. Under POSIX, @@ -281,7 +281,7 @@ However, with Windows paths, the drive is retained as necessary:: >>> PureWindowsPath('c:/foo', '/Windows') - PureWindowsPath('c:\\Windows') + PureWindowsPath('c:/Windows') >>> PureWindowsPath('c:/foo', 'd:') PureWindowsPath('d:') @@ -303,7 +303,7 @@ notation):: >>> PureWindowsPath('//some/path') - PureWindowsPath('\\\\some\\path\\') + PureWindowsPath('//some/path/') On POSIX, they are collapsed except if there are exactly two leading slashes, which is a special case in the POSIX specification on `pathname resolution`_ @@ -343,12 +343,12 @@ 'c:/windows' To get the bytes representation (which might be useful under Unix systems), -call ``bytes()`` on it:: +call ``bytes()`` on it, which internally uses ``os.fsencode()``:: >>> bytes(p) b'/home/antoine/pathlib/setup.py' -To represent the path as a ``file`` URI, call the ``as_uri()`` method:: +To represent the path as a ``file:`` URI, call the ``as_uri()`` method:: >>> p = PurePosixPath('/etc/passwd') >>> p.as_uri() @@ -357,6 +357,13 @@ >>> p.as_uri() 'file:///c:/Windows' +The repr() of a path always uses forward slashes, even under Windows, for +readability and to remind users that forward slashes are ok:: + + >>> p = PureWindowsPath('c:/Windows') + >>> p + PureWindowsPath('c:/Windows') + Properties ---------- @@ -416,7 +423,7 @@ >>> p = PureWindowsPath('c:/Downloads/pathlib.tar.gz') >>> p.with_name('setup.py') - PureWindowsPath('c:\\Downloads\\setup.py') + PureWindowsPath('c:/Downloads/setup.py') It fails with a ``ValueError`` if the path doesn't have an actual name:: @@ -426,7 +433,7 @@ File "", line 1, in File "pathlib.py", line 875, in with_name raise ValueError("%r has an empty name" % (self,)) - ValueError: PureWindowsPath('c:\\') has an empty name + ValueError: PureWindowsPath('c:/') has an empty name >>> p.name '' @@ -435,7 +442,7 @@ >>> p = PureWindowsPath('c:/Downloads/pathlib.tar.gz') >>> p.with_suffix('.bz2') - PureWindowsPath('c:\\Downloads\\pathlib.tar.bz2') + PureWindowsPath('c:/Downloads/pathlib.tar.bz2') >>> p = PureWindowsPath('README') >>> p.with_suffix('.bz2') PureWindowsPath('README.bz2') @@ -491,11 +498,11 @@ The ``parent()`` method returns an ancestor of the path:: >>> p.parent() - PureWindowsPath('c:\\python33\\bin') + PureWindowsPath('c:/python33/bin') >>> p.parent(2) - PureWindowsPath('c:\\python33') + PureWindowsPath('c:/python33') >>> p.parent(3) - PureWindowsPath('c:\\') + PureWindowsPath('c:/') The ``parents()`` method automates repeated invocations of ``parent()``, until the anchor is reached:: @@ -503,9 +510,9 @@ >>> p = PureWindowsPath('c:/python33/bin/python.exe') >>> for parent in p.parents(): parent ... - PureWindowsPath('c:\\python33\\bin') - PureWindowsPath('c:\\python33') - PureWindowsPath('c:\\') + PureWindowsPath('c:/python33/bin') + PureWindowsPath('c:/python33') + PureWindowsPath('c:/') Querying -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sun Nov 17 16:47:05 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 17 Nov 2013 16:47:05 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Explain_match=28=29_behaviour?= Message-ID: <3dMyN9475Qz7LwN@mail.python.org> http://hg.python.org/peps/rev/adf0827a781d changeset: 5283:adf0827a781d user: Antoine Pitrou date: Sun Nov 17 16:47:01 2013 +0100 summary: Explain match() behaviour files: pep-0428.txt | 20 ++++++++++++++++++-- 1 files changed, 18 insertions(+), 2 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -524,10 +524,26 @@ ``is_reserved()`` returns True if a Windows path is a reserved path such as ``CON`` or ``NUL``. It always returns False for POSIX paths. -``match()`` matches the path against a glob pattern:: +``match()`` matches the path against a glob pattern. It operates on +individual parts and matches from the right: - >>> PureWindowsPath('c:/PATHLIB/setup.py').match('c:*lib/*.PY') + >>> p = PurePosixPath('/usr/bin') + >>> p.match('/usr/b*') True + >>> p.match('usr/b*') + True + >>> p.match('b*') + True + >>> p.match('/u*') + False + +This behaviour respects the following expectations: + +- A simple pattern such as "\*.py" matches arbitrarily long paths as long + as the last part matches, e.g. "/usr/foo/bar.py". + +- Longer patterns can be used as well for more complex matching, e.g. + "/usr/foo/\*.py" matches "/usr/foo/bar.py". ``normcase()`` returns a case-folded version of the path for NT paths:: -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sun Nov 17 16:49:13 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 17 Nov 2013 16:49:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Point_out_that_resolve=28=29_?= =?utf-8?q?is_similar_to_realpath=28=29?= Message-ID: <3dMyQd06wpz7Lwm@mail.python.org> http://hg.python.org/peps/rev/889c6f8a498a changeset: 5284:889c6f8a498a user: Antoine Pitrou date: Sun Nov 17 16:49:07 2013 +0100 summary: Point out that resolve() is similar to realpath() files: pep-0428.txt | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -606,7 +606,8 @@ --------------- The ``resolve()`` method makes a path absolute, resolving any symlink on -the way. It is the only operation which will remove "``..``" path components. +the way (like the POSIX realpath() call). It is the only operation which +will remove "``..``" path components. Directory walking -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sun Nov 17 18:04:47 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sun, 17 Nov 2013 18:04:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgMTY5OTg6?= =?utf-8?q?_Clarify_that_+=3D_on_a_shared_value_is_not_atomic=2E?= Message-ID: <3dN05q2RBqz7LkJ@mail.python.org> http://hg.python.org/cpython/rev/7aabbe919f55 changeset: 87205:7aabbe919f55 branch: 2.7 user: Richard Oudkerk date: Sun Nov 17 17:00:38 2013 +0000 summary: Issue 16998: Clarify that += on a shared value is not atomic. files: Doc/library/multiprocessing.rst | 24 +++++++++++++++----- 1 files changed, 18 insertions(+), 6 deletions(-) diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -960,12 +960,24 @@ ctypes type or a one character typecode of the kind used by the :mod:`array` module. *\*args* is passed on to the constructor for the type. - If *lock* is ``True`` (the default) then a new lock object is created to - synchronize access to the value. If *lock* is a :class:`Lock` or - :class:`RLock` object then that will be used to synchronize access to the - value. If *lock* is ``False`` then access to the returned object will not be - automatically protected by a lock, so it will not necessarily be - "process-safe". + If *lock* is ``True`` (the default) then a new recursive lock + object is created to synchronize access to the value. If *lock* is + a :class:`Lock` or :class:`RLock` object then that will be used to + synchronize access to the value. If *lock* is ``False`` then + access to the returned object will not be automatically protected + by a lock, so it will not necessarily be "process-safe". + + Operations like ``+=`` which involve a read and write are not + atomic. So if, for instance, you want to atomically increment a + shared value it is insufficient to just do :: + + counter.value += 1 + + Assuming the associated lock is recursive (which it is by default) + you can instead do :: + + with counter.get_lock(): + counter.value += 1 Note that *lock* is a keyword-only argument. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 18:04:48 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sun, 17 Nov 2013 18:04:48 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgMTY5OTg6?= =?utf-8?q?_Clarify_that_+=3D_on_a_shared_value_is_not_atomic=2E?= Message-ID: <3dN05r4MX6z7Ll7@mail.python.org> http://hg.python.org/cpython/rev/11cafbe6519f changeset: 87206:11cafbe6519f branch: 3.3 parent: 87202:f86fdaf529ea user: Richard Oudkerk date: Sun Nov 17 17:00:38 2013 +0000 summary: Issue 16998: Clarify that += on a shared value is not atomic. files: Doc/library/multiprocessing.rst | 24 +++++++++++++++----- 1 files changed, 18 insertions(+), 6 deletions(-) diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -1006,12 +1006,24 @@ ctypes type or a one character typecode of the kind used by the :mod:`array` module. *\*args* is passed on to the constructor for the type. - If *lock* is ``True`` (the default) then a new lock object is created to - synchronize access to the value. If *lock* is a :class:`Lock` or - :class:`RLock` object then that will be used to synchronize access to the - value. If *lock* is ``False`` then access to the returned object will not be - automatically protected by a lock, so it will not necessarily be - "process-safe". + If *lock* is ``True`` (the default) then a new recursive lock + object is created to synchronize access to the value. If *lock* is + a :class:`Lock` or :class:`RLock` object then that will be used to + synchronize access to the value. If *lock* is ``False`` then + access to the returned object will not be automatically protected + by a lock, so it will not necessarily be "process-safe". + + Operations like ``+=`` which involve a read and write are not + atomic. So if, for instance, you want to atomically increment a + shared value it is insufficient to just do :: + + counter.value += 1 + + Assuming the associated lock is recursive (which it is by default) + you can instead do :: + + with counter.get_lock(): + counter.value += 1 Note that *lock* is a keyword-only argument. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 18:04:49 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sun, 17 Nov 2013 18:04:49 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?b?KTogTWVyZ2Uu?= Message-ID: <3dN05s656Nz7LlT@mail.python.org> http://hg.python.org/cpython/rev/8a8556947dd0 changeset: 87207:8a8556947dd0 parent: 87203:18d95780100e parent: 87206:11cafbe6519f user: Richard Oudkerk date: Sun Nov 17 17:03:19 2013 +0000 summary: Merge. files: Doc/library/multiprocessing.rst | 24 +++++++++++++++----- 1 files changed, 18 insertions(+), 6 deletions(-) diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -1172,12 +1172,24 @@ ctypes type or a one character typecode of the kind used by the :mod:`array` module. *\*args* is passed on to the constructor for the type. - If *lock* is ``True`` (the default) then a new lock object is created to - synchronize access to the value. If *lock* is a :class:`Lock` or - :class:`RLock` object then that will be used to synchronize access to the - value. If *lock* is ``False`` then access to the returned object will not be - automatically protected by a lock, so it will not necessarily be - "process-safe". + If *lock* is ``True`` (the default) then a new recursive lock + object is created to synchronize access to the value. If *lock* is + a :class:`Lock` or :class:`RLock` object then that will be used to + synchronize access to the value. If *lock* is ``False`` then + access to the returned object will not be automatically protected + by a lock, so it will not necessarily be "process-safe". + + Operations like ``+=`` which involve a read and write are not + atomic. So if, for instance, you want to atomically increment a + shared value it is insufficient to just do :: + + counter.value += 1 + + Assuming the associated lock is recursive (which it is by default) + you can instead do :: + + with counter.get_lock(): + counter.value += 1 Note that *lock* is a keyword-only argument. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 18:37:26 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sun, 17 Nov 2013 18:37:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Fix_handling_o?= =?utf-8?q?f_SystemExit_and_exit_code=2E__Patch_by_Brodie_Rao=2E?= Message-ID: <3dN0qV1V4mz7LwD@mail.python.org> http://hg.python.org/cpython/rev/0fa82a06194c changeset: 87208:0fa82a06194c branch: 2.7 parent: 87205:7aabbe919f55 user: Richard Oudkerk date: Sun Nov 17 17:24:11 2013 +0000 summary: Fix handling of SystemExit and exit code. Patch by Brodie Rao. files: Lib/multiprocessing/process.py | 2 +- Lib/test/test_multiprocessing.py | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/multiprocessing/process.py b/Lib/multiprocessing/process.py --- a/Lib/multiprocessing/process.py +++ b/Lib/multiprocessing/process.py @@ -267,7 +267,7 @@ else: sys.stderr.write(str(e.args[0]) + '\n') sys.stderr.flush() - exitcode = 0 if isinstance(e.args[0], str) else 1 + exitcode = 1 except: exitcode = 1 import traceback diff --git a/Lib/test/test_multiprocessing.py b/Lib/test/test_multiprocessing.py --- a/Lib/test/test_multiprocessing.py +++ b/Lib/test/test_multiprocessing.py @@ -339,7 +339,7 @@ testfn = test_support.TESTFN self.addCleanup(test_support.unlink, testfn) - for reason, code in (([1, 2, 3], 1), ('ignore this', 0)): + for reason, code in (([1, 2, 3], 1), ('ignore this', 1)): p = self.Process(target=self._test_sys_exit, args=(reason, testfn)) p.daemon = True p.start() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 18:37:27 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sun, 17 Nov 2013 18:37:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Fix_handling_o?= =?utf-8?q?f_SystemExit_and_exit_code=2E__Patch_by_Brodie_Rao=2E?= Message-ID: <3dN0qW47pTz7Ljs@mail.python.org> http://hg.python.org/cpython/rev/44b5ec2f0f5d changeset: 87209:44b5ec2f0f5d branch: 3.3 parent: 87206:11cafbe6519f user: Richard Oudkerk date: Sun Nov 17 17:24:11 2013 +0000 summary: Fix handling of SystemExit and exit code. Patch by Brodie Rao. files: Lib/multiprocessing/process.py | 2 +- Lib/test/test_multiprocessing.py | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/multiprocessing/process.py b/Lib/multiprocessing/process.py --- a/Lib/multiprocessing/process.py +++ b/Lib/multiprocessing/process.py @@ -266,7 +266,7 @@ exitcode = e.args[0] else: sys.stderr.write(str(e.args[0]) + '\n') - exitcode = 0 if isinstance(e.args[0], str) else 1 + exitcode = 1 except: exitcode = 1 import traceback diff --git a/Lib/test/test_multiprocessing.py b/Lib/test/test_multiprocessing.py --- a/Lib/test/test_multiprocessing.py +++ b/Lib/test/test_multiprocessing.py @@ -467,7 +467,7 @@ testfn = test.support.TESTFN self.addCleanup(test.support.unlink, testfn) - for reason, code in (([1, 2, 3], 1), ('ignore this', 0)): + for reason, code in (([1, 2, 3], 1), ('ignore this', 1)): p = self.Process(target=self._test_sys_exit, args=(reason, testfn)) p.daemon = True p.start() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 18:37:28 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sun, 17 Nov 2013 18:37:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?b?KTogTWVyZ2Uu?= Message-ID: <3dN0qX6yHrz7LxM@mail.python.org> http://hg.python.org/cpython/rev/9779267262d2 changeset: 87210:9779267262d2 parent: 87207:8a8556947dd0 parent: 87209:44b5ec2f0f5d user: Richard Oudkerk date: Sun Nov 17 17:30:54 2013 +0000 summary: Merge. files: Lib/multiprocessing/process.py | 2 +- Lib/test/_test_multiprocessing.py | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/multiprocessing/process.py b/Lib/multiprocessing/process.py --- a/Lib/multiprocessing/process.py +++ b/Lib/multiprocessing/process.py @@ -262,7 +262,7 @@ exitcode = e.args[0] else: sys.stderr.write(str(e.args[0]) + '\n') - exitcode = 0 if isinstance(e.args[0], str) else 1 + exitcode = 1 except: exitcode = 1 import traceback diff --git a/Lib/test/_test_multiprocessing.py b/Lib/test/_test_multiprocessing.py --- a/Lib/test/_test_multiprocessing.py +++ b/Lib/test/_test_multiprocessing.py @@ -474,7 +474,7 @@ testfn = test.support.TESTFN self.addCleanup(test.support.unlink, testfn) - for reason, code in (([1, 2, 3], 1), ('ignore this', 0)): + for reason, code in (([1, 2, 3], 1), ('ignore this', 1)): p = self.Process(target=self._test_sys_exit, args=(reason, testfn)) p.daemon = True p.start() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 18:48:41 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sun, 17 Nov 2013 18:48:41 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NTk5?= =?utf-8?q?=3A_Increase_sleep_period=2E?= Message-ID: <3dN14T6LpMz7Lwd@mail.python.org> http://hg.python.org/cpython/rev/ba7d53b5f3de changeset: 87211:ba7d53b5f3de branch: 2.7 parent: 87208:0fa82a06194c user: Richard Oudkerk date: Sun Nov 17 17:45:16 2013 +0000 summary: Issue #19599: Increase sleep period. files: Lib/test/test_multiprocessing.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_multiprocessing.py b/Lib/test/test_multiprocessing.py --- a/Lib/test/test_multiprocessing.py +++ b/Lib/test/test_multiprocessing.py @@ -1140,7 +1140,7 @@ self.assertTimingAlmostEqual(get.elapsed, TIMEOUT1) def test_async_timeout(self): - res = self.pool.apply_async(sqr, (6, TIMEOUT2 + 0.2)) + res = self.pool.apply_async(sqr, (6, TIMEOUT2 + 1.0)) get = TimingWrapper(res.get) self.assertRaises(multiprocessing.TimeoutError, get, timeout=TIMEOUT2) self.assertTimingAlmostEqual(get.elapsed, TIMEOUT2) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 18:48:43 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sun, 17 Nov 2013 18:48:43 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NTk5?= =?utf-8?q?=3A_Increase_sleep_period=2E?= Message-ID: <3dN14W16d0z7Ltg@mail.python.org> http://hg.python.org/cpython/rev/7d6a9e7060c7 changeset: 87212:7d6a9e7060c7 branch: 3.3 parent: 87209:44b5ec2f0f5d user: Richard Oudkerk date: Sun Nov 17 17:45:16 2013 +0000 summary: Issue #19599: Increase sleep period. files: Lib/test/test_multiprocessing.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_multiprocessing.py b/Lib/test/test_multiprocessing.py --- a/Lib/test/test_multiprocessing.py +++ b/Lib/test/test_multiprocessing.py @@ -1714,7 +1714,7 @@ self.assertTimingAlmostEqual(get.elapsed, TIMEOUT1) def test_async_timeout(self): - res = self.pool.apply_async(sqr, (6, TIMEOUT2 + 0.2)) + res = self.pool.apply_async(sqr, (6, TIMEOUT2 + 1.0)) get = TimingWrapper(res.get) self.assertRaises(multiprocessing.TimeoutError, get, timeout=TIMEOUT2) self.assertTimingAlmostEqual(get.elapsed, TIMEOUT2) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 18:48:44 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sun, 17 Nov 2013 18:48:44 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?b?KTogTWVyZ2Uu?= Message-ID: <3dN14X2qqSz7Lx0@mail.python.org> http://hg.python.org/cpython/rev/2ead6c9744c0 changeset: 87213:2ead6c9744c0 parent: 87210:9779267262d2 parent: 87212:7d6a9e7060c7 user: Richard Oudkerk date: Sun Nov 17 17:47:00 2013 +0000 summary: Merge. files: Lib/test/_test_multiprocessing.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/_test_multiprocessing.py b/Lib/test/_test_multiprocessing.py --- a/Lib/test/_test_multiprocessing.py +++ b/Lib/test/_test_multiprocessing.py @@ -1721,7 +1721,7 @@ self.assertTimingAlmostEqual(get.elapsed, TIMEOUT1) def test_async_timeout(self): - res = self.pool.apply_async(sqr, (6, TIMEOUT2 + 0.2)) + res = self.pool.apply_async(sqr, (6, TIMEOUT2 + 1.0)) get = TimingWrapper(res.get) self.assertRaises(multiprocessing.TimeoutError, get, timeout=TIMEOUT2) self.assertTimingAlmostEqual(get.elapsed, TIMEOUT2) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 20:04:55 2013 From: python-checkins at python.org (christian.heimes) Date: Sun, 17 Nov 2013 20:04:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319448=3A_Add_priv?= =?utf-8?q?ate_API_to_SSL_module_to_lookup_ASN=2E1_objects_by_OID=2C?= Message-ID: <3dN2mR2sRnz7LkY@mail.python.org> http://hg.python.org/cpython/rev/f43f65038e2a changeset: 87214:f43f65038e2a user: Christian Heimes date: Sun Nov 17 19:59:14 2013 +0100 summary: Issue #19448: Add private API to SSL module to lookup ASN.1 objects by OID, NID, short name and long name. files: Lib/ssl.py | 26 ++++++++- Lib/test/test_ssl.py | 38 +++++++++++++ Misc/NEWS | 3 + Modules/_ssl.c | 91 ++++++++++++++++++++++++++++++++ 4 files changed, 156 insertions(+), 2 deletions(-) diff --git a/Lib/ssl.py b/Lib/ssl.py --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -91,7 +91,7 @@ import re import sys import os -import collections +from collections import namedtuple import _ssl # if we can't import it, let the error propagate @@ -102,6 +102,7 @@ SSLSyscallError, SSLEOFError, ) from _ssl import CERT_NONE, CERT_OPTIONAL, CERT_REQUIRED +from _ssl import txt2obj as _txt2obj, nid2obj as _nid2obj from _ssl import RAND_status, RAND_egd, RAND_add, RAND_bytes, RAND_pseudo_bytes def _import_symbols(prefix): @@ -256,7 +257,7 @@ "subjectAltName fields were found") -DefaultVerifyPaths = collections.namedtuple("DefaultVerifyPaths", +DefaultVerifyPaths = namedtuple("DefaultVerifyPaths", "cafile capath openssl_cafile_env openssl_cafile openssl_capath_env " "openssl_capath") @@ -274,6 +275,27 @@ *parts) +class _ASN1Object(namedtuple("_ASN1Object", "nid shortname longname oid")): + """ASN.1 object identifier lookup + """ + __slots__ = () + + def __new__(cls, oid): + return super().__new__(cls, *_txt2obj(oid, name=False)) + + @classmethod + def fromnid(cls, nid): + """Create _ASN1Object from OpenSSL numeric ID + """ + return super().__new__(cls, *_nid2obj(nid)) + + @classmethod + def fromname(cls, name): + """Create _ASN1Object from short name, long name or OID + """ + return super().__new__(cls, *_txt2obj(name, name=True)) + + class SSLContext(_SSLContext): """An SSLContext holds various SSL-related configuration options and data, such as certificates and possibly a private key.""" diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -539,6 +539,44 @@ self.assertIsInstance(ca[0][0], bytes) self.assertIsInstance(ca[0][1], int) + def test_asn1object(self): + expected = (129, 'serverAuth', 'TLS Web Server Authentication', + '1.3.6.1.5.5.7.3.1') + + val = ssl._ASN1Object('1.3.6.1.5.5.7.3.1') + self.assertEqual(val, expected) + self.assertEqual(val.nid, 129) + self.assertEqual(val.shortname, 'serverAuth') + self.assertEqual(val.longname, 'TLS Web Server Authentication') + self.assertEqual(val.oid, '1.3.6.1.5.5.7.3.1') + self.assertIsInstance(val, ssl._ASN1Object) + self.assertRaises(ValueError, ssl._ASN1Object, 'serverAuth') + + val = ssl._ASN1Object.fromnid(129) + self.assertEqual(val, expected) + self.assertIsInstance(val, ssl._ASN1Object) + self.assertRaises(ValueError, ssl._ASN1Object.fromnid, -1) + self.assertRaises(ValueError, ssl._ASN1Object.fromnid, 100000) + for i in range(1000): + try: + obj = ssl._ASN1Object.fromnid(i) + except ValueError: + pass + else: + self.assertIsInstance(obj.nid, int) + self.assertIsInstance(obj.shortname, str) + self.assertIsInstance(obj.longname, str) + self.assertIsInstance(obj.oid, (str, type(None))) + + val = ssl._ASN1Object.fromname('TLS Web Server Authentication') + self.assertEqual(val, expected) + self.assertIsInstance(val, ssl._ASN1Object) + self.assertEqual(ssl._ASN1Object.fromname('serverAuth'), expected) + self.assertEqual(ssl._ASN1Object.fromname('1.3.6.1.5.5.7.3.1'), + expected) + self.assertRaises(ValueError, ssl._ASN1Object.fromname, 'serverauth') + + class ContextTests(unittest.TestCase): @skip_if_broken_ubuntu_ssl diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -50,6 +50,9 @@ Library ------- +- Issue #19448: Add private API to SSL module to lookup ASN.1 objects by OID, + NID, short name and long name. + - Issue #19282: dbm.open now supports the context manager protocol. (Inital patch by Claudiu Popa) diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -2998,6 +2998,93 @@ return NULL; } +static PyObject* +asn1obj2py(ASN1_OBJECT *obj) +{ + int nid; + const char *ln, *sn; + char buf[100]; + int buflen; + + nid = OBJ_obj2nid(obj); + if (nid == NID_undef) { + PyErr_Format(PyExc_ValueError, "Unknown object"); + return NULL; + } + sn = OBJ_nid2sn(nid); + ln = OBJ_nid2ln(nid); + buflen = OBJ_obj2txt(buf, sizeof(buf), obj, 1); + if (buflen < 0) { + _setSSLError(NULL, 0, __FILE__, __LINE__); + return NULL; + } + if (buflen) { + return Py_BuildValue("isss#", nid, sn, ln, buf, buflen); + } else { + return Py_BuildValue("issO", nid, sn, ln, Py_None); + } +} + +PyDoc_STRVAR(PySSL_txt2obj_doc, +"txt2obj(txt, name=False) -> (nid, shortname, longname, oid)\n\ +\n\ +Lookup NID, short name, long name and OID of an ASN1_OBJECT. By default\n\ +objects are looked up by OID. With name=True short and long name are also\n\ +matched."); + +static PyObject* +PySSL_txt2obj(PyObject *self, PyObject *args, PyObject *kwds) +{ + char *kwlist[] = {"txt", "name", NULL}; + PyObject *result = NULL; + char *txt; + int name = 0; + ASN1_OBJECT *obj; + + if (!PyArg_ParseTupleAndKeywords(args, kwds, "s|p:txt2obj", + kwlist, &txt, &name)) { + return NULL; + } + obj = OBJ_txt2obj(txt, name ? 0 : 1); + if (obj == NULL) { + PyErr_Format(PyExc_ValueError, "Unknown object"); + return NULL; + } + result = asn1obj2py(obj); + ASN1_OBJECT_free(obj); + return result; +} + +PyDoc_STRVAR(PySSL_nid2obj_doc, +"nid2obj(nid) -> (nid, shortname, longname, oid)\n\ +\n\ +Lookup NID, short name, long name and OID of an ASN1_OBJECT by NID."); + +static PyObject* +PySSL_nid2obj(PyObject *self, PyObject *args) +{ + PyObject *result = NULL; + int nid; + ASN1_OBJECT *obj; + + if (!PyArg_ParseTuple(args, "i:nid2obj", &nid)) { + return NULL; + } + if (nid < NID_undef) { + PyErr_Format(PyExc_ValueError, "NID must be positive."); + return NULL; + } + obj = OBJ_nid2obj(nid); + if (obj == NULL) { + PyErr_Format(PyExc_ValueError, "Unknown NID"); + return NULL; + } + result = asn1obj2py(obj); + ASN1_OBJECT_free(obj); + return result; +} + + #ifdef _MSC_VER PyDoc_STRVAR(PySSL_enum_cert_store_doc, "enum_cert_store(store_name, cert_type='certificate') -> []\n\ @@ -3145,6 +3232,10 @@ {"enum_cert_store", (PyCFunction)PySSL_enum_cert_store, METH_VARARGS | METH_KEYWORDS, PySSL_enum_cert_store_doc}, #endif + {"txt2obj", (PyCFunction)PySSL_txt2obj, + METH_VARARGS | METH_KEYWORDS, PySSL_txt2obj_doc}, + {"nid2obj", (PyCFunction)PySSL_nid2obj, + METH_VARARGS, PySSL_nid2obj_doc}, {NULL, NULL} /* Sentinel */ }; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 20:19:14 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 17 Nov 2013 20:19:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Remove_normcase=28=29=2C_it?= =?utf-8?q?=27s_not_terribly_useful_and_its_presence_can_be_misleading?= Message-ID: <3dN34y21sNz7LmY@mail.python.org> http://hg.python.org/peps/rev/9c60d5d8020b changeset: 5285:9c60d5d8020b user: Antoine Pitrou date: Sun Nov 17 20:19:10 2013 +0100 summary: Remove normcase(), it's not terribly useful and its presence can be misleading files: pep-0428.txt | 7 ------- 1 files changed, 0 insertions(+), 7 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -545,13 +545,6 @@ - Longer patterns can be used as well for more complex matching, e.g. "/usr/foo/\*.py" matches "/usr/foo/bar.py". -``normcase()`` returns a case-folded version of the path for NT paths:: - - >>> PurePosixPath('CAPS').normcase() - PurePosixPath('CAPS') - >>> PureWindowsPath('CAPS').normcase() - PureWindowsPath('caps') - Concrete paths API ================== -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sun Nov 17 21:09:38 2013 From: python-checkins at python.org (ezio.melotti) Date: Sun, 17 Nov 2013 21:09:38 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Fix_indentatio?= =?utf-8?q?n_in_doc_example=2E?= Message-ID: <3dN4C65JHFz7LjS@mail.python.org> http://hg.python.org/cpython/rev/cc27c0aba18b changeset: 87215:cc27c0aba18b branch: 2.7 parent: 87211:ba7d53b5f3de user: Ezio Melotti date: Sun Nov 17 22:07:48 2013 +0200 summary: Fix indentation in doc example. files: Doc/tutorial/controlflow.rst | 10 +++++----- 1 files changed, 5 insertions(+), 5 deletions(-) diff --git a/Doc/tutorial/controlflow.rst b/Doc/tutorial/controlflow.rst --- a/Doc/tutorial/controlflow.rst +++ b/Doc/tutorial/controlflow.rst @@ -19,14 +19,14 @@ >>> x = int(raw_input("Please enter an integer: ")) Please enter an integer: 42 >>> if x < 0: - ... x = 0 - ... print 'Negative changed to zero' + ... x = 0 + ... print 'Negative changed to zero' ... elif x == 0: - ... print 'Zero' + ... print 'Zero' ... elif x == 1: - ... print 'Single' + ... print 'Single' ... else: - ... print 'More' + ... print 'More' ... More -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 21:09:39 2013 From: python-checkins at python.org (ezio.melotti) Date: Sun, 17 Nov 2013 21:09:39 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Fix_indentatio?= =?utf-8?q?n_in_doc_example=2E?= Message-ID: <3dN4C773YJz7LjS@mail.python.org> http://hg.python.org/cpython/rev/17e7aedc9a5d changeset: 87216:17e7aedc9a5d branch: 3.3 parent: 87212:7d6a9e7060c7 user: Ezio Melotti date: Sun Nov 17 22:07:48 2013 +0200 summary: Fix indentation in doc example. files: Doc/tutorial/controlflow.rst | 10 +++++----- 1 files changed, 5 insertions(+), 5 deletions(-) diff --git a/Doc/tutorial/controlflow.rst b/Doc/tutorial/controlflow.rst --- a/Doc/tutorial/controlflow.rst +++ b/Doc/tutorial/controlflow.rst @@ -19,14 +19,14 @@ >>> x = int(input("Please enter an integer: ")) Please enter an integer: 42 >>> if x < 0: - ... x = 0 - ... print('Negative changed to zero') + ... x = 0 + ... print('Negative changed to zero') ... elif x == 0: - ... print('Zero') + ... print('Zero') ... elif x == 1: - ... print('Single') + ... print('Single') ... else: - ... print('More') + ... print('More') ... More -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 21:09:41 2013 From: python-checkins at python.org (ezio.melotti) Date: Sun, 17 Nov 2013 21:09:41 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge_indentation_fix_in_doc_example_from_3=2E3=2E?= Message-ID: <3dN4C91kq2z7Lk4@mail.python.org> http://hg.python.org/cpython/rev/9b79eed3c16b changeset: 87217:9b79eed3c16b parent: 87214:f43f65038e2a parent: 87216:17e7aedc9a5d user: Ezio Melotti date: Sun Nov 17 22:09:24 2013 +0200 summary: Merge indentation fix in doc example from 3.3. files: Doc/tutorial/controlflow.rst | 10 +++++----- 1 files changed, 5 insertions(+), 5 deletions(-) diff --git a/Doc/tutorial/controlflow.rst b/Doc/tutorial/controlflow.rst --- a/Doc/tutorial/controlflow.rst +++ b/Doc/tutorial/controlflow.rst @@ -19,14 +19,14 @@ >>> x = int(input("Please enter an integer: ")) Please enter an integer: 42 >>> if x < 0: - ... x = 0 - ... print('Negative changed to zero') + ... x = 0 + ... print('Negative changed to zero') ... elif x == 0: - ... print('Zero') + ... print('Zero') ... elif x == 1: - ... print('Single') + ... print('Single') ... else: - ... print('More') + ... print('More') ... More -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 21:27:27 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 17 Nov 2013 21:27:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Fix_buildbot_f?= =?utf-8?q?ailure?= Message-ID: <3dN4bg0jhLz7LjS@mail.python.org> http://hg.python.org/cpython/rev/c44643df7d2a changeset: 87218:c44643df7d2a branch: 2.7 parent: 87215:cc27c0aba18b user: Antoine Pitrou date: Sun Nov 17 21:27:20 2013 +0100 summary: Fix buildbot failure files: Lib/lib-tk/test/test_ttk/test_extensions.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/lib-tk/test/test_ttk/test_extensions.py b/Lib/lib-tk/test/test_ttk/test_extensions.py --- a/Lib/lib-tk/test/test_ttk/test_extensions.py +++ b/Lib/lib-tk/test/test_ttk/test_extensions.py @@ -45,7 +45,7 @@ # it tries calling instance attributes not yet defined. ttk.LabeledScale(variable=myvar) if hasattr(sys, 'last_type'): - self.assertNotEqual(sys.last_type, tkinter.TclError) + self.assertNotEqual(sys.last_type, Tkinter.TclError) def test_initialization(self): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 22:39:40 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 22:39:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NjAz?= =?utf-8?q?=3A_Use_specific_asserts_in_test=5Fdecr=2E?= Message-ID: <3dN6C06Rymz7LjS@mail.python.org> http://hg.python.org/cpython/rev/2407ecebcf7d changeset: 87219:2407ecebcf7d branch: 3.3 parent: 87216:17e7aedc9a5d user: Serhiy Storchaka date: Sun Nov 17 23:38:50 2013 +0200 summary: Issue #19603: Use specific asserts in test_decr. files: Lib/test/test_descr.py | 325 ++++++++++++++-------------- 1 files changed, 168 insertions(+), 157 deletions(-) diff --git a/Lib/test/test_descr.py b/Lib/test/test_descr.py --- a/Lib/test/test_descr.py +++ b/Lib/test/test_descr.py @@ -388,13 +388,21 @@ class ClassPropertiesAndMethods(unittest.TestCase): + def assertHasAttr(self, obj, name): + self.assertTrue(hasattr(obj, name), + '%r has no attribute %r' % (obj, name)) + + def assertNotHasAttr(self, obj, name): + self.assertFalse(hasattr(obj, name), + '%r has unexpected attribute %r' % (obj, name)) + def test_python_dicts(self): # Testing Python subclass of dict... self.assertTrue(issubclass(dict, dict)) self.assertIsInstance({}, dict) d = dict() self.assertEqual(d, {}) - self.assertTrue(d.__class__ is dict) + self.assertIs(d.__class__, dict) self.assertIsInstance(d, dict) class C(dict): state = -1 @@ -572,7 +580,7 @@ def _set_x(self, x): self.__x = -x a = A() - self.assertTrue(not hasattr(a, "x")) + self.assertNotHasAttr(a, "x") a.x = 12 self.assertEqual(a.x, 12) self.assertEqual(a._A__x, -12) @@ -998,14 +1006,14 @@ self.assertEqual(type(a), object) b = object() self.assertNotEqual(a, b) - self.assertFalse(hasattr(a, "foo")) + self.assertNotHasAttr(a, "foo") try: a.foo = 12 except (AttributeError, TypeError): pass else: self.fail("object() should not allow setting a foo attribute") - self.assertFalse(hasattr(object(), "__dict__")) + self.assertNotHasAttr(object(), "__dict__") class Cdict(object): pass @@ -1020,28 +1028,28 @@ class C0(object): __slots__ = [] x = C0() - self.assertFalse(hasattr(x, "__dict__")) - self.assertFalse(hasattr(x, "foo")) + self.assertNotHasAttr(x, "__dict__") + self.assertNotHasAttr(x, "foo") class C1(object): __slots__ = ['a'] x = C1() - self.assertFalse(hasattr(x, "__dict__")) - self.assertFalse(hasattr(x, "a")) + self.assertNotHasAttr(x, "__dict__") + self.assertNotHasAttr(x, "a") x.a = 1 self.assertEqual(x.a, 1) x.a = None self.assertEqual(x.a, None) del x.a - self.assertFalse(hasattr(x, "a")) + self.assertNotHasAttr(x, "a") class C3(object): __slots__ = ['a', 'b', 'c'] x = C3() - self.assertFalse(hasattr(x, "__dict__")) - self.assertFalse(hasattr(x, 'a')) - self.assertFalse(hasattr(x, 'b')) - self.assertFalse(hasattr(x, 'c')) + self.assertNotHasAttr(x, "__dict__") + self.assertNotHasAttr(x, 'a') + self.assertNotHasAttr(x, 'b') + self.assertNotHasAttr(x, 'c') x.a = 1 x.b = 2 x.c = 3 @@ -1057,8 +1065,8 @@ def get(self): return self.__a x = C4(5) - self.assertFalse(hasattr(x, '__dict__')) - self.assertFalse(hasattr(x, '__a')) + self.assertNotHasAttr(x, '__dict__') + self.assertNotHasAttr(x, '__a') self.assertEqual(x.get(), 5) try: x.__a = 6 @@ -1130,7 +1138,7 @@ x = C() x.foo = 5 self.assertEqual(x.foo, 5) - self.assertTrue(type(slots[0]) is str) + self.assertIs(type(slots[0]), str) # this used to leak references try: class C(object): @@ -1222,16 +1230,16 @@ class D(object): __slots__ = ["__dict__"] a = D() - self.assertTrue(hasattr(a, "__dict__")) - self.assertFalse(hasattr(a, "__weakref__")) + self.assertHasAttr(a, "__dict__") + self.assertNotHasAttr(a, "__weakref__") a.foo = 42 self.assertEqual(a.__dict__, {"foo": 42}) class W(object): __slots__ = ["__weakref__"] a = W() - self.assertTrue(hasattr(a, "__weakref__")) - self.assertFalse(hasattr(a, "__dict__")) + self.assertHasAttr(a, "__weakref__") + self.assertNotHasAttr(a, "__dict__") try: a.foo = 42 except AttributeError: @@ -1242,16 +1250,16 @@ class C1(W, D): __slots__ = [] a = C1() - self.assertTrue(hasattr(a, "__dict__")) - self.assertTrue(hasattr(a, "__weakref__")) + self.assertHasAttr(a, "__dict__") + self.assertHasAttr(a, "__weakref__") a.foo = 42 self.assertEqual(a.__dict__, {"foo": 42}) class C2(D, W): __slots__ = [] a = C2() - self.assertTrue(hasattr(a, "__dict__")) - self.assertTrue(hasattr(a, "__weakref__")) + self.assertHasAttr(a, "__dict__") + self.assertHasAttr(a, "__weakref__") a.foo = 42 self.assertEqual(a.__dict__, {"foo": 42}) @@ -1289,7 +1297,7 @@ class C(object): pass a = C() - self.assertFalse(hasattr(a, "foobar")) + self.assertNotHasAttr(a, "foobar") C.foobar = 2 self.assertEqual(a.foobar, 2) C.method = lambda self: 42 @@ -1299,7 +1307,7 @@ C.__int__ = lambda self: 100 self.assertEqual(int(a), 100) self.assertEqual(a.foobar, 2) - self.assertFalse(hasattr(a, "spam")) + self.assertNotHasAttr(a, "spam") def mygetattr(self, name): if name == "spam": return "spam" @@ -1450,7 +1458,7 @@ self.assertEqual(cm.x, 42) self.assertEqual(cm.__dict__, {"x" : 42}) del cm.x - self.assertFalse(hasattr(cm, "x")) + self.assertNotHasAttr(cm, "x") @support.impl_detail("the module 'xxsubtype' is internal") def test_classmethods_in_c(self): @@ -1505,7 +1513,7 @@ self.assertEqual(sm.x, 42) self.assertEqual(sm.__dict__, {"x" : 42}) del sm.x - self.assertFalse(hasattr(sm, "x")) + self.assertNotHasAttr(sm, "x") @support.impl_detail("the module 'xxsubtype' is internal") def test_staticmethods_in_c(self): @@ -1575,7 +1583,7 @@ self.assertEqual(a.x, 10) self.assertEqual(a.x, 11) del a.x - self.assertEqual(hasattr(a, 'x'), 0) + self.assertNotHasAttr(a, 'x') def test_newslots(self): # Testing __new__ slot override... @@ -1845,17 +1853,17 @@ raise IndexError c1 = C() c2 = C() - self.assertTrue(not not c1) # What? + self.assertFalse(not c1) self.assertNotEqual(id(c1), id(c2)) hash(c1) hash(c2) self.assertEqual(c1, c1) self.assertTrue(c1 != c2) - self.assertTrue(not c1 != c1) - self.assertTrue(not c1 == c2) + self.assertFalse(c1 != c1) + self.assertFalse(c1 == c2) # Note that the module name appears in str/repr, and that varies # depending on whether this test is run standalone or from a framework. - self.assertTrue(str(c1).find('C object at ') >= 0) + self.assertGreaterEqual(str(c1).find('C object at '), 0) self.assertEqual(str(c1), repr(c1)) self.assertNotIn(-1, c1) for i in range(10): @@ -1868,17 +1876,17 @@ raise IndexError d1 = D() d2 = D() - self.assertTrue(not not d1) + self.assertFalse(not d1) self.assertNotEqual(id(d1), id(d2)) hash(d1) hash(d2) self.assertEqual(d1, d1) self.assertNotEqual(d1, d2) - self.assertTrue(not d1 != d1) - self.assertTrue(not d1 == d2) + self.assertFalse(d1 != d1) + self.assertFalse(d1 == d2) # Note that the module name appears in str/repr, and that varies # depending on whether this test is run standalone or from a framework. - self.assertTrue(str(d1).find('D object at ') >= 0) + self.assertGreaterEqual(str(d1).find('D object at '), 0) self.assertEqual(str(d1), repr(d1)) self.assertNotIn(-1, d1) for i in range(10): @@ -1914,11 +1922,11 @@ p1 = Proxy(1) p_1 = Proxy(-1) self.assertFalse(p0) - self.assertTrue(not not p1) + self.assertFalse(not p1) self.assertEqual(hash(p0), hash(0)) self.assertEqual(p0, p0) self.assertNotEqual(p0, p1) - self.assertTrue(not p0 != p0) + self.assertFalse(p0 != p0) self.assertEqual(not p0, p1) self.assertTrue(p0 < p1) self.assertTrue(p0 <= p1) @@ -1950,7 +1958,7 @@ try: weakref.ref(no) except TypeError as msg: - self.assertTrue(str(msg).find("weak reference") >= 0) + self.assertIn("weak reference", str(msg)) else: self.fail("weakref.ref(no) should be illegal") class Weak(object): @@ -1974,17 +1982,17 @@ del self.__x x = property(getx, setx, delx, doc="I'm the x property.") a = C() - self.assertFalse(hasattr(a, "x")) + self.assertNotHasAttr(a, "x") a.x = 42 self.assertEqual(a._C__x, 42) self.assertEqual(a.x, 42) del a.x - self.assertFalse(hasattr(a, "x")) - self.assertFalse(hasattr(a, "_C__x")) + self.assertNotHasAttr(a, "x") + self.assertNotHasAttr(a, "_C__x") C.x.__set__(a, 100) self.assertEqual(C.x.__get__(a), 100) C.x.__delete__(a) - self.assertFalse(hasattr(a, "x")) + self.assertNotHasAttr(a, "x") raw = C.__dict__['x'] self.assertIsInstance(raw, property) @@ -1996,9 +2004,9 @@ self.assertIn("fdel", attrs) self.assertEqual(raw.__doc__, "I'm the x property.") - self.assertTrue(raw.fget is C.__dict__['getx']) - self.assertTrue(raw.fset is C.__dict__['setx']) - self.assertTrue(raw.fdel is C.__dict__['delx']) + self.assertIs(raw.fget, C.__dict__['getx']) + self.assertIs(raw.fset, C.__dict__['setx']) + self.assertIs(raw.fdel, C.__dict__['delx']) for attr in "__doc__", "fget", "fset", "fdel": try: @@ -2062,14 +2070,14 @@ del self._foo c = C() self.assertEqual(C.foo.__doc__, "hello") - self.assertFalse(hasattr(c, "foo")) + self.assertNotHasAttr(c, "foo") c.foo = -42 - self.assertTrue(hasattr(c, '_foo')) + self.assertHasAttr(c, '_foo') self.assertEqual(c._foo, 42) self.assertEqual(c.foo, 42) del c.foo - self.assertFalse(hasattr(c, '_foo')) - self.assertFalse(hasattr(c, "foo")) + self.assertNotHasAttr(c, '_foo') + self.assertNotHasAttr(c, "foo") class D(C): @C.foo.deleter @@ -2421,13 +2429,13 @@ a = hexint(12345) self.assertEqual(a, 12345) self.assertEqual(int(a), 12345) - self.assertTrue(int(a).__class__ is int) + self.assertIs(int(a).__class__, int) self.assertEqual(hash(a), hash(12345)) - self.assertTrue((+a).__class__ is int) - self.assertTrue((a >> 0).__class__ is int) - self.assertTrue((a << 0).__class__ is int) - self.assertTrue((hexint(0) << 12).__class__ is int) - self.assertTrue((hexint(0) >> 12).__class__ is int) + self.assertIs((+a).__class__, int) + self.assertIs((a >> 0).__class__, int) + self.assertIs((a << 0).__class__, int) + self.assertIs((hexint(0) << 12).__class__, int) + self.assertIs((hexint(0) >> 12).__class__, int) class octlong(int): __slots__ = [] @@ -2444,31 +2452,31 @@ self.assertEqual(a, 12345) self.assertEqual(int(a), 12345) self.assertEqual(hash(a), hash(12345)) - self.assertTrue(int(a).__class__ is int) - self.assertTrue((+a).__class__ is int) - self.assertTrue((-a).__class__ is int) - self.assertTrue((-octlong(0)).__class__ is int) - self.assertTrue((a >> 0).__class__ is int) - self.assertTrue((a << 0).__class__ is int) - self.assertTrue((a - 0).__class__ is int) - self.assertTrue((a * 1).__class__ is int) - self.assertTrue((a ** 1).__class__ is int) - self.assertTrue((a // 1).__class__ is int) - self.assertTrue((1 * a).__class__ is int) - self.assertTrue((a | 0).__class__ is int) - self.assertTrue((a ^ 0).__class__ is int) - self.assertTrue((a & -1).__class__ is int) - self.assertTrue((octlong(0) << 12).__class__ is int) - self.assertTrue((octlong(0) >> 12).__class__ is int) - self.assertTrue(abs(octlong(0)).__class__ is int) + self.assertIs(int(a).__class__, int) + self.assertIs((+a).__class__, int) + self.assertIs((-a).__class__, int) + self.assertIs((-octlong(0)).__class__, int) + self.assertIs((a >> 0).__class__, int) + self.assertIs((a << 0).__class__, int) + self.assertIs((a - 0).__class__, int) + self.assertIs((a * 1).__class__, int) + self.assertIs((a ** 1).__class__, int) + self.assertIs((a // 1).__class__, int) + self.assertIs((1 * a).__class__, int) + self.assertIs((a | 0).__class__, int) + self.assertIs((a ^ 0).__class__, int) + self.assertIs((a & -1).__class__, int) + self.assertIs((octlong(0) << 12).__class__, int) + self.assertIs((octlong(0) >> 12).__class__, int) + self.assertIs(abs(octlong(0)).__class__, int) # Because octlong overrides __add__, we can't check the absence of +0 # optimizations using octlong. class longclone(int): pass a = longclone(1) - self.assertTrue((a + 0).__class__ is int) - self.assertTrue((0 + a).__class__ is int) + self.assertIs((a + 0).__class__, int) + self.assertIs((0 + a).__class__, int) # Check that negative clones don't segfault a = longclone(-1) @@ -2485,9 +2493,9 @@ a = precfloat(12345) self.assertEqual(a, 12345.0) self.assertEqual(float(a), 12345.0) - self.assertTrue(float(a).__class__ is float) + self.assertIs(float(a).__class__, float) self.assertEqual(hash(a), hash(12345.0)) - self.assertTrue((+a).__class__ is float) + self.assertIs((+a).__class__, float) class madcomplex(complex): def __repr__(self): @@ -2535,20 +2543,20 @@ self.assertEqual(v, t) a = madtuple((1,2,3,4,5)) self.assertEqual(tuple(a), (1,2,3,4,5)) - self.assertTrue(tuple(a).__class__ is tuple) + self.assertIs(tuple(a).__class__, tuple) self.assertEqual(hash(a), hash((1,2,3,4,5))) - self.assertTrue(a[:].__class__ is tuple) - self.assertTrue((a * 1).__class__ is tuple) - self.assertTrue((a * 0).__class__ is tuple) - self.assertTrue((a + ()).__class__ is tuple) + self.assertIs(a[:].__class__, tuple) + self.assertIs((a * 1).__class__, tuple) + self.assertIs((a * 0).__class__, tuple) + self.assertIs((a + ()).__class__, tuple) a = madtuple(()) self.assertEqual(tuple(a), ()) - self.assertTrue(tuple(a).__class__ is tuple) - self.assertTrue((a + a).__class__ is tuple) - self.assertTrue((a * 0).__class__ is tuple) - self.assertTrue((a * 1).__class__ is tuple) - self.assertTrue((a * 2).__class__ is tuple) - self.assertTrue(a[:].__class__ is tuple) + self.assertIs(tuple(a).__class__, tuple) + self.assertIs((a + a).__class__, tuple) + self.assertIs((a * 0).__class__, tuple) + self.assertIs((a * 1).__class__, tuple) + self.assertIs((a * 2).__class__, tuple) + self.assertIs(a[:].__class__, tuple) class madstring(str): _rev = None @@ -2570,48 +2578,48 @@ self.assertEqual(u, s) s = madstring("12345") self.assertEqual(str(s), "12345") - self.assertTrue(str(s).__class__ is str) + self.assertIs(str(s).__class__, str) base = "\x00" * 5 s = madstring(base) self.assertEqual(s, base) self.assertEqual(str(s), base) - self.assertTrue(str(s).__class__ is str) + self.assertIs(str(s).__class__, str) self.assertEqual(hash(s), hash(base)) self.assertEqual({s: 1}[base], 1) self.assertEqual({base: 1}[s], 1) - self.assertTrue((s + "").__class__ is str) + self.assertIs((s + "").__class__, str) self.assertEqual(s + "", base) - self.assertTrue(("" + s).__class__ is str) + self.assertIs(("" + s).__class__, str) self.assertEqual("" + s, base) - self.assertTrue((s * 0).__class__ is str) + self.assertIs((s * 0).__class__, str) self.assertEqual(s * 0, "") - self.assertTrue((s * 1).__class__ is str) + self.assertIs((s * 1).__class__, str) self.assertEqual(s * 1, base) - self.assertTrue((s * 2).__class__ is str) + self.assertIs((s * 2).__class__, str) self.assertEqual(s * 2, base + base) - self.assertTrue(s[:].__class__ is str) + self.assertIs(s[:].__class__, str) self.assertEqual(s[:], base) - self.assertTrue(s[0:0].__class__ is str) + self.assertIs(s[0:0].__class__, str) self.assertEqual(s[0:0], "") - self.assertTrue(s.strip().__class__ is str) + self.assertIs(s.strip().__class__, str) self.assertEqual(s.strip(), base) - self.assertTrue(s.lstrip().__class__ is str) + self.assertIs(s.lstrip().__class__, str) self.assertEqual(s.lstrip(), base) - self.assertTrue(s.rstrip().__class__ is str) + self.assertIs(s.rstrip().__class__, str) self.assertEqual(s.rstrip(), base) identitytab = {} - self.assertTrue(s.translate(identitytab).__class__ is str) + self.assertIs(s.translate(identitytab).__class__, str) self.assertEqual(s.translate(identitytab), base) - self.assertTrue(s.replace("x", "x").__class__ is str) + self.assertIs(s.replace("x", "x").__class__, str) self.assertEqual(s.replace("x", "x"), base) - self.assertTrue(s.ljust(len(s)).__class__ is str) + self.assertIs(s.ljust(len(s)).__class__, str) self.assertEqual(s.ljust(len(s)), base) - self.assertTrue(s.rjust(len(s)).__class__ is str) + self.assertIs(s.rjust(len(s)).__class__, str) self.assertEqual(s.rjust(len(s)), base) - self.assertTrue(s.center(len(s)).__class__ is str) + self.assertIs(s.center(len(s)).__class__, str) self.assertEqual(s.center(len(s)), base) - self.assertTrue(s.lower().__class__ is str) + self.assertIs(s.lower().__class__, str) self.assertEqual(s.lower(), base) class madunicode(str): @@ -2630,47 +2638,47 @@ base = "12345" u = madunicode(base) self.assertEqual(str(u), base) - self.assertTrue(str(u).__class__ is str) + self.assertIs(str(u).__class__, str) self.assertEqual(hash(u), hash(base)) self.assertEqual({u: 1}[base], 1) self.assertEqual({base: 1}[u], 1) - self.assertTrue(u.strip().__class__ is str) + self.assertIs(u.strip().__class__, str) self.assertEqual(u.strip(), base) - self.assertTrue(u.lstrip().__class__ is str) + self.assertIs(u.lstrip().__class__, str) self.assertEqual(u.lstrip(), base) - self.assertTrue(u.rstrip().__class__ is str) + self.assertIs(u.rstrip().__class__, str) self.assertEqual(u.rstrip(), base) - self.assertTrue(u.replace("x", "x").__class__ is str) + self.assertIs(u.replace("x", "x").__class__, str) self.assertEqual(u.replace("x", "x"), base) - self.assertTrue(u.replace("xy", "xy").__class__ is str) + self.assertIs(u.replace("xy", "xy").__class__, str) self.assertEqual(u.replace("xy", "xy"), base) - self.assertTrue(u.center(len(u)).__class__ is str) + self.assertIs(u.center(len(u)).__class__, str) self.assertEqual(u.center(len(u)), base) - self.assertTrue(u.ljust(len(u)).__class__ is str) + self.assertIs(u.ljust(len(u)).__class__, str) self.assertEqual(u.ljust(len(u)), base) - self.assertTrue(u.rjust(len(u)).__class__ is str) + self.assertIs(u.rjust(len(u)).__class__, str) self.assertEqual(u.rjust(len(u)), base) - self.assertTrue(u.lower().__class__ is str) + self.assertIs(u.lower().__class__, str) self.assertEqual(u.lower(), base) - self.assertTrue(u.upper().__class__ is str) + self.assertIs(u.upper().__class__, str) self.assertEqual(u.upper(), base) - self.assertTrue(u.capitalize().__class__ is str) + self.assertIs(u.capitalize().__class__, str) self.assertEqual(u.capitalize(), base) - self.assertTrue(u.title().__class__ is str) + self.assertIs(u.title().__class__, str) self.assertEqual(u.title(), base) - self.assertTrue((u + "").__class__ is str) + self.assertIs((u + "").__class__, str) self.assertEqual(u + "", base) - self.assertTrue(("" + u).__class__ is str) + self.assertIs(("" + u).__class__, str) self.assertEqual("" + u, base) - self.assertTrue((u * 0).__class__ is str) + self.assertIs((u * 0).__class__, str) self.assertEqual(u * 0, "") - self.assertTrue((u * 1).__class__ is str) + self.assertIs((u * 1).__class__, str) self.assertEqual(u * 1, base) - self.assertTrue((u * 2).__class__ is str) + self.assertIs((u * 2).__class__, str) self.assertEqual(u * 2, base + base) - self.assertTrue(u[:].__class__ is str) + self.assertIs(u[:].__class__, str) self.assertEqual(u[:], base) - self.assertTrue(u[0:0].__class__ is str) + self.assertIs(u[0:0].__class__, str) self.assertEqual(u[0:0], "") class sublist(list): @@ -2846,13 +2854,13 @@ for x in 1, 2, 3: for y in 1, 2, 3: for op in "<", "<=", "==", "!=", ">", ">=": - self.assertTrue(eval("c[x] %s c[y]" % op) == + self.assertEqual(eval("c[x] %s c[y]" % op), eval("x %s y" % op), "x=%d, y=%d" % (x, y)) - self.assertTrue(eval("c[x] %s y" % op) == + self.assertEqual(eval("c[x] %s y" % op), eval("x %s y" % op), "x=%d, y=%d" % (x, y)) - self.assertTrue(eval("x %s c[y]" % op) == + self.assertEqual(eval("x %s c[y]" % op), eval("x %s y" % op), "x=%d, y=%d" % (x, y)) @@ -2925,12 +2933,15 @@ for x in 1, 2, 3: for y in 1, 2, 3: for op in "<", "<=", "==", "!=", ">", ">=": - self.assertTrue(eval("c[x] %s c[y]" % op) == eval("x %s y" % op), - "x=%d, y=%d" % (x, y)) - self.assertTrue(eval("c[x] %s y" % op) == eval("x %s y" % op), - "x=%d, y=%d" % (x, y)) - self.assertTrue(eval("x %s c[y]" % op) == eval("x %s y" % op), - "x=%d, y=%d" % (x, y)) + self.assertEqual(eval("c[x] %s c[y]" % op), + eval("x %s y" % op), + "x=%d, y=%d" % (x, y)) + self.assertEqual(eval("c[x] %s y" % op), + eval("x %s y" % op), + "x=%d, y=%d" % (x, y)) + self.assertEqual(eval("x %s c[y]" % op), + eval("x %s y" % op), + "x=%d, y=%d" % (x, y)) def test_descrdoc(self): # Testing descriptor doc strings... @@ -2969,9 +2980,9 @@ for cls2 in C, D, E, F: x = cls() x.__class__ = cls2 - self.assertTrue(x.__class__ is cls2) + self.assertIs(x.__class__, cls2) x.__class__ = cls - self.assertTrue(x.__class__ is cls) + self.assertIs(x.__class__, cls) def cant(x, C): try: x.__class__ = C @@ -3027,11 +3038,11 @@ x = cls() x.a = 1 x.__class__ = cls2 - self.assertTrue(x.__class__ is cls2, + self.assertIs(x.__class__, cls2, "assigning %r as __class__ for %r silently failed" % (cls2, x)) self.assertEqual(x.a, 1) x.__class__ = cls - self.assertTrue(x.__class__ is cls, + self.assertIs(x.__class__, cls, "assigning %r as __class__ for %r silently failed" % (cls, x)) self.assertEqual(x.a, 1) for cls in G, J, K, L, M, N, P, R, list, Int: @@ -3200,7 +3211,7 @@ for cls in C, C1, C2: s = pickle.dumps(cls, bin) cls2 = pickle.loads(s) - self.assertTrue(cls2 is cls) + self.assertIs(cls2, cls) a = C1(1, 2); a.append(42); a.append(24) b = C2("hello", "world", 42) @@ -3230,7 +3241,7 @@ import copy for cls in C, C1, C2: cls2 = copy.deepcopy(cls) - self.assertTrue(cls2 is cls) + self.assertIs(cls2, cls) a = C1(1, 2); a.append(42); a.append(24) b = C2("hello", "world", 42) @@ -3290,7 +3301,7 @@ # Now it should work x = C() y = pickle.loads(pickle.dumps(x)) - self.assertEqual(hasattr(y, 'a'), 0) + self.assertNotHasAttr(y, 'a') x.a = 42 y = pickle.loads(pickle.dumps(x)) self.assertEqual(y.a, 42) @@ -3590,9 +3601,9 @@ from types import ModuleType as M m = M.__new__(M) str(m) - self.assertEqual(hasattr(m, "__name__"), 0) - self.assertEqual(hasattr(m, "__file__"), 0) - self.assertEqual(hasattr(m, "foo"), 0) + self.assertNotHasAttr(m, "__name__") + self.assertNotHasAttr(m, "__file__") + self.assertNotHasAttr(m, "foo") self.assertFalse(m.__dict__) # None or {} are both reasonable answers m.foo = 1 self.assertEqual(m.__dict__, {"foo": 1}) @@ -3772,8 +3783,8 @@ __slots__=() if support.check_impl_detail(): self.assertEqual(C.__basicsize__, B.__basicsize__) - self.assertTrue(hasattr(C, '__dict__')) - self.assertTrue(hasattr(C, '__weakref__')) + self.assertHasAttr(C, '__dict__') + self.assertHasAttr(C, '__weakref__') C().x = 2 def test_rmul(self): @@ -4251,7 +4262,7 @@ self.assertEqual(c.attr, 1) # this makes a crash more likely: support.gc_collect() - self.assertEqual(hasattr(c, 'attr'), False) + self.assertNotHasAttr(c, 'attr') def test_init(self): # SF 1155938 @@ -4274,17 +4285,17 @@ l = [] self.assertEqual(l.__add__, l.__add__) self.assertEqual(l.__add__, [].__add__) - self.assertTrue(l.__add__ != [5].__add__) - self.assertTrue(l.__add__ != l.__mul__) - self.assertTrue(l.__add__.__name__ == '__add__') + self.assertNotEqual(l.__add__, [5].__add__) + self.assertNotEqual(l.__add__, l.__mul__) + self.assertEqual(l.__add__.__name__, '__add__') if hasattr(l.__add__, '__self__'): # CPython - self.assertTrue(l.__add__.__self__ is l) - self.assertTrue(l.__add__.__objclass__ is list) + self.assertIs(l.__add__.__self__, l) + self.assertIs(l.__add__.__objclass__, list) else: # Python implementations where [].__add__ is a normal bound method - self.assertTrue(l.__add__.im_self is l) - self.assertTrue(l.__add__.im_class is list) + self.assertIs(l.__add__.im_self, l) + self.assertIs(l.__add__.im_class, list) self.assertEqual(l.__add__.__doc__, list.__add__.__doc__) try: hash(l.__add__) @@ -4451,7 +4462,7 @@ fake_str = FakeStr() # isinstance() reads __class__ - self.assertTrue(isinstance(fake_str, str)) + self.assertIsInstance(fake_str, str) # call a method descriptor with self.assertRaises(TypeError): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 22:39:42 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 22:39:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319603=3A_Use_specific_asserts_in_test=5Fdecr=2E?= Message-ID: <3dN6C24pcJz7LjS@mail.python.org> http://hg.python.org/cpython/rev/6049c954a703 changeset: 87220:6049c954a703 parent: 87217:9b79eed3c16b parent: 87219:2407ecebcf7d user: Serhiy Storchaka date: Sun Nov 17 23:39:13 2013 +0200 summary: Issue #19603: Use specific asserts in test_decr. files: Lib/test/test_descr.py | 325 ++++++++++++++-------------- 1 files changed, 168 insertions(+), 157 deletions(-) diff --git a/Lib/test/test_descr.py b/Lib/test/test_descr.py --- a/Lib/test/test_descr.py +++ b/Lib/test/test_descr.py @@ -388,13 +388,21 @@ class ClassPropertiesAndMethods(unittest.TestCase): + def assertHasAttr(self, obj, name): + self.assertTrue(hasattr(obj, name), + '%r has no attribute %r' % (obj, name)) + + def assertNotHasAttr(self, obj, name): + self.assertFalse(hasattr(obj, name), + '%r has unexpected attribute %r' % (obj, name)) + def test_python_dicts(self): # Testing Python subclass of dict... self.assertTrue(issubclass(dict, dict)) self.assertIsInstance({}, dict) d = dict() self.assertEqual(d, {}) - self.assertTrue(d.__class__ is dict) + self.assertIs(d.__class__, dict) self.assertIsInstance(d, dict) class C(dict): state = -1 @@ -572,7 +580,7 @@ def _set_x(self, x): self.__x = -x a = A() - self.assertTrue(not hasattr(a, "x")) + self.assertNotHasAttr(a, "x") a.x = 12 self.assertEqual(a.x, 12) self.assertEqual(a._A__x, -12) @@ -998,14 +1006,14 @@ self.assertEqual(type(a), object) b = object() self.assertNotEqual(a, b) - self.assertFalse(hasattr(a, "foo")) + self.assertNotHasAttr(a, "foo") try: a.foo = 12 except (AttributeError, TypeError): pass else: self.fail("object() should not allow setting a foo attribute") - self.assertFalse(hasattr(object(), "__dict__")) + self.assertNotHasAttr(object(), "__dict__") class Cdict(object): pass @@ -1020,28 +1028,28 @@ class C0(object): __slots__ = [] x = C0() - self.assertFalse(hasattr(x, "__dict__")) - self.assertFalse(hasattr(x, "foo")) + self.assertNotHasAttr(x, "__dict__") + self.assertNotHasAttr(x, "foo") class C1(object): __slots__ = ['a'] x = C1() - self.assertFalse(hasattr(x, "__dict__")) - self.assertFalse(hasattr(x, "a")) + self.assertNotHasAttr(x, "__dict__") + self.assertNotHasAttr(x, "a") x.a = 1 self.assertEqual(x.a, 1) x.a = None self.assertEqual(x.a, None) del x.a - self.assertFalse(hasattr(x, "a")) + self.assertNotHasAttr(x, "a") class C3(object): __slots__ = ['a', 'b', 'c'] x = C3() - self.assertFalse(hasattr(x, "__dict__")) - self.assertFalse(hasattr(x, 'a')) - self.assertFalse(hasattr(x, 'b')) - self.assertFalse(hasattr(x, 'c')) + self.assertNotHasAttr(x, "__dict__") + self.assertNotHasAttr(x, 'a') + self.assertNotHasAttr(x, 'b') + self.assertNotHasAttr(x, 'c') x.a = 1 x.b = 2 x.c = 3 @@ -1057,8 +1065,8 @@ def get(self): return self.__a x = C4(5) - self.assertFalse(hasattr(x, '__dict__')) - self.assertFalse(hasattr(x, '__a')) + self.assertNotHasAttr(x, '__dict__') + self.assertNotHasAttr(x, '__a') self.assertEqual(x.get(), 5) try: x.__a = 6 @@ -1130,7 +1138,7 @@ x = C() x.foo = 5 self.assertEqual(x.foo, 5) - self.assertTrue(type(slots[0]) is str) + self.assertIs(type(slots[0]), str) # this used to leak references try: class C(object): @@ -1222,16 +1230,16 @@ class D(object): __slots__ = ["__dict__"] a = D() - self.assertTrue(hasattr(a, "__dict__")) - self.assertFalse(hasattr(a, "__weakref__")) + self.assertHasAttr(a, "__dict__") + self.assertNotHasAttr(a, "__weakref__") a.foo = 42 self.assertEqual(a.__dict__, {"foo": 42}) class W(object): __slots__ = ["__weakref__"] a = W() - self.assertTrue(hasattr(a, "__weakref__")) - self.assertFalse(hasattr(a, "__dict__")) + self.assertHasAttr(a, "__weakref__") + self.assertNotHasAttr(a, "__dict__") try: a.foo = 42 except AttributeError: @@ -1242,16 +1250,16 @@ class C1(W, D): __slots__ = [] a = C1() - self.assertTrue(hasattr(a, "__dict__")) - self.assertTrue(hasattr(a, "__weakref__")) + self.assertHasAttr(a, "__dict__") + self.assertHasAttr(a, "__weakref__") a.foo = 42 self.assertEqual(a.__dict__, {"foo": 42}) class C2(D, W): __slots__ = [] a = C2() - self.assertTrue(hasattr(a, "__dict__")) - self.assertTrue(hasattr(a, "__weakref__")) + self.assertHasAttr(a, "__dict__") + self.assertHasAttr(a, "__weakref__") a.foo = 42 self.assertEqual(a.__dict__, {"foo": 42}) @@ -1289,7 +1297,7 @@ class C(object): pass a = C() - self.assertFalse(hasattr(a, "foobar")) + self.assertNotHasAttr(a, "foobar") C.foobar = 2 self.assertEqual(a.foobar, 2) C.method = lambda self: 42 @@ -1299,7 +1307,7 @@ C.__int__ = lambda self: 100 self.assertEqual(int(a), 100) self.assertEqual(a.foobar, 2) - self.assertFalse(hasattr(a, "spam")) + self.assertNotHasAttr(a, "spam") def mygetattr(self, name): if name == "spam": return "spam" @@ -1450,7 +1458,7 @@ self.assertEqual(cm.x, 42) self.assertEqual(cm.__dict__, {"x" : 42}) del cm.x - self.assertFalse(hasattr(cm, "x")) + self.assertNotHasAttr(cm, "x") @support.impl_detail("the module 'xxsubtype' is internal") def test_classmethods_in_c(self): @@ -1505,7 +1513,7 @@ self.assertEqual(sm.x, 42) self.assertEqual(sm.__dict__, {"x" : 42}) del sm.x - self.assertFalse(hasattr(sm, "x")) + self.assertNotHasAttr(sm, "x") @support.impl_detail("the module 'xxsubtype' is internal") def test_staticmethods_in_c(self): @@ -1575,7 +1583,7 @@ self.assertEqual(a.x, 10) self.assertEqual(a.x, 11) del a.x - self.assertEqual(hasattr(a, 'x'), 0) + self.assertNotHasAttr(a, 'x') def test_newslots(self): # Testing __new__ slot override... @@ -1846,17 +1854,17 @@ raise IndexError c1 = C() c2 = C() - self.assertTrue(not not c1) # What? + self.assertFalse(not c1) self.assertNotEqual(id(c1), id(c2)) hash(c1) hash(c2) self.assertEqual(c1, c1) self.assertTrue(c1 != c2) - self.assertTrue(not c1 != c1) - self.assertTrue(not c1 == c2) + self.assertFalse(c1 != c1) + self.assertFalse(c1 == c2) # Note that the module name appears in str/repr, and that varies # depending on whether this test is run standalone or from a framework. - self.assertTrue(str(c1).find('C object at ') >= 0) + self.assertGreaterEqual(str(c1).find('C object at '), 0) self.assertEqual(str(c1), repr(c1)) self.assertNotIn(-1, c1) for i in range(10): @@ -1869,17 +1877,17 @@ raise IndexError d1 = D() d2 = D() - self.assertTrue(not not d1) + self.assertFalse(not d1) self.assertNotEqual(id(d1), id(d2)) hash(d1) hash(d2) self.assertEqual(d1, d1) self.assertNotEqual(d1, d2) - self.assertTrue(not d1 != d1) - self.assertTrue(not d1 == d2) + self.assertFalse(d1 != d1) + self.assertFalse(d1 == d2) # Note that the module name appears in str/repr, and that varies # depending on whether this test is run standalone or from a framework. - self.assertTrue(str(d1).find('D object at ') >= 0) + self.assertGreaterEqual(str(d1).find('D object at '), 0) self.assertEqual(str(d1), repr(d1)) self.assertNotIn(-1, d1) for i in range(10): @@ -1915,11 +1923,11 @@ p1 = Proxy(1) p_1 = Proxy(-1) self.assertFalse(p0) - self.assertTrue(not not p1) + self.assertFalse(not p1) self.assertEqual(hash(p0), hash(0)) self.assertEqual(p0, p0) self.assertNotEqual(p0, p1) - self.assertTrue(not p0 != p0) + self.assertFalse(p0 != p0) self.assertEqual(not p0, p1) self.assertTrue(p0 < p1) self.assertTrue(p0 <= p1) @@ -1951,7 +1959,7 @@ try: weakref.ref(no) except TypeError as msg: - self.assertTrue(str(msg).find("weak reference") >= 0) + self.assertIn("weak reference", str(msg)) else: self.fail("weakref.ref(no) should be illegal") class Weak(object): @@ -1975,17 +1983,17 @@ del self.__x x = property(getx, setx, delx, doc="I'm the x property.") a = C() - self.assertFalse(hasattr(a, "x")) + self.assertNotHasAttr(a, "x") a.x = 42 self.assertEqual(a._C__x, 42) self.assertEqual(a.x, 42) del a.x - self.assertFalse(hasattr(a, "x")) - self.assertFalse(hasattr(a, "_C__x")) + self.assertNotHasAttr(a, "x") + self.assertNotHasAttr(a, "_C__x") C.x.__set__(a, 100) self.assertEqual(C.x.__get__(a), 100) C.x.__delete__(a) - self.assertFalse(hasattr(a, "x")) + self.assertNotHasAttr(a, "x") raw = C.__dict__['x'] self.assertIsInstance(raw, property) @@ -1997,9 +2005,9 @@ self.assertIn("fdel", attrs) self.assertEqual(raw.__doc__, "I'm the x property.") - self.assertTrue(raw.fget is C.__dict__['getx']) - self.assertTrue(raw.fset is C.__dict__['setx']) - self.assertTrue(raw.fdel is C.__dict__['delx']) + self.assertIs(raw.fget, C.__dict__['getx']) + self.assertIs(raw.fset, C.__dict__['setx']) + self.assertIs(raw.fdel, C.__dict__['delx']) for attr in "__doc__", "fget", "fset", "fdel": try: @@ -2063,14 +2071,14 @@ del self._foo c = C() self.assertEqual(C.foo.__doc__, "hello") - self.assertFalse(hasattr(c, "foo")) + self.assertNotHasAttr(c, "foo") c.foo = -42 - self.assertTrue(hasattr(c, '_foo')) + self.assertHasAttr(c, '_foo') self.assertEqual(c._foo, 42) self.assertEqual(c.foo, 42) del c.foo - self.assertFalse(hasattr(c, '_foo')) - self.assertFalse(hasattr(c, "foo")) + self.assertNotHasAttr(c, '_foo') + self.assertNotHasAttr(c, "foo") class D(C): @C.foo.deleter @@ -2424,13 +2432,13 @@ a = hexint(12345) self.assertEqual(a, 12345) self.assertEqual(int(a), 12345) - self.assertTrue(int(a).__class__ is int) + self.assertIs(int(a).__class__, int) self.assertEqual(hash(a), hash(12345)) - self.assertTrue((+a).__class__ is int) - self.assertTrue((a >> 0).__class__ is int) - self.assertTrue((a << 0).__class__ is int) - self.assertTrue((hexint(0) << 12).__class__ is int) - self.assertTrue((hexint(0) >> 12).__class__ is int) + self.assertIs((+a).__class__, int) + self.assertIs((a >> 0).__class__, int) + self.assertIs((a << 0).__class__, int) + self.assertIs((hexint(0) << 12).__class__, int) + self.assertIs((hexint(0) >> 12).__class__, int) class octlong(int): __slots__ = [] @@ -2447,31 +2455,31 @@ self.assertEqual(a, 12345) self.assertEqual(int(a), 12345) self.assertEqual(hash(a), hash(12345)) - self.assertTrue(int(a).__class__ is int) - self.assertTrue((+a).__class__ is int) - self.assertTrue((-a).__class__ is int) - self.assertTrue((-octlong(0)).__class__ is int) - self.assertTrue((a >> 0).__class__ is int) - self.assertTrue((a << 0).__class__ is int) - self.assertTrue((a - 0).__class__ is int) - self.assertTrue((a * 1).__class__ is int) - self.assertTrue((a ** 1).__class__ is int) - self.assertTrue((a // 1).__class__ is int) - self.assertTrue((1 * a).__class__ is int) - self.assertTrue((a | 0).__class__ is int) - self.assertTrue((a ^ 0).__class__ is int) - self.assertTrue((a & -1).__class__ is int) - self.assertTrue((octlong(0) << 12).__class__ is int) - self.assertTrue((octlong(0) >> 12).__class__ is int) - self.assertTrue(abs(octlong(0)).__class__ is int) + self.assertIs(int(a).__class__, int) + self.assertIs((+a).__class__, int) + self.assertIs((-a).__class__, int) + self.assertIs((-octlong(0)).__class__, int) + self.assertIs((a >> 0).__class__, int) + self.assertIs((a << 0).__class__, int) + self.assertIs((a - 0).__class__, int) + self.assertIs((a * 1).__class__, int) + self.assertIs((a ** 1).__class__, int) + self.assertIs((a // 1).__class__, int) + self.assertIs((1 * a).__class__, int) + self.assertIs((a | 0).__class__, int) + self.assertIs((a ^ 0).__class__, int) + self.assertIs((a & -1).__class__, int) + self.assertIs((octlong(0) << 12).__class__, int) + self.assertIs((octlong(0) >> 12).__class__, int) + self.assertIs(abs(octlong(0)).__class__, int) # Because octlong overrides __add__, we can't check the absence of +0 # optimizations using octlong. class longclone(int): pass a = longclone(1) - self.assertTrue((a + 0).__class__ is int) - self.assertTrue((0 + a).__class__ is int) + self.assertIs((a + 0).__class__, int) + self.assertIs((0 + a).__class__, int) # Check that negative clones don't segfault a = longclone(-1) @@ -2488,9 +2496,9 @@ a = precfloat(12345) self.assertEqual(a, 12345.0) self.assertEqual(float(a), 12345.0) - self.assertTrue(float(a).__class__ is float) + self.assertIs(float(a).__class__, float) self.assertEqual(hash(a), hash(12345.0)) - self.assertTrue((+a).__class__ is float) + self.assertIs((+a).__class__, float) class madcomplex(complex): def __repr__(self): @@ -2538,20 +2546,20 @@ self.assertEqual(v, t) a = madtuple((1,2,3,4,5)) self.assertEqual(tuple(a), (1,2,3,4,5)) - self.assertTrue(tuple(a).__class__ is tuple) + self.assertIs(tuple(a).__class__, tuple) self.assertEqual(hash(a), hash((1,2,3,4,5))) - self.assertTrue(a[:].__class__ is tuple) - self.assertTrue((a * 1).__class__ is tuple) - self.assertTrue((a * 0).__class__ is tuple) - self.assertTrue((a + ()).__class__ is tuple) + self.assertIs(a[:].__class__, tuple) + self.assertIs((a * 1).__class__, tuple) + self.assertIs((a * 0).__class__, tuple) + self.assertIs((a + ()).__class__, tuple) a = madtuple(()) self.assertEqual(tuple(a), ()) - self.assertTrue(tuple(a).__class__ is tuple) - self.assertTrue((a + a).__class__ is tuple) - self.assertTrue((a * 0).__class__ is tuple) - self.assertTrue((a * 1).__class__ is tuple) - self.assertTrue((a * 2).__class__ is tuple) - self.assertTrue(a[:].__class__ is tuple) + self.assertIs(tuple(a).__class__, tuple) + self.assertIs((a + a).__class__, tuple) + self.assertIs((a * 0).__class__, tuple) + self.assertIs((a * 1).__class__, tuple) + self.assertIs((a * 2).__class__, tuple) + self.assertIs(a[:].__class__, tuple) class madstring(str): _rev = None @@ -2573,48 +2581,48 @@ self.assertEqual(u, s) s = madstring("12345") self.assertEqual(str(s), "12345") - self.assertTrue(str(s).__class__ is str) + self.assertIs(str(s).__class__, str) base = "\x00" * 5 s = madstring(base) self.assertEqual(s, base) self.assertEqual(str(s), base) - self.assertTrue(str(s).__class__ is str) + self.assertIs(str(s).__class__, str) self.assertEqual(hash(s), hash(base)) self.assertEqual({s: 1}[base], 1) self.assertEqual({base: 1}[s], 1) - self.assertTrue((s + "").__class__ is str) + self.assertIs((s + "").__class__, str) self.assertEqual(s + "", base) - self.assertTrue(("" + s).__class__ is str) + self.assertIs(("" + s).__class__, str) self.assertEqual("" + s, base) - self.assertTrue((s * 0).__class__ is str) + self.assertIs((s * 0).__class__, str) self.assertEqual(s * 0, "") - self.assertTrue((s * 1).__class__ is str) + self.assertIs((s * 1).__class__, str) self.assertEqual(s * 1, base) - self.assertTrue((s * 2).__class__ is str) + self.assertIs((s * 2).__class__, str) self.assertEqual(s * 2, base + base) - self.assertTrue(s[:].__class__ is str) + self.assertIs(s[:].__class__, str) self.assertEqual(s[:], base) - self.assertTrue(s[0:0].__class__ is str) + self.assertIs(s[0:0].__class__, str) self.assertEqual(s[0:0], "") - self.assertTrue(s.strip().__class__ is str) + self.assertIs(s.strip().__class__, str) self.assertEqual(s.strip(), base) - self.assertTrue(s.lstrip().__class__ is str) + self.assertIs(s.lstrip().__class__, str) self.assertEqual(s.lstrip(), base) - self.assertTrue(s.rstrip().__class__ is str) + self.assertIs(s.rstrip().__class__, str) self.assertEqual(s.rstrip(), base) identitytab = {} - self.assertTrue(s.translate(identitytab).__class__ is str) + self.assertIs(s.translate(identitytab).__class__, str) self.assertEqual(s.translate(identitytab), base) - self.assertTrue(s.replace("x", "x").__class__ is str) + self.assertIs(s.replace("x", "x").__class__, str) self.assertEqual(s.replace("x", "x"), base) - self.assertTrue(s.ljust(len(s)).__class__ is str) + self.assertIs(s.ljust(len(s)).__class__, str) self.assertEqual(s.ljust(len(s)), base) - self.assertTrue(s.rjust(len(s)).__class__ is str) + self.assertIs(s.rjust(len(s)).__class__, str) self.assertEqual(s.rjust(len(s)), base) - self.assertTrue(s.center(len(s)).__class__ is str) + self.assertIs(s.center(len(s)).__class__, str) self.assertEqual(s.center(len(s)), base) - self.assertTrue(s.lower().__class__ is str) + self.assertIs(s.lower().__class__, str) self.assertEqual(s.lower(), base) class madunicode(str): @@ -2633,47 +2641,47 @@ base = "12345" u = madunicode(base) self.assertEqual(str(u), base) - self.assertTrue(str(u).__class__ is str) + self.assertIs(str(u).__class__, str) self.assertEqual(hash(u), hash(base)) self.assertEqual({u: 1}[base], 1) self.assertEqual({base: 1}[u], 1) - self.assertTrue(u.strip().__class__ is str) + self.assertIs(u.strip().__class__, str) self.assertEqual(u.strip(), base) - self.assertTrue(u.lstrip().__class__ is str) + self.assertIs(u.lstrip().__class__, str) self.assertEqual(u.lstrip(), base) - self.assertTrue(u.rstrip().__class__ is str) + self.assertIs(u.rstrip().__class__, str) self.assertEqual(u.rstrip(), base) - self.assertTrue(u.replace("x", "x").__class__ is str) + self.assertIs(u.replace("x", "x").__class__, str) self.assertEqual(u.replace("x", "x"), base) - self.assertTrue(u.replace("xy", "xy").__class__ is str) + self.assertIs(u.replace("xy", "xy").__class__, str) self.assertEqual(u.replace("xy", "xy"), base) - self.assertTrue(u.center(len(u)).__class__ is str) + self.assertIs(u.center(len(u)).__class__, str) self.assertEqual(u.center(len(u)), base) - self.assertTrue(u.ljust(len(u)).__class__ is str) + self.assertIs(u.ljust(len(u)).__class__, str) self.assertEqual(u.ljust(len(u)), base) - self.assertTrue(u.rjust(len(u)).__class__ is str) + self.assertIs(u.rjust(len(u)).__class__, str) self.assertEqual(u.rjust(len(u)), base) - self.assertTrue(u.lower().__class__ is str) + self.assertIs(u.lower().__class__, str) self.assertEqual(u.lower(), base) - self.assertTrue(u.upper().__class__ is str) + self.assertIs(u.upper().__class__, str) self.assertEqual(u.upper(), base) - self.assertTrue(u.capitalize().__class__ is str) + self.assertIs(u.capitalize().__class__, str) self.assertEqual(u.capitalize(), base) - self.assertTrue(u.title().__class__ is str) + self.assertIs(u.title().__class__, str) self.assertEqual(u.title(), base) - self.assertTrue((u + "").__class__ is str) + self.assertIs((u + "").__class__, str) self.assertEqual(u + "", base) - self.assertTrue(("" + u).__class__ is str) + self.assertIs(("" + u).__class__, str) self.assertEqual("" + u, base) - self.assertTrue((u * 0).__class__ is str) + self.assertIs((u * 0).__class__, str) self.assertEqual(u * 0, "") - self.assertTrue((u * 1).__class__ is str) + self.assertIs((u * 1).__class__, str) self.assertEqual(u * 1, base) - self.assertTrue((u * 2).__class__ is str) + self.assertIs((u * 2).__class__, str) self.assertEqual(u * 2, base + base) - self.assertTrue(u[:].__class__ is str) + self.assertIs(u[:].__class__, str) self.assertEqual(u[:], base) - self.assertTrue(u[0:0].__class__ is str) + self.assertIs(u[0:0].__class__, str) self.assertEqual(u[0:0], "") class sublist(list): @@ -2849,13 +2857,13 @@ for x in 1, 2, 3: for y in 1, 2, 3: for op in "<", "<=", "==", "!=", ">", ">=": - self.assertTrue(eval("c[x] %s c[y]" % op) == + self.assertEqual(eval("c[x] %s c[y]" % op), eval("x %s y" % op), "x=%d, y=%d" % (x, y)) - self.assertTrue(eval("c[x] %s y" % op) == + self.assertEqual(eval("c[x] %s y" % op), eval("x %s y" % op), "x=%d, y=%d" % (x, y)) - self.assertTrue(eval("x %s c[y]" % op) == + self.assertEqual(eval("x %s c[y]" % op), eval("x %s y" % op), "x=%d, y=%d" % (x, y)) @@ -2928,12 +2936,15 @@ for x in 1, 2, 3: for y in 1, 2, 3: for op in "<", "<=", "==", "!=", ">", ">=": - self.assertTrue(eval("c[x] %s c[y]" % op) == eval("x %s y" % op), - "x=%d, y=%d" % (x, y)) - self.assertTrue(eval("c[x] %s y" % op) == eval("x %s y" % op), - "x=%d, y=%d" % (x, y)) - self.assertTrue(eval("x %s c[y]" % op) == eval("x %s y" % op), - "x=%d, y=%d" % (x, y)) + self.assertEqual(eval("c[x] %s c[y]" % op), + eval("x %s y" % op), + "x=%d, y=%d" % (x, y)) + self.assertEqual(eval("c[x] %s y" % op), + eval("x %s y" % op), + "x=%d, y=%d" % (x, y)) + self.assertEqual(eval("x %s c[y]" % op), + eval("x %s y" % op), + "x=%d, y=%d" % (x, y)) def test_descrdoc(self): # Testing descriptor doc strings... @@ -2972,9 +2983,9 @@ for cls2 in C, D, E, F: x = cls() x.__class__ = cls2 - self.assertTrue(x.__class__ is cls2) + self.assertIs(x.__class__, cls2) x.__class__ = cls - self.assertTrue(x.__class__ is cls) + self.assertIs(x.__class__, cls) def cant(x, C): try: x.__class__ = C @@ -3030,11 +3041,11 @@ x = cls() x.a = 1 x.__class__ = cls2 - self.assertTrue(x.__class__ is cls2, + self.assertIs(x.__class__, cls2, "assigning %r as __class__ for %r silently failed" % (cls2, x)) self.assertEqual(x.a, 1) x.__class__ = cls - self.assertTrue(x.__class__ is cls, + self.assertIs(x.__class__, cls, "assigning %r as __class__ for %r silently failed" % (cls, x)) self.assertEqual(x.a, 1) for cls in G, J, K, L, M, N, P, R, list, Int: @@ -3203,7 +3214,7 @@ for cls in C, C1, C2: s = pickle.dumps(cls, bin) cls2 = pickle.loads(s) - self.assertTrue(cls2 is cls) + self.assertIs(cls2, cls) a = C1(1, 2); a.append(42); a.append(24) b = C2("hello", "world", 42) @@ -3233,7 +3244,7 @@ import copy for cls in C, C1, C2: cls2 = copy.deepcopy(cls) - self.assertTrue(cls2 is cls) + self.assertIs(cls2, cls) a = C1(1, 2); a.append(42); a.append(24) b = C2("hello", "world", 42) @@ -3293,7 +3304,7 @@ # Now it should work x = C() y = pickle.loads(pickle.dumps(x)) - self.assertEqual(hasattr(y, 'a'), 0) + self.assertNotHasAttr(y, 'a') x.a = 42 y = pickle.loads(pickle.dumps(x)) self.assertEqual(y.a, 42) @@ -3593,9 +3604,9 @@ from types import ModuleType as M m = M.__new__(M) str(m) - self.assertEqual(hasattr(m, "__name__"), 0) - self.assertEqual(hasattr(m, "__file__"), 0) - self.assertEqual(hasattr(m, "foo"), 0) + self.assertNotHasAttr(m, "__name__") + self.assertNotHasAttr(m, "__file__") + self.assertNotHasAttr(m, "foo") self.assertFalse(m.__dict__) # None or {} are both reasonable answers m.foo = 1 self.assertEqual(m.__dict__, {"foo": 1}) @@ -3765,8 +3776,8 @@ __slots__=() if support.check_impl_detail(): self.assertEqual(C.__basicsize__, B.__basicsize__) - self.assertTrue(hasattr(C, '__dict__')) - self.assertTrue(hasattr(C, '__weakref__')) + self.assertHasAttr(C, '__dict__') + self.assertHasAttr(C, '__weakref__') C().x = 2 def test_rmul(self): @@ -4244,7 +4255,7 @@ self.assertEqual(c.attr, 1) # this makes a crash more likely: support.gc_collect() - self.assertEqual(hasattr(c, 'attr'), False) + self.assertNotHasAttr(c, 'attr') def test_init(self): # SF 1155938 @@ -4267,17 +4278,17 @@ l = [] self.assertEqual(l.__add__, l.__add__) self.assertEqual(l.__add__, [].__add__) - self.assertTrue(l.__add__ != [5].__add__) - self.assertTrue(l.__add__ != l.__mul__) - self.assertTrue(l.__add__.__name__ == '__add__') + self.assertNotEqual(l.__add__, [5].__add__) + self.assertNotEqual(l.__add__, l.__mul__) + self.assertEqual(l.__add__.__name__, '__add__') if hasattr(l.__add__, '__self__'): # CPython - self.assertTrue(l.__add__.__self__ is l) - self.assertTrue(l.__add__.__objclass__ is list) + self.assertIs(l.__add__.__self__, l) + self.assertIs(l.__add__.__objclass__, list) else: # Python implementations where [].__add__ is a normal bound method - self.assertTrue(l.__add__.im_self is l) - self.assertTrue(l.__add__.im_class is list) + self.assertIs(l.__add__.im_self, l) + self.assertIs(l.__add__.im_class, list) self.assertEqual(l.__add__.__doc__, list.__add__.__doc__) try: hash(l.__add__) @@ -4444,7 +4455,7 @@ fake_str = FakeStr() # isinstance() reads __class__ - self.assertTrue(isinstance(fake_str, str)) + self.assertIsInstance(fake_str, str) # call a method descriptor with self.assertRaises(TypeError): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 22:39:44 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 17 Nov 2013 22:39:44 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NjAz?= =?utf-8?q?=3A_Use_specific_asserts_in_test=5Fdecr=2E?= Message-ID: <3dN6C43CZyz7Lm2@mail.python.org> http://hg.python.org/cpython/rev/38fcb9a63398 changeset: 87221:38fcb9a63398 branch: 2.7 parent: 87218:c44643df7d2a user: Serhiy Storchaka date: Sun Nov 17 23:39:24 2013 +0200 summary: Issue #19603: Use specific asserts in test_decr. files: Lib/test/test_descr.py | 333 +++++++++++++++------------- 1 files changed, 174 insertions(+), 159 deletions(-) diff --git a/Lib/test/test_descr.py b/Lib/test/test_descr.py --- a/Lib/test/test_descr.py +++ b/Lib/test/test_descr.py @@ -398,13 +398,21 @@ class ClassPropertiesAndMethods(unittest.TestCase): + def assertHasAttr(self, obj, name): + self.assertTrue(hasattr(obj, name), + '%r has no attribute %r' % (obj, name)) + + def assertNotHasAttr(self, obj, name): + self.assertFalse(hasattr(obj, name), + '%r has unexpected attribute %r' % (obj, name)) + def test_python_dicts(self): # Testing Python subclass of dict... self.assertTrue(issubclass(dict, dict)) self.assertIsInstance({}, dict) d = dict() self.assertEqual(d, {}) - self.assertTrue(d.__class__ is dict) + self.assertIs(d.__class__, dict) self.assertIsInstance(d, dict) class C(dict): state = -1 @@ -585,7 +593,7 @@ def _set_x(self, x): self.__x = -x a = A() - self.assertTrue(not hasattr(a, "x")) + self.assertNotHasAttr(a, "x") a.x = 12 self.assertEqual(a.x, 12) self.assertEqual(a._A__x, -12) @@ -934,14 +942,14 @@ self.assertEqual(type(a), object) b = object() self.assertNotEqual(a, b) - self.assertFalse(hasattr(a, "foo")) + self.assertNotHasAttr(a, "foo") try: a.foo = 12 except (AttributeError, TypeError): pass else: self.fail("object() should not allow setting a foo attribute") - self.assertFalse(hasattr(object(), "__dict__")) + self.assertNotHasAttr(object(), "__dict__") class Cdict(object): pass @@ -956,28 +964,28 @@ class C0(object): __slots__ = [] x = C0() - self.assertFalse(hasattr(x, "__dict__")) - self.assertFalse(hasattr(x, "foo")) + self.assertNotHasAttr(x, "__dict__") + self.assertNotHasAttr(x, "foo") class C1(object): __slots__ = ['a'] x = C1() - self.assertFalse(hasattr(x, "__dict__")) - self.assertFalse(hasattr(x, "a")) + self.assertNotHasAttr(x, "__dict__") + self.assertNotHasAttr(x, "a") x.a = 1 self.assertEqual(x.a, 1) x.a = None self.assertEqual(x.a, None) del x.a - self.assertFalse(hasattr(x, "a")) + self.assertNotHasAttr(x, "a") class C3(object): __slots__ = ['a', 'b', 'c'] x = C3() - self.assertFalse(hasattr(x, "__dict__")) - self.assertFalse(hasattr(x, 'a')) - self.assertFalse(hasattr(x, 'b')) - self.assertFalse(hasattr(x, 'c')) + self.assertNotHasAttr(x, "__dict__") + self.assertNotHasAttr(x, 'a') + self.assertNotHasAttr(x, 'b') + self.assertNotHasAttr(x, 'c') x.a = 1 x.b = 2 x.c = 3 @@ -993,8 +1001,8 @@ def get(self): return self.__a x = C4(5) - self.assertFalse(hasattr(x, '__dict__')) - self.assertFalse(hasattr(x, '__a')) + self.assertNotHasAttr(x, '__dict__') + self.assertNotHasAttr(x, '__a') self.assertEqual(x.get(), 5) try: x.__a = 6 @@ -1164,16 +1172,16 @@ class D(object): __slots__ = ["__dict__"] a = D() - self.assertTrue(hasattr(a, "__dict__")) - self.assertFalse(hasattr(a, "__weakref__")) + self.assertHasAttr(a, "__dict__") + self.assertNotHasAttr(a, "__weakref__") a.foo = 42 self.assertEqual(a.__dict__, {"foo": 42}) class W(object): __slots__ = ["__weakref__"] a = W() - self.assertTrue(hasattr(a, "__weakref__")) - self.assertFalse(hasattr(a, "__dict__")) + self.assertHasAttr(a, "__weakref__") + self.assertNotHasAttr(a, "__dict__") try: a.foo = 42 except AttributeError: @@ -1184,16 +1192,16 @@ class C1(W, D): __slots__ = [] a = C1() - self.assertTrue(hasattr(a, "__dict__")) - self.assertTrue(hasattr(a, "__weakref__")) + self.assertHasAttr(a, "__dict__") + self.assertHasAttr(a, "__weakref__") a.foo = 42 self.assertEqual(a.__dict__, {"foo": 42}) class C2(D, W): __slots__ = [] a = C2() - self.assertTrue(hasattr(a, "__dict__")) - self.assertTrue(hasattr(a, "__weakref__")) + self.assertHasAttr(a, "__dict__") + self.assertHasAttr(a, "__weakref__") a.foo = 42 self.assertEqual(a.__dict__, {"foo": 42}) @@ -1241,7 +1249,7 @@ class C(object): pass a = C() - self.assertFalse(hasattr(a, "foobar")) + self.assertNotHasAttr(a, "foobar") C.foobar = 2 self.assertEqual(a.foobar, 2) C.method = lambda self: 42 @@ -1251,7 +1259,7 @@ C.__int__ = lambda self: 100 self.assertEqual(int(a), 100) self.assertEqual(a.foobar, 2) - self.assertFalse(hasattr(a, "spam")) + self.assertNotHasAttr(a, "spam") def mygetattr(self, name): if name == "spam": return "spam" @@ -1521,7 +1529,7 @@ self.assertEqual(a.x, 10) self.assertEqual(a.x, 11) del a.x - self.assertEqual(hasattr(a, 'x'), 0) + self.assertNotHasAttr(a, 'x') def test_newslots(self): # Testing __new__ slot override... @@ -1797,18 +1805,18 @@ raise IndexError c1 = C() c2 = C() - self.assertTrue(not not c1) # What? + self.assertFalse(not c1) self.assertNotEqual(id(c1), id(c2)) hash(c1) hash(c2) self.assertEqual(cmp(c1, c2), cmp(id(c1), id(c2))) self.assertEqual(c1, c1) self.assertTrue(c1 != c2) - self.assertTrue(not c1 != c1) - self.assertTrue(not c1 == c2) + self.assertFalse(c1 != c1) + self.assertFalse(c1 == c2) # Note that the module name appears in str/repr, and that varies # depending on whether this test is run standalone or from a framework. - self.assertTrue(str(c1).find('C object at ') >= 0) + self.assertGreaterEqual(str(c1).find('C object at '), 0) self.assertEqual(str(c1), repr(c1)) self.assertNotIn(-1, c1) for i in range(10): @@ -1821,18 +1829,18 @@ raise IndexError d1 = D() d2 = D() - self.assertTrue(not not d1) + self.assertFalse(not d1) self.assertNotEqual(id(d1), id(d2)) hash(d1) hash(d2) self.assertEqual(cmp(d1, d2), cmp(id(d1), id(d2))) self.assertEqual(d1, d1) self.assertNotEqual(d1, d2) - self.assertTrue(not d1 != d1) - self.assertTrue(not d1 == d2) + self.assertFalse(d1 != d1) + self.assertFalse(d1 == d2) # Note that the module name appears in str/repr, and that varies # depending on whether this test is run standalone or from a framework. - self.assertTrue(str(d1).find('D object at ') >= 0) + self.assertGreaterEqual(str(d1).find('D object at '), 0) self.assertEqual(str(d1), repr(d1)) self.assertNotIn(-1, d1) for i in range(10): @@ -1862,11 +1870,11 @@ p1 = Proxy(1) p_1 = Proxy(-1) self.assertFalse(p0) - self.assertTrue(not not p1) + self.assertFalse(not p1) self.assertEqual(hash(p0), hash(0)) self.assertEqual(p0, p0) self.assertNotEqual(p0, p1) - self.assertTrue(not p0 != p0) + self.assertFalse(p0 != p0) self.assertEqual(not p0, p1) self.assertEqual(cmp(p0, p1), -1) self.assertEqual(cmp(p0, p0), 0) @@ -1902,7 +1910,7 @@ p1 = DProxy(1) p_1 = DProxy(-1) self.assertFalse(p0) - self.assertTrue(not not p1) + self.assertFalse(not p1) self.assertEqual(hash(p0), hash(0)) self.assertEqual(p0, p0) self.assertNotEqual(p0, p1) @@ -1995,7 +2003,7 @@ try: weakref.ref(no) except TypeError, msg: - self.assertTrue(str(msg).find("weak reference") >= 0) + self.assertIn("weak reference", str(msg)) else: self.fail("weakref.ref(no) should be illegal") class Weak(object): @@ -2019,17 +2027,17 @@ del self.__x x = property(getx, setx, delx, doc="I'm the x property.") a = C() - self.assertFalse(hasattr(a, "x")) + self.assertNotHasAttr(a, "x") a.x = 42 self.assertEqual(a._C__x, 42) self.assertEqual(a.x, 42) del a.x - self.assertFalse(hasattr(a, "x")) - self.assertFalse(hasattr(a, "_C__x")) + self.assertNotHasAttr(a, "x") + self.assertNotHasAttr(a, "_C__x") C.x.__set__(a, 100) self.assertEqual(C.x.__get__(a), 100) C.x.__delete__(a) - self.assertFalse(hasattr(a, "x")) + self.assertNotHasAttr(a, "x") raw = C.__dict__['x'] self.assertIsInstance(raw, property) @@ -2041,9 +2049,9 @@ self.assertIn("fdel", attrs) self.assertEqual(raw.__doc__, "I'm the x property.") - self.assertTrue(raw.fget is C.__dict__['getx']) - self.assertTrue(raw.fset is C.__dict__['setx']) - self.assertTrue(raw.fdel is C.__dict__['delx']) + self.assertIs(raw.fget, C.__dict__['getx']) + self.assertIs(raw.fset, C.__dict__['setx']) + self.assertIs(raw.fdel, C.__dict__['delx']) for attr in "__doc__", "fget", "fset", "fdel": try: @@ -2107,14 +2115,14 @@ del self._foo c = C() self.assertEqual(C.foo.__doc__, "hello") - self.assertFalse(hasattr(c, "foo")) + self.assertNotHasAttr(c, "foo") c.foo = -42 - self.assertTrue(hasattr(c, '_foo')) + self.assertHasAttr(c, '_foo') self.assertEqual(c._foo, 42) self.assertEqual(c.foo, 42) del c.foo - self.assertFalse(hasattr(c, '_foo')) - self.assertFalse(hasattr(c, "foo")) + self.assertNotHasAttr(c, '_foo') + self.assertNotHasAttr(c, "foo") class D(C): @C.foo.deleter @@ -2500,13 +2508,13 @@ a = hexint(12345) self.assertEqual(a, 12345) self.assertEqual(int(a), 12345) - self.assertTrue(int(a).__class__ is int) + self.assertIs(int(a).__class__, int) self.assertEqual(hash(a), hash(12345)) - self.assertTrue((+a).__class__ is int) - self.assertTrue((a >> 0).__class__ is int) - self.assertTrue((a << 0).__class__ is int) - self.assertTrue((hexint(0) << 12).__class__ is int) - self.assertTrue((hexint(0) >> 12).__class__ is int) + self.assertIs((+a).__class__, int) + self.assertIs((a >> 0).__class__, int) + self.assertIs((a << 0).__class__, int) + self.assertIs((hexint(0) << 12).__class__, int) + self.assertIs((hexint(0) >> 12).__class__, int) class octlong(long): __slots__ = [] @@ -2526,31 +2534,31 @@ self.assertEqual(a, 12345L) self.assertEqual(long(a), 12345L) self.assertEqual(hash(a), hash(12345L)) - self.assertTrue(long(a).__class__ is long) - self.assertTrue((+a).__class__ is long) - self.assertTrue((-a).__class__ is long) - self.assertTrue((-octlong(0)).__class__ is long) - self.assertTrue((a >> 0).__class__ is long) - self.assertTrue((a << 0).__class__ is long) - self.assertTrue((a - 0).__class__ is long) - self.assertTrue((a * 1).__class__ is long) - self.assertTrue((a ** 1).__class__ is long) - self.assertTrue((a // 1).__class__ is long) - self.assertTrue((1 * a).__class__ is long) - self.assertTrue((a | 0).__class__ is long) - self.assertTrue((a ^ 0).__class__ is long) - self.assertTrue((a & -1L).__class__ is long) - self.assertTrue((octlong(0) << 12).__class__ is long) - self.assertTrue((octlong(0) >> 12).__class__ is long) - self.assertTrue(abs(octlong(0)).__class__ is long) + self.assertIs(long(a).__class__, long) + self.assertIs((+a).__class__, long) + self.assertIs((-a).__class__, long) + self.assertIs((-octlong(0)).__class__, long) + self.assertIs((a >> 0).__class__, long) + self.assertIs((a << 0).__class__, long) + self.assertIs((a - 0).__class__, long) + self.assertIs((a * 1).__class__, long) + self.assertIs((a ** 1).__class__, long) + self.assertIs((a // 1).__class__, long) + self.assertIs((1 * a).__class__, long) + self.assertIs((a | 0).__class__, long) + self.assertIs((a ^ 0).__class__, long) + self.assertIs((a & -1L).__class__, long) + self.assertIs((octlong(0) << 12).__class__, long) + self.assertIs((octlong(0) >> 12).__class__, long) + self.assertIs(abs(octlong(0)).__class__, long) # Because octlong overrides __add__, we can't check the absence of +0 # optimizations using octlong. class longclone(long): pass a = longclone(1) - self.assertTrue((a + 0).__class__ is long) - self.assertTrue((0 + a).__class__ is long) + self.assertIs((a + 0).__class__, long) + self.assertIs((0 + a).__class__, long) # Check that negative clones don't segfault a = longclone(-1) @@ -2567,9 +2575,9 @@ a = precfloat(12345) self.assertEqual(a, 12345.0) self.assertEqual(float(a), 12345.0) - self.assertTrue(float(a).__class__ is float) + self.assertIs(float(a).__class__, float) self.assertEqual(hash(a), hash(12345.0)) - self.assertTrue((+a).__class__ is float) + self.assertIs((+a).__class__, float) class madcomplex(complex): def __repr__(self): @@ -2617,20 +2625,20 @@ self.assertEqual(v, t) a = madtuple((1,2,3,4,5)) self.assertEqual(tuple(a), (1,2,3,4,5)) - self.assertTrue(tuple(a).__class__ is tuple) + self.assertIs(tuple(a).__class__, tuple) self.assertEqual(hash(a), hash((1,2,3,4,5))) - self.assertTrue(a[:].__class__ is tuple) - self.assertTrue((a * 1).__class__ is tuple) - self.assertTrue((a * 0).__class__ is tuple) - self.assertTrue((a + ()).__class__ is tuple) + self.assertIs(a[:].__class__, tuple) + self.assertIs((a * 1).__class__, tuple) + self.assertIs((a * 0).__class__, tuple) + self.assertIs((a + ()).__class__, tuple) a = madtuple(()) self.assertEqual(tuple(a), ()) - self.assertTrue(tuple(a).__class__ is tuple) - self.assertTrue((a + a).__class__ is tuple) - self.assertTrue((a * 0).__class__ is tuple) - self.assertTrue((a * 1).__class__ is tuple) - self.assertTrue((a * 2).__class__ is tuple) - self.assertTrue(a[:].__class__ is tuple) + self.assertIs(tuple(a).__class__, tuple) + self.assertIs((a + a).__class__, tuple) + self.assertIs((a * 0).__class__, tuple) + self.assertIs((a * 1).__class__, tuple) + self.assertIs((a * 2).__class__, tuple) + self.assertIs(a[:].__class__, tuple) class madstring(str): _rev = None @@ -2652,51 +2660,51 @@ self.assertEqual(u, s) s = madstring("12345") self.assertEqual(str(s), "12345") - self.assertTrue(str(s).__class__ is str) + self.assertIs(str(s).__class__, str) base = "\x00" * 5 s = madstring(base) self.assertEqual(s, base) self.assertEqual(str(s), base) - self.assertTrue(str(s).__class__ is str) + self.assertIs(str(s).__class__, str) self.assertEqual(hash(s), hash(base)) self.assertEqual({s: 1}[base], 1) self.assertEqual({base: 1}[s], 1) - self.assertTrue((s + "").__class__ is str) + self.assertIs((s + "").__class__, str) self.assertEqual(s + "", base) - self.assertTrue(("" + s).__class__ is str) + self.assertIs(("" + s).__class__, str) self.assertEqual("" + s, base) - self.assertTrue((s * 0).__class__ is str) + self.assertIs((s * 0).__class__, str) self.assertEqual(s * 0, "") - self.assertTrue((s * 1).__class__ is str) + self.assertIs((s * 1).__class__, str) self.assertEqual(s * 1, base) - self.assertTrue((s * 2).__class__ is str) + self.assertIs((s * 2).__class__, str) self.assertEqual(s * 2, base + base) - self.assertTrue(s[:].__class__ is str) + self.assertIs(s[:].__class__, str) self.assertEqual(s[:], base) - self.assertTrue(s[0:0].__class__ is str) + self.assertIs(s[0:0].__class__, str) self.assertEqual(s[0:0], "") - self.assertTrue(s.strip().__class__ is str) + self.assertIs(s.strip().__class__, str) self.assertEqual(s.strip(), base) - self.assertTrue(s.lstrip().__class__ is str) + self.assertIs(s.lstrip().__class__, str) self.assertEqual(s.lstrip(), base) - self.assertTrue(s.rstrip().__class__ is str) + self.assertIs(s.rstrip().__class__, str) self.assertEqual(s.rstrip(), base) identitytab = ''.join([chr(i) for i in range(256)]) - self.assertTrue(s.translate(identitytab).__class__ is str) + self.assertIs(s.translate(identitytab).__class__, str) self.assertEqual(s.translate(identitytab), base) - self.assertTrue(s.translate(identitytab, "x").__class__ is str) + self.assertIs(s.translate(identitytab, "x").__class__, str) self.assertEqual(s.translate(identitytab, "x"), base) self.assertEqual(s.translate(identitytab, "\x00"), "") - self.assertTrue(s.replace("x", "x").__class__ is str) + self.assertIs(s.replace("x", "x").__class__, str) self.assertEqual(s.replace("x", "x"), base) - self.assertTrue(s.ljust(len(s)).__class__ is str) + self.assertIs(s.ljust(len(s)).__class__, str) self.assertEqual(s.ljust(len(s)), base) - self.assertTrue(s.rjust(len(s)).__class__ is str) + self.assertIs(s.rjust(len(s)).__class__, str) self.assertEqual(s.rjust(len(s)), base) - self.assertTrue(s.center(len(s)).__class__ is str) + self.assertIs(s.center(len(s)).__class__, str) self.assertEqual(s.center(len(s)), base) - self.assertTrue(s.lower().__class__ is str) + self.assertIs(s.lower().__class__, str) self.assertEqual(s.lower(), base) class madunicode(unicode): @@ -2715,47 +2723,47 @@ base = u"12345" u = madunicode(base) self.assertEqual(unicode(u), base) - self.assertTrue(unicode(u).__class__ is unicode) + self.assertIs(unicode(u).__class__, unicode) self.assertEqual(hash(u), hash(base)) self.assertEqual({u: 1}[base], 1) self.assertEqual({base: 1}[u], 1) - self.assertTrue(u.strip().__class__ is unicode) + self.assertIs(u.strip().__class__, unicode) self.assertEqual(u.strip(), base) - self.assertTrue(u.lstrip().__class__ is unicode) + self.assertIs(u.lstrip().__class__, unicode) self.assertEqual(u.lstrip(), base) - self.assertTrue(u.rstrip().__class__ is unicode) + self.assertIs(u.rstrip().__class__, unicode) self.assertEqual(u.rstrip(), base) - self.assertTrue(u.replace(u"x", u"x").__class__ is unicode) + self.assertIs(u.replace(u"x", u"x").__class__, unicode) self.assertEqual(u.replace(u"x", u"x"), base) - self.assertTrue(u.replace(u"xy", u"xy").__class__ is unicode) + self.assertIs(u.replace(u"xy", u"xy").__class__, unicode) self.assertEqual(u.replace(u"xy", u"xy"), base) - self.assertTrue(u.center(len(u)).__class__ is unicode) + self.assertIs(u.center(len(u)).__class__, unicode) self.assertEqual(u.center(len(u)), base) - self.assertTrue(u.ljust(len(u)).__class__ is unicode) + self.assertIs(u.ljust(len(u)).__class__, unicode) self.assertEqual(u.ljust(len(u)), base) - self.assertTrue(u.rjust(len(u)).__class__ is unicode) + self.assertIs(u.rjust(len(u)).__class__, unicode) self.assertEqual(u.rjust(len(u)), base) - self.assertTrue(u.lower().__class__ is unicode) + self.assertIs(u.lower().__class__, unicode) self.assertEqual(u.lower(), base) - self.assertTrue(u.upper().__class__ is unicode) + self.assertIs(u.upper().__class__, unicode) self.assertEqual(u.upper(), base) - self.assertTrue(u.capitalize().__class__ is unicode) + self.assertIs(u.capitalize().__class__, unicode) self.assertEqual(u.capitalize(), base) - self.assertTrue(u.title().__class__ is unicode) + self.assertIs(u.title().__class__, unicode) self.assertEqual(u.title(), base) - self.assertTrue((u + u"").__class__ is unicode) + self.assertIs((u + u"").__class__, unicode) self.assertEqual(u + u"", base) - self.assertTrue((u"" + u).__class__ is unicode) + self.assertIs((u"" + u).__class__, unicode) self.assertEqual(u"" + u, base) - self.assertTrue((u * 0).__class__ is unicode) + self.assertIs((u * 0).__class__, unicode) self.assertEqual(u * 0, u"") - self.assertTrue((u * 1).__class__ is unicode) + self.assertIs((u * 1).__class__, unicode) self.assertEqual(u * 1, base) - self.assertTrue((u * 2).__class__ is unicode) + self.assertIs((u * 2).__class__, unicode) self.assertEqual(u * 2, base + base) - self.assertTrue(u[:].__class__ is unicode) + self.assertIs(u[:].__class__, unicode) self.assertEqual(u[:], base) - self.assertTrue(u[0:0].__class__ is unicode) + self.assertIs(u[0:0].__class__, unicode) self.assertEqual(u[0:0], u"") class sublist(list): @@ -2901,12 +2909,16 @@ c = {1: c1, 2: c2, 3: c3} for x in 1, 2, 3: for y in 1, 2, 3: - self.assertTrue(cmp(c[x], c[y]) == cmp(x, y), "x=%d, y=%d" % (x, y)) + self.assertEqual(cmp(c[x], c[y]), cmp(x, y), + "x=%d, y=%d" % (x, y)) for op in "<", "<=", "==", "!=", ">", ">=": - self.assertTrue(eval("c[x] %s c[y]" % op) == eval("x %s y" % op), - "x=%d, y=%d" % (x, y)) - self.assertTrue(cmp(c[x], y) == cmp(x, y), "x=%d, y=%d" % (x, y)) - self.assertTrue(cmp(x, c[y]) == cmp(x, y), "x=%d, y=%d" % (x, y)) + self.assertEqual(eval("c[x] %s c[y]" % op), + eval("x %s y" % op), + "x=%d, y=%d" % (x, y)) + self.assertEqual(cmp(c[x], y), cmp(x, y), + "x=%d, y=%d" % (x, y)) + self.assertEqual(cmp(x, c[y]), cmp(x, y), + "x=%d, y=%d" % (x, y)) def test_rich_comparisons(self): # Testing rich comparisons... @@ -2979,12 +2991,15 @@ for x in 1, 2, 3: for y in 1, 2, 3: for op in "<", "<=", "==", "!=", ">", ">=": - self.assertTrue(eval("c[x] %s c[y]" % op) == eval("x %s y" % op), - "x=%d, y=%d" % (x, y)) - self.assertTrue(eval("c[x] %s y" % op) == eval("x %s y" % op), - "x=%d, y=%d" % (x, y)) - self.assertTrue(eval("x %s c[y]" % op) == eval("x %s y" % op), - "x=%d, y=%d" % (x, y)) + self.assertEqual(eval("c[x] %s c[y]" % op), + eval("x %s y" % op), + "x=%d, y=%d" % (x, y)) + self.assertEqual(eval("c[x] %s y" % op), + eval("x %s y" % op), + "x=%d, y=%d" % (x, y)) + self.assertEqual(eval("x %s c[y]" % op), + eval("x %s y" % op), + "x=%d, y=%d" % (x, y)) def test_coercions(self): # Testing coercions... @@ -3049,9 +3064,9 @@ for cls2 in C, D, E, F: x = cls() x.__class__ = cls2 - self.assertTrue(x.__class__ is cls2) + self.assertIs(x.__class__, cls2) x.__class__ = cls - self.assertTrue(x.__class__ is cls) + self.assertIs(x.__class__, cls) def cant(x, C): try: x.__class__ = C @@ -3113,11 +3128,11 @@ x = cls() x.a = 1 x.__class__ = cls2 - self.assertTrue(x.__class__ is cls2, + self.assertIs(x.__class__, cls2, "assigning %r as __class__ for %r silently failed" % (cls2, x)) self.assertEqual(x.a, 1) x.__class__ = cls - self.assertTrue(x.__class__ is cls, + self.assertIs(x.__class__, cls, "assigning %r as __class__ for %r silently failed" % (cls, x)) self.assertEqual(x.a, 1) for cls in G, J, K, L, M, N, P, R, list, Int: @@ -3287,7 +3302,7 @@ for cls in C, C1, C2: s = p.dumps(cls, bin) cls2 = p.loads(s) - self.assertTrue(cls2 is cls) + self.assertIs(cls2, cls) a = C1(1, 2); a.append(42); a.append(24) b = C2("hello", "world", 42) @@ -3317,7 +3332,7 @@ import copy for cls in C, C1, C2: cls2 = copy.deepcopy(cls) - self.assertTrue(cls2 is cls) + self.assertIs(cls2, cls) a = C1(1, 2); a.append(42); a.append(24) b = C2("hello", "world", 42) @@ -3388,9 +3403,9 @@ # Now it should work x = C() y = pickle.loads(pickle.dumps(x)) - self.assertEqual(hasattr(y, 'a'), 0) + self.assertNotHasAttr(y, 'a') y = cPickle.loads(cPickle.dumps(x)) - self.assertEqual(hasattr(y, 'a'), 0) + self.assertNotHasAttr(y, 'a') x.a = 42 y = pickle.loads(pickle.dumps(x)) self.assertEqual(y.a, 42) @@ -3706,9 +3721,9 @@ from types import ModuleType as M m = M.__new__(M) str(m) - self.assertEqual(hasattr(m, "__name__"), 0) - self.assertEqual(hasattr(m, "__file__"), 0) - self.assertEqual(hasattr(m, "foo"), 0) + self.assertNotHasAttr(m, "__name__") + self.assertNotHasAttr(m, "__file__") + self.assertNotHasAttr(m, "foo") self.assertFalse(m.__dict__) # None or {} are both reasonable answers m.foo = 1 self.assertEqual(m.__dict__, {"foo": 1}) @@ -3888,8 +3903,8 @@ __slots__=() if test_support.check_impl_detail(): self.assertEqual(C.__basicsize__, B.__basicsize__) - self.assertTrue(hasattr(C, '__dict__')) - self.assertTrue(hasattr(C, '__weakref__')) + self.assertHasAttr(C, '__dict__') + self.assertHasAttr(C, '__weakref__') C().x = 2 def test_rmul(self): @@ -4390,7 +4405,7 @@ self.assertEqual(c.attr, 1) # this makes a crash more likely: test_support.gc_collect() - self.assertEqual(hasattr(c, 'attr'), False) + self.assertNotHasAttr(c, 'attr') def test_init(self): # SF 1155938 @@ -4411,17 +4426,17 @@ l = [] self.assertEqual(l.__add__, l.__add__) self.assertEqual(l.__add__, [].__add__) - self.assertTrue(l.__add__ != [5].__add__) - self.assertTrue(l.__add__ != l.__mul__) - self.assertTrue(l.__add__.__name__ == '__add__') + self.assertNotEqual(l.__add__, [5].__add__) + self.assertNotEqual(l.__add__, l.__mul__) + self.assertEqual(l.__add__.__name__, '__add__') if hasattr(l.__add__, '__self__'): # CPython - self.assertTrue(l.__add__.__self__ is l) - self.assertTrue(l.__add__.__objclass__ is list) + self.assertIs(l.__add__.__self__, l) + self.assertIs(l.__add__.__objclass__, list) else: # Python implementations where [].__add__ is a normal bound method - self.assertTrue(l.__add__.im_self is l) - self.assertTrue(l.__add__.im_class is list) + self.assertIs(l.__add__.im_self, l) + self.assertIs(l.__add__.im_class, list) self.assertEqual(l.__add__.__doc__, list.__add__.__doc__) try: hash(l.__add__) @@ -4604,7 +4619,7 @@ fake_str = FakeStr() # isinstance() reads __class__ on new style classes - self.assertTrue(isinstance(fake_str, str)) + self.assertIsInstance(fake_str, str) # call a method descriptor with self.assertRaises(TypeError): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 22:58:10 2013 From: python-checkins at python.org (gregory.p.smith) Date: Sun, 17 Nov 2013 22:58:10 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Fix_compilatio?= =?utf-8?q?n_error_under_gcc_of_the_ctypes_module_bundled_libffi_for_arm?= =?utf-8?q?=2E?= Message-ID: <3dN6cL3z7Gz7Ljf@mail.python.org> http://hg.python.org/cpython/rev/5e71f5cfd233 changeset: 87222:5e71f5cfd233 branch: 3.3 parent: 87219:2407ecebcf7d user: Gregory P. Smith date: Sun Nov 17 21:56:07 2013 +0000 summary: Fix compilation error under gcc of the ctypes module bundled libffi for arm. A variable was declared below the top of a block and one function was using a K&R C style function declaration! files: Misc/NEWS | 2 ++ Modules/_ctypes/libffi/src/arm/ffi.c | 14 +++++++------- 2 files changed, 9 insertions(+), 7 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -13,6 +13,8 @@ Library ------- +- Fix compilation error under gcc of the ctypes module bundled libffi for arm. + - Issue #19523: Closed FileHandler leak which occurred when delay was set. - Issue #13674: Prevented time.strftime from crashing on Windows when given diff --git a/Modules/_ctypes/libffi/src/arm/ffi.c b/Modules/_ctypes/libffi/src/arm/ffi.c --- a/Modules/_ctypes/libffi/src/arm/ffi.c +++ b/Modules/_ctypes/libffi/src/arm/ffi.c @@ -221,11 +221,11 @@ int vfp_struct = (cif->flags == FFI_TYPE_STRUCT_VFP_FLOAT || cif->flags == FFI_TYPE_STRUCT_VFP_DOUBLE); + unsigned int temp; + ecif.cif = cif; ecif.avalue = avalue; - unsigned int temp; - /* If the return value is a struct and we don't have a return */ /* value address then we need to make one */ @@ -278,11 +278,11 @@ /* This function is jumped to by the trampoline */ unsigned int -ffi_closure_SYSV_inner (closure, respp, args, vfp_args) - ffi_closure *closure; - void **respp; - void *args; - void *vfp_args; +ffi_closure_SYSV_inner( + ffi_closure *closure, + void **respp, + void *args, + void *vfp_args) { // our various things... ffi_cif *cif; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 22:58:11 2013 From: python-checkins at python.org (gregory.p.smith) Date: Sun, 17 Nov 2013 22:58:11 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Fix_compilation_error_under_gcc_of_the_ctypes_module_bun?= =?utf-8?q?dled_libffi_for_arm=2E?= Message-ID: <3dN6cM5y3Mz7LnD@mail.python.org> http://hg.python.org/cpython/rev/0fbdff3fde3a changeset: 87223:0fbdff3fde3a parent: 87220:6049c954a703 parent: 87222:5e71f5cfd233 user: Gregory P. Smith date: Sun Nov 17 21:57:43 2013 +0000 summary: Fix compilation error under gcc of the ctypes module bundled libffi for arm. A variable was declared below the top of a block and one function was using a K&R C style function declaration! files: Misc/NEWS | 2 ++ Modules/_ctypes/libffi/src/arm/ffi.c | 14 +++++++------- 2 files changed, 9 insertions(+), 7 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -50,6 +50,8 @@ Library ------- +- Fix compilation error under gcc of the ctypes module bundled libffi for arm. + - Issue #19448: Add private API to SSL module to lookup ASN.1 objects by OID, NID, short name and long name. diff --git a/Modules/_ctypes/libffi/src/arm/ffi.c b/Modules/_ctypes/libffi/src/arm/ffi.c --- a/Modules/_ctypes/libffi/src/arm/ffi.c +++ b/Modules/_ctypes/libffi/src/arm/ffi.c @@ -221,11 +221,11 @@ int vfp_struct = (cif->flags == FFI_TYPE_STRUCT_VFP_FLOAT || cif->flags == FFI_TYPE_STRUCT_VFP_DOUBLE); + unsigned int temp; + ecif.cif = cif; ecif.avalue = avalue; - unsigned int temp; - /* If the return value is a struct and we don't have a return */ /* value address then we need to make one */ @@ -278,11 +278,11 @@ /* This function is jumped to by the trampoline */ unsigned int -ffi_closure_SYSV_inner (closure, respp, args, vfp_args) - ffi_closure *closure; - void **respp; - void *args; - void *vfp_args; +ffi_closure_SYSV_inner( + ffi_closure *closure, + void **respp, + void *args, + void *vfp_args) { // our various things... ffi_cif *cif; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 23:21:25 2013 From: python-checkins at python.org (gregory.p.smith) Date: Sun, 17 Nov 2013 23:21:25 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogRml4IHRlc3Quc3Vw?= =?utf-8?q?port=2Ebind=5Fport=28=29_to_not_cause_an_error_when_Python_was_?= =?utf-8?q?compiled?= Message-ID: <3dN77922Lrz7Ljs@mail.python.org> http://hg.python.org/cpython/rev/9791c5d55f52 changeset: 87224:9791c5d55f52 branch: 3.3 parent: 87222:5e71f5cfd233 user: Gregory P. Smith date: Sun Nov 17 22:19:32 2013 +0000 summary: Fix test.support.bind_port() to not cause an error when Python was compiled on a system with SO_REUSEPORT defined in the headers but run on a system with an OS kernel that does not support that reasonably new socket option. files: Lib/test/support/__init__.py | 12 +++++++++--- Misc/NEWS | 4 ++++ 2 files changed, 13 insertions(+), 3 deletions(-) diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -570,9 +570,15 @@ raise TestFailed("tests should never set the SO_REUSEADDR " \ "socket option on TCP/IP sockets!") if hasattr(socket, 'SO_REUSEPORT'): - if sock.getsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT) == 1: - raise TestFailed("tests should never set the SO_REUSEPORT " \ - "socket option on TCP/IP sockets!") + try: + if sock.getsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT) == 1: + raise TestFailed("tests should never set the SO_REUSEPORT " \ + "socket option on TCP/IP sockets!") + except OSError: + # Python's socket module was compiled using modern headers + # thus defining SO_REUSEPORT but this process is running + # under an older kernel that does not support SO_REUSEPORT. + pass if hasattr(socket, 'SO_EXCLUSIVEADDRUSE'): sock.setsockopt(socket.SOL_SOCKET, socket.SO_EXCLUSIVEADDRUSE, 1) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -13,6 +13,10 @@ Library ------- +- Fix test.support.bind_port() to not cause an error when Python was compiled + on a system with SO_REUSEPORT defined in the headers but run on a system + with an OS kernel that does not support that reasonably new socket option. + - Fix compilation error under gcc of the ctypes module bundled libffi for arm. - Issue #19523: Closed FileHandler leak which occurred when delay was set. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 23:21:26 2013 From: python-checkins at python.org (gregory.p.smith) Date: Sun, 17 Nov 2013 23:21:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Fix_test=2Esupport=2Ebind=5Fport=28=29_to_not_cause_an_e?= =?utf-8?q?rror_when_Python_was_compiled?= Message-ID: <3dN77B3z1Tz7Lnd@mail.python.org> http://hg.python.org/cpython/rev/00766fa3366b changeset: 87225:00766fa3366b parent: 87223:0fbdff3fde3a parent: 87224:9791c5d55f52 user: Gregory P. Smith date: Sun Nov 17 22:21:02 2013 +0000 summary: Fix test.support.bind_port() to not cause an error when Python was compiled on a system with SO_REUSEPORT defined in the headers but run on a system with an OS kernel that does not support that reasonably new socket option. files: Lib/test/support/__init__.py | 12 +++++++++--- Misc/NEWS | 4 ++++ 2 files changed, 13 insertions(+), 3 deletions(-) diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -594,9 +594,15 @@ raise TestFailed("tests should never set the SO_REUSEADDR " \ "socket option on TCP/IP sockets!") if hasattr(socket, 'SO_REUSEPORT'): - if sock.getsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT) == 1: - raise TestFailed("tests should never set the SO_REUSEPORT " \ - "socket option on TCP/IP sockets!") + try: + if sock.getsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT) == 1: + raise TestFailed("tests should never set the SO_REUSEPORT " \ + "socket option on TCP/IP sockets!") + except OSError: + # Python's socket module was compiled using modern headers + # thus defining SO_REUSEPORT but this process is running + # under an older kernel that does not support SO_REUSEPORT. + pass if hasattr(socket, 'SO_EXCLUSIVEADDRUSE'): sock.setsockopt(socket.SOL_SOCKET, socket.SO_EXCLUSIVEADDRUSE, 1) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -50,6 +50,10 @@ Library ------- +- Fix test.support.bind_port() to not cause an error when Python was compiled + on a system with SO_REUSEPORT defined in the headers but run on a system + with an OS kernel that does not support that reasonably new socket option. + - Fix compilation error under gcc of the ctypes module bundled libffi for arm. - Issue #19448: Add private API to SSL module to lookup ASN.1 objects by OID, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 23:40:16 2013 From: python-checkins at python.org (victor.stinner) Date: Sun, 17 Nov 2013 23:40:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319634=3A_time=2Es?= =?utf-8?q?trftime=28=22=25y=22=29_now_raises_a_ValueError_on_AIX_when_giv?= =?utf-8?q?en_a?= Message-ID: <3dN7Xw0gDbz7Lwd@mail.python.org> http://hg.python.org/cpython/rev/fd9ce1d4b820 changeset: 87226:fd9ce1d4b820 user: Victor Stinner date: Sun Nov 17 23:39:21 2013 +0100 summary: Issue #19634: time.strftime("%y") now raises a ValueError on AIX when given a year before 1900. files: Lib/test/test_strftime.py | 16 +++++++--------- Misc/NEWS | 3 +++ Modules/timemodule.c | 14 ++++++++++++++ 3 files changed, 24 insertions(+), 9 deletions(-) diff --git a/Lib/test/test_strftime.py b/Lib/test/test_strftime.py --- a/Lib/test/test_strftime.py +++ b/Lib/test/test_strftime.py @@ -182,15 +182,13 @@ a date before 1900 is passed with a format string containing "%y" """ - @unittest.skipUnless(sys.platform == "win32", "Only applies to Windows") - def test_y_before_1900_win(self): - with self.assertRaises(ValueError): - time.strftime("%y", (1899, 1, 1, 0, 0, 0, 0, 0, 0)) - - @unittest.skipIf(sys.platform == "win32", "Doesn't apply on Windows") - def test_y_before_1900_nonwin(self): - self.assertEqual( - time.strftime("%y", (1899, 1, 1, 0, 0, 0, 0, 0, 0)), "99") + def test_y_before_1900(self): + t = (1899, 1, 1, 0, 0, 0, 0, 0, 0) + if sys.platform == "win32" or sys.platform.startswith("aix"): + with self.assertRaises(ValueError): + time.strftime("%y", t) + else: + self.assertEqual(time.strftime("%y", t), "99") def test_y_1900(self): self.assertEqual( diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -50,6 +50,9 @@ Library ------- +- Issue #19634: time.strftime("%y") now raises a ValueError on AIX when given a + year before 1900. + - Fix test.support.bind_port() to not cause an error when Python was compiled on a system with SO_REUSEPORT defined in the headers but run on a system with an OS kernel that does not support that reasonably new socket option. diff --git a/Modules/timemodule.c b/Modules/timemodule.c --- a/Modules/timemodule.c +++ b/Modules/timemodule.c @@ -650,6 +650,20 @@ return NULL; } } +#elif defined(_AIX) + for(outbuf = wcschr(fmt, '%'); + outbuf != NULL; + outbuf = wcschr(outbuf+2, '%')) + { + /* Issue #19634: On AIX, wcsftime("y", (1899, 1, 1, 0, 0, 0, 0, 0, 0)) + returns "0/" instead of "99" */ + if (outbuf[1] == L'y' && buf.tm_year < 0) { + PyErr_SetString(PyExc_ValueError, + "format %y requires year >= 1900 on AIX"); + Py_DECREF(format); + return NULL; + } + } #endif fmtlen = time_strlen(fmt); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 23:47:03 2013 From: python-checkins at python.org (victor.stinner) Date: Sun, 17 Nov 2013 23:47:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_test=5Fselectors=3A_test?= =?utf-8?q?=5Ftimeout_fails_sometimes_on_busy_=28slow=29_buildbots=2C_rela?= =?utf-8?q?x?= Message-ID: <3dN7hl6lnpz7LjR@mail.python.org> http://hg.python.org/cpython/rev/f2fdc675deab changeset: 87227:f2fdc675deab user: Victor Stinner date: Sun Nov 17 23:46:34 2013 +0100 summary: test_selectors: test_timeout fails sometimes on busy (slow) buildbots, relax the unit test on max time (but be more strict on mon time). Example of failure: http://buildbot.python.org/all/builders/x86%20OpenIndiana%203.x/builds/6978/steps/test/logs/stdio ====================================================================== FAIL: test_timeout (test.test_selectors.PollSelectorTestCase) ---------------------------------------------------------------------- Traceback (most recent call last): File "/export/home/buildbot/32bits/3.x.cea-indiana-x86/build/Lib/test/test_selectors.py", line 316, in test_timeout self.assertTrue(0.5 < t1 - t0 < 1.5, t1 - t0) AssertionError: False is not true : 1.5033390671014786 files: Lib/test/test_selectors.py | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_selectors.py b/Lib/test/test_selectors.py --- a/Lib/test/test_selectors.py +++ b/Lib/test/test_selectors.py @@ -313,7 +313,8 @@ t0 = time() self.assertFalse(s.select(1)) t1 = time() - self.assertTrue(0.5 < t1 - t0 < 1.5, t1 - t0) + dt = t1 - t0 + self.assertTrue(0.8 <= dt <= 1.6, dt) @unittest.skipUnless(hasattr(signal, "alarm"), "signal.alarm() required for this test") -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 17 23:54:19 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 17 Nov 2013 23:54:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2317618=3A_Add_Base?= =?utf-8?q?85_and_Ascii85_encoding/decoding_to_the_base64_module=2E?= Message-ID: <3dN7s70YRYz7Lkq@mail.python.org> http://hg.python.org/cpython/rev/42366e293b7b changeset: 87228:42366e293b7b user: Antoine Pitrou date: Sun Nov 17 23:52:25 2013 +0100 summary: Issue #17618: Add Base85 and Ascii85 encoding/decoding to the base64 module. files: Doc/library/base64.rst | 70 +++++++ Lib/base64.py | 191 ++++++++++++++++++++- Lib/test/test_base64.py | 254 +++++++++++++++++++++++++++- Misc/NEWS | 2 + 4 files changed, 514 insertions(+), 3 deletions(-) diff --git a/Doc/library/base64.rst b/Doc/library/base64.rst --- a/Doc/library/base64.rst +++ b/Doc/library/base64.rst @@ -132,6 +132,76 @@ string. +.. function:: a85encode(s, *, foldspaces=False, wrapcol=0, pad=False, adobe=False) + + Encode a byte string using Ascii85. + + *s* is the string to encode. The encoded byte string is returned. + + *foldspaces* is an optional flag that uses the special short sequence 'y' + instead of 4 consecutive spaces (ASCII 0x20) as supported by 'btoa'. This + feature is not supported by the "standard" Ascii85 encoding. + + *wrapcol* controls whether the output should have newline ('\n') + characters added to it. If this is non-zero, each output line will be + at most this many characters long. + + *pad* controls whether the input string is padded to a multiple of 4 + before encoding. Note that the ``btoa`` implementation always pads. + + *adobe* controls whether the encoded byte sequence is framed with ``<~`` + and ``~>``, which is used by the Adobe implementation. + + .. versionadded:: 3.4 + + +.. function:: a85decode(s, *, foldspaces=False, adobe=False, ignorechars=b' \t\n\r\v') + + Decode an Ascii85 encoded byte string. + + *s* is the byte string to decode. + + *foldspaces* is a flag that specifies whether the 'y' short sequence + should be accepted as shorthand for 4 consecutive spaces (ASCII 0x20). + This feature is not supported by the "standard" Ascii85 encoding. + + *adobe* controls whether the input sequence is in Adobe Ascii85 format + (i.e. is framed with <~ and ~>). + + *ignorechars* should be a byte string containing characters to ignore + from the input. This should only contain whitespace characters, and by + default contains all whitespace characters in ASCII. + + .. versionadded:: 3.4 + + +.. function:: b85encode(s, pad=False) + + Encode a byte string using base85, as used in e.g. git-style binary + diffs. + + If *pad* is true, the input is padded with "\\0" so its length is a + multiple of 4 characters before encoding. + + .. versionadded:: 3.4 + + +.. function:: b85decode(b) + + Decode base85-encoded byte string. Padding is implicitly removed, if + necessary. + + .. versionadded:: 3.4 + + +.. note:: + Both Base85 and Ascii85 have an expansion factor of 5 to 4 (5 Base85 or + Ascii85 characters can encode 4 binary bytes), while the better-known + Base64 has an expansion factor of 6 to 4. They are therefore more + efficient when space expensive. They differ by details such as the + character map used for encoding. + + The legacy interface: .. function:: decode(input, output) diff --git a/Lib/base64.py b/Lib/base64.py --- a/Lib/base64.py +++ b/Lib/base64.py @@ -1,6 +1,6 @@ #! /usr/bin/env python3 -"""RFC 3548: Base16, Base32, Base64 Data Encodings""" +"""Base16, Base32, Base64 (RFC 3548), Base85 and Ascii85 data encodings""" # Modified 04-Oct-1995 by Jack Jansen to use binascii module # Modified 30-Dec-2003 by Barry Warsaw to add full RFC 3548 support @@ -9,6 +9,7 @@ import re import struct import binascii +import itertools __all__ = [ @@ -17,6 +18,8 @@ # Generalized interface for other encodings 'b64encode', 'b64decode', 'b32encode', 'b32decode', 'b16encode', 'b16decode', + # Base85 and Ascii85 encodings + 'b85encode', 'b85decode', 'a85encode', 'a85decode', # Standard Base64 encoding 'standard_b64encode', 'standard_b64decode', # Some common Base64 alternatives. As referenced by RFC 3458, see thread @@ -268,7 +271,193 @@ raise binascii.Error('Non-base16 digit found') return binascii.unhexlify(s) +# +# Ascii85 encoding/decoding +# +def _85encode(b, chars, chars2, pad=False, foldnuls=False, foldspaces=False): + # Helper function for a85encode and b85encode + if not isinstance(b, bytes_types): + b = memoryview(b).tobytes() + + padding = (-len(b)) % 4 + if padding: + b = b + b'\0' * padding + words = struct.Struct('!%dI' % (len(b) // 4)).unpack(b) + + a85chars2 = _a85chars2 + a85chars = _a85chars + chunks = [b'z' if foldnuls and not word else + b'y' if foldspaces and word == 0x20202020 else + (chars2[word // 614125] + + chars2[word // 85 % 7225] + + chars[word % 85]) + for word in words] + + if padding and not pad: + if chunks[-1] == b'z': + chunks[-1] = chars[0] * 5 + chunks[-1] = chunks[-1][:-padding] + + return b''.join(chunks) + +_A85START = b"<~" +_A85END = b"~>" +_a85chars = [bytes([i]) for i in range(33, 118)] +_a85chars2 = [(a + b) for a in _a85chars for b in _a85chars] + +def a85encode(b, *, foldspaces=False, wrapcol=0, pad=False, adobe=False): + """Encode a byte string using Ascii85. + + b is the byte string to encode. The encoded byte string is returned. + + foldspaces is an optional flag that uses the special short sequence 'y' + instead of 4 consecutive spaces (ASCII 0x20) as supported by 'btoa'. This + feature is not supported by the "standard" Adobe encoding. + + wrapcol controls whether the output should have newline ('\n') characters + added to it. If this is non-zero, each output line will be at most this + many characters long. + + pad controls whether the input string is padded to a multiple of 4 before + encoding. Note that the btoa implementation always pads. + + adobe controls whether the encoded byte sequence is framed with <~ and ~>, + which is used by the Adobe implementation. + """ + result = _85encode(b, _a85chars, _a85chars2, pad, True, foldspaces) + + if adobe: + result = _A85START + result + if wrapcol: + wrapcol = max(2 if adobe else 1, wrapcol) + chunks = [result[i: i + wrapcol] + for i in range(0, len(result), wrapcol)] + if adobe: + if len(chunks[-1]) + 2 > wrapcol: + chunks.append(b'') + result = b'\n'.join(chunks) + if adobe: + result += _A85END + + return result + +def a85decode(b, *, foldspaces=False, adobe=False, ignorechars=b' \t\n\r\v'): + """Decode an Ascii85 encoded byte string. + + s is the byte string to decode. + + foldspaces is a flag that specifies whether the 'y' short sequence should be + accepted as shorthand for 4 consecutive spaces (ASCII 0x20). This feature is + not supported by the "standard" Adobe encoding. + + adobe controls whether the input sequence is in Adobe Ascii85 format (i.e. + is framed with <~ and ~>). + + ignorechars should be a byte string containing characters to ignore from the + input. This should only contain whitespace characters, and by default + contains all whitespace characters in ASCII. + """ + b = _bytes_from_decode_data(b) + if adobe: + if not (b.startswith(_A85START) and b.endswith(_A85END)): + raise ValueError("Ascii85 encoded byte sequences must be bracketed " + "by {} and {}".format(_A85START, _A85END)) + b = b[2:-2] # Strip off start/end markers + # + # We have to go through this stepwise, so as to ignore spaces and handle + # special short sequences + # + packI = struct.Struct('!I').pack + decoded = [] + decoded_append = decoded.append + curr = [] + curr_append = curr.append + curr_clear = curr.clear + for x in b + b'u' * 4: + if b'!'[0] <= x <= b'u'[0]: + curr_append(x) + if len(curr) == 5: + acc = 0 + for x in curr: + acc = 85 * acc + (x - 33) + try: + decoded_append(packI(acc)) + except struct.error: + raise ValueError('Ascii85 overflow') from None + curr_clear() + elif x == b'z'[0]: + if curr: + raise ValueError('z inside Ascii85 5-tuple') + decoded_append(b'\0\0\0\0') + elif foldspaces and x == b'y'[0]: + if curr: + raise ValueError('y inside Ascii85 5-tuple') + decoded_append(b'\x20\x20\x20\x20') + elif x in ignorechars: + # Skip whitespace + continue + else: + raise ValueError('Non-Ascii85 digit found: %c' % x) + + result = b''.join(decoded) + padding = 4 - len(curr) + if padding: + # Throw away the extra padding + result = result[:-padding] + return result + +# The following code is originally taken (with permission) from Mercurial + +_b85chars = b"0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ" \ + b"abcdefghijklmnopqrstuvwxyz!#$%&()*+-;<=>?@^_`{|}~" +_b85chars = [bytes([i]) for i in _b85chars] +_b85chars2 = [(a + b) for a in _b85chars for b in _b85chars] +_b85dec = None + +def b85encode(b, pad=False): + """Encode an ASCII-encoded byte array in base85 format. + + If pad is true, the input is padded with "\0" so its length is a multiple of + 4 characters before encoding. + """ + return _85encode(b, _b85chars, _b85chars2, pad) + +def b85decode(b): + """Decode base85-encoded byte array""" + b = _bytes_from_decode_data(b) + global _b85dec + if _b85dec is None: + _b85dec = [None] * 256 + for i, c in enumerate(_b85chars): + _b85dec[c[0]] = i + + padding = (-len(b)) % 5 + b = b + b'~' * padding + out = [] + packI = struct.Struct('!I').pack + for i in range(0, len(b), 5): + chunk = b[i:i + 5] + acc = 0 + try: + for c in chunk: + acc = acc * 85 + _b85dec[c] + except TypeError: + for j, c in enumerate(chunk): + if _b85dec[c] is None: + raise ValueError('bad base85 character at position %d' + % (i + j)) from None + raise + try: + out.append(packI(acc)) + except struct.error: + raise ValueError('base85 overflow in hunk starting at byte %d' + % i) from None + + result = b''.join(out) + if padding: + result = result[:-padding] + return result # Legacy interface. This code could be cleaned up since I don't believe # binascii has any line length limitations. It just doesn't seem worth it diff --git a/Lib/test/test_base64.py b/Lib/test/test_base64.py --- a/Lib/test/test_base64.py +++ b/Lib/test/test_base64.py @@ -100,9 +100,13 @@ def check_other_types(self, f, bytes_data, expected): eq = self.assertEqual - eq(f(bytearray(bytes_data)), expected) + b = bytearray(bytes_data) + eq(f(b), expected) + # The bytearray wasn't mutated + eq(b, bytes_data) eq(f(memoryview(bytes_data)), expected) eq(f(array('B', bytes_data)), expected) + # XXX why is b64encode hardcoded here? self.check_nonbyte_element_format(base64.b64encode, bytes_data) self.check_multidimensional(base64.b64encode, bytes_data) @@ -359,12 +363,258 @@ eq(base64.b16decode(array('B', b"0102abcdef"), True), b'\x01\x02\xab\xcd\xef') + def test_a85encode(self): + eq = self.assertEqual + + tests = { + b'': b'', + b"www.python.org": b'GB\\6`E-ZP=Df.1GEb>', + bytes(range(255)): b"""!!*-'"9eu7#RLhG$k3[W&.oNg'GVB"(`=52*$$""" + b"""(B+<_pR,UFcb-n-Vr/1iJ-0JP==1c70M3&s#]4?Ykm5X at _(6q'R884cE""" + b"""H9MJ8X:f1+h<)lt#=BSg3>[:ZC?t!MSA7]@cBPD3sCi+'.E,fo>FEMbN""" + b"""G^4U^I!pHnJ:W<)KS>/9Ll%"IN/`jYOHG]iPa.Q$R$jD4S=Q7DTV8*TU""" + b"""nsrdW2ZetXKAY/Yd(L?['d?O\\@K2_]Y2%o^qmn*`5Ta:aN;TJbg"GZd""" + b"""*^:jeCE.%f\\,!5gtgiEi8N\\UjQ5OekiqBum-X60nF?)@o_%qPq"ad`""" + b"""r;HT""", + b"abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ" + b"0123456789!@#0^&*();:<>,. []{}": + b'@:E_WAS,RgBkhF"D/O92EH6,BF`qtRH$VbC6UX at 47n?3D92&&T' + b":Jand;cHat='/U/0JP==1c70M3&r-I,;,E,oN2F(oQ1z', + b"zero compression\0\0\0": b'H=_,8+Cf>,E,oN2F(oQ1!!!!', + b"Boundary:\0\0\0\0": b'6>q!aA79M(3WK-[!!', + b"Space compr: ": b';fH/TAKYK$D/aMV+', data) + + self.check_other_types(base64.a85encode, b"www.python.org", + b'GB\\6`E-ZP=Df.1GEb>') + + self.assertRaises(TypeError, base64.a85encode, "") + + eq(base64.a85encode(b"www.python.org", wrapcol=7, adobe=False), + b'GB\\6`E-\nZP=Df.1\nGEb>') + eq(base64.a85encode(b"\0\0\0\0www.python.org", wrapcol=7, adobe=False), + b'zGB\\6`E\n-ZP=Df.\n1GEb>') + eq(base64.a85encode(b"www.python.org", wrapcol=7, adobe=True), + b'<~GB\\6`\nE-ZP=Df\n.1GEb>\n~>') + + eq(base64.a85encode(b' '*8, foldspaces=True, adobe=False), b'yy') + eq(base64.a85encode(b' '*7, foldspaces=True, adobe=False), b'y+}2SXo+ITwPvYU}0ioWMyV&XlZI|Y;A6DaB*^Tbai%j""" + b"""czJqze0_d at fPsR8goTEOh>41ejE#,. []{}""": + b"""VPa!sWoBn+X=-b1ZEkOHadLBXb#`}nd3r%YLqtVJM at UIZOH55pPf$@(""" + b"""Q&d$}S6EqEFflSSG&MFiI5{CeBQRbjDkv#CIy^osE+AW7dwl""", + b'no padding..': b'Zf_uPVPs@!Zf7no', + b'zero compression\x00\x00\x00\x00': b'dS!BNAY*TBaB^jHb7^mG00000', + b'zero compression\x00\x00\x00': b'dS!BNAY*TBaB^jHb7^mG0000', + b"""Boundary:\x00\x00\x00\x00""": b"""LT`0$WMOi7IsgCw00""", + b'Space compr: ': b'Q*dEpWgug3ZE$irARr(h', + b'\xff': b'{{', + b'\xff'*2: b'|Nj', + b'\xff'*3: b'|Ns9', + b'\xff'*4: b'|NsC0', + } + + for data, res in tests.items(): + eq(base64.b85encode(data), res) + + self.check_other_types(base64.b85encode, b"www.python.org", + b'cXxL#aCvlSZ*DGca%T') + + def test_a85decode(self): + eq = self.assertEqual + + tests = { + b'': b'', + b'GB\\6`E-ZP=Df.1GEb>': b'www.python.org', + b"""! ! * -'"\n\t\t9eu\r\n7# RL\vhG$k3[W&.oNg'GVB"(`=52*$$""" + b"""(B+<_pR,UFcb-n-Vr/1iJ-0JP==1c70M3&s#]4?Ykm5X at _(6q'R884cE""" + b"""H9MJ8X:f1+h<)lt#=BSg3>[:ZC?t!MSA7]@cBPD3sCi+'.E,fo>FEMbN""" + b"""G^4U^I!pHnJ:W<)KS>/9Ll%"IN/`jYOHG]iPa.Q$R$jD4S=Q7DTV8*TU""" + b"""nsrdW2ZetXKAY/Yd(L?['d?O\\@K2_]Y2%o^qmn*`5Ta:aN;TJbg"GZd""" + b"""*^:jeCE.%f\\,!5gtgiEi8N\\UjQ5OekiqBum-X60nF?)@o_%qPq"ad`""" + b"""r;HT""": bytes(range(255)), + b"""@:E_WAS,RgBkhF"D/O92EH6,BF`qtRH$VbC6UX at 47n?3D92&&T:Jand;c""" + b"""Hat='/U/0JP==1c70M3&r-I,;,. []{}', + b'DJpY:@:Wn_DJ(RS': b'no padding..', + b'H=_,8+Cf>,E,oN2F(oQ1z': b'zero compression\x00\x00\x00\x00', + b'H=_,8+Cf>,E,oN2F(oQ1!!!!': b'zero compression\x00\x00\x00', + b'6>q!aA79M(3WK-[!!': b"Boundary:\x00\x00\x00\x00", + b';fH/TAKYK$D/aMV+', adobe=True), res, data) + eq(base64.a85decode('<~%s~>' % data.decode("ascii"), adobe=True), + res, data) + + eq(base64.a85decode(b'yy', foldspaces=True, adobe=False), b' '*8) + eq(base64.a85decode(b'y+', + b"www.python.org") + + def test_b85decode(self): + eq = self.assertEqual + + tests = { + b'': b'', + b'cXxL#aCvlSZ*DGca%T': b'www.python.org', + b"""009C61O)~M2nh-c3=Iws5D^j+6crX17#SKH9337X""" + b"""AR!_nBqb&%C at Cr{EG;fCFflSSG&MFiI5|2yJUu=?KtV!7L`6nNNJ&ad""" + b"""OifNtP*GA-R8>}2SXo+ITwPvYU}0ioWMyV&XlZI|Y;A6DaB*^Tbai%j""" + b"""czJqze0_d at fPsR8goTEOh>41ejE#,. []{}""", + b'Zf_uPVPs@!Zf7no': b'no padding..', + b'dS!BNAY*TBaB^jHb7^mG00000': b'zero compression\x00\x00\x00\x00', + b'dS!BNAY*TBaB^jHb7^mG0000': b'zero compression\x00\x00\x00', + b"""LT`0$WMOi7IsgCw00""": b"""Boundary:\x00\x00\x00\x00""", + b'Q*dEpWgug3ZE$irARr(h': b'Space compr: ', + b'{{': b'\xff', + b'|Nj': b'\xff'*2, + b'|Ns9': b'\xff'*3, + b'|NsC0': b'\xff'*4, + } + + for data, res in tests.items(): + eq(base64.b85decode(data), res) + eq(base64.b85decode(data.decode("ascii")), res) + + self.check_other_types(base64.b85decode, b'cXxL#aCvlSZ*DGca%T', + b"www.python.org") + + def test_a85_padding(self): + eq = self.assertEqual + + eq(base64.a85encode(b"x", pad=True), b'GQ7^D') + eq(base64.a85encode(b"xx", pad=True), b"G^'2g") + eq(base64.a85encode(b"xxx", pad=True), b'G^+H5') + eq(base64.a85encode(b"xxxx", pad=True), b'G^+IX') + eq(base64.a85encode(b"xxxxx", pad=True), b'G^+IXGQ7^D') + + eq(base64.a85decode(b'GQ7^D'), b"x\x00\x00\x00") + eq(base64.a85decode(b"G^'2g"), b"xx\x00\x00") + eq(base64.a85decode(b'G^+H5'), b"xxx\x00") + eq(base64.a85decode(b'G^+IX'), b"xxxx") + eq(base64.a85decode(b'G^+IXGQ7^D'), b"xxxxx\x00\x00\x00") + + def test_b85_padding(self): + eq = self.assertEqual + + eq(base64.b85encode(b"x", pad=True), b'cmMzZ') + eq(base64.b85encode(b"xx", pad=True), b'cz6H+') + eq(base64.b85encode(b"xxx", pad=True), b'czAdK') + eq(base64.b85encode(b"xxxx", pad=True), b'czAet') + eq(base64.b85encode(b"xxxxx", pad=True), b'czAetcmMzZ') + + eq(base64.b85decode(b'cmMzZ'), b"x\x00\x00\x00") + eq(base64.b85decode(b'cz6H+'), b"xx\x00\x00") + eq(base64.b85decode(b'czAdK'), b"xxx\x00") + eq(base64.b85decode(b'czAet'), b"xxxx") + eq(base64.b85decode(b'czAetcmMzZ'), b"xxxxx\x00\x00\x00") + + def test_a85decode_errors(self): + illegal = (set(range(32)) | set(range(118, 256))) - set(b' \t\n\r\v') + for c in illegal: + with self.assertRaises(ValueError, msg=bytes([c])): + base64.a85decode(b'!!!!' + bytes([c])) + with self.assertRaises(ValueError, msg=bytes([c])): + base64.a85decode(b'!!!!' + bytes([c]), adobe=False) + with self.assertRaises(ValueError, msg=bytes([c])): + base64.a85decode(b'<~!!!!' + bytes([c]) + b'~>', adobe=True) + + self.assertRaises(ValueError, base64.a85decode, + b"malformed", adobe=True) + self.assertRaises(ValueError, base64.a85decode, + b"<~still malformed", adobe=True) + self.assertRaises(ValueError, base64.a85decode, + b"also malformed~>", adobe=True) + + # With adobe=False (the default), Adobe framing markers are disallowed + self.assertRaises(ValueError, base64.a85decode, + b"<~~>") + self.assertRaises(ValueError, base64.a85decode, + b"<~~>", adobe=False) + base64.a85decode(b"<~~>", adobe=True) # sanity check + + self.assertRaises(ValueError, base64.a85decode, + b"abcx", adobe=False) + self.assertRaises(ValueError, base64.a85decode, + b"abcdey", adobe=False) + self.assertRaises(ValueError, base64.a85decode, + b"a b\nc", adobe=False, ignorechars=b"") + + self.assertRaises(ValueError, base64.a85decode, b's', adobe=False) + self.assertRaises(ValueError, base64.a85decode, b's8', adobe=False) + self.assertRaises(ValueError, base64.a85decode, b's8W', adobe=False) + self.assertRaises(ValueError, base64.a85decode, b's8W-', adobe=False) + self.assertRaises(ValueError, base64.a85decode, b's8W-"', adobe=False) + + def test_b85decode_errors(self): + illegal = list(range(33)) + \ + list(b'"\',./:[\\]') + \ + list(range(128, 256)) + for c in illegal: + with self.assertRaises(ValueError, msg=bytes([c])): + base64.b85decode(b'0000' + bytes([c])) + + self.assertRaises(ValueError, base64.b85decode, b'|') + self.assertRaises(ValueError, base64.b85decode, b'|N') + self.assertRaises(ValueError, base64.b85decode, b'|Ns') + self.assertRaises(ValueError, base64.b85decode, b'|NsC') + self.assertRaises(ValueError, base64.b85decode, b'|NsC1') + def test_decode_nonascii_str(self): decode_funcs = (base64.b64decode, base64.standard_b64decode, base64.urlsafe_b64decode, base64.b32decode, - base64.b16decode) + base64.b16decode, + base64.b85decode, + base64.a85decode) for f in decode_funcs: self.assertRaises(ValueError, f, 'with non-ascii \xcb') diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -50,6 +50,8 @@ Library ------- +- Issue #17618: Add Base85 and Ascii85 encoding/decoding to the base64 module. + - Issue #19634: time.strftime("%y") now raises a ValueError on AIX when given a year before 1900. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 01:08:12 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 18 Nov 2013 01:08:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_compiler_warnings_on_W?= =?utf-8?q?indows_64-bit_in_grammar=2Ec?= Message-ID: <3dN9VN0wJTzQPC@mail.python.org> http://hg.python.org/cpython/rev/ac4272df1af6 changeset: 87229:ac4272df1af6 user: Victor Stinner date: Mon Nov 18 01:07:38 2013 +0100 summary: Fix compiler warnings on Windows 64-bit in grammar.c INT_MAX states and labels should be enough for everyone files: Parser/grammar.c | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Parser/grammar.c b/Parser/grammar.c --- a/Parser/grammar.c +++ b/Parser/grammar.c @@ -63,7 +63,7 @@ s->s_upper = 0; s->s_accel = NULL; s->s_accept = 0; - return s - d->d_state; + return Py_SAFE_DOWNCAST(s - d->d_state, Py_intptr_t, int); } void @@ -105,7 +105,7 @@ if (Py_DebugFlag) printf("Label @ %8p, %d: %s\n", ll, ll->ll_nlabels, PyGrammar_LabelRepr(lb)); - return lb - ll->ll_label; + return Py_SAFE_DOWNCAST(lb - ll->ll_label, Py_intptr_t, int); } /* Same, but rather dies than adds */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 01:10:20 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 18 Nov 2013 01:10:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_a_compiler_warning_on_?= =?utf-8?q?Windows_64-bit_in_parsetok=2Ec?= Message-ID: <3dN9Xr4BmfzQRJ@mail.python.org> http://hg.python.org/cpython/rev/19e900e3033f changeset: 87230:19e900e3033f user: Victor Stinner date: Mon Nov 18 01:09:51 2013 +0100 summary: Fix a compiler warning on Windows 64-bit in parsetok.c Python parser doesn't support lines longer than INT_MAX bytes yet files: Parser/parsetok.c | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Parser/parsetok.c b/Parser/parsetok.c --- a/Parser/parsetok.c +++ b/Parser/parsetok.c @@ -254,7 +254,8 @@ } #endif if (a >= tok->line_start) - col_offset = a - tok->line_start; + col_offset = Py_SAFE_DOWNCAST(a - tok->line_start, + Py_intptr_t, int); else col_offset = -1; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 01:21:23 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 18 Nov 2013 01:21:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Use_Py=5Fssize=5Ft_type_fo?= =?utf-8?q?r_sizes_in_getargs=2Ec?= Message-ID: <3dN9nb3SBFzR8P@mail.python.org> http://hg.python.org/cpython/rev/103998db4407 changeset: 87231:103998db4407 user: Victor Stinner date: Mon Nov 18 01:21:12 2013 +0100 summary: Use Py_ssize_t type for sizes in getargs.c Fix compiler warnings on Windows 64-bit files: Python/getargs.c | 25 ++++++++++++++++++------- 1 files changed, 18 insertions(+), 7 deletions(-) diff --git a/Python/getargs.c b/Python/getargs.c --- a/Python/getargs.c +++ b/Python/getargs.c @@ -421,6 +421,7 @@ int n = 0; const char *format = *p_format; int i; + Py_ssize_t len; for (;;) { int c = *format++; @@ -450,12 +451,20 @@ return msgbuf; } - if ((i = PySequence_Size(arg)) != n) { + len = PySequence_Size(arg); + if (len != n) { levels[0] = 0; - PyOS_snprintf(msgbuf, bufsize, - toplevel ? "expected %d arguments, not %d" : - "must be sequence of length %d, not %d", - n, i); + if (toplevel) { + PyOS_snprintf(msgbuf, bufsize, + "expected %d arguments, not %" PY_FORMAT_SIZE_T "d", + n, len); + } + else { + PyOS_snprintf(msgbuf, bufsize, + "must be sequence of length %d, " + "not %" PY_FORMAT_SIZE_T "d", + n, len); + } return msgbuf; } @@ -1426,7 +1435,8 @@ const char *fname, *msg, *custom_msg, *keyword; int min = INT_MAX; int max = INT_MAX; - int i, len, nargs, nkeywords; + int i, len; + Py_ssize_t nargs, nkeywords; PyObject *current_arg; freelistentry_t static_entries[STATIC_FREELIST_ENTRIES]; freelist_t freelist = {static_entries, 0, 0}; @@ -1466,7 +1476,8 @@ nkeywords = (keywords == NULL) ? 0 : PyDict_Size(keywords); if (nargs + nkeywords > len) { PyErr_Format(PyExc_TypeError, - "%s%s takes at most %d argument%s (%d given)", + "%s%s takes at most %d argument%s " + "(%" PY_FORMAT_SIZE_T "d given)", (fname == NULL) ? "function" : fname, (fname == NULL) ? "" : "()", len, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 01:24:53 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 18 Nov 2013 01:24:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_sqlite=3A_raise_an_Overflo?= =?utf-8?q?wError_if_the_result_is_longer_than_INT=5FMAX_bytes?= Message-ID: <3dN9sd0lhYzQsH@mail.python.org> http://hg.python.org/cpython/rev/0f7f1f2121a1 changeset: 87232:0f7f1f2121a1 user: Victor Stinner date: Mon Nov 18 01:24:31 2013 +0100 summary: sqlite: raise an OverflowError if the result is longer than INT_MAX bytes Fix a compiler warning on Windows 64-bit files: Modules/_sqlite/connection.c | 10 ++++++++-- 1 files changed, 8 insertions(+), 2 deletions(-) diff --git a/Modules/_sqlite/connection.c b/Modules/_sqlite/connection.c --- a/Modules/_sqlite/connection.c +++ b/Modules/_sqlite/connection.c @@ -522,10 +522,16 @@ const char* buffer; Py_ssize_t buflen; if (PyObject_AsCharBuffer(py_val, &buffer, &buflen) != 0) { - PyErr_SetString(PyExc_ValueError, "could not convert BLOB to buffer"); + PyErr_SetString(PyExc_ValueError, + "could not convert BLOB to buffer"); return -1; } - sqlite3_result_blob(context, buffer, buflen, SQLITE_TRANSIENT); + if (buflen > INT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "BLOB longer than INT_MAX bytes"); + return -1; + } + sqlite3_result_blob(context, buffer, (int)buflen, SQLITE_TRANSIENT); } else { return -1; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 01:56:50 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 18 Nov 2013 01:56:50 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_a_compiler_warning_on_?= =?utf-8?q?Windows_64-bit=3A_=5Fsqlite_module?= Message-ID: <3dNBZV3Yp7zRT4@mail.python.org> http://hg.python.org/cpython/rev/855e172bcac4 changeset: 87233:855e172bcac4 user: Victor Stinner date: Mon Nov 18 01:27:30 2013 +0100 summary: Fix a compiler warning on Windows 64-bit: _sqlite module files: Modules/_sqlite/row.c | 13 +++++++++---- 1 files changed, 9 insertions(+), 4 deletions(-) diff --git a/Modules/_sqlite/row.c b/Modules/_sqlite/row.c --- a/Modules/_sqlite/row.c +++ b/Modules/_sqlite/row.c @@ -67,7 +67,7 @@ { long _idx; char* key; - int nitems, i; + Py_ssize_t nitems, i; char* compare_key; char* p1; @@ -88,7 +88,10 @@ nitems = PyTuple_Size(self->description); for (i = 0; i < nitems; i++) { - compare_key = _PyUnicode_AsString(PyTuple_GET_ITEM(PyTuple_GET_ITEM(self->description, i), 0)); + PyObject *obj; + obj = PyTuple_GET_ITEM(self->description, i); + obj = PyTuple_GET_ITEM(obj, 0); + compare_key = _PyUnicode_AsString(obj); if (!compare_key) { return NULL; } @@ -120,10 +123,12 @@ PyErr_SetString(PyExc_IndexError, "No item with that key"); return NULL; - } else if (PySlice_Check(idx)) { + } + else if (PySlice_Check(idx)) { PyErr_SetString(PyExc_ValueError, "slices not implemented, yet"); return NULL; - } else { + } + else { PyErr_SetString(PyExc_IndexError, "Index must be int or string"); return NULL; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 01:56:51 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 18 Nov 2013 01:56:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_sqlite=3A_raise_an_Overflo?= =?utf-8?q?wError_if_a_string_or_a_BLOB_is_longer_than_INT=5FMAX?= Message-ID: <3dNBZW5PvJzQdB@mail.python.org> http://hg.python.org/cpython/rev/40d25b2b93f0 changeset: 87234:40d25b2b93f0 user: Victor Stinner date: Mon Nov 18 01:36:29 2013 +0100 summary: sqlite: raise an OverflowError if a string or a BLOB is longer than INT_MAX bytes Fix compiler warnings on Windows 64-bit files: Modules/_sqlite/statement.c | 24 ++++++++++++++++-------- 1 files changed, 16 insertions(+), 8 deletions(-) diff --git a/Modules/_sqlite/statement.c b/Modules/_sqlite/statement.c --- a/Modules/_sqlite/statement.c +++ b/Modules/_sqlite/statement.c @@ -132,18 +132,26 @@ break; case TYPE_UNICODE: string = _PyUnicode_AsStringAndSize(parameter, &buflen); - if (string != NULL) - rc = sqlite3_bind_text(self->st, pos, string, buflen, SQLITE_TRANSIENT); - else - rc = -1; + if (string == NULL) + return -1; + if (buflen > INT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "string longer than INT_MAX bytes"); + return -1; + } + rc = sqlite3_bind_text(self->st, pos, string, (int)buflen, SQLITE_TRANSIENT); break; case TYPE_BUFFER: - if (PyObject_AsCharBuffer(parameter, &buffer, &buflen) == 0) { - rc = sqlite3_bind_blob(self->st, pos, buffer, buflen, SQLITE_TRANSIENT); - } else { + if (PyObject_AsCharBuffer(parameter, &buffer, &buflen) != 0) { PyErr_SetString(PyExc_ValueError, "could not convert BLOB to buffer"); - rc = -1; + return -1; } + if (buflen > INT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "BLOB longer than INT_MAX bytes"); + return -1; + } + rc = sqlite3_bind_blob(self->st, pos, buffer, buflen, SQLITE_TRANSIENT); break; case TYPE_UNKNOWN: rc = -1; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 02:05:45 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 18 Nov 2013 02:05:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_PY=5FFORMAT=5FSIZE=5FT_sho?= =?utf-8?q?uld_not_be_used_with_PyErr=5FFormat=28=29=2C_PyErr=5FFormat=28?= =?utf-8?b?IiV6ZCIpIGlz?= Message-ID: <3dNBmn1DRJzRf4@mail.python.org> http://hg.python.org/cpython/rev/d1dc7888656f changeset: 87235:d1dc7888656f user: Victor Stinner date: Mon Nov 18 02:05:31 2013 +0100 summary: PY_FORMAT_SIZE_T should not be used with PyErr_Format(), PyErr_Format("%zd") is portable files: Python/getargs.c | 3 +-- 1 files changed, 1 insertions(+), 2 deletions(-) diff --git a/Python/getargs.c b/Python/getargs.c --- a/Python/getargs.c +++ b/Python/getargs.c @@ -1476,8 +1476,7 @@ nkeywords = (keywords == NULL) ? 0 : PyDict_Size(keywords); if (nargs + nkeywords > len) { PyErr_Format(PyExc_TypeError, - "%s%s takes at most %d argument%s " - "(%" PY_FORMAT_SIZE_T "d given)", + "%s%s takes at most %d argument%s (%zd given)", (fname == NULL) ? "function" : fname, (fname == NULL) ? "" : "()", len, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 02:07:44 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 18 Nov 2013 02:07:44 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_sqlite=3A_Use_Py=5Fssize?= =?utf-8?q?=5Ft_to_store_a_size_instead_of_an_int?= Message-ID: <3dNBq46T96zR8P@mail.python.org> http://hg.python.org/cpython/rev/2a01ca4b0edc changeset: 87236:2a01ca4b0edc user: Victor Stinner date: Mon Nov 18 02:07:29 2013 +0100 summary: sqlite: Use Py_ssize_t to store a size instead of an int Fix a compiler warning on Windows 64-bit files: Modules/_sqlite/statement.c | 6 ++++-- 1 files changed, 4 insertions(+), 2 deletions(-) diff --git a/Modules/_sqlite/statement.c b/Modules/_sqlite/statement.c --- a/Modules/_sqlite/statement.c +++ b/Modules/_sqlite/statement.c @@ -184,7 +184,7 @@ int i; int rc; int num_params_needed; - int num_params; + Py_ssize_t num_params; Py_BEGIN_ALLOW_THREADS num_params_needed = sqlite3_bind_parameter_count(self->st); @@ -200,7 +200,9 @@ num_params = PySequence_Size(parameters); } if (num_params != num_params_needed) { - PyErr_Format(pysqlite_ProgrammingError, "Incorrect number of bindings supplied. The current statement uses %d, and there are %d supplied.", + PyErr_Format(pysqlite_ProgrammingError, + "Incorrect number of bindings supplied. The current " + "statement uses %d, and there are %zd supplied.", num_params_needed, num_params); return; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 02:11:55 2013 From: python-checkins at python.org (guido.van.rossum) Date: Mon, 18 Nov 2013 02:11:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Skip_test=5Fasyncio_if_con?= =?utf-8?q?current=2Efutures_can=27t_be_imported=2E_Hopeful_fix_for?= Message-ID: <3dNBvv0PHCzQRJ@mail.python.org> http://hg.python.org/cpython/rev/0031ac40806a changeset: 87237:0031ac40806a user: Guido van Rossum date: Sun Nov 17 17:00:21 2013 -0800 summary: Skip test_asyncio if concurrent.futures can't be imported. Hopeful fix for issue 19645. files: Lib/test/test_asyncio/__init__.py | 10 +++++----- 1 files changed, 5 insertions(+), 5 deletions(-) diff --git a/Lib/test/test_asyncio/__init__.py b/Lib/test/test_asyncio/__init__.py --- a/Lib/test/test_asyncio/__init__.py +++ b/Lib/test/test_asyncio/__init__.py @@ -1,12 +1,12 @@ import os import sys import unittest -from test.support import run_unittest +from test.support import run_unittest, import_module -try: - import threading -except ImportError: - raise unittest.SkipTest("No module named '_thread'") +# Skip tests if we don't have threading. +import_module('threading') +# Skip tests if we don't have concurrent.futures. +import_module('concurrent.futures') def suite(): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 02:44:26 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 18 Nov 2013 02:44:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319634=3A_Fix_time?= =?utf-8?q?=5Fstrftime=28=29_on_AIX=2C_format_is_a_wchar=5Ft*_not_a_PyObje?= =?utf-8?q?ct*?= Message-ID: <3dNCdQ75VjzQsH@mail.python.org> http://hg.python.org/cpython/rev/652de09a3a1a changeset: 87238:652de09a3a1a user: Victor Stinner date: Mon Nov 18 02:43:29 2013 +0100 summary: Issue #19634: Fix time_strftime() on AIX, format is a wchar_t* not a PyObject* files: Modules/timemodule.c | 3 +-- 1 files changed, 1 insertions(+), 2 deletions(-) diff --git a/Modules/timemodule.c b/Modules/timemodule.c --- a/Modules/timemodule.c +++ b/Modules/timemodule.c @@ -650,7 +650,7 @@ return NULL; } } -#elif defined(_AIX) +#elif defined(_AIX) && defined(HAVE_WCSFTIME) for(outbuf = wcschr(fmt, '%'); outbuf != NULL; outbuf = wcschr(outbuf+2, '%')) @@ -660,7 +660,6 @@ if (outbuf[1] == L'y' && buf.tm_year < 0) { PyErr_SetString(PyExc_ValueError, "format %y requires year >= 1900 on AIX"); - Py_DECREF(format); return NULL; } } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 05:08:32 2013 From: python-checkins at python.org (zach.ware) Date: Mon, 18 Nov 2013 05:08:32 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319520=3A_Fix_=28t?= =?utf-8?q?he_last!=29_compiler_warning_on_32bit_Windows=2C_in_=5Fsha3?= Message-ID: <3dNGqh0mx8zPZD@mail.python.org> http://hg.python.org/cpython/rev/259c82332199 changeset: 87239:259c82332199 user: Zachary Ware date: Sun Nov 17 16:08:23 2013 -0600 summary: Issue #19520: Fix (the last!) compiler warning on 32bit Windows, in _sha3 files: Misc/NEWS | 2 ++ Modules/_sha3/keccak/KeccakF-1600-opt32.c | 2 +- 2 files changed, 3 insertions(+), 1 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -285,6 +285,8 @@ Build ----- +- Issue #19520: Fix compiler warning in the _sha3 module on 32bit Windows. + - Issue #19356: Avoid using a C variabled named "_self", it's a reserved word in some C compilers. diff --git a/Modules/_sha3/keccak/KeccakF-1600-opt32.c b/Modules/_sha3/keccak/KeccakF-1600-opt32.c --- a/Modules/_sha3/keccak/KeccakF-1600-opt32.c +++ b/Modules/_sha3/keccak/KeccakF-1600-opt32.c @@ -212,7 +212,7 @@ #define extractLanes(laneCount, state, data) \ { \ - int i; \ + unsigned int i; \ for(i=0; i<(laneCount); i++) \ setInterleavedWordsInto8bytes(data+i*8, (UINT32*)state+i*2); \ } -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Mon Nov 18 07:37:41 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Mon, 18 Nov 2013 07:37:41 +0100 Subject: [Python-checkins] Daily reference leaks (652de09a3a1a): sum=0 Message-ID: results for 652de09a3a1a on branch "default" -------------------------------------------- test_site leaked [2, -2, 0] references, sum=0 test_site leaked [2, -2, 0] memory blocks, sum=0 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflog3O1l4v', '-x'] From python-checkins at python.org Mon Nov 18 10:04:17 2013 From: python-checkins at python.org (christian.heimes) Date: Mon, 18 Nov 2013 10:04:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Remove_unused_code_path_fr?= =?utf-8?q?om_PBKDF2_that_is_causing_a_warning_on_Win64?= Message-ID: <3dNPNx1xLtzPtV@mail.python.org> http://hg.python.org/cpython/rev/ce19003cbfc3 changeset: 87240:ce19003cbfc3 user: Christian Heimes date: Mon Nov 18 09:59:44 2013 +0100 summary: Remove unused code path from PBKDF2 that is causing a warning on Win64 files: Modules/_hashopenssl.c | 6 +----- 1 files changed, 1 insertions(+), 5 deletions(-) diff --git a/Modules/_hashopenssl.c b/Modules/_hashopenssl.c --- a/Modules/_hashopenssl.c +++ b/Modules/_hashopenssl.c @@ -504,10 +504,6 @@ HMAC_CTX_init(&hctx); p = out; tkeylen = keylen; - if (!pass) - passlen = 0; - else if(passlen == -1) - passlen = strlen(pass); if (!HMAC_Init_ex(&hctx_tpl, pass, passlen, digest, NULL)) { HMAC_CTX_cleanup(&hctx_tpl); return 0; @@ -671,7 +667,7 @@ Py_BEGIN_ALLOW_THREADS retval = PKCS5_PBKDF2_HMAC_fast((char*)password.buf, (int)password.len, - (unsigned char *)salt.buf, salt.len, + (unsigned char *)salt.buf, (int)salt.len, iterations, digest, dklen, (unsigned char *)key); Py_END_ALLOW_THREADS -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 10:04:18 2013 From: python-checkins at python.org (christian.heimes) Date: Mon, 18 Nov 2013 10:04:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Safely_downcast_SOCKET=5FT?= =?utf-8?q?_to_int_in_=5Fssl_module?= Message-ID: <3dNPNy3fL6zPtV@mail.python.org> http://hg.python.org/cpython/rev/00348c0518f8 changeset: 87241:00348c0518f8 user: Christian Heimes date: Mon Nov 18 10:04:07 2013 +0100 summary: Safely downcast SOCKET_T to int in _ssl module files: Modules/_ssl.c | 8 +++++--- 1 files changed, 5 insertions(+), 3 deletions(-) diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -499,7 +499,7 @@ self->ssl = SSL_new(ctx); PySSL_END_ALLOW_THREADS SSL_set_app_data(self->ssl,self); - SSL_set_fd(self->ssl, sock->sock_fd); + SSL_set_fd(self->ssl, Py_SAFE_DOWNCAST(sock->sock_fd, SOCKET_T, int)); mode = SSL_MODE_ACCEPT_MOVING_WRITE_BUFFER; #ifdef SSL_MODE_AUTO_RETRY mode |= SSL_MODE_AUTO_RETRY; @@ -1378,9 +1378,11 @@ /* See if the socket is ready */ PySSL_BEGIN_ALLOW_THREADS if (writing) - rc = select(s->sock_fd+1, NULL, &fds, NULL, &tv); + rc = select(Py_SAFE_DOWNCAST(s->sock_fd+1, SOCKET_T, int), + NULL, &fds, NULL, &tv); else - rc = select(s->sock_fd+1, &fds, NULL, NULL, &tv); + rc = select(Py_SAFE_DOWNCAST(s->sock_fd+1, SOCKET_T, int), + &fds, NULL, NULL, &tv); PySSL_END_ALLOW_THREADS #ifdef HAVE_POLL -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 10:30:50 2013 From: python-checkins at python.org (christian.heimes) Date: Mon, 18 Nov 2013 10:30:50 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_GetVolumePathNameW=3A_down?= =?utf-8?q?cast_bufsize_to_DWORD?= Message-ID: <3dNPzZ5pJwzPYf@mail.python.org> http://hg.python.org/cpython/rev/947bd57a145c changeset: 87242:947bd57a145c user: Christian Heimes date: Mon Nov 18 10:30:42 2013 +0100 summary: GetVolumePathNameW: downcast bufsize to DWORD files: Modules/posixmodule.c | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -4054,7 +4054,8 @@ return PyErr_NoMemory(); Py_BEGIN_ALLOW_THREADS - ret = GetVolumePathNameW(path, mountpath, bufsize); + ret = GetVolumePathNameW(path, mountpath, + Py_SAFE_DOWNCAST(bufsize, size_t, DWORD)); Py_END_ALLOW_THREADS if (!ret) { -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 11:06:06 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 18 Nov 2013 11:06:06 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Relax_timing_on_test=5Fasy?= =?utf-8?q?ncio_for_busy_=28slow=29_Windows_buildbots?= Message-ID: <3dNQmG33RBz7Mdy@mail.python.org> http://hg.python.org/cpython/rev/df3d792b2468 changeset: 87243:df3d792b2468 user: Victor Stinner date: Mon Nov 18 11:05:22 2013 +0100 summary: Relax timing on test_asyncio for busy (slow) Windows buildbots http://buildbot.python.org/all/builders/AMD64%20Windows%20Server%202008%20%5BSB%5D%203.x/builds/1649/steps/test/logs/stdio ====================================================================== FAIL: test_wait_for_handle (test.test_asyncio.test_windows_events.ProactorTests) ---------------------------------------------------------------------- Traceback (most recent call last): File "E:\home\cpython\buildslave\x64\3.x.snakebite-win2k8r2sp1-amd64\build\lib\test\test_asyncio\test_windows_events.py", line 112, in test_wait_for_handle self.assertTrue(0.18 < elapsed < 0.22, elapsed) AssertionError: False is not true : 0.25 files: Lib/test/test_asyncio/test_windows_events.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_asyncio/test_windows_events.py b/Lib/test/test_asyncio/test_windows_events.py --- a/Lib/test/test_asyncio/test_windows_events.py +++ b/Lib/test/test_asyncio/test_windows_events.py @@ -109,7 +109,7 @@ self.loop.run_until_complete(f) elapsed = self.loop.time() - start self.assertFalse(f.result()) - self.assertTrue(0.18 < elapsed < 0.22, elapsed) + self.assertTrue(0.18 < elapsed < 0.5, elapsed) _overlapped.SetEvent(event) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 12:07:38 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Mon, 18 Nov 2013 12:07:38 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=238402=3A_Added_the?= =?utf-8?q?_escape=28=29_function_to_the_glob_module=2E?= Message-ID: <3dNS7G1N1Zz7Mdv@mail.python.org> http://hg.python.org/cpython/rev/5fda36bff39d changeset: 87244:5fda36bff39d user: Serhiy Storchaka date: Mon Nov 18 13:06:43 2013 +0200 summary: Issue #8402: Added the escape() function to the glob module. files: Doc/library/glob.rst | 11 +++++++++++ Lib/glob.py | 16 ++++++++++++++-- Lib/test/test_glob.py | 22 ++++++++++++++++++++++ Misc/NEWS | 2 ++ 4 files changed, 49 insertions(+), 2 deletions(-) diff --git a/Doc/library/glob.rst b/Doc/library/glob.rst --- a/Doc/library/glob.rst +++ b/Doc/library/glob.rst @@ -40,6 +40,17 @@ without actually storing them all simultaneously. +.. function:: escape(pathname) + + Escape all special characters (``'?'``, ``'*'`` and ``'['``). + This is useful if you want to match an arbitrary literal string that may + have special characters in it. Special characters in drive/UNC + sharepoints are not escaped, e.g. on Windows + ``escape('//?/c:/Quo vadis?.txt')`` returns ``'//?/c:/Quo vadis[?].txt'``. + + .. versionadded:: 3.4 + + For example, consider a directory containing only the following files: :file:`1.gif`, :file:`2.txt`, and :file:`card.gif`. :func:`glob` will produce the following results. Notice how any leading components of the path are diff --git a/Lib/glob.py b/Lib/glob.py --- a/Lib/glob.py +++ b/Lib/glob.py @@ -79,8 +79,8 @@ return [] -magic_check = re.compile('[*?[]') -magic_check_bytes = re.compile(b'[*?[]') +magic_check = re.compile('([*?[])') +magic_check_bytes = re.compile(b'([*?[])') def has_magic(s): if isinstance(s, bytes): @@ -91,3 +91,15 @@ def _ishidden(path): return path[0] in ('.', b'.'[0]) + +def escape(pathname): + """Escape all special characters. + """ + # Escaping is done by wrapping any of "*?[" between square brackets. + # Metacharacters do not work in the drive part and shouldn't be escaped. + drive, pathname = os.path.splitdrive(pathname) + if isinstance(pathname, bytes): + pathname = magic_check_bytes.sub(br'[\1]', pathname) + else: + pathname = magic_check.sub(r'[\1]', pathname) + return drive + pathname diff --git a/Lib/test/test_glob.py b/Lib/test/test_glob.py --- a/Lib/test/test_glob.py +++ b/Lib/test/test_glob.py @@ -169,6 +169,28 @@ eq(glob.glob('\\\\*\\*\\'), []) eq(glob.glob(b'\\\\*\\*\\'), []) + def check_escape(self, arg, expected): + self.assertEqual(glob.escape(arg), expected) + self.assertEqual(glob.escape(os.fsencode(arg)), os.fsencode(expected)) + + def test_escape(self): + check = self.check_escape + check('abc', 'abc') + check('[', '[[]') + check('?', '[?]') + check('*', '[*]') + check('[[_/*?*/_]]', '[[][[]_/[*][?][*]/_]]') + check('/[[_/*?*/_]]/', '/[[][[]_/[*][?][*]/_]]/') + + @unittest.skipUnless(sys.platform == "win32", "Win32 specific test") + def test_escape_windows(self): + check = self.check_escape + check('?:?', '?:[?]') + check('*:*', '*:[*]') + check(r'\\?\c:\?', r'\\?\c:\[?]') + check(r'\\*\*\*', r'\\*\*\[*]') + check('//?/c:/?', '//?/c:/[?]') + check('//*/*/*', '//*/*/[*]') def test_main(): run_unittest(GlobTests) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -50,6 +50,8 @@ Library ------- +- Issue #8402: Added the escape() function to the glob module. + - Issue #17618: Add Base85 and Ascii85 encoding/decoding to the base64 module. - Issue #19634: time.strftime("%y") now raises a ValueError on AIX when given a -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 18:32:31 2013 From: python-checkins at python.org (larry.hastings) Date: Mon, 18 Nov 2013 18:32:31 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Argument_Clinic=3A_rename_?= =?utf-8?q?=22self=22_to_=22module=22_for_module-level_functions=2E?= Message-ID: <3dNcgM5ghWz7Mtv@mail.python.org> http://hg.python.org/cpython/rev/64c853a88a10 changeset: 87245:64c853a88a10 user: Larry Hastings date: Mon Nov 18 09:32:13 2013 -0800 summary: Argument Clinic: rename "self" to "module" for module-level functions. files: Modules/_cursesmodule.c | 1 - Modules/_datetimemodule.c | 23 +++++++------- Modules/_dbmmodule.c | 10 +++--- Modules/_weakref.c | 10 +++--- Modules/posixmodule.c | 42 +++++++++++++------------- Modules/unicodedata.c | 23 +++++++------- Modules/zlibmodule.c | 40 +++++++++++++------------ Objects/dictobject.c | 2 +- Objects/unicodeobject.c | 2 +- Tools/clinic/clinic.py | 26 ++++++++++------ 10 files changed, 94 insertions(+), 85 deletions(-) diff --git a/Modules/_cursesmodule.c b/Modules/_cursesmodule.c --- a/Modules/_cursesmodule.c +++ b/Modules/_cursesmodule.c @@ -551,7 +551,6 @@ /*[clinic] module curses - class curses.window curses.window.addch diff --git a/Modules/_datetimemodule.c b/Modules/_datetimemodule.c --- a/Modules/_datetimemodule.c +++ b/Modules/_datetimemodule.c @@ -4141,9 +4141,10 @@ /*[clinic] module datetime +class datetime.datetime @classmethod -datetime.now +datetime.datetime.now tz: object = None Timezone object. @@ -4153,23 +4154,23 @@ If no tz is specified, uses local timezone. [clinic]*/ -PyDoc_STRVAR(datetime_now__doc__, +PyDoc_STRVAR(datetime_datetime_now__doc__, "Returns new datetime object representing current time local to tz.\n" "\n" -"datetime.now(tz=None)\n" +"datetime.datetime.now(tz=None)\n" " tz\n" " Timezone object.\n" "\n" "If no tz is specified, uses local timezone."); -#define DATETIME_NOW_METHODDEF \ - {"now", (PyCFunction)datetime_now, METH_VARARGS|METH_KEYWORDS|METH_CLASS, datetime_now__doc__}, +#define DATETIME_DATETIME_NOW_METHODDEF \ + {"now", (PyCFunction)datetime_datetime_now, METH_VARARGS|METH_KEYWORDS|METH_CLASS, datetime_datetime_now__doc__}, static PyObject * -datetime_now_impl(PyObject *cls, PyObject *tz); +datetime_datetime_now_impl(PyObject *cls, PyObject *tz); static PyObject * -datetime_now(PyObject *cls, PyObject *args, PyObject *kwargs) +datetime_datetime_now(PyObject *cls, PyObject *args, PyObject *kwargs) { PyObject *return_value = NULL; static char *_keywords[] = {"tz", NULL}; @@ -4179,15 +4180,15 @@ "|O:now", _keywords, &tz)) goto exit; - return_value = datetime_now_impl(cls, tz); + return_value = datetime_datetime_now_impl(cls, tz); exit: return return_value; } static PyObject * -datetime_now_impl(PyObject *cls, PyObject *tz) -/*[clinic checksum: 328b54387f4c2f8cb534997e1bd55f8cb38c4992]*/ +datetime_datetime_now_impl(PyObject *cls, PyObject *tz) +/*[clinic checksum: cde1daca68c9b7dca6df51759db2de1d43a39774]*/ { PyObject *self; @@ -5037,7 +5038,7 @@ /* Class methods: */ - DATETIME_NOW_METHODDEF + DATETIME_DATETIME_NOW_METHODDEF {"utcnow", (PyCFunction)datetime_utcnow, METH_NOARGS | METH_CLASS, diff --git a/Modules/_dbmmodule.c b/Modules/_dbmmodule.c --- a/Modules/_dbmmodule.c +++ b/Modules/_dbmmodule.c @@ -415,10 +415,10 @@ {"open", (PyCFunction)dbmopen, METH_VARARGS, dbmopen__doc__}, static PyObject * -dbmopen_impl(PyObject *self, const char *filename, const char *flags, int mode); +dbmopen_impl(PyObject *module, const char *filename, const char *flags, int mode); static PyObject * -dbmopen(PyObject *self, PyObject *args) +dbmopen(PyObject *module, PyObject *args) { PyObject *return_value = NULL; const char *filename; @@ -429,15 +429,15 @@ "s|si:open", &filename, &flags, &mode)) goto exit; - return_value = dbmopen_impl(self, filename, flags, mode); + return_value = dbmopen_impl(module, filename, flags, mode); exit: return return_value; } static PyObject * -dbmopen_impl(PyObject *self, const char *filename, const char *flags, int mode) -/*[clinic checksum: 61007c796d38af85c8035afa769fb4bb453429ee]*/ +dbmopen_impl(PyObject *module, const char *filename, const char *flags, int mode) +/*[clinic checksum: 2b0ec9e3c6ecd19e06d16c9f0ba33848245cb1ab]*/ { int iflags; diff --git a/Modules/_weakref.c b/Modules/_weakref.c --- a/Modules/_weakref.c +++ b/Modules/_weakref.c @@ -25,14 +25,14 @@ {"getweakrefcount", (PyCFunction)_weakref_getweakrefcount, METH_O, _weakref_getweakrefcount__doc__}, static Py_ssize_t -_weakref_getweakrefcount_impl(PyObject *self, PyObject *object); +_weakref_getweakrefcount_impl(PyObject *module, PyObject *object); static PyObject * -_weakref_getweakrefcount(PyObject *self, PyObject *object) +_weakref_getweakrefcount(PyObject *module, PyObject *object) { PyObject *return_value = NULL; Py_ssize_t _return_value; - _return_value = _weakref_getweakrefcount_impl(self, object); + _return_value = _weakref_getweakrefcount_impl(module, object); if ((_return_value == -1) && PyErr_Occurred()) goto exit; return_value = PyLong_FromSsize_t(_return_value); @@ -42,8 +42,8 @@ } static Py_ssize_t -_weakref_getweakrefcount_impl(PyObject *self, PyObject *object) -/*[clinic checksum: 0b7e7ddd87d483719ebac0fba364fff0ed0182d9]*/ +_weakref_getweakrefcount_impl(PyObject *module, PyObject *object) +/*[clinic checksum: 05cffbc3a4b193a0b7e645da81be281748704f69]*/ { PyWeakReference **list; diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -2460,10 +2460,10 @@ {"stat", (PyCFunction)os_stat, METH_VARARGS|METH_KEYWORDS, os_stat__doc__}, static PyObject * -os_stat_impl(PyObject *self, path_t *path, int dir_fd, int follow_symlinks); - -static PyObject * -os_stat(PyObject *self, PyObject *args, PyObject *kwargs) +os_stat_impl(PyObject *module, path_t *path, int dir_fd, int follow_symlinks); + +static PyObject * +os_stat(PyObject *module, PyObject *args, PyObject *kwargs) { PyObject *return_value = NULL; static char *_keywords[] = {"path", "dir_fd", "follow_symlinks", NULL}; @@ -2475,7 +2475,7 @@ "O&|$O&p:stat", _keywords, path_converter, &path, OS_STAT_DIR_FD_CONVERTER, &dir_fd, &follow_symlinks)) goto exit; - return_value = os_stat_impl(self, &path, dir_fd, follow_symlinks); + return_value = os_stat_impl(module, &path, dir_fd, follow_symlinks); exit: /* Cleanup for path */ @@ -2485,8 +2485,8 @@ } static PyObject * -os_stat_impl(PyObject *self, path_t *path, int dir_fd, int follow_symlinks) -/*[clinic checksum: 9d9af08e8cfafd12f94e73ea3065eb3056f99515]*/ +os_stat_impl(PyObject *module, path_t *path, int dir_fd, int follow_symlinks) +/*[clinic checksum: 89390f78327e3f045a81974d758d3996e2a71f68]*/ { return posix_do_stat("stat", path, dir_fd, follow_symlinks); } @@ -2600,10 +2600,10 @@ {"access", (PyCFunction)os_access, METH_VARARGS|METH_KEYWORDS, os_access__doc__}, static PyObject * -os_access_impl(PyObject *self, path_t *path, int mode, int dir_fd, int effective_ids, int follow_symlinks); - -static PyObject * -os_access(PyObject *self, PyObject *args, PyObject *kwargs) +os_access_impl(PyObject *module, path_t *path, int mode, int dir_fd, int effective_ids, int follow_symlinks); + +static PyObject * +os_access(PyObject *module, PyObject *args, PyObject *kwargs) { PyObject *return_value = NULL; static char *_keywords[] = {"path", "mode", "dir_fd", "effective_ids", "follow_symlinks", NULL}; @@ -2617,7 +2617,7 @@ "O&i|$O&pp:access", _keywords, path_converter, &path, &mode, OS_STAT_DIR_FD_CONVERTER, &dir_fd, &effective_ids, &follow_symlinks)) goto exit; - return_value = os_access_impl(self, &path, mode, dir_fd, effective_ids, follow_symlinks); + return_value = os_access_impl(module, &path, mode, dir_fd, effective_ids, follow_symlinks); exit: /* Cleanup for path */ @@ -2627,8 +2627,8 @@ } static PyObject * -os_access_impl(PyObject *self, path_t *path, int mode, int dir_fd, int effective_ids, int follow_symlinks) -/*[clinic checksum: 0147557eb43243df57ba616cc7c35f232c69bc6a]*/ +os_access_impl(PyObject *module, path_t *path, int mode, int dir_fd, int effective_ids, int follow_symlinks) +/*[clinic checksum: aa3e145816a748172e62df8e44af74169c7e1247]*/ { PyObject *return_value = NULL; @@ -2734,10 +2734,10 @@ {"ttyname", (PyCFunction)os_ttyname, METH_VARARGS, os_ttyname__doc__}, static char * -os_ttyname_impl(PyObject *self, int fd); - -static PyObject * -os_ttyname(PyObject *self, PyObject *args) +os_ttyname_impl(PyObject *module, int fd); + +static PyObject * +os_ttyname(PyObject *module, PyObject *args) { PyObject *return_value = NULL; int fd; @@ -2747,7 +2747,7 @@ "i:ttyname", &fd)) goto exit; - _return_value = os_ttyname_impl(self, fd); + _return_value = os_ttyname_impl(module, fd); if (_return_value == NULL) goto exit; return_value = PyUnicode_DecodeFSDefault(_return_value); @@ -2757,8 +2757,8 @@ } static char * -os_ttyname_impl(PyObject *self, int fd) -/*[clinic checksum: ea680155d87bb733f542d67653eca732dd0981a8]*/ +os_ttyname_impl(PyObject *module, int fd) +/*[clinic checksum: c742dd621ec98d0f81d37d264e1d3c89c7a5fb1a]*/ { char *ret; diff --git a/Modules/unicodedata.c b/Modules/unicodedata.c --- a/Modules/unicodedata.c +++ b/Modules/unicodedata.c @@ -109,7 +109,8 @@ /*[clinic] module unicodedata -unicodedata.decimal +class unicodedata.UCD +unicodedata.UCD.decimal unichr: object(type='str') default: object=NULL @@ -122,23 +123,23 @@ not given, ValueError is raised. [clinic]*/ -PyDoc_STRVAR(unicodedata_decimal__doc__, +PyDoc_STRVAR(unicodedata_UCD_decimal__doc__, "Converts a Unicode character into its equivalent decimal value.\n" "\n" -"unicodedata.decimal(unichr, default=None)\n" +"unicodedata.UCD.decimal(unichr, default=None)\n" "\n" "Returns the decimal value assigned to the Unicode character unichr\n" "as integer. If no such value is defined, default is returned, or, if\n" "not given, ValueError is raised."); -#define UNICODEDATA_DECIMAL_METHODDEF \ - {"decimal", (PyCFunction)unicodedata_decimal, METH_VARARGS, unicodedata_decimal__doc__}, +#define UNICODEDATA_UCD_DECIMAL_METHODDEF \ + {"decimal", (PyCFunction)unicodedata_UCD_decimal, METH_VARARGS, unicodedata_UCD_decimal__doc__}, static PyObject * -unicodedata_decimal_impl(PyObject *self, PyObject *unichr, PyObject *default_value); +unicodedata_UCD_decimal_impl(PyObject *self, PyObject *unichr, PyObject *default_value); static PyObject * -unicodedata_decimal(PyObject *self, PyObject *args) +unicodedata_UCD_decimal(PyObject *self, PyObject *args) { PyObject *return_value = NULL; PyObject *unichr; @@ -148,15 +149,15 @@ "O!|O:decimal", &PyUnicode_Type, &unichr, &default_value)) goto exit; - return_value = unicodedata_decimal_impl(self, unichr, default_value); + return_value = unicodedata_UCD_decimal_impl(self, unichr, default_value); exit: return return_value; } static PyObject * -unicodedata_decimal_impl(PyObject *self, PyObject *unichr, PyObject *default_value) -/*[clinic checksum: 76c8d1c3dbee495d4cfd86ca6829543a3129344a]*/ +unicodedata_UCD_decimal_impl(PyObject *self, PyObject *unichr, PyObject *default_value) +/*[clinic checksum: a0980c387387287e2ac230c37d95b26f6903e0d2]*/ { PyUnicodeObject *v = (PyUnicodeObject *)unichr; int have_old = 0; @@ -1288,7 +1289,7 @@ /* XXX Add doc strings. */ static PyMethodDef unicodedata_functions[] = { - UNICODEDATA_DECIMAL_METHODDEF + UNICODEDATA_UCD_DECIMAL_METHODDEF {"digit", unicodedata_digit, METH_VARARGS, unicodedata_digit__doc__}, {"numeric", unicodedata_numeric, METH_VARARGS, unicodedata_numeric__doc__}, {"category", unicodedata_category, METH_VARARGS, diff --git a/Modules/zlibmodule.c b/Modules/zlibmodule.c --- a/Modules/zlibmodule.c +++ b/Modules/zlibmodule.c @@ -630,8 +630,9 @@ /*[clinic] module zlib +class zlib.Decompress -zlib.decompress +zlib.Decompress.decompress data: Py_buffer The binary data to decompress. @@ -648,10 +649,10 @@ Call the flush() method to clear these buffers. [clinic]*/ -PyDoc_STRVAR(zlib_decompress__doc__, +PyDoc_STRVAR(zlib_Decompress_decompress__doc__, "Return a string containing the decompressed version of the data.\n" "\n" -"zlib.decompress(data, max_length=0)\n" +"zlib.Decompress.decompress(data, max_length=0)\n" " data\n" " The binary data to decompress.\n" " max_length\n" @@ -663,14 +664,14 @@ "internal buffers for later processing.\n" "Call the flush() method to clear these buffers."); -#define ZLIB_DECOMPRESS_METHODDEF \ - {"decompress", (PyCFunction)zlib_decompress, METH_VARARGS, zlib_decompress__doc__}, +#define ZLIB_DECOMPRESS_DECOMPRESS_METHODDEF \ + {"decompress", (PyCFunction)zlib_Decompress_decompress, METH_VARARGS, zlib_Decompress_decompress__doc__}, static PyObject * -zlib_decompress_impl(PyObject *self, Py_buffer *data, int max_length); +zlib_Decompress_decompress_impl(PyObject *self, Py_buffer *data, int max_length); static PyObject * -zlib_decompress(PyObject *self, PyObject *args) +zlib_Decompress_decompress(PyObject *self, PyObject *args) { PyObject *return_value = NULL; Py_buffer data; @@ -680,7 +681,7 @@ "y*|i:decompress", &data, &max_length)) goto exit; - return_value = zlib_decompress_impl(self, &data, max_length); + return_value = zlib_Decompress_decompress_impl(self, &data, max_length); exit: /* Cleanup for data */ @@ -690,8 +691,8 @@ } static PyObject * -zlib_decompress_impl(PyObject *self, Py_buffer *data, int max_length) -/*[clinic checksum: 168d093d400739dde947cca1f4fb0f9d51cdc2c9]*/ +zlib_Decompress_decompress_impl(PyObject *self, Py_buffer *data, int max_length) +/*[clinic checksum: bfac7a0f07e891869d87c665a76dc2611014420f]*/ { compobject *zself = (compobject *)self; int err; @@ -907,22 +908,23 @@ /*[clinic] -zlib.copy +class zlib.Compress +zlib.Compress.copy Return a copy of the compression object. [clinic]*/ -PyDoc_STRVAR(zlib_copy__doc__, +PyDoc_STRVAR(zlib_Compress_copy__doc__, "Return a copy of the compression object.\n" "\n" -"zlib.copy()"); +"zlib.Compress.copy()"); -#define ZLIB_COPY_METHODDEF \ - {"copy", (PyCFunction)zlib_copy, METH_NOARGS, zlib_copy__doc__}, +#define ZLIB_COMPRESS_COPY_METHODDEF \ + {"copy", (PyCFunction)zlib_Compress_copy, METH_NOARGS, zlib_Compress_copy__doc__}, static PyObject * -zlib_copy(PyObject *self) -/*[clinic checksum: 7b648de2c1f933ba2b9fa17331ff1a44d9a4a740]*/ +zlib_Compress_copy(PyObject *self) +/*[clinic checksum: 2551952e72329f0f2beb48a1dde3780e485a220b]*/ { compobject *zself = (compobject *)self; compobject *retval = NULL; @@ -1118,14 +1120,14 @@ {"flush", (binaryfunc)PyZlib_flush, METH_VARARGS, comp_flush__doc__}, #ifdef HAVE_ZLIB_COPY - ZLIB_COPY_METHODDEF + ZLIB_COMPRESS_COPY_METHODDEF #endif {NULL, NULL} }; static PyMethodDef Decomp_methods[] = { - ZLIB_DECOMPRESS_METHODDEF + ZLIB_DECOMPRESS_DECOMPRESS_METHODDEF {"flush", (binaryfunc)PyZlib_unflush, METH_VARARGS, decomp_flush__doc__}, #ifdef HAVE_ZLIB_COPY diff --git a/Objects/dictobject.c b/Objects/dictobject.c --- a/Objects/dictobject.c +++ b/Objects/dictobject.c @@ -2161,7 +2161,7 @@ } /*[clinic] -module dict +class dict @coexist dict.__contains__ diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -12722,7 +12722,7 @@ } /*[clinic] -module str +class str @staticmethod str.maketrans as unicode_maketrans diff --git a/Tools/clinic/clinic.py b/Tools/clinic/clinic.py --- a/Tools/clinic/clinic.py +++ b/Tools/clinic/clinic.py @@ -127,13 +127,13 @@ def is_legal_py_identifier(s): return all(is_legal_c_identifier(field) for field in s.split('.')) -# added "self", "cls", and "null" just to be safe +# added "module", "self", "cls", and "null" just to be safe # (clinic will generate variables with these names) c_keywords = set(""" asm auto break case char cls const continue default do double -else enum extern float for goto if inline int long null register -return self short signed sizeof static struct switch typedef -typeof union unsigned void volatile while +else enum extern float for goto if inline int long module null +register return self short signed sizeof static struct switch +typedef typeof union unsigned void volatile while """.strip().split()) def ensure_legal_c_identifier(s): @@ -620,7 +620,7 @@ else: if f.kind == CALLABLE: meth_flags = '' - self_name = "self" + self_name = "self" if f.cls else "module" elif f.kind == CLASS_METHOD: meth_flags = 'METH_CLASS' self_name = "cls" @@ -1028,6 +1028,7 @@ self.verify = verify self.filename = filename self.modules = collections.OrderedDict() + self.classes = collections.OrderedDict() global clinic clinic = self @@ -1064,7 +1065,7 @@ if not in_classes: child = parent.modules.get(field) if child: - module = child + parent = module = child continue in_classes = True if not hasattr(parent, 'classes'): @@ -1129,6 +1130,9 @@ self.classes = collections.OrderedDict() self.functions = [] + def __repr__(self): + return "" + class Class: def __init__(self, name, module=None, cls=None): self.name = name @@ -1139,6 +1143,10 @@ self.classes = collections.OrderedDict() self.functions = [] + def __repr__(self): + return "" + + DATA, CALLABLE, METHOD, STATIC_METHOD, CLASS_METHOD = range(5) class Function: @@ -1808,13 +1816,11 @@ so_far = [] module, cls = self.clinic._module_and_class(fields) - if not module: - fail("You must explicitly specify the module for the class.") - c = Class(name, module, cls) - module.classes[name] = c if cls: cls.classes[name] = c + else: + module.classes[name] = c self.block.signatures.append(c) def at_classmethod(self): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 18:41:28 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 18 Nov 2013 18:41:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_erro?= =?utf-8?q?r_handling_of_CDataType=5Ffrom=5Fbuffer=28=29?= Message-ID: <3dNcsh0Tqrz7Mv1@mail.python.org> http://hg.python.org/cpython/rev/8f770cdb7a19 changeset: 87246:8f770cdb7a19 user: Victor Stinner date: Mon Nov 18 18:35:55 2013 +0100 summary: Issue #19437: Fix error handling of CDataType_from_buffer() KeepRef() decreases the reference counter of its 'keep' parameter on error files: Modules/_ctypes/_ctypes.c | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/Modules/_ctypes/_ctypes.c b/Modules/_ctypes/_ctypes.c --- a/Modules/_ctypes/_ctypes.c +++ b/Modules/_ctypes/_ctypes.c @@ -454,7 +454,6 @@ Py_INCREF(obj); if (-1 == KeepRef((CDataObject *)result, -1, obj)) { - Py_DECREF(result); return NULL; } return result; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 18:41:29 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 18 Nov 2013 18:41:29 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319437=3A_Fix_erro?= =?utf-8?q?r_handling_of_PyCArrayType=5Fnew=28=29=2C_don=27t_decreases_the?= Message-ID: <3dNcsj2Fmsz7Mv1@mail.python.org> http://hg.python.org/cpython/rev/556bdd8d0dde changeset: 87247:556bdd8d0dde user: Victor Stinner date: Mon Nov 18 18:37:33 2013 +0100 summary: Issue #19437: Fix error handling of PyCArrayType_new(), don't decreases the reference counter of stgdict after result stole a reference to it files: Modules/_ctypes/_ctypes.c | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Modules/_ctypes/_ctypes.c b/Modules/_ctypes/_ctypes.c --- a/Modules/_ctypes/_ctypes.c +++ b/Modules/_ctypes/_ctypes.c @@ -1342,7 +1342,8 @@ if (-1 == PyDict_Update((PyObject *)stgdict, result->tp_dict)) goto error; Py_DECREF(result->tp_dict); - result->tp_dict = (PyObject *)stgdict; + result->tp_dict = (PyObject *)stgdict; /* steal the reference */ + stgdict = NULL; /* Special case for character arrays. A permanent annoyance: char arrays are also strings! -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 19:02:04 2013 From: python-checkins at python.org (charles-francois.natali) Date: Mon, 18 Nov 2013 19:02:04 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_selectors=3A_use_a_single_?= =?utf-8?q?return=2E?= Message-ID: <3dNdKS33g0z7PM0@mail.python.org> http://hg.python.org/cpython/rev/a3bfac8228a1 changeset: 87248:a3bfac8228a1 user: Charles-Fran?ois Natali date: Mon Nov 18 18:59:43 2013 +0100 summary: selectors: use a single return. files: Lib/selectors.py | 6 ++---- 1 files changed, 2 insertions(+), 4 deletions(-) diff --git a/Lib/selectors.py b/Lib/selectors.py --- a/Lib/selectors.py +++ b/Lib/selectors.py @@ -140,14 +140,12 @@ raise KeyError("{!r} is not registered".format(fileobj)) from None if events != key.events: self.unregister(fileobj) - return self.register(fileobj, events, data) + key = self.register(fileobj, events, data) elif data != key.data: # Use a shortcut to update the data. key = key._replace(data=data) self._fd_to_key[key.fd] = key - return key - else: - return key + return key @abstractmethod def select(self, timeout=None): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 21:19:07 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 18 Nov 2013 21:19:07 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319581=3A_Change_t?= =?utf-8?q?he_overallocation_factor_of_=5FPyUnicodeWriter_on_Windows?= Message-ID: <3dNhMb6Svbz7N2K@mail.python.org> http://hg.python.org/cpython/rev/093b9838a41c changeset: 87249:093b9838a41c user: Victor Stinner date: Mon Nov 18 21:08:39 2013 +0100 summary: Issue #19581: Change the overallocation factor of _PyUnicodeWriter on Windows On Windows, a factor of 50% gives best performances. files: Objects/unicodeobject.c | 23 +++++++++++++++++------ 1 files changed, 17 insertions(+), 6 deletions(-) diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -13106,6 +13106,13 @@ _PyUnicodeWriter_PrepareInternal(_PyUnicodeWriter *writer, Py_ssize_t length, Py_UCS4 maxchar) { +#ifdef MS_WINDOWS + /* On Windows, overallocate by 50% is the best factor */ +# define OVERALLOCATE_FACTOR 2 +#else + /* On Linux, overallocate by 25% is the best factor */ +# define OVERALLOCATE_FACTOR 4 +#endif Py_ssize_t newlen; PyObject *newbuffer; @@ -13121,9 +13128,10 @@ if (writer->buffer == NULL) { assert(!writer->readonly); - if (writer->overallocate && newlen <= (PY_SSIZE_T_MAX - newlen / 4)) { - /* overallocate 25% to limit the number of resize */ - newlen += newlen / 4; + if (writer->overallocate + && newlen <= (PY_SSIZE_T_MAX - newlen / OVERALLOCATE_FACTOR)) { + /* overallocate to limit the number of realloc() */ + newlen += newlen / OVERALLOCATE_FACTOR; } if (newlen < writer->min_length) newlen = writer->min_length; @@ -13133,9 +13141,10 @@ return -1; } else if (newlen > writer->size) { - if (writer->overallocate && newlen <= (PY_SSIZE_T_MAX - newlen / 4)) { - /* overallocate 25% to limit the number of resize */ - newlen += newlen / 4; + if (writer->overallocate + && newlen <= (PY_SSIZE_T_MAX - newlen / OVERALLOCATE_FACTOR)) { + /* overallocate to limit the number of realloc() */ + newlen += newlen / OVERALLOCATE_FACTOR; } if (newlen < writer->min_length) newlen = writer->min_length; @@ -13169,6 +13178,8 @@ } _PyUnicodeWriter_Update(writer); return 0; + +#undef OVERALLOCATE_FACTOR } Py_LOCAL_INLINE(int) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 21:19:09 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 18 Nov 2013 21:19:09 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319513=3A_repr=28l?= =?utf-8?q?ist=29_now_uses_the_PyUnicodeWriter_API=2C_it_is_faster_than?= Message-ID: <3dNhMd1Fvxz7Mtt@mail.python.org> http://hg.python.org/cpython/rev/fc7ceb001eec changeset: 87250:fc7ceb001eec user: Victor Stinner date: Mon Nov 18 21:11:57 2013 +0100 summary: Issue #19513: repr(list) now uses the PyUnicodeWriter API, it is faster than the PyAccu API files: Objects/listobject.c | 46 ++++++++++++++++++++----------- 1 files changed, 29 insertions(+), 17 deletions(-) diff --git a/Objects/listobject.c b/Objects/listobject.c --- a/Objects/listobject.c +++ b/Objects/listobject.c @@ -338,9 +338,9 @@ list_repr(PyListObject *v) { Py_ssize_t i; - PyObject *s = NULL; - _PyAccu acc; + PyObject *s; static PyObject *sep = NULL; + _PyUnicodeWriter writer; if (Py_SIZE(v) == 0) { return PyUnicode_FromString("[]"); @@ -357,38 +357,50 @@ return i > 0 ? PyUnicode_FromString("[...]") : NULL; } - if (_PyAccu_Init(&acc)) + _PyUnicodeWriter_Init(&writer); + writer.overallocate = 1; + if (Py_SIZE(v) > 1) { + /* "[" + "1" + ", 2" * (len - 1) + "]" */ + writer.min_length = 1 + 1 + (2 + 1) * (Py_SIZE(v) - 1) + 1; + } + else { + /* "[1]" */ + writer.min_length = 3; + } + + if (_PyUnicodeWriter_WriteChar(&writer, '[') < 0) goto error; - s = PyUnicode_FromString("["); - if (s == NULL || _PyAccu_Accumulate(&acc, s)) - goto error; - Py_CLEAR(s); - /* Do repr() on each element. Note that this may mutate the list, so must refetch the list size on each iteration. */ for (i = 0; i < Py_SIZE(v); ++i) { + if (i > 0) { + if (_PyUnicodeWriter_WriteStr(&writer, sep) < 0) + goto error; + } + if (Py_EnterRecursiveCall(" while getting the repr of a list")) goto error; s = PyObject_Repr(v->ob_item[i]); Py_LeaveRecursiveCall(); - if (i > 0 && _PyAccu_Accumulate(&acc, sep)) + if (s == NULL) goto error; - if (s == NULL || _PyAccu_Accumulate(&acc, s)) + + if (_PyUnicodeWriter_WriteStr(&writer, s) < 0) { + Py_DECREF(s); goto error; - Py_CLEAR(s); + } + Py_DECREF(s); } - s = PyUnicode_FromString("]"); - if (s == NULL || _PyAccu_Accumulate(&acc, s)) + + if (_PyUnicodeWriter_WriteChar(&writer, ']') < 0) goto error; - Py_CLEAR(s); Py_ReprLeave((PyObject *)v); - return _PyAccu_Finish(&acc); + return _PyUnicodeWriter_Finish(&writer); error: - _PyAccu_Destroy(&acc); - Py_XDECREF(s); + _PyUnicodeWriter_Dealloc(&writer); Py_ReprLeave((PyObject *)v); return NULL; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 21:48:54 2013 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 18 Nov 2013 21:48:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Mention_that_resolve=28=29_ca?= =?utf-8?q?nonicalizes_the_path_under_Windows?= Message-ID: <3dNj1y6GjKzPqp@mail.python.org> http://hg.python.org/peps/rev/97e81c1494aa changeset: 5286:97e81c1494aa user: Antoine Pitrou date: Mon Nov 18 21:48:50 2013 +0100 summary: Mention that resolve() canonicalizes the path under Windows files: pep-0428.txt | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -600,7 +600,8 @@ The ``resolve()`` method makes a path absolute, resolving any symlink on the way (like the POSIX realpath() call). It is the only operation which -will remove "``..``" path components. +will remove "``..``" path components. On Windows, this method will also +take care to return the canonical path (with the right casing). Directory walking -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Mon Nov 18 21:49:45 2013 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 18 Nov 2013 21:49:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?cGVwczogQWRkIGlzX3NvY2soKSwgaXNfZmlm?= =?utf-8?b?bygpLCBpc19ibG9ja19kZXZpY2UoKSwgaXNfY2hhcl9kZXZpY2UoKQ==?= Message-ID: <3dNj2x37lqzRF0@mail.python.org> http://hg.python.org/peps/rev/467f635c23c7 changeset: 5287:467f635c23c7 user: Antoine Pitrou date: Mon Nov 18 21:49:41 2013 +0100 summary: Add is_sock(), is_fifo(), is_block_device(), is_char_device() files: pep-0428.txt | 8 ++++++++ 1 files changed, 8 insertions(+), 0 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -584,6 +584,14 @@ False >>> p.is_symlink() False + >>> p.is_sock() + False + >>> p.is_fifo() + False + >>> p.is_block_device() + False + >>> p.is_char_device() + False The file owner and group names (rather than numeric ids) are queried through matching properties:: -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Mon Nov 18 22:10:52 2013 From: python-checkins at python.org (gregory.p.smith) Date: Mon, 18 Nov 2013 22:10:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Cleanup_this_t?= =?utf-8?q?est=27s_modification_of_os=2Eenviron_in_teardown_=28this_alread?= =?utf-8?q?y?= Message-ID: <3dNjWJ2v9sz7Mtt@mail.python.org> http://hg.python.org/cpython/rev/615b5918b2ae changeset: 87251:615b5918b2ae branch: 3.3 parent: 87224:9791c5d55f52 user: Gregory P. Smith date: Mon Nov 18 21:10:04 2013 +0000 summary: Cleanup this test's modification of os.environ in teardown (this already exists in 3.4 but apparently wasn't done for 3.3). files: Lib/test/test_urllib2_localnet.py | 3 +++ 1 files changed, 3 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_urllib2_localnet.py b/Lib/test/test_urllib2_localnet.py --- a/Lib/test/test_urllib2_localnet.py +++ b/Lib/test/test_urllib2_localnet.py @@ -353,12 +353,15 @@ def setUp(self): super(TestUrlopen, self).setUp() # Ignore proxies for localhost tests. + self.old_environ = os.environ.copy() os.environ['NO_PROXY'] = '*' self.server = None def tearDown(self): if self.server is not None: self.server.stop() + os.environ.clear() + os.environ.update(self.old_environ) super(TestUrlopen, self).tearDown() def urlopen(self, url, data=None, **kwargs): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 22:10:53 2013 From: python-checkins at python.org (gregory.p.smith) Date: Mon, 18 Nov 2013 22:10:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_null_merge_=28already_in_3=2E4=29?= Message-ID: <3dNjWK5VFbz7N2K@mail.python.org> http://hg.python.org/cpython/rev/6507c6c03885 changeset: 87252:6507c6c03885 parent: 87250:fc7ceb001eec parent: 87251:615b5918b2ae user: Gregory P. Smith date: Mon Nov 18 21:10:29 2013 +0000 summary: null merge (already in 3.4) files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 22:16:32 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 18 Nov 2013 22:16:32 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319513=3A_Simplify?= =?utf-8?b?IGxpc3RfcmVwcigp?= Message-ID: <3dNjdr5mBlz7Ljc@mail.python.org> http://hg.python.org/cpython/rev/ead9043f69df changeset: 87253:ead9043f69df user: Victor Stinner date: Mon Nov 18 22:15:44 2013 +0100 summary: Issue #19513: Simplify list_repr() files: Objects/listobject.c | 10 ++-------- 1 files changed, 2 insertions(+), 8 deletions(-) diff --git a/Objects/listobject.c b/Objects/listobject.c --- a/Objects/listobject.c +++ b/Objects/listobject.c @@ -359,14 +359,8 @@ _PyUnicodeWriter_Init(&writer); writer.overallocate = 1; - if (Py_SIZE(v) > 1) { - /* "[" + "1" + ", 2" * (len - 1) + "]" */ - writer.min_length = 1 + 1 + (2 + 1) * (Py_SIZE(v) - 1) + 1; - } - else { - /* "[1]" */ - writer.min_length = 3; - } + /* "[" + "1" + ", 2" * (len - 1) + "]" */ + writer.min_length = 1 + 1 + (2 + 1) * (Py_SIZE(v) - 1) + 1; if (_PyUnicodeWriter_WriteChar(&writer, '[') < 0) goto error; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 18 23:49:06 2013 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 18 Nov 2013 23:49:06 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Make_owner=28=29_and_group=28?= =?utf-8?q?=29_methods=2C_rather_than_properties=2E?= Message-ID: <3dNlhf3fkVzP38@mail.python.org> http://hg.python.org/peps/rev/bf0cb331281b changeset: 5288:bf0cb331281b user: Antoine Pitrou date: Mon Nov 18 23:49:02 2013 +0100 summary: Make owner() and group() methods, rather than properties. It didn't make sense to keep them as properties as all other metadata-querying APIs are now exposed under the form of methods (e.g. is_dir(), etc.), and there's no stat() caching anymore. files: pep-0428.txt | 6 +++--- 1 files changed, 3 insertions(+), 3 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -594,12 +594,12 @@ False The file owner and group names (rather than numeric ids) are queried -through matching properties:: +through corresponding methods:: >>> p = Path('/etc/shadow') - >>> p.owner + >>> p.owner() 'root' - >>> p.group + >>> p.group() 'shadow' -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Tue Nov 19 04:47:54 2013 From: python-checkins at python.org (zach.ware) Date: Tue, 19 Nov 2013 04:47:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NTk2?= =?utf-8?q?=3A_Set_untestable_tests_in_test=5Fimportlib_to_None?= Message-ID: <3dNtKQ2sCxz7PN7@mail.python.org> http://hg.python.org/cpython/rev/1ac4f0645519 changeset: 87254:1ac4f0645519 branch: 3.3 parent: 87251:615b5918b2ae user: Zachary Ware date: Mon Nov 18 21:44:38 2013 -0600 summary: Issue #19596: Set untestable tests in test_importlib to None to avoid reporting success on empty tests. files: Lib/test/test_importlib/builtin/test_finder.py | 18 ++------- Lib/test/test_importlib/builtin/test_loader.py | 14 ++----- Lib/test/test_importlib/extension/test_finder.py | 19 +++------ Lib/test/test_importlib/extension/test_loader.py | 15 +++---- Lib/test/test_importlib/frozen/test_finder.py | 10 ++--- Lib/test/test_importlib/frozen/test_loader.py | 5 +- Misc/NEWS | 3 + 7 files changed, 29 insertions(+), 55 deletions(-) diff --git a/Lib/test/test_importlib/builtin/test_finder.py b/Lib/test/test_importlib/builtin/test_finder.py --- a/Lib/test/test_importlib/builtin/test_finder.py +++ b/Lib/test/test_importlib/builtin/test_finder.py @@ -16,21 +16,11 @@ found = machinery.BuiltinImporter.find_module(builtin_util.NAME) self.assertTrue(found) - def test_package(self): - # Built-in modules cannot be a package. - pass + # Built-in modules cannot be a package. + test_package = test_package_in_package = test_package_over_module = None - def test_module_in_package(self): - # Built-in modules cannobt be in a package. - pass - - def test_package_in_package(self): - # Built-in modules cannot be a package. - pass - - def test_package_over_module(self): - # Built-in modules cannot be a package. - pass + # Built-in modules cannot be in a package. + test_module_in_package = None def test_failure(self): assert 'importlib' not in sys.builtin_module_names diff --git a/Lib/test/test_importlib/builtin/test_loader.py b/Lib/test/test_importlib/builtin/test_loader.py --- a/Lib/test/test_importlib/builtin/test_loader.py +++ b/Lib/test/test_importlib/builtin/test_loader.py @@ -32,17 +32,11 @@ module = self.load_module(builtin_util.NAME) self.verify(module) - def test_package(self): - # Built-in modules cannot be a package. - pass + # Built-in modules cannot be a package. + test_package = test_lacking_parent = None - def test_lacking_parent(self): - # Built-in modules cannot be a package. - pass - - def test_state_after_failure(self): - # Not way to force an imoprt failure. - pass + # No way to force an import failure. + test_state_after_failure = None def test_module_reuse(self): # Test that the same module is used in a reload. diff --git a/Lib/test/test_importlib/extension/test_finder.py b/Lib/test/test_importlib/extension/test_finder.py --- a/Lib/test/test_importlib/extension/test_finder.py +++ b/Lib/test/test_importlib/extension/test_finder.py @@ -17,21 +17,14 @@ def test_module(self): self.assertTrue(self.find_module(util.NAME)) - def test_package(self): - # No extension module as an __init__ available for testing. - pass + # No extension module as an __init__ available for testing. + test_package = test_package_in_package = None - def test_module_in_package(self): - # No extension module in a package available for testing. - pass + # No extension module in a package available for testing. + test_module_in_package = None - def test_package_in_package(self): - # No extension module as an __init__ available for testing. - pass - - def test_package_over_module(self): - # Extension modules cannot be an __init__ for a package. - pass + # Extension modules cannot be an __init__ for a package. + test_package_over_module = None def test_failure(self): self.assertIsNone(self.find_module('asdfjkl;')) diff --git a/Lib/test/test_importlib/extension/test_loader.py b/Lib/test/test_importlib/extension/test_loader.py --- a/Lib/test/test_importlib/extension/test_loader.py +++ b/Lib/test/test_importlib/extension/test_loader.py @@ -38,13 +38,11 @@ self.assertIsInstance(module.__loader__, machinery.ExtensionFileLoader) - def test_package(self): - # No extension module as __init__ available for testing. - pass + # No extension module as __init__ available for testing. + test_package = None - def test_lacking_parent(self): - # No extension module in a package available for testing. - pass + # No extension module in a package available for testing. + test_lacking_parent = None def test_module_reuse(self): with util.uncache(ext_util.NAME): @@ -52,9 +50,8 @@ module2 = self.load_module(ext_util.NAME) self.assertIs(module1, module2) - def test_state_after_failure(self): - # No easy way to trigger a failure after a successful import. - pass + # No easy way to trigger a failure after a successful import. + test_state_after_failure = None def test_unloadable(self): name = 'asdfjkl;' diff --git a/Lib/test/test_importlib/frozen/test_finder.py b/Lib/test/test_importlib/frozen/test_finder.py --- a/Lib/test/test_importlib/frozen/test_finder.py +++ b/Lib/test/test_importlib/frozen/test_finder.py @@ -25,13 +25,11 @@ loader = self.find('__phello__.spam', ['__phello__']) self.assertTrue(hasattr(loader, 'load_module')) - def test_package_in_package(self): - # No frozen package within another package to test with. - pass + # No frozen package within another package to test with. + test_package_in_package = None - def test_package_over_module(self): - # No easy way to test. - pass + # No easy way to test. + test_package_over_module = None def test_failure(self): loader = self.find('') diff --git a/Lib/test/test_importlib/frozen/test_loader.py b/Lib/test/test_importlib/frozen/test_loader.py --- a/Lib/test/test_importlib/frozen/test_loader.py +++ b/Lib/test/test_importlib/frozen/test_loader.py @@ -65,9 +65,8 @@ self.assertEqual(repr(module), "") - def test_state_after_failure(self): - # No way to trigger an error in a frozen module. - pass + # No way to trigger an error in a frozen module. + test_state_after_failure = None def test_unloadable(self): assert machinery.FrozenImporter.find_module('_not_real') is None diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -51,6 +51,9 @@ Tests ----- +- Issue #19596: Set untestable tests in test_importlib to None to avoid + reporting success on empty tests. + - Issue #19440: Clean up test_capi by removing an unnecessary __future__ import, converting from test_main to unittest.main, and running the _testcapi module tests within a unittest TestCase. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 04:47:55 2013 From: python-checkins at python.org (zach.ware) Date: Tue, 19 Nov 2013 04:47:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319596=3A_Null_merge_with_3=2E3?= Message-ID: <3dNtKR4kBgz7QXJ@mail.python.org> http://hg.python.org/cpython/rev/34a65109d191 changeset: 87255:34a65109d191 parent: 87253:ead9043f69df parent: 87254:1ac4f0645519 user: Zachary Ware date: Mon Nov 18 21:47:35 2013 -0600 summary: Issue #19596: Null merge with 3.3 This will be merged into default when PEP451 is merged in. See changeset 5d38989191bb in features/pep-451 files: -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Tue Nov 19 07:37:41 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Tue, 19 Nov 2013 07:37:41 +0100 Subject: [Python-checkins] Daily reference leaks (ead9043f69df): sum=8 Message-ID: results for ead9043f69df on branch "default" -------------------------------------------- test_asyncio leaked [4, 0, 0] memory blocks, sum=4 test_site leaked [2, -2, 2] references, sum=2 test_site leaked [2, -2, 2] memory blocks, sum=2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogsAqdJf', '-x'] From python-checkins at python.org Tue Nov 19 08:55:37 2013 From: python-checkins at python.org (matthias.klose) Date: Tue, 19 Nov 2013 08:55:37 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_-_Update_confi?= =?utf-8?q?g=2E=7Bguess=2Csub=7D_for_new_ports=2E?= Message-ID: <3dNzqF0sLsz7S1B@mail.python.org> http://hg.python.org/cpython/rev/d47349232171 changeset: 87256:d47349232171 branch: 2.7 parent: 87221:38fcb9a63398 user: doko at ubuntu.com date: Tue Nov 19 08:54:06 2013 +0100 summary: - Update config.{guess,sub} for new ports. files: config.guess | 151 ++++++++++++++++++++++---------------- config.sub | 41 +++++----- 2 files changed, 108 insertions(+), 84 deletions(-) diff --git a/config.guess b/config.guess --- a/config.guess +++ b/config.guess @@ -1,10 +1,8 @@ #! /bin/sh # Attempt to guess a canonical system name. -# Copyright (C) 1992, 1993, 1994, 1995, 1996, 1997, 1998, 1999, -# 2000, 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -# 2011, 2012, 2013 Free Software Foundation, Inc. +# Copyright 1992-2013 Free Software Foundation, Inc. -timestamp='2012-12-29' +timestamp='2013-06-10' # This file is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by @@ -26,7 +24,7 @@ # program. This Exception is an additional permission under section 7 # of the GNU General Public License, version 3 ("GPLv3"). # -# Originally written by Per Bothner. +# Originally written by Per Bothner. # # You can get the latest version of this script from: # http://git.savannah.gnu.org/gitweb/?p=config.git;a=blob_plain;f=config.guess;hb=HEAD @@ -52,9 +50,7 @@ GNU config.guess ($timestamp) Originally written by Per Bothner. -Copyright (C) 1992, 1993, 1994, 1995, 1996, 1997, 1998, 1999, 2000, -2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013 Free Software Foundation, Inc. +Copyright 1992-2013 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE." @@ -136,6 +132,27 @@ UNAME_SYSTEM=`(uname -s) 2>/dev/null` || UNAME_SYSTEM=unknown UNAME_VERSION=`(uname -v) 2>/dev/null` || UNAME_VERSION=unknown +case "${UNAME_SYSTEM}" in +Linux|GNU|GNU/*) + # If the system lacks a compiler, then just pick glibc. + # We could probably try harder. + LIBC=gnu + + eval $set_cc_for_build + cat <<-EOF > $dummy.c + #include + #if defined(__UCLIBC__) + LIBC=uclibc + #elif defined(__dietlibc__) + LIBC=dietlibc + #else + LIBC=gnu + #endif + EOF + eval `$CC_FOR_BUILD -E $dummy.c 2>/dev/null | grep '^LIBC'` + ;; +esac + # Note: order is significant - the case branches are not exclusive. case "${UNAME_MACHINE}:${UNAME_SYSTEM}:${UNAME_RELEASE}:${UNAME_VERSION}" in @@ -857,21 +874,21 @@ exit ;; *:GNU:*:*) # the GNU system - echo `echo ${UNAME_MACHINE}|sed -e 's,[-/].*$,,'`-unknown-gnu`echo ${UNAME_RELEASE}|sed -e 's,/.*$,,'` + echo `echo ${UNAME_MACHINE}|sed -e 's,[-/].*$,,'`-unknown-${LIBC}`echo ${UNAME_RELEASE}|sed -e 's,/.*$,,'` exit ;; *:GNU/*:*:*) # other systems with GNU libc and userland - echo ${UNAME_MACHINE}-unknown-`echo ${UNAME_SYSTEM} | sed 's,^[^/]*/,,' | tr '[A-Z]' '[a-z]'``echo ${UNAME_RELEASE}|sed -e 's/[-(].*//'`-gnu + echo ${UNAME_MACHINE}-unknown-`echo ${UNAME_SYSTEM} | sed 's,^[^/]*/,,' | tr '[A-Z]' '[a-z]'``echo ${UNAME_RELEASE}|sed -e 's/[-(].*//'`-${LIBC} exit ;; i*86:Minix:*:*) echo ${UNAME_MACHINE}-pc-minix exit ;; aarch64:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; aarch64_be:Linux:*:*) UNAME_MACHINE=aarch64_be - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; alpha:Linux:*:*) case `sed -n '/^cpu model/s/^.*: \(.*\)/\1/p' < /proc/cpuinfo` in @@ -884,59 +901,54 @@ EV68*) UNAME_MACHINE=alphaev68 ;; esac objdump --private-headers /bin/sh | grep -q ld.so.1 - if test "$?" = 0 ; then LIBC="libc1" ; else LIBC="" ; fi - echo ${UNAME_MACHINE}-unknown-linux-gnu${LIBC} + if test "$?" = 0 ; then LIBC="gnulibc1" ; fi + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + arc:Linux:*:* | arceb:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; arm*:Linux:*:*) eval $set_cc_for_build if echo __ARM_EABI__ | $CC_FOR_BUILD -E - 2>/dev/null \ | grep -q __ARM_EABI__ then - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} else if echo __ARM_PCS_VFP | $CC_FOR_BUILD -E - 2>/dev/null \ | grep -q __ARM_PCS_VFP then - echo ${UNAME_MACHINE}-unknown-linux-gnueabi + echo ${UNAME_MACHINE}-unknown-linux-${LIBC}eabi else - echo ${UNAME_MACHINE}-unknown-linux-gnueabihf + echo ${UNAME_MACHINE}-unknown-linux-${LIBC}eabihf fi fi exit ;; avr32*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; cris:Linux:*:*) - echo ${UNAME_MACHINE}-axis-linux-gnu + echo ${UNAME_MACHINE}-axis-linux-${LIBC} exit ;; crisv32:Linux:*:*) - echo ${UNAME_MACHINE}-axis-linux-gnu + echo ${UNAME_MACHINE}-axis-linux-${LIBC} exit ;; frv:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; hexagon:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; i*86:Linux:*:*) - LIBC=gnu - eval $set_cc_for_build - sed 's/^ //' << EOF >$dummy.c - #ifdef __dietlibc__ - LIBC=dietlibc - #endif -EOF - eval `$CC_FOR_BUILD -E $dummy.c 2>/dev/null | grep '^LIBC'` - echo "${UNAME_MACHINE}-pc-linux-${LIBC}" + echo ${UNAME_MACHINE}-pc-linux-${LIBC} exit ;; ia64:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; m32r*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; m68*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; mips:Linux:*:* | mips64:Linux:*:*) eval $set_cc_for_build @@ -955,54 +967,63 @@ #endif EOF eval `$CC_FOR_BUILD -E $dummy.c 2>/dev/null | grep '^CPU'` - test x"${CPU}" != x && { echo "${CPU}-unknown-linux-gnu"; exit; } + test x"${CPU}" != x && { echo "${CPU}-unknown-linux-${LIBC}"; exit; } ;; + or1k:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; or32:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; padre:Linux:*:*) - echo sparc-unknown-linux-gnu + echo sparc-unknown-linux-${LIBC} exit ;; parisc64:Linux:*:* | hppa64:Linux:*:*) - echo hppa64-unknown-linux-gnu + echo hppa64-unknown-linux-${LIBC} exit ;; parisc:Linux:*:* | hppa:Linux:*:*) # Look for CPU level case `grep '^cpu[^a-z]*:' /proc/cpuinfo 2>/dev/null | cut -d' ' -f2` in - PA7*) echo hppa1.1-unknown-linux-gnu ;; - PA8*) echo hppa2.0-unknown-linux-gnu ;; - *) echo hppa-unknown-linux-gnu ;; + PA7*) echo hppa1.1-unknown-linux-${LIBC} ;; + PA8*) echo hppa2.0-unknown-linux-${LIBC} ;; + *) echo hppa-unknown-linux-${LIBC} ;; esac exit ;; ppc64:Linux:*:*) - echo powerpc64-unknown-linux-gnu + echo powerpc64-unknown-linux-${LIBC} exit ;; ppc:Linux:*:*) - echo powerpc-unknown-linux-gnu + echo powerpc-unknown-linux-${LIBC} + exit ;; + ppc64le:Linux:*:*) + echo powerpc64le-unknown-linux-${LIBC} + exit ;; + ppcle:Linux:*:*) + echo powerpcle-unknown-linux-${LIBC} exit ;; s390:Linux:*:* | s390x:Linux:*:*) - echo ${UNAME_MACHINE}-ibm-linux + echo ${UNAME_MACHINE}-ibm-linux-${LIBC} exit ;; sh64*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; sh*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; sparc:Linux:*:* | sparc64:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; tile*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; vax:Linux:*:*) - echo ${UNAME_MACHINE}-dec-linux-gnu + echo ${UNAME_MACHINE}-dec-linux-${LIBC} exit ;; x86_64:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; xtensa*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; i*86:DYNIX/ptx:4*:*) # ptx 4.0 does uname -s correctly, with DYNIX/ptx in there. @@ -1235,19 +1256,21 @@ exit ;; *:Darwin:*:*) UNAME_PROCESSOR=`uname -p` || UNAME_PROCESSOR=unknown - case $UNAME_PROCESSOR in - i386) - eval $set_cc_for_build - if [ "$CC_FOR_BUILD" != 'no_compiler_found' ]; then - if (echo '#ifdef __LP64__'; echo IS_64BIT_ARCH; echo '#endif') | \ - (CCOPTS= $CC_FOR_BUILD -E - 2>/dev/null) | \ - grep IS_64BIT_ARCH >/dev/null - then - UNAME_PROCESSOR="x86_64" - fi - fi ;; - unknown) UNAME_PROCESSOR=powerpc ;; - esac + eval $set_cc_for_build + if test "$UNAME_PROCESSOR" = unknown ; then + UNAME_PROCESSOR=powerpc + fi + if [ "$CC_FOR_BUILD" != 'no_compiler_found' ]; then + if (echo '#ifdef __LP64__'; echo IS_64BIT_ARCH; echo '#endif') | \ + (CCOPTS= $CC_FOR_BUILD -E - 2>/dev/null) | \ + grep IS_64BIT_ARCH >/dev/null + then + case $UNAME_PROCESSOR in + i386) UNAME_PROCESSOR=x86_64 ;; + powerpc) UNAME_PROCESSOR=powerpc64 ;; + esac + fi + fi echo ${UNAME_PROCESSOR}-apple-darwin${UNAME_RELEASE} exit ;; *:procnto*:*:* | *:QNX:[0123456789]*:*) diff --git a/config.sub b/config.sub --- a/config.sub +++ b/config.sub @@ -1,10 +1,8 @@ #! /bin/sh # Configuration validation subroutine script. -# Copyright (C) 1992, 1993, 1994, 1995, 1996, 1997, 1998, 1999, -# 2000, 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -# 2011, 2012, 2013 Free Software Foundation, Inc. +# Copyright 1992-2013 Free Software Foundation, Inc. -timestamp='2012-12-29' +timestamp='2013-08-10' # This file is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by @@ -70,9 +68,7 @@ version="\ GNU config.sub ($timestamp) -Copyright (C) 1992, 1993, 1994, 1995, 1996, 1997, 1998, 1999, 2000, -2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013 Free Software Foundation, Inc. +Copyright 1992-2013 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE." @@ -256,12 +252,12 @@ | alpha | alphaev[4-8] | alphaev56 | alphaev6[78] | alphapca5[67] \ | alpha64 | alpha64ev[4-8] | alpha64ev56 | alpha64ev6[78] | alpha64pca5[67] \ | am33_2.0 \ - | arc \ + | arc | arceb \ | arm | arm[bl]e | arme[lb] | armv[2-8] | armv[3-8][lb] | armv7[arm] \ | avr | avr32 \ | be32 | be64 \ | bfin \ - | c4x | clipper \ + | c4x | c8051 | clipper \ | d10v | d30v | dlx | dsp16xx \ | epiphany \ | fido | fr30 | frv \ @@ -290,16 +286,17 @@ | mipsisa64r2 | mipsisa64r2el \ | mipsisa64sb1 | mipsisa64sb1el \ | mipsisa64sr71k | mipsisa64sr71kel \ + | mipsr5900 | mipsr5900el \ | mipstx39 | mipstx39el \ | mn10200 | mn10300 \ | moxie \ | mt \ | msp430 \ | nds32 | nds32le | nds32be \ - | nios | nios2 \ + | nios | nios2 | nios2eb | nios2el \ | ns16k | ns32k \ | open8 \ - | or32 \ + | or1k | or32 \ | pdp10 | pdp11 | pj | pjl \ | powerpc | powerpc64 | powerpc64le | powerpcle \ | pyramid \ @@ -369,13 +366,13 @@ | aarch64-* | aarch64_be-* \ | alpha-* | alphaev[4-8]-* | alphaev56-* | alphaev6[78]-* \ | alpha64-* | alpha64ev[4-8]-* | alpha64ev56-* | alpha64ev6[78]-* \ - | alphapca5[67]-* | alpha64pca5[67]-* | arc-* \ + | alphapca5[67]-* | alpha64pca5[67]-* | arc-* | arceb-* \ | arm-* | armbe-* | armle-* | armeb-* | armv*-* \ | avr-* | avr32-* \ | be32-* | be64-* \ | bfin-* | bs2000-* \ | c[123]* | c30-* | [cjt]90-* | c4x-* \ - | clipper-* | craynv-* | cydra-* \ + | c8051-* | clipper-* | craynv-* | cydra-* \ | d10v-* | d30v-* | dlx-* \ | elxsi-* \ | f30[01]-* | f700-* | fido-* | fr30-* | frv-* | fx80-* \ @@ -407,12 +404,13 @@ | mipsisa64r2-* | mipsisa64r2el-* \ | mipsisa64sb1-* | mipsisa64sb1el-* \ | mipsisa64sr71k-* | mipsisa64sr71kel-* \ + | mipsr5900-* | mipsr5900el-* \ | mipstx39-* | mipstx39el-* \ | mmix-* \ | mt-* \ | msp430-* \ | nds32-* | nds32le-* | nds32be-* \ - | nios-* | nios2-* \ + | nios-* | nios2-* | nios2eb-* | nios2el-* \ | none-* | np1-* | ns16k-* | ns32k-* \ | open8-* \ | orion-* \ @@ -796,7 +794,7 @@ os=-mingw64 ;; mingw32) - basic_machine=i386-pc + basic_machine=i686-pc os=-mingw32 ;; mingw32ce) @@ -832,7 +830,7 @@ basic_machine=`echo $basic_machine | sed -e 's/ms1-/mt-/'` ;; msys) - basic_machine=i386-pc + basic_machine=i686-pc os=-msys ;; mvs) @@ -1354,7 +1352,7 @@ -gnu* | -bsd* | -mach* | -minix* | -genix* | -ultrix* | -irix* \ | -*vms* | -sco* | -esix* | -isc* | -aix* | -cnk* | -sunos | -sunos[34]*\ | -hpux* | -unos* | -osf* | -luna* | -dgux* | -auroraux* | -solaris* \ - | -sym* | -kopensolaris* \ + | -sym* | -kopensolaris* | -plan9* \ | -amigaos* | -amigados* | -msdos* | -newsos* | -unicos* | -aof* \ | -aos* | -aros* \ | -nindy* | -vxsim* | -vxworks* | -ebmon* | -hms* | -mvs* \ @@ -1500,9 +1498,6 @@ -aros*) os=-aros ;; - -kaos*) - os=-kaos - ;; -zvmoe) os=-zvmoe ;; @@ -1551,6 +1546,9 @@ c4x-* | tic4x-*) os=-coff ;; + c8051-*) + os=-elf + ;; hexagon-*) os=-elf ;; @@ -1594,6 +1592,9 @@ mips*-*) os=-elf ;; + or1k-*) + os=-elf + ;; or32-*) os=-coff ;; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 08:55:38 2013 From: python-checkins at python.org (matthias.klose) Date: Tue, 19 Nov 2013 08:55:38 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_-_Update_confi?= =?utf-8?q?g=2E=7Bguess=2Csub=7D_for_new_ports=2E?= Message-ID: <3dNzqG4tssz7bG7@mail.python.org> http://hg.python.org/cpython/rev/5adb2fca72c2 changeset: 87257:5adb2fca72c2 branch: 3.3 parent: 87254:1ac4f0645519 user: doko at ubuntu.com date: Tue Nov 19 08:54:38 2013 +0100 summary: - Update config.{guess,sub} for new ports. files: config.guess | 151 ++++++++++++++++++++++---------------- config.sub | 41 +++++----- 2 files changed, 108 insertions(+), 84 deletions(-) diff --git a/config.guess b/config.guess --- a/config.guess +++ b/config.guess @@ -1,10 +1,8 @@ #! /bin/sh # Attempt to guess a canonical system name. -# Copyright (C) 1992, 1993, 1994, 1995, 1996, 1997, 1998, 1999, -# 2000, 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -# 2011, 2012, 2013 Free Software Foundation, Inc. +# Copyright 1992-2013 Free Software Foundation, Inc. -timestamp='2012-12-29' +timestamp='2013-06-10' # This file is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by @@ -26,7 +24,7 @@ # program. This Exception is an additional permission under section 7 # of the GNU General Public License, version 3 ("GPLv3"). # -# Originally written by Per Bothner. +# Originally written by Per Bothner. # # You can get the latest version of this script from: # http://git.savannah.gnu.org/gitweb/?p=config.git;a=blob_plain;f=config.guess;hb=HEAD @@ -52,9 +50,7 @@ GNU config.guess ($timestamp) Originally written by Per Bothner. -Copyright (C) 1992, 1993, 1994, 1995, 1996, 1997, 1998, 1999, 2000, -2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013 Free Software Foundation, Inc. +Copyright 1992-2013 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE." @@ -136,6 +132,27 @@ UNAME_SYSTEM=`(uname -s) 2>/dev/null` || UNAME_SYSTEM=unknown UNAME_VERSION=`(uname -v) 2>/dev/null` || UNAME_VERSION=unknown +case "${UNAME_SYSTEM}" in +Linux|GNU|GNU/*) + # If the system lacks a compiler, then just pick glibc. + # We could probably try harder. + LIBC=gnu + + eval $set_cc_for_build + cat <<-EOF > $dummy.c + #include + #if defined(__UCLIBC__) + LIBC=uclibc + #elif defined(__dietlibc__) + LIBC=dietlibc + #else + LIBC=gnu + #endif + EOF + eval `$CC_FOR_BUILD -E $dummy.c 2>/dev/null | grep '^LIBC'` + ;; +esac + # Note: order is significant - the case branches are not exclusive. case "${UNAME_MACHINE}:${UNAME_SYSTEM}:${UNAME_RELEASE}:${UNAME_VERSION}" in @@ -857,21 +874,21 @@ exit ;; *:GNU:*:*) # the GNU system - echo `echo ${UNAME_MACHINE}|sed -e 's,[-/].*$,,'`-unknown-gnu`echo ${UNAME_RELEASE}|sed -e 's,/.*$,,'` + echo `echo ${UNAME_MACHINE}|sed -e 's,[-/].*$,,'`-unknown-${LIBC}`echo ${UNAME_RELEASE}|sed -e 's,/.*$,,'` exit ;; *:GNU/*:*:*) # other systems with GNU libc and userland - echo ${UNAME_MACHINE}-unknown-`echo ${UNAME_SYSTEM} | sed 's,^[^/]*/,,' | tr '[A-Z]' '[a-z]'``echo ${UNAME_RELEASE}|sed -e 's/[-(].*//'`-gnu + echo ${UNAME_MACHINE}-unknown-`echo ${UNAME_SYSTEM} | sed 's,^[^/]*/,,' | tr '[A-Z]' '[a-z]'``echo ${UNAME_RELEASE}|sed -e 's/[-(].*//'`-${LIBC} exit ;; i*86:Minix:*:*) echo ${UNAME_MACHINE}-pc-minix exit ;; aarch64:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; aarch64_be:Linux:*:*) UNAME_MACHINE=aarch64_be - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; alpha:Linux:*:*) case `sed -n '/^cpu model/s/^.*: \(.*\)/\1/p' < /proc/cpuinfo` in @@ -884,59 +901,54 @@ EV68*) UNAME_MACHINE=alphaev68 ;; esac objdump --private-headers /bin/sh | grep -q ld.so.1 - if test "$?" = 0 ; then LIBC="libc1" ; else LIBC="" ; fi - echo ${UNAME_MACHINE}-unknown-linux-gnu${LIBC} + if test "$?" = 0 ; then LIBC="gnulibc1" ; fi + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + arc:Linux:*:* | arceb:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; arm*:Linux:*:*) eval $set_cc_for_build if echo __ARM_EABI__ | $CC_FOR_BUILD -E - 2>/dev/null \ | grep -q __ARM_EABI__ then - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} else if echo __ARM_PCS_VFP | $CC_FOR_BUILD -E - 2>/dev/null \ | grep -q __ARM_PCS_VFP then - echo ${UNAME_MACHINE}-unknown-linux-gnueabi + echo ${UNAME_MACHINE}-unknown-linux-${LIBC}eabi else - echo ${UNAME_MACHINE}-unknown-linux-gnueabihf + echo ${UNAME_MACHINE}-unknown-linux-${LIBC}eabihf fi fi exit ;; avr32*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; cris:Linux:*:*) - echo ${UNAME_MACHINE}-axis-linux-gnu + echo ${UNAME_MACHINE}-axis-linux-${LIBC} exit ;; crisv32:Linux:*:*) - echo ${UNAME_MACHINE}-axis-linux-gnu + echo ${UNAME_MACHINE}-axis-linux-${LIBC} exit ;; frv:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; hexagon:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; i*86:Linux:*:*) - LIBC=gnu - eval $set_cc_for_build - sed 's/^ //' << EOF >$dummy.c - #ifdef __dietlibc__ - LIBC=dietlibc - #endif -EOF - eval `$CC_FOR_BUILD -E $dummy.c 2>/dev/null | grep '^LIBC'` - echo "${UNAME_MACHINE}-pc-linux-${LIBC}" + echo ${UNAME_MACHINE}-pc-linux-${LIBC} exit ;; ia64:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; m32r*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; m68*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; mips:Linux:*:* | mips64:Linux:*:*) eval $set_cc_for_build @@ -955,54 +967,63 @@ #endif EOF eval `$CC_FOR_BUILD -E $dummy.c 2>/dev/null | grep '^CPU'` - test x"${CPU}" != x && { echo "${CPU}-unknown-linux-gnu"; exit; } + test x"${CPU}" != x && { echo "${CPU}-unknown-linux-${LIBC}"; exit; } ;; + or1k:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; or32:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; padre:Linux:*:*) - echo sparc-unknown-linux-gnu + echo sparc-unknown-linux-${LIBC} exit ;; parisc64:Linux:*:* | hppa64:Linux:*:*) - echo hppa64-unknown-linux-gnu + echo hppa64-unknown-linux-${LIBC} exit ;; parisc:Linux:*:* | hppa:Linux:*:*) # Look for CPU level case `grep '^cpu[^a-z]*:' /proc/cpuinfo 2>/dev/null | cut -d' ' -f2` in - PA7*) echo hppa1.1-unknown-linux-gnu ;; - PA8*) echo hppa2.0-unknown-linux-gnu ;; - *) echo hppa-unknown-linux-gnu ;; + PA7*) echo hppa1.1-unknown-linux-${LIBC} ;; + PA8*) echo hppa2.0-unknown-linux-${LIBC} ;; + *) echo hppa-unknown-linux-${LIBC} ;; esac exit ;; ppc64:Linux:*:*) - echo powerpc64-unknown-linux-gnu + echo powerpc64-unknown-linux-${LIBC} exit ;; ppc:Linux:*:*) - echo powerpc-unknown-linux-gnu + echo powerpc-unknown-linux-${LIBC} + exit ;; + ppc64le:Linux:*:*) + echo powerpc64le-unknown-linux-${LIBC} + exit ;; + ppcle:Linux:*:*) + echo powerpcle-unknown-linux-${LIBC} exit ;; s390:Linux:*:* | s390x:Linux:*:*) - echo ${UNAME_MACHINE}-ibm-linux + echo ${UNAME_MACHINE}-ibm-linux-${LIBC} exit ;; sh64*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; sh*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; sparc:Linux:*:* | sparc64:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; tile*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; vax:Linux:*:*) - echo ${UNAME_MACHINE}-dec-linux-gnu + echo ${UNAME_MACHINE}-dec-linux-${LIBC} exit ;; x86_64:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; xtensa*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; i*86:DYNIX/ptx:4*:*) # ptx 4.0 does uname -s correctly, with DYNIX/ptx in there. @@ -1235,19 +1256,21 @@ exit ;; *:Darwin:*:*) UNAME_PROCESSOR=`uname -p` || UNAME_PROCESSOR=unknown - case $UNAME_PROCESSOR in - i386) - eval $set_cc_for_build - if [ "$CC_FOR_BUILD" != 'no_compiler_found' ]; then - if (echo '#ifdef __LP64__'; echo IS_64BIT_ARCH; echo '#endif') | \ - (CCOPTS= $CC_FOR_BUILD -E - 2>/dev/null) | \ - grep IS_64BIT_ARCH >/dev/null - then - UNAME_PROCESSOR="x86_64" - fi - fi ;; - unknown) UNAME_PROCESSOR=powerpc ;; - esac + eval $set_cc_for_build + if test "$UNAME_PROCESSOR" = unknown ; then + UNAME_PROCESSOR=powerpc + fi + if [ "$CC_FOR_BUILD" != 'no_compiler_found' ]; then + if (echo '#ifdef __LP64__'; echo IS_64BIT_ARCH; echo '#endif') | \ + (CCOPTS= $CC_FOR_BUILD -E - 2>/dev/null) | \ + grep IS_64BIT_ARCH >/dev/null + then + case $UNAME_PROCESSOR in + i386) UNAME_PROCESSOR=x86_64 ;; + powerpc) UNAME_PROCESSOR=powerpc64 ;; + esac + fi + fi echo ${UNAME_PROCESSOR}-apple-darwin${UNAME_RELEASE} exit ;; *:procnto*:*:* | *:QNX:[0123456789]*:*) diff --git a/config.sub b/config.sub --- a/config.sub +++ b/config.sub @@ -1,10 +1,8 @@ #! /bin/sh # Configuration validation subroutine script. -# Copyright (C) 1992, 1993, 1994, 1995, 1996, 1997, 1998, 1999, -# 2000, 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -# 2011, 2012, 2013 Free Software Foundation, Inc. +# Copyright 1992-2013 Free Software Foundation, Inc. -timestamp='2012-12-29' +timestamp='2013-08-10' # This file is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by @@ -70,9 +68,7 @@ version="\ GNU config.sub ($timestamp) -Copyright (C) 1992, 1993, 1994, 1995, 1996, 1997, 1998, 1999, 2000, -2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013 Free Software Foundation, Inc. +Copyright 1992-2013 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE." @@ -256,12 +252,12 @@ | alpha | alphaev[4-8] | alphaev56 | alphaev6[78] | alphapca5[67] \ | alpha64 | alpha64ev[4-8] | alpha64ev56 | alpha64ev6[78] | alpha64pca5[67] \ | am33_2.0 \ - | arc \ + | arc | arceb \ | arm | arm[bl]e | arme[lb] | armv[2-8] | armv[3-8][lb] | armv7[arm] \ | avr | avr32 \ | be32 | be64 \ | bfin \ - | c4x | clipper \ + | c4x | c8051 | clipper \ | d10v | d30v | dlx | dsp16xx \ | epiphany \ | fido | fr30 | frv \ @@ -290,16 +286,17 @@ | mipsisa64r2 | mipsisa64r2el \ | mipsisa64sb1 | mipsisa64sb1el \ | mipsisa64sr71k | mipsisa64sr71kel \ + | mipsr5900 | mipsr5900el \ | mipstx39 | mipstx39el \ | mn10200 | mn10300 \ | moxie \ | mt \ | msp430 \ | nds32 | nds32le | nds32be \ - | nios | nios2 \ + | nios | nios2 | nios2eb | nios2el \ | ns16k | ns32k \ | open8 \ - | or32 \ + | or1k | or32 \ | pdp10 | pdp11 | pj | pjl \ | powerpc | powerpc64 | powerpc64le | powerpcle \ | pyramid \ @@ -369,13 +366,13 @@ | aarch64-* | aarch64_be-* \ | alpha-* | alphaev[4-8]-* | alphaev56-* | alphaev6[78]-* \ | alpha64-* | alpha64ev[4-8]-* | alpha64ev56-* | alpha64ev6[78]-* \ - | alphapca5[67]-* | alpha64pca5[67]-* | arc-* \ + | alphapca5[67]-* | alpha64pca5[67]-* | arc-* | arceb-* \ | arm-* | armbe-* | armle-* | armeb-* | armv*-* \ | avr-* | avr32-* \ | be32-* | be64-* \ | bfin-* | bs2000-* \ | c[123]* | c30-* | [cjt]90-* | c4x-* \ - | clipper-* | craynv-* | cydra-* \ + | c8051-* | clipper-* | craynv-* | cydra-* \ | d10v-* | d30v-* | dlx-* \ | elxsi-* \ | f30[01]-* | f700-* | fido-* | fr30-* | frv-* | fx80-* \ @@ -407,12 +404,13 @@ | mipsisa64r2-* | mipsisa64r2el-* \ | mipsisa64sb1-* | mipsisa64sb1el-* \ | mipsisa64sr71k-* | mipsisa64sr71kel-* \ + | mipsr5900-* | mipsr5900el-* \ | mipstx39-* | mipstx39el-* \ | mmix-* \ | mt-* \ | msp430-* \ | nds32-* | nds32le-* | nds32be-* \ - | nios-* | nios2-* \ + | nios-* | nios2-* | nios2eb-* | nios2el-* \ | none-* | np1-* | ns16k-* | ns32k-* \ | open8-* \ | orion-* \ @@ -796,7 +794,7 @@ os=-mingw64 ;; mingw32) - basic_machine=i386-pc + basic_machine=i686-pc os=-mingw32 ;; mingw32ce) @@ -832,7 +830,7 @@ basic_machine=`echo $basic_machine | sed -e 's/ms1-/mt-/'` ;; msys) - basic_machine=i386-pc + basic_machine=i686-pc os=-msys ;; mvs) @@ -1354,7 +1352,7 @@ -gnu* | -bsd* | -mach* | -minix* | -genix* | -ultrix* | -irix* \ | -*vms* | -sco* | -esix* | -isc* | -aix* | -cnk* | -sunos | -sunos[34]*\ | -hpux* | -unos* | -osf* | -luna* | -dgux* | -auroraux* | -solaris* \ - | -sym* | -kopensolaris* \ + | -sym* | -kopensolaris* | -plan9* \ | -amigaos* | -amigados* | -msdos* | -newsos* | -unicos* | -aof* \ | -aos* | -aros* \ | -nindy* | -vxsim* | -vxworks* | -ebmon* | -hms* | -mvs* \ @@ -1500,9 +1498,6 @@ -aros*) os=-aros ;; - -kaos*) - os=-kaos - ;; -zvmoe) os=-zvmoe ;; @@ -1551,6 +1546,9 @@ c4x-* | tic4x-*) os=-coff ;; + c8051-*) + os=-elf + ;; hexagon-*) os=-elf ;; @@ -1594,6 +1592,9 @@ mips*-*) os=-elf ;; + or1k-*) + os=-elf + ;; or32-*) os=-coff ;; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 08:55:40 2013 From: python-checkins at python.org (matthias.klose) Date: Tue, 19 Nov 2013 08:55:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_-_Update_config=2E=7Bguess=2Csub=7D_for_new_ports=2E?= Message-ID: <3dNzqJ1qPKz7bbC@mail.python.org> http://hg.python.org/cpython/rev/62acba59660f changeset: 87258:62acba59660f parent: 87255:34a65109d191 parent: 87257:5adb2fca72c2 user: doko at ubuntu.com date: Tue Nov 19 08:55:06 2013 +0100 summary: - Update config.{guess,sub} for new ports. files: config.guess | 151 ++++++++++++++++++++++---------------- config.sub | 41 +++++----- 2 files changed, 108 insertions(+), 84 deletions(-) diff --git a/config.guess b/config.guess --- a/config.guess +++ b/config.guess @@ -1,10 +1,8 @@ #! /bin/sh # Attempt to guess a canonical system name. -# Copyright (C) 1992, 1993, 1994, 1995, 1996, 1997, 1998, 1999, -# 2000, 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -# 2011, 2012, 2013 Free Software Foundation, Inc. +# Copyright 1992-2013 Free Software Foundation, Inc. -timestamp='2012-12-29' +timestamp='2013-06-10' # This file is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by @@ -26,7 +24,7 @@ # program. This Exception is an additional permission under section 7 # of the GNU General Public License, version 3 ("GPLv3"). # -# Originally written by Per Bothner. +# Originally written by Per Bothner. # # You can get the latest version of this script from: # http://git.savannah.gnu.org/gitweb/?p=config.git;a=blob_plain;f=config.guess;hb=HEAD @@ -52,9 +50,7 @@ GNU config.guess ($timestamp) Originally written by Per Bothner. -Copyright (C) 1992, 1993, 1994, 1995, 1996, 1997, 1998, 1999, 2000, -2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013 Free Software Foundation, Inc. +Copyright 1992-2013 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE." @@ -136,6 +132,27 @@ UNAME_SYSTEM=`(uname -s) 2>/dev/null` || UNAME_SYSTEM=unknown UNAME_VERSION=`(uname -v) 2>/dev/null` || UNAME_VERSION=unknown +case "${UNAME_SYSTEM}" in +Linux|GNU|GNU/*) + # If the system lacks a compiler, then just pick glibc. + # We could probably try harder. + LIBC=gnu + + eval $set_cc_for_build + cat <<-EOF > $dummy.c + #include + #if defined(__UCLIBC__) + LIBC=uclibc + #elif defined(__dietlibc__) + LIBC=dietlibc + #else + LIBC=gnu + #endif + EOF + eval `$CC_FOR_BUILD -E $dummy.c 2>/dev/null | grep '^LIBC'` + ;; +esac + # Note: order is significant - the case branches are not exclusive. case "${UNAME_MACHINE}:${UNAME_SYSTEM}:${UNAME_RELEASE}:${UNAME_VERSION}" in @@ -857,21 +874,21 @@ exit ;; *:GNU:*:*) # the GNU system - echo `echo ${UNAME_MACHINE}|sed -e 's,[-/].*$,,'`-unknown-gnu`echo ${UNAME_RELEASE}|sed -e 's,/.*$,,'` + echo `echo ${UNAME_MACHINE}|sed -e 's,[-/].*$,,'`-unknown-${LIBC}`echo ${UNAME_RELEASE}|sed -e 's,/.*$,,'` exit ;; *:GNU/*:*:*) # other systems with GNU libc and userland - echo ${UNAME_MACHINE}-unknown-`echo ${UNAME_SYSTEM} | sed 's,^[^/]*/,,' | tr '[A-Z]' '[a-z]'``echo ${UNAME_RELEASE}|sed -e 's/[-(].*//'`-gnu + echo ${UNAME_MACHINE}-unknown-`echo ${UNAME_SYSTEM} | sed 's,^[^/]*/,,' | tr '[A-Z]' '[a-z]'``echo ${UNAME_RELEASE}|sed -e 's/[-(].*//'`-${LIBC} exit ;; i*86:Minix:*:*) echo ${UNAME_MACHINE}-pc-minix exit ;; aarch64:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; aarch64_be:Linux:*:*) UNAME_MACHINE=aarch64_be - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; alpha:Linux:*:*) case `sed -n '/^cpu model/s/^.*: \(.*\)/\1/p' < /proc/cpuinfo` in @@ -884,59 +901,54 @@ EV68*) UNAME_MACHINE=alphaev68 ;; esac objdump --private-headers /bin/sh | grep -q ld.so.1 - if test "$?" = 0 ; then LIBC="libc1" ; else LIBC="" ; fi - echo ${UNAME_MACHINE}-unknown-linux-gnu${LIBC} + if test "$?" = 0 ; then LIBC="gnulibc1" ; fi + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; + arc:Linux:*:* | arceb:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; arm*:Linux:*:*) eval $set_cc_for_build if echo __ARM_EABI__ | $CC_FOR_BUILD -E - 2>/dev/null \ | grep -q __ARM_EABI__ then - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} else if echo __ARM_PCS_VFP | $CC_FOR_BUILD -E - 2>/dev/null \ | grep -q __ARM_PCS_VFP then - echo ${UNAME_MACHINE}-unknown-linux-gnueabi + echo ${UNAME_MACHINE}-unknown-linux-${LIBC}eabi else - echo ${UNAME_MACHINE}-unknown-linux-gnueabihf + echo ${UNAME_MACHINE}-unknown-linux-${LIBC}eabihf fi fi exit ;; avr32*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; cris:Linux:*:*) - echo ${UNAME_MACHINE}-axis-linux-gnu + echo ${UNAME_MACHINE}-axis-linux-${LIBC} exit ;; crisv32:Linux:*:*) - echo ${UNAME_MACHINE}-axis-linux-gnu + echo ${UNAME_MACHINE}-axis-linux-${LIBC} exit ;; frv:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; hexagon:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; i*86:Linux:*:*) - LIBC=gnu - eval $set_cc_for_build - sed 's/^ //' << EOF >$dummy.c - #ifdef __dietlibc__ - LIBC=dietlibc - #endif -EOF - eval `$CC_FOR_BUILD -E $dummy.c 2>/dev/null | grep '^LIBC'` - echo "${UNAME_MACHINE}-pc-linux-${LIBC}" + echo ${UNAME_MACHINE}-pc-linux-${LIBC} exit ;; ia64:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; m32r*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; m68*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; mips:Linux:*:* | mips64:Linux:*:*) eval $set_cc_for_build @@ -955,54 +967,63 @@ #endif EOF eval `$CC_FOR_BUILD -E $dummy.c 2>/dev/null | grep '^CPU'` - test x"${CPU}" != x && { echo "${CPU}-unknown-linux-gnu"; exit; } + test x"${CPU}" != x && { echo "${CPU}-unknown-linux-${LIBC}"; exit; } ;; + or1k:Linux:*:*) + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} + exit ;; or32:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; padre:Linux:*:*) - echo sparc-unknown-linux-gnu + echo sparc-unknown-linux-${LIBC} exit ;; parisc64:Linux:*:* | hppa64:Linux:*:*) - echo hppa64-unknown-linux-gnu + echo hppa64-unknown-linux-${LIBC} exit ;; parisc:Linux:*:* | hppa:Linux:*:*) # Look for CPU level case `grep '^cpu[^a-z]*:' /proc/cpuinfo 2>/dev/null | cut -d' ' -f2` in - PA7*) echo hppa1.1-unknown-linux-gnu ;; - PA8*) echo hppa2.0-unknown-linux-gnu ;; - *) echo hppa-unknown-linux-gnu ;; + PA7*) echo hppa1.1-unknown-linux-${LIBC} ;; + PA8*) echo hppa2.0-unknown-linux-${LIBC} ;; + *) echo hppa-unknown-linux-${LIBC} ;; esac exit ;; ppc64:Linux:*:*) - echo powerpc64-unknown-linux-gnu + echo powerpc64-unknown-linux-${LIBC} exit ;; ppc:Linux:*:*) - echo powerpc-unknown-linux-gnu + echo powerpc-unknown-linux-${LIBC} + exit ;; + ppc64le:Linux:*:*) + echo powerpc64le-unknown-linux-${LIBC} + exit ;; + ppcle:Linux:*:*) + echo powerpcle-unknown-linux-${LIBC} exit ;; s390:Linux:*:* | s390x:Linux:*:*) - echo ${UNAME_MACHINE}-ibm-linux + echo ${UNAME_MACHINE}-ibm-linux-${LIBC} exit ;; sh64*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; sh*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; sparc:Linux:*:* | sparc64:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; tile*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; vax:Linux:*:*) - echo ${UNAME_MACHINE}-dec-linux-gnu + echo ${UNAME_MACHINE}-dec-linux-${LIBC} exit ;; x86_64:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; xtensa*:Linux:*:*) - echo ${UNAME_MACHINE}-unknown-linux-gnu + echo ${UNAME_MACHINE}-unknown-linux-${LIBC} exit ;; i*86:DYNIX/ptx:4*:*) # ptx 4.0 does uname -s correctly, with DYNIX/ptx in there. @@ -1235,19 +1256,21 @@ exit ;; *:Darwin:*:*) UNAME_PROCESSOR=`uname -p` || UNAME_PROCESSOR=unknown - case $UNAME_PROCESSOR in - i386) - eval $set_cc_for_build - if [ "$CC_FOR_BUILD" != 'no_compiler_found' ]; then - if (echo '#ifdef __LP64__'; echo IS_64BIT_ARCH; echo '#endif') | \ - (CCOPTS= $CC_FOR_BUILD -E - 2>/dev/null) | \ - grep IS_64BIT_ARCH >/dev/null - then - UNAME_PROCESSOR="x86_64" - fi - fi ;; - unknown) UNAME_PROCESSOR=powerpc ;; - esac + eval $set_cc_for_build + if test "$UNAME_PROCESSOR" = unknown ; then + UNAME_PROCESSOR=powerpc + fi + if [ "$CC_FOR_BUILD" != 'no_compiler_found' ]; then + if (echo '#ifdef __LP64__'; echo IS_64BIT_ARCH; echo '#endif') | \ + (CCOPTS= $CC_FOR_BUILD -E - 2>/dev/null) | \ + grep IS_64BIT_ARCH >/dev/null + then + case $UNAME_PROCESSOR in + i386) UNAME_PROCESSOR=x86_64 ;; + powerpc) UNAME_PROCESSOR=powerpc64 ;; + esac + fi + fi echo ${UNAME_PROCESSOR}-apple-darwin${UNAME_RELEASE} exit ;; *:procnto*:*:* | *:QNX:[0123456789]*:*) diff --git a/config.sub b/config.sub --- a/config.sub +++ b/config.sub @@ -1,10 +1,8 @@ #! /bin/sh # Configuration validation subroutine script. -# Copyright (C) 1992, 1993, 1994, 1995, 1996, 1997, 1998, 1999, -# 2000, 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, -# 2011, 2012, 2013 Free Software Foundation, Inc. +# Copyright 1992-2013 Free Software Foundation, Inc. -timestamp='2012-12-29' +timestamp='2013-08-10' # This file is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by @@ -70,9 +68,7 @@ version="\ GNU config.sub ($timestamp) -Copyright (C) 1992, 1993, 1994, 1995, 1996, 1997, 1998, 1999, 2000, -2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, -2012, 2013 Free Software Foundation, Inc. +Copyright 1992-2013 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE." @@ -256,12 +252,12 @@ | alpha | alphaev[4-8] | alphaev56 | alphaev6[78] | alphapca5[67] \ | alpha64 | alpha64ev[4-8] | alpha64ev56 | alpha64ev6[78] | alpha64pca5[67] \ | am33_2.0 \ - | arc \ + | arc | arceb \ | arm | arm[bl]e | arme[lb] | armv[2-8] | armv[3-8][lb] | armv7[arm] \ | avr | avr32 \ | be32 | be64 \ | bfin \ - | c4x | clipper \ + | c4x | c8051 | clipper \ | d10v | d30v | dlx | dsp16xx \ | epiphany \ | fido | fr30 | frv \ @@ -290,16 +286,17 @@ | mipsisa64r2 | mipsisa64r2el \ | mipsisa64sb1 | mipsisa64sb1el \ | mipsisa64sr71k | mipsisa64sr71kel \ + | mipsr5900 | mipsr5900el \ | mipstx39 | mipstx39el \ | mn10200 | mn10300 \ | moxie \ | mt \ | msp430 \ | nds32 | nds32le | nds32be \ - | nios | nios2 \ + | nios | nios2 | nios2eb | nios2el \ | ns16k | ns32k \ | open8 \ - | or32 \ + | or1k | or32 \ | pdp10 | pdp11 | pj | pjl \ | powerpc | powerpc64 | powerpc64le | powerpcle \ | pyramid \ @@ -369,13 +366,13 @@ | aarch64-* | aarch64_be-* \ | alpha-* | alphaev[4-8]-* | alphaev56-* | alphaev6[78]-* \ | alpha64-* | alpha64ev[4-8]-* | alpha64ev56-* | alpha64ev6[78]-* \ - | alphapca5[67]-* | alpha64pca5[67]-* | arc-* \ + | alphapca5[67]-* | alpha64pca5[67]-* | arc-* | arceb-* \ | arm-* | armbe-* | armle-* | armeb-* | armv*-* \ | avr-* | avr32-* \ | be32-* | be64-* \ | bfin-* | bs2000-* \ | c[123]* | c30-* | [cjt]90-* | c4x-* \ - | clipper-* | craynv-* | cydra-* \ + | c8051-* | clipper-* | craynv-* | cydra-* \ | d10v-* | d30v-* | dlx-* \ | elxsi-* \ | f30[01]-* | f700-* | fido-* | fr30-* | frv-* | fx80-* \ @@ -407,12 +404,13 @@ | mipsisa64r2-* | mipsisa64r2el-* \ | mipsisa64sb1-* | mipsisa64sb1el-* \ | mipsisa64sr71k-* | mipsisa64sr71kel-* \ + | mipsr5900-* | mipsr5900el-* \ | mipstx39-* | mipstx39el-* \ | mmix-* \ | mt-* \ | msp430-* \ | nds32-* | nds32le-* | nds32be-* \ - | nios-* | nios2-* \ + | nios-* | nios2-* | nios2eb-* | nios2el-* \ | none-* | np1-* | ns16k-* | ns32k-* \ | open8-* \ | orion-* \ @@ -796,7 +794,7 @@ os=-mingw64 ;; mingw32) - basic_machine=i386-pc + basic_machine=i686-pc os=-mingw32 ;; mingw32ce) @@ -832,7 +830,7 @@ basic_machine=`echo $basic_machine | sed -e 's/ms1-/mt-/'` ;; msys) - basic_machine=i386-pc + basic_machine=i686-pc os=-msys ;; mvs) @@ -1354,7 +1352,7 @@ -gnu* | -bsd* | -mach* | -minix* | -genix* | -ultrix* | -irix* \ | -*vms* | -sco* | -esix* | -isc* | -aix* | -cnk* | -sunos | -sunos[34]*\ | -hpux* | -unos* | -osf* | -luna* | -dgux* | -auroraux* | -solaris* \ - | -sym* | -kopensolaris* \ + | -sym* | -kopensolaris* | -plan9* \ | -amigaos* | -amigados* | -msdos* | -newsos* | -unicos* | -aof* \ | -aos* | -aros* \ | -nindy* | -vxsim* | -vxworks* | -ebmon* | -hms* | -mvs* \ @@ -1500,9 +1498,6 @@ -aros*) os=-aros ;; - -kaos*) - os=-kaos - ;; -zvmoe) os=-zvmoe ;; @@ -1551,6 +1546,9 @@ c4x-* | tic4x-*) os=-coff ;; + c8051-*) + os=-elf + ;; hexagon-*) os=-elf ;; @@ -1594,6 +1592,9 @@ mips*-*) os=-elf ;; + or1k-*) + os=-elf + ;; or32-*) os=-coff ;; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 09:13:17 2013 From: python-checkins at python.org (matthias.klose) Date: Tue, 19 Nov 2013 09:13:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_-_Remove_execu?= =?utf-8?q?te_permissions_from_test=5Fdbm=5Fgnu=2Epy_and_test=5Fdbm=5Fndbm?= =?utf-8?b?LnB5?= Message-ID: <3dP0Cd2zCpz7S1B@mail.python.org> http://hg.python.org/cpython/rev/26c73799715f changeset: 87259:26c73799715f branch: 3.3 parent: 87257:5adb2fca72c2 user: doko at ubuntu.com date: Tue Nov 19 09:12:28 2013 +0100 summary: - Remove execute permissions from test_dbm_gnu.py and test_dbm_ndbm.py files: Lib/test/test_dbm_gnu.py | 0 Lib/test/test_dbm_ndbm.py | 0 2 files changed, 0 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_dbm_gnu.py b/Lib/test/test_dbm_gnu.py old mode 100755 new mode 100644 diff --git a/Lib/test/test_dbm_ndbm.py b/Lib/test/test_dbm_ndbm.py old mode 100755 new mode 100644 -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 09:13:18 2013 From: python-checkins at python.org (matthias.klose) Date: Tue, 19 Nov 2013 09:13:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_-_Remove_execute_permissions_from_test=5Fdbm=5Fgnu=2Epy_?= =?utf-8?q?and_test=5Fdbm=5Fndbm=2Epy?= Message-ID: <3dP0Cf4n3Tz7bFh@mail.python.org> http://hg.python.org/cpython/rev/f25c1cb21c14 changeset: 87260:f25c1cb21c14 parent: 87258:62acba59660f parent: 87259:26c73799715f user: doko at ubuntu.com date: Tue Nov 19 09:12:50 2013 +0100 summary: - Remove execute permissions from test_dbm_gnu.py and test_dbm_ndbm.py files: Lib/test/test_dbm_gnu.py | 0 Lib/test/test_dbm_ndbm.py | 0 2 files changed, 0 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_dbm_gnu.py b/Lib/test/test_dbm_gnu.py old mode 100755 new mode 100644 diff --git a/Lib/test/test_dbm_ndbm.py b/Lib/test/test_dbm_ndbm.py old mode 100755 new mode 100644 -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 10:33:21 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 19 Nov 2013 10:33:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2312892=3A_The_utf-?= =?utf-8?q?16*_and_utf-32*_codecs_now_reject_=28lone=29_surrogates=2E?= Message-ID: <3dP2012GN5zSW4@mail.python.org> http://hg.python.org/cpython/rev/0d9624f2ff43 changeset: 87261:0d9624f2ff43 user: Serhiy Storchaka date: Tue Nov 19 11:32:41 2013 +0200 summary: Issue #12892: The utf-16* and utf-32* codecs now reject (lone) surrogates. The utf-16* and utf-32* encoders no longer allow surrogate code points (U+D800-U+DFFF) to be encoded. The utf-32* decoders no longer decode byte sequences that correspond to surrogate code points. The surrogatepass error handler now works with the utf-16* and utf-32* codecs. Based on patches by Victor Stinner and Kang-Hao (Kenny) Lu. files: Doc/library/codecs.rst | 25 +- Doc/whatsnew/3.4.rst | 7 + Lib/test/test_codecs.py | 68 +++++- Misc/ACKS | 1 + Misc/NEWS | 6 + Objects/stringlib/codecs.h | 198 +++++++++++++++++- Objects/unicodeobject.c | 257 ++++++++++++++++++++++-- Python/codecs.c | 163 ++++++++++++++- 8 files changed, 643 insertions(+), 82 deletions(-) diff --git a/Doc/library/codecs.rst b/Doc/library/codecs.rst --- a/Doc/library/codecs.rst +++ b/Doc/library/codecs.rst @@ -365,18 +365,23 @@ | | in :pep:`383`. | +-------------------------+-----------------------------------------------+ -In addition, the following error handlers are specific to a single codec: +In addition, the following error handlers are specific to Unicode encoding +schemes: -+-------------------+---------+-------------------------------------------+ -| Value | Codec | Meaning | -+===================+=========+===========================================+ -|``'surrogatepass'``| utf-8 | Allow encoding and decoding of surrogate | -| | | codes in UTF-8. | -+-------------------+---------+-------------------------------------------+ ++-------------------+------------------------+-------------------------------------------+ +| Value | Codec | Meaning | ++===================+========================+===========================================+ +|``'surrogatepass'``| utf-8, utf-16, utf-32, | Allow encoding and decoding of surrogate | +| | utf-16-be, utf-16-le, | codes in all the Unicode encoding schemes.| +| | utf-32-be, utf-32-le | | ++-------------------+------------------------+-------------------------------------------+ .. versionadded:: 3.1 The ``'surrogateescape'`` and ``'surrogatepass'`` error handlers. +.. versionchanged:: 3.4 + The ``'surrogatepass'`` error handlers now works with utf-16\* and utf-32\* codecs. + The set of allowed values can be extended via :meth:`register_error`. @@ -1167,6 +1172,12 @@ | utf_8_sig | | all languages | +-----------------+--------------------------------+--------------------------------+ +.. versionchanged:: 3.4 + The utf-16\* and utf-32\* encoders no longer allow surrogate code points + (U+D800--U+DFFF) to be encoded. The utf-32\* decoders no longer decode + byte sequences that correspond to surrogate code points. + + Python Specific Encodings ------------------------- diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -253,6 +253,13 @@ ``__main__.__file__`` when a script has been executed directly using a relative path (Contributed by Brett Cannon in :issue:`18416`). +* Now all the UTF-\* codecs (except UTF-7) reject surrogates during both + encoding and decoding unless the ``surrogatepass`` error handler is used, + with the exception of the UTF-16 decoder that accepts valid surrogate pairs, + and the UTF-16 encoder that produces them while encoding non-BMP characters. + Contributed by Victor Stinner, Kang-Hao (Kenny) Lu and Serhiy Storchaka in + :issue:`12892`. + New Modules =========== diff --git a/Lib/test/test_codecs.py b/Lib/test/test_codecs.py --- a/Lib/test/test_codecs.py +++ b/Lib/test/test_codecs.py @@ -300,8 +300,46 @@ self.assertEqual(reader.readline(), s5) self.assertEqual(reader.readline(), "") + ill_formed_sequence_replace = "\ufffd" + + def test_lone_surrogates(self): + self.assertRaises(UnicodeEncodeError, "\ud800".encode, self.encoding) + self.assertEqual("[\uDC80]".encode(self.encoding, "backslashreplace"), + "[\\udc80]".encode(self.encoding)) + self.assertEqual("[\uDC80]".encode(self.encoding, "xmlcharrefreplace"), + "[�]".encode(self.encoding)) + self.assertEqual("[\uDC80]".encode(self.encoding, "ignore"), + "[]".encode(self.encoding)) + self.assertEqual("[\uDC80]".encode(self.encoding, "replace"), + "[?]".encode(self.encoding)) + + bom = "".encode(self.encoding) + for before, after in [("\U00010fff", "A"), ("[", "]"), + ("A", "\U00010fff")]: + before_sequence = before.encode(self.encoding)[len(bom):] + after_sequence = after.encode(self.encoding)[len(bom):] + test_string = before + "\uDC80" + after + test_sequence = (bom + before_sequence + + self.ill_formed_sequence + after_sequence) + self.assertRaises(UnicodeDecodeError, test_sequence.decode, + self.encoding) + self.assertEqual(test_string.encode(self.encoding, + "surrogatepass"), + test_sequence) + self.assertEqual(test_sequence.decode(self.encoding, + "surrogatepass"), + test_string) + self.assertEqual(test_sequence.decode(self.encoding, "ignore"), + before + after) + self.assertEqual(test_sequence.decode(self.encoding, "replace"), + before + self.ill_formed_sequence_replace + after) + class UTF32Test(ReadTest, unittest.TestCase): encoding = "utf-32" + if sys.byteorder == 'little': + ill_formed_sequence = b"\x80\xdc\x00\x00" + else: + ill_formed_sequence = b"\x00\x00\xdc\x80" spamle = (b'\xff\xfe\x00\x00' b's\x00\x00\x00p\x00\x00\x00a\x00\x00\x00m\x00\x00\x00' @@ -393,6 +431,7 @@ class UTF32LETest(ReadTest, unittest.TestCase): encoding = "utf-32-le" + ill_formed_sequence = b"\x80\xdc\x00\x00" def test_partial(self): self.check_partial( @@ -437,6 +476,7 @@ class UTF32BETest(ReadTest, unittest.TestCase): encoding = "utf-32-be" + ill_formed_sequence = b"\x00\x00\xdc\x80" def test_partial(self): self.check_partial( @@ -482,6 +522,10 @@ class UTF16Test(ReadTest, unittest.TestCase): encoding = "utf-16" + if sys.byteorder == 'little': + ill_formed_sequence = b"\x80\xdc" + else: + ill_formed_sequence = b"\xdc\x80" spamle = b'\xff\xfes\x00p\x00a\x00m\x00s\x00p\x00a\x00m\x00' spambe = b'\xfe\xff\x00s\x00p\x00a\x00m\x00s\x00p\x00a\x00m' @@ -562,6 +606,7 @@ class UTF16LETest(ReadTest, unittest.TestCase): encoding = "utf-16-le" + ill_formed_sequence = b"\x80\xdc" def test_partial(self): self.check_partial( @@ -605,6 +650,7 @@ class UTF16BETest(ReadTest, unittest.TestCase): encoding = "utf-16-be" + ill_formed_sequence = b"\xdc\x80" def test_partial(self): self.check_partial( @@ -648,6 +694,8 @@ class UTF8Test(ReadTest, unittest.TestCase): encoding = "utf-8" + ill_formed_sequence = b"\xed\xb2\x80" + ill_formed_sequence_replace = "\ufffd" * 3 def test_partial(self): self.check_partial( @@ -677,18 +725,11 @@ u, u.encode(self.encoding)) def test_lone_surrogates(self): - self.assertRaises(UnicodeEncodeError, "\ud800".encode, "utf-8") - self.assertRaises(UnicodeDecodeError, b"\xed\xa0\x80".decode, "utf-8") - self.assertEqual("[\uDC80]".encode("utf-8", "backslashreplace"), - b'[\\udc80]') - self.assertEqual("[\uDC80]".encode("utf-8", "xmlcharrefreplace"), - b'[�]') - self.assertEqual("[\uDC80]".encode("utf-8", "surrogateescape"), + super().test_lone_surrogates() + # not sure if this is making sense for + # UTF-16 and UTF-32 + self.assertEqual("[\uDC80]".encode('utf-8', "surrogateescape"), b'[\x80]') - self.assertEqual("[\uDC80]".encode("utf-8", "ignore"), - b'[]') - self.assertEqual("[\uDC80]".encode("utf-8", "replace"), - b'[?]') def test_surrogatepass_handler(self): self.assertEqual("abc\ud800def".encode("utf-8", "surrogatepass"), @@ -851,6 +892,9 @@ self.assertEqual('\ud801\udca0'.encode(self.encoding), b'+2AHcoA-') self.assertEqual(b'+2AHcoA-'.decode(self.encoding), '\U000104A0') + test_lone_surrogates = None + + class UTF16ExTest(unittest.TestCase): def test_errors(self): @@ -875,7 +919,7 @@ self.assertRaises(TypeError, codecs.readbuffer_encode) self.assertRaises(TypeError, codecs.readbuffer_encode, 42) -class UTF8SigTest(ReadTest, unittest.TestCase): +class UTF8SigTest(UTF8Test, unittest.TestCase): encoding = "utf-8-sig" def test_partial(self): diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -783,6 +783,7 @@ Jason Lowe Tony Lownds Ray Loyzaga +Kang-Hao (Kenny) Lu Lukas Lueg Loren Luke Fredrik Lundh diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,6 +10,12 @@ Core and Builtins ----------------- +- Issue #12892: The utf-16* and utf-32* encoders no longer allow surrogate code + points (U+D800-U+DFFF) to be encoded. The utf-32* decoders no longer decode + byte sequences that correspond to surrogate code points. The surrogatepass + error handler now works with the utf-16* and utf-32* codecs. Based on + patches by Victor Stinner and Kang-Hao (Kenny) Lu. + - Issue #17806: Added keyword-argument support for "tabsize" to str/bytes.expandtabs(). diff --git a/Objects/stringlib/codecs.h b/Objects/stringlib/codecs.h --- a/Objects/stringlib/codecs.h +++ b/Objects/stringlib/codecs.h @@ -596,26 +596,134 @@ #undef SWAB -Py_LOCAL_INLINE(void) -STRINGLIB(utf16_encode)(unsigned short *out, - const STRINGLIB_CHAR *in, +#if STRINGLIB_MAX_CHAR >= 0x80 +Py_LOCAL_INLINE(Py_ssize_t) +STRINGLIB(utf16_encode_)(const STRINGLIB_CHAR *in, Py_ssize_t len, + unsigned short **outptr, int native_ordering) { + unsigned short *out = *outptr; const STRINGLIB_CHAR *end = in + len; #if STRINGLIB_SIZEOF_CHAR == 1 # define SWAB2(CH) ((CH) << 8) #else # define SWAB2(CH) (((CH) << 8) | ((CH) >> 8)) #endif + if (native_ordering) { #if STRINGLIB_MAX_CHAR < 0x10000 + const STRINGLIB_CHAR *unrolled_end = in + _Py_SIZE_ROUND_DOWN(len, 4); + while (in < unrolled_end) { +# if STRINGLIB_MAX_CHAR >= 0xd800 + if (((in[0] ^ 0xd800) & + (in[1] ^ 0xd800) & + (in[2] ^ 0xd800) & + (in[3] ^ 0xd800) & 0xf800) == 0) + break; +# endif + out[0] = in[0]; + out[1] = in[1]; + out[2] = in[2]; + out[3] = in[3]; + in += 4; out += 4; + } +#endif + while (in < end) { + Py_UCS4 ch; + ch = *in++; +#if STRINGLIB_MAX_CHAR >= 0xd800 + if (ch < 0xd800) + *out++ = ch; + else if (ch < 0xe000) + /* reject surrogate characters (U+DC800-U+DFFF) */ + goto fail; +# if STRINGLIB_MAX_CHAR >= 0x10000 + else if (ch >= 0x10000) { + out[0] = Py_UNICODE_HIGH_SURROGATE(ch); + out[1] = Py_UNICODE_LOW_SURROGATE(ch); + out += 2; + } +# endif + else +#endif + *out++ = ch; + } + } else { +#if STRINGLIB_MAX_CHAR < 0x10000 + const STRINGLIB_CHAR *unrolled_end = in + _Py_SIZE_ROUND_DOWN(len, 4); + while (in < unrolled_end) { +# if STRINGLIB_MAX_CHAR >= 0xd800 + if (((in[0] ^ 0xd800) & + (in[1] ^ 0xd800) & + (in[2] ^ 0xd800) & + (in[3] ^ 0xd800) & 0xf800) == 0) + break; +# endif + out[0] = SWAB2(in[0]); + out[1] = SWAB2(in[1]); + out[2] = SWAB2(in[2]); + out[3] = SWAB2(in[3]); + in += 4; out += 4; + } +#endif + while (in < end) { + Py_UCS4 ch = *in++; +#if STRINGLIB_MAX_CHAR >= 0xd800 + if (ch < 0xd800) + *out++ = SWAB2((Py_UCS2)ch); + else if (ch < 0xe000) + /* reject surrogate characters (U+DC800-U+DFFF) */ + goto fail; +# if STRINGLIB_MAX_CHAR >= 0x10000 + else if (ch >= 0x10000) { + Py_UCS2 ch1 = Py_UNICODE_HIGH_SURROGATE(ch); + Py_UCS2 ch2 = Py_UNICODE_LOW_SURROGATE(ch); + out[0] = SWAB2(ch1); + out[1] = SWAB2(ch2); + out += 2; + } +# endif + else +#endif + *out++ = SWAB2((Py_UCS2)ch); + } + } + *outptr = out; + return len; +#if STRINGLIB_MAX_CHAR >= 0xd800 + fail: +#endif + *outptr = out; + return len - (end - in + 1); +} +#endif + +#undef SWAB2 + +#if STRINGLIB_MAX_CHAR >= 0x80 +Py_LOCAL_INLINE(Py_ssize_t) +STRINGLIB(utf16_encode)(const STRINGLIB_CHAR *in, + Py_ssize_t len, + unsigned short **outptr, + int native_ordering) +{ + unsigned short *out = *outptr; + const STRINGLIB_CHAR *end = in + len; +#if STRINGLIB_SIZEOF_CHAR == 1 if (native_ordering) { -# if STRINGLIB_SIZEOF_CHAR == 2 - Py_MEMCPY(out, in, 2 * len); -# else - _PyUnicode_CONVERT_BYTES(STRINGLIB_CHAR, unsigned short, in, end, out); -# endif + const STRINGLIB_CHAR *unrolled_end = in + _Py_SIZE_ROUND_DOWN(len, 4); + while (in < unrolled_end) { + out[0] = in[0]; + out[1] = in[1]; + out[2] = in[2]; + out[3] = in[3]; + in += 4; out += 4; + } + while (in < end) { + *out++ = *in++; + } } else { +# define SWAB2(CH) ((CH) << 8) /* high byte is zero */ const STRINGLIB_CHAR *unrolled_end = in + _Py_SIZE_ROUND_DOWN(len, 4); while (in < unrolled_end) { out[0] = SWAB2(in[0]); @@ -625,37 +733,95 @@ in += 4; out += 4; } while (in < end) { - *out++ = SWAB2(*in); - ++in; + Py_UCS4 ch = *in++; + *out++ = SWAB2((Py_UCS2)ch); } +#undef SWAB2 } + *outptr = out; + return len; #else if (native_ordering) { +#if STRINGLIB_MAX_CHAR < 0x10000 + const STRINGLIB_CHAR *unrolled_end = in + _Py_SIZE_ROUND_DOWN(len, 4); + while (in < unrolled_end) { + /* check if any character is a surrogate character */ + if (((in[0] ^ 0xd800) & + (in[1] ^ 0xd800) & + (in[2] ^ 0xd800) & + (in[3] ^ 0xd800) & 0xf800) == 0) + break; + out[0] = in[0]; + out[1] = in[1]; + out[2] = in[2]; + out[3] = in[3]; + in += 4; out += 4; + } +#endif while (in < end) { - Py_UCS4 ch = *in++; - if (ch < 0x10000) + Py_UCS4 ch; + ch = *in++; + if (ch < 0xd800) *out++ = ch; - else { + else if (ch < 0xe000) + /* reject surrogate characters (U+DC800-U+DFFF) */ + goto fail; +#if STRINGLIB_MAX_CHAR >= 0x10000 + else if (ch >= 0x10000) { out[0] = Py_UNICODE_HIGH_SURROGATE(ch); out[1] = Py_UNICODE_LOW_SURROGATE(ch); out += 2; } +#endif + else + *out++ = ch; } } else { +#define SWAB2(CH) (((CH) << 8) | ((CH) >> 8)) +#if STRINGLIB_MAX_CHAR < 0x10000 + const STRINGLIB_CHAR *unrolled_end = in + _Py_SIZE_ROUND_DOWN(len, 4); + while (in < unrolled_end) { + /* check if any character is a surrogate character */ + if (((in[0] ^ 0xd800) & + (in[1] ^ 0xd800) & + (in[2] ^ 0xd800) & + (in[3] ^ 0xd800) & 0xf800) == 0) + break; + out[0] = SWAB2(in[0]); + out[1] = SWAB2(in[1]); + out[2] = SWAB2(in[2]); + out[3] = SWAB2(in[3]); + in += 4; out += 4; + } +#endif while (in < end) { Py_UCS4 ch = *in++; - if (ch < 0x10000) + if (ch < 0xd800) *out++ = SWAB2((Py_UCS2)ch); - else { + else if (ch < 0xe000) + /* reject surrogate characters (U+DC800-U+DFFF) */ + goto fail; +#if STRINGLIB_MAX_CHAR >= 0x10000 + else if (ch >= 0x10000) { Py_UCS2 ch1 = Py_UNICODE_HIGH_SURROGATE(ch); Py_UCS2 ch2 = Py_UNICODE_LOW_SURROGATE(ch); out[0] = SWAB2(ch1); out[1] = SWAB2(ch2); out += 2; } +#endif + else + *out++ = SWAB2((Py_UCS2)ch); } +#undef SWAB2 } + *outptr = out; + return len; + fail: + *outptr = out; + return len - (end - in + 1); #endif -#undef SWAB2 } +#endif + #endif /* STRINGLIB_IS_UNICODE */ diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -4963,6 +4963,7 @@ _PyUnicodeWriter writer; const unsigned char *q, *e; int le, bo = 0; /* assume native ordering by default */ + const char *encoding; const char *errmsg = ""; PyObject *errorHandler = NULL; PyObject *exc = NULL; @@ -5002,6 +5003,7 @@ #else le = bo <= 0; #endif + encoding = le ? "utf-32-le" : "utf-32-be"; _PyUnicodeWriter_Init(&writer); writer.min_length = (e - q + 3) / 4; @@ -5022,6 +5024,9 @@ ch = (q[3] << 24) | (q[2] << 16) | (q[1] << 8) | q[0]; if (ch > maxch) break; + if (kind != PyUnicode_1BYTE_KIND && + Py_UNICODE_IS_SURROGATE(ch)) + break; PyUnicode_WRITE(kind, data, pos++, ch); q += 4; } while (q <= last); @@ -5031,6 +5036,9 @@ ch = (q[0] << 24) | (q[1] << 16) | (q[2] << 8) | q[3]; if (ch > maxch) break; + if (kind != PyUnicode_1BYTE_KIND && + Py_UNICODE_IS_SURROGATE(ch)) + break; PyUnicode_WRITE(kind, data, pos++, ch); q += 4; } while (q <= last); @@ -5038,7 +5046,12 @@ writer.pos = pos; } - if (ch <= maxch) { + if (Py_UNICODE_IS_SURROGATE(ch)) { + errmsg = "codepoint in surrogate code point range(0xd800, 0xe000)"; + startinpos = ((const char *)q) - starts; + endinpos = startinpos + 4; + } + else if (ch <= maxch) { if (q == e || consumed) break; /* remaining bytes at the end? (size should be divisible by 4) */ @@ -5062,7 +5075,7 @@ chooses to skip the input */ if (unicode_decode_call_errorhandler_writer( errors, &errorHandler, - "utf32", errmsg, + encoding, errmsg, &starts, (const char **)&e, &startinpos, &endinpos, &exc, (const char **)&q, &writer)) goto onError; @@ -5099,6 +5112,10 @@ #else int iorder[] = {3, 2, 1, 0}; #endif + const char *encoding; + PyObject *errorHandler = NULL; + PyObject *exc = NULL; + PyObject *rep = NULL; #define STORECHAR(CH) \ do { \ @@ -5130,7 +5147,7 @@ if (byteorder == 0) STORECHAR(0xFEFF); if (len == 0) - goto done; + return v; if (byteorder == -1) { /* force LE */ @@ -5138,6 +5155,7 @@ iorder[1] = 1; iorder[2] = 2; iorder[3] = 3; + encoding = "utf-32-le"; } else if (byteorder == 1) { /* force BE */ @@ -5145,13 +5163,103 @@ iorder[1] = 2; iorder[2] = 1; iorder[3] = 0; - } - - for (i = 0; i < len; i++) - STORECHAR(PyUnicode_READ(kind, data, i)); - - done: + encoding = "utf-32-be"; + } + else + encoding = "utf-32"; + + if (kind == PyUnicode_1BYTE_KIND) { + for (i = 0; i < len; i++) + STORECHAR(PyUnicode_READ(kind, data, i)); + return v; + } + + for (i = 0; i < len;) { + Py_ssize_t repsize, moreunits; + Py_UCS4 ch = PyUnicode_READ(kind, data, i); + i++; + assert(ch <= MAX_UNICODE); + if (!Py_UNICODE_IS_SURROGATE(ch)) { + STORECHAR(ch); + continue; + } + + rep = unicode_encode_call_errorhandler( + errors, &errorHandler, + encoding, "surrogates not allowed", + str, &exc, i-1, i, &i); + + if (!rep) + goto error; + + if (PyBytes_Check(rep)) { + repsize = PyBytes_GET_SIZE(rep); + if (repsize & 3) { + raise_encode_exception(&exc, encoding, + str, i - 1, i, + "surrogates not allowed"); + goto error; + } + moreunits = repsize / 4; + } + else { + assert(PyUnicode_Check(rep)); + if (PyUnicode_READY(rep) < 0) + goto error; + moreunits = repsize = PyUnicode_GET_LENGTH(rep); + if (!PyUnicode_IS_ASCII(rep)) { + raise_encode_exception(&exc, encoding, + str, i - 1, i, + "surrogates not allowed"); + goto error; + } + } + + /* four bytes are reserved for each surrogate */ + if (moreunits > 1) { + Py_ssize_t outpos = p - (unsigned char*) PyBytes_AS_STRING(v); + Py_ssize_t morebytes = 4 * (moreunits - 1); + if (PyBytes_GET_SIZE(v) > PY_SSIZE_T_MAX - morebytes) { + /* integer overflow */ + PyErr_NoMemory(); + goto error; + } + if (_PyBytes_Resize(&v, PyBytes_GET_SIZE(v) + morebytes) < 0) + goto error; + p = (unsigned char*) PyBytes_AS_STRING(v) + outpos; + } + + if (PyBytes_Check(rep)) { + Py_MEMCPY(p, PyBytes_AS_STRING(rep), repsize); + p += repsize; + } else /* rep is unicode */ { + const Py_UCS1 *repdata; + assert(PyUnicode_KIND(rep) == PyUnicode_1BYTE_KIND); + repdata = PyUnicode_1BYTE_DATA(rep); + while (repsize--) { + Py_UCS4 ch = *repdata++; + STORECHAR(ch); + } + } + + Py_CLEAR(rep); + } + + /* Cut back to size actually needed. This is necessary for, for example, + encoding of a string containing isolated surrogates and the 'ignore' + handler is used. */ + nsize = p - (unsigned char*) PyBytes_AS_STRING(v); + if (nsize != PyBytes_GET_SIZE(v)) + _PyBytes_Resize(&v, nsize); + Py_XDECREF(errorHandler); + Py_XDECREF(exc); return v; + error: + Py_XDECREF(rep); + Py_XDECREF(errorHandler); + Py_XDECREF(exc); + Py_XDECREF(v); + return NULL; #undef STORECHAR } @@ -5204,6 +5312,7 @@ const char *errmsg = ""; PyObject *errorHandler = NULL; PyObject *exc = NULL; + const char *encoding; q = (unsigned char *)s; e = q + size; @@ -5237,8 +5346,10 @@ #if PY_LITTLE_ENDIAN native_ordering = bo <= 0; + encoding = bo <= 0 ? "utf-16-le" : "utf-16-be"; #else native_ordering = bo >= 0; + encoding = bo >= 0 ? "utf-16-be" : "utf-16-le"; #endif /* Note: size will always be longer than the resulting Unicode @@ -5312,7 +5423,7 @@ if (unicode_decode_call_errorhandler_writer( errors, &errorHandler, - "utf16", errmsg, + encoding, errmsg, &starts, (const char **)&e, &startinpos, @@ -5348,13 +5459,17 @@ Py_ssize_t len; PyObject *v; unsigned short *out; - Py_ssize_t bytesize; Py_ssize_t pairs; #if PY_BIG_ENDIAN int native_ordering = byteorder >= 0; #else int native_ordering = byteorder <= 0; #endif + const char *encoding; + Py_ssize_t nsize, pos; + PyObject *errorHandler = NULL; + PyObject *exc = NULL; + PyObject *rep = NULL; if (!PyUnicode_Check(str)) { PyErr_BadArgument(); @@ -5376,8 +5491,8 @@ } if (len > PY_SSIZE_T_MAX / 2 - pairs - (byteorder == 0)) return PyErr_NoMemory(); - bytesize = (len + pairs + (byteorder == 0)) * 2; - v = PyBytes_FromStringAndSize(NULL, bytesize); + nsize = len + pairs + (byteorder == 0); + v = PyBytes_FromStringAndSize(NULL, nsize * 2); if (v == NULL) return NULL; @@ -5389,25 +5504,107 @@ if (len == 0) goto done; - switch (kind) { - case PyUnicode_1BYTE_KIND: { - ucs1lib_utf16_encode(out, (const Py_UCS1 *)data, len, native_ordering); - break; - } - case PyUnicode_2BYTE_KIND: { - ucs2lib_utf16_encode(out, (const Py_UCS2 *)data, len, native_ordering); - break; - } - case PyUnicode_4BYTE_KIND: { - ucs4lib_utf16_encode(out, (const Py_UCS4 *)data, len, native_ordering); - break; - } - default: - assert(0); - } - + if (kind == PyUnicode_1BYTE_KIND) { + ucs1lib_utf16_encode((const Py_UCS1 *)data, len, &out, native_ordering); + goto done; + } + + if (byteorder < 0) + encoding = "utf-16-le"; + else if (byteorder > 0) + encoding = "utf-16-be"; + else + encoding = "utf-16"; + + pos = 0; + while (pos < len) { + Py_ssize_t repsize, moreunits; + + if (kind == PyUnicode_2BYTE_KIND) { + pos += ucs2lib_utf16_encode((const Py_UCS2 *)data + pos, len - pos, + &out, native_ordering); + } + else { + assert(kind == PyUnicode_4BYTE_KIND); + pos += ucs4lib_utf16_encode((const Py_UCS4 *)data + pos, len - pos, + &out, native_ordering); + } + if (pos == len) + break; + + rep = unicode_encode_call_errorhandler( + errors, &errorHandler, + encoding, "surrogates not allowed", + str, &exc, pos, pos + 1, &pos); + if (!rep) + goto error; + + if (PyBytes_Check(rep)) { + repsize = PyBytes_GET_SIZE(rep); + if (repsize & 1) { + raise_encode_exception(&exc, encoding, + str, pos - 1, pos, + "surrogates not allowed"); + goto error; + } + moreunits = repsize / 2; + } + else { + assert(PyUnicode_Check(rep)); + if (PyUnicode_READY(rep) < 0) + goto error; + moreunits = repsize = PyUnicode_GET_LENGTH(rep); + if (!PyUnicode_IS_ASCII(rep)) { + raise_encode_exception(&exc, encoding, + str, pos - 1, pos, + "surrogates not allowed"); + goto error; + } + } + + /* two bytes are reserved for each surrogate */ + if (moreunits > 1) { + Py_ssize_t outpos = out - (unsigned short*) PyBytes_AS_STRING(v); + Py_ssize_t morebytes = 2 * (moreunits - 1); + if (PyBytes_GET_SIZE(v) > PY_SSIZE_T_MAX - morebytes) { + /* integer overflow */ + PyErr_NoMemory(); + goto error; + } + if (_PyBytes_Resize(&v, PyBytes_GET_SIZE(v) + morebytes) < 0) + goto error; + out = (unsigned short*) PyBytes_AS_STRING(v) + outpos; + } + + if (PyBytes_Check(rep)) { + Py_MEMCPY(out, PyBytes_AS_STRING(rep), repsize); + out += moreunits; + } else /* rep is unicode */ { + assert(PyUnicode_KIND(rep) == PyUnicode_1BYTE_KIND); + ucs1lib_utf16_encode(PyUnicode_1BYTE_DATA(rep), repsize, + &out, native_ordering); + } + + Py_CLEAR(rep); + } + + /* Cut back to size actually needed. This is necessary for, for example, + encoding of a string containing isolated surrogates and the 'ignore' handler + is used. */ + nsize = (unsigned char*) out - (unsigned char*) PyBytes_AS_STRING(v); + if (nsize != PyBytes_GET_SIZE(v)) + _PyBytes_Resize(&v, nsize); + Py_XDECREF(errorHandler); + Py_XDECREF(exc); done: return v; + error: + Py_XDECREF(rep); + Py_XDECREF(errorHandler); + Py_XDECREF(exc); + Py_XDECREF(v); + return NULL; +#undef STORECHAR } PyObject * diff --git a/Python/codecs.c b/Python/codecs.c --- a/Python/codecs.c +++ b/Python/codecs.c @@ -753,6 +753,65 @@ } } +#define ENC_UTF8 0 +#define ENC_UTF16BE 1 +#define ENC_UTF16LE 2 +#define ENC_UTF32BE 3 +#define ENC_UTF32LE 4 + +static int +get_standard_encoding(const char *encoding, int *bytelength) +{ + if (Py_TOLOWER(encoding[0]) == 'u' && + Py_TOLOWER(encoding[1]) == 't' && + Py_TOLOWER(encoding[2]) == 'f') { + encoding += 3; + if (*encoding == '-' || *encoding == '_' ) + encoding++; + if (encoding[0] == '1' && encoding[1] == '6') { + encoding += 2; + *bytelength = 2; + if (*encoding == '\0') { +#ifdef WORDS_BIGENDIAN + return ENC_UTF16BE; +#else + return ENC_UTF16LE; +#endif + } + if (*encoding == '-' || *encoding == '_' ) + encoding++; + if (Py_TOLOWER(encoding[1]) == 'e' && encoding[2] == '\0') { + if (Py_TOLOWER(encoding[0]) == 'b') + return ENC_UTF16BE; + if (Py_TOLOWER(encoding[0]) == 'l') + return ENC_UTF16LE; + } + } + else if (encoding[0] == '3' && encoding[1] == '2') { + encoding += 2; + *bytelength = 4; + if (*encoding == '\0') { +#ifdef WORDS_BIGENDIAN + return ENC_UTF32BE; +#else + return ENC_UTF32LE; +#endif + } + if (*encoding == '-' || *encoding == '_' ) + encoding++; + if (Py_TOLOWER(encoding[1]) == 'e' && encoding[2] == '\0') { + if (Py_TOLOWER(encoding[0]) == 'b') + return ENC_UTF32BE; + if (Py_TOLOWER(encoding[0]) == 'l') + return ENC_UTF32LE; + } + } + } + /* utf-8 */ + *bytelength = 3; + return ENC_UTF8; +} + /* This handler is declared static until someone demonstrates a need to call it directly. */ static PyObject * @@ -760,24 +819,40 @@ { PyObject *restuple; PyObject *object; + PyObject *encode; + char *encoding; + int code; + int bytelength; Py_ssize_t i; Py_ssize_t start; Py_ssize_t end; PyObject *res; if (PyObject_IsInstance(exc, PyExc_UnicodeEncodeError)) { - char *outp; + unsigned char *outp; if (PyUnicodeEncodeError_GetStart(exc, &start)) return NULL; if (PyUnicodeEncodeError_GetEnd(exc, &end)) return NULL; if (!(object = PyUnicodeEncodeError_GetObject(exc))) return NULL; - res = PyBytes_FromStringAndSize(NULL, 3*(end-start)); + if (!(encode = PyUnicodeEncodeError_GetEncoding(exc))) { + Py_DECREF(object); + return NULL; + } + if (!(encoding = PyUnicode_AsUTF8(encode))) { + Py_DECREF(object); + Py_DECREF(encode); + return NULL; + } + code = get_standard_encoding(encoding, &bytelength); + Py_DECREF(encode); + + res = PyBytes_FromStringAndSize(NULL, bytelength*(end-start)); if (!res) { Py_DECREF(object); return NULL; } - outp = PyBytes_AsString(res); + outp = (unsigned char*)PyBytes_AsString(res); for (i = start; i < end; i++) { /* object is guaranteed to be "ready" */ Py_UCS4 ch = PyUnicode_READ_CHAR(object, i); @@ -788,9 +863,33 @@ Py_DECREF(object); return NULL; } - *outp++ = (char)(0xe0 | (ch >> 12)); - *outp++ = (char)(0x80 | ((ch >> 6) & 0x3f)); - *outp++ = (char)(0x80 | (ch & 0x3f)); + switch (code) { + case ENC_UTF8: + *outp++ = (unsigned char)(0xe0 | (ch >> 12)); + *outp++ = (unsigned char)(0x80 | ((ch >> 6) & 0x3f)); + *outp++ = (unsigned char)(0x80 | (ch & 0x3f)); + break; + case ENC_UTF16LE: + *outp++ = (unsigned char) ch; + *outp++ = (unsigned char)(ch >> 8); + break; + case ENC_UTF16BE: + *outp++ = (unsigned char)(ch >> 8); + *outp++ = (unsigned char) ch; + break; + case ENC_UTF32LE: + *outp++ = (unsigned char) ch; + *outp++ = (unsigned char)(ch >> 8); + *outp++ = (unsigned char)(ch >> 16); + *outp++ = (unsigned char)(ch >> 24); + break; + case ENC_UTF32BE: + *outp++ = (unsigned char)(ch >> 24); + *outp++ = (unsigned char)(ch >> 16); + *outp++ = (unsigned char)(ch >> 8); + *outp++ = (unsigned char) ch; + break; + } } restuple = Py_BuildValue("(On)", res, end); Py_DECREF(res); @@ -802,34 +901,64 @@ Py_UCS4 ch = 0; if (PyUnicodeDecodeError_GetStart(exc, &start)) return NULL; + if (PyUnicodeDecodeError_GetEnd(exc, &end)) + return NULL; if (!(object = PyUnicodeDecodeError_GetObject(exc))) return NULL; if (!(p = (unsigned char*)PyBytes_AsString(object))) { Py_DECREF(object); return NULL; } + if (!(encode = PyUnicodeDecodeError_GetEncoding(exc))) { + Py_DECREF(object); + return NULL; + } + if (!(encoding = PyUnicode_AsUTF8(encode))) { + Py_DECREF(object); + Py_DECREF(encode); + return NULL; + } + code = get_standard_encoding(encoding, &bytelength); + Py_DECREF(encode); + /* Try decoding a single surrogate character. If there are more, let the codec call us again. */ p += start; - if (PyBytes_GET_SIZE(object) - start >= 3 && - (p[0] & 0xf0) == 0xe0 && - (p[1] & 0xc0) == 0x80 && - (p[2] & 0xc0) == 0x80) { - /* it's a three-byte code */ - ch = ((p[0] & 0x0f) << 12) + ((p[1] & 0x3f) << 6) + (p[2] & 0x3f); - if (!Py_UNICODE_IS_SURROGATE(ch)) - /* it's not a surrogate - fail */ - ch = 0; + if (PyBytes_GET_SIZE(object) - start >= bytelength) { + switch (code) { + case ENC_UTF8: + if ((p[0] & 0xf0) == 0xe0 && + (p[1] & 0xc0) == 0x80 && + (p[2] & 0xc0) == 0x80) { + /* it's a three-byte code */ + ch = ((p[0] & 0x0f) << 12) + ((p[1] & 0x3f) << 6) + (p[2] & 0x3f); + } + break; + case ENC_UTF16LE: + ch = p[1] << 8 | p[0]; + break; + case ENC_UTF16BE: + ch = p[0] << 8 | p[1]; + break; + case ENC_UTF32LE: + ch = (p[3] << 24) | (p[2] << 16) | (p[1] << 8) | p[0]; + break; + case ENC_UTF32BE: + ch = (p[0] << 24) | (p[1] << 16) | (p[2] << 8) | p[3]; + break; + } } + Py_DECREF(object); - if (ch == 0) { + if (!Py_UNICODE_IS_SURROGATE(ch)) { + /* it's not a surrogate - fail */ PyErr_SetObject(PyExceptionInstance_Class(exc), exc); return NULL; } res = PyUnicode_FromOrdinal(ch); if (res == NULL) return NULL; - return Py_BuildValue("(Nn)", res, start+3); + return Py_BuildValue("(Nn)", res, start + bytelength); } else { wrong_exception_type(exc); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 12:13:12 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 19 Nov 2013 12:13:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319513=3A_Disable_?= =?utf-8?q?overallocation_of_the_PyUnicodeWriter_before_the_last?= Message-ID: <3dP4CD0DSwzSn4@mail.python.org> http://hg.python.org/cpython/rev/27461e6a7763 changeset: 87262:27461e6a7763 user: Victor Stinner date: Tue Nov 19 12:09:00 2013 +0100 summary: Issue #19513: Disable overallocation of the PyUnicodeWriter before the last write files: Objects/listobject.c | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Objects/listobject.c b/Objects/listobject.c --- a/Objects/listobject.c +++ b/Objects/listobject.c @@ -387,6 +387,7 @@ Py_DECREF(s); } + writer.overallocate = 0; if (_PyUnicodeWriter_WriteChar(&writer, ']') < 0) goto error; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 12:55:29 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 19 Nov 2013 12:55:29 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Add_=5FPyUnicodeWriter=5FW?= =?utf-8?q?riteASCIIString=28=29_function?= Message-ID: <3dP5812b1Dz7Mgj@mail.python.org> http://hg.python.org/cpython/rev/d1ca05428c38 changeset: 87263:d1ca05428c38 user: Victor Stinner date: Tue Nov 19 12:54:53 2013 +0100 summary: Add _PyUnicodeWriter_WriteASCIIString() function files: Include/unicodeobject.h | 17 ++++- Objects/listobject.c | 9 +-- Objects/unicodeobject.c | 90 ++++++++++++++++++++----- Python/formatter_unicode.c | 22 +++--- 4 files changed, 98 insertions(+), 40 deletions(-) diff --git a/Include/unicodeobject.h b/Include/unicodeobject.h --- a/Include/unicodeobject.h +++ b/Include/unicodeobject.h @@ -962,12 +962,20 @@ Py_ssize_t end ); +/* Append a ASCII-encoded byte string. + Return 0 on success, raise an exception and return -1 on error. */ +PyAPI_FUNC(int) +_PyUnicodeWriter_WriteASCIIString(_PyUnicodeWriter *writer, + const char *str, /* ASCII-encoded byte string */ + Py_ssize_t len /* number of bytes, or -1 if unknown */ + ); + /* Append a latin1-encoded byte string. Return 0 on success, raise an exception and return -1 on error. */ PyAPI_FUNC(int) -_PyUnicodeWriter_WriteCstr(_PyUnicodeWriter *writer, - const char *str, /* latin1-encoded byte string */ - Py_ssize_t len /* length in bytes */ +_PyUnicodeWriter_WriteLatin1String(_PyUnicodeWriter *writer, + const char *str, /* latin1-encoded byte string */ + Py_ssize_t len /* length in bytes */ ); /* Get the value of the writer as an Unicode string. Clear the @@ -979,6 +987,9 @@ /* Deallocate memory of a writer (clear its internal buffer). */ PyAPI_FUNC(void) _PyUnicodeWriter_Dealloc(_PyUnicodeWriter *writer); + +PyAPI_FUNC(int) _PyObject_ReprWriter(_PyUnicodeWriter *writer, + PyObject *v); #endif #ifndef Py_LIMITED_API diff --git a/Objects/listobject.c b/Objects/listobject.c --- a/Objects/listobject.c +++ b/Objects/listobject.c @@ -339,19 +339,12 @@ { Py_ssize_t i; PyObject *s; - static PyObject *sep = NULL; _PyUnicodeWriter writer; if (Py_SIZE(v) == 0) { return PyUnicode_FromString("[]"); } - if (sep == NULL) { - sep = PyUnicode_FromString(", "); - if (sep == NULL) - return NULL; - } - i = Py_ReprEnter((PyObject*)v); if (i != 0) { return i > 0 ? PyUnicode_FromString("[...]") : NULL; @@ -369,7 +362,7 @@ so must refetch the list size on each iteration. */ for (i = 0; i < Py_SIZE(v); ++i) { if (i > 0) { - if (_PyUnicodeWriter_WriteStr(&writer, sep) < 0) + if (_PyUnicodeWriter_WriteASCIIString(&writer, ", ", 2) < 0) goto error; } diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -140,9 +140,9 @@ buffer where the result characters are written to. */ #define _PyUnicode_CONVERT_BYTES(from_type, to_type, begin, end, to) \ do { \ - to_type *_to = (to_type *) to; \ - const from_type *_iter = (begin); \ - const from_type *_end = (end); \ + to_type *_to = (to_type *)(to); \ + const from_type *_iter = (from_type *)(begin); \ + const from_type *_end = (from_type *)(end); \ Py_ssize_t n = (_end) - (_iter); \ const from_type *_unrolled_end = \ _iter + _Py_SIZE_ROUND_DOWN(n, 4); \ @@ -2562,7 +2562,6 @@ precision = len; arglen = Py_MAX(precision, width); - assert(ucs1lib_find_max_char((Py_UCS1*)buffer, (Py_UCS1*)buffer + len) <= 127); if (_PyUnicodeWriter_Prepare(writer, arglen, 127) == -1) return NULL; @@ -2581,8 +2580,8 @@ writer->pos += fill; } - unicode_write_cstr(writer->buffer, writer->pos, buffer, len); - writer->pos += len; + if (_PyUnicodeWriter_WriteASCIIString(writer, buffer, len) < 0) + return NULL; break; } @@ -2604,11 +2603,8 @@ len += 2; } - assert(ucs1lib_find_max_char((Py_UCS1*)number, (Py_UCS1*)number + len) <= 127); - if (_PyUnicodeWriter_Prepare(writer, len, 127) == -1) + if (_PyUnicodeWriter_WriteASCIIString(writer, number, len) < 0) return NULL; - unicode_write_cstr(writer->buffer, writer->pos, number, len); - writer->pos += len; break; } @@ -2707,7 +2703,7 @@ skip the code, since there's no way to know what's in the argument list) */ len = strlen(p); - if (_PyUnicodeWriter_WriteCstr(writer, p, len) == -1) + if (_PyUnicodeWriter_WriteLatin1String(writer, p, len) == -1) return NULL; f = p+len; return f; @@ -2759,10 +2755,9 @@ if (*p == '\0') writer.overallocate = 0; - if (_PyUnicodeWriter_Prepare(&writer, len, 127) == -1) + + if (_PyUnicodeWriter_WriteASCIIString(&writer, f, len) < 0) goto fail; - unicode_write_cstr(writer.buffer, writer.pos, f, len); - writer.pos += len; f = p; } @@ -13461,7 +13456,68 @@ } int -_PyUnicodeWriter_WriteCstr(_PyUnicodeWriter *writer, const char *str, Py_ssize_t len) +_PyUnicodeWriter_WriteASCIIString(_PyUnicodeWriter *writer, + const char *ascii, Py_ssize_t len) +{ + if (len == -1) + len = strlen(ascii); + + assert(ucs1lib_find_max_char((Py_UCS1*)ascii, (Py_UCS1*)ascii + len) < 128); + + if (writer->buffer == NULL && !writer->overallocate) { + PyObject *str; + + str = _PyUnicode_FromASCII(ascii, len); + if (str == NULL) + return -1; + + writer->readonly = 1; + writer->buffer = str; + _PyUnicodeWriter_Update(writer); + writer->pos += len; + return 0; + } + + if (_PyUnicodeWriter_Prepare(writer, len, 127) == -1) + return -1; + + switch (writer->kind) + { + case PyUnicode_1BYTE_KIND: + { + const Py_UCS1 *str = (const Py_UCS1 *)ascii; + Py_UCS1 *data = writer->data; + + Py_MEMCPY(data + writer->pos, str, len); + break; + } + case PyUnicode_2BYTE_KIND: + { + _PyUnicode_CONVERT_BYTES( + Py_UCS1, Py_UCS2, + ascii, ascii + len, + (Py_UCS2 *)writer->data + writer->pos); + break; + } + case PyUnicode_4BYTE_KIND: + { + _PyUnicode_CONVERT_BYTES( + Py_UCS1, Py_UCS4, + ascii, ascii + len, + (Py_UCS4 *)writer->data + writer->pos); + break; + } + default: + assert(0); + } + + writer->pos += len; + return 0; +} + +int +_PyUnicodeWriter_WriteLatin1String(_PyUnicodeWriter *writer, + const char *str, Py_ssize_t len) { Py_UCS4 maxchar; @@ -13828,12 +13884,10 @@ return -1; len = strlen(p); if (writer) { - if (_PyUnicodeWriter_Prepare(writer, len, 127) == -1) { + if (_PyUnicodeWriter_WriteASCIIString(writer, p, len) < 0) { PyMem_Free(p); return -1; } - unicode_write_cstr(writer->buffer, writer->pos, p, len); - writer->pos += len; } else *p_output = _PyUnicode_FromASCII(p, len); diff --git a/Python/formatter_unicode.c b/Python/formatter_unicode.c --- a/Python/formatter_unicode.c +++ b/Python/formatter_unicode.c @@ -1053,6 +1053,17 @@ n_digits += 1; } + if (format->sign != '+' && format->sign != ' ' + && format->width == -1 + && format->type != 'n' + && !format->thousands_separators) + { + /* Fast path */ + result = _PyUnicodeWriter_WriteASCIIString(writer, buf, n_digits); + PyMem_Free(buf); + return result; + } + /* Since there is no unicode version of PyOS_double_to_string, just use the 8 bit version and then convert to unicode. */ unicode_tmp = _PyUnicode_FromASCII(buf, n_digits); @@ -1060,17 +1071,6 @@ if (unicode_tmp == NULL) goto done; - if (format->sign != '+' && format->sign != ' ' - && format->width == -1 - && format->type != 'n' - && !format->thousands_separators) - { - /* Fast path */ - result = _PyUnicodeWriter_WriteStr(writer, unicode_tmp); - Py_DECREF(unicode_tmp); - return result; - } - /* Is a sign character present in the output? If so, remember it and skip it */ index = 0; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 13:00:17 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 19 Nov 2013 13:00:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319513=3A_repr=28t?= =?utf-8?q?uple=29_now_uses_=5FPyUnicodeWriter_for_better_performances?= Message-ID: <3dP5FY3YvTzRvP@mail.python.org> http://hg.python.org/cpython/rev/99141ab08e21 changeset: 87264:99141ab08e21 user: Victor Stinner date: Tue Nov 19 12:59:46 2013 +0100 summary: Issue #19513: repr(tuple) now uses _PyUnicodeWriter for better performances files: Objects/tupleobject.c | 66 +++++++++++++++++------------- 1 files changed, 38 insertions(+), 28 deletions(-) diff --git a/Objects/tupleobject.c b/Objects/tupleobject.c --- a/Objects/tupleobject.c +++ b/Objects/tupleobject.c @@ -255,20 +255,12 @@ tuplerepr(PyTupleObject *v) { Py_ssize_t i, n; - PyObject *s = NULL; - _PyAccu acc; - static PyObject *sep = NULL; + _PyUnicodeWriter writer; n = Py_SIZE(v); if (n == 0) return PyUnicode_FromString("()"); - if (sep == NULL) { - sep = PyUnicode_FromString(", "); - if (sep == NULL) - return NULL; - } - /* While not mutable, it is still possible to end up with a cycle in a tuple through an object that stores itself within a tuple (and thus infinitely asks for the repr of itself). This should only be @@ -278,40 +270,58 @@ return i > 0 ? PyUnicode_FromString("(...)") : NULL; } - if (_PyAccu_Init(&acc)) + _PyUnicodeWriter_Init(&writer); + writer.overallocate = 1; + if (Py_SIZE(v) > 1) { + /* "(" + "1" + ", 2" * (len - 1) + ")" */ + writer.min_length = 1 + 1 + (2 + 1) * (Py_SIZE(v) - 1) + 1; + } + else { + /* "(1,)" */ + writer.min_length = 4; + } + + if (_PyUnicodeWriter_WriteChar(&writer, '(') < 0) goto error; - s = PyUnicode_FromString("("); - if (s == NULL || _PyAccu_Accumulate(&acc, s)) - goto error; - Py_CLEAR(s); - /* Do repr() on each element. */ for (i = 0; i < n; ++i) { + PyObject *s; + + if (i > 0) { + if (_PyUnicodeWriter_WriteASCIIString(&writer, ", ", 2) < 0) + goto error; + } + if (Py_EnterRecursiveCall(" while getting the repr of a tuple")) goto error; s = PyObject_Repr(v->ob_item[i]); Py_LeaveRecursiveCall(); - if (i > 0 && _PyAccu_Accumulate(&acc, sep)) + if (s == NULL) goto error; - if (s == NULL || _PyAccu_Accumulate(&acc, s)) + + if (_PyUnicodeWriter_WriteStr(&writer, s) < 0) { + Py_DECREF(s); goto error; - Py_CLEAR(s); + } + Py_DECREF(s); } - if (n > 1) - s = PyUnicode_FromString(")"); - else - s = PyUnicode_FromString(",)"); - if (s == NULL || _PyAccu_Accumulate(&acc, s)) - goto error; - Py_CLEAR(s); + + writer.overallocate = 0; + if (n > 1) { + if (_PyUnicodeWriter_WriteChar(&writer, ')') < 0) + goto error; + } + else { + if (_PyUnicodeWriter_WriteASCIIString(&writer, ",)", 2) < 0) + goto error; + } Py_ReprLeave((PyObject *)v); - return _PyAccu_Finish(&acc); + return _PyUnicodeWriter_Finish(&writer); error: - _PyAccu_Destroy(&acc); - Py_XDECREF(s); + _PyUnicodeWriter_Dealloc(&writer); Py_ReprLeave((PyObject *)v); return NULL; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 13:11:32 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 19 Nov 2013 13:11:32 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319646=3A_repr=28d?= =?utf-8?q?ict=29_now_uses_=5FPyUnicodeWriter_API_for_better_performances?= Message-ID: <3dP5VX2Y7szR4Y@mail.python.org> http://hg.python.org/cpython/rev/3a354b879d1f changeset: 87265:3a354b879d1f user: Victor Stinner date: Tue Nov 19 13:07:38 2013 +0100 summary: Issue #19646: repr(dict) now uses _PyUnicodeWriter API for better performances files: Objects/dictobject.c | 109 +++++++++++++++--------------- 1 files changed, 54 insertions(+), 55 deletions(-) diff --git a/Objects/dictobject.c b/Objects/dictobject.c --- a/Objects/dictobject.c +++ b/Objects/dictobject.c @@ -1397,9 +1397,9 @@ dict_repr(PyDictObject *mp) { Py_ssize_t i; - PyObject *s, *temp, *colon = NULL; - PyObject *pieces = NULL, *result = NULL; - PyObject *key, *value; + PyObject *key = NULL, *value = NULL; + _PyUnicodeWriter writer; + int first; i = Py_ReprEnter((PyObject *)mp); if (i != 0) { @@ -1407,74 +1407,73 @@ } if (mp->ma_used == 0) { - result = PyUnicode_FromString("{}"); - goto Done; + Py_ReprLeave((PyObject *)mp); + return PyUnicode_FromString("{}"); } - pieces = PyList_New(0); - if (pieces == NULL) - goto Done; - - colon = PyUnicode_FromString(": "); - if (colon == NULL) - goto Done; + _PyUnicodeWriter_Init(&writer); + writer.overallocate = 1; + /* "{" + "1: 2" + ", 3: 4" * (len - 1) + "}" */ + writer.min_length = 1 + 4 + (2 + 4) * (mp->ma_used - 1) + 1; + + if (_PyUnicodeWriter_WriteChar(&writer, '{') < 0) + goto error; /* Do repr() on each key+value pair, and insert ": " between them. Note that repr may mutate the dict. */ i = 0; + first = 1; while (PyDict_Next((PyObject *)mp, &i, &key, &value)) { - int status; + PyObject *s; + int res; + /* Prevent repr from deleting key or value during key format. */ Py_INCREF(key); Py_INCREF(value); + + if (!first) { + if (_PyUnicodeWriter_WriteASCIIString(&writer, ", ", 2) < 0) + goto error; + } + first = 0; + s = PyObject_Repr(key); - PyUnicode_Append(&s, colon); if (s == NULL) - goto Done; - - PyUnicode_AppendAndDel(&s, PyObject_Repr(value)); - Py_DECREF(key); - Py_DECREF(value); + goto error; + res = _PyUnicodeWriter_WriteStr(&writer, s); + Py_DECREF(s); + if (res < 0) + goto error; + + if (_PyUnicodeWriter_WriteASCIIString(&writer, ": ", 2) < 0) + goto error; + + s = PyObject_Repr(value); if (s == NULL) - goto Done; - status = PyList_Append(pieces, s); - Py_DECREF(s); /* append created a new ref */ - if (status < 0) - goto Done; + goto error; + res = _PyUnicodeWriter_WriteStr(&writer, s); + Py_DECREF(s); + if (res < 0) + goto error; + + Py_CLEAR(key); + Py_CLEAR(value); } - /* Add "{}" decorations to the first and last items. */ - assert(PyList_GET_SIZE(pieces) > 0); - s = PyUnicode_FromString("{"); - if (s == NULL) - goto Done; - temp = PyList_GET_ITEM(pieces, 0); - PyUnicode_AppendAndDel(&s, temp); - PyList_SET_ITEM(pieces, 0, s); - if (s == NULL) - goto Done; - - s = PyUnicode_FromString("}"); - if (s == NULL) - goto Done; - temp = PyList_GET_ITEM(pieces, PyList_GET_SIZE(pieces) - 1); - PyUnicode_AppendAndDel(&temp, s); - PyList_SET_ITEM(pieces, PyList_GET_SIZE(pieces) - 1, temp); - if (temp == NULL) - goto Done; - - /* Paste them all together with ", " between. */ - s = PyUnicode_FromString(", "); - if (s == NULL) - goto Done; - result = PyUnicode_Join(s, pieces); - Py_DECREF(s); - -Done: - Py_XDECREF(pieces); - Py_XDECREF(colon); + writer.overallocate = 0; + if (_PyUnicodeWriter_WriteChar(&writer, '}') < 0) + goto error; + Py_ReprLeave((PyObject *)mp); - return result; + + return _PyUnicodeWriter_Finish(&writer); + +error: + Py_ReprLeave((PyObject *)mp); + _PyUnicodeWriter_Dealloc(&writer); + Py_XDECREF(key); + Py_XDECREF(value); + return NULL; } static Py_ssize_t -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 13:20:19 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 19 Nov 2013 13:20:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_oops=2C_remove_=5FPyObject?= =?utf-8?q?=5FReprWriter=28=29_definition_=28unwanted_change=29?= Message-ID: <3dP5hg1wtZzRmv@mail.python.org> http://hg.python.org/cpython/rev/b0eb26a44aff changeset: 87266:b0eb26a44aff user: Victor Stinner date: Tue Nov 19 13:18:45 2013 +0100 summary: oops, remove _PyObject_ReprWriter() definition (unwanted change) files: Include/unicodeobject.h | 3 --- 1 files changed, 0 insertions(+), 3 deletions(-) diff --git a/Include/unicodeobject.h b/Include/unicodeobject.h --- a/Include/unicodeobject.h +++ b/Include/unicodeobject.h @@ -987,9 +987,6 @@ /* Deallocate memory of a writer (clear its internal buffer). */ PyAPI_FUNC(void) _PyUnicodeWriter_Dealloc(_PyUnicodeWriter *writer); - -PyAPI_FUNC(int) _PyObject_ReprWriter(_PyUnicodeWriter *writer, - PyObject *v); #endif #ifndef Py_LIMITED_API -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 13:34:42 2013 From: python-checkins at python.org (nick.coghlan) Date: Tue, 19 Nov 2013 13:34:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Also_chain_codec_exception?= =?utf-8?q?s_that_allow_weakrefs?= Message-ID: <3dP61G2Xs5zSFM@mail.python.org> http://hg.python.org/cpython/rev/04e1f701aeaa changeset: 87267:04e1f701aeaa user: Nick Coghlan date: Tue Nov 19 22:33:10 2013 +1000 summary: Also chain codec exceptions that allow weakrefs The zlib and hex codecs throw custom exception types with weakref support if the input type is valid, but the data fails validation. Make sure the exception chaining in the codec infrastructure can wrap those as well. files: Lib/test/test_codecs.py | 41 +++++++++++++++++++++++++--- Objects/exceptions.c | 17 +++++++++-- 2 files changed, 50 insertions(+), 8 deletions(-) diff --git a/Lib/test/test_codecs.py b/Lib/test/test_codecs.py --- a/Lib/test/test_codecs.py +++ b/Lib/test/test_codecs.py @@ -2402,6 +2402,25 @@ self.assertTrue(isinstance(failure.exception.__cause__, AttributeError)) + def test_custom_zlib_error_is_wrapped(self): + # Check zlib codec gives a good error for malformed input + msg = "^decoding with 'zlib_codec' codec failed" + with self.assertRaisesRegex(Exception, msg) as failure: + b"hello".decode("zlib_codec") + self.assertTrue(isinstance(failure.exception.__cause__, + type(failure.exception))) + + def test_custom_hex_error_is_wrapped(self): + # Check hex codec gives a good error for malformed input + msg = "^decoding with 'hex_codec' codec failed" + with self.assertRaisesRegex(Exception, msg) as failure: + b"hello".decode("hex_codec") + self.assertTrue(isinstance(failure.exception.__cause__, + type(failure.exception))) + + # Unfortunately, the bz2 module throws OSError, which the codec + # machinery currently can't wrap :( + def test_bad_decoding_output_type(self): # Check bytes.decode and bytearray.decode give a good error # message for binary -> binary codecs @@ -2466,15 +2485,15 @@ with self.assertRaisesRegex(exc_type, full_msg) as caught: yield caught - def check_wrapped(self, obj_to_raise, msg): + def check_wrapped(self, obj_to_raise, msg, exc_type=RuntimeError): self.set_codec(obj_to_raise) - with self.assertWrapped("encoding", RuntimeError, msg): + with self.assertWrapped("encoding", exc_type, msg): "str_input".encode(self.codec_name) - with self.assertWrapped("encoding", RuntimeError, msg): + with self.assertWrapped("encoding", exc_type, msg): codecs.encode("str_input", self.codec_name) - with self.assertWrapped("decoding", RuntimeError, msg): + with self.assertWrapped("decoding", exc_type, msg): b"bytes input".decode(self.codec_name) - with self.assertWrapped("decoding", RuntimeError, msg): + with self.assertWrapped("decoding", exc_type, msg): codecs.decode(b"bytes input", self.codec_name) def test_raise_by_type(self): @@ -2484,6 +2503,18 @@ msg = "This should be wrapped" self.check_wrapped(RuntimeError(msg), msg) + def test_raise_grandchild_subclass_exact_size(self): + msg = "This should be wrapped" + class MyRuntimeError(RuntimeError): + __slots__ = () + self.check_wrapped(MyRuntimeError(msg), msg, MyRuntimeError) + + def test_raise_subclass_with_weakref_support(self): + msg = "This should be wrapped" + class MyRuntimeError(RuntimeError): + pass + self.check_wrapped(MyRuntimeError(msg), msg, MyRuntimeError) + @contextlib.contextmanager def assertNotWrapped(self, operation, exc_type, msg_re, msg=None): if msg is None: diff --git a/Objects/exceptions.c b/Objects/exceptions.c --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -2630,16 +2630,27 @@ PyTypeObject *caught_type; PyObject **dictptr; PyObject *instance_args; - Py_ssize_t num_args; + Py_ssize_t num_args, caught_type_size, base_exc_size; PyObject *new_exc, *new_val, *new_tb; va_list vargs; + int same_basic_size; PyErr_Fetch(&exc, &val, &tb); caught_type = (PyTypeObject *)exc; - /* Ensure type info indicates no extra state is stored at the C level */ + /* Ensure type info indicates no extra state is stored at the C level + * and that the type can be reinstantiated using PyErr_Format + */ + caught_type_size = caught_type->tp_basicsize; + base_exc_size = _PyExc_BaseException.tp_basicsize; + same_basic_size = ( + caught_type_size == base_exc_size || + (PyType_SUPPORTS_WEAKREFS(caught_type) && + (caught_type_size == base_exc_size + sizeof(PyObject *)) + ) + ); if (caught_type->tp_init != (initproc)BaseException_init || caught_type->tp_new != BaseException_new || - caught_type->tp_basicsize != _PyExc_BaseException.tp_basicsize || + !same_basic_size || caught_type->tp_itemsize != _PyExc_BaseException.tp_itemsize) { /* We can't be sure we can wrap this safely, since it may contain * more state than just the exception type. Accordingly, we just -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 14:56:12 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 19 Nov 2013 14:56:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Remove_dead_code_committed?= =?utf-8?q?_in_issue_=2312892=2E?= Message-ID: <3dP7qJ12bhz7M2t@mail.python.org> http://hg.python.org/cpython/rev/130597102dac changeset: 87268:130597102dac user: Serhiy Storchaka date: Tue Nov 19 15:56:05 2013 +0200 summary: Remove dead code committed in issue #12892. files: Objects/stringlib/codecs.h | 104 ------------------------- 1 files changed, 0 insertions(+), 104 deletions(-) diff --git a/Objects/stringlib/codecs.h b/Objects/stringlib/codecs.h --- a/Objects/stringlib/codecs.h +++ b/Objects/stringlib/codecs.h @@ -598,110 +598,6 @@ #if STRINGLIB_MAX_CHAR >= 0x80 Py_LOCAL_INLINE(Py_ssize_t) -STRINGLIB(utf16_encode_)(const STRINGLIB_CHAR *in, - Py_ssize_t len, - unsigned short **outptr, - int native_ordering) -{ - unsigned short *out = *outptr; - const STRINGLIB_CHAR *end = in + len; -#if STRINGLIB_SIZEOF_CHAR == 1 -# define SWAB2(CH) ((CH) << 8) -#else -# define SWAB2(CH) (((CH) << 8) | ((CH) >> 8)) -#endif - if (native_ordering) { -#if STRINGLIB_MAX_CHAR < 0x10000 - const STRINGLIB_CHAR *unrolled_end = in + _Py_SIZE_ROUND_DOWN(len, 4); - while (in < unrolled_end) { -# if STRINGLIB_MAX_CHAR >= 0xd800 - if (((in[0] ^ 0xd800) & - (in[1] ^ 0xd800) & - (in[2] ^ 0xd800) & - (in[3] ^ 0xd800) & 0xf800) == 0) - break; -# endif - out[0] = in[0]; - out[1] = in[1]; - out[2] = in[2]; - out[3] = in[3]; - in += 4; out += 4; - } -#endif - while (in < end) { - Py_UCS4 ch; - ch = *in++; -#if STRINGLIB_MAX_CHAR >= 0xd800 - if (ch < 0xd800) - *out++ = ch; - else if (ch < 0xe000) - /* reject surrogate characters (U+DC800-U+DFFF) */ - goto fail; -# if STRINGLIB_MAX_CHAR >= 0x10000 - else if (ch >= 0x10000) { - out[0] = Py_UNICODE_HIGH_SURROGATE(ch); - out[1] = Py_UNICODE_LOW_SURROGATE(ch); - out += 2; - } -# endif - else -#endif - *out++ = ch; - } - } else { -#if STRINGLIB_MAX_CHAR < 0x10000 - const STRINGLIB_CHAR *unrolled_end = in + _Py_SIZE_ROUND_DOWN(len, 4); - while (in < unrolled_end) { -# if STRINGLIB_MAX_CHAR >= 0xd800 - if (((in[0] ^ 0xd800) & - (in[1] ^ 0xd800) & - (in[2] ^ 0xd800) & - (in[3] ^ 0xd800) & 0xf800) == 0) - break; -# endif - out[0] = SWAB2(in[0]); - out[1] = SWAB2(in[1]); - out[2] = SWAB2(in[2]); - out[3] = SWAB2(in[3]); - in += 4; out += 4; - } -#endif - while (in < end) { - Py_UCS4 ch = *in++; -#if STRINGLIB_MAX_CHAR >= 0xd800 - if (ch < 0xd800) - *out++ = SWAB2((Py_UCS2)ch); - else if (ch < 0xe000) - /* reject surrogate characters (U+DC800-U+DFFF) */ - goto fail; -# if STRINGLIB_MAX_CHAR >= 0x10000 - else if (ch >= 0x10000) { - Py_UCS2 ch1 = Py_UNICODE_HIGH_SURROGATE(ch); - Py_UCS2 ch2 = Py_UNICODE_LOW_SURROGATE(ch); - out[0] = SWAB2(ch1); - out[1] = SWAB2(ch2); - out += 2; - } -# endif - else -#endif - *out++ = SWAB2((Py_UCS2)ch); - } - } - *outptr = out; - return len; -#if STRINGLIB_MAX_CHAR >= 0xd800 - fail: -#endif - *outptr = out; - return len - (end - in + 1); -} -#endif - -#undef SWAB2 - -#if STRINGLIB_MAX_CHAR >= 0x80 -Py_LOCAL_INLINE(Py_ssize_t) STRINGLIB(utf16_encode)(const STRINGLIB_CHAR *in, Py_ssize_t len, unsigned short **outptr, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 16:40:51 2013 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 19 Nov 2013 16:40:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Replace_=22NT_path=22_with_?= =?utf-8?q?=22Windows_path=22?= Message-ID: <3dPB832Frhz7M4J@mail.python.org> http://hg.python.org/peps/rev/19a4004e5c2b changeset: 5289:19a4004e5c2b user: Antoine Pitrou date: Tue Nov 19 16:40:46 2013 +0100 summary: Replace "NT path" with "Windows path" files: pep-0428.txt | 6 +++--- 1 files changed, 3 insertions(+), 3 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -127,9 +127,9 @@ of pure classes, plus operations that do I/O. * a path class is of a given flavour according to the kind of operating - system paths it represents. `pathlib`_ implements two flavours: NT paths - for the filesystem semantics embodied in Windows systems, POSIX paths for - other systems (``os.name``'s terminology is re-used here). + system paths it represents. `pathlib`_ implements two flavours: Windows + paths for the filesystem semantics embodied in Windows systems, POSIX + paths for other systems. Any pure class can be instantiated on any system: for example, you can manipulate ``PurePosixPath`` objects under Windows, ``PureWindowsPath`` -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Tue Nov 19 16:45:51 2013 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 19 Nov 2013 16:45:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Replace_=22seven=22_with_=22s?= =?utf-8?q?everal=22_=3A=29?= Message-ID: <3dPBFq6LP4zRmv@mail.python.org> http://hg.python.org/peps/rev/1ed43b2af279 changeset: 5290:1ed43b2af279 user: Antoine Pitrou date: Tue Nov 19 16:45:47 2013 +0100 summary: Replace "seven" with "several" :) files: pep-0428.txt | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -368,7 +368,7 @@ Properties ---------- -Seven simple properties are provided on every path (each can be empty):: +Several simple properties are provided on every path (each can be empty):: >>> p = PureWindowsPath('c:/Downloads/pathlib.tar.gz') >>> p.drive -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Tue Nov 19 16:52:12 2013 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 19 Nov 2013 16:52:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_s/is=5Fsock/is=5Fsocket/?= Message-ID: <3dPBP85RLdz7M5L@mail.python.org> http://hg.python.org/peps/rev/7e2389da5ce2 changeset: 5291:7e2389da5ce2 user: Antoine Pitrou date: Tue Nov 19 16:52:03 2013 +0100 summary: s/is_sock/is_socket/ files: pep-0428.txt | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -584,7 +584,7 @@ False >>> p.is_symlink() False - >>> p.is_sock() + >>> p.is_socket() False >>> p.is_fifo() False -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Tue Nov 19 17:03:07 2013 From: python-checkins at python.org (martin.v.loewis) Date: Tue, 19 Nov 2013 17:03:07 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319550=3A_Implemen?= =?utf-8?q?t_Windows_installer_changes_of_PEP_453_=28ensurepip=29=2E?= Message-ID: <3dPBdl2PKtz7M3k@mail.python.org> http://hg.python.org/cpython/rev/e0c4a5b2b739 changeset: 87269:e0c4a5b2b739 user: Martin v. L?wis date: Tue Nov 19 17:02:36 2013 +0100 summary: Issue #19550: Implement Windows installer changes of PEP 453 (ensurepip). files: Misc/NEWS | 2 ++ Tools/msi/msi.py | 29 ++++++++++++++++++++++++----- 2 files changed, 26 insertions(+), 5 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -293,6 +293,8 @@ Build ----- +- Issue #19550: Implement Windows installer changes of PEP 453 (ensurepip). + - Issue #19520: Fix compiler warning in the _sha3 module on 32bit Windows. - Issue #19356: Avoid using a C variabled named "_self", it's a reserved diff --git a/Tools/msi/msi.py b/Tools/msi/msi.py --- a/Tools/msi/msi.py +++ b/Tools/msi/msi.py @@ -420,6 +420,8 @@ compileargs = r'-Wi "[TARGETDIR]Lib\compileall.py" -f -x "bad_coding|badsyntax|site-packages|py2_|lib2to3\\tests|venv\\scripts" "[TARGETDIR]Lib"' lib2to3args = r'-c "import lib2to3.pygram, lib2to3.patcomp;lib2to3.patcomp.PatternCompiler()"' + updatepipargs = r'-m ensurepip -U' + removepipargs = r'-m ensurepip -r' # does not yet work # See "CustomAction Table" add_data(db, "CustomAction", [ # msidbCustomActionTypeFirstSequence + msidbCustomActionTypeTextData + msidbCustomActionTypeProperty @@ -436,6 +438,9 @@ ("CompilePyc", 18, "python.exe", compileargs), ("CompilePyo", 18, "python.exe", "-O "+compileargs), ("CompileGrammar", 18, "python.exe", lib2to3args), + # msidbCustomActionTypeInScript (1024); run during actual installation + ("UpdatePip", 18+1024, "python.exe", updatepipargs), + #("RemovePip", 18, "python.exe", removepipargs), ]) # UI Sequences, see "InstallUISequence Table", "Using a Sequence Table" @@ -462,7 +467,7 @@ # Prepend TARGETDIR to the system path, and remove it on uninstall. add_data(db, "Environment", - [("PathAddition", "=-*Path", "[TARGETDIR];[~]", "REGISTRY.path")]) + [("PathAddition", "=-*Path", "[TARGETDIR];[TARGETDIR]Scripts;[~]", "REGISTRY.path")]) # Execute Sequences add_data(db, "InstallExecuteSequence", @@ -472,6 +477,12 @@ ("SetLauncherDirToWindows", 'LAUNCHERDIR="" and ' + sys32cond, 753), ("SetLauncherDirToTarget", 'LAUNCHERDIR="" and not ' + sys32cond, 754), ("UpdateEditIDLE", None, 1050), + # run command if install state of pip changes to INSTALLSTATE_LOCAL + # run after InstallFiles + ("UpdatePip", "&pip=3", 4001), + # remove pip when state changes to INSTALLSTATE_ABSENT + # run before RemoveFiles + #("RemovePip", "&pip=2", 3499), ("CompilePyc", "COMPILEALL", 6800), ("CompilePyo", "COMPILEALL", 6801), ("CompileGrammar", "COMPILEALL", 6802), @@ -751,7 +762,8 @@ advanced = PyDialog(db, "AdvancedDlg", x, y, w, h, modal, title, "CompilePyc", "Ok", "Ok") advanced.title("Advanced Options for [ProductName]") - # A radio group with two options: allusers, justme + + # A checkbox whether to build pyc files advanced.checkbox("CompilePyc", 135, 60, 230, 50, 3, "COMPILEALL", "Compile .py files to byte code after installation", "Ok") @@ -848,7 +860,8 @@ # (i.e. additional Python libraries) need to follow the parent feature. # Features that have no advertisement trigger (e.g. the test suite) # must not support advertisement - global default_feature, tcltk, htmlfiles, tools, testsuite, ext_feature, private_crt, prepend_path + global default_feature, tcltk, htmlfiles, tools, testsuite + global ext_feature, private_crt, prepend_path, update_pip default_feature = Feature(db, "DefaultFeature", "Python", "Python Interpreter and Libraries", 1, directory = "TARGETDIR") @@ -870,8 +883,14 @@ tools = Feature(db, "Tools", "Utility Scripts", "Python utility scripts (Tools/)", 9, parent = default_feature, attributes=2) + # pip installation isn't enabled by default until a clean uninstall procedure + # becomes possible + update_pip = Feature(db, "pip", "pip", + "Install (or upgrade from an earlier version) pip, " + "a tool for installing and managing Python packages.", 11, + parent = default_feature, attributes=2|8, level=2) testsuite = Feature(db, "Testsuite", "Test suite", - "Python test suite (Lib/test/)", 11, + "Python test suite (Lib/test/)", 13, parent = default_feature, attributes=2|8) # prepend_path is an additional feature which is to be off by default. # Since the default level for the above features is 1, this needs to be @@ -879,7 +898,7 @@ prepend_path = Feature(db, "PrependPath", "Add python.exe to Path", "Prepend [TARGETDIR] to the system Path variable. " "This allows you to type 'python' into a command " - "prompt without needing the full path.", 13, + "prompt without needing the full path.", 15, parent = default_feature, attributes=2|8, level=2) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 17:05:41 2013 From: python-checkins at python.org (andrew.kuchling) Date: Tue, 19 Nov 2013 17:05:41 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogIzEwOTg3NDk6IHJl?= =?utf-8?q?-word_gettext_docs_to_not_encourage_using_pygettext_so_much=2E?= Message-ID: <3dPBhj4XfPz7M3k@mail.python.org> http://hg.python.org/cpython/rev/4fe87b5df2d0 changeset: 87270:4fe87b5df2d0 branch: 3.3 parent: 87259:26c73799715f user: Andrew Kuchling date: Tue Nov 19 11:05:20 2013 -0500 summary: #1098749: re-word gettext docs to not encourage using pygettext so much. Also, add a link to the Babel package. files: Doc/library/gettext.rst | 97 ++++++++++++++-------------- 1 files changed, 48 insertions(+), 49 deletions(-) diff --git a/Doc/library/gettext.rst b/Doc/library/gettext.rst --- a/Doc/library/gettext.rst +++ b/Doc/library/gettext.rst @@ -94,9 +94,10 @@ The Plural formula is taken from the catalog header. It is a C or Python expression that has a free variable *n*; the expression evaluates to the index - of the plural in the catalog. See the GNU gettext documentation for the precise - syntax to be used in :file:`.po` files and the formulas for a variety of - languages. + of the plural in the catalog. See + `the GNU gettext documentation `__ + for the precise syntax to be used in :file:`.po` files and the + formulas for a variety of languages. .. function:: lngettext(singular, plural, n) @@ -451,35 +452,42 @@ In this example, the string ``'writing a log message'`` is marked as a candidate for translation, while the strings ``'mylog.txt'`` and ``'w'`` are not. -The Python distribution comes with two tools which help you generate the message -catalogs once you've prepared your source code. These may or may not be -available from a binary distribution, but they can be found in a source -distribution, in the :file:`Tools/i18n` directory. +There are a few tools to extract the strings meant for translation. +The original GNU :program:`gettext` only supported C or C++ source +code but its extended version :program:`xgettext` scans code written +in a number of languages, including Python, to find strings marked as +translatable. `Babel `__ is a Python +internationalization library that includes a :file:`pybabel` script to +extract and compile message catalogs. Fran?ois Pinard's program +called :program:`xpot` does a similar job and is available as part of +his `po-utils package `__. -The :program:`pygettext` [#]_ program scans all your Python source code looking -for the strings you previously marked as translatable. It is similar to the GNU -:program:`gettext` program except that it understands all the intricacies of -Python source code, but knows nothing about C or C++ source code. You don't -need GNU ``gettext`` unless you're also going to be translating C code (such as -C extension modules). +(Python also includes pure-Python versions of these programs, called +:program:`pygettext.py` and :program:`msgfmt.py`; some Python distributions +will install them for you. :program:`pygettext.py` is similar to +:program:`xgettext`, but only understands Python source code and +cannot handle other programming languages such as C or C++. +:program:`pygettext.py` supports a command-line interface similar to +:program:`xgettext`; for details on its use, run ``pygettext.py +--help``. :program:`msgfmt.py` is binary compatible with GNU +:program:`msgfmt`. With these two programs, you may not need the GNU +:program:`gettext` package to internationalize your Python +applications.) -:program:`pygettext` generates textual Uniforum-style human readable message -catalog :file:`.pot` files, essentially structured human readable files which -contain every marked string in the source code, along with a placeholder for the -translation strings. :program:`pygettext` is a command line script that supports -a similar command line interface as :program:`xgettext`; for details on its use, -run:: +:program:`xgettext`, :program:`pygettext`, and similar tools generate +:file:`.po` files that are message catalogs. They are structured +:human-readable files that contain every marked string in the source +:code, along with a placeholder for the translated versions of these +:strings. - pygettext.py --help - -Copies of these :file:`.pot` files are then handed over to the individual human -translators who write language-specific versions for every supported natural -language. They send you back the filled in language-specific versions as a -:file:`.po` file. Using the :program:`msgfmt.py` [#]_ program (in the -:file:`Tools/i18n` directory), you take the :file:`.po` files from your -translators and generate the machine-readable :file:`.mo` binary catalog files. -The :file:`.mo` files are what the :mod:`gettext` module uses for the actual -translation processing during run-time. +Copies of these :file:`.po` files are then handed over to the +individual human translators who write translations for every +supported natural language. They send back the completed +language-specific versions as a :file:`.po` file that's +compiled into a machine-readable :file:`.mo` binary catalog file using +the :program:`msgfmt` program. The :file:`.mo` files are used by the +:mod:`gettext` module for the actual translation processing at +run-time. How you use the :mod:`gettext` module in your code depends on whether you are internationalizing a single module or your entire application. The next two @@ -517,7 +525,7 @@ import gettext gettext.install('myapplication') -If you need to set the locale directory, you can pass these into the +If you need to set the locale directory, you can pass it into the :func:`install` function:: import gettext @@ -590,7 +598,8 @@ namespace. Note that the second use of :func:`_` will not identify "a" as being -translatable to the :program:`pygettext` program, since it is not a string. +translatable to the :program:`gettext` program, because the parameter +is not a string literal. Another way to handle this is with the following example:: @@ -606,11 +615,14 @@ for a in animals: print(_(a)) -In this case, you are marking translatable strings with the function :func:`N_`, -[#]_ which won't conflict with any definition of :func:`_`. However, you will -need to teach your message extraction program to look for translatable strings -marked with :func:`N_`. :program:`pygettext` and :program:`xpot` both support -this through the use of command line switches. +In this case, you are marking translatable strings with the function +:func:`N_`, which won't conflict with any definition of :func:`_`. +However, you will need to teach your message extraction program to +look for translatable strings marked with :func:`N_`. :program:`xgettext`, +:program:`pygettext`, ``pybabel extract``, and :program:`xpot` all +support this through the use of the :option:`-k` command-line switch. +The choice of :func:`N_` here is totally arbitrary; it could have just +as easily been :func:`MarkThisStringForTranslation`. Acknowledgements @@ -645,16 +657,3 @@ absolute path at the start of your application. .. [#] See the footnote for :func:`bindtextdomain` above. - -.. [#] Fran?ois Pinard has written a program called :program:`xpot` which does a - similar job. It is available as part of his `po-utils package - `_. - -.. [#] :program:`msgfmt.py` is binary compatible with GNU :program:`msgfmt` except that - it provides a simpler, all-Python implementation. With this and - :program:`pygettext.py`, you generally won't need to install the GNU - :program:`gettext` package to internationalize your Python applications. - -.. [#] The choice of :func:`N_` here is totally arbitrary; it could have just as easily - been :func:`MarkThisStringForTranslation`. - -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 17:06:57 2013 From: python-checkins at python.org (andrew.kuchling) Date: Tue, 19 Nov 2013 17:06:57 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge_from_3=2E3?= Message-ID: <3dPBk95XB7z7M43@mail.python.org> http://hg.python.org/cpython/rev/177e0254fdee changeset: 87271:177e0254fdee parent: 87269:e0c4a5b2b739 parent: 87270:4fe87b5df2d0 user: Andrew Kuchling date: Tue Nov 19 11:06:44 2013 -0500 summary: Merge from 3.3 files: Doc/library/gettext.rst | 97 ++++++++++++++-------------- 1 files changed, 48 insertions(+), 49 deletions(-) diff --git a/Doc/library/gettext.rst b/Doc/library/gettext.rst --- a/Doc/library/gettext.rst +++ b/Doc/library/gettext.rst @@ -94,9 +94,10 @@ The Plural formula is taken from the catalog header. It is a C or Python expression that has a free variable *n*; the expression evaluates to the index - of the plural in the catalog. See the GNU gettext documentation for the precise - syntax to be used in :file:`.po` files and the formulas for a variety of - languages. + of the plural in the catalog. See + `the GNU gettext documentation `__ + for the precise syntax to be used in :file:`.po` files and the + formulas for a variety of languages. .. function:: lngettext(singular, plural, n) @@ -451,35 +452,42 @@ In this example, the string ``'writing a log message'`` is marked as a candidate for translation, while the strings ``'mylog.txt'`` and ``'w'`` are not. -The Python distribution comes with two tools which help you generate the message -catalogs once you've prepared your source code. These may or may not be -available from a binary distribution, but they can be found in a source -distribution, in the :file:`Tools/i18n` directory. +There are a few tools to extract the strings meant for translation. +The original GNU :program:`gettext` only supported C or C++ source +code but its extended version :program:`xgettext` scans code written +in a number of languages, including Python, to find strings marked as +translatable. `Babel `__ is a Python +internationalization library that includes a :file:`pybabel` script to +extract and compile message catalogs. Fran?ois Pinard's program +called :program:`xpot` does a similar job and is available as part of +his `po-utils package `__. -The :program:`pygettext` [#]_ program scans all your Python source code looking -for the strings you previously marked as translatable. It is similar to the GNU -:program:`gettext` program except that it understands all the intricacies of -Python source code, but knows nothing about C or C++ source code. You don't -need GNU ``gettext`` unless you're also going to be translating C code (such as -C extension modules). +(Python also includes pure-Python versions of these programs, called +:program:`pygettext.py` and :program:`msgfmt.py`; some Python distributions +will install them for you. :program:`pygettext.py` is similar to +:program:`xgettext`, but only understands Python source code and +cannot handle other programming languages such as C or C++. +:program:`pygettext.py` supports a command-line interface similar to +:program:`xgettext`; for details on its use, run ``pygettext.py +--help``. :program:`msgfmt.py` is binary compatible with GNU +:program:`msgfmt`. With these two programs, you may not need the GNU +:program:`gettext` package to internationalize your Python +applications.) -:program:`pygettext` generates textual Uniforum-style human readable message -catalog :file:`.pot` files, essentially structured human readable files which -contain every marked string in the source code, along with a placeholder for the -translation strings. :program:`pygettext` is a command line script that supports -a similar command line interface as :program:`xgettext`; for details on its use, -run:: +:program:`xgettext`, :program:`pygettext`, and similar tools generate +:file:`.po` files that are message catalogs. They are structured +:human-readable files that contain every marked string in the source +:code, along with a placeholder for the translated versions of these +:strings. - pygettext.py --help - -Copies of these :file:`.pot` files are then handed over to the individual human -translators who write language-specific versions for every supported natural -language. They send you back the filled in language-specific versions as a -:file:`.po` file. Using the :program:`msgfmt.py` [#]_ program (in the -:file:`Tools/i18n` directory), you take the :file:`.po` files from your -translators and generate the machine-readable :file:`.mo` binary catalog files. -The :file:`.mo` files are what the :mod:`gettext` module uses for the actual -translation processing during run-time. +Copies of these :file:`.po` files are then handed over to the +individual human translators who write translations for every +supported natural language. They send back the completed +language-specific versions as a :file:`.po` file that's +compiled into a machine-readable :file:`.mo` binary catalog file using +the :program:`msgfmt` program. The :file:`.mo` files are used by the +:mod:`gettext` module for the actual translation processing at +run-time. How you use the :mod:`gettext` module in your code depends on whether you are internationalizing a single module or your entire application. The next two @@ -517,7 +525,7 @@ import gettext gettext.install('myapplication') -If you need to set the locale directory, you can pass these into the +If you need to set the locale directory, you can pass it into the :func:`install` function:: import gettext @@ -590,7 +598,8 @@ namespace. Note that the second use of :func:`_` will not identify "a" as being -translatable to the :program:`pygettext` program, since it is not a string. +translatable to the :program:`gettext` program, because the parameter +is not a string literal. Another way to handle this is with the following example:: @@ -606,11 +615,14 @@ for a in animals: print(_(a)) -In this case, you are marking translatable strings with the function :func:`N_`, -[#]_ which won't conflict with any definition of :func:`_`. However, you will -need to teach your message extraction program to look for translatable strings -marked with :func:`N_`. :program:`pygettext` and :program:`xpot` both support -this through the use of command line switches. +In this case, you are marking translatable strings with the function +:func:`N_`, which won't conflict with any definition of :func:`_`. +However, you will need to teach your message extraction program to +look for translatable strings marked with :func:`N_`. :program:`xgettext`, +:program:`pygettext`, ``pybabel extract``, and :program:`xpot` all +support this through the use of the :option:`-k` command-line switch. +The choice of :func:`N_` here is totally arbitrary; it could have just +as easily been :func:`MarkThisStringForTranslation`. Acknowledgements @@ -645,16 +657,3 @@ absolute path at the start of your application. .. [#] See the footnote for :func:`bindtextdomain` above. - -.. [#] Fran?ois Pinard has written a program called :program:`xpot` which does a - similar job. It is available as part of his `po-utils package - `_. - -.. [#] :program:`msgfmt.py` is binary compatible with GNU :program:`msgfmt` except that - it provides a simpler, all-Python implementation. With this and - :program:`pygettext.py`, you generally won't need to install the GNU - :program:`gettext` package to internationalize your Python applications. - -.. [#] The choice of :func:`N_` here is totally arbitrary; it could have just as easily - been :func:`MarkThisStringForTranslation`. - -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 19:25:52 2013 From: python-checkins at python.org (r.david.murray) Date: Tue, 19 Nov 2013 19:25:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogIzE5NDQ5OiBIYW5k?= =?utf-8?q?le_non-string_keys_when_generating_=27fieldnames=27_error=2E?= Message-ID: <3dPFpS3t2YzS4l@mail.python.org> http://hg.python.org/cpython/rev/6e5afeada7ca changeset: 87272:6e5afeada7ca branch: 3.3 parent: 87270:4fe87b5df2d0 user: R David Murray date: Tue Nov 19 13:16:20 2013 -0500 summary: #19449: Handle non-string keys when generating 'fieldnames' error. csv was handling non-string keys fine except for the error message generated when a non-string key was not in 'fieldnames'. Fix by Tomas Grahn, full patch-with-test by Vajrasky Kok (tweaked slightly). files: Lib/csv.py | 2 +- Lib/test/test_csv.py | 12 ++++++++++++ Misc/NEWS | 3 +++ 3 files changed, 16 insertions(+), 1 deletions(-) diff --git a/Lib/csv.py b/Lib/csv.py --- a/Lib/csv.py +++ b/Lib/csv.py @@ -146,7 +146,7 @@ wrong_fields = [k for k in rowdict if k not in self.fieldnames] if wrong_fields: raise ValueError("dict contains fields not in fieldnames: " - + ", ".join(wrong_fields)) + + ", ".join([repr(x) for x in wrong_fields])) return [rowdict.get(key, self.restval) for key in self.fieldnames] def writerow(self, rowdict): diff --git a/Lib/test/test_csv.py b/Lib/test/test_csv.py --- a/Lib/test/test_csv.py +++ b/Lib/test/test_csv.py @@ -570,6 +570,18 @@ fileobj = StringIO() self.assertRaises(TypeError, csv.DictWriter, fileobj) + def test_write_fields_not_in_fieldnames(self): + with TemporaryFile("w+", newline='') as fileobj: + writer = csv.DictWriter(fileobj, fieldnames = ["f1", "f2", "f3"]) + # Of special note is the non-string key (issue 19449) + with self.assertRaises(ValueError) as cx: + writer.writerow({"f4": 10, "f2": "spam", 1: "abc"}) + exception = str(cx.exception) + self.assertIn("fieldnames", exception) + self.assertIn("'f4'", exception) + self.assertNotIn("'f2'", exception) + self.assertIn("1", exception) + def test_read_dict_fields(self): with TemporaryFile("w+") as fileobj: fileobj.write("1,2,abc\r\n") diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -13,6 +13,9 @@ Library ------- +- Issue #19449: in csv's writerow, handle non-string keys when generating the + error message that certain keys are not in the 'fieldnames' list. + - Fix test.support.bind_port() to not cause an error when Python was compiled on a system with SO_REUSEPORT defined in the headers but run on a system with an OS kernel that does not support that reasonably new socket option. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 19:25:53 2013 From: python-checkins at python.org (r.david.murray) Date: Tue, 19 Nov 2013 19:25:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge=3A_=2319449=3A_Handle_non-string_keys_when_generat?= =?utf-8?q?ing_=27fieldnames=27_error=2E?= Message-ID: <3dPFpT5lZrz7M4Z@mail.python.org> http://hg.python.org/cpython/rev/ee2c80eeca2a changeset: 87273:ee2c80eeca2a parent: 87271:177e0254fdee parent: 87272:6e5afeada7ca user: R David Murray date: Tue Nov 19 13:17:26 2013 -0500 summary: Merge: #19449: Handle non-string keys when generating 'fieldnames' error. files: Lib/csv.py | 2 +- Lib/test/test_csv.py | 12 ++++++++++++ Misc/NEWS | 3 +++ 3 files changed, 16 insertions(+), 1 deletions(-) diff --git a/Lib/csv.py b/Lib/csv.py --- a/Lib/csv.py +++ b/Lib/csv.py @@ -146,7 +146,7 @@ wrong_fields = [k for k in rowdict if k not in self.fieldnames] if wrong_fields: raise ValueError("dict contains fields not in fieldnames: " - + ", ".join(wrong_fields)) + + ", ".join([repr(x) for x in wrong_fields])) return [rowdict.get(key, self.restval) for key in self.fieldnames] def writerow(self, rowdict): diff --git a/Lib/test/test_csv.py b/Lib/test/test_csv.py --- a/Lib/test/test_csv.py +++ b/Lib/test/test_csv.py @@ -579,6 +579,18 @@ fileobj = StringIO() self.assertRaises(TypeError, csv.DictWriter, fileobj) + def test_write_fields_not_in_fieldnames(self): + with TemporaryFile("w+", newline='') as fileobj: + writer = csv.DictWriter(fileobj, fieldnames = ["f1", "f2", "f3"]) + # Of special note is the non-string key (issue 19449) + with self.assertRaises(ValueError) as cx: + writer.writerow({"f4": 10, "f2": "spam", 1: "abc"}) + exception = str(cx.exception) + self.assertIn("fieldnames", exception) + self.assertIn("'f4'", exception) + self.assertNotIn("'f2'", exception) + self.assertIn("1", exception) + def test_read_dict_fields(self): with TemporaryFile("w+") as fileobj: fileobj.write("1,2,abc\r\n") diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -56,6 +56,9 @@ Library ------- +- Issue #19449: in csv's writerow, handle non-string keys when generating the + error message that certain keys are not in the 'fieldnames' list. + - Issue #8402: Added the escape() function to the glob module. - Issue #17618: Add Base85 and Ascii85 encoding/decoding to the base64 module. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 19:25:55 2013 From: python-checkins at python.org (r.david.murray) Date: Tue, 19 Nov 2013 19:25:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogIzE5NDQ5OiBIYW5k?= =?utf-8?q?le_non-string_keys_when_generating_=27fieldnames=27_error=2E?= Message-ID: <3dPFpW0lckz7M5C@mail.python.org> http://hg.python.org/cpython/rev/e52d7b173ab5 changeset: 87274:e52d7b173ab5 branch: 2.7 parent: 87256:d47349232171 user: R David Murray date: Tue Nov 19 13:25:24 2013 -0500 summary: #19449: Handle non-string keys when generating 'fieldnames' error. Backport from 3.3 6e5afeada7ca. files: Lib/csv.py | 4 ++-- Lib/test/test_csv.py | 17 +++++++++++++++++ Misc/NEWS | 3 +++ 3 files changed, 22 insertions(+), 2 deletions(-) diff --git a/Lib/csv.py b/Lib/csv.py --- a/Lib/csv.py +++ b/Lib/csv.py @@ -140,8 +140,8 @@ if self.extrasaction == "raise": wrong_fields = [k for k in rowdict if k not in self.fieldnames] if wrong_fields: - raise ValueError("dict contains fields not in fieldnames: " + - ", ".join(wrong_fields)) + raise ValueError("dict contains fields not in fieldnames: " + + ", ".join([repr(x) for x in wrong_fields])) return [rowdict.get(key, self.restval) for key in self.fieldnames] def writerow(self, rowdict): diff --git a/Lib/test/test_csv.py b/Lib/test/test_csv.py --- a/Lib/test/test_csv.py +++ b/Lib/test/test_csv.py @@ -630,6 +630,23 @@ fileobj = StringIO() self.assertRaises(TypeError, csv.DictWriter, fileobj) + def test_write_fields_not_in_fieldnames(self): + fd, name = tempfile.mkstemp() + fileobj = os.fdopen(fd, "w+b") + try: + writer = csv.DictWriter(fileobj, fieldnames = ["f1", "f2", "f3"]) + # Of special note is the non-string key (issue 19449) + with self.assertRaises(ValueError) as cx: + writer.writerow({"f4": 10, "f2": "spam", 1: "abc"}) + exception = str(cx.exception) + self.assertIn("fieldnames", exception) + self.assertIn("'f4'", exception) + self.assertNotIn("'f2'", exception) + self.assertIn("1", exception) + finally: + fileobj.close() + os.unlink(name) + def test_read_dict_fields(self): fd, name = tempfile.mkstemp() fileobj = os.fdopen(fd, "w+b") diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -12,6 +12,9 @@ Library ------- +- Issue #19449: in csv's writerow, handle non-string keys when generating the + error message that certain keys are not in the 'fieldnames' list. + - Issue #12853: Fix NameError in distutils.command.upload. - Issue #19523: Closed FileHandler leak which occurred when delay was set. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 19:29:16 2013 From: python-checkins at python.org (ezio.melotti) Date: Tue, 19 Nov 2013 19:29:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=232927=3A_Added_the_unesc?= =?utf-8?q?ape=28=29_function_to_the_html_module=2E?= Message-ID: <3dPFtN62SlzRfN@mail.python.org> http://hg.python.org/cpython/rev/7b9235852b3b changeset: 87275:7b9235852b3b parent: 87273:ee2c80eeca2a user: Ezio Melotti date: Tue Nov 19 20:28:45 2013 +0200 summary: #2927: Added the unescape() function to the html module. files: Doc/library/html.entities.rst | 1 + Doc/library/html.rst | 11 ++ Lib/html/__init__.py | 114 +++++++++++++++++++++- Lib/html/parser.py | 38 +------ Lib/test/test_html.py | 86 ++++++++++++++++- Lib/test/test_htmlparser.py | 12 -- Misc/NEWS | 2 + 7 files changed, 215 insertions(+), 49 deletions(-) diff --git a/Doc/library/html.entities.rst b/Doc/library/html.entities.rst --- a/Doc/library/html.entities.rst +++ b/Doc/library/html.entities.rst @@ -20,6 +20,7 @@ Note that the trailing semicolon is included in the name (e.g. ``'gt;'``), however some of the names are accepted by the standard even without the semicolon: in this case the name is present with and without the ``';'``. + See also :func:`html.unescape`. .. versionadded:: 3.3 diff --git a/Doc/library/html.rst b/Doc/library/html.rst --- a/Doc/library/html.rst +++ b/Doc/library/html.rst @@ -20,6 +20,17 @@ .. versionadded:: 3.2 + +.. function:: unescape(s) + + Convert all named and numeric character references (e.g. ``>``, + ``>``, ``&x3e;``) in the string *s* to the corresponding unicode + characters. This function uses the rules defined by the HTML 5 standard + for both valid and invalid character references, and the :data:`list of + HTML 5 named character references `. + + .. versionadded:: 3.4 + -------------- Submodules in the ``html`` package are: diff --git a/Lib/html/__init__.py b/Lib/html/__init__.py --- a/Lib/html/__init__.py +++ b/Lib/html/__init__.py @@ -2,7 +2,12 @@ General functions for HTML manipulation. """ -# NB: this is a candidate for a bytes/string polymorphic interface +import re as _re +from html.entities import html5 as _html5 + + +__all__ = ['escape', 'unescape'] + def escape(s, quote=True): """ @@ -18,3 +23,110 @@ s = s.replace('"', """) s = s.replace('\'', "'") return s + + +# see http://www.w3.org/TR/html5/syntax.html#tokenizing-character-references + +_invalid_charrefs = { + 0x00: '\ufffd', # REPLACEMENT CHARACTER + 0x0d: '\r', # CARRIAGE RETURN + 0x80: '\u20ac', # EURO SIGN + 0x81: '\x81', # + 0x82: '\u201a', # SINGLE LOW-9 QUOTATION MARK + 0x83: '\u0192', # LATIN SMALL LETTER F WITH HOOK + 0x84: '\u201e', # DOUBLE LOW-9 QUOTATION MARK + 0x85: '\u2026', # HORIZONTAL ELLIPSIS + 0x86: '\u2020', # DAGGER + 0x87: '\u2021', # DOUBLE DAGGER + 0x88: '\u02c6', # MODIFIER LETTER CIRCUMFLEX ACCENT + 0x89: '\u2030', # PER MILLE SIGN + 0x8a: '\u0160', # LATIN CAPITAL LETTER S WITH CARON + 0x8b: '\u2039', # SINGLE LEFT-POINTING ANGLE QUOTATION MARK + 0x8c: '\u0152', # LATIN CAPITAL LIGATURE OE + 0x8d: '\x8d', # + 0x8e: '\u017d', # LATIN CAPITAL LETTER Z WITH CARON + 0x8f: '\x8f', # + 0x90: '\x90', # + 0x91: '\u2018', # LEFT SINGLE QUOTATION MARK + 0x92: '\u2019', # RIGHT SINGLE QUOTATION MARK + 0x93: '\u201c', # LEFT DOUBLE QUOTATION MARK + 0x94: '\u201d', # RIGHT DOUBLE QUOTATION MARK + 0x95: '\u2022', # BULLET + 0x96: '\u2013', # EN DASH + 0x97: '\u2014', # EM DASH + 0x98: '\u02dc', # SMALL TILDE + 0x99: '\u2122', # TRADE MARK SIGN + 0x9a: '\u0161', # LATIN SMALL LETTER S WITH CARON + 0x9b: '\u203a', # SINGLE RIGHT-POINTING ANGLE QUOTATION MARK + 0x9c: '\u0153', # LATIN SMALL LIGATURE OE + 0x9d: '\x9d', # + 0x9e: '\u017e', # LATIN SMALL LETTER Z WITH CARON + 0x9f: '\u0178', # LATIN CAPITAL LETTER Y WITH DIAERESIS +} + +_invalid_codepoints = { + # 0x0001 to 0x0008 + 0x1, 0x2, 0x3, 0x4, 0x5, 0x6, 0x7, 0x8, + # 0x000E to 0x001F + 0xe, 0xf, 0x10, 0x11, 0x12, 0x13, 0x14, 0x15, 0x16, 0x17, 0x18, 0x19, + 0x1a, 0x1b, 0x1c, 0x1d, 0x1e, 0x1f, + # 0x007F to 0x009F + 0x7f, 0x80, 0x81, 0x82, 0x83, 0x84, 0x85, 0x86, 0x87, 0x88, 0x89, 0x8a, + 0x8b, 0x8c, 0x8d, 0x8e, 0x8f, 0x90, 0x91, 0x92, 0x93, 0x94, 0x95, 0x96, + 0x97, 0x98, 0x99, 0x9a, 0x9b, 0x9c, 0x9d, 0x9e, 0x9f, + # 0xFDD0 to 0xFDEF + 0xfdd0, 0xfdd1, 0xfdd2, 0xfdd3, 0xfdd4, 0xfdd5, 0xfdd6, 0xfdd7, 0xfdd8, + 0xfdd9, 0xfdda, 0xfddb, 0xfddc, 0xfddd, 0xfdde, 0xfddf, 0xfde0, 0xfde1, + 0xfde2, 0xfde3, 0xfde4, 0xfde5, 0xfde6, 0xfde7, 0xfde8, 0xfde9, 0xfdea, + 0xfdeb, 0xfdec, 0xfded, 0xfdee, 0xfdef, + # others + 0xb, 0xfffe, 0xffff, 0x1fffe, 0x1ffff, 0x2fffe, 0x2ffff, 0x3fffe, 0x3ffff, + 0x4fffe, 0x4ffff, 0x5fffe, 0x5ffff, 0x6fffe, 0x6ffff, 0x7fffe, 0x7ffff, + 0x8fffe, 0x8ffff, 0x9fffe, 0x9ffff, 0xafffe, 0xaffff, 0xbfffe, 0xbffff, + 0xcfffe, 0xcffff, 0xdfffe, 0xdffff, 0xefffe, 0xeffff, 0xffffe, 0xfffff, + 0x10fffe, 0x10ffff +} + + +def _replace_charref(s): + s = s.group(1) + if s[0] == '#': + # numeric charref + if s[1] in 'xX': + num = int(s[2:].rstrip(';'), 16) + else: + num = int(s[1:].rstrip(';')) + if num in _invalid_charrefs: + return _invalid_charrefs[num] + if 0xD800 <= num <= 0xDFFF or num > 0x10FFFF: + return '\uFFFD' + if num in _invalid_codepoints: + return '' + return chr(num) + else: + # named charref + if s in _html5: + return _html5[s] + # find the longest matching name (as defined by the standard) + for x in range(len(s)-1, 1, -1): + if s[:x] in _html5: + return _html5[s[:x]] + s[x:] + else: + return '&' + s + + +_charref = _re.compile(r'&(#[0-9]+;?' + r'|#[xX][0-9a-fA-F]+;?' + r'|[^\t\n\f <&#;]{1,32};?)') + +def unescape(s): + """ + Convert all named and numeric character references (e.g. >, >, + &x3e;) in the string s to the corresponding unicode characters. + This function uses the rules defined by the HTML 5 standard + for both valid and invalid character references, and the list of + HTML 5 named character references defined in html.entities.html5. + """ + if '&' not in s: + return s + return _charref.sub(_replace_charref, s) diff --git a/Lib/html/parser.py b/Lib/html/parser.py --- a/Lib/html/parser.py +++ b/Lib/html/parser.py @@ -8,9 +8,12 @@ # and CDATA (character data -- only end tags are special). -import _markupbase import re import warnings +import _markupbase + +from html import unescape + __all__ = ['HTMLParser'] @@ -357,7 +360,7 @@ attrvalue[:1] == '"' == attrvalue[-1:]: attrvalue = attrvalue[1:-1] if attrvalue: - attrvalue = self.unescape(attrvalue) + attrvalue = unescape(attrvalue) attrs.append((attrname.lower(), attrvalue)) k = m.end() @@ -510,34 +513,3 @@ def unknown_decl(self, data): if self.strict: self.error("unknown declaration: %r" % (data,)) - - # Internal -- helper to remove special character quoting - def unescape(self, s): - if '&' not in s: - return s - def replaceEntities(s): - s = s.groups()[0] - try: - if s[0] == "#": - s = s[1:] - if s[0] in ['x','X']: - c = int(s[1:].rstrip(';'), 16) - else: - c = int(s.rstrip(';')) - return chr(c) - except ValueError: - return '&#' + s - else: - from html.entities import html5 - if s in html5: - return html5[s] - elif s.endswith(';'): - return '&' + s - for x in range(2, len(s)): - if s[:x] in html5: - return html5[s[:x]] + s[x:] - else: - return '&' + s - - return re.sub(r"&(#?[xX]?(?:[0-9a-fA-F]+;|\w{1,32};?))", - replaceEntities, s, flags=re.ASCII) diff --git a/Lib/test/test_html.py b/Lib/test/test_html.py --- a/Lib/test/test_html.py +++ b/Lib/test/test_html.py @@ -16,9 +16,89 @@ html.escape('\'\'', False), '\'<script>"&foo;"</script>\'') + def test_unescape(self): + numeric_formats = ['&#%d', '&#%d;', '&#x%x', '&#x%x;'] + errmsg = 'unescape(%r) should have returned %r' + def check(text, expected): + self.assertEqual(html.unescape(text), expected, + msg=errmsg % (text, expected)) + def check_num(num, expected): + for format in numeric_formats: + text = format % num + self.assertEqual(html.unescape(text), expected, + msg=errmsg % (text, expected)) + # check text with no character references + check('no character references', 'no character references') + # check & followed by invalid chars + check('&\n&\t& &&', '&\n&\t& &&') + # check & followed by numbers and letters + check('&0 &9 &a &0; &9; &a;', '&0 &9 &a &0; &9; &a;') + # check incomplete entities at the end of the string + for x in ['&', '&#', '&#x', '&#X', '&#y', '&#xy', '&#Xy']: + check(x, x) + check(x+';', x+';') + # check several combinations of numeric character references, + # possibly followed by different characters + formats = ['&#%d', '&#%07d', '&#%d;', '&#%07d;', + '&#x%x', '&#x%06x', '&#x%x;', '&#x%06x;', + '&#x%X', '&#x%06X', '&#X%x;', '&#X%06x;'] + for num, char in zip([65, 97, 34, 38, 0x2603, 0x101234], + ['A', 'a', '"', '&', '\u2603', '\U00101234']): + for s in formats: + check(s % num, char) + for end in [' ', 'X']: + check((s+end) % num, char+end) + # check invalid codepoints + for cp in [0xD800, 0xDB00, 0xDC00, 0xDFFF, 0x110000]: + check_num(cp, '\uFFFD') + # check more invalid codepoints + for cp in [0x1, 0xb, 0xe, 0x7f, 0xfffe, 0xffff, 0x10fffe, 0x10ffff]: + check_num(cp, '') + # check invalid numbers + for num, ch in zip([0x0d, 0x80, 0x95, 0x9d], '\r\u20ac\u2022\x9d'): + check_num(num, ch) + # check small numbers + check_num(0, '\uFFFD') + check_num(9, '\t') + # check a big number + check_num(1000000000000000000, '\uFFFD') + # check that multiple trailing semicolons are handled correctly + for e in ['";', '";', '";', '";']: + check(e, '";') + # check that semicolons in the middle don't create problems + for e in ['"quot;', '"quot;', '"quot;', '"quot;']: + check(e, '"quot;') + # check triple adjacent charrefs + for e in ['"', '"', '"', '"']: + check(e*3, '"""') + check((e+';')*3, '"""') + # check that the case is respected + for e in ['&', '&', '&', '&']: + check(e, '&') + for e in ['&Amp', '&Amp;']: + check(e, e) + # check that non-existent named entities are returned unchanged + check('&svadilfari;', '&svadilfari;') + # the following examples are in the html5 specs + check('¬it', '?it') + check('¬it;', '?it;') + check('¬in', '?in') + check('∉', '?') + # a similar example with a long name + check('¬ReallyAnExistingNamedCharacterReference;', + '?ReallyAnExistingNamedCharacterReference;') + # longest valid name + check('∳', '?') + # check a charref that maps to two unicode chars + check('∾̳', '\u223E\u0333') + check('&acE', '&acE') + # see #12888 + check('{ ' * 1050, '{ ' * 1050) + # see #15156 + check('ÉricÉric&alphacentauriαcentauri', + '?ric?ric&alphacentauri?centauri') + check('&co;', '&co;') -def test_main(): - run_unittest(HtmlTests) if __name__ == '__main__': - test_main() + unittest.main() diff --git a/Lib/test/test_htmlparser.py b/Lib/test/test_htmlparser.py --- a/Lib/test/test_htmlparser.py +++ b/Lib/test/test_htmlparser.py @@ -569,18 +569,6 @@ for html, expected in data: self._run_check(html, expected) - def test_unescape_function(self): - p = self.get_collector() - self.assertEqual(p.unescape('&#bad;'),'&#bad;') - self.assertEqual(p.unescape('&'),'&') - # see #12888 - self.assertEqual(p.unescape('{ ' * 1050), '{ ' * 1050) - # see #15156 - self.assertEqual(p.unescape('ÉricÉric' - '&alphacentauriαcentauri'), - '?ric?ric&alphacentauri?centauri') - self.assertEqual(p.unescape('&co;'), '&co;') - def test_broken_comments(self): html = ('' '' diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -59,6 +59,8 @@ - Issue #19449: in csv's writerow, handle non-string keys when generating the error message that certain keys are not in the 'fieldnames' list. +- Issue #2927: Added the unescape() function to the html module. + - Issue #8402: Added the escape() function to the glob module. - Issue #17618: Add Base85 and Ascii85 encoding/decoding to the base64 module. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 20:43:45 2013 From: python-checkins at python.org (guido.van.rossum) Date: Tue, 19 Nov 2013 20:43:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Add_streams=2Es?= =?utf-8?q?tart=5Fserver=28=29=2C_by_Gustavo_Carneiro=2E?= Message-ID: <3dPHXK1LgQz7LjM@mail.python.org> http://hg.python.org/cpython/rev/2012e85638d9 changeset: 87276:2012e85638d9 user: Guido van Rossum date: Tue Nov 19 11:43:38 2013 -0800 summary: asyncio: Add streams.start_server(), by Gustavo Carneiro. files: Lib/asyncio/streams.py | 53 +++++++++++- Lib/test/test_asyncio/test_streams.py | 66 +++++++++++++++ 2 files changed, 117 insertions(+), 2 deletions(-) diff --git a/Lib/asyncio/streams.py b/Lib/asyncio/streams.py --- a/Lib/asyncio/streams.py +++ b/Lib/asyncio/streams.py @@ -1,6 +1,8 @@ """Stream-related things.""" -__all__ = ['StreamReader', 'StreamReaderProtocol', 'open_connection'] +__all__ = ['StreamReader', 'StreamReaderProtocol', + 'open_connection', 'start_server', + ] import collections @@ -43,6 +45,42 @@ return reader, writer + at tasks.coroutine +def start_server(client_connected_cb, host=None, port=None, *, + loop=None, limit=_DEFAULT_LIMIT, **kwds): + """Start a socket server, call back for each client connected. + + The first parameter, `client_connected_cb`, takes two parameters: + client_reader, client_writer. client_reader is a StreamReader + object, while client_writer is a StreamWriter object. This + parameter can either be a plain callback function or a coroutine; + if it is a coroutine, it will be automatically converted into a + Task. + + The rest of the arguments are all the usual arguments to + loop.create_server() except protocol_factory; most common are + positional host and port, with various optional keyword arguments + following. The return value is the same as loop.create_server(). + + Additional optional keyword arguments are loop (to set the event loop + instance to use) and limit (to set the buffer limit passed to the + StreamReader). + + The return value is the same as loop.create_server(), i.e. a + Server object which can be used to stop the service. + """ + if loop is None: + loop = events.get_event_loop() + + def factory(): + reader = StreamReader(limit=limit, loop=loop) + protocol = StreamReaderProtocol(reader, client_connected_cb, + loop=loop) + return protocol + + return (yield from loop.create_server(factory, host, port, **kwds)) + + class StreamReaderProtocol(protocols.Protocol): """Trivial helper class to adapt between Protocol and StreamReader. @@ -52,13 +90,24 @@ call inappropriate methods of the protocol.) """ - def __init__(self, stream_reader): + def __init__(self, stream_reader, client_connected_cb=None, loop=None): self._stream_reader = stream_reader + self._stream_writer = None self._drain_waiter = None self._paused = False + self._client_connected_cb = client_connected_cb + self._loop = loop # May be None; we may never need it. def connection_made(self, transport): self._stream_reader.set_transport(transport) + if self._client_connected_cb is not None: + self._stream_writer = StreamWriter(transport, self, + self._stream_reader, + self._loop) + res = self._client_connected_cb(self._stream_reader, + self._stream_writer) + if tasks.iscoroutine(res): + tasks.Task(res, loop=self._loop) def connection_lost(self, exc): if exc is None: diff --git a/Lib/test/test_asyncio/test_streams.py b/Lib/test/test_asyncio/test_streams.py --- a/Lib/test/test_asyncio/test_streams.py +++ b/Lib/test/test_asyncio/test_streams.py @@ -359,6 +359,72 @@ test_utils.run_briefly(self.loop) self.assertIs(stream._waiter, None) + def test_start_server(self): + + class MyServer: + + def __init__(self, loop): + self.server = None + self.loop = loop + + @tasks.coroutine + def handle_client(self, client_reader, client_writer): + data = yield from client_reader.readline() + client_writer.write(data) + + def start(self): + self.server = self.loop.run_until_complete( + streams.start_server(self.handle_client, + '127.0.0.1', 12345, + loop=self.loop)) + + def handle_client_callback(self, client_reader, client_writer): + task = tasks.Task(client_reader.readline(), loop=self.loop) + + def done(task): + client_writer.write(task.result()) + + task.add_done_callback(done) + + def start_callback(self): + self.server = self.loop.run_until_complete( + streams.start_server(self.handle_client_callback, + '127.0.0.1', 12345, + loop=self.loop)) + + def stop(self): + if self.server is not None: + self.server.close() + self.loop.run_until_complete(self.server.wait_closed()) + self.server = None + + @tasks.coroutine + def client(): + reader, writer = yield from streams.open_connection( + '127.0.0.1', 12345, loop=self.loop) + # send a line + writer.write(b"hello world!\n") + # read it back + msgback = yield from reader.readline() + writer.close() + return msgback + + # test the server variant with a coroutine as client handler + server = MyServer(self.loop) + server.start() + msg = self.loop.run_until_complete(tasks.Task(client(), + loop=self.loop)) + server.stop() + self.assertEqual(msg, b"hello world!\n") + + # test the server variant with a callback as client handler + server = MyServer(self.loop) + server.start_callback() + msg = self.loop.run_until_complete(tasks.Task(client(), + loop=self.loop)) + server.stop() + self.assertEqual(msg, b"hello world!\n") + if __name__ == '__main__': unittest.main() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 21:49:03 2013 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 19 Nov 2013 21:49:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_The_=22parts=22_property_now_?= =?utf-8?q?simply_returns_a_tuple_=28the_slightly_magic_slicing?= Message-ID: <3dPJzg4jDvz7M4Z@mail.python.org> http://hg.python.org/peps/rev/5b9b0c2978d3 changeset: 5292:5b9b0c2978d3 user: Antoine Pitrou date: Tue Nov 19 21:48:55 2013 +0100 summary: The "parts" property now simply returns a tuple (the slightly magic slicing behaviour is gone) files: pep-0428.txt | 21 +++++---------------- 1 files changed, 5 insertions(+), 16 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -216,7 +216,7 @@ >>> p.root '/' >>> p.parts - + ('/', 'home', 'antoine', 'pathlib', 'setup.py') >>> p.relative_to('/home/antoine') PosixPath('pathlib/setup.py') >>> p.exists() @@ -469,29 +469,18 @@ Sequence-like access -------------------- -The ``parts`` property provides read-only sequence access to a path object:: +The ``parts`` property returns a tuple providing read-only sequence access +to a path's components:: >>> p = PurePosixPath('/etc/init.d') >>> p.parts - - -Simple indexing returns the invidual path component as a string, while -slicing returns a new path object constructed from the selected components:: - - >>> p.parts[-1] - 'init.d' - >>> p.parts[:-1] - PurePosixPath('/etc') + ('/', 'etc', 'init.d') Windows paths handle the drive and the root as a single path component:: >>> p = PureWindowsPath('c:/setup.py') >>> p.parts - - >>> p.root - '\\' - >>> p.parts[0] - 'c:\\' + ('c:\\', 'setup.py') (separating them would be wrong, since ``C:`` is not the parent of ``C:\\``). -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Tue Nov 19 21:51:23 2013 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 19 Nov 2013 21:51:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_=22parents=22_is_now_a_sequen?= =?utf-8?q?ce_property_providing_access_to_the_path=27s_ancestors?= Message-ID: <3dPK2M61hxz7M5y@mail.python.org> http://hg.python.org/peps/rev/fc4d7db90ff2 changeset: 5293:fc4d7db90ff2 user: Antoine Pitrou date: Tue Nov 19 21:51:19 2013 +0100 summary: "parents" is now a sequence property providing access to the path's ancestors files: pep-0428.txt | 11 +++++++---- 1 files changed, 7 insertions(+), 4 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -493,14 +493,17 @@ >>> p.parent(3) PureWindowsPath('c:/') -The ``parents()`` method automates repeated invocations of ``parent()``, until -the anchor is reached:: +The ``parents`` property returns an immutable sequence of the path's +logical ancestors:: >>> p = PureWindowsPath('c:/python33/bin/python.exe') - >>> for parent in p.parents(): parent - ... + >>> len(p.parents) + 3 + >>> p.parents[0] PureWindowsPath('c:/python33/bin') + >>> p.parents[1] PureWindowsPath('c:/python33') + >>> p.parents[2] PureWindowsPath('c:/') -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Tue Nov 19 22:01:19 2013 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 19 Nov 2013 22:01:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_=60parent=60_is_now_a_propert?= =?utf-8?q?y_returning_the_logical_parent_of_the_path?= Message-ID: <3dPKFq0yVkz7M4Z@mail.python.org> http://hg.python.org/peps/rev/2c57a645cce8 changeset: 5294:2c57a645cce8 user: Antoine Pitrou date: Tue Nov 19 22:01:14 2013 +0100 summary: `parent` is now a property returning the logical parent of the path files: pep-0428.txt | 9 +++------ 1 files changed, 3 insertions(+), 6 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -484,14 +484,11 @@ (separating them would be wrong, since ``C:`` is not the parent of ``C:\\``). -The ``parent()`` method returns an ancestor of the path:: +The ``parent`` property returns the logical parent of the path:: - >>> p.parent() + >>> p = PureWindowsPath('c:/python33/bin/python.exe') + >>> p.parent PureWindowsPath('c:/python33/bin') - >>> p.parent(2) - PureWindowsPath('c:/python33') - >>> p.parent(3) - PureWindowsPath('c:/') The ``parents`` property returns an immutable sequence of the path's logical ancestors:: -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Tue Nov 19 22:23:37 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 19 Nov 2013 22:23:37 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=239566=3A_compile?= =?utf-8?q?=2Ec_uses_Py=5Fssize=5Ft_instead_of_int_to_store_sizes_to_fix?= Message-ID: <3dPKlY0N7Bz7M56@mail.python.org> http://hg.python.org/cpython/rev/68fd86a83ece changeset: 87277:68fd86a83ece user: Victor Stinner date: Tue Nov 19 22:23:20 2013 +0100 summary: Issue #9566: compile.c uses Py_ssize_t instead of int to store sizes to fix compiler warnings on Windows 64-bit. Use Py_SAFE_DOWNCAST() where the final downcast is needed. The bytecode doesn't support integer parameters larger than 32-bit yet. files: Python/compile.c | 116 ++++++++++++++++++++-------------- 1 files changed, 68 insertions(+), 48 deletions(-) diff --git a/Python/compile.c b/Python/compile.c --- a/Python/compile.c +++ b/Python/compile.c @@ -170,7 +170,7 @@ static int compiler_next_instr(struct compiler *, basicblock *); static int compiler_addop(struct compiler *, int); static int compiler_addop_o(struct compiler *, int, PyObject *, PyObject *); -static int compiler_addop_i(struct compiler *, int, int); +static int compiler_addop_i(struct compiler *, Py_ssize_t, Py_ssize_t); static int compiler_addop_j(struct compiler *, int, basicblock *, int); static basicblock *compiler_use_new_block(struct compiler *); static int compiler_error(struct compiler *, const char *); @@ -195,7 +195,7 @@ static int expr_constant(struct compiler *, expr_ty); static int compiler_with(struct compiler *, stmt_ty, int); -static int compiler_call_helper(struct compiler *c, int n, +static int compiler_call_helper(struct compiler *c, Py_ssize_t n, asdl_seq *args, asdl_seq *keywords, expr_ty starargs, @@ -388,7 +388,7 @@ n = PyList_Size(list); for (i = 0; i < n; i++) { - v = PyLong_FromLong(i); + v = PyLong_FromSsize_t(i); if (!v) { Py_DECREF(dict); return NULL; @@ -416,7 +416,7 @@ */ static PyObject * -dictbytype(PyObject *src, int scope_type, int flag, int offset) +dictbytype(PyObject *src, int scope_type, int flag, Py_ssize_t offset) { Py_ssize_t i = offset, scope, num_keys, key_i; PyObject *k, *v, *dest = PyDict_New(); @@ -450,7 +450,7 @@ scope = (vi >> SCOPE_OFFSET) & SCOPE_MASK; if (scope == scope_type || vi & flag) { - PyObject *tuple, *item = PyLong_FromLong(i); + PyObject *tuple, *item = PyLong_FromSsize_t(i); if (item == NULL) { Py_DECREF(sorted_keys); Py_DECREF(dest); @@ -635,7 +635,7 @@ static void compiler_exit_scope(struct compiler *c) { - int n; + Py_ssize_t n; PyObject *capsule; c->c_nestlevel--; @@ -1128,7 +1128,7 @@ if (PyErr_Occurred()) return -1; arg = PyDict_Size(dict); - v = PyLong_FromLong(arg); + v = PyLong_FromSsize_t(arg); if (!v) { Py_DECREF(t); return -1; @@ -1150,7 +1150,7 @@ compiler_addop_o(struct compiler *c, int opcode, PyObject *dict, PyObject *o) { - int arg = compiler_add_o(c, dict, o); + Py_ssize_t arg = compiler_add_o(c, dict, o); if (arg < 0) return 0; return compiler_addop_i(c, opcode, arg); @@ -1160,7 +1160,7 @@ compiler_addop_name(struct compiler *c, int opcode, PyObject *dict, PyObject *o) { - int arg; + Py_ssize_t arg; PyObject *mangled = _Py_Mangle(c->u->u_private, o); if (!mangled) return 0; @@ -1176,15 +1176,21 @@ */ static int -compiler_addop_i(struct compiler *c, int opcode, int oparg) +compiler_addop_i(struct compiler *c, Py_ssize_t opcode, Py_ssize_t oparg) { struct instr *i; int off; + + /* Integer arguments are limit to 16-bit. There is an extension for 32-bit + integer arguments. */ + assert(INT32_MIN <= opcode); + assert(opcode <= INT32_MAX); + off = compiler_next_instr(c, c->u->u_curblock); if (off < 0) return 0; i = &c->u->u_curblock->b_instr[off]; - i->i_opcode = opcode; + i->i_opcode = Py_SAFE_DOWNCAST(opcode, Py_ssize_t, int); i->i_oparg = oparg; i->i_hasarg = 1; compiler_set_lineno(c, off); @@ -1436,9 +1442,9 @@ } static int -compiler_make_closure(struct compiler *c, PyCodeObject *co, int args, PyObject *qualname) +compiler_make_closure(struct compiler *c, PyCodeObject *co, Py_ssize_t args, PyObject *qualname) { - int i, free = PyCode_GetNumFree(co); + Py_ssize_t i, free = PyCode_GetNumFree(co); if (qualname == NULL) qualname = co->co_name; @@ -1565,7 +1571,7 @@ */ static identifier return_str; PyObject *names; - int len; + Py_ssize_t len; names = PyList_New(0); if (!names) return -1; @@ -1602,7 +1608,7 @@ if (len) { /* convert names to a tuple and place on stack */ PyObject *elt; - int i; + Py_ssize_t i; PyObject *s = PyTuple_New(len); if (!s) goto error; @@ -1616,7 +1622,9 @@ len++; /* include the just-pushed tuple */ } Py_DECREF(names); - return len; + + /* We just checked that len <= 65535, see above */ + return Py_SAFE_DOWNCAST(len, Py_ssize_t, int); error: Py_DECREF(names); @@ -1632,7 +1640,8 @@ expr_ty returns = s->v.FunctionDef.returns; asdl_seq* decos = s->v.FunctionDef.decorator_list; stmt_ty st; - int i, n, docstring, kw_default_count = 0, arglength; + Py_ssize_t i, n, arglength; + int docstring, kw_default_count = 0; int num_annotations; assert(s->kind == FunctionDef_kind); @@ -1851,7 +1860,8 @@ PyCodeObject *co; PyObject *qualname; static identifier name; - int kw_default_count = 0, arglength; + int kw_default_count = 0; + Py_ssize_t arglength; arguments_ty args = e->v.Lambda.args; assert(e->kind == Lambda_kind); @@ -2162,7 +2172,7 @@ compiler_try_except(struct compiler *c, stmt_ty s) { basicblock *body, *orelse, *except, *end; - int i, n; + Py_ssize_t i, n; body = compiler_new_block(c); except = compiler_new_block(c); @@ -2328,7 +2338,7 @@ module names. XXX Perhaps change the representation to make this case simpler? */ - int i, n = asdl_seq_LEN(s->v.Import.names); + Py_ssize_t i, n = asdl_seq_LEN(s->v.Import.names); for (i = 0; i < n; i++) { alias_ty alias = (alias_ty)asdl_seq_GET(s->v.Import.names, i); @@ -2372,7 +2382,7 @@ static int compiler_from_import(struct compiler *c, stmt_ty s) { - int i, n = asdl_seq_LEN(s->v.ImportFrom.names); + Py_ssize_t i, n = asdl_seq_LEN(s->v.ImportFrom.names); PyObject *names = PyTuple_New(n); PyObject *level; @@ -2489,7 +2499,7 @@ static int compiler_visit_stmt(struct compiler *c, stmt_ty s) { - int i, n; + Py_ssize_t i, n; /* Always assign a lineno to the next instruction for a stmt. */ c->u->u_lineno = s->lineno; @@ -2537,12 +2547,12 @@ if (s->v.Raise.exc) { VISIT(c, expr, s->v.Raise.exc); n++; - if (s->v.Raise.cause) { - VISIT(c, expr, s->v.Raise.cause); - n++; + if (s->v.Raise.cause) { + VISIT(c, expr, s->v.Raise.cause); + n++; + } } - } - ADDOP_I(c, RAISE_VARARGS, n); + ADDOP_I(c, RAISE_VARARGS, (int)n); break; case Try_kind: return compiler_try(c, s); @@ -2702,7 +2712,8 @@ static int compiler_nameop(struct compiler *c, identifier name, expr_context_ty ctx) { - int op, scope, arg; + int op, scope; + Py_ssize_t arg; enum { OP_FAST, OP_GLOBAL, OP_DEREF, OP_NAME } optype; PyObject *dict = c->u->u_names; @@ -2828,7 +2839,8 @@ compiler_boolop(struct compiler *c, expr_ty e) { basicblock *end; - int jumpi, i, n; + int jumpi; + Py_ssize_t i, n; asdl_seq *s; assert(e->kind == BoolOp_kind); @@ -2854,7 +2866,7 @@ static int compiler_list(struct compiler *c, expr_ty e) { - int n = asdl_seq_LEN(e->v.List.elts); + Py_ssize_t n = asdl_seq_LEN(e->v.List.elts); if (e->v.List.ctx == Store) { int i, seen_star = 0; for (i = 0; i < n; i++) { @@ -2887,7 +2899,7 @@ static int compiler_tuple(struct compiler *c, expr_ty e) { - int n = asdl_seq_LEN(e->v.Tuple.elts); + Py_ssize_t n = asdl_seq_LEN(e->v.Tuple.elts); if (e->v.Tuple.ctx == Store) { int i, seen_star = 0; for (i = 0; i < n; i++) { @@ -2920,7 +2932,7 @@ static int compiler_compare(struct compiler *c, expr_ty e) { - int i, n; + Py_ssize_t i, n; basicblock *cleanup = NULL; /* XXX the logic can be cleaned up for 1 or multiple comparisons */ @@ -2976,11 +2988,11 @@ /* shared code between compiler_call and compiler_class */ static int compiler_call_helper(struct compiler *c, - int n, /* Args already pushed */ - asdl_seq *args, - asdl_seq *keywords, - expr_ty starargs, - expr_ty kwargs) + Py_ssize_t n, /* Args already pushed */ + asdl_seq *args, + asdl_seq *keywords, + expr_ty starargs, + expr_ty kwargs) { int code = 0; @@ -3041,7 +3053,7 @@ comprehension_ty gen; basicblock *start, *anchor, *skip, *if_cleanup; - int i, n; + Py_ssize_t i, n; start = compiler_new_block(c); skip = compiler_new_block(c); @@ -3380,7 +3392,7 @@ static int compiler_visit_expr(struct compiler *c, expr_ty e) { - int i, n; + Py_ssize_t i, n; /* If expr e has a different line number than the last expr/stmt, set a new line number for the next instruction. @@ -3409,6 +3421,8 @@ return compiler_ifexp(c, e); case Dict_kind: n = asdl_seq_LEN(e->v.Dict.values); + /* BUILD_MAP parameter is only used to preallocate the dictionary, + it doesn't need to be exact */ ADDOP_I(c, BUILD_MAP, (n>0xFFFF ? 0xFFFF : n)); for (i = 0; i < n; i++) { VISIT(c, expr, @@ -3756,7 +3770,7 @@ case ExtSlice_kind: kindname = "extended slice"; if (ctx != AugStore) { - int i, n = asdl_seq_LEN(s->v.ExtSlice.dims); + Py_ssize_t i, n = asdl_seq_LEN(s->v.ExtSlice.dims); for (i = 0; i < n; i++) { slice_ty sub = (slice_ty)asdl_seq_GET( s->v.ExtSlice.dims, i); @@ -3934,7 +3948,7 @@ assemble_lnotab(struct assembler *a, struct instr *i) { int d_bytecode, d_lineno; - int len; + Py_ssize_t len; unsigned char *lnotab; d_bytecode = a->a_offset - a->a_lineno_off; @@ -4125,7 +4139,7 @@ } static PyObject * -dict_keys_inorder(PyObject *dict, int offset) +dict_keys_inorder(PyObject *dict, Py_ssize_t offset) { PyObject *tuple, *k, *v; Py_ssize_t i, pos = 0, size = PyDict_Size(dict); @@ -4150,7 +4164,8 @@ compute_code_flags(struct compiler *c) { PySTEntryObject *ste = c->u->u_ste; - int flags = 0, n; + int flags = 0; + Py_ssize_t n; if (ste->ste_type == FunctionBlock) { flags |= CO_NEWLOCALS; if (!ste->ste_unoptimized) @@ -4174,9 +4189,9 @@ if (n == 0) { n = PyDict_Size(c->u->u_cellvars); if (n < 0) - return -1; + return -1; if (n == 0) { - flags |= CO_NOFREE; + flags |= CO_NOFREE; } } @@ -4195,7 +4210,9 @@ PyObject *freevars = NULL; PyObject *cellvars = NULL; PyObject *bytecode = NULL; - int nlocals, flags; + Py_ssize_t nlocals; + int nlocals_int; + int flags; tmp = dict_keys_inorder(c->u->u_consts, 0); if (!tmp) @@ -4214,7 +4231,11 @@ freevars = dict_keys_inorder(c->u->u_freevars, PyTuple_Size(cellvars)); if (!freevars) goto error; + nlocals = PyDict_Size(c->u->u_varnames); + assert(nlocals < INT_MAX); + nlocals_int = Py_SAFE_DOWNCAST(nlocals, Py_ssize_t, int); + flags = compute_code_flags(c); if (flags < 0) goto error; @@ -4230,7 +4251,7 @@ consts = tmp; co = PyCode_New(c->u->u_argcount, c->u->u_kwonlyargcount, - nlocals, stackdepth(c), flags, + nlocals_int, stackdepth(c), flags, bytecode, consts, names, varnames, freevars, cellvars, c->c_filename, c->u->u_name, @@ -4349,4 +4370,3 @@ return PyAST_CompileEx(mod, filename, flags, -1, arena); } - -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 22:28:15 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 19 Nov 2013 22:28:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=239566=2C_=2319617?= =?utf-8?q?=3A_Fix_compilation_on_Windows?= Message-ID: <3dPKrv3mSdz7Lmt@mail.python.org> http://hg.python.org/cpython/rev/8d3e85dfa46f changeset: 87278:8d3e85dfa46f user: Victor Stinner date: Tue Nov 19 22:28:01 2013 +0100 summary: Issue #9566, #19617: Fix compilation on Windows INT32_MIN and INT32_MAX constants are unknown on Windows. files: Python/compile.c | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Python/compile.c b/Python/compile.c --- a/Python/compile.c +++ b/Python/compile.c @@ -1183,8 +1183,8 @@ /* Integer arguments are limit to 16-bit. There is an extension for 32-bit integer arguments. */ - assert(INT32_MIN <= opcode); - assert(opcode <= INT32_MAX); + assert(-2147483648 <= opcode); + assert(opcode <= 2147483647); off = compiler_next_instr(c, c->u->u_curblock); if (off < 0) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 23:03:48 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 19 Nov 2013 23:03:48 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=239566=2C_=2319617?= =?utf-8?q?=3A_New_try_to_fix_compilation_on_Windows?= Message-ID: <3dPLdw1bcrz7M6H@mail.python.org> http://hg.python.org/cpython/rev/ee4da7291211 changeset: 87279:ee4da7291211 user: Victor Stinner date: Tue Nov 19 23:03:25 2013 +0100 summary: Issue #9566, #19617: New try to fix compilation on Windows Some compilers (ex: Visual Studio) decode -2147483648 as a unsigned integer instead of an signed integer. files: Python/compile.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Python/compile.c b/Python/compile.c --- a/Python/compile.c +++ b/Python/compile.c @@ -1183,7 +1183,7 @@ /* Integer arguments are limit to 16-bit. There is an extension for 32-bit integer arguments. */ - assert(-2147483648 <= opcode); + assert((-2147483647-1) <= opcode); assert(opcode <= 2147483647); off = compiler_next_instr(c, c->u->u_curblock); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 23:04:18 2013 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 19 Nov 2013 23:04:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Fix_outdated_example?= Message-ID: <3dPLfV2XHYz7M5t@mail.python.org> http://hg.python.org/peps/rev/6f43e38a4ae6 changeset: 5295:6f43e38a4ae6 user: Antoine Pitrou date: Tue Nov 19 23:04:11 2013 +0100 summary: Fix outdated example files: pep-0428.txt | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -211,7 +211,7 @@ >>> p = Path('/home/antoine/pathlib/setup.py') >>> p.name 'setup.py' - >>> p.ext + >>> p.suffix '.py' >>> p.root '/' -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Tue Nov 19 23:05:56 2013 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 19 Nov 2013 23:05:56 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Fix_typo?= Message-ID: <3dPLhN6PdBz7M6X@mail.python.org> http://hg.python.org/peps/rev/1037304d357c changeset: 5296:1037304d357c user: Antoine Pitrou date: Tue Nov 19 23:05:51 2013 +0100 summary: Fix typo files: pep-0428.txt | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -152,7 +152,7 @@ In this proposal, the path classes do not derive from a builtin type. This contrasts with some other Path class proposals which were derived from ``str``. They also do not pretend to implement the sequence protocol: -if you want a path to act as a sequence, you have to lookup a dedicate +if you want a path to act as a sequence, you have to lookup a dedicated attribute (the ``parts`` attribute). By avoiding to pass as builtin types, the path classes minimize the potential -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Tue Nov 19 23:47:06 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 19 Nov 2013 23:47:06 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319637=3A_fix_test?= =?utf-8?q?=5Fundecodable=5Fenv=28=29_of_test=5Fsubprocess_on_AIX?= Message-ID: <3dPMbt3tGYz7M7L@mail.python.org> http://hg.python.org/cpython/rev/e651036191ad changeset: 87280:e651036191ad user: Victor Stinner date: Tue Nov 19 23:46:06 2013 +0100 summary: Issue #19637: fix test_undecodable_env() of test_subprocess on AIX On AIX, the C locale encoding uses the ISO-8859-1 encoding, not ASCII. files: Lib/test/test_subprocess.py | 17 ++++++++++++----- 1 files changed, 12 insertions(+), 5 deletions(-) diff --git a/Lib/test/test_subprocess.py b/Lib/test/test_subprocess.py --- a/Lib/test/test_subprocess.py +++ b/Lib/test/test_subprocess.py @@ -1681,31 +1681,38 @@ def test_undecodable_env(self): for key, value in (('test', 'abc\uDCFF'), ('test\uDCFF', '42')): + encoded_value = value.encode("ascii", "surrogateescape") + # test str with surrogates script = "import os; print(ascii(os.getenv(%s)))" % repr(key) env = os.environ.copy() env[key] = value - # Use C locale to get ascii for the locale encoding to force + # Use C locale to get ASCII for the locale encoding to force # surrogate-escaping of \xFF in the child process; otherwise it can # be decoded as-is if the default locale is latin-1. env['LC_ALL'] = 'C' + if sys.platform.startswith("aix"): + # On AIX, the C locale uses the Latin1 encoding + decoded_value = encoded_value.decode("latin1", "surrogateescape") + else: + # On other UNIXes, the C locale uses the ASCII encoding + decoded_value = value stdout = subprocess.check_output( [sys.executable, "-c", script], env=env) stdout = stdout.rstrip(b'\n\r') - self.assertEqual(stdout.decode('ascii'), ascii(value)) + self.assertEqual(stdout.decode('ascii'), ascii(decoded_value)) # test bytes key = key.encode("ascii", "surrogateescape") - value = value.encode("ascii", "surrogateescape") script = "import os; print(ascii(os.getenvb(%s)))" % repr(key) env = os.environ.copy() - env[key] = value + env[key] = encoded_value stdout = subprocess.check_output( [sys.executable, "-c", script], env=env) stdout = stdout.rstrip(b'\n\r') - self.assertEqual(stdout.decode('ascii'), ascii(value)) + self.assertEqual(stdout.decode('ascii'), ascii(encoded_value)) def test_bytes_program(self): abs_program = os.fsencode(sys.executable) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 19 23:56:57 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 19 Nov 2013 23:56:57 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=239566=2C_=2319617?= =?utf-8?q?=3A_Fix_more_compiler_warnings_in_compile=2Ec_on_Windows_64-bit?= Message-ID: <3dPMqF3FZrz7M43@mail.python.org> http://hg.python.org/cpython/rev/116bd550e309 changeset: 87281:116bd550e309 user: Victor Stinner date: Tue Nov 19 23:56:34 2013 +0100 summary: Issue #9566, #19617: Fix more compiler warnings in compile.c on Windows 64-bit files: Python/compile.c | 23 +++++++++++++---------- 1 files changed, 13 insertions(+), 10 deletions(-) diff --git a/Python/compile.c b/Python/compile.c --- a/Python/compile.c +++ b/Python/compile.c @@ -120,8 +120,8 @@ PyObject *u_private; /* for private name mangling */ - int u_argcount; /* number of arguments for block */ - int u_kwonlyargcount; /* number of keyword only arguments for block */ + Py_ssize_t u_argcount; /* number of arguments for block */ + Py_ssize_t u_kwonlyargcount; /* number of keyword only arguments for block */ /* Pointer to the most recently allocated block. By following b_list members, you can reach all early allocated blocks. */ basicblock *u_blocks; @@ -170,7 +170,7 @@ static int compiler_next_instr(struct compiler *, basicblock *); static int compiler_addop(struct compiler *, int); static int compiler_addop_o(struct compiler *, int, PyObject *, PyObject *); -static int compiler_addop_i(struct compiler *, Py_ssize_t, Py_ssize_t); +static int compiler_addop_i(struct compiler *, int, Py_ssize_t); static int compiler_addop_j(struct compiler *, int, basicblock *, int); static basicblock *compiler_use_new_block(struct compiler *); static int compiler_error(struct compiler *, const char *); @@ -1074,7 +1074,7 @@ return 1; } -static int +static Py_ssize_t compiler_add_o(struct compiler *c, PyObject *dict, PyObject *o) { PyObject *t, *v; @@ -1176,22 +1176,22 @@ */ static int -compiler_addop_i(struct compiler *c, Py_ssize_t opcode, Py_ssize_t oparg) +compiler_addop_i(struct compiler *c, int opcode, Py_ssize_t oparg) { struct instr *i; int off; /* Integer arguments are limit to 16-bit. There is an extension for 32-bit integer arguments. */ - assert((-2147483647-1) <= opcode); - assert(opcode <= 2147483647); + assert((-2147483647-1) <= oparg); + assert(oparg <= 2147483647); off = compiler_next_instr(c, c->u->u_curblock); if (off < 0) return 0; i = &c->u->u_curblock->b_instr[off]; - i->i_opcode = Py_SAFE_DOWNCAST(opcode, Py_ssize_t, int); - i->i_oparg = oparg; + i->i_opcode = opcode; + i->i_oparg = Py_SAFE_DOWNCAST(oparg, Py_ssize_t, int); i->i_hasarg = 1; compiler_set_lineno(c, off); return 1; @@ -4213,6 +4213,7 @@ Py_ssize_t nlocals; int nlocals_int; int flags; + int argcount, kwonlyargcount; tmp = dict_keys_inorder(c->u->u_consts, 0); if (!tmp) @@ -4250,7 +4251,9 @@ Py_DECREF(consts); consts = tmp; - co = PyCode_New(c->u->u_argcount, c->u->u_kwonlyargcount, + argcount = Py_SAFE_DOWNCAST(c->u->u_argcount, Py_ssize_t, int); + kwonlyargcount = Py_SAFE_DOWNCAST(c->u->u_kwonlyargcount, Py_ssize_t, int); + co = PyCode_New(argcount, kwonlyargcount, nlocals_int, stackdepth(c), flags, bytecode, consts, names, varnames, freevars, cellvars, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 00:15:11 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 20 Nov 2013 00:15:11 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=5Fmsi=2Ec=3A_Fix_compiler?= =?utf-8?q?_warnings_on_Windows_64-bit?= Message-ID: <3dPNDH5KRWz7M4q@mail.python.org> http://hg.python.org/cpython/rev/c0e63937efa7 changeset: 87282:c0e63937efa7 user: Victor Stinner date: Wed Nov 20 00:14:49 2013 +0100 summary: _msi.c: Fix compiler warnings on Windows 64-bit "hf" type is INT_PTR, it is used to store an int in _msi.c. files: PC/_msi.c | 8 ++++---- 1 files changed, 4 insertions(+), 4 deletions(-) diff --git a/PC/_msi.c b/PC/_msi.c --- a/PC/_msi.c +++ b/PC/_msi.c @@ -63,7 +63,7 @@ static FNFCIREAD(cb_read) { - UINT result = (UINT)_read(hf, memory, cb); + UINT result = (UINT)_read((int)hf, memory, cb); if (result != cb) *err = errno; return result; @@ -71,7 +71,7 @@ static FNFCIWRITE(cb_write) { - UINT result = (UINT)_write(hf, memory, cb); + UINT result = (UINT)_write((int)hf, memory, cb); if (result != cb) *err = errno; return result; @@ -79,7 +79,7 @@ static FNFCICLOSE(cb_close) { - int result = _close(hf); + int result = _close((int)hf); if (result != 0) *err = errno; return result; @@ -87,7 +87,7 @@ static FNFCISEEK(cb_seek) { - long result = (long)_lseek(hf, dist, seektype); + long result = (long)_lseek((int)hf, dist, seektype); if (result == -1) *err = errno; return result; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 00:46:44 2013 From: python-checkins at python.org (antoine.pitrou) Date: Wed, 20 Nov 2013 00:46:44 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Assorted_comments_by_Jim_Jewe?= =?utf-8?q?tt?= Message-ID: <3dPNwh5Rq7z7Lmt@mail.python.org> http://hg.python.org/peps/rev/39106c2eb5a1 changeset: 5297:39106c2eb5a1 user: Antoine Pitrou date: Wed Nov 20 00:46:39 2013 +0100 summary: Assorted comments by Jim Jewett files: pep-0428.txt | 10 +++++++--- 1 files changed, 7 insertions(+), 3 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -155,8 +155,8 @@ if you want a path to act as a sequence, you have to lookup a dedicated attribute (the ``parts`` attribute). -By avoiding to pass as builtin types, the path classes minimize the potential -for confusion if they are combined by accident with genuine builtin types. +Not behaving like one of the basic builtin types also minimizes the potential +for confusion if a path is combined by accident with genuine builtin types. Immutability @@ -201,6 +201,9 @@ File "", line 1, in TypeError: unorderable types: PurePosixPath() < PureWindowsPath() +Paths compare unequal to, and are not orderable with instances of builtin +types (such as ``str``) and any other types. + Useful notations ---------------- @@ -315,7 +318,8 @@ PurePosixPath('//some/path') Calling the constructor without any argument creates a path object pointing -to the logical "current directory":: +to the logical "current directory" (without looking up its absolute path, +which is the job of the ``cwd()`` classmethod on concrete paths):: >>> PurePosixPath() PurePosixPath('.') -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Wed Nov 20 00:50:51 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 00:50:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Add_workaround?= =?utf-8?q?_for_VS_2010_nmake_clean_issue=2E_VS_2010_doesn=27t_set_up_PATH?= =?utf-8?q?_for?= Message-ID: <3dPP1R4DDLz7M5b@mail.python.org> http://hg.python.org/cpython/rev/748f007b23fd changeset: 87283:748f007b23fd branch: 3.3 parent: 87272:6e5afeada7ca user: Christian Heimes date: Wed Nov 20 00:41:29 2013 +0100 summary: Add workaround for VS 2010 nmake clean issue. VS 2010 doesn't set up PATH for nmake.exe correctly. files: Misc/NEWS | 3 +++ PC/python3.mak | 6 +++++- PCbuild/python3dll.vcxproj | 12 ++++++------ 3 files changed, 14 insertions(+), 7 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,9 @@ Build ----- +- Add workaround for VS 2010 nmake clean issue. VS 2010 doesn't set up PATH + for nmake.exe correctly. + What's New in Python 3.3.3? =========================== diff --git a/PC/python3.mak b/PC/python3.mak --- a/PC/python3.mak +++ b/PC/python3.mak @@ -5,6 +5,10 @@ lib /def:python33stub.def /out:$(OutDir)python33stub.lib /MACHINE:$(MACHINE) clean: - del $(OutDir)python3.dll $(OutDir)python3.lib $(OutDir)python33stub.lib $(OutDir)python3.exp $(OutDir)python33stub.exp + IF EXIST $(OutDir)python3.dll del $(OutDir)python3.dll + IF EXIST $(OutDir)python3.lib del $(OutDir)python3.lib + IF EXIST $(OutDir)python33stub.lib del $(OutDir)python33stub.lib + IF EXIST $(OutDir)python3.exp del $(OutDir)python3.exp + IF EXIST $(OutDir)python33stub.exp del $(OutDir)python33stub.exp rebuild: clean $(OutDir)python3.dll diff --git a/PCbuild/python3dll.vcxproj b/PCbuild/python3dll.vcxproj --- a/PCbuild/python3dll.vcxproj +++ b/PCbuild/python3dll.vcxproj @@ -99,7 +99,7 @@ cd $(ProjectDir)\..\PC nmake /f python3.mak MACHINE=x86 OutDir=$(OutDir) rebuild cd $(ProjectDir)\..\PC -nmake /f python3.mak MACHINE=x86 OutDir=$(OutDir) clean +"$(VSInstallDir)\VC\bin\nmake.exe" /f python3.mak MACHINE=x86 OutDir=$(OutDir) clean $(OutDir)python3.dll $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -111,7 +111,7 @@ cd $(ProjectDir)\..\PC nmake /f python3.mak MACHINE=x64 OutDir=$(OutDir) rebuild cd $(ProjectDir)\..\PC -nmake /f python3.mak MACHINE=x64 OutDir=$(OutDir) clean +"$(VSInstallDir)\VC\bin\nmake.exe" /f python3.mak MACHINE=x64 OutDir=$(OutDir) clean $(OutDir)python3.dll $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -123,7 +123,7 @@ cd $(ProjectDir)\..\PC nmake /f python3.mak MACHINE=x86 OutDir=$(OutDir) rebuild cd $(ProjectDir)\..\PC -nmake /f python3.mak MACHINE=x86 OutDir=$(OutDir) clean +"$(VSInstallDir)\VC\bin\nmake.exe" /f python3.mak MACHINE=x86 OutDir=$(OutDir) clean $(OutDir)python3.dll $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -135,7 +135,7 @@ cd $(ProjectDir)\..\PC nmake /f python3.mak MACHINE=x64 OutDir=$(OutDir) rebuild cd $(ProjectDir)\..\PC -nmake /f python3.mak MACHINE=x64 OutDir=$(OutDir) clean +"$(VSInstallDir)\VC\bin\nmake.exe" /f python3.mak MACHINE=x64 OutDir=$(OutDir) clean $(OutDir)python3.dll $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -147,7 +147,7 @@ cd $(ProjectDir)\..\PC nmake /f python3.mak MACHINE=x86 OutDir=$(OutDir) rebuild cd $(ProjectDir)\..\PC -nmake /f python3.mak MACHINE=x86 OutDir=$(OutDir) clean +"$(VSInstallDir)\VC\bin\nmake.exe" /f python3.mak MACHINE=x86 OutDir=$(OutDir) clean $(OutDir)python3.dll $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -159,7 +159,7 @@ cd $(ProjectDir)\..\PC nmake /f python3.mak MACHINE=x64 OutDir=$(OutDir) rebuild cd $(ProjectDir)\..\PC -nmake /f python3.mak MACHINE=x64 OutDir=$(OutDir) clean +"$(VSInstallDir)\VC\bin\nmake.exe" /f python3.mak MACHINE=x64 OutDir=$(OutDir) clean $(OutDir)python3.dll $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 00:50:53 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 00:50:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Add_workaround_for_VS_2010_nmake_clean_issue=2E_VS_2010_?= =?utf-8?q?doesn=27t_set_up_PATH_for?= Message-ID: <3dPP1T07CZz7M5b@mail.python.org> http://hg.python.org/cpython/rev/5645792a524f changeset: 87284:5645792a524f parent: 87282:c0e63937efa7 parent: 87283:748f007b23fd user: Christian Heimes date: Wed Nov 20 00:50:38 2013 +0100 summary: Add workaround for VS 2010 nmake clean issue. VS 2010 doesn't set up PATH for nmake.exe correctly. files: Misc/NEWS | 3 +++ PC/python3.mak | 6 +++++- PCbuild/python3dll.vcxproj | 12 ++++++------ 3 files changed, 14 insertions(+), 7 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -298,6 +298,9 @@ Build ----- +- Add workaround for VS 2010 nmake clean issue. VS 2010 doesn't set up PATH + for nmake.exe correctly. + - Issue #19550: Implement Windows installer changes of PEP 453 (ensurepip). - Issue #19520: Fix compiler warning in the _sha3 module on 32bit Windows. diff --git a/PC/python3.mak b/PC/python3.mak --- a/PC/python3.mak +++ b/PC/python3.mak @@ -5,6 +5,10 @@ lib /def:python34stub.def /out:$(OutDir)python34stub.lib /MACHINE:$(MACHINE) clean: - del $(OutDir)python3.dll $(OutDir)python3.lib $(OutDir)python34stub.lib $(OutDir)python3.exp $(OutDir)python34stub.exp + IF EXIST $(OutDir)python3.dll del $(OutDir)python3.dll + IF EXIST $(OutDir)python3.lib del $(OutDir)python3.lib + IF EXIST $(OutDir)python34stub.lib del $(OutDir)python34stub.lib + IF EXIST $(OutDir)python3.exp del $(OutDir)python3.exp + IF EXIST $(OutDir)python34stub.exp del $(OutDir)python34stub.exp rebuild: clean $(OutDir)python3.dll diff --git a/PCbuild/python3dll.vcxproj b/PCbuild/python3dll.vcxproj --- a/PCbuild/python3dll.vcxproj +++ b/PCbuild/python3dll.vcxproj @@ -99,7 +99,7 @@ cd $(ProjectDir)\..\PC nmake /f python3.mak MACHINE=x86 OutDir=$(OutDir) rebuild cd $(ProjectDir)\..\PC -nmake /f python3.mak MACHINE=x86 OutDir=$(OutDir) clean +"$(VSInstallDir)\VC\bin\nmake.exe" /f python3.mak MACHINE=x86 OutDir=$(OutDir) clean $(OutDir)python3.dll $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -111,7 +111,7 @@ cd $(ProjectDir)\..\PC nmake /f python3.mak MACHINE=x64 OutDir=$(OutDir) rebuild cd $(ProjectDir)\..\PC -nmake /f python3.mak MACHINE=x64 OutDir=$(OutDir) clean +"$(VSInstallDir)\VC\bin\nmake.exe" /f python3.mak MACHINE=x64 OutDir=$(OutDir) clean $(OutDir)python3.dll $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -123,7 +123,7 @@ cd $(ProjectDir)\..\PC nmake /f python3.mak MACHINE=x86 OutDir=$(OutDir) rebuild cd $(ProjectDir)\..\PC -nmake /f python3.mak MACHINE=x86 OutDir=$(OutDir) clean +"$(VSInstallDir)\VC\bin\nmake.exe" /f python3.mak MACHINE=x86 OutDir=$(OutDir) clean $(OutDir)python3.dll $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -135,7 +135,7 @@ cd $(ProjectDir)\..\PC nmake /f python3.mak MACHINE=x64 OutDir=$(OutDir) rebuild cd $(ProjectDir)\..\PC -nmake /f python3.mak MACHINE=x64 OutDir=$(OutDir) clean +"$(VSInstallDir)\VC\bin\nmake.exe" /f python3.mak MACHINE=x64 OutDir=$(OutDir) clean $(OutDir)python3.dll $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -147,7 +147,7 @@ cd $(ProjectDir)\..\PC nmake /f python3.mak MACHINE=x86 OutDir=$(OutDir) rebuild cd $(ProjectDir)\..\PC -nmake /f python3.mak MACHINE=x86 OutDir=$(OutDir) clean +"$(VSInstallDir)\VC\bin\nmake.exe" /f python3.mak MACHINE=x86 OutDir=$(OutDir) clean $(OutDir)python3.dll $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) @@ -159,7 +159,7 @@ cd $(ProjectDir)\..\PC nmake /f python3.mak MACHINE=x64 OutDir=$(OutDir) rebuild cd $(ProjectDir)\..\PC -nmake /f python3.mak MACHINE=x64 OutDir=$(OutDir) clean +"$(VSInstallDir)\VC\bin\nmake.exe" /f python3.mak MACHINE=x64 OutDir=$(OutDir) clean $(OutDir)python3.dll $(NMakePreprocessorDefinitions) $(NMakeIncludeSearchPath) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 01:12:34 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 01:12:34 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogY29uZmlndXJlOiBl?= =?utf-8?q?cho_message_to_AS=5FMESSAGE=5FFD=2E_--silent_redirects_fd_to_/d?= =?utf-8?q?ev/null=2E?= Message-ID: <3dPPVV0Dmgz7M5S@mail.python.org> http://hg.python.org/cpython/rev/5517027519a4 changeset: 87285:5517027519a4 branch: 3.3 parent: 87283:748f007b23fd user: Christian Heimes date: Wed Nov 20 01:11:18 2013 +0100 summary: configure: echo message to AS_MESSAGE_FD. --silent redirects fd to /dev/null. files: configure | 6 +++--- configure.ac | 6 +++--- 2 files changed, 6 insertions(+), 6 deletions(-) diff --git a/configure b/configure --- a/configure +++ b/configure @@ -16489,19 +16489,19 @@ fi -echo "creating Modules/Setup" +echo "creating Modules/Setup" >&6 if test ! -f Modules/Setup then cp $srcdir/Modules/Setup.dist Modules/Setup fi -echo "creating Modules/Setup.local" +echo "creating Modules/Setup.local" >&6 if test ! -f Modules/Setup.local then echo "# Edit this file for local setup changes" >Modules/Setup.local fi -echo "creating Makefile" +echo "creating Makefile" >&6 $SHELL $srcdir/Modules/makesetup -c $srcdir/Modules/config.c.in \ -s Modules Modules/Setup.config \ Modules/Setup.local Modules/Setup diff --git a/configure.ac b/configure.ac --- a/configure.ac +++ b/configure.ac @@ -4666,19 +4666,19 @@ AC_CONFIG_FILES([Modules/ld_so_aix], [chmod +x Modules/ld_so_aix]) AC_OUTPUT -echo "creating Modules/Setup" +echo "creating Modules/Setup" >&AS_MESSAGE_FD if test ! -f Modules/Setup then cp $srcdir/Modules/Setup.dist Modules/Setup fi -echo "creating Modules/Setup.local" +echo "creating Modules/Setup.local" >&AS_MESSAGE_FD if test ! -f Modules/Setup.local then echo "# Edit this file for local setup changes" >Modules/Setup.local fi -echo "creating Makefile" +echo "creating Makefile" >&AS_MESSAGE_FD $SHELL $srcdir/Modules/makesetup -c $srcdir/Modules/config.c.in \ -s Modules Modules/Setup.config \ Modules/Setup.local Modules/Setup -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 01:12:35 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 01:12:35 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_configure=3A_echo_message_to_AS=5FMESSAGE=5FFD=2E_--sile?= =?utf-8?q?nt_redirects_fd_to_/dev/null=2E?= Message-ID: <3dPPVW2mG4z7M68@mail.python.org> http://hg.python.org/cpython/rev/c9731d508785 changeset: 87286:c9731d508785 parent: 87284:5645792a524f parent: 87285:5517027519a4 user: Christian Heimes date: Wed Nov 20 01:11:33 2013 +0100 summary: configure: echo message to AS_MESSAGE_FD. --silent redirects fd to /dev/null. files: configure | 6 +++--- configure.ac | 6 +++--- 2 files changed, 6 insertions(+), 6 deletions(-) diff --git a/configure b/configure --- a/configure +++ b/configure @@ -16530,19 +16530,19 @@ fi -echo "creating Modules/Setup" +echo "creating Modules/Setup" >&6 if test ! -f Modules/Setup then cp $srcdir/Modules/Setup.dist Modules/Setup fi -echo "creating Modules/Setup.local" +echo "creating Modules/Setup.local" >&6 if test ! -f Modules/Setup.local then echo "# Edit this file for local setup changes" >Modules/Setup.local fi -echo "creating Makefile" +echo "creating Makefile" >&6 $SHELL $srcdir/Modules/makesetup -c $srcdir/Modules/config.c.in \ -s Modules Modules/Setup.config \ Modules/Setup.local Modules/Setup diff --git a/configure.ac b/configure.ac --- a/configure.ac +++ b/configure.ac @@ -4698,19 +4698,19 @@ AC_CONFIG_FILES([Modules/ld_so_aix], [chmod +x Modules/ld_so_aix]) AC_OUTPUT -echo "creating Modules/Setup" +echo "creating Modules/Setup" >&AS_MESSAGE_FD if test ! -f Modules/Setup then cp $srcdir/Modules/Setup.dist Modules/Setup fi -echo "creating Modules/Setup.local" +echo "creating Modules/Setup.local" >&AS_MESSAGE_FD if test ! -f Modules/Setup.local then echo "# Edit this file for local setup changes" >Modules/Setup.local fi -echo "creating Makefile" +echo "creating Makefile" >&AS_MESSAGE_FD $SHELL $srcdir/Modules/makesetup -c $srcdir/Modules/config.c.in \ -s Modules Modules/Setup.config \ Modules/Setup.local Modules/Setup -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 01:18:34 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 01:18:34 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_configure=3A_use_AC=5FMSG?= =?utf-8?q?=5FNOTICE=28=29_instead_of_AC=5FMSG=5FWARN=28=29_to_inform_user?= =?utf-8?q?_about?= Message-ID: <3dPPdQ6CsSzQs2@mail.python.org> http://hg.python.org/cpython/rev/1300d94d6e88 changeset: 87287:1300d94d6e88 user: Christian Heimes date: Wed Nov 20 01:18:26 2013 +0100 summary: configure: use AC_MSG_NOTICE() instead of AC_MSG_WARN() to inform user about C++ compiler. Now './configure --silent && make -s' doesn't print any message to stdout or stderr. files: configure | 6 +++--- configure.ac | 2 +- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/configure b/configure --- a/configure +++ b/configure @@ -4803,16 +4803,16 @@ fi if test "$preset_cxx" != "$CXX" then - { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: + { $as_echo "$as_me:${as_lineno-$LINENO}: By default, distutils will build C++ extension modules with \"$CXX\". If this is not intended, then set CXX on the configure command line. " >&5 -$as_echo "$as_me: WARNING: +$as_echo "$as_me: By default, distutils will build C++ extension modules with \"$CXX\". If this is not intended, then set CXX on the configure command line. - " >&2;} + " >&6;} fi diff --git a/configure.ac b/configure.ac --- a/configure.ac +++ b/configure.ac @@ -693,7 +693,7 @@ fi if test "$preset_cxx" != "$CXX" then - AC_MSG_WARN([ + AC_MSG_NOTICE([ By default, distutils will build C++ extension modules with "$CXX". If this is not intended, then set CXX on the configure command line. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 02:46:53 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 02:46:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_explain_performance_implicati?= =?utf-8?q?ons?= Message-ID: <3dPRbK0wddzQLQ@mail.python.org> http://hg.python.org/peps/rev/fbe779221a7a changeset: 5298:fbe779221a7a user: Christian Heimes date: Wed Nov 20 02:46:44 2013 +0100 summary: explain performance implications move some paragraphs around link to benchmark and benchmark results files: pep-0456.txt | 67 ++++++++++++++++++++++++--------------- 1 files changed, 41 insertions(+), 26 deletions(-) diff --git a/pep-0456.txt b/pep-0456.txt --- a/pep-0456.txt +++ b/pep-0456.txt @@ -296,9 +296,14 @@ However a fast function like DJBX33A is not as secure as SipHash24. A cutoff at about 5 to 7 bytes should provide a decent safety margin and speed up at the same time. The PEP's reference implementation provides such a cutoff with -``Py_HASH_CUTOFF`` but disables the optimization by default. Multiple runs of -Python's benchmark suite shows an average speedups between 3% and 5% for -benchmarks such as django_v2, mako and etree with a cutoff of 7 on 64 bit Linux. +``Py_HASH_CUTOFF``. The optimization is disabled by default for several +reasons. For one the security implications are unclear yet and should be +thoroughly studied before the optimization is enabled by default. Secondly +the performance benefits vary. On 64 bit Linux system with Intel Core i7 +multiple runs of Python's benchmark suite [pybench]_ show an average speedups +between 3% and 5% for benchmarks such as django_v2, mako and etree with a +cutoff of 7. Benchmarks with X86 binaries and Windows X86_64 builds on the +same machine are a bit slower with small string optimization. C API additions @@ -506,33 +511,23 @@ buffer. -Further things to consider -========================== - -ASCII str / bytes hash collision --------------------------------- - -Since the implementation of [pep-0393]_ bytes and ASCII text have the same -memory layout. Because of this the new hashing API will keep the invariant:: - - hash("ascii string") == hash(b"ascii string") - -for ASCII string and ASCII bytes. Equal hash values result in a hash collision -and therefore cause a minor speed penalty for dicts and sets with mixed keys. -The cause of the collision could be removed by e.g. subtracting ``2`` from -the hash value of bytes. (``-2`` because ``hash(b"") == 0`` and ``-1`` is -reserved.) - - Performance =========== -TBD +In general the PEP 456 code with SipHash24 is about as fast as the old code +with FNV. SipHash24 seems to make better use of modern compilers, CPUs and +large L1 cache. Several benchmarks show a small speed improvement on 64 bit +CPUs such as Intel Core i5 and Intel Core i7 processes. 32 bit builds and +benchmarks on older CPUs such as an AMD Athlon X2 are slightly slower with +SipHash24. The performance increase or decrease are so small that they should +not affect any application code. -First tests suggest that SipHash performs a bit faster on 64-bit CPUs when -it is fed with medium size byte strings as well as ASCII and UCS2 Unicode -strings. For very short strings the setup cost for SipHash dominates its -speed but it is still in the same order of magnitude as the current FNV code. +The benchmarks were conducted on CPython default branch revision b08868fd5994 +and the PEP repository [pep-456-repos]_. All upstream changes were merged +into the pep-456 branch. The "performance" CPU governor was configured and +almost all programs were stopped so the benchmarks were able to utilize +TurboBoost and the CPU caches as much as possible. The raw benchmark results +of multiple machines and platforms are made available at [benchmarks]_. Hash value distribution @@ -630,6 +625,20 @@ under rare circumstances. The PEP implementation is optimized and simplified for the common case. +ASCII str / bytes hash collision +-------------------------------- + +Since the implementation of [pep-0393]_ bytes and ASCII text have the same +memory layout. Because of this the new hashing API will keep the invariant:: + + hash("ascii string") == hash(b"ascii string") + +for ASCII string and ASCII bytes. Equal hash values result in a hash collision +and therefore cause a minor speed penalty for dicts and sets with mixed keys. +The cause of the collision could be removed by e.g. subtracting ``2`` from +the hash value of bytes. ``-2`` because ``hash(b"") == 0`` and ``-1`` is +reserved. The PEP doesn't change the hash value. + References ========== @@ -672,6 +681,12 @@ .. [alignmentmyth] http://lemire.me/blog/archives/2012/05/31/data-alignment-for-speed-myth-or-reality/ +.. [pybench] http://hg.python.org/benchmarks/ + +.. [benchmarks] https://bitbucket.org/tiran/pep-456-benchmarks/src + +.. [pep-456-repos] http://hg.python.org/features/pep-456 + Copyright ========= -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Wed Nov 20 02:57:49 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 02:57:49 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_small_string_optimization_wil?= =?utf-8?q?l_either_be_enabled_or_removed_before_beta_2_is?= Message-ID: <3dPRqx3MsczQBQ@mail.python.org> http://hg.python.org/peps/rev/bd006cee937b changeset: 5299:bd006cee937b user: Christian Heimes date: Wed Nov 20 02:57:42 2013 +0100 summary: small string optimization will either be enabled or removed before beta 2 is released files: pep-0456.txt | 6 +++++- 1 files changed, 5 insertions(+), 1 deletions(-) diff --git a/pep-0456.txt b/pep-0456.txt --- a/pep-0456.txt +++ b/pep-0456.txt @@ -9,7 +9,7 @@ Content-Type: text/x-rst Created: 27-Sep-2013 Python-Version: 3.4 -Post-History: 06-Oct-2013, 14-Nov-2013 +Post-History: 06-Oct-2013, 14-Nov-2013, 20-Nov-2013 Abstract @@ -305,6 +305,10 @@ cutoff of 7. Benchmarks with X86 binaries and Windows X86_64 builds on the same machine are a bit slower with small string optimization. +The state of small string optimization will be assessed during the beta phase +of Python 3.4. The feature will either be enabled with appropriate values +or the code will be removed before beta 2 is released. + C API additions =============== -- Repository URL: http://hg.python.org/peps From solipsis at pitrou.net Wed Nov 20 07:38:14 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Wed, 20 Nov 2013 07:38:14 +0100 Subject: [Python-checkins] Daily reference leaks (1300d94d6e88): sum=55 Message-ID: results for 1300d94d6e88 on branch "default" -------------------------------------------- test_multiprocessing_spawn leaked [0, 38, 0] references, sum=38 test_multiprocessing_spawn leaked [0, 17, 0] memory blocks, sum=17 test_site leaked [0, 2, -2] references, sum=0 test_site leaked [0, 2, -2] memory blocks, sum=0 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogFeTXuz', '-x'] From victor.stinner at gmail.com Tue Nov 12 23:52:56 2013 From: victor.stinner at gmail.com (Victor Stinner) Date: Tue, 12 Nov 2013 23:52:56 +0100 Subject: [Python-checkins] cpython: Provide a more readable representation of socket on repr(). In-Reply-To: <3dK35t3mfqz7LkL@mail.python.org> References: <3dK35t3mfqz7LkL@mail.python.org> Message-ID: Hi Giampaolo, You forgot to update tests after your change in repr(socket). Tests are failing on buildbots, just one example: ====================================================================== FAIL: test_repr (test.test_socket.GeneralModuleTests) ---------------------------------------------------------------------- Traceback (most recent call last): File "/var/lib/buildslave/3.x.murray-gentoo/build/Lib/test/test_socket.py", line 653, in test_repr self.assertIn('family=%i' % socket.AF_INET, repr(s)) AssertionError: 'family=2' not found in "" ---------------------------------------------------------------------- Victor 2013/11/12 giampaolo.rodola : > http://hg.python.org/cpython/rev/c5751f01b09b > changeset: 87074:c5751f01b09b > parent: 85942:0d079c66dc23 > user: Giampaolo Rodola' > date: Thu Oct 03 21:01:43 2013 +0200 > summary: > Provide a more readable representation of socket on repr(). > > Before: > > > Now: > > > files: > Lib/socket.py | 2 +- > 1 files changed, 1 insertions(+), 1 deletions(-) > > > diff --git a/Lib/socket.py b/Lib/socket.py > --- a/Lib/socket.py > +++ b/Lib/socket.py > @@ -136,7 +136,7 @@ > address(es). > """ > closed = getattr(self, '_closed', False) > - s = "<%s.%s%s fd=%i, family=%i, type=%i, proto=%i" \ > + s = "<%s.%s%s fd=%i, family=%s, type=%s, proto=%i" \ > % (self.__class__.__module__, > self.__class__.__name__, > " [closed]" if closed else "", > > -- > Repository URL: http://hg.python.org/cpython > > _______________________________________________ > Python-checkins mailing list > Python-checkins at python.org > https://mail.python.org/mailman/listinfo/python-checkins > From python-checkins at python.org Wed Nov 20 11:37:27 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 11:37:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Nick_has_accepted_my_PEP=2C_t?= =?utf-8?q?hanks_a_lot?= Message-ID: <3dPgMW5Qd9zSG3@mail.python.org> http://hg.python.org/peps/rev/2877998a58b5 changeset: 5300:2877998a58b5 user: Christian Heimes date: Wed Nov 20 11:37:19 2013 +0100 summary: Nick has accepted my PEP, thanks a lot files: pep-0456.txt | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/pep-0456.txt b/pep-0456.txt --- a/pep-0456.txt +++ b/pep-0456.txt @@ -4,12 +4,13 @@ Last-Modified: $Date$ Author: Christian Heimes BDFL-Delegate: Nick Coghlan -Status: Draft +Status: Accepted Type: Standards Track Content-Type: text/x-rst Created: 27-Sep-2013 Python-Version: 3.4 Post-History: 06-Oct-2013, 14-Nov-2013, 20-Nov-2013 +Resolution: https://mail.python.org/pipermail/python-dev/2013-November/130400.html Abstract -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Wed Nov 20 11:46:32 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 11:46:32 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_ssue_=2319183=3A_Implement?= =?utf-8?q?_PEP_456_=27secure_and_interchangeable_hash_algorithm=27=2E?= Message-ID: <3dPgZ03w2RzSpt@mail.python.org> http://hg.python.org/cpython/rev/adb471b9cba1 changeset: 87288:adb471b9cba1 user: Christian Heimes date: Wed Nov 20 11:46:18 2013 +0100 summary: ssue #19183: Implement PEP 456 'secure and interchangeable hash algorithm'. Python now uses SipHash24 on all major platforms. files: Doc/library/sys.rst | 11 + Doc/license.rst | 29 + Doc/whatsnew/3.4.rst | 1 + Include/Python.h | 1 + Include/object.h | 17 - Include/pyhash.h | 147 +++++ Include/pyport.h | 19 +- Lib/test/regrtest.py | 2 + Lib/test/test_hash.py | 150 +++++- Lib/test/test_sys.py | 23 +- Makefile.pre.in | 2 + Misc/ACKS | 1 + Misc/NEWS | 3 + Modules/pyexpat.c | 2 +- Objects/bytesobject.c | 2 +- Objects/memoryobject.c | 2 +- Objects/object.c | 146 ----- Objects/unicodeobject.c | 35 +- PCbuild/pythoncore.vcxproj | 2 + PCbuild/pythoncore.vcxproj.filters | 6 + Python/pyhash.c | 430 +++++++++++++++++ Python/pythonrun.c | 3 + Python/random.c | 13 +- Python/sysmodule.c | 20 +- configure | 116 ++++- configure.ac | 72 ++- pyconfig.h.in | 16 + 27 files changed, 1029 insertions(+), 242 deletions(-) diff --git a/Doc/library/sys.rst b/Doc/library/sys.rst --- a/Doc/library/sys.rst +++ b/Doc/library/sys.rst @@ -594,9 +594,20 @@ | :const:`imag` | multiplier used for the imaginary part of a | | | complex number | +---------------------+--------------------------------------------------+ + | :const:`algorithm` | name of the algorithm for hashing of str, bytes, | + | | and memoryview | + +---------------------+--------------------------------------------------+ + | :const:`hash_bits` | internal output size of the hash algorithm | + +---------------------+--------------------------------------------------+ + | :const:`seed_bits` | size of the seed key of the hash algorithm | + +---------------------+--------------------------------------------------+ + .. versionadded:: 3.2 + .. versionchanged: 3.4 + Added *algorithm*, *hash_bits* and *seed_bits* + .. data:: hexversion diff --git a/Doc/license.rst b/Doc/license.rst --- a/Doc/license.rst +++ b/Doc/license.rst @@ -609,6 +609,35 @@ http://creativecommons.org/publicdomain/zero/1.0/ +SipHash24 +--------- + +The file :file:`Python/pyhash.c` contains Marek Majkowski' implementation of +Dan Bernstein's SipHash24 algorithm. The contains the following note:: + + + Copyright (c) 2013 Marek Majkowski + + Permission is hereby granted, free of charge, to any person obtaining a copy + of this software and associated documentation files (the "Software"), to deal + in the Software without restriction, including without limitation the rights + to use, copy, modify, merge, publish, distribute, sublicense, and/or sell + copies of the Software, and to permit persons to whom the Software is + furnished to do so, subject to the following conditions: + + The above copyright notice and this permission notice shall be included in + all copies or substantial portions of the Software. + + + Original location: + https://github.com/majek/csiphash/ + + Solution inspired by code from: + Samuel Neves (supercop/crypto_auth/siphash24/little) + djb (supercop/crypto_auth/siphash24/little2) + Jean-Philippe Aumasson (https://131002.net/siphash/siphash24.c) + + strtod and dtoa --------------- diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -116,6 +116,7 @@ * :ref:`PEP 442: Safe object finalization ` * :ref:`PEP 445: Configurable memory allocators ` +* :pep:`456` Secure and interchangeable hash algorithm * Improve finalization of Python modules to avoid setting their globals to None, in most cases (:issue:`18214`). * A more efficient :mod:`marshal` format (:issue:`16475`). diff --git a/Include/Python.h b/Include/Python.h --- a/Include/Python.h +++ b/Include/Python.h @@ -68,6 +68,7 @@ #include "object.h" #include "objimpl.h" #include "typeslots.h" +#include "pyhash.h" #include "pydebug.h" diff --git a/Include/object.h b/Include/object.h --- a/Include/object.h +++ b/Include/object.h @@ -562,23 +562,6 @@ PyAPI_FUNC(int) Py_ReprEnter(PyObject *); PyAPI_FUNC(void) Py_ReprLeave(PyObject *); -/* Helpers for hash functions */ -#ifndef Py_LIMITED_API -PyAPI_FUNC(Py_hash_t) _Py_HashDouble(double); -PyAPI_FUNC(Py_hash_t) _Py_HashPointer(void*); -PyAPI_FUNC(Py_hash_t) _Py_HashBytes(unsigned char*, Py_ssize_t); -#endif - -typedef struct { - Py_hash_t prefix; - Py_hash_t suffix; -} _Py_HashSecret_t; -PyAPI_DATA(_Py_HashSecret_t) _Py_HashSecret; - -#ifdef Py_DEBUG -PyAPI_DATA(int) _Py_HashSecret_Initialized; -#endif - /* Helper for passing objects to printf and the like */ #define PyObject_REPR(obj) _PyUnicode_AsString(PyObject_Repr(obj)) diff --git a/Include/pyhash.h b/Include/pyhash.h new file mode 100644 --- /dev/null +++ b/Include/pyhash.h @@ -0,0 +1,147 @@ +#ifndef Py_HASH_H + +#define Py_HASH_H +#ifdef __cplusplus +extern "C" { +#endif + +/* Helpers for hash functions */ +#ifndef Py_LIMITED_API +PyAPI_FUNC(Py_hash_t) _Py_HashDouble(double); +PyAPI_FUNC(Py_hash_t) _Py_HashPointer(void*); +PyAPI_FUNC(Py_hash_t) _Py_HashBytes(const void*, Py_ssize_t); +#endif + +/* Prime multiplier used in string and various other hashes. */ +#define _PyHASH_MULTIPLIER 1000003UL /* 0xf4243 */ + +/* Parameters used for the numeric hash implementation. See notes for + _Py_HashDouble in Objects/object.c. Numeric hashes are based on + reduction modulo the prime 2**_PyHASH_BITS - 1. */ + +#if SIZEOF_VOID_P >= 8 +# define _PyHASH_BITS 61 +#else +# define _PyHASH_BITS 31 +#endif + +#define _PyHASH_MODULUS (((size_t)1 << _PyHASH_BITS) - 1) +#define _PyHASH_INF 314159 +#define _PyHASH_NAN 0 +#define _PyHASH_IMAG _PyHASH_MULTIPLIER + + +/* hash secret + * + * memory layout on 64 bit systems + * cccccccc cccccccc cccccccc uc -- unsigned char[24] + * pppppppp ssssssss ........ fnv -- two Py_hash_t + * k0k0k0k0 k1k1k1k1 ........ siphash -- two PY_UINT64_T + * ........ ........ ssssssss djbx33a -- 16 bytes padding + one Py_hash_t + * ........ ........ eeeeeeee pyexpat XML hash salt + * + * memory layout on 32 bit systems + * cccccccc cccccccc cccccccc uc + * ppppssss ........ ........ fnv -- two Py_hash_t + * k0k0k0k0 k1k1k1k1 ........ siphash -- two PY_UINT64_T (*) + * ........ ........ ssss.... djbx33a -- 16 bytes padding + one Py_hash_t + * ........ ........ eeee.... pyexpat XML hash salt + * + * (*) The siphash member may not be available on 32 bit platforms without + * an unsigned int64 data type. + */ +typedef union { + /* ensure 24 bytes */ + unsigned char uc[24]; + /* two Py_hash_t for FNV */ + struct { + Py_hash_t prefix; + Py_hash_t suffix; + } fnv; +#ifdef PY_UINT64_T + /* two uint64 for SipHash24 */ + struct { + PY_UINT64_T k0; + PY_UINT64_T k1; + } siphash; +#endif + /* a different (!) Py_hash_t for small string optimization */ + struct { + unsigned char padding[16]; + Py_hash_t suffix; + } djbx33a; + struct { + unsigned char padding[16]; + Py_hash_t hashsalt; + } expat; +} _Py_HashSecret_t; +PyAPI_DATA(_Py_HashSecret_t) _Py_HashSecret; + +#ifdef Py_DEBUG +PyAPI_DATA(int) _Py_HashSecret_Initialized; +#endif + + +/* hash function definition */ +#ifndef Py_LIMITED_API +typedef struct { + Py_hash_t (*const hash)(const void *, Py_ssize_t); + const char *name; + const int hash_bits; + const int seed_bits; +} PyHash_FuncDef; + +PyAPI_FUNC(PyHash_FuncDef*) PyHash_GetFuncDef(void); +#endif + + +/* cutoff for small string DJBX33A optimization in range [1, cutoff). + * + * About 50% of the strings in a typical Python application are smaller than + * 6 to 7 chars. However DJBX33A is vulnerable to hash collision attacks. + * NEVER use DJBX33A for long strings! + * + * A Py_HASH_CUTOFF of 0 disables small string optimization. 32 bit platforms + * should use a smaller cutoff because it is easier to create colliding + * strings. A cutoff of 7 on 64bit platforms and 5 on 32bit platforms should + * provide a decent safety margin. + */ +#ifndef Py_HASH_CUTOFF +# define Py_HASH_CUTOFF 0 +#elif (Py_HASH_CUTOFF > 7 || Py_HASH_CUTOFF < 0) +# error Py_HASH_CUTOFF must in range 0...7. +#endif /* Py_HASH_CUTOFF */ + + +/* hash algorithm selection + * + * The values for Py_HASH_SIPHASH24 and Py_HASH_FNV are hard-coded in the + * configure script. + * + * - FNV is available on all platforms and architectures. + * - SIPHASH24 only works on plaforms that provide PY_UINT64_T and doesn't + * require aligned memory for integers. + * - With EXTERNAL embedders can provide an alternative implementation with:: + * + * PyHash_FuncDef PyHash_Func = {...}; + * + * XXX: Figure out __declspec() for extern PyHash_FuncDef. + */ +#define Py_HASH_EXTERNAL 0 +#define Py_HASH_SIPHASH24 1 +#define Py_HASH_FNV 2 + +#ifndef Py_HASH_ALGORITHM +# if (defined(PY_UINT64_T) && defined(PY_UINT32_T) \ + && !defined(HAVE_ALIGNED_REQUIRED)) +# define Py_HASH_ALGORITHM Py_HASH_SIPHASH24 +# else +# define Py_HASH_ALGORITHM Py_HASH_FNV +# endif /* uint64_t && uint32_t && aligned */ +#endif /* Py_HASH_ALGORITHM */ + +#ifdef __cplusplus +} +#endif + +#endif /* !Py_HASH_H */ diff --git a/Include/pyport.h b/Include/pyport.h --- a/Include/pyport.h +++ b/Include/pyport.h @@ -144,23 +144,6 @@ #endif #endif -/* Prime multiplier used in string and various other hashes. */ -#define _PyHASH_MULTIPLIER 1000003UL /* 0xf4243 */ - -/* Parameters used for the numeric hash implementation. See notes for - _Py_HashDouble in Objects/object.c. Numeric hashes are based on - reduction modulo the prime 2**_PyHASH_BITS - 1. */ - -#if SIZEOF_VOID_P >= 8 -#define _PyHASH_BITS 61 -#else -#define _PyHASH_BITS 31 -#endif -#define _PyHASH_MODULUS (((size_t)1 << _PyHASH_BITS) - 1) -#define _PyHASH_INF 314159 -#define _PyHASH_NAN 0 -#define _PyHASH_IMAG _PyHASH_MULTIPLIER - /* uintptr_t is the C9X name for an unsigned integral type such that a * legitimate void* can be cast to uintptr_t and then back to void* again * without loss of information. Similarly for intptr_t, wrt a signed @@ -199,8 +182,10 @@ #endif /* Py_hash_t is the same size as a pointer. */ +#define SIZEOF_PY_HASH_T SIZEOF_SIZE_T typedef Py_ssize_t Py_hash_t; /* Py_uhash_t is the unsigned equivalent needed to calculate numeric hash. */ +#define SIZEOF_PY_UHASH_T SIZEOF_SIZE_T typedef size_t Py_uhash_t; /* Largest possible value of size_t. diff --git a/Lib/test/regrtest.py b/Lib/test/regrtest.py --- a/Lib/test/regrtest.py +++ b/Lib/test/regrtest.py @@ -601,6 +601,8 @@ print("==", platform.python_implementation(), *sys.version.split()) print("== ", platform.platform(aliased=True), "%s-endian" % sys.byteorder) + print("== ", "hash algorithm:", sys.hash_info.algorithm, + "64bit" if sys.maxsize > 2**32 else "32bit") print("== ", os.getcwd()) print("Testing with flags:", sys.flags) diff --git a/Lib/test/test_hash.py b/Lib/test/test_hash.py --- a/Lib/test/test_hash.py +++ b/Lib/test/test_hash.py @@ -12,6 +12,40 @@ IS_64BIT = sys.maxsize > 2**32 +def lcg(x, length=16): + """Linear congruential generator""" + if x == 0: + return bytes(length) + out = bytearray(length) + for i in range(length): + x = (214013 * x + 2531011) & 0x7fffffff + out[i] = (x >> 16) & 0xff + return bytes(out) + +def pysiphash(uint64): + """Convert SipHash24 output to Py_hash_t + """ + assert 0 <= uint64 < (1 << 64) + # simple unsigned to signed int64 + if uint64 > (1 << 63) - 1: + int64 = uint64 - (1 << 64) + else: + int64 = uint64 + # mangle uint64 to uint32 + uint32 = (uint64 ^ uint64 >> 32) & 0xffffffff + # simple unsigned to signed int32 + if uint32 > (1 << 31) - 1: + int32 = uint32 - (1 << 32) + else: + int32 = uint32 + return int32, int64 + +def skip_unless_internalhash(test): + """Skip decorator for tests that depend on SipHash24 or FNV""" + ok = sys.hash_info.algorithm in {"fnv", "siphash24"} + msg = "Requires SipHash24 or FNV" + return test if ok else unittest.skip(msg)(test) + class HashEqualityTestCase(unittest.TestCase): @@ -138,7 +172,7 @@ # an object to be tested def get_hash_command(self, repr_): - return 'print(hash(%s))' % repr_ + return 'print(hash(eval(%s.decode("utf-8"))))' % repr_.encode("utf-8") def get_hash(self, repr_, seed=None): env = os.environ.copy() @@ -161,12 +195,67 @@ self.assertNotEqual(run1, run2) class StringlikeHashRandomizationTests(HashRandomizationTests): + repr_ = None + repr_long = None + + # 32bit little, 64bit little, 32bit big, 64bit big + known_hashes = { + 'djba33x': [ # only used for small strings + # seed 0, 'abc' + [193485960, 193485960, 193485960, 193485960], + # seed 42, 'abc' + [-678966196, 573763426263223372, -820489388, -4282905804826039665], + ], + 'siphash24': [ + # seed 0, 'abc' + [2025351752, 4596069200710135518, 1433332804, + -3481057401533226760], + # seed 42, 'abc' + [-774632014, -4501618152524544106, 1054608210, + -1493500025205289231], + # seed 42, 'abcdefghijk' + [-1436007334, 4436719588892876975, -1436007334, + 4436719588892876975], + # seed 0, '????', PyUCS2 layout depends on endianess + [1386693832, 5749986484189612790, 1776982909, + -5915111450199468540], + # seed 42, '????' + [1260387190, -2947981342227738144, 1430287772, + -4296699217652516017], + ], + 'fnv': [ + # seed 0, 'abc' + [-1600925533, 1453079729188098211, -1600925533, + 1453079729188098211], + # seed 42, 'abc' + [-206076799, -4410911502303878509, -1024014457, + -3570150969479994130], + # seed 42, 'abcdefghijk' + [811136751, -5046230049376118746, -77208053 , + -4779029615281019666], + # seed 0, '????' + [44402817, 8998297579845987431, -1956240331, + -782697888614047887], + # seed 42, '????' + [-283066365, -4576729883824601543, -271871407, None], + ] + } + + def get_expected_hash(self, position, length): + if length < sys.hash_info.cutoff: + algorithm = "djba33x" + else: + algorithm = sys.hash_info.algorithm + if sys.byteorder == 'little': + platform = 1 if IS_64BIT else 0 + else: + assert(sys.byteorder == 'big') + platform = 3 if IS_64BIT else 2 + return self.known_hashes[algorithm][position][platform] + def test_null_hash(self): # PYTHONHASHSEED=0 disables the randomized hash - if IS_64BIT: - known_hash_of_obj = 1453079729188098211 - else: - known_hash_of_obj = -1600925533 + known_hash_of_obj = self.get_expected_hash(0, 3) # Randomization is enabled by default: self.assertNotEqual(self.get_hash(self.repr_), known_hash_of_obj) @@ -174,39 +263,53 @@ # It can also be disabled by setting the seed to 0: self.assertEqual(self.get_hash(self.repr_, seed=0), known_hash_of_obj) + @skip_unless_internalhash def test_fixed_hash(self): # test a fixed seed for the randomized hash # Note that all types share the same values: - if IS_64BIT: - if sys.byteorder == 'little': - h = -4410911502303878509 - else: - h = -3570150969479994130 - else: - if sys.byteorder == 'little': - h = -206076799 - else: - h = -1024014457 + h = self.get_expected_hash(1, 3) self.assertEqual(self.get_hash(self.repr_, seed=42), h) + @skip_unless_internalhash + def test_long_fixed_hash(self): + if self.repr_long is None: + return + h = self.get_expected_hash(2, 11) + self.assertEqual(self.get_hash(self.repr_long, seed=42), h) + + class StrHashRandomizationTests(StringlikeHashRandomizationTests, unittest.TestCase): repr_ = repr('abc') + repr_long = repr('abcdefghijk') + repr_ucs2 = repr('????') + @skip_unless_internalhash def test_empty_string(self): self.assertEqual(hash(""), 0) + @skip_unless_internalhash + def test_ucs2_string(self): + h = self.get_expected_hash(3, 6) + self.assertEqual(self.get_hash(self.repr_ucs2, seed=0), h) + h = self.get_expected_hash(4, 6) + self.assertEqual(self.get_hash(self.repr_ucs2, seed=42), h) + class BytesHashRandomizationTests(StringlikeHashRandomizationTests, unittest.TestCase): repr_ = repr(b'abc') + repr_long = repr(b'abcdefghijk') + @skip_unless_internalhash def test_empty_string(self): self.assertEqual(hash(b""), 0) class MemoryviewHashRandomizationTests(StringlikeHashRandomizationTests, unittest.TestCase): repr_ = "memoryview(b'abc')" + repr_long = "memoryview(b'abcdefghijk')" + @skip_unless_internalhash def test_empty_string(self): self.assertEqual(hash(memoryview(b"")), 0) @@ -224,5 +327,22 @@ repr_ = repr(datetime.time(0)) +class HashDistributionTestCase(unittest.TestCase): + + def test_hash_distribution(self): + # check for hash collision + base = "abcdefghabcdefg" + for i in range(1, len(base)): + prefix = base[:i] + s15 = set() + s255 = set() + for c in range(256): + h = hash(prefix + chr(c)) + s15.add(h & 0xf) + s255.add(h & 0xff) + # SipHash24 distribution depends on key, usually > 60% + self.assertGreater(len(s15), 8, prefix) + self.assertGreater(len(s255), 128, prefix) + if __name__ == "__main__": unittest.main() diff --git a/Lib/test/test_sys.py b/Lib/test/test_sys.py --- a/Lib/test/test_sys.py +++ b/Lib/test/test_sys.py @@ -8,6 +8,7 @@ import codecs import gc import sysconfig +import platform # count the number of test runs, used to create unique # strings to intern in test_intern() @@ -431,7 +432,7 @@ self.assertEqual(type(sys.int_info.sizeof_digit), int) self.assertIsInstance(sys.hexversion, int) - self.assertEqual(len(sys.hash_info), 5) + self.assertEqual(len(sys.hash_info), 9) self.assertLess(sys.hash_info.modulus, 2**sys.hash_info.width) # sys.hash_info.modulus should be a prime; we do a quick # probable primality test (doesn't exclude the possibility of @@ -446,6 +447,26 @@ self.assertIsInstance(sys.hash_info.inf, int) self.assertIsInstance(sys.hash_info.nan, int) self.assertIsInstance(sys.hash_info.imag, int) + algo = sysconfig.get_config_var("PY_HASH_ALGORITHM") + if sys.hash_info.algorithm in {"fnv", "siphash24"}: + self.assertIn(sys.hash_info.hash_bits, {32, 64}) + self.assertIn(sys.hash_info.seed_bits, {32, 64, 128}) + + if algo == 1: + self.assertEqual(sys.hash_info.algorithm, "siphash24") + elif algo == 2: + self.assertEqual(sys.hash_info.algorithm, "fnv") + else: + processor = platform.processor().lower() + if processor in {"sparc", "mips"}: + self.assertEqual(sys.hash_info.algorithm, "fnv") + else: + self.assertEqual(sys.hash_info.algorithm, "siphash24") + else: + # PY_HASH_EXTERNAL + self.assertEqual(algo, 0) + self.assertGreaterEqual(sys.hash_info.cutoff, 0) + self.assertLess(sys.hash_info.cutoff, 8) self.assertIsInstance(sys.maxsize, int) self.assertIsInstance(sys.maxunicode, int) diff --git a/Makefile.pre.in b/Makefile.pre.in --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -366,6 +366,7 @@ Python/pyarena.o \ Python/pyctype.o \ Python/pyfpe.o \ + Python/pyhash.o \ Python/pymath.o \ Python/pystate.o \ Python/pythonrun.o \ @@ -868,6 +869,7 @@ $(srcdir)/Include/pydebug.h \ $(srcdir)/Include/pyerrors.h \ $(srcdir)/Include/pyfpe.h \ + $(srcdir)/Include/pyhash.h \ $(srcdir)/Include/pymath.h \ $(srcdir)/Include/pygetopt.h \ $(srcdir)/Include/pymacro.h \ diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -802,6 +802,7 @@ Don MacMillen Tomasz Ma?kowiak Steve Majewski +Marek Majkowski Grzegorz Makarewicz David Malcolm Greg Malcolm diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,6 +10,9 @@ Core and Builtins ----------------- +- Issue #19183: Implement PEP 456 'secure and interchangeable hash algorithm'. + Python now uses SipHash24 on all major platforms. + - Issue #12892: The utf-16* and utf-32* encoders no longer allow surrogate code points (U+D800-U+DFFF) to be encoded. The utf-32* decoders no longer decode byte sequences that correspond to surrogate code points. The surrogatepass diff --git a/Modules/pyexpat.c b/Modules/pyexpat.c --- a/Modules/pyexpat.c +++ b/Modules/pyexpat.c @@ -1218,7 +1218,7 @@ * has a backport of this feature where we also define XML_HAS_SET_HASH_SALT * to indicate that we can still use it. */ XML_SetHashSalt(self->itself, - (unsigned long)_Py_HashSecret.prefix); + (unsigned long)_Py_HashSecret.expat.hashsalt); #endif XML_SetUserData(self->itself, (void *)self); XML_SetUnknownEncodingHandler(self->itself, diff --git a/Objects/bytesobject.c b/Objects/bytesobject.c --- a/Objects/bytesobject.c +++ b/Objects/bytesobject.c @@ -897,7 +897,7 @@ { if (a->ob_shash == -1) { /* Can't fail */ - a->ob_shash = _Py_HashBytes((unsigned char *) a->ob_sval, Py_SIZE(a)); + a->ob_shash = _Py_HashBytes(a->ob_sval, Py_SIZE(a)); } return a->ob_shash; } diff --git a/Objects/memoryobject.c b/Objects/memoryobject.c --- a/Objects/memoryobject.c +++ b/Objects/memoryobject.c @@ -2742,7 +2742,7 @@ } /* Can't fail */ - self->hash = _Py_HashBytes((unsigned char *)mem, view->len); + self->hash = _Py_HashBytes(mem, view->len); if (mem != view->buf) PyMem_Free(mem); diff --git a/Objects/object.c b/Objects/object.c --- a/Objects/object.c +++ b/Objects/object.c @@ -731,150 +731,6 @@ return ok; } -/* Set of hash utility functions to help maintaining the invariant that - if a==b then hash(a)==hash(b) - - All the utility functions (_Py_Hash*()) return "-1" to signify an error. -*/ - -/* For numeric types, the hash of a number x is based on the reduction - of x modulo the prime P = 2**_PyHASH_BITS - 1. It's designed so that - hash(x) == hash(y) whenever x and y are numerically equal, even if - x and y have different types. - - A quick summary of the hashing strategy: - - (1) First define the 'reduction of x modulo P' for any rational - number x; this is a standard extension of the usual notion of - reduction modulo P for integers. If x == p/q (written in lowest - terms), the reduction is interpreted as the reduction of p times - the inverse of the reduction of q, all modulo P; if q is exactly - divisible by P then define the reduction to be infinity. So we've - got a well-defined map - - reduce : { rational numbers } -> { 0, 1, 2, ..., P-1, infinity }. - - (2) Now for a rational number x, define hash(x) by: - - reduce(x) if x >= 0 - -reduce(-x) if x < 0 - - If the result of the reduction is infinity (this is impossible for - integers, floats and Decimals) then use the predefined hash value - _PyHASH_INF for x >= 0, or -_PyHASH_INF for x < 0, instead. - _PyHASH_INF, -_PyHASH_INF and _PyHASH_NAN are also used for the - hashes of float and Decimal infinities and nans. - - A selling point for the above strategy is that it makes it possible - to compute hashes of decimal and binary floating-point numbers - efficiently, even if the exponent of the binary or decimal number - is large. The key point is that - - reduce(x * y) == reduce(x) * reduce(y) (modulo _PyHASH_MODULUS) - - provided that {reduce(x), reduce(y)} != {0, infinity}. The reduction of a - binary or decimal float is never infinity, since the denominator is a power - of 2 (for binary) or a divisor of a power of 10 (for decimal). So we have, - for nonnegative x, - - reduce(x * 2**e) == reduce(x) * reduce(2**e) % _PyHASH_MODULUS - - reduce(x * 10**e) == reduce(x) * reduce(10**e) % _PyHASH_MODULUS - - and reduce(10**e) can be computed efficiently by the usual modular - exponentiation algorithm. For reduce(2**e) it's even better: since - P is of the form 2**n-1, reduce(2**e) is 2**(e mod n), and multiplication - by 2**(e mod n) modulo 2**n-1 just amounts to a rotation of bits. - - */ - -Py_hash_t -_Py_HashDouble(double v) -{ - int e, sign; - double m; - Py_uhash_t x, y; - - if (!Py_IS_FINITE(v)) { - if (Py_IS_INFINITY(v)) - return v > 0 ? _PyHASH_INF : -_PyHASH_INF; - else - return _PyHASH_NAN; - } - - m = frexp(v, &e); - - sign = 1; - if (m < 0) { - sign = -1; - m = -m; - } - - /* process 28 bits at a time; this should work well both for binary - and hexadecimal floating point. */ - x = 0; - while (m) { - x = ((x << 28) & _PyHASH_MODULUS) | x >> (_PyHASH_BITS - 28); - m *= 268435456.0; /* 2**28 */ - e -= 28; - y = (Py_uhash_t)m; /* pull out integer part */ - m -= y; - x += y; - if (x >= _PyHASH_MODULUS) - x -= _PyHASH_MODULUS; - } - - /* adjust for the exponent; first reduce it modulo _PyHASH_BITS */ - e = e >= 0 ? e % _PyHASH_BITS : _PyHASH_BITS-1-((-1-e) % _PyHASH_BITS); - x = ((x << e) & _PyHASH_MODULUS) | x >> (_PyHASH_BITS - e); - - x = x * sign; - if (x == (Py_uhash_t)-1) - x = (Py_uhash_t)-2; - return (Py_hash_t)x; -} - -Py_hash_t -_Py_HashPointer(void *p) -{ - Py_hash_t x; - size_t y = (size_t)p; - /* bottom 3 or 4 bits are likely to be 0; rotate y by 4 to avoid - excessive hash collisions for dicts and sets */ - y = (y >> 4) | (y << (8 * SIZEOF_VOID_P - 4)); - x = (Py_hash_t)y; - if (x == -1) - x = -2; - return x; -} - -Py_hash_t -_Py_HashBytes(unsigned char *p, Py_ssize_t len) -{ - Py_uhash_t x; - Py_ssize_t i; - - /* - We make the hash of the empty string be 0, rather than using - (prefix ^ suffix), since this slightly obfuscates the hash secret - */ -#ifdef Py_DEBUG - assert(_Py_HashSecret_Initialized); -#endif - if (len == 0) { - return 0; - } - x = (Py_uhash_t) _Py_HashSecret.prefix; - x ^= (Py_uhash_t) *p << 7; - for (i = 0; i < len; i++) - x = (_PyHASH_MULTIPLIER * x) ^ (Py_uhash_t) *p++; - x ^= (Py_uhash_t) len; - x ^= (Py_uhash_t) _Py_HashSecret.suffix; - if (x == -1) - x = -2; - return x; -} - Py_hash_t PyObject_HashNotImplemented(PyObject *v) { @@ -883,8 +739,6 @@ return -1; } -_Py_HashSecret_t _Py_HashSecret; - Py_hash_t PyObject_Hash(PyObject *v) { diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -11386,39 +11386,8 @@ _PyUnicode_HASH(self) = 0; return 0; } - - /* The hash function as a macro, gets expanded three times below. */ -#define HASH(P) \ - x ^= (Py_uhash_t) *P << 7; \ - while (--len >= 0) \ - x = (_PyHASH_MULTIPLIER * x) ^ (Py_uhash_t) *P++; \ - - x = (Py_uhash_t) _Py_HashSecret.prefix; - switch (PyUnicode_KIND(self)) { - case PyUnicode_1BYTE_KIND: { - const unsigned char *c = PyUnicode_1BYTE_DATA(self); - HASH(c); - break; - } - case PyUnicode_2BYTE_KIND: { - const Py_UCS2 *s = PyUnicode_2BYTE_DATA(self); - HASH(s); - break; - } - default: { - Py_UCS4 *l; - assert(PyUnicode_KIND(self) == PyUnicode_4BYTE_KIND && - "Impossible switch case in unicode_hash"); - l = PyUnicode_4BYTE_DATA(self); - HASH(l); - break; - } - } - x ^= (Py_uhash_t) PyUnicode_GET_LENGTH(self); - x ^= (Py_uhash_t) _Py_HashSecret.suffix; - - if (x == -1) - x = -2; + x = _Py_HashBytes(PyUnicode_DATA(self), + PyUnicode_GET_LENGTH(self) * PyUnicode_KIND(self)); _PyUnicode_HASH(self) = x; return x; } diff --git a/PCbuild/pythoncore.vcxproj b/PCbuild/pythoncore.vcxproj --- a/PCbuild/pythoncore.vcxproj +++ b/PCbuild/pythoncore.vcxproj @@ -412,6 +412,7 @@ + @@ -616,6 +617,7 @@ + diff --git a/PCbuild/pythoncore.vcxproj.filters b/PCbuild/pythoncore.vcxproj.filters --- a/PCbuild/pythoncore.vcxproj.filters +++ b/PCbuild/pythoncore.vcxproj.filters @@ -421,6 +421,9 @@ Python + + Include + @@ -931,6 +934,9 @@ Modules + + Python + diff --git a/Python/pyhash.c b/Python/pyhash.c new file mode 100644 --- /dev/null +++ b/Python/pyhash.c @@ -0,0 +1,430 @@ +/* Set of hash utility functions to help maintaining the invariant that + if a==b then hash(a)==hash(b) + + All the utility functions (_Py_Hash*()) return "-1" to signify an error. +*/ +#include "Python.h" + +#ifdef __APPLE__ +# include +#elif defined(HAVE_LE64TOH) && defined(HAVE_ENDIAN_H) +# include +#elif defined(HAVE_LE64TOH) && defined(HAVE_SYS_ENDIAN_H) +# include +#endif + +#ifdef __cplusplus +extern "C" { +#endif + +_Py_HashSecret_t _Py_HashSecret; + +#if Py_HASH_ALGORITHM == Py_HASH_EXTERNAL +extern PyHash_FuncDef PyHash_Func; +#else +static PyHash_FuncDef PyHash_Func; +#endif + +/* Count _Py_HashBytes() calls */ +#ifdef Py_HASH_STATS +#define Py_HASH_STATS_MAX 32 +static Py_ssize_t hashstats[Py_HASH_STATS_MAX + 1] = {0}; +#endif + +/* For numeric types, the hash of a number x is based on the reduction + of x modulo the prime P = 2**_PyHASH_BITS - 1. It's designed so that + hash(x) == hash(y) whenever x and y are numerically equal, even if + x and y have different types. + + A quick summary of the hashing strategy: + + (1) First define the 'reduction of x modulo P' for any rational + number x; this is a standard extension of the usual notion of + reduction modulo P for integers. If x == p/q (written in lowest + terms), the reduction is interpreted as the reduction of p times + the inverse of the reduction of q, all modulo P; if q is exactly + divisible by P then define the reduction to be infinity. So we've + got a well-defined map + + reduce : { rational numbers } -> { 0, 1, 2, ..., P-1, infinity }. + + (2) Now for a rational number x, define hash(x) by: + + reduce(x) if x >= 0 + -reduce(-x) if x < 0 + + If the result of the reduction is infinity (this is impossible for + integers, floats and Decimals) then use the predefined hash value + _PyHASH_INF for x >= 0, or -_PyHASH_INF for x < 0, instead. + _PyHASH_INF, -_PyHASH_INF and _PyHASH_NAN are also used for the + hashes of float and Decimal infinities and nans. + + A selling point for the above strategy is that it makes it possible + to compute hashes of decimal and binary floating-point numbers + efficiently, even if the exponent of the binary or decimal number + is large. The key point is that + + reduce(x * y) == reduce(x) * reduce(y) (modulo _PyHASH_MODULUS) + + provided that {reduce(x), reduce(y)} != {0, infinity}. The reduction of a + binary or decimal float is never infinity, since the denominator is a power + of 2 (for binary) or a divisor of a power of 10 (for decimal). So we have, + for nonnegative x, + + reduce(x * 2**e) == reduce(x) * reduce(2**e) % _PyHASH_MODULUS + + reduce(x * 10**e) == reduce(x) * reduce(10**e) % _PyHASH_MODULUS + + and reduce(10**e) can be computed efficiently by the usual modular + exponentiation algorithm. For reduce(2**e) it's even better: since + P is of the form 2**n-1, reduce(2**e) is 2**(e mod n), and multiplication + by 2**(e mod n) modulo 2**n-1 just amounts to a rotation of bits. + + */ + +Py_hash_t +_Py_HashDouble(double v) +{ + int e, sign; + double m; + Py_uhash_t x, y; + + if (!Py_IS_FINITE(v)) { + if (Py_IS_INFINITY(v)) + return v > 0 ? _PyHASH_INF : -_PyHASH_INF; + else + return _PyHASH_NAN; + } + + m = frexp(v, &e); + + sign = 1; + if (m < 0) { + sign = -1; + m = -m; + } + + /* process 28 bits at a time; this should work well both for binary + and hexadecimal floating point. */ + x = 0; + while (m) { + x = ((x << 28) & _PyHASH_MODULUS) | x >> (_PyHASH_BITS - 28); + m *= 268435456.0; /* 2**28 */ + e -= 28; + y = (Py_uhash_t)m; /* pull out integer part */ + m -= y; + x += y; + if (x >= _PyHASH_MODULUS) + x -= _PyHASH_MODULUS; + } + + /* adjust for the exponent; first reduce it modulo _PyHASH_BITS */ + e = e >= 0 ? e % _PyHASH_BITS : _PyHASH_BITS-1-((-1-e) % _PyHASH_BITS); + x = ((x << e) & _PyHASH_MODULUS) | x >> (_PyHASH_BITS - e); + + x = x * sign; + if (x == (Py_uhash_t)-1) + x = (Py_uhash_t)-2; + return (Py_hash_t)x; +} + +Py_hash_t +_Py_HashPointer(void *p) +{ + Py_hash_t x; + size_t y = (size_t)p; + /* bottom 3 or 4 bits are likely to be 0; rotate y by 4 to avoid + excessive hash collisions for dicts and sets */ + y = (y >> 4) | (y << (8 * SIZEOF_VOID_P - 4)); + x = (Py_hash_t)y; + if (x == -1) + x = -2; + return x; +} + +Py_hash_t +_Py_HashBytes(const void *src, Py_ssize_t len) +{ + Py_hash_t x; + /* + We make the hash of the empty string be 0, rather than using + (prefix ^ suffix), since this slightly obfuscates the hash secret + */ + if (len == 0) { + return 0; + } + +#ifdef Py_HASH_STATS + hashstats[(len <= Py_HASH_STATS_MAX) ? len : 0]++; +#endif + +#if Py_HASH_CUTOFF > 0 + if (len < Py_HASH_CUTOFF) { + /* Optimize hashing of very small strings with inline DJBX33A. */ + Py_uhash_t hash; + const unsigned char *p = src; + hash = 5381; /* DJBX33A starts with 5381 */ + + switch(len) { + /* ((hash << 5) + hash) + *p == hash * 33 + *p */ + case 7: hash = ((hash << 5) + hash) + *p++; /* fallthrough */ + case 6: hash = ((hash << 5) + hash) + *p++; /* fallthrough */ + case 5: hash = ((hash << 5) + hash) + *p++; /* fallthrough */ + case 4: hash = ((hash << 5) + hash) + *p++; /* fallthrough */ + case 3: hash = ((hash << 5) + hash) + *p++; /* fallthrough */ + case 2: hash = ((hash << 5) + hash) + *p++; /* fallthrough */ + case 1: hash = ((hash << 5) + hash) + *p++; break; + default: + assert(0); + } + hash ^= len; + hash ^= (Py_uhash_t) _Py_HashSecret.djbx33a.suffix; + x = (Py_hash_t)hash; + } + else +#endif /* Py_HASH_CUTOFF */ + x = PyHash_Func.hash(src, len); + + if (x == -1) + return -2; + return x; +} + +void +_PyHash_Fini(void) +{ +#ifdef Py_HASH_STATS + int i; + Py_ssize_t total = 0; + char *fmt = "%2i %8" PY_FORMAT_SIZE_T "d %8" PY_FORMAT_SIZE_T "d\n"; + + fprintf(stderr, "len calls total\n"); + for (i = 1; i <= Py_HASH_STATS_MAX; i++) { + total += hashstats[i]; + fprintf(stderr, fmt, i, hashstats[i], total); + } + total += hashstats[0]; + fprintf(stderr, "> %8" PY_FORMAT_SIZE_T "d %8" PY_FORMAT_SIZE_T "d\n", + hashstats[0], total); +#endif +} + +PyHash_FuncDef * +PyHash_GetFuncDef(void) +{ + return &PyHash_Func; +} + +/* Optimized memcpy() for Windows */ +#ifdef _MSC_VER +# if SIZEOF_PY_UHASH_T == 4 +# define PY_UHASH_CPY(dst, src) do { \ + dst[0] = src[0]; dst[1] = src[1]; dst[2] = src[2]; dst[3] = src[3]; \ + } while(0) +# elif SIZEOF_PY_UHASH_T == 8 +# define PY_UHASH_CPY(dst, src) do { \ + dst[0] = src[0]; dst[1] = src[1]; dst[2] = src[2]; dst[3] = src[3]; \ + dst[4] = src[4]; dst[5] = src[5]; dst[6] = src[6]; dst[7] = src[7]; \ + } while(0) +# else +# error SIZEOF_PY_UHASH_T must be 4 or 8 +# endif /* SIZEOF_PY_UHASH_T */ +#else /* not Windows */ +# define PY_UHASH_CPY(dst, src) memcpy(dst, src, SIZEOF_PY_UHASH_T) +#endif /* _MSC_VER */ + + +#if Py_HASH_ALGORITHM == Py_HASH_FNV +/* ************************************************************************** + * Modified Fowler-Noll-Vo (FNV) hash function + */ +static Py_hash_t +fnv(const void *src, Py_ssize_t len) +{ + const unsigned char *p = src; + Py_uhash_t x; + Py_ssize_t remainder, blocks; + union { + Py_uhash_t value; + unsigned char bytes[SIZEOF_PY_UHASH_T]; + } block; + +#ifdef Py_DEBUG + assert(_Py_HashSecret_Initialized); +#endif + remainder = len % SIZEOF_PY_UHASH_T; + if (remainder == 0) { + /* Process at least one block byte by byte to reduce hash collisions + * for strings with common prefixes. */ + remainder = SIZEOF_PY_UHASH_T; + } + blocks = (len - remainder) / SIZEOF_PY_UHASH_T; + + x = (Py_uhash_t) _Py_HashSecret.fnv.prefix; + x ^= (Py_uhash_t) *p << 7; + while (blocks--) { + PY_UHASH_CPY(block.bytes, p); + x = (_PyHASH_MULTIPLIER * x) ^ block.value; + p += SIZEOF_PY_UHASH_T; + } + /* add remainder */ + for (; remainder > 0; remainder--) + x = (_PyHASH_MULTIPLIER * x) ^ (Py_uhash_t) *p++; + x ^= (Py_uhash_t) len; + x ^= (Py_uhash_t) _Py_HashSecret.fnv.suffix; + if (x == -1) { + x = -2; + } + return x; +} + +static PyHash_FuncDef PyHash_Func = {fnv, "fnv", 8 * SIZEOF_PY_HASH_T, + 16 * SIZEOF_PY_HASH_T}; + +#endif /* Py_HASH_ALGORITHM == Py_HASH_FNV */ + + +#if Py_HASH_ALGORITHM == Py_HASH_SIPHASH24 +/* ************************************************************************** + + Copyright (c) 2013 Marek Majkowski + + Permission is hereby granted, free of charge, to any person obtaining a copy + of this software and associated documentation files (the "Software"), to deal + in the Software without restriction, including without limitation the rights + to use, copy, modify, merge, publish, distribute, sublicense, and/or sell + copies of the Software, and to permit persons to whom the Software is + furnished to do so, subject to the following conditions: + + The above copyright notice and this permission notice shall be included in + all copies or substantial portions of the Software. + + THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, + OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN + THE SOFTWARE. + + + Original location: + https://github.com/majek/csiphash/ + + Solution inspired by code from: + Samuel Neves (supercop/crypto_auth/siphash24/little) + djb (supercop/crypto_auth/siphash24/little2) + Jean-Philippe Aumasson (https://131002.net/siphash/siphash24.c) + + Modified for Python by Christian Heimes: + - C89 / MSVC compatibility + - PY_UINT64_T, PY_UINT32_T and PY_UINT8_T + - _rotl64() on Windows + - letoh64() fallback +*/ + +typedef unsigned char PY_UINT8_T; + +/* byte swap little endian to host endian + * Endian conversion not only ensures that the hash function returns the same + * value on all platforms. It is also required to for a good dispersion of + * the hash values' least significant bits. + */ +#if PY_LITTLE_ENDIAN +# define _le64toh(x) ((PY_UINT64_T)(x)) +#elif defined(__APPLE__) +# define _le64toh(x) OSSwapLittleToHostInt64(x) +#elif defined(HAVE_LETOH64) +# define _le64toh(x) le64toh(x) +#else +# define _le64toh(x) (((PY_UINT64_T)(x) << 56) | \ + (((PY_UINT64_T)(x) << 40) & 0xff000000000000ULL) | \ + (((PY_UINT64_T)(x) << 24) & 0xff0000000000ULL) | \ + (((PY_UINT64_T)(x) << 8) & 0xff00000000ULL) | \ + (((PY_UINT64_T)(x) >> 8) & 0xff000000ULL) | \ + (((PY_UINT64_T)(x) >> 24) & 0xff0000ULL) | \ + (((PY_UINT64_T)(x) >> 40) & 0xff00ULL) | \ + ((PY_UINT64_T)(x) >> 56)) +#endif + + +#ifdef _MSC_VER +# define ROTATE(x, b) _rotl64(x, b) +#else +# define ROTATE(x, b) (PY_UINT64_T)( ((x) << (b)) | ( (x) >> (64 - (b))) ) +#endif + +#define HALF_ROUND(a,b,c,d,s,t) \ + a += b; c += d; \ + b = ROTATE(b, s) ^ a; \ + d = ROTATE(d, t) ^ c; \ + a = ROTATE(a, 32); + +#define DOUBLE_ROUND(v0,v1,v2,v3) \ + HALF_ROUND(v0,v1,v2,v3,13,16); \ + HALF_ROUND(v2,v1,v0,v3,17,21); \ + HALF_ROUND(v0,v1,v2,v3,13,16); \ + HALF_ROUND(v2,v1,v0,v3,17,21); + + +static Py_hash_t +siphash24(const void *src, Py_ssize_t src_sz) { + PY_UINT64_T k0 = _le64toh(_Py_HashSecret.siphash.k0); + PY_UINT64_T k1 = _le64toh(_Py_HashSecret.siphash.k1); + PY_UINT64_T b = (PY_UINT64_T)src_sz << 56; + const PY_UINT64_T *in = (PY_UINT64_T*)src; + + PY_UINT64_T v0 = k0 ^ 0x736f6d6570736575ULL; + PY_UINT64_T v1 = k1 ^ 0x646f72616e646f6dULL; + PY_UINT64_T v2 = k0 ^ 0x6c7967656e657261ULL; + PY_UINT64_T v3 = k1 ^ 0x7465646279746573ULL; + + PY_UINT64_T t; + PY_UINT8_T *pt; + PY_UINT8_T *m; + + while (src_sz >= 8) { + PY_UINT64_T mi = _le64toh(*in); + in += 1; + src_sz -= 8; + v3 ^= mi; + DOUBLE_ROUND(v0,v1,v2,v3); + v0 ^= mi; + } + + t = 0; + pt = (PY_UINT8_T *)&t; + m = (PY_UINT8_T *)in; + switch (src_sz) { + case 7: pt[6] = m[6]; + case 6: pt[5] = m[5]; + case 5: pt[4] = m[4]; + case 4: *((PY_UINT32_T*)&pt[0]) = *((PY_UINT32_T*)&m[0]); break; + case 3: pt[2] = m[2]; + case 2: pt[1] = m[1]; + case 1: pt[0] = m[0]; + } + b |= _le64toh(t); + + v3 ^= b; + DOUBLE_ROUND(v0,v1,v2,v3); + v0 ^= b; + v2 ^= 0xff; + DOUBLE_ROUND(v0,v1,v2,v3); + DOUBLE_ROUND(v0,v1,v2,v3); + + /* modified */ + t = (v0 ^ v1) ^ (v2 ^ v3); +#if SIZEOF_VOID_P == 4 + t ^= (t >> 32); +#endif + return (Py_hash_t)t; +} + +static PyHash_FuncDef PyHash_Func = {siphash24, "siphash24", 64, 128}; + +#endif /* Py_HASH_ALGORITHM == Py_HASH_SIPHASH24 */ + +#ifdef __cplusplus +} +#endif diff --git a/Python/pythonrun.c b/Python/pythonrun.c --- a/Python/pythonrun.c +++ b/Python/pythonrun.c @@ -104,6 +104,7 @@ extern void PyLong_Fini(void); extern int _PyFaulthandler_Init(void); extern void _PyFaulthandler_Fini(void); +extern void _PyHash_Fini(void); #ifdef WITH_THREAD extern void _PyGILState_Init(PyInterpreterState *, PyThreadState *); @@ -650,6 +651,8 @@ #ifdef COUNT_ALLOCS dump_counts(stdout); #endif + /* dump hash stats */ + _PyHash_Fini(); PRINT_TOTAL_REFS(); diff --git a/Python/random.c b/Python/random.c --- a/Python/random.c +++ b/Python/random.c @@ -95,7 +95,7 @@ /* Read size bytes from /dev/urandom into buffer. Call Py_FatalError() on error. */ static void -dev_urandom_noraise(char *buffer, Py_ssize_t size) +dev_urandom_noraise(unsigned char *buffer, Py_ssize_t size) { int fd; Py_ssize_t n; @@ -249,8 +249,9 @@ _PyRandom_Init(void) { char *env; - void *secret = &_Py_HashSecret; + unsigned char *secret = (unsigned char *)&_Py_HashSecret.uc; Py_ssize_t secret_size = sizeof(_Py_HashSecret_t); + assert(secret_size == sizeof(_Py_HashSecret.uc)); if (_Py_HashSecret_Initialized) return; @@ -278,17 +279,17 @@ memset(secret, 0, secret_size); } else { - lcg_urandom(seed, (unsigned char*)secret, secret_size); + lcg_urandom(seed, secret, secret_size); } } else { #ifdef MS_WINDOWS - (void)win32_urandom((unsigned char *)secret, secret_size, 0); + (void)win32_urandom(secret, secret_size, 0); #else /* #ifdef MS_WINDOWS */ # ifdef __VMS - vms_urandom((unsigned char *)secret, secret_size, 0); + vms_urandom(secret, secret_size, 0); # else - dev_urandom_noraise((char*)secret, secret_size); + dev_urandom_noraise(secret, secret_size); # endif #endif } diff --git a/Python/sysmodule.c b/Python/sysmodule.c --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -658,7 +658,7 @@ "hash_info\n\ \n\ A struct sequence providing parameters used for computing\n\ -numeric hashes. The attributes are read only."); +hashes. The attributes are read only."); static PyStructSequence_Field hash_info_fields[] = { {"width", "width of the type used for hashing, in bits"}, @@ -667,6 +667,11 @@ {"inf", "value to be used for hash of a positive infinity"}, {"nan", "value to be used for hash of a nan"}, {"imag", "multiplier used for the imaginary part of a complex number"}, + {"algorithm", "name of the algorithm for hashing of str, bytes and " + "memoryviews"}, + {"hash_bits", "internal output size of hash algorithm"}, + {"seed_bits", "seed size of hash algorithm"}, + {"cutoff", "small string optimization cutoff"}, {NULL, NULL} }; @@ -674,7 +679,7 @@ "sys.hash_info", hash_info_doc, hash_info_fields, - 5, + 9, }; static PyObject * @@ -682,9 +687,11 @@ { PyObject *hash_info; int field = 0; + PyHash_FuncDef *hashfunc; hash_info = PyStructSequence_New(&Hash_InfoType); if (hash_info == NULL) return NULL; + hashfunc = PyHash_GetFuncDef(); PyStructSequence_SET_ITEM(hash_info, field++, PyLong_FromLong(8*sizeof(Py_hash_t))); PyStructSequence_SET_ITEM(hash_info, field++, @@ -695,6 +702,14 @@ PyLong_FromLong(_PyHASH_NAN)); PyStructSequence_SET_ITEM(hash_info, field++, PyLong_FromLong(_PyHASH_IMAG)); + PyStructSequence_SET_ITEM(hash_info, field++, + PyUnicode_FromString(hashfunc->name)); + PyStructSequence_SET_ITEM(hash_info, field++, + PyLong_FromLong(hashfunc->hash_bits)); + PyStructSequence_SET_ITEM(hash_info, field++, + PyLong_FromLong(hashfunc->seed_bits)); + PyStructSequence_SET_ITEM(hash_info, field++, + PyLong_FromLong(Py_HASH_CUTOFF)); if (PyErr_Occurred()) { Py_CLEAR(hash_info); return NULL; @@ -1338,6 +1353,7 @@ executable -- absolute path of the executable binary of the Python interpreter\n\ float_info -- a struct sequence with information about the float implementation.\n\ float_repr_style -- string indicating the style of repr() output for floats\n\ +hash_info -- a struct sequence with information about the hash algorithm.\n\ hexversion -- version information encoded as a single integer\n\ implementation -- Python implementation information.\n\ int_info -- a struct sequence with information about the int implementation.\n\ diff --git a/configure b/configure --- a/configure +++ b/configure @@ -792,6 +792,7 @@ enable_shared enable_profiling with_pydebug +with_hash_algorithm with_libs with_system_expat with_system_ffi @@ -1465,6 +1466,8 @@ compiler --with-suffix=.exe set executable suffix --with-pydebug build with Py_DEBUG defined + --with-hash-algorithm=[fnv|siphash24] + select hash algorithm --with-libs='lib1 ...' link against additional libs --with-system-expat build pyexpat module using an installed expat library @@ -6956,7 +6959,8 @@ sys/stat.h sys/syscall.h sys/sys_domain.h sys/termio.h sys/time.h \ sys/times.h sys/types.h sys/uio.h sys/un.h sys/utsname.h sys/wait.h pty.h \ libutil.h sys/resource.h netpacket/packet.h sysexits.h bluetooth.h \ -bluetooth/bluetooth.h linux/tipc.h spawn.h util.h alloca.h +bluetooth/bluetooth.h linux/tipc.h spawn.h util.h alloca.h endian.h \ +sys/endian.h do : as_ac_Header=`$as_echo "ac_cv_header_$ac_header" | $as_tr_sh` ac_fn_c_check_header_mongrel "$LINENO" "$ac_header" "$as_ac_Header" "$ac_includes_default" @@ -7330,6 +7334,43 @@ fi +# byte swapping +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for le64toh" >&5 +$as_echo_n "checking for le64toh... " >&6; } +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +#ifdef HAVE_ENDIAN_H +#include +#elif defined(HAVE_SYS_ENDIAN_H) +#include +#endif + +int +main () +{ + + le64toh(1) + ; + return 0; +} + +_ACEOF +if ac_fn_c_try_link "$LINENO"; then : + ac_cv_has_le64toh=yes +else + ac_cv_has_le64toh=no +fi +rm -f core conftest.err conftest.$ac_objext \ + conftest$ac_exeext conftest.$ac_ext +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_has_le64toh" >&5 +$as_echo "$ac_cv_has_le64toh" >&6; } +if test "$ac_cv_has_le64toh" = "yes"; then + +$as_echo "#define HAVE_HTOLE64 1" >>confdefs.h + +fi + # Enabling LFS on Solaris (2.6 to 9) with gcc 2.95 triggers a bug in # the system headers: If _XOPEN_SOURCE and _LARGEFILE_SOURCE are # defined, but the compiler does not support pragma redefine_extname, @@ -8987,6 +9028,79 @@ *) ;; esac +# check for systems that require aligned memory access +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking aligned memory access is required" >&5 +$as_echo_n "checking aligned memory access is required... " >&6; } +if test "$cross_compiling" = yes; then : + aligned_required=yes +else + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int main() +{ + char s[16]; + int i, *p1, *p2; + for (i=0; i < 16; i++) + s[i] = i; + p1 = (int*)(s+1); + p2 = (int*)(s+2); + if (*p1 == *p2) + return 1; + return 0; +} + +_ACEOF +if ac_fn_c_try_run "$LINENO"; then : + aligned_required=no +else + aligned_required=yes +fi +rm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \ + conftest.$ac_objext conftest.beam conftest.$ac_ext +fi + + +if test "$aligned_required" = yes ; then + +$as_echo "#define HAVE_ALIGNED_REQUIRED 1" >>confdefs.h + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $aligned_required" >&5 +$as_echo "$aligned_required" >&6; } + + +# str, bytes and memoryview hash algorithm + + +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for --with-hash-algorithm" >&5 +$as_echo_n "checking for --with-hash-algorithm... " >&6; } + +# Check whether --with-hash_algorithm was given. +if test "${with_hash_algorithm+set}" = set; then : + withval=$with_hash_algorithm; +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $withval" >&5 +$as_echo "$withval" >&6; } +case "$withval" in + siphash24) + $as_echo "#define Py_HASH_ALGORITHM 1" >>confdefs.h + + ;; + fnv) + $as_echo "#define Py_HASH_ALGORITHM 2" >>confdefs.h + + ;; + *) + as_fn_error $? "unknown hash algorithm '$withval'" "$LINENO" 5 + ;; +esac + +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: default" >&5 +$as_echo "default" >&6; } +fi + + # Most SVR4 platforms (e.g. Solaris) need -lsocket and -lnsl. { $as_echo "$as_me:${as_lineno-$LINENO}: checking for t_open in -lnsl" >&5 $as_echo_n "checking for t_open in -lnsl... " >&6; } diff --git a/configure.ac b/configure.ac --- a/configure.ac +++ b/configure.ac @@ -1543,7 +1543,8 @@ sys/stat.h sys/syscall.h sys/sys_domain.h sys/termio.h sys/time.h \ sys/times.h sys/types.h sys/uio.h sys/un.h sys/utsname.h sys/wait.h pty.h \ libutil.h sys/resource.h netpacket/packet.h sysexits.h bluetooth.h \ -bluetooth/bluetooth.h linux/tipc.h spawn.h util.h alloca.h) +bluetooth/bluetooth.h linux/tipc.h spawn.h util.h alloca.h endian.h \ +sys/endian.h) CPPFLAGS=$ac_save_cppflags AC_HEADER_DIRENT AC_HEADER_MAJOR @@ -1614,6 +1615,22 @@ AC_DEFINE(HAVE_MAKEDEV, 1, [Define this if you have the makedev macro.]) fi +# byte swapping +AC_MSG_CHECKING(for le64toh) +AC_LINK_IFELSE([AC_LANG_PROGRAM([[ +#ifdef HAVE_ENDIAN_H +#include +#elif defined(HAVE_SYS_ENDIAN_H) +#include +#endif +]], [[ + le64toh(1) ]]) +],[ac_cv_has_le64toh=yes],[ac_cv_has_le64toh=no]) +AC_MSG_RESULT($ac_cv_has_le64toh) +if test "$ac_cv_has_le64toh" = "yes"; then + AC_DEFINE(HAVE_HTOLE64, 1, [Define this if you have le64toh()]) +fi + # Enabling LFS on Solaris (2.6 to 9) with gcc 2.95 triggers a bug in # the system headers: If _XOPEN_SOURCE and _LARGEFILE_SOURCE are # defined, but the compiler does not support pragma redefine_extname, @@ -2229,6 +2246,59 @@ *) ;; esac +# check for systems that require aligned memory access +AC_MSG_CHECKING(aligned memory access is required) +AC_TRY_RUN([ +int main() +{ + char s[16]; + int i, *p1, *p2; + for (i=0; i < 16; i++) + s[i] = i; + p1 = (int*)(s+1); + p2 = (int*)(s+2); + if (*p1 == *p2) + return 1; + return 0; +} + ], + [aligned_required=no], + [aligned_required=yes], + [aligned_required=yes]) + +if test "$aligned_required" = yes ; then + AC_DEFINE([HAVE_ALIGNED_REQUIRED], [1], + [Define if aligned memory access is required]) +fi +AC_MSG_RESULT($aligned_required) + + +# str, bytes and memoryview hash algorithm +AH_TEMPLATE(Py_HASH_ALGORITHM, + [Define hash algorithm for str, bytes and memoryview. + SipHash24: 1, FNV: 2, externally defined: 0]) + +AC_MSG_CHECKING(for --with-hash-algorithm) +dnl quadrigraphs "@<:@" and "@:>@" produce "[" and "]" in the output +AC_ARG_WITH(hash_algorithm, + AS_HELP_STRING([--with-hash-algorithm=@<:@fnv|siphash24@:>@], + [select hash algorithm]), +[ +AC_MSG_RESULT($withval) +case "$withval" in + siphash24) + AC_DEFINE(Py_HASH_ALGORITHM, 1) + ;; + fnv) + AC_DEFINE(Py_HASH_ALGORITHM, 2) + ;; + *) + AC_MSG_ERROR([unknown hash algorithm '$withval']) + ;; +esac +], +[AC_MSG_RESULT(default)]) + # Most SVR4 platforms (e.g. Solaris) need -lsocket and -lnsl. AC_CHECK_LIB(nsl, t_open, [LIBS="-lnsl $LIBS"]) # SVR4 AC_CHECK_LIB(socket, socket, [LIBS="-lsocket $LIBS"], [], $LIBS) # SVR4 sockets diff --git a/pyconfig.h.in b/pyconfig.h.in --- a/pyconfig.h.in +++ b/pyconfig.h.in @@ -49,6 +49,9 @@ /* Define to 1 if you have the `alarm' function. */ #undef HAVE_ALARM +/* Define if aligned memory access is required */ +#undef HAVE_ALIGNED_REQUIRED + /* Define to 1 if you have the header file. */ #undef HAVE_ALLOCA_H @@ -199,6 +202,9 @@ /* Defined when any dynamic module loading is enabled. */ #undef HAVE_DYNAMIC_LOADING +/* Define to 1 if you have the header file. */ +#undef HAVE_ENDIAN_H + /* Define if you have the 'epoll' functions. */ #undef HAVE_EPOLL @@ -408,6 +414,9 @@ /* Define if you have the 'hstrerror' function. */ #undef HAVE_HSTRERROR +/* Define this if you have le64toh() */ +#undef HAVE_HTOLE64 + /* Define to 1 if you have the `hypot' function. */ #undef HAVE_HYPOT @@ -927,6 +936,9 @@ */ #undef HAVE_SYS_DIR_H +/* Define to 1 if you have the header file. */ +#undef HAVE_SYS_ENDIAN_H + /* Define to 1 if you have the header file. */ #undef HAVE_SYS_EPOLL_H @@ -1193,6 +1205,10 @@ /* Defined if Python is built as a shared library. */ #undef Py_ENABLE_SHARED +/* Define hash algorithm for str, bytes and memoryview. SipHash24: 1, FNV: 2, + externally defined: 0 */ +#undef Py_HASH_ALGORITHM + /* assume C89 semantics that RETSIGTYPE is always void */ #undef RETSIGTYPE -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 12:00:45 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 12:00:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319183=3A_test=5Fg?= =?utf-8?q?db=27s_test=5Fdict_was_failing_on_some_machines_as_the_order_or?= Message-ID: <3dPgtP3TVvz7LkL@mail.python.org> http://hg.python.org/cpython/rev/422ed27b62ce changeset: 87289:422ed27b62ce user: Christian Heimes date: Wed Nov 20 12:00:35 2013 +0100 summary: Issue #19183: test_gdb's test_dict was failing on some machines as the order or dict keys has changed again. files: Lib/test/test_gdb.py | 3 +-- 1 files changed, 1 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_gdb.py b/Lib/test/test_gdb.py --- a/Lib/test/test_gdb.py +++ b/Lib/test/test_gdb.py @@ -248,8 +248,7 @@ 'Verify the pretty-printing of dictionaries' self.assertGdbRepr({}) self.assertGdbRepr({'foo': 'bar'}) - self.assertGdbRepr({'foo': 'bar', 'douglas': 42}, - "{'foo': 'bar', 'douglas': 42}") + self.assertGdbRepr({'foo': 'bar', 'douglas': 42}), def test_lists(self): 'Verify the pretty-printing of lists' -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 12:28:23 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 20 Nov 2013 12:28:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319183=3A_Fix_repr?= =?utf-8?q?=28=29_tests_of_test=5Fgdb=2C_hash=28=29_is_now_platform_depend?= =?utf-8?q?ent?= Message-ID: <3dPhVH3tC2zSFM@mail.python.org> http://hg.python.org/cpython/rev/11cb1c8faf11 changeset: 87290:11cb1c8faf11 user: Victor Stinner date: Wed Nov 20 12:27:48 2013 +0100 summary: Issue #19183: Fix repr() tests of test_gdb, hash() is now platform dependent files: Lib/test/test_gdb.py | 49 ++++++++++++++++++------------- 1 files changed, 29 insertions(+), 20 deletions(-) diff --git a/Lib/test/test_gdb.py b/Lib/test/test_gdb.py --- a/Lib/test/test_gdb.py +++ b/Lib/test/test_gdb.py @@ -42,6 +42,8 @@ checkout_hook_path = os.path.join(os.path.dirname(sys.executable), 'python-gdb.py') +PYTHONHASHSEED = '123' + def run_gdb(*args, **env_vars): """Runs gdb in --batch mode with the additional arguments given by *args. @@ -146,7 +148,7 @@ # print ' '.join(args) # Use "args" to invoke gdb, capturing stdout, stderr: - out, err = run_gdb(*args, PYTHONHASHSEED='0') + out, err = run_gdb(*args, PYTHONHASHSEED=PYTHONHASHSEED) errlines = err.splitlines() unexpected_errlines = [] @@ -219,41 +221,48 @@ gdb_output = self.get_stack_trace('id(42)') self.assertTrue(BREAKPOINT_FN in gdb_output) + def get_python_repr(self, val): + args = [sys.executable, '-c', 'print(repr(%a))' % (val,)] + env = os.environ.copy() + env['PYTHONHASHSEED'] = PYTHONHASHSEED + output = subprocess.check_output(args, env=env, universal_newlines=True) + return output.rstrip() + def assertGdbRepr(self, val, exp_repr=None, cmds_after_breakpoint=None): # Ensure that gdb's rendering of the value in a debugged process # matches repr(value) in this process: gdb_repr, gdb_output = self.get_gdb_repr('id(' + ascii(val) + ')', cmds_after_breakpoint) if not exp_repr: - exp_repr = repr(val) + exp_repr = self.get_python_repr(val) self.assertEqual(gdb_repr, exp_repr, ('%r did not equal expected %r; full output was:\n%s' % (gdb_repr, exp_repr, gdb_output))) def test_int(self): 'Verify the pretty-printing of various int values' - self.assertGdbRepr(42) - self.assertGdbRepr(0) - self.assertGdbRepr(-7) - self.assertGdbRepr(1000000000000) - self.assertGdbRepr(-1000000000000000) + self.assertGdbRepr(42, '42') + self.assertGdbRepr(0, '0') + self.assertGdbRepr(-7, '-7') + self.assertGdbRepr(1000000000000, '1000000000000') + self.assertGdbRepr(-1000000000000000, '-1000000000000000') def test_singletons(self): 'Verify the pretty-printing of True, False and None' - self.assertGdbRepr(True) - self.assertGdbRepr(False) - self.assertGdbRepr(None) + self.assertGdbRepr(True, 'True') + self.assertGdbRepr(False, 'False') + self.assertGdbRepr(None, 'None') def test_dicts(self): 'Verify the pretty-printing of dictionaries' - self.assertGdbRepr({}) + self.assertGdbRepr({}, '{}') self.assertGdbRepr({'foo': 'bar'}) - self.assertGdbRepr({'foo': 'bar', 'douglas': 42}), + self.assertGdbRepr({'foo': 'bar', 'douglas': 42}) def test_lists(self): 'Verify the pretty-printing of lists' - self.assertGdbRepr([]) - self.assertGdbRepr(list(range(5))) + self.assertGdbRepr([], '[]') + self.assertGdbRepr(list(range(5)), '[0, 1, 2, 3, 4]') def test_bytes(self): 'Verify the pretty-printing of bytes' @@ -303,7 +312,7 @@ def test_tuples(self): 'Verify the pretty-printing of tuples' - self.assertGdbRepr(tuple()) + self.assertGdbRepr(tuple(), '()') self.assertGdbRepr((1,), '(1,)') self.assertGdbRepr(('foo', 'bar', 'baz')) @@ -312,13 +321,13 @@ if (gdb_major_version, gdb_minor_version) < (7, 3): self.skipTest("pretty-printing of sets needs gdb 7.3 or later") self.assertGdbRepr(set()) - self.assertGdbRepr(set(['a', 'b']), "{'a', 'b'}") - self.assertGdbRepr(set([4, 5, 6]), "{4, 5, 6}") + self.assertGdbRepr(set(['a', 'b'])) + self.assertGdbRepr(set([4, 5, 6])) # Ensure that we handle sets containing the "dummy" key value, # which happens on deletion: gdb_repr, gdb_output = self.get_gdb_repr('''s = set(['a','b']) -s.pop() +s.remove('a') id(s)''') self.assertEqual(gdb_repr, "{'b'}") @@ -327,8 +336,8 @@ if (gdb_major_version, gdb_minor_version) < (7, 3): self.skipTest("pretty-printing of frozensets needs gdb 7.3 or later") self.assertGdbRepr(frozenset()) - self.assertGdbRepr(frozenset(['a', 'b']), "frozenset({'a', 'b'})") - self.assertGdbRepr(frozenset([4, 5, 6]), "frozenset({4, 5, 6})") + self.assertGdbRepr(frozenset(['a', 'b'])) + self.assertGdbRepr(frozenset([4, 5, 6])) def test_exceptions(self): # Test a RuntimeError -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 12:49:14 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 12:49:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319183=3A_too_many?= =?utf-8?q?_tests_depend_on_the_sort_order_of_repr=28=29=2E?= Message-ID: <3dPhyL0FyDz7LjY@mail.python.org> http://hg.python.org/cpython/rev/961d832d8734 changeset: 87291:961d832d8734 user: Christian Heimes date: Wed Nov 20 12:49:05 2013 +0100 summary: Issue #19183: too many tests depend on the sort order of repr(). The bitshift and xor op for 32bit builds has changed the order of hash values. files: Python/pyhash.c | 3 --- 1 files changed, 0 insertions(+), 3 deletions(-) diff --git a/Python/pyhash.c b/Python/pyhash.c --- a/Python/pyhash.c +++ b/Python/pyhash.c @@ -415,9 +415,6 @@ /* modified */ t = (v0 ^ v1) ^ (v2 ^ v3); -#if SIZEOF_VOID_P == 4 - t ^= (t >> 32); -#endif return (Py_hash_t)t; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 13:47:23 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 13:47:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_update_siphash24_test_valu?= =?utf-8?q?es?= Message-ID: <3dPkFR36YSz7Ljs@mail.python.org> http://hg.python.org/cpython/rev/b878f206b71f changeset: 87292:b878f206b71f user: Christian Heimes date: Wed Nov 20 13:47:13 2013 +0100 summary: update siphash24 test values files: Lib/test/test_hash.py | 18 +++++++----------- 1 files changed, 7 insertions(+), 11 deletions(-) diff --git a/Lib/test/test_hash.py b/Lib/test/test_hash.py --- a/Lib/test/test_hash.py +++ b/Lib/test/test_hash.py @@ -207,21 +207,17 @@ [-678966196, 573763426263223372, -820489388, -4282905804826039665], ], 'siphash24': [ + # NOTE: PyUCS2 layout depends on endianess # seed 0, 'abc' - [2025351752, 4596069200710135518, 1433332804, - -3481057401533226760], + [1198583518, 4596069200710135518, 1198583518, 4596069200710135518], # seed 42, 'abc' - [-774632014, -4501618152524544106, 1054608210, - -1493500025205289231], + [273876886, -4501618152524544106, 273876886, -4501618152524544106], # seed 42, 'abcdefghijk' - [-1436007334, 4436719588892876975, -1436007334, - 4436719588892876975], - # seed 0, '????', PyUCS2 layout depends on endianess - [1386693832, 5749986484189612790, 1776982909, - -5915111450199468540], + [-1745215313, 4436719588892876975, -1745215313, 4436719588892876975], + # seed 0, '????' + [493570806, 5749986484189612790, -1006381564, -5915111450199468540], # seed 42, '????' - [1260387190, -2947981342227738144, 1430287772, - -4296699217652516017], + [-1677110816, -2947981342227738144, -1860207793, -4296699217652516017], ], 'fnv': [ # seed 0, 'abc' -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 16:44:58 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Wed, 20 Nov 2013 16:44:58 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Print_Tk_patch?= =?utf-8?q?level_in_test=5Ftcl_in_verbose_mode_=28issue19654=29=2E?= Message-ID: <3dPpBL4XXyz7LnP@mail.python.org> http://hg.python.org/cpython/rev/74b76a726285 changeset: 87293:74b76a726285 branch: 3.3 parent: 87285:5517027519a4 user: Serhiy Storchaka date: Wed Nov 20 17:43:49 2013 +0200 summary: Print Tk patchlevel in test_tcl in verbose mode (issue19654). files: Lib/test/test_tcl.py | 6 ++++++ 1 files changed, 6 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_tcl.py b/Lib/test/test_tcl.py --- a/Lib/test/test_tcl.py +++ b/Lib/test/test_tcl.py @@ -302,6 +302,12 @@ self.assertRaises(OverflowError, self.interp.call, 'set', '_', value) +def setUpModule(): + if support.verbose: + tcl = Tcl() + print('patchlevel =', tcl.call('info', 'patchlevel')) + + def test_main(): support.run_unittest(TclTest, TkinterTest, BigmemTclTest) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 16:44:59 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Wed, 20 Nov 2013 16:44:59 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Print_Tk_patchlevel_in_test=5Ftcl_in_verbose_mode_=28iss?= =?utf-8?b?dWUxOTY1NCku?= Message-ID: <3dPpBM70sMz7Llg@mail.python.org> http://hg.python.org/cpython/rev/1b58f14f5d60 changeset: 87294:1b58f14f5d60 parent: 87292:b878f206b71f parent: 87293:74b76a726285 user: Serhiy Storchaka date: Wed Nov 20 17:44:28 2013 +0200 summary: Print Tk patchlevel in test_tcl in verbose mode (issue19654). files: Lib/test/test_tcl.py | 6 ++++++ 1 files changed, 6 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_tcl.py b/Lib/test/test_tcl.py --- a/Lib/test/test_tcl.py +++ b/Lib/test/test_tcl.py @@ -268,6 +268,12 @@ self.assertRaises(OverflowError, self.interp.call, 'set', '_', value) +def setUpModule(): + if support.verbose: + tcl = Tcl() + print('patchlevel =', tcl.call('info', 'patchlevel')) + + def test_main(): support.run_unittest(TclTest, TkinterTest, BigmemTclTest) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 16:45:01 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Wed, 20 Nov 2013 16:45:01 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Print_Tk_patch?= =?utf-8?q?level_in_test=5Ftcl_in_verbose_mode_=28issue19654=29=2E?= Message-ID: <3dPpBP2P3Sz7Lmm@mail.python.org> http://hg.python.org/cpython/rev/78c906600183 changeset: 87295:78c906600183 branch: 2.7 parent: 87274:e52d7b173ab5 user: Serhiy Storchaka date: Wed Nov 20 17:44:38 2013 +0200 summary: Print Tk patchlevel in test_tcl in verbose mode (issue19654). files: Lib/test/test_tcl.py | 6 ++++++ 1 files changed, 6 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_tcl.py b/Lib/test/test_tcl.py --- a/Lib/test/test_tcl.py +++ b/Lib/test/test_tcl.py @@ -282,6 +282,12 @@ self.assertRaises(OverflowError, self.interp.call, 'set', '_', value) +def setUpModule(): + if test_support.verbose: + tcl = Tcl() + print 'patchlevel =', tcl.call('info', 'patchlevel') + + def test_main(): test_support.run_unittest(TclTest, TkinterTest, BigmemTclTest) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 17:23:32 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 17:23:32 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2317276=3A_MD5_as_d?= =?utf-8?q?efault_digestmod_for_HMAC_is_deprecated=2E_The_HMAC?= Message-ID: <3dPq2r3RWRz7Lld@mail.python.org> http://hg.python.org/cpython/rev/86107e7e6ee5 changeset: 87296:86107e7e6ee5 parent: 87294:1b58f14f5d60 user: Christian Heimes date: Wed Nov 20 17:23:06 2013 +0100 summary: Issue #17276: MD5 as default digestmod for HMAC is deprecated. The HMAC module supports digestmod names, e.g. hmac.HMAC('sha1'). files: Doc/library/hmac.rst | 11 ++++++- Doc/whatsnew/3.4.rst | 3 ++ Lib/hmac.py | 13 +++++++-- Lib/imaplib.py | 2 +- Lib/multiprocessing/connection.py | 4 +- Lib/smtplib.py | 2 +- Lib/test/test_hmac.py | 27 ++++++++++++++---- Lib/test/test_pep247.py | 12 ++++--- Misc/NEWS | 3 ++ 9 files changed, 57 insertions(+), 20 deletions(-) diff --git a/Doc/library/hmac.rst b/Doc/library/hmac.rst --- a/Doc/library/hmac.rst +++ b/Doc/library/hmac.rst @@ -18,13 +18,20 @@ Return a new hmac object. *key* is a bytes or bytearray object giving the secret key. If *msg* is present, the method call ``update(msg)`` is made. - *digestmod* is the digest constructor or module for the HMAC object to use. - It defaults to the :data:`hashlib.md5` constructor. + *digestmod* is the digest name, digest constructor or module for the HMAC + object to use. It supports any name suitable to :func:`hashlib.new` and + defaults to the :data:`hashlib.md5` constructor. .. versionchanged:: 3.4 Parameter *key* can be a bytes or bytearray object. Parameter *msg* can be of any type supported by :mod:`hashlib`. + Paramter *digestmod* can be the name of a hash algorithm. + + .. deprecated:: 3.4 + MD5 as implicit default digest for *digestmod* is deprecated. + + An HMAC object has the following methods: .. method:: HMAC.update(msg) diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -842,6 +842,9 @@ * The :mod:`formatter` module is pending deprecation and is slated for removal in Python 3.6. +* MD5 as default digestmod for :mod:`hmac` is deprecated. Python 3.6 will + require an explicit digest name or constructor as *digestmod* argument. + Deprecated functions and types of the C API ------------------------------------------- diff --git a/Lib/hmac.py b/Lib/hmac.py --- a/Lib/hmac.py +++ b/Lib/hmac.py @@ -5,6 +5,7 @@ import warnings as _warnings from _operator import _compare_digest as compare_digest +import hashlib as _hashlib trans_5C = bytes((x ^ 0x5C) for x in range(256)) trans_36 = bytes((x ^ 0x36) for x in range(256)) @@ -28,8 +29,11 @@ key: key for the keyed hash object. msg: Initial input for the hash, if provided. digestmod: A module supporting PEP 247. *OR* - A hashlib constructor returning a new hash object. + A hashlib constructor returning a new hash object. *OR* + A hash name suitable for hashlib.new(). Defaults to hashlib.md5. + Implicit default to hashlib.md5 is deprecated and will be + removed in Python 3.6. Note: key and msg must be a bytes or bytearray objects. """ @@ -38,11 +42,14 @@ raise TypeError("key: expected bytes or bytearray, but got %r" % type(key).__name__) if digestmod is None: - import hashlib - digestmod = hashlib.md5 + _warnings.warn("HMAC() without an explicit digestmod argument " + "is deprecated.", PendingDeprecationWarning, 2) + digestmod = _hashlib.md5 if callable(digestmod): self.digest_cons = digestmod + elif isinstance(digestmod, str): + self.digest_cons = lambda d=b'': _hashlib.new(digestmod, d) else: self.digest_cons = lambda d=b'': digestmod.new(d) diff --git a/Lib/imaplib.py b/Lib/imaplib.py --- a/Lib/imaplib.py +++ b/Lib/imaplib.py @@ -554,7 +554,7 @@ import hmac pwd = (self.password.encode('ASCII') if isinstance(self.password, str) else self.password) - return self.user + " " + hmac.HMAC(pwd, challenge).hexdigest() + return self.user + " " + hmac.HMAC(pwd, challenge, 'md5').hexdigest() def logout(self): diff --git a/Lib/multiprocessing/connection.py b/Lib/multiprocessing/connection.py --- a/Lib/multiprocessing/connection.py +++ b/Lib/multiprocessing/connection.py @@ -719,7 +719,7 @@ assert isinstance(authkey, bytes) message = os.urandom(MESSAGE_LENGTH) connection.send_bytes(CHALLENGE + message) - digest = hmac.new(authkey, message).digest() + digest = hmac.new(authkey, message, 'md5').digest() response = connection.recv_bytes(256) # reject large message if response == digest: connection.send_bytes(WELCOME) @@ -733,7 +733,7 @@ message = connection.recv_bytes(256) # reject large message assert message[:len(CHALLENGE)] == CHALLENGE, 'message = %r' % message message = message[len(CHALLENGE):] - digest = hmac.new(authkey, message).digest() + digest = hmac.new(authkey, message, 'md5').digest() connection.send_bytes(digest) response = connection.recv_bytes(256) # reject large message if response != WELCOME: diff --git a/Lib/smtplib.py b/Lib/smtplib.py --- a/Lib/smtplib.py +++ b/Lib/smtplib.py @@ -579,7 +579,7 @@ def encode_cram_md5(challenge, user, password): challenge = base64.decodebytes(challenge) response = user + " " + hmac.HMAC(password.encode('ascii'), - challenge).hexdigest() + challenge, 'md5').hexdigest() return encode_base64(response.encode('ascii'), eol='') def encode_plain(user, password): diff --git a/Lib/test/test_hmac.py b/Lib/test/test_hmac.py --- a/Lib/test/test_hmac.py +++ b/Lib/test/test_hmac.py @@ -10,7 +10,9 @@ # Test the HMAC module against test vectors from the RFC. def md5test(key, data, digest): - h = hmac.HMAC(key, data) + h = hmac.HMAC(key, data, digestmod=hashlib.md5) + self.assertEqual(h.hexdigest().upper(), digest.upper()) + h = hmac.HMAC(key, data, digestmod='md5') self.assertEqual(h.hexdigest().upper(), digest.upper()) md5test(b"\x0b" * 16, @@ -46,6 +48,9 @@ def shatest(key, data, digest): h = hmac.HMAC(key, data, digestmod=hashlib.sha1) self.assertEqual(h.hexdigest().upper(), digest.upper()) + h = hmac.HMAC(key, data, digestmod='sha1') + self.assertEqual(h.hexdigest().upper(), digest.upper()) + shatest(b"\x0b" * 20, b"Hi There", @@ -76,10 +81,13 @@ b"and Larger Than One Block-Size Data"), "e8e99d0f45237d786d6bbaa7965c7808bbff1a91") - def _rfc4231_test_cases(self, hashfunc): + def _rfc4231_test_cases(self, hashfunc, hashname): def hmactest(key, data, hexdigests): h = hmac.HMAC(key, data, digestmod=hashfunc) self.assertEqual(h.hexdigest().lower(), hexdigests[hashfunc]) + h = hmac.HMAC(key, data, digestmod=hashname) + self.assertEqual(h.hexdigest().lower(), hexdigests[hashfunc]) + # 4.2. Test Case 1 hmactest(key = b'\x0b'*20, @@ -189,16 +197,16 @@ }) def test_sha224_rfc4231(self): - self._rfc4231_test_cases(hashlib.sha224) + self._rfc4231_test_cases(hashlib.sha224, 'sha224') def test_sha256_rfc4231(self): - self._rfc4231_test_cases(hashlib.sha256) + self._rfc4231_test_cases(hashlib.sha256, 'sha256') def test_sha384_rfc4231(self): - self._rfc4231_test_cases(hashlib.sha384) + self._rfc4231_test_cases(hashlib.sha384, 'sha384') def test_sha512_rfc4231(self): - self._rfc4231_test_cases(hashlib.sha512) + self._rfc4231_test_cases(hashlib.sha512, 'sha512') def test_legacy_block_size_warnings(self): class MockCrazyHash(object): @@ -222,6 +230,13 @@ hmac.HMAC(b'a', b'b', digestmod=MockCrazyHash) self.fail('Expected warning about small block_size') + def test_with_digestmod_warning(self): + with self.assertWarns(PendingDeprecationWarning): + key = b"\x0b" * 16 + data = b"Hi There" + digest = "9294727A3638BB1C13F48EF8158BFC9D" + h = hmac.HMAC(key, data) + self.assertEqual(h.hexdigest().upper(), digest) class ConstructorTestCase(unittest.TestCase): diff --git a/Lib/test/test_pep247.py b/Lib/test/test_pep247.py --- a/Lib/test/test_pep247.py +++ b/Lib/test/test_pep247.py @@ -15,12 +15,14 @@ self.assertTrue(module.digest_size is None or module.digest_size > 0) self.check_object(module.new, module.digest_size, key) - def check_object(self, cls, digest_size, key): + def check_object(self, cls, digest_size, key, digestmod=None): if key is not None: - obj1 = cls(key) - obj2 = cls(key, b'string') - h1 = cls(key, b'string').digest() - obj3 = cls(key) + if digestmod is None: + digestmod = md5 + obj1 = cls(key, digestmod=digestmod) + obj2 = cls(key, b'string', digestmod=digestmod) + h1 = cls(key, b'string', digestmod=digestmod).digest() + obj3 = cls(key, digestmod=digestmod) obj3.update(b'string') h2 = obj3.digest() else: diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -59,6 +59,9 @@ Library ------- +- Issue #17276: MD5 as default digestmod for HMAC is deprecated. The HMAC + module supports digestmod names, e.g. hmac.HMAC('sha1'). + - Issue #19449: in csv's writerow, handle non-string keys when generating the error message that certain keys are not in the 'fieldnames' list. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 17:35:17 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 17:35:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2318775=3A_Add_name?= =?utf-8?q?_and_block=5Fsize_attribute_to_HMAC_object=2E_They_now?= Message-ID: <3dPqJP15JJzR1Q@mail.python.org> http://hg.python.org/cpython/rev/71f4a805d262 changeset: 87297:71f4a805d262 user: Christian Heimes date: Wed Nov 20 17:35:06 2013 +0100 summary: Issue #18775: Add name and block_size attribute to HMAC object. They now provide the same API elements as non-keyed cryptographic hash functions. files: Doc/library/hmac.rst | 19 ++++++++++++++++ Lib/hmac.py | 8 +++++++ Lib/test/test_hmac.py | 35 +++++++++++++++++++++++++----- Misc/NEWS | 3 ++ 4 files changed, 59 insertions(+), 6 deletions(-) diff --git a/Doc/library/hmac.rst b/Doc/library/hmac.rst --- a/Doc/library/hmac.rst +++ b/Doc/library/hmac.rst @@ -79,6 +79,25 @@ compute the digests of strings that share a common initial substring. +A hash object has the following attributes: + +.. attribute:: HMAC.digest_size + + The size of the resulting HMAC digest in bytes. + +.. attribute:: HMAC.block_size + + The internal block size of the hash algorithm in bytes. + + .. versionadded:: 3.4 + +.. attribute:: HMAC.name + + The canonical name of this HMAC, always lowercase, e.g. ``hmac-md5``. + + .. versionadded:: 3.4 + + This module also provides the following helper function: .. function:: compare_digest(a, b) diff --git a/Lib/hmac.py b/Lib/hmac.py --- a/Lib/hmac.py +++ b/Lib/hmac.py @@ -70,6 +70,10 @@ RuntimeWarning, 2) blocksize = self.blocksize + # self.blocksize is the default blocksize. self.block_size is + # effective block size as well as the public API attribute. + self.block_size = blocksize + if len(key) > blocksize: key = self.digest_cons(key).digest() @@ -79,6 +83,10 @@ if msg is not None: self.update(msg) + @property + def name(self): + return "hmac-" + self.inner.name + def update(self, msg): """Update this hashing object with the string msg. """ diff --git a/Lib/test/test_hmac.py b/Lib/test/test_hmac.py --- a/Lib/test/test_hmac.py +++ b/Lib/test/test_hmac.py @@ -12,8 +12,16 @@ def md5test(key, data, digest): h = hmac.HMAC(key, data, digestmod=hashlib.md5) self.assertEqual(h.hexdigest().upper(), digest.upper()) + self.assertEqual(h.name, "hmac-md5") + self.assertEqual(h.digest_size, 16) + self.assertEqual(h.block_size, 64) + h = hmac.HMAC(key, data, digestmod='md5') self.assertEqual(h.hexdigest().upper(), digest.upper()) + self.assertEqual(h.name, "hmac-md5") + self.assertEqual(h.digest_size, 16) + self.assertEqual(h.block_size, 64) + md5test(b"\x0b" * 16, b"Hi There", @@ -48,8 +56,15 @@ def shatest(key, data, digest): h = hmac.HMAC(key, data, digestmod=hashlib.sha1) self.assertEqual(h.hexdigest().upper(), digest.upper()) + self.assertEqual(h.name, "hmac-sha1") + self.assertEqual(h.digest_size, 20) + self.assertEqual(h.block_size, 64) + h = hmac.HMAC(key, data, digestmod='sha1') self.assertEqual(h.hexdigest().upper(), digest.upper()) + self.assertEqual(h.name, "hmac-sha1") + self.assertEqual(h.digest_size, 20) + self.assertEqual(h.block_size, 64) shatest(b"\x0b" * 20, @@ -81,12 +96,20 @@ b"and Larger Than One Block-Size Data"), "e8e99d0f45237d786d6bbaa7965c7808bbff1a91") - def _rfc4231_test_cases(self, hashfunc, hashname): + def _rfc4231_test_cases(self, hashfunc, hash_name, digest_size, block_size): def hmactest(key, data, hexdigests): + hmac_name = "hmac-" + hash_name h = hmac.HMAC(key, data, digestmod=hashfunc) self.assertEqual(h.hexdigest().lower(), hexdigests[hashfunc]) - h = hmac.HMAC(key, data, digestmod=hashname) + self.assertEqual(h.name, hmac_name) + self.assertEqual(h.digest_size, digest_size) + self.assertEqual(h.block_size, block_size) + + h = hmac.HMAC(key, data, digestmod=hash_name) self.assertEqual(h.hexdigest().lower(), hexdigests[hashfunc]) + self.assertEqual(h.name, hmac_name) + self.assertEqual(h.digest_size, digest_size) + self.assertEqual(h.block_size, block_size) # 4.2. Test Case 1 @@ -197,16 +220,16 @@ }) def test_sha224_rfc4231(self): - self._rfc4231_test_cases(hashlib.sha224, 'sha224') + self._rfc4231_test_cases(hashlib.sha224, 'sha224', 28, 64) def test_sha256_rfc4231(self): - self._rfc4231_test_cases(hashlib.sha256, 'sha256') + self._rfc4231_test_cases(hashlib.sha256, 'sha256', 32, 64) def test_sha384_rfc4231(self): - self._rfc4231_test_cases(hashlib.sha384, 'sha384') + self._rfc4231_test_cases(hashlib.sha384, 'sha384', 48, 128) def test_sha512_rfc4231(self): - self._rfc4231_test_cases(hashlib.sha512, 'sha512') + self._rfc4231_test_cases(hashlib.sha512, 'sha512', 64, 128) def test_legacy_block_size_warnings(self): class MockCrazyHash(object): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -59,6 +59,9 @@ Library ------- +- Issue #18775: Add name and block_size attribute to HMAC object. They now + provide the same API elements as non-keyed cryptographic hash functions. + - Issue #17276: MD5 as default digestmod for HMAC is deprecated. The HMAC module supports digestmod names, e.g. hmac.HMAC('sha1'). -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 17:40:42 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 17:40:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2317791=3A_Drop_PRE?= =?utf-8?q?FIX_and_EXEC=5FPREFIX_definitions_from_PC/pyconfig=2Eh?= Message-ID: <3dPqQf4xDDz7Ll5@mail.python.org> http://hg.python.org/cpython/rev/fbd856e817a1 changeset: 87298:fbd856e817a1 user: Christian Heimes date: Wed Nov 20 17:40:31 2013 +0100 summary: Issue #17791: Drop PREFIX and EXEC_PREFIX definitions from PC/pyconfig.h files: Misc/NEWS | 2 ++ PC/pyconfig.h | 2 -- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -307,6 +307,8 @@ Build ----- +- Issue #17791: Drop PREFIX and EXEC_PREFIX definitions from PC/pyconfig.h + - Add workaround for VS 2010 nmake clean issue. VS 2010 doesn't set up PATH for nmake.exe correctly. diff --git a/PC/pyconfig.h b/PC/pyconfig.h --- a/PC/pyconfig.h +++ b/PC/pyconfig.h @@ -74,8 +74,6 @@ #define DONT_HAVE_SIG_PAUSE #define LONG_BIT 32 #define WORD_BIT 32 -#define PREFIX "" -#define EXEC_PREFIX "" #define MS_WIN32 /* only support win32 and greater. */ #define MS_WINDOWS -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 17:43:33 2013 From: python-checkins at python.org (christian.heimes) Date: Wed, 20 Nov 2013 17:43:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2316632=3A_Enable_D?= =?utf-8?q?EP_and_ASLR_on_Windows=2E?= Message-ID: <3dPqTx2jcXz7Ljr@mail.python.org> http://hg.python.org/cpython/rev/cb1691d42101 changeset: 87299:cb1691d42101 user: Christian Heimes date: Wed Nov 20 17:43:23 2013 +0100 summary: Issue #16632: Enable DEP and ASLR on Windows. files: Misc/NEWS | 2 ++ PCbuild/pyproject.props | 5 ++--- 2 files changed, 4 insertions(+), 3 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -307,6 +307,8 @@ Build ----- +- Issue #16632: Enable DEP and ASLR on Windows. + - Issue #17791: Drop PREFIX and EXEC_PREFIX definitions from PC/pyconfig.h - Add workaround for VS 2010 nmake clean issue. VS 2010 doesn't set up PATH diff --git a/PCbuild/pyproject.props b/PCbuild/pyproject.props --- a/PCbuild/pyproject.props +++ b/PCbuild/pyproject.props @@ -49,9 +49,8 @@ true $(OutDir)$(TargetName).pdb Windows - false - - + true + true MachineX86 -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 18:14:31 2013 From: python-checkins at python.org (larry.hastings) Date: Wed, 20 Nov 2013 18:14:31 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319474=3A_Argument?= =?utf-8?q?_Clinic_now_always_specifies_a_default_value_for?= Message-ID: <3dPr9g6chRz7Lk4@mail.python.org> http://hg.python.org/cpython/rev/4f0f496e482e changeset: 87300:4f0f496e482e user: Larry Hastings date: Wed Nov 20 09:13:52 2013 -0800 summary: Issue #19474: Argument Clinic now always specifies a default value for variables in option groups, to prevent "uninitialized value" warnings. files: Modules/_cursesmodule.c | 8 ++-- Tools/clinic/clinic.py | 42 +++++++++++++++++++++++++--- 2 files changed, 41 insertions(+), 9 deletions(-) diff --git a/Modules/_cursesmodule.c b/Modules/_cursesmodule.c --- a/Modules/_cursesmodule.c +++ b/Modules/_cursesmodule.c @@ -608,11 +608,11 @@ { PyObject *return_value = NULL; int group_left_1 = 0; - int x; - int y; + int x = 0; + int y = 0; PyObject *ch; int group_right_1 = 0; - long attr; + long attr = 0; switch (PyTuple_Size(args)) { case 1: @@ -646,7 +646,7 @@ static PyObject * curses_window_addch_impl(PyObject *self, int group_left_1, int x, int y, PyObject *ch, int group_right_1, long attr) -/*[clinic checksum: 98ade780397a48d0be48439763424b3b00c92089]*/ +/*[clinic checksum: 094d012af1019387c0219a9c0bc76e90729c833f]*/ { PyCursesWindowObject *cwself = (PyCursesWindowObject *)self; int coordinates_group = group_left_1; diff --git a/Tools/clinic/clinic.py b/Tools/clinic/clinic.py --- a/Tools/clinic/clinic.py +++ b/Tools/clinic/clinic.py @@ -1281,17 +1281,29 @@ # Or "unspecified" if there is no default. default = unspecified - # "default" converted into a str for rendering into Python code. - py_default = None - # "default" as it should appear in the documentation, as a string. # Or None if there is no default. doc_default = None + # "default" converted into a str for rendering into Python code. + py_default = None + # "default" converted into a C value, as a string. # Or None if there is no default. c_default = None + # The default value used to initialize the C variable when + # there is no default, but not specifying a default may + # result in an "uninitialized variable" warning. This can + # easily happen when using option groups--although + # properly-written code won't actually use the variable, + # the variable does get passed in to the _impl. (Ah, if + # only dataflow analysis could inline the static function!) + # + # This value is specified as a string. + # Every non-abstract subclass should supply a valid value. + c_ignored_default = 'NULL' + # The C converter *function* to be used, if any. # (If this is not None, format_unit must be 'O&'.) converter = None @@ -1327,6 +1339,7 @@ parameter is a clinic.Parameter instance. data is a CRenderData instance. """ + self.parameter = parameter name = ensure_legal_c_identifier(self.name) # declarations @@ -1401,9 +1414,12 @@ The C statement to declare this variable. """ declaration = [self.simple_declaration()] - if self.c_default: + default = self.c_default + if not default and self.parameter.group: + default = self.c_ignored_default + if default: declaration.append(" = ") - declaration.append(self.c_default) + declaration.append(default) declaration.append(";") return "".join(declaration) @@ -1427,6 +1443,7 @@ class bool_converter(CConverter): type = 'int' format_unit = 'p' + c_ignored_default = '0' def converter_init(self): self.default = bool(self.default) @@ -1435,11 +1452,13 @@ class char_converter(CConverter): type = 'char' format_unit = 'c' + c_ignored_default = "'\0'" @add_legacy_c_converter('B', bitwise=True) class byte_converter(CConverter): type = 'byte' format_unit = 'b' + c_ignored_default = "'\0'" def converter_init(self, *, bitwise=False): if bitwise: @@ -1448,10 +1467,12 @@ class short_converter(CConverter): type = 'short' format_unit = 'h' + c_ignored_default = "0" class unsigned_short_converter(CConverter): type = 'unsigned short' format_unit = 'H' + c_ignored_default = "0" def converter_init(self, *, bitwise=False): if not bitwise: @@ -1461,6 +1482,7 @@ class int_converter(CConverter): type = 'int' format_unit = 'i' + c_ignored_default = "0" def converter_init(self, *, from_str=False): if from_str: @@ -1469,6 +1491,7 @@ class unsigned_int_converter(CConverter): type = 'unsigned int' format_unit = 'I' + c_ignored_default = "0" def converter_init(self, *, bitwise=False): if not bitwise: @@ -1477,10 +1500,12 @@ class long_converter(CConverter): type = 'long' format_unit = 'l' + c_ignored_default = "0" class unsigned_long_converter(CConverter): type = 'unsigned long' format_unit = 'k' + c_ignored_default = "0" def converter_init(self, *, bitwise=False): if not bitwise: @@ -1489,10 +1514,12 @@ class PY_LONG_LONG_converter(CConverter): type = 'PY_LONG_LONG' format_unit = 'L' + c_ignored_default = "0" class unsigned_PY_LONG_LONG_converter(CConverter): type = 'unsigned PY_LONG_LONG' format_unit = 'K' + c_ignored_default = "0" def converter_init(self, *, bitwise=False): if not bitwise: @@ -1501,20 +1528,24 @@ class Py_ssize_t_converter(CConverter): type = 'Py_ssize_t' format_unit = 'n' + c_ignored_default = "0" class float_converter(CConverter): type = 'float' format_unit = 'f' + c_ignored_default = "0.0" class double_converter(CConverter): type = 'double' format_unit = 'd' + c_ignored_default = "0.0" class Py_complex_converter(CConverter): type = 'Py_complex' format_unit = 'D' + c_ignored_default = "{0.0, 0.0}" class object_converter(CConverter): @@ -1579,6 +1610,7 @@ type = 'Py_buffer' format_unit = 'y*' impl_by_reference = True + c_ignored_default = "{NULL, NULL, 0, 0, 0, 0, NULL, NULL, NULL, NULL, NULL}" def converter_init(self, *, str=False, zeroes=False, nullable=False, read_write=False): if not str: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 19:03:54 2013 From: python-checkins at python.org (antoine.pitrou) Date: Wed, 20 Nov 2013 19:03:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Mark_PEP_3156_accepted?= Message-ID: <3dPsGf44Cmz7Ljs@mail.python.org> http://hg.python.org/peps/rev/9852e3b990aa changeset: 5301:9852e3b990aa user: Antoine Pitrou date: Wed Nov 20 19:03:47 2013 +0100 summary: Mark PEP 3156 accepted files: pep-3156.txt | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/pep-3156.txt b/pep-3156.txt --- a/pep-3156.txt +++ b/pep-3156.txt @@ -3,8 +3,9 @@ Version: $Revision$ Last-Modified: $Date$ Author: Guido van Rossum +BDFL-Delegate: Antoine Pitrou Discussions-To: -Status: Draft +Status: Accepted Type: Standards Track Content-Type: text/x-rst Created: 12-Dec-2012 -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Wed Nov 20 19:11:34 2013 From: python-checkins at python.org (antoine.pitrou) Date: Wed, 20 Nov 2013 19:11:34 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Add_Resolution_header?= Message-ID: <3dPsRV0xnPz7LkW@mail.python.org> http://hg.python.org/peps/rev/0a885384824b changeset: 5302:0a885384824b user: Antoine Pitrou date: Wed Nov 20 19:11:28 2013 +0100 summary: Add Resolution header files: pep-3156.txt | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/pep-3156.txt b/pep-3156.txt --- a/pep-3156.txt +++ b/pep-3156.txt @@ -10,6 +10,7 @@ Content-Type: text/x-rst Created: 12-Dec-2012 Post-History: 21-Dec-2012 +Resolution: https://mail.python.org/pipermail/python-dev/2013-November/130419.html Abstract ======== -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Wed Nov 20 19:34:30 2013 From: python-checkins at python.org (guido.van.rossum) Date: Wed, 20 Nov 2013 19:34:30 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Add_Gustavo_Carneiro_=28Gambi?= =?utf-8?q?t_Research=29_to_the_acknowledgments=2E?= Message-ID: <3dPsxy1gxNz7Ljj@mail.python.org> http://hg.python.org/peps/rev/b90ab4ee8c4d changeset: 5303:b90ab4ee8c4d user: Guido van Rossum date: Wed Nov 20 10:34:19 2013 -0800 summary: Add Gustavo Carneiro (Gambit Research) to the acknowledgments. files: pep-3156.txt | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/pep-3156.txt b/pep-3156.txt --- a/pep-3156.txt +++ b/pep-3156.txt @@ -1761,6 +1761,7 @@ Contributors to the implementation include Eli Bendersky, +Gustavo Carneiro (Gambit Research), Sa?l Ibarra Corretg?, Geert Jansen, A. Jesse Jiryu Davis, -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Wed Nov 20 20:53:42 2013 From: python-checkins at python.org (guido.van.rossum) Date: Wed, 20 Nov 2013 20:53:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Clarify_where_=22Documenti?= =?utf-8?q?ng_Python=22_can_be_found=2E?= Message-ID: <3dPvjL4LqLzSNd@mail.python.org> http://hg.python.org/cpython/rev/1e79ca2bc494 changeset: 87301:1e79ca2bc494 user: Guido van Rossum date: Wed Nov 20 11:53:31 2013 -0800 summary: Clarify where "Documenting Python" can be found. files: Doc/README.txt | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Doc/README.txt b/Doc/README.txt --- a/Doc/README.txt +++ b/Doc/README.txt @@ -7,7 +7,8 @@ Documentation on the authoring Python documentation, including information about both style and markup, is available in the "Documenting Python" chapter of the -documentation. There's also a chapter intended to point out differences to +developers guide (http://docs.python.org/devguide/documenting.html). +There's also a chapter intended to point out differences to those familiar with the previous docs written in LaTeX. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 20 22:38:22 2013 From: python-checkins at python.org (guido.van.rossum) Date: Wed, 20 Nov 2013 22:38:22 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Accept_PEP_428_=28pathlib=29?= =?utf-8?q?=2E?= Message-ID: <3dPy266M1tz7LjQ@mail.python.org> http://hg.python.org/peps/rev/e7026f5de3ef changeset: 5304:e7026f5de3ef user: Guido van Rossum date: Wed Nov 20 13:38:17 2013 -0800 summary: Accept PEP 428 (pathlib). files: pep-0428.txt | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -3,7 +3,7 @@ Version: $Revision$ Last-Modified: $Date$ Author: Antoine Pitrou -Status: Draft +Status: Accepted Type: Standards Track Content-Type: text/x-rst Created: 30-July-2012 -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Wed Nov 20 22:44:37 2013 From: python-checkins at python.org (guido.van.rossum) Date: Wed, 20 Nov 2013 22:44:37 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Add_Resolution_header_to_PEP_?= =?utf-8?q?428_=28pathlib=29=2E?= Message-ID: <3dPy9K6Lllz7LlZ@mail.python.org> http://hg.python.org/peps/rev/29d4b5762a72 changeset: 5305:29d4b5762a72 user: Guido van Rossum date: Wed Nov 20 13:44:22 2013 -0800 summary: Add Resolution header to PEP 428 (pathlib). files: pep-0428.txt | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -9,6 +9,7 @@ Created: 30-July-2012 Python-Version: 3.4 Post-History: http://mail.python.org/pipermail/python-ideas/2012-October/016338.html +Resolution: https://mail.python.org/pipermail/python-dev/2013-November/130424.html Abstract -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Thu Nov 21 00:59:12 2013 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 21 Nov 2013 00:59:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Add_a_FRAME_opcode_for_framin?= =?utf-8?q?g=2C_and_document_Alexandre=27s_new_MEMOIZE_opcode?= Message-ID: <3dQ18c5sB9z7LjP@mail.python.org> http://hg.python.org/peps/rev/c8d77b44a329 changeset: 5306:c8d77b44a329 user: Antoine Pitrou date: Thu Nov 21 00:58:38 2013 +0100 summary: Add a FRAME opcode for framing, and document Alexandre's new MEMOIZE opcode files: pep-3154.txt | 26 +++++++++++++++++++++++--- 1 files changed, 23 insertions(+), 3 deletions(-) diff --git a/pep-3154.txt b/pep-3154.txt --- a/pep-3154.txt +++ b/pep-3154.txt @@ -58,15 +58,19 @@ of a pickle is thus the following:: +------+------+ - | 0x80 | 0x04 | protocol header (2 bytes) + | 0x80 | 0x04 | protocol header (2 bytes) + +------+------+ + | OP | FRAME opcode (1 byte) +------+------+-----------+ | MM MM MM MM MM MM MM MM | frame size (8 bytes, little-endian) +------+------------------+ - | .... | first frame contents (M bytes) + | .... | first frame contents (M bytes) + +------+ + | OP | FRAME opcode (1 byte) +------+------+-----------+ | NN NN NN NN NN NN NN NN | frame size (8 bytes, little-endian) +------+------------------+ - | .... | second frame contents (N bytes) + | .... | second frame contents (N bytes) +------+ etc. @@ -142,6 +146,16 @@ integer, which is wasteful. A specific opcode with a 1-byte length would make many pickles smaller. +Smaller memoization +------------------- + +The PUT opcodes all require an explicit index to select in which entry +of the memo dictionary the top-of-stack is memoized. However, in practice +those numbers are allocated in sequential order. A new opcode, MEMOIZE, +will instead store the top-of-stack in at the index equal to the current +size of the memo dictionary. This allows for shorter pickles, since PUT +opcodes are emitted for all non-atomic datatypes. + Summary of new opcodes ====================== @@ -149,6 +163,9 @@ These reflect the state of the proposed implementation (thanks mostly to Alexandre Vassalotti's work): +* ``FRAME``: introduce a new frame (followed by the 8-byte frame size + and the frame contents). + * ``SHORT_BINUNICODE``: push a utf8-encoded str object with a one-byte size prefix (therefore less than 256 bytes long). @@ -178,6 +195,9 @@ ``qualname``, and push the result of looking up the dotted ``qualname`` in the module named ``module_name``. +* ``MEMOIZE``: store the top-of-stack object in the memo dictionary with + an index equal to the current size of the memo dictionary. + Alternative ideas ================= -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Thu Nov 21 01:48:52 2013 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 21 Nov 2013 01:48:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_EMPTY=5FFROZENSET_is_gone_=28?= =?utf-8?q?Alexandre=29?= Message-ID: <3dQ2Fw3bfLz7LnL@mail.python.org> http://hg.python.org/peps/rev/23ed6838856b changeset: 5307:23ed6838856b user: Antoine Pitrou date: Thu Nov 21 01:48:24 2013 +0100 summary: EMPTY_FROZENSET is gone (Alexandre) files: pep-3154.txt | 2 -- 1 files changed, 0 insertions(+), 2 deletions(-) diff --git a/pep-3154.txt b/pep-3154.txt --- a/pep-3154.txt +++ b/pep-3154.txt @@ -182,8 +182,6 @@ * ``ADDITEMS``: add the topmost stack items to the set (to be used with ``EMPTY_SET``). -* ``EMPTY_FROZENSET``: push a new empty frozenset object on the stack. - * ``FROZENSET``: create a frozenset object from the topmost stack items, and push it on the stack. -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Thu Nov 21 01:49:28 2013 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 21 Nov 2013 01:49:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Mark_accepted_=28Tim=29?= Message-ID: <3dQ2Gc2lCTz7LlK@mail.python.org> http://hg.python.org/peps/rev/826ec23e731d changeset: 5308:826ec23e731d user: Antoine Pitrou date: Thu Nov 21 01:49:24 2013 +0100 summary: Mark accepted (Tim) files: pep-3154.txt | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/pep-3154.txt b/pep-3154.txt --- a/pep-3154.txt +++ b/pep-3154.txt @@ -3,14 +3,14 @@ Version: $Revision$ Last-Modified: $Date$ Author: Antoine Pitrou -Status: Draft +Status: Accepted Type: Standards Track Content-Type: text/x-rst Created: 2011-08-11 Python-Version: 3.4 Post-History: http://mail.python.org/pipermail/python-dev/2011-August/112821.html -Resolution: TBD +Resolution: https://mail.python.org/pipermail/python-dev/2013-November/130439.html Abstract -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Thu Nov 21 02:06:11 2013 From: python-checkins at python.org (antoine.pitrou) Date: Thu, 21 Nov 2013 02:06:11 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Remove_question_marks_and_exp?= =?utf-8?q?lain_=5F=5Fgetnewargs=5Fex=5F=5F?= Message-ID: <3dQ2dv6VWYz7Lnd@mail.python.org> http://hg.python.org/peps/rev/6fd82e83ccca changeset: 5309:6fd82e83ccca user: Antoine Pitrou date: Thu Nov 21 02:06:06 2013 +0100 summary: Remove question marks and explain __getnewargs_ex__ files: pep-3154.txt | 9 ++++++--- 1 files changed, 6 insertions(+), 3 deletions(-) diff --git a/pep-3154.txt b/pep-3154.txt --- a/pep-3154.txt +++ b/pep-3154.txt @@ -134,10 +134,13 @@ Calling __new__ with keyword arguments -------------------------------------- -Currently, classes whose __new__ mandates the use of keyword-only +Currently, classes whose ``__new__`` mandates the use of keyword-only arguments can not be pickled (or, rather, unpickled) [3]_. Both a new -special method (``__getnewargs_ex__`` ?) and a new opcode (NEWOBJ_EX ?) -are needed. +special method (``__getnewargs_ex__``) and a new opcode (NEWOBJ_EX) +are needed. The ``__getnewargs_ex__`` method, if it exists, must +return a two-tuple ``(args, kwargs)`` where the first item is the +tuple of positional arguments and the second item is the dict of +keyword arguments for the class's ``__new__`` method. Better string encoding ---------------------- -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Thu Nov 21 03:35:12 2013 From: python-checkins at python.org (christian.heimes) Date: Thu, 21 Nov 2013 03:35:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2318138=3A_Implemen?= =?utf-8?q?t_cadata_argument_of_SSLContext=2Eload=5Fverify=5Flocation=28?= =?utf-8?q?=29?= Message-ID: <3dQ4cc2TVzz7LjN@mail.python.org> http://hg.python.org/cpython/rev/234e3c8dc52f changeset: 87302:234e3c8dc52f user: Christian Heimes date: Thu Nov 21 03:35:02 2013 +0100 summary: Issue #18138: Implement cadata argument of SSLContext.load_verify_location() to load CA certificates and CRL from memory. It supports PEM and DER encoded strings. files: Doc/library/ssl.rst | 11 +- Lib/test/test_ssl.py | 88 ++++++++++++- Misc/NEWS | 4 + Modules/_ssl.c | 208 +++++++++++++++++++++++++----- 4 files changed, 274 insertions(+), 37 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -821,6 +821,7 @@ .. versionadded:: 3.4 + .. method:: SSLContext.load_cert_chain(certfile, keyfile=None, password=None) Load a private key and the corresponding certificate. The *certfile* @@ -851,7 +852,7 @@ .. versionchanged:: 3.3 New optional argument *password*. -.. method:: SSLContext.load_verify_locations(cafile=None, capath=None) +.. method:: SSLContext.load_verify_locations(cafile=None, capath=None, cadata=None) Load a set of "certification authority" (CA) certificates used to validate other peers' certificates when :data:`verify_mode` is other than @@ -867,6 +868,14 @@ following an `OpenSSL specific layout `_. + The *cadata* object, if present, is either an ASCII string of one or more + PEM-encoded certificates or a bytes-like object of DER-encoded + certificates. Like with *capath* extra lines around PEM-encoded + certificates are ignored but at least one certificate must be present. + + .. versionchanged:: 3.4 + New optional argument *cadata* + .. method:: SSLContext.get_ca_certs(binary_form=False) Get a list of loaded "certification authority" (CA) certificates. If the diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -25,7 +25,8 @@ PROTOCOLS = sorted(ssl._PROTOCOL_NAMES) HOST = support.HOST -data_file = lambda name: os.path.join(os.path.dirname(__file__), name) +def data_file(*name): + return os.path.join(os.path.dirname(__file__), *name) # The custom key and certificate files used in test_ssl are generated # using Lib/test/make_ssl_certs.py. @@ -43,6 +44,9 @@ KEY_PASSWORD = "somepass" CAPATH = data_file("capath") BYTES_CAPATH = os.fsencode(CAPATH) +CAFILE_NEURONIO = data_file("capath", "4e1295a3.0") +CAFILE_CACERT = data_file("capath", "5ed36f99.0") + # Two keys and certs signed by the same CA (for SNI tests) SIGNED_CERTFILE = data_file("keycert3.pem") @@ -726,7 +730,7 @@ ctx.load_verify_locations(BYTES_CERTFILE) ctx.load_verify_locations(cafile=BYTES_CERTFILE, capath=None) self.assertRaises(TypeError, ctx.load_verify_locations) - self.assertRaises(TypeError, ctx.load_verify_locations, None, None) + self.assertRaises(TypeError, ctx.load_verify_locations, None, None, None) with self.assertRaises(OSError) as cm: ctx.load_verify_locations(WRONGCERT) self.assertEqual(cm.exception.errno, errno.ENOENT) @@ -738,6 +742,64 @@ # Issue #10989: crash if the second argument type is invalid self.assertRaises(TypeError, ctx.load_verify_locations, None, True) + def test_load_verify_cadata(self): + # test cadata + with open(CAFILE_CACERT) as f: + cacert_pem = f.read() + cacert_der = ssl.PEM_cert_to_DER_cert(cacert_pem) + with open(CAFILE_NEURONIO) as f: + neuronio_pem = f.read() + neuronio_der = ssl.PEM_cert_to_DER_cert(neuronio_pem) + + # test PEM + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + self.assertEqual(ctx.cert_store_stats()["x509_ca"], 0) + ctx.load_verify_locations(cadata=cacert_pem) + self.assertEqual(ctx.cert_store_stats()["x509_ca"], 1) + ctx.load_verify_locations(cadata=neuronio_pem) + self.assertEqual(ctx.cert_store_stats()["x509_ca"], 2) + # cert already in hash table + ctx.load_verify_locations(cadata=neuronio_pem) + self.assertEqual(ctx.cert_store_stats()["x509_ca"], 2) + + # combined + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + combined = "\n".join((cacert_pem, neuronio_pem)) + ctx.load_verify_locations(cadata=combined) + self.assertEqual(ctx.cert_store_stats()["x509_ca"], 2) + + # with junk around the certs + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + combined = ["head", cacert_pem, "other", neuronio_pem, "again", + neuronio_pem, "tail"] + ctx.load_verify_locations(cadata="\n".join(combined)) + self.assertEqual(ctx.cert_store_stats()["x509_ca"], 2) + + # test DER + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + ctx.load_verify_locations(cadata=cacert_der) + ctx.load_verify_locations(cadata=neuronio_der) + self.assertEqual(ctx.cert_store_stats()["x509_ca"], 2) + # cert already in hash table + ctx.load_verify_locations(cadata=cacert_der) + self.assertEqual(ctx.cert_store_stats()["x509_ca"], 2) + + # combined + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + combined = b"".join((cacert_der, neuronio_der)) + ctx.load_verify_locations(cadata=combined) + self.assertEqual(ctx.cert_store_stats()["x509_ca"], 2) + + # error cases + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + self.assertRaises(TypeError, ctx.load_verify_locations, cadata=object) + + with self.assertRaisesRegex(ssl.SSLError, "no start line"): + ctx.load_verify_locations(cadata="broken") + with self.assertRaisesRegex(ssl.SSLError, "not enough data"): + ctx.load_verify_locations(cadata=b"broken") + + def test_load_dh_params(self): ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) ctx.load_dh_params(DHFILE) @@ -1057,6 +1119,28 @@ finally: s.close() + def test_connect_cadata(self): + with open(CAFILE_CACERT) as f: + pem = f.read() + der = ssl.PEM_cert_to_DER_cert(pem) + with support.transient_internet("svn.python.org"): + ctx = ssl.SSLContext(ssl.PROTOCOL_SSLv23) + ctx.verify_mode = ssl.CERT_REQUIRED + ctx.load_verify_locations(cadata=pem) + with ctx.wrap_socket(socket.socket(socket.AF_INET)) as s: + s.connect(("svn.python.org", 443)) + cert = s.getpeercert() + self.assertTrue(cert) + + # same with DER + ctx = ssl.SSLContext(ssl.PROTOCOL_SSLv23) + ctx.verify_mode = ssl.CERT_REQUIRED + ctx.load_verify_locations(cadata=der) + with ctx.wrap_socket(socket.socket(socket.AF_INET)) as s: + s.connect(("svn.python.org", 443)) + cert = s.getpeercert() + self.assertTrue(cert) + @unittest.skipIf(os.name == "nt", "Can't use a socket as a file under Windows") def test_makefile_close(self): # Issue #5238: creating a file-like object with makefile() shouldn't diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -59,6 +59,10 @@ Library ------- +- Issue #18138: Implement cadata argument of SSLContext.load_verify_location() + to load CA certificates and CRL from memory. It supports PEM and DER + encoded strings. + - Issue #18775: Add name and block_size attribute to HMAC object. They now provide the same API elements as non-keyed cryptographic hash functions. diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -2304,60 +2304,200 @@ return NULL; } +/* internal helper function, returns -1 on error + */ +static int +_add_ca_certs(PySSLContext *self, void *data, Py_ssize_t len, + int filetype) +{ + BIO *biobuf = NULL; + X509_STORE *store; + int retval = 0, err, loaded = 0; + + assert(filetype == SSL_FILETYPE_ASN1 || filetype == SSL_FILETYPE_PEM); + + if (len <= 0) { + PyErr_SetString(PyExc_ValueError, + "Empty certificate data"); + return -1; + } else if (len > INT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "Certificate data is too long."); + return -1; + } + + biobuf = BIO_new_mem_buf(data, len); + if (biobuf == NULL) { + _setSSLError("Can't allocate buffer", 0, __FILE__, __LINE__); + return -1; + } + + store = SSL_CTX_get_cert_store(self->ctx); + assert(store != NULL); + + while (1) { + X509 *cert = NULL; + int r; + + if (filetype == SSL_FILETYPE_ASN1) { + cert = d2i_X509_bio(biobuf, NULL); + } else { + cert = PEM_read_bio_X509(biobuf, NULL, + self->ctx->default_passwd_callback, + self->ctx->default_passwd_callback_userdata); + } + if (cert == NULL) { + break; + } + r = X509_STORE_add_cert(store, cert); + X509_free(cert); + if (!r) { + err = ERR_peek_last_error(); + if ((ERR_GET_LIB(err) == ERR_LIB_X509) && + (ERR_GET_REASON(err) == X509_R_CERT_ALREADY_IN_HASH_TABLE)) { + /* cert already in hash table, not an error */ + ERR_clear_error(); + } else { + break; + } + } + loaded++; + } + + err = ERR_peek_last_error(); + if ((filetype == SSL_FILETYPE_ASN1) && + (loaded > 0) && + (ERR_GET_LIB(err) == ERR_LIB_ASN1) && + (ERR_GET_REASON(err) == ASN1_R_HEADER_TOO_LONG)) { + /* EOF ASN1 file, not an error */ + ERR_clear_error(); + retval = 0; + } else if ((filetype == SSL_FILETYPE_PEM) && + (loaded > 0) && + (ERR_GET_LIB(err) == ERR_LIB_PEM) && + (ERR_GET_REASON(err) == PEM_R_NO_START_LINE)) { + /* EOF PEM file, not an error */ + ERR_clear_error(); + retval = 0; + } else { + _setSSLError(NULL, 0, __FILE__, __LINE__); + retval = -1; + } + + BIO_free(biobuf); + return retval; +} + + static PyObject * load_verify_locations(PySSLContext *self, PyObject *args, PyObject *kwds) { - char *kwlist[] = {"cafile", "capath", NULL}; - PyObject *cafile = NULL, *capath = NULL; + char *kwlist[] = {"cafile", "capath", "cadata", NULL}; + PyObject *cafile = NULL, *capath = NULL, *cadata = NULL; PyObject *cafile_bytes = NULL, *capath_bytes = NULL; const char *cafile_buf = NULL, *capath_buf = NULL; - int r; + int r = 0, ok = 1; errno = 0; if (!PyArg_ParseTupleAndKeywords(args, kwds, - "|OO:load_verify_locations", kwlist, - &cafile, &capath)) + "|OOO:load_verify_locations", kwlist, + &cafile, &capath, &cadata)) return NULL; + if (cafile == Py_None) cafile = NULL; if (capath == Py_None) capath = NULL; - if (cafile == NULL && capath == NULL) { + if (cadata == Py_None) + cadata = NULL; + + if (cafile == NULL && capath == NULL && cadata == NULL) { PyErr_SetString(PyExc_TypeError, - "cafile and capath cannot be both omitted"); - return NULL; + "cafile, capath and cadata cannot be all omitted"); + goto error; } if (cafile && !PyUnicode_FSConverter(cafile, &cafile_bytes)) { PyErr_SetString(PyExc_TypeError, "cafile should be a valid filesystem path"); + goto error; + } + if (capath && !PyUnicode_FSConverter(capath, &capath_bytes)) { + PyErr_SetString(PyExc_TypeError, + "capath should be a valid filesystem path"); + goto error; + } + + /* validata cadata type and load cadata */ + if (cadata) { + Py_buffer buf; + PyObject *cadata_ascii = NULL; + + if (PyObject_GetBuffer(cadata, &buf, PyBUF_SIMPLE) == 0) { + if (!PyBuffer_IsContiguous(&buf, 'C') || buf.ndim > 1) { + PyBuffer_Release(&buf); + PyErr_SetString(PyExc_TypeError, + "cadata should be a contiguous buffer with " + "a single dimension"); + goto error; + } + r = _add_ca_certs(self, buf.buf, buf.len, SSL_FILETYPE_ASN1); + PyBuffer_Release(&buf); + if (r == -1) { + goto error; + } + } else { + PyErr_Clear(); + cadata_ascii = PyUnicode_AsASCIIString(cadata); + if (cadata_ascii == NULL) { + PyErr_SetString(PyExc_TypeError, + "cadata should be a ASCII string or a " + "bytes-like object"); + goto error; + } + r = _add_ca_certs(self, + PyBytes_AS_STRING(cadata_ascii), + PyBytes_GET_SIZE(cadata_ascii), + SSL_FILETYPE_PEM); + Py_DECREF(cadata_ascii); + if (r == -1) { + goto error; + } + } + } + + /* load cafile or capath */ + if (cafile || capath) { + if (cafile) + cafile_buf = PyBytes_AS_STRING(cafile_bytes); + if (capath) + capath_buf = PyBytes_AS_STRING(capath_bytes); + PySSL_BEGIN_ALLOW_THREADS + r = SSL_CTX_load_verify_locations(self->ctx, cafile_buf, capath_buf); + PySSL_END_ALLOW_THREADS + if (r != 1) { + ok = 0; + if (errno != 0) { + ERR_clear_error(); + PyErr_SetFromErrno(PyExc_IOError); + } + else { + _setSSLError(NULL, 0, __FILE__, __LINE__); + } + goto error; + } + } + goto end; + + error: + ok = 0; + end: + Py_XDECREF(cafile_bytes); + Py_XDECREF(capath_bytes); + if (ok) { + Py_RETURN_NONE; + } else { return NULL; } - if (capath && !PyUnicode_FSConverter(capath, &capath_bytes)) { - Py_XDECREF(cafile_bytes); - PyErr_SetString(PyExc_TypeError, - "capath should be a valid filesystem path"); - return NULL; - } - if (cafile) - cafile_buf = PyBytes_AS_STRING(cafile_bytes); - if (capath) - capath_buf = PyBytes_AS_STRING(capath_bytes); - PySSL_BEGIN_ALLOW_THREADS - r = SSL_CTX_load_verify_locations(self->ctx, cafile_buf, capath_buf); - PySSL_END_ALLOW_THREADS - Py_XDECREF(cafile_bytes); - Py_XDECREF(capath_bytes); - if (r != 1) { - if (errno != 0) { - ERR_clear_error(); - PyErr_SetFromErrno(PyExc_IOError); - } - else { - _setSSLError(NULL, 0, __FILE__, __LINE__); - } - return NULL; - } - Py_RETURN_NONE; } static PyObject * -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 03:40:29 2013 From: python-checkins at python.org (christian.heimes) Date: Thu, 21 Nov 2013 03:40:29 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2318379=3A_SSLSocke?= =?utf-8?q?t=2Egetpeercert=28=29_returns_CA_issuer_AIA_fields=2C_OCSP?= Message-ID: <3dQ4kj5G49z7Lky@mail.python.org> http://hg.python.org/cpython/rev/468d18bffdea changeset: 87303:468d18bffdea user: Christian Heimes date: Thu Nov 21 03:40:15 2013 +0100 summary: Issue #18379: SSLSocket.getpeercert() returns CA issuer AIA fields, OCSP and CRL distribution points. files: Doc/library/ssl.rst | 4 + Lib/test/test_ssl.py | 8 +- Misc/NEWS | 3 + Modules/_ssl.c | 152 ++++++++++++++++++++++++++++++- 4 files changed, 165 insertions(+), 2 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -733,6 +733,10 @@ .. versionchanged:: 3.4 :exc:`ValueError` is raised when the handshake isn't done. + .. versionchanged:: 3.4 + The returned dictionary includes additional X509v3 extension items + such as ``crlDistributionPoints``, ``caIssuers`` and ``OCSP`` URIs. + .. method:: SSLSocket.cipher() Returns a three-value tuple containing the name of the cipher being used, the diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -212,6 +212,12 @@ (('DNS', 'projects.developer.nokia.com'), ('DNS', 'projects.forum.nokia.com')) ) + # extra OCSP and AIA fields + self.assertEqual(p['OCSP'], ('http://ocsp.verisign.com',)) + self.assertEqual(p['caIssuers'], + ('http://SVRIntl-G3-aia.verisign.com/SVRIntlG3.cer',)) + self.assertEqual(p['crlDistributionPoints'], + ('http://SVRIntl-G3-crl.verisign.com/SVRIntlG3.crl',)) def test_parse_cert_CVE_2013_4238(self): p = ssl._ssl._test_decode_cert(NULLBYTECERT) @@ -905,6 +911,7 @@ 'notAfter': asn1time('Mar 29 12:29:49 2033 GMT'), 'notBefore': asn1time('Mar 30 12:29:49 2003 GMT'), 'serialNumber': '00', + 'crlDistributionPoints': ('https://www.cacert.org/revoke.crl',), 'subject': ((('organizationName', 'Root CA'),), (('organizationalUnitName', 'http://www.cacert.org'),), (('commonName', 'CA Cert Signing Authority'),), @@ -1269,7 +1276,6 @@ s.close() self.assertEqual(len(ctx.get_ca_certs()), 1) - try: import threading except ImportError: diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -59,6 +59,9 @@ Library ------- +- Issue #18379: SSLSocket.getpeercert() returns CA issuer AIA fields, OCSP + and CRL distribution points. + - Issue #18138: Implement cadata argument of SSLContext.load_verify_location() to load CA certificates and CRL from memory. It supports PEM and DER encoded strings. diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -965,6 +965,120 @@ } static PyObject * +_get_aia_uri(X509 *certificate, int nid) { + PyObject *lst = NULL, *ostr = NULL; + int i, result; + AUTHORITY_INFO_ACCESS *info; + + info = X509_get_ext_d2i(certificate, NID_info_access, NULL, NULL); + if ((info == NULL) || (sk_ACCESS_DESCRIPTION_num(info) == 0)) { + return Py_None; + } + + if ((lst = PyList_New(0)) == NULL) { + goto fail; + } + + for (i = 0; i < sk_ACCESS_DESCRIPTION_num(info); i++) { + ACCESS_DESCRIPTION *ad = sk_ACCESS_DESCRIPTION_value(info, i); + ASN1_IA5STRING *uri; + + if ((OBJ_obj2nid(ad->method) != nid) || + (ad->location->type != GEN_URI)) { + continue; + } + uri = ad->location->d.uniformResourceIdentifier; + ostr = PyUnicode_FromStringAndSize((char *)uri->data, + uri->length); + if (ostr == NULL) { + goto fail; + } + result = PyList_Append(lst, ostr); + Py_DECREF(ostr); + if (result < 0) { + goto fail; + } + } + AUTHORITY_INFO_ACCESS_free(info); + + /* convert to tuple or None */ + if (PyList_Size(lst) == 0) { + Py_DECREF(lst); + return Py_None; + } else { + PyObject *tup; + tup = PyList_AsTuple(lst); + Py_DECREF(lst); + return tup; + } + + fail: + AUTHORITY_INFO_ACCESS_free(info); + Py_DECREF(lst); + return NULL; +} + +static PyObject * +_get_crl_dp(X509 *certificate) { + STACK_OF(DIST_POINT) *dps; + int i, j, result; + PyObject *lst; + + /* Calls x509v3_cache_extensions and sets up crldp */ + X509_check_ca(certificate); + dps = certificate->crldp; + if (dps == NULL) { + return Py_None; + } + + if ((lst = PyList_New(0)) == NULL) { + return NULL; + } + + for (i=0; i < sk_DIST_POINT_num(dps); i++) { + DIST_POINT *dp; + STACK_OF(GENERAL_NAME) *gns; + + dp = sk_DIST_POINT_value(dps, i); + gns = dp->distpoint->name.fullname; + + for (j=0; j < sk_GENERAL_NAME_num(gns); j++) { + GENERAL_NAME *gn; + ASN1_IA5STRING *uri; + PyObject *ouri; + + gn = sk_GENERAL_NAME_value(gns, j); + if (gn->type != GEN_URI) { + continue; + } + uri = gn->d.uniformResourceIdentifier; + ouri = PyUnicode_FromStringAndSize((char *)uri->data, + uri->length); + if (ouri == NULL) { + Py_DECREF(lst); + return NULL; + } + result = PyList_Append(lst, ouri); + Py_DECREF(ouri); + if (result < 0) { + Py_DECREF(lst); + return NULL; + } + } + } + /* convert to tuple or None */ + if (PyList_Size(lst) == 0) { + Py_DECREF(lst); + return Py_None; + } else { + PyObject *tup; + tup = PyList_AsTuple(lst); + Py_DECREF(lst); + return tup; + } +} + +static PyObject * _decode_certificate(X509 *certificate) { PyObject *retval = NULL; @@ -974,9 +1088,10 @@ PyObject *issuer; PyObject *version; PyObject *sn_obj; + PyObject *obj; ASN1_INTEGER *serialNumber; char buf[2048]; - int len; + int len, result; ASN1_TIME *notBefore, *notAfter; PyObject *pnotBefore, *pnotAfter; @@ -1082,6 +1197,41 @@ Py_DECREF(peer_alt_names); } + /* Authority Information Access: OCSP URIs */ + obj = _get_aia_uri(certificate, NID_ad_OCSP); + if (obj == NULL) { + goto fail1; + } else if (obj != Py_None) { + result = PyDict_SetItemString(retval, "OCSP", obj); + Py_DECREF(obj); + if (result < 0) { + goto fail1; + } + } + + obj = _get_aia_uri(certificate, NID_ad_ca_issuers); + if (obj == NULL) { + goto fail1; + } else if (obj != Py_None) { + result = PyDict_SetItemString(retval, "caIssuers", obj); + Py_DECREF(obj); + if (result < 0) { + goto fail1; + } + } + + /* CDP (CRL distribution points) */ + obj = _get_crl_dp(certificate); + if (obj == NULL) { + goto fail1; + } else if (obj != Py_None) { + result = PyDict_SetItemString(retval, "crlDistributionPoints", obj); + Py_DECREF(obj); + if (result < 0) { + goto fail1; + } + } + BIO_free(biobuf); return retval; -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Thu Nov 21 07:37:23 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Thu, 21 Nov 2013 07:37:23 +0100 Subject: [Python-checkins] Daily reference leaks (1e79ca2bc494): sum=4 Message-ID: results for 1e79ca2bc494 on branch "default" -------------------------------------------- test_site leaked [2, 0, 0] references, sum=2 test_site leaked [2, 0, 0] memory blocks, sum=2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogVneMA4', '-x'] From python-checkins at python.org Thu Nov 21 10:05:19 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Thu, 21 Nov 2013 10:05:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NjMz?= =?utf-8?q?=3A_Fixed_writing_not_compressed_16-_and_32-bit_wave_files_on?= Message-ID: <3dQFGl73jBz7LkP@mail.python.org> http://hg.python.org/cpython/rev/7b040bc289e8 changeset: 87304:7b040bc289e8 branch: 3.3 parent: 87293:74b76a726285 user: Serhiy Storchaka date: Thu Nov 21 11:02:30 2013 +0200 summary: Issue #19633: Fixed writing not compressed 16- and 32-bit wave files on big-endian platforms. Temporary forbidden test_unseekable_incompleted_write fornot compressed 16- and 32-bit wave file on big-endian platforms. files: Lib/test/audiotests.py | 6 ++++-- Lib/test/test_wave.py | 11 +++++++++++ Lib/wave.py | 4 +++- Misc/NEWS | 3 +++ 4 files changed, 21 insertions(+), 3 deletions(-) diff --git a/Lib/test/audiotests.py b/Lib/test/audiotests.py --- a/Lib/test/audiotests.py +++ b/Lib/test/audiotests.py @@ -6,7 +6,8 @@ import sys def byteswap2(data): - a = array.array('h', data) + a = array.array('h') + a.frombytes(data) a.byteswap() return a.tobytes() @@ -17,7 +18,8 @@ return bytes(ba) def byteswap4(data): - a = array.array('i', data) + a = array.array('i') + a.frombytes(data) a.byteswap() return a.tobytes() diff --git a/Lib/test/test_wave.py b/Lib/test/test_wave.py --- a/Lib/test/test_wave.py +++ b/Lib/test/test_wave.py @@ -48,6 +48,12 @@ if sys.byteorder != 'big': frames = audiotests.byteswap2(frames) + if sys.byteorder == 'big': + @unittest.expectedFailure + def test_unseekable_incompleted_write(self): + super().test_unseekable_incompleted_write() + + class WavePCM24Test(audiotests.AudioWriteTests, audiotests.AudioTestsWithSourceFile, @@ -108,6 +114,11 @@ if sys.byteorder != 'big': frames = audiotests.byteswap4(frames) + if sys.byteorder == 'big': + @unittest.expectedFailure + def test_unseekable_incompleted_write(self): + super().test_unseekable_incompleted_write() + if __name__ == '__main__': unittest.main() diff --git a/Lib/wave.py b/Lib/wave.py --- a/Lib/wave.py +++ b/Lib/wave.py @@ -424,7 +424,9 @@ data = self._convert(data) if self._sampwidth in (2, 4) and sys.byteorder == 'big': import array - data = array.array(_array_fmts[self._sampwidth], data) + a = array.array(_array_fmts[self._sampwidth]) + a.frombytes(data) + data = a assert data.itemsize == self._sampwidth data.byteswap() data.tofile(self._file) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -13,6 +13,9 @@ Library ------- +- Issue #19633: Fixed writing not compressed 16- and 32-bit wave files on + big-endian platforms. + - Issue #19449: in csv's writerow, handle non-string keys when generating the error message that certain keys are not in the 'fieldnames' list. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 10:05:21 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Thu, 21 Nov 2013 10:05:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319633=3A_Fixed_writing_not_compressed_16-_and_3?= =?utf-8?q?2-bit_wave_files_on?= Message-ID: <3dQFGn1s19z7Llb@mail.python.org> http://hg.python.org/cpython/rev/7cf7f19445ba changeset: 87305:7cf7f19445ba parent: 87303:468d18bffdea parent: 87304:7b040bc289e8 user: Serhiy Storchaka date: Thu Nov 21 11:04:22 2013 +0200 summary: Issue #19633: Fixed writing not compressed 16- and 32-bit wave files on big-endian platforms. Temporary forbidden test_unseekable_incompleted_write fornot compressed 16- and 32-bit wave file on big-endian platforms. files: Lib/test/audiotests.py | 6 ++++-- Lib/test/test_wave.py | 11 +++++++++++ Lib/wave.py | 4 +++- Misc/NEWS | 3 +++ 4 files changed, 21 insertions(+), 3 deletions(-) diff --git a/Lib/test/audiotests.py b/Lib/test/audiotests.py --- a/Lib/test/audiotests.py +++ b/Lib/test/audiotests.py @@ -6,7 +6,8 @@ import sys def byteswap2(data): - a = array.array('h', data) + a = array.array('h') + a.frombytes(data) a.byteswap() return a.tobytes() @@ -17,7 +18,8 @@ return bytes(ba) def byteswap4(data): - a = array.array('i', data) + a = array.array('i') + a.frombytes(data) a.byteswap() return a.tobytes() diff --git a/Lib/test/test_wave.py b/Lib/test/test_wave.py --- a/Lib/test/test_wave.py +++ b/Lib/test/test_wave.py @@ -48,6 +48,12 @@ if sys.byteorder != 'big': frames = audiotests.byteswap2(frames) + if sys.byteorder == 'big': + @unittest.expectedFailure + def test_unseekable_incompleted_write(self): + super().test_unseekable_incompleted_write() + + class WavePCM24Test(audiotests.AudioWriteTests, audiotests.AudioTestsWithSourceFile, @@ -108,6 +114,11 @@ if sys.byteorder != 'big': frames = audiotests.byteswap4(frames) + if sys.byteorder == 'big': + @unittest.expectedFailure + def test_unseekable_incompleted_write(self): + super().test_unseekable_incompleted_write() + if __name__ == '__main__': unittest.main() diff --git a/Lib/wave.py b/Lib/wave.py --- a/Lib/wave.py +++ b/Lib/wave.py @@ -443,7 +443,9 @@ data = self._convert(data) if self._sampwidth in (2, 4) and sys.byteorder == 'big': import array - data = array.array(_array_fmts[self._sampwidth], data) + a = array.array(_array_fmts[self._sampwidth]) + a.frombytes(data) + data = a assert data.itemsize == self._sampwidth data.byteswap() data.tofile(self._file) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -59,6 +59,9 @@ Library ------- +- Issue #19633: Fixed writing not compressed 16- and 32-bit wave files on + big-endian platforms. + - Issue #18379: SSLSocket.getpeercert() returns CA issuer AIA fields, OCSP and CRL distribution points. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 10:05:22 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Thu, 21 Nov 2013 10:05:22 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NjMz?= =?utf-8?q?=3A_Fixed_writing_not_compressed_16-_and_32-bit_wave_files_on?= Message-ID: <3dQFGp3lJfz7LmT@mail.python.org> http://hg.python.org/cpython/rev/03a32ead9c7d changeset: 87306:03a32ead9c7d branch: 2.7 parent: 87295:78c906600183 user: Serhiy Storchaka date: Thu Nov 21 11:04:37 2013 +0200 summary: Issue #19633: Fixed writing not compressed 16- and 32-bit wave files on big-endian platforms. Temporary forbidden test_unseekable_incompleted_write fornot compressed 16- and 32-bit wave file on big-endian platforms. files: Lib/test/audiotests.py | 6 ++++-- Lib/test/test_wave.py | 11 +++++++++++ Lib/wave.py | 4 +++- Misc/NEWS | 3 +++ 4 files changed, 21 insertions(+), 3 deletions(-) diff --git a/Lib/test/audiotests.py b/Lib/test/audiotests.py --- a/Lib/test/audiotests.py +++ b/Lib/test/audiotests.py @@ -10,7 +10,8 @@ return base64.b16decode(s.replace(' ', '')) def byteswap2(data): - a = array.array('h', data) + a = array.array('h') + a.fromstring(data) a.byteswap() return a.tostring() @@ -21,7 +22,8 @@ return bytes(ba) def byteswap4(data): - a = array.array('i', data) + a = array.array('i') + a.fromstring(data) a.byteswap() return a.tostring() diff --git a/Lib/test/test_wave.py b/Lib/test/test_wave.py --- a/Lib/test/test_wave.py +++ b/Lib/test/test_wave.py @@ -48,6 +48,12 @@ if sys.byteorder != 'big': frames = audiotests.byteswap2(frames) + if sys.byteorder == 'big': + @unittest.expectedFailure + def test_unseekable_incompleted_write(self): + super().test_unseekable_incompleted_write() + + class WavePCM24Test(audiotests.AudioWriteTests, audiotests.AudioTestsWithSourceFile, @@ -108,6 +114,11 @@ if sys.byteorder != 'big': frames = audiotests.byteswap4(frames) + if sys.byteorder == 'big': + @unittest.expectedFailure + def test_unseekable_incompleted_write(self): + super().test_unseekable_incompleted_write() + def test_main(): run_unittest(WavePCM8Test, WavePCM16Test, WavePCM24Test, WavePCM32Test) diff --git a/Lib/wave.py b/Lib/wave.py --- a/Lib/wave.py +++ b/Lib/wave.py @@ -424,7 +424,9 @@ data = self._convert(data) if self._sampwidth in (2, 4) and sys.byteorder == 'big': import array - data = array.array(_array_fmts[self._sampwidth], data) + a = array.array(_array_fmts[self._sampwidth]) + a.fromstring(data) + data = a assert data.itemsize == self._sampwidth data.byteswap() data.tofile(self._file) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -12,6 +12,9 @@ Library ------- +- Issue #19633: Fixed writing not compressed 16- and 32-bit wave files on + big-endian platforms. + - Issue #19449: in csv's writerow, handle non-string keys when generating the error message that certain keys are not in the 'fieldnames' list. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 10:29:52 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 21 Nov 2013 10:29:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319183=3A_Simplify?= =?utf-8?q?_test=5Fgdb?= Message-ID: <3dQFq42PHnz7Lkx@mail.python.org> http://hg.python.org/cpython/rev/eec4758e3a45 changeset: 87307:eec4758e3a45 parent: 87305:7cf7f19445ba user: Victor Stinner date: Thu Nov 21 10:25:09 2013 +0100 summary: Issue #19183: Simplify test_gdb repr() is no more platform dependent, SipHash has been fixed files: Lib/test/test_gdb.py | 47 +++++++++++++------------------ 1 files changed, 20 insertions(+), 27 deletions(-) diff --git a/Lib/test/test_gdb.py b/Lib/test/test_gdb.py --- a/Lib/test/test_gdb.py +++ b/Lib/test/test_gdb.py @@ -221,48 +221,41 @@ gdb_output = self.get_stack_trace('id(42)') self.assertTrue(BREAKPOINT_FN in gdb_output) - def get_python_repr(self, val): - args = [sys.executable, '-c', 'print(repr(%a))' % (val,)] - env = os.environ.copy() - env['PYTHONHASHSEED'] = PYTHONHASHSEED - output = subprocess.check_output(args, env=env, universal_newlines=True) - return output.rstrip() - def assertGdbRepr(self, val, exp_repr=None, cmds_after_breakpoint=None): # Ensure that gdb's rendering of the value in a debugged process # matches repr(value) in this process: gdb_repr, gdb_output = self.get_gdb_repr('id(' + ascii(val) + ')', cmds_after_breakpoint) if not exp_repr: - exp_repr = self.get_python_repr(val) + exp_repr = repr(val) self.assertEqual(gdb_repr, exp_repr, ('%r did not equal expected %r; full output was:\n%s' % (gdb_repr, exp_repr, gdb_output))) def test_int(self): 'Verify the pretty-printing of various int values' - self.assertGdbRepr(42, '42') - self.assertGdbRepr(0, '0') - self.assertGdbRepr(-7, '-7') - self.assertGdbRepr(1000000000000, '1000000000000') - self.assertGdbRepr(-1000000000000000, '-1000000000000000') + self.assertGdbRepr(42) + self.assertGdbRepr(0) + self.assertGdbRepr(-7) + self.assertGdbRepr(1000000000000) + self.assertGdbRepr(-1000000000000000) def test_singletons(self): 'Verify the pretty-printing of True, False and None' - self.assertGdbRepr(True, 'True') - self.assertGdbRepr(False, 'False') - self.assertGdbRepr(None, 'None') + self.assertGdbRepr(True) + self.assertGdbRepr(False) + self.assertGdbRepr(None) def test_dicts(self): 'Verify the pretty-printing of dictionaries' - self.assertGdbRepr({}, '{}') - self.assertGdbRepr({'foo': 'bar'}) - self.assertGdbRepr({'foo': 'bar', 'douglas': 42}) + self.assertGdbRepr({}) + self.assertGdbRepr({'foo': 'bar'}, "{'foo': 'bar'}") + self.assertGdbRepr({'foo': 'bar', 'douglas': 42}, "{'douglas': 42, 'foo': 'bar'}") def test_lists(self): 'Verify the pretty-printing of lists' - self.assertGdbRepr([], '[]') - self.assertGdbRepr(list(range(5)), '[0, 1, 2, 3, 4]') + self.assertGdbRepr([]) + self.assertGdbRepr(list(range(5))) def test_bytes(self): 'Verify the pretty-printing of bytes' @@ -320,9 +313,9 @@ 'Verify the pretty-printing of sets' if (gdb_major_version, gdb_minor_version) < (7, 3): self.skipTest("pretty-printing of sets needs gdb 7.3 or later") - self.assertGdbRepr(set()) - self.assertGdbRepr(set(['a', 'b'])) - self.assertGdbRepr(set([4, 5, 6])) + self.assertGdbRepr(set(), 'set()') + self.assertGdbRepr(set(['a', 'b']), "{'a', 'b'}") + self.assertGdbRepr(set([4, 5, 6]), "{4, 5, 6}") # Ensure that we handle sets containing the "dummy" key value, # which happens on deletion: @@ -335,9 +328,9 @@ 'Verify the pretty-printing of frozensets' if (gdb_major_version, gdb_minor_version) < (7, 3): self.skipTest("pretty-printing of frozensets needs gdb 7.3 or later") - self.assertGdbRepr(frozenset()) - self.assertGdbRepr(frozenset(['a', 'b'])) - self.assertGdbRepr(frozenset([4, 5, 6])) + self.assertGdbRepr(frozenset(), 'frozenset()') + self.assertGdbRepr(frozenset(['a', 'b']), "frozenset({'a', 'b'})") + self.assertGdbRepr(frozenset([4, 5, 6]), "frozenset({4, 5, 6})") def test_exceptions(self): # Test a RuntimeError -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 12:16:52 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 21 Nov 2013 12:16:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2319578=3A_Fix_list?= =?utf-8?q?=5Fass=5Fsubscript=28=29=2C_handle_list=5Fresize=28=29_failure?= Message-ID: <3dQJBX4vPHz7Lkf@mail.python.org> http://hg.python.org/cpython/rev/b508253f2876 changeset: 87308:b508253f2876 user: Victor Stinner date: Thu Nov 21 12:16:35 2013 +0100 summary: Close #19578: Fix list_ass_subscript(), handle list_resize() failure Notify the caller of the failure (MemoryError exception). files: Objects/listobject.c | 5 +++-- 1 files changed, 3 insertions(+), 2 deletions(-) diff --git a/Objects/listobject.c b/Objects/listobject.c --- a/Objects/listobject.c +++ b/Objects/listobject.c @@ -2483,6 +2483,7 @@ PyObject **garbage; size_t cur; Py_ssize_t i; + int res; if (slicelength <= 0) return 0; @@ -2533,14 +2534,14 @@ } Py_SIZE(self) -= slicelength; - list_resize(self, Py_SIZE(self)); + res = list_resize(self, Py_SIZE(self)); for (i = 0; i < slicelength; i++) { Py_DECREF(garbage[i]); } PyMem_FREE(garbage); - return 0; + return res; } else { /* assign slice */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 12:32:14 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 21 Nov 2013 12:32:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2319568=3A_Fix_byte?= =?utf-8?q?array=5Fsetslice=5Flinear=28=29=2C_fix_handling_of?= Message-ID: <3dQJXG51zvz7Ljb@mail.python.org> http://hg.python.org/cpython/rev/ab73b7fd7523 changeset: 87309:ab73b7fd7523 user: Victor Stinner date: Thu Nov 21 12:29:51 2013 +0100 summary: Close #19568: Fix bytearray_setslice_linear(), fix handling of PyByteArray_Resize() failure: leave the bytearray object in an consistent state. If growth < 0, handling the memory allocation failure is tricky here because the bytearray object has already been modified. If lo != 0, the operation is completed, but a MemoryError is still raised and the memory block is not shrinked. If lo == 0, the bytearray is restored in its previous state and a MemoryError is raised. files: Objects/bytearrayobject.c | 100 ++++++++++++++++--------- 1 files changed, 63 insertions(+), 37 deletions(-) diff --git a/Objects/bytearrayobject.c b/Objects/bytearrayobject.c --- a/Objects/bytearrayobject.c +++ b/Objects/bytearrayobject.c @@ -453,54 +453,80 @@ Py_ssize_t avail = hi - lo; char *buf = PyByteArray_AS_STRING(self); Py_ssize_t growth = bytes_len - avail; + int res = 0; assert(avail >= 0); - if (growth != 0) { - if (growth < 0) { - if (!_canresize(self)) + if (growth < 0) { + if (!_canresize(self)) + return -1; + + if (lo == 0) { + /* Shrink the buffer by advancing its logical start */ + self->ob_start -= growth; + /* + 0 lo hi old_size + | |<----avail----->|<-----tail------>| + | |<-bytes_len->|<-----tail------>| + 0 new_lo new_hi new_size + */ + } + else { + /* + 0 lo hi old_size + | |<----avail----->|<-----tomove------>| + | |<-bytes_len->|<-----tomove------>| + 0 lo new_hi new_size + */ + memmove(buf + lo + bytes_len, buf + hi, + Py_SIZE(self) - hi); + } + if (PyByteArray_Resize((PyObject *)self, + Py_SIZE(self) + growth) < 0) { + /* Issue #19578: Handling the memory allocation failure here is + tricky here because the bytearray object has already been + modified. Depending on growth and lo, the behaviour is + different. + + If growth < 0 and lo != 0, the operation is completed, but a + MemoryError is still raised and the memory block is not + shrinked. Otherwise, the bytearray is restored in its previous + state and a MemoryError is raised. */ + if (lo == 0) { + self->ob_start += growth; return -1; - if (lo == 0) { - /* Shrink the buffer by advancing its logical start */ - self->ob_start -= growth; - /* - 0 lo hi old_size - | |<----avail----->|<-----tail------>| - | |<-bytes_len->|<-----tail------>| - 0 new_lo new_hi new_size - */ } - else { - /* - 0 lo hi old_size - | |<----avail----->|<-----tomove------>| - | |<-bytes_len->|<-----tomove------>| - 0 lo new_hi new_size - */ - memmove(buf + lo + bytes_len, buf + hi, - Py_SIZE(self) - hi); - } + /* memmove() removed bytes, the bytearray object cannot be + restored in its previous state. */ + Py_SIZE(self) += growth; + res = -1; } - /* XXX(nnorwitz): need to verify this can't overflow! */ - if (PyByteArray_Resize( - (PyObject *)self, Py_SIZE(self) + growth) < 0) + buf = PyByteArray_AS_STRING(self); + } + else if (growth > 0) { + if (Py_SIZE(self) > (Py_ssize_t)PY_SSIZE_T_MAX - growth) { + PyErr_NoMemory(); return -1; + } + + if (PyByteArray_Resize((PyObject *)self, + Py_SIZE(self) + growth) < 0) { + return -1; + } buf = PyByteArray_AS_STRING(self); - if (growth > 0) { - /* Make the place for the additional bytes */ - /* - 0 lo hi old_size - | |<-avail->|<-----tomove------>| - | |<---bytes_len-->|<-----tomove------>| - 0 lo new_hi new_size - */ - memmove(buf + lo + bytes_len, buf + hi, - Py_SIZE(self) - lo - bytes_len); - } + /* Make the place for the additional bytes */ + /* + 0 lo hi old_size + | |<-avail->|<-----tomove------>| + | |<---bytes_len-->|<-----tomove------>| + 0 lo new_hi new_size + */ + memmove(buf + lo + bytes_len, buf + hi, + Py_SIZE(self) - lo - bytes_len); } if (bytes_len > 0) memcpy(buf + lo, bytes, bytes_len); - return 0; + return res; } static int -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 15:47:08 2013 From: python-checkins at python.org (ronald.oussoren) Date: Thu, 21 Nov 2013 15:47:08 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2314455=3A_plistlib?= =?utf-8?q?_now_supports_binary_plists_and_has_an_updated_API=2E?= Message-ID: <3dQNs86nNCz7Lkt@mail.python.org> http://hg.python.org/cpython/rev/673ca119dbd0 changeset: 87310:673ca119dbd0 user: Ronald Oussoren date: Thu Nov 21 15:46:49 2013 +0100 summary: Issue #14455: plistlib now supports binary plists and has an updated API. This patch adds support for binary plists on OSX to plistlib (based on a patch by 'dpounces'). The patch also cleans up the API for the plistlib module. files: Doc/library/plistlib.rst | 181 +- Lib/plistlib.py | 1098 ++++++++-- Lib/test/test_plistlib.py | 497 +++- Mac/Tools/plistlib_generate_testdata.py | 94 + Misc/NEWS | 2 + 5 files changed, 1446 insertions(+), 426 deletions(-) diff --git a/Doc/library/plistlib.rst b/Doc/library/plistlib.rst --- a/Doc/library/plistlib.rst +++ b/Doc/library/plistlib.rst @@ -16,26 +16,21 @@ -------------- This module provides an interface for reading and writing the "property list" -XML files used mainly by Mac OS X. +files used mainly by Mac OS X and supports both binary and XML plist files. -The property list (``.plist``) file format is a simple XML pickle supporting +The property list (``.plist``) file format is a simple serialization supporting basic object types, like dictionaries, lists, numbers and strings. Usually the top level object is a dictionary. -To write out and to parse a plist file, use the :func:`writePlist` and -:func:`readPlist` functions. +To write out and to parse a plist file, use the :func:`dump` and +:func:`load` functions. -To work with plist data in bytes objects, use :func:`writePlistToBytes` -and :func:`readPlistFromBytes`. +To work with plist data in bytes objects, use :func:`dumps` +and :func:`loads`. Values can be strings, integers, floats, booleans, tuples, lists, dictionaries -(but only with string keys), :class:`Data` or :class:`datetime.datetime` -objects. String values (including dictionary keys) have to be unicode strings -- -they will be written out as UTF-8. - -The ```` plist type is supported through the :class:`Data` class. This is -a thin wrapper around a Python bytes object. Use :class:`Data` if your strings -contain control characters. +(but only with string keys), :class:`Data`, :class:`bytes`, :class:`bytesarray` +or :class:`datetime.datetime` objects. .. seealso:: @@ -45,37 +40,145 @@ This module defines the following functions: +.. function:: load(fp, \*, fmt=None, use_builtin_types=True, dict_type=dict) + + Read a plist file. *fp* should be a readable and binary file object. + Return the unpacked root object (which usually is a + dictionary). + + The *fmt* is the format of the file and the following values are valid: + + * :data:`None`: Autodetect the file format + + * :data:`FMT_XML`: XML file format + + * :data:`FMT_BINARY`: Binary plist format + + If *use_builtin_types* is True (the default) binary data will be returned + as instances of :class:`bytes`, otherwise it is returned as instances of + :class:`Data`. + + The *dict_type* is the type used for dictionaries that are read from the + plist file. The exact structure of the plist can be recovered by using + :class:`collections.OrderedDict` (although the order of keys shouldn't be + important in plist files). + + XML data for the :data:`FMT_XML` format is parsed using the Expat parser + from :mod:`xml.parsers.expat` -- see its documentation for possible + exceptions on ill-formed XML. Unknown elements will simply be ignored + by the plist parser. + + The parser for the binary format raises :exc:`InvalidFileException` + when the file cannot be parsed. + + .. versionadded:: 3.4 + + +.. function:: loads(data, \*, fmt=None, use_builtin_types=True, dict_type=dict) + + Load a plist from a bytes object. See :func:`load` for an explanation of + the keyword arguments. + + +.. function:: dump(value, fp, \*, fmt=FMT_XML, sort_keys=True, skipkeys=False) + + Write *value* to a plist file. *Fp* should be a writable, binary + file object. + + The *fmt* argument specifies the format of the plist file and can be + one of the following values: + + * :data:`FMT_XML`: XML formatted plist file + + * :data:`FMT_BINARY`: Binary formatted plist file + + When *sort_keys* is true (the default) the keys for dictionaries will be + written to the plist in sorted order, otherwise they will be written in + the iteration order of the dictionary. + + When *skipkeys* is false (the default) the function raises :exc:`TypeError` + when a key of a dictionary is not a string, otherwise such keys are skipped. + + A :exc:`TypeError` will be raised if the object is of an unsupported type or + a container that contains objects of unsupported types. + + .. versionchanged:: 3.4 + Added the *fmt*, *sort_keys* and *skipkeys* arguments. + + +.. function:: dumps(value, \*, fmt=FMT_XML, sort_keys=True, skipkeys=False) + + Return *value* as a plist-formatted bytes object. See + the documentation for :func:`dump` for an explanation of the keyword + arguments of this function. + + +The following functions are deprecated: + .. function:: readPlist(pathOrFile) - Read a plist file. *pathOrFile* may either be a file name or a (readable and - binary) file object. Return the unpacked root object (which usually is a - dictionary). + Read a plist file. *pathOrFile* may be either a file name or a (readable + and binary) file object. Returns the unpacked root object (which usually + is a dictionary). - The XML data is parsed using the Expat parser from :mod:`xml.parsers.expat` - -- see its documentation for possible exceptions on ill-formed XML. - Unknown elements will simply be ignored by the plist parser. + This function calls :func:`load` to do the actual work, the the documentation + of :func:`that function ` for an explanation of the keyword arguments. + + .. note:: + + Dict values in the result have a ``__getattr__`` method that defers + to ``__getitem_``. This means that you can use attribute access to + access items of these dictionaries. + + .. deprecated: 3.4 Use :func:`load` instead. .. function:: writePlist(rootObject, pathOrFile) - Write *rootObject* to a plist file. *pathOrFile* may either be a file name - or a (writable and binary) file object. + Write *rootObject* to an XML plist file. *pathOrFile* may be either a file name + or a (writable and binary) file object - A :exc:`TypeError` will be raised if the object is of an unsupported type or - a container that contains objects of unsupported types. + .. deprecated: 3.4 Use :func:`dump` instead. .. function:: readPlistFromBytes(data) Read a plist data from a bytes object. Return the root object. + See :func:`load` for a description of the keyword arguments. + + .. note:: + + Dict values in the result have a ``__getattr__`` method that defers + to ``__getitem_``. This means that you can use attribute access to + access items of these dictionaries. + + .. deprecated:: 3.4 Use :func:`loads` instead. + .. function:: writePlistToBytes(rootObject) - Return *rootObject* as a plist-formatted bytes object. + Return *rootObject* as an XML plist-formatted bytes object. + .. deprecated:: 3.4 Use :func:`dumps` instead. -The following class is available: + .. versionchanged:: 3.4 + Added the *fmt*, *sort_keys* and *skipkeys* arguments. + + +The following classes are available: + +.. class:: Dict([dict]): + + Return an extended mapping object with the same value as dictionary + *dict*. + + This class is a subclass of :class:`dict` where attribute access can + be used to access items. That is, ``aDict.key`` is the same as + ``aDict['key']`` for getting, setting and deleting items in the mapping. + + .. deprecated:: 3.0 + .. class:: Data(data) @@ -86,6 +189,24 @@ It has one attribute, :attr:`data`, that can be used to retrieve the Python bytes object stored in it. + .. deprecated:: 3.4 Use a :class:`bytes` object instead + + +The following constants are avaiable: + +.. data:: FMT_XML + + The XML format for plist files. + + .. versionadded:: 3.4 + + +.. data:: FMT_BINARY + + The binary format for plist files + + .. versionadded:: 3.4 + Examples -------- @@ -103,13 +224,15 @@ aTrueValue = True, aFalseValue = False, ), - someData = Data(b""), - someMoreData = Data(b"" * 10), + someData = b"", + someMoreData = b"" * 10, aDate = datetime.datetime.fromtimestamp(time.mktime(time.gmtime())), ) - writePlist(pl, fileName) + with open(fileName, 'wb') as fp: + dump(pl, fp) Parsing a plist:: - pl = readPlist(pathOrFile) + with open(fileName, 'rb') as fp: + pl = load(fp) print(pl["aKey"]) diff --git a/Lib/plistlib.py b/Lib/plistlib.py --- a/Lib/plistlib.py +++ b/Lib/plistlib.py @@ -4,25 +4,20 @@ basic object types, like dictionaries, lists, numbers and strings. Usually the top level object is a dictionary. -To write out a plist file, use the writePlist(rootObject, pathOrFile) -function. 'rootObject' is the top level object, 'pathOrFile' is a -filename or a (writable) file object. +To write out a plist file, use the dump(value, file) +function. 'value' is the top level object, 'file' is +a (writable) file object. -To parse a plist from a file, use the readPlist(pathOrFile) function, -with a file name or a (readable) file object as the only argument. It +To parse a plist from a file, use the load(file) function, +with a (readable) file object as the only argument. It returns the top level object (again, usually a dictionary). -To work with plist data in bytes objects, you can use readPlistFromBytes() -and writePlistToBytes(). +To work with plist data in bytes objects, you can use loads() +and dumps(). Values can be strings, integers, floats, booleans, tuples, lists, -dictionaries (but only with string keys), Data or datetime.datetime objects. -String values (including dictionary keys) have to be unicode strings -- they -will be written out as UTF-8. - -The plist type is supported through the Data class. This is a -thin wrapper around a Python bytes object. Use 'Data' if your strings -contain control characters. +dictionaries (but only with string keys), Data, bytes, bytearray, or +datetime.datetime objects. Generate Plist example: @@ -37,226 +32,48 @@ aTrueValue = True, aFalseValue = False, ), - someData = Data(b""), - someMoreData = Data(b"" * 10), + someData = b"", + someMoreData = b"" * 10, aDate = datetime.datetime.fromtimestamp(time.mktime(time.gmtime())), ) - writePlist(pl, fileName) + with open(fileName, 'wb') as fp: + dump(pl, fp) Parse Plist example: - pl = readPlist(pathOrFile) - print pl["aKey"] + with open(fileName, 'rb') as fp: + pl = load(fp) + print(pl["aKey"]) """ - - __all__ = [ "readPlist", "writePlist", "readPlistFromBytes", "writePlistToBytes", - "Plist", "Data", "Dict" + "Plist", "Data", "Dict", "FMT_XML", "FMT_BINARY", + "load", "dump", "loads", "dumps" ] -# Note: the Plist and Dict classes have been deprecated. import binascii +import codecs +import contextlib import datetime +import enum from io import BytesIO +import itertools +import os import re +import struct +from warnings import warn +from xml.parsers.expat import ParserCreate -def readPlist(pathOrFile): - """Read a .plist file. 'pathOrFile' may either be a file name or a - (readable) file object. Return the unpacked root object (which - usually is a dictionary). - """ - didOpen = False - try: - if isinstance(pathOrFile, str): - pathOrFile = open(pathOrFile, 'rb') - didOpen = True - p = PlistParser() - rootObject = p.parse(pathOrFile) - finally: - if didOpen: - pathOrFile.close() - return rootObject +PlistFormat = enum.Enum('PlistFormat', 'FMT_XML FMT_BINARY', module=__name__) +globals().update(PlistFormat.__members__) -def writePlist(rootObject, pathOrFile): - """Write 'rootObject' to a .plist file. 'pathOrFile' may either be a - file name or a (writable) file object. - """ - didOpen = False - try: - if isinstance(pathOrFile, str): - pathOrFile = open(pathOrFile, 'wb') - didOpen = True - writer = PlistWriter(pathOrFile) - writer.writeln("") - writer.writeValue(rootObject) - writer.writeln("") - finally: - if didOpen: - pathOrFile.close() - - -def readPlistFromBytes(data): - """Read a plist data from a bytes object. Return the root object. - """ - return readPlist(BytesIO(data)) - - -def writePlistToBytes(rootObject): - """Return 'rootObject' as a plist-formatted bytes object. - """ - f = BytesIO() - writePlist(rootObject, f) - return f.getvalue() - - -class DumbXMLWriter: - def __init__(self, file, indentLevel=0, indent="\t"): - self.file = file - self.stack = [] - self.indentLevel = indentLevel - self.indent = indent - - def beginElement(self, element): - self.stack.append(element) - self.writeln("<%s>" % element) - self.indentLevel += 1 - - def endElement(self, element): - assert self.indentLevel > 0 - assert self.stack.pop() == element - self.indentLevel -= 1 - self.writeln("" % element) - - def simpleElement(self, element, value=None): - if value is not None: - value = _escape(value) - self.writeln("<%s>%s" % (element, value, element)) - else: - self.writeln("<%s/>" % element) - - def writeln(self, line): - if line: - # plist has fixed encoding of utf-8 - if isinstance(line, str): - line = line.encode('utf-8') - self.file.write(self.indentLevel * self.indent) - self.file.write(line) - self.file.write(b'\n') - - -# Contents should conform to a subset of ISO 8601 -# (in particular, YYYY '-' MM '-' DD 'T' HH ':' MM ':' SS 'Z'. Smaller units may be omitted with -# a loss of precision) -_dateParser = re.compile(r"(?P\d\d\d\d)(?:-(?P\d\d)(?:-(?P\d\d)(?:T(?P\d\d)(?::(?P\d\d)(?::(?P\d\d))?)?)?)?)?Z", re.ASCII) - -def _dateFromString(s): - order = ('year', 'month', 'day', 'hour', 'minute', 'second') - gd = _dateParser.match(s).groupdict() - lst = [] - for key in order: - val = gd[key] - if val is None: - break - lst.append(int(val)) - return datetime.datetime(*lst) - -def _dateToString(d): - return '%04d-%02d-%02dT%02d:%02d:%02dZ' % ( - d.year, d.month, d.day, - d.hour, d.minute, d.second - ) - - -# Regex to find any control chars, except for \t \n and \r -_controlCharPat = re.compile( - r"[\x00\x01\x02\x03\x04\x05\x06\x07\x08\x0b\x0c\x0e\x0f" - r"\x10\x11\x12\x13\x14\x15\x16\x17\x18\x19\x1a\x1b\x1c\x1d\x1e\x1f]") - -def _escape(text): - m = _controlCharPat.search(text) - if m is not None: - raise ValueError("strings can't contains control characters; " - "use plistlib.Data instead") - text = text.replace("\r\n", "\n") # convert DOS line endings - text = text.replace("\r", "\n") # convert Mac line endings - text = text.replace("&", "&") # escape '&' - text = text.replace("<", "<") # escape '<' - text = text.replace(">", ">") # escape '>' - return text - - -PLISTHEADER = b"""\ - - -""" - -class PlistWriter(DumbXMLWriter): - - def __init__(self, file, indentLevel=0, indent=b"\t", writeHeader=1): - if writeHeader: - file.write(PLISTHEADER) - DumbXMLWriter.__init__(self, file, indentLevel, indent) - - def writeValue(self, value): - if isinstance(value, str): - self.simpleElement("string", value) - elif isinstance(value, bool): - # must switch for bool before int, as bool is a - # subclass of int... - if value: - self.simpleElement("true") - else: - self.simpleElement("false") - elif isinstance(value, int): - self.simpleElement("integer", "%d" % value) - elif isinstance(value, float): - self.simpleElement("real", repr(value)) - elif isinstance(value, dict): - self.writeDict(value) - elif isinstance(value, Data): - self.writeData(value) - elif isinstance(value, datetime.datetime): - self.simpleElement("date", _dateToString(value)) - elif isinstance(value, (tuple, list)): - self.writeArray(value) - else: - raise TypeError("unsupported type: %s" % type(value)) - - def writeData(self, data): - self.beginElement("data") - self.indentLevel -= 1 - maxlinelength = max(16, 76 - len(self.indent.replace(b"\t", b" " * 8) * - self.indentLevel)) - for line in data.asBase64(maxlinelength).split(b"\n"): - if line: - self.writeln(line) - self.indentLevel += 1 - self.endElement("data") - - def writeDict(self, d): - if d: - self.beginElement("dict") - items = sorted(d.items()) - for key, value in items: - if not isinstance(key, str): - raise TypeError("keys must be strings") - self.simpleElement("key", key) - self.writeValue(value) - self.endElement("dict") - else: - self.simpleElement("dict") - - def writeArray(self, array): - if array: - self.beginElement("array") - for value in array: - self.writeValue(value) - self.endElement("array") - else: - self.simpleElement("array") +# +# +# Deprecated functionality +# +# class _InternalDict(dict): @@ -264,19 +81,18 @@ # This class is needed while Dict is scheduled for deprecation: # we only need to warn when a *user* instantiates Dict or when # the "attribute notation for dict keys" is used. + __slots__ = () def __getattr__(self, attr): try: value = self[attr] except KeyError: raise AttributeError(attr) - from warnings import warn warn("Attribute access from plist dicts is deprecated, use d[key] " "notation instead", DeprecationWarning, 2) return value def __setattr__(self, attr, value): - from warnings import warn warn("Attribute access from plist dicts is deprecated, use d[key] " "notation instead", DeprecationWarning, 2) self[attr] = value @@ -286,56 +102,111 @@ del self[attr] except KeyError: raise AttributeError(attr) - from warnings import warn warn("Attribute access from plist dicts is deprecated, use d[key] " "notation instead", DeprecationWarning, 2) + class Dict(_InternalDict): def __init__(self, **kwargs): - from warnings import warn warn("The plistlib.Dict class is deprecated, use builtin dict instead", DeprecationWarning, 2) super().__init__(**kwargs) + at contextlib.contextmanager +def _maybe_open(pathOrFile, mode): + if isinstance(pathOrFile, str): + with open(pathOrFile, mode) as fp: + yield fp + + else: + yield pathOrFile + + class Plist(_InternalDict): - - """This class has been deprecated. Use readPlist() and writePlist() + """This class has been deprecated. Use dump() and load() functions instead, together with regular dict objects. """ def __init__(self, **kwargs): - from warnings import warn - warn("The Plist class is deprecated, use the readPlist() and " - "writePlist() functions instead", DeprecationWarning, 2) + warn("The Plist class is deprecated, use the load() and " + "dump() functions instead", DeprecationWarning, 2) super().__init__(**kwargs) + @classmethod def fromFile(cls, pathOrFile): - """Deprecated. Use the readPlist() function instead.""" - rootObject = readPlist(pathOrFile) + """Deprecated. Use the load() function instead.""" + with maybe_open(pathOrFile, 'rb') as fp: + value = load(fp) plist = cls() - plist.update(rootObject) + plist.update(value) return plist - fromFile = classmethod(fromFile) def write(self, pathOrFile): - """Deprecated. Use the writePlist() function instead.""" - writePlist(self, pathOrFile) + """Deprecated. Use the dump() function instead.""" + with _maybe_open(pathOrFile, 'wb') as fp: + dump(self, fp) -def _encodeBase64(s, maxlinelength=76): - # copied from base64.encodebytes(), with added maxlinelength argument - maxbinsize = (maxlinelength//4)*3 - pieces = [] - for i in range(0, len(s), maxbinsize): - chunk = s[i : i + maxbinsize] - pieces.append(binascii.b2a_base64(chunk)) - return b''.join(pieces) +def readPlist(pathOrFile): + """ + Read a .plist from a path or file. pathOrFile should either + be a file name, or a readable binary file object. + + This function is deprecated, use load instead. + """ + warn("The readPlist function is deprecated, use load() instead", + DeprecationWarning, 2) + + with _maybe_open(pathOrFile, 'rb') as fp: + return load(fp, fmt=None, use_builtin_types=False, + dict_type=_InternalDict) + +def writePlist(value, pathOrFile): + """ + Write 'value' to a .plist file. 'pathOrFile' may either be a + file name or a (writable) file object. + + This function is deprecated, use dump instead. + """ + warn("The writePlist function is deprecated, use dump() instead", + DeprecationWarning, 2) + with _maybe_open(pathOrFile, 'wb') as fp: + dump(value, fp, fmt=FMT_XML, sort_keys=True, skipkeys=False) + + +def readPlistFromBytes(data): + """ + Read a plist data from a bytes object. Return the root object. + + This function is deprecated, use loads instead. + """ + warn("The readPlistFromBytes function is deprecated, use loads() instead", + DeprecationWarning, 2) + return load(BytesIO(data), fmt=None, use_builtin_types=False, + dict_type=_InternalDict) + + +def writePlistToBytes(value): + """ + Return 'value' as a plist-formatted bytes object. + + This function is deprecated, use dumps instead. + """ + warn("The writePlistToBytes function is deprecated, use dumps() instead", + DeprecationWarning, 2) + f = BytesIO() + dump(value, f, fmt=FMT_XML, sort_keys=True, skipkeys=False) + return f.getvalue() + class Data: + """ + Wrapper for binary data. - """Wrapper for binary data.""" + This class is deprecated, use a bytes object instead. + """ def __init__(self, data): if not isinstance(data, bytes): @@ -346,10 +217,10 @@ def fromBase64(cls, data): # base64.decodebytes just calls binascii.a2b_base64; # it seems overkill to use both base64 and binascii. - return cls(binascii.a2b_base64(data)) + return cls(_decode_base64(data)) def asBase64(self, maxlinelength=76): - return _encodeBase64(self.data, maxlinelength) + return _encode_base64(self.data, maxlinelength) def __eq__(self, other): if isinstance(other, self.__class__): @@ -362,43 +233,119 @@ def __repr__(self): return "%s(%s)" % (self.__class__.__name__, repr(self.data)) -class PlistParser: +# +# +# End of deprecated functionality +# +# - def __init__(self): + +# +# XML support +# + + +# XML 'header' +PLISTHEADER = b"""\ + + +""" + + +# Regex to find any control chars, except for \t \n and \r +_controlCharPat = re.compile( + r"[\x00\x01\x02\x03\x04\x05\x06\x07\x08\x0b\x0c\x0e\x0f" + r"\x10\x11\x12\x13\x14\x15\x16\x17\x18\x19\x1a\x1b\x1c\x1d\x1e\x1f]") + +def _encode_base64(s, maxlinelength=76): + # copied from base64.encodebytes(), with added maxlinelength argument + maxbinsize = (maxlinelength//4)*3 + pieces = [] + for i in range(0, len(s), maxbinsize): + chunk = s[i : i + maxbinsize] + pieces.append(binascii.b2a_base64(chunk)) + return b''.join(pieces) + +def _decode_base64(s): + if isinstance(s, str): + return binascii.a2b_base64(s.encode("utf-8")) + + else: + return binascii.a2b_base64(s) + +# Contents should conform to a subset of ISO 8601 +# (in particular, YYYY '-' MM '-' DD 'T' HH ':' MM ':' SS 'Z'. Smaller units +# may be omitted with # a loss of precision) +_dateParser = re.compile(r"(?P\d\d\d\d)(?:-(?P\d\d)(?:-(?P\d\d)(?:T(?P\d\d)(?::(?P\d\d)(?::(?P\d\d))?)?)?)?)?Z", re.ASCII) + + +def _date_from_string(s): + order = ('year', 'month', 'day', 'hour', 'minute', 'second') + gd = _dateParser.match(s).groupdict() + lst = [] + for key in order: + val = gd[key] + if val is None: + break + lst.append(int(val)) + return datetime.datetime(*lst) + + +def _date_to_string(d): + return '%04d-%02d-%02dT%02d:%02d:%02dZ' % ( + d.year, d.month, d.day, + d.hour, d.minute, d.second + ) + +def _escape(text): + m = _controlCharPat.search(text) + if m is not None: + raise ValueError("strings can't contains control characters; " + "use bytes instead") + text = text.replace("\r\n", "\n") # convert DOS line endings + text = text.replace("\r", "\n") # convert Mac line endings + text = text.replace("&", "&") # escape '&' + text = text.replace("<", "<") # escape '<' + text = text.replace(">", ">") # escape '>' + return text + +class _PlistParser: + def __init__(self, use_builtin_types, dict_type): self.stack = [] - self.currentKey = None + self.current_key = None self.root = None + self._use_builtin_types = use_builtin_types + self._dict_type = dict_type def parse(self, fileobj): - from xml.parsers.expat import ParserCreate self.parser = ParserCreate() - self.parser.StartElementHandler = self.handleBeginElement - self.parser.EndElementHandler = self.handleEndElement - self.parser.CharacterDataHandler = self.handleData + self.parser.StartElementHandler = self.handle_begin_element + self.parser.EndElementHandler = self.handle_end_element + self.parser.CharacterDataHandler = self.handle_data self.parser.ParseFile(fileobj) return self.root - def handleBeginElement(self, element, attrs): + def handle_begin_element(self, element, attrs): self.data = [] handler = getattr(self, "begin_" + element, None) if handler is not None: handler(attrs) - def handleEndElement(self, element): + def handle_end_element(self, element): handler = getattr(self, "end_" + element, None) if handler is not None: handler() - def handleData(self, data): + def handle_data(self, data): self.data.append(data) - def addObject(self, value): - if self.currentKey is not None: + def add_object(self, value): + if self.current_key is not None: if not isinstance(self.stack[-1], type({})): raise ValueError("unexpected element at line %d" % self.parser.CurrentLineNumber) - self.stack[-1][self.currentKey] = value - self.currentKey = None + self.stack[-1][self.current_key] = value + self.current_key = None elif not self.stack: # this is the root object self.root = value @@ -408,7 +355,7 @@ self.parser.CurrentLineNumber) self.stack[-1].append(value) - def getData(self): + def get_data(self): data = ''.join(self.data) self.data = [] return data @@ -416,39 +363,648 @@ # element handlers def begin_dict(self, attrs): - d = _InternalDict() - self.addObject(d) + d = self._dict_type() + self.add_object(d) self.stack.append(d) + def end_dict(self): - if self.currentKey: + if self.current_key: raise ValueError("missing value for key '%s' at line %d" % - (self.currentKey,self.parser.CurrentLineNumber)) + (self.current_key,self.parser.CurrentLineNumber)) self.stack.pop() def end_key(self): - if self.currentKey or not isinstance(self.stack[-1], type({})): + if self.current_key or not isinstance(self.stack[-1], type({})): raise ValueError("unexpected key at line %d" % self.parser.CurrentLineNumber) - self.currentKey = self.getData() + self.current_key = self.get_data() def begin_array(self, attrs): a = [] - self.addObject(a) + self.add_object(a) self.stack.append(a) + def end_array(self): self.stack.pop() def end_true(self): - self.addObject(True) + self.add_object(True) + def end_false(self): - self.addObject(False) + self.add_object(False) + def end_integer(self): - self.addObject(int(self.getData())) + self.add_object(int(self.get_data())) + def end_real(self): - self.addObject(float(self.getData())) + self.add_object(float(self.get_data())) + def end_string(self): - self.addObject(self.getData()) + self.add_object(self.get_data()) + def end_data(self): - self.addObject(Data.fromBase64(self.getData().encode("utf-8"))) + if self._use_builtin_types: + self.add_object(_decode_base64(self.get_data())) + + else: + self.add_object(Data.fromBase64(self.get_data())) + def end_date(self): - self.addObject(_dateFromString(self.getData())) + self.add_object(_date_from_string(self.get_data())) + + +class _DumbXMLWriter: + def __init__(self, file, indent_level=0, indent="\t"): + self.file = file + self.stack = [] + self._indent_level = indent_level + self.indent = indent + + def begin_element(self, element): + self.stack.append(element) + self.writeln("<%s>" % element) + self._indent_level += 1 + + def end_element(self, element): + assert self._indent_level > 0 + assert self.stack.pop() == element + self._indent_level -= 1 + self.writeln("" % element) + + def simple_element(self, element, value=None): + if value is not None: + value = _escape(value) + self.writeln("<%s>%s" % (element, value, element)) + + else: + self.writeln("<%s/>" % element) + + def writeln(self, line): + if line: + # plist has fixed encoding of utf-8 + + # XXX: is this test needed? + if isinstance(line, str): + line = line.encode('utf-8') + self.file.write(self._indent_level * self.indent) + self.file.write(line) + self.file.write(b'\n') + + +class _PlistWriter(_DumbXMLWriter): + def __init__( + self, file, indent_level=0, indent=b"\t", writeHeader=1, + sort_keys=True, skipkeys=False): + + if writeHeader: + file.write(PLISTHEADER) + _DumbXMLWriter.__init__(self, file, indent_level, indent) + self._sort_keys = sort_keys + self._skipkeys = skipkeys + + def write(self, value): + self.writeln("") + self.write_value(value) + self.writeln("") + + def write_value(self, value): + if isinstance(value, str): + self.simple_element("string", value) + + elif value is True: + self.simple_element("true") + + elif value is False: + self.simple_element("false") + + elif isinstance(value, int): + self.simple_element("integer", "%d" % value) + + elif isinstance(value, float): + self.simple_element("real", repr(value)) + + elif isinstance(value, dict): + self.write_dict(value) + + elif isinstance(value, Data): + self.write_data(value) + + elif isinstance(value, (bytes, bytearray)): + self.write_bytes(value) + + elif isinstance(value, datetime.datetime): + self.simple_element("date", _date_to_string(value)) + + elif isinstance(value, (tuple, list)): + self.write_array(value) + + else: + raise TypeError("unsupported type: %s" % type(value)) + + def write_data(self, data): + self.write_bytes(data.data) + + def write_bytes(self, data): + self.begin_element("data") + self._indent_level -= 1 + maxlinelength = max( + 16, + 76 - len(self.indent.replace(b"\t", b" " * 8) * self._indent_level)) + + for line in _encode_base64(data, maxlinelength).split(b"\n"): + if line: + self.writeln(line) + self._indent_level += 1 + self.end_element("data") + + def write_dict(self, d): + if d: + self.begin_element("dict") + if self._sort_keys: + items = sorted(d.items()) + else: + items = d.items() + + for key, value in items: + if not isinstance(key, str): + if self._skipkeys: + continue + raise TypeError("keys must be strings") + self.simple_element("key", key) + self.write_value(value) + self.end_element("dict") + + else: + self.simple_element("dict") + + def write_array(self, array): + if array: + self.begin_element("array") + for value in array: + self.write_value(value) + self.end_element("array") + + else: + self.simple_element("array") + + +def _is_fmt_xml(header): + prefixes = (b'offset... + # TRAILER + self._fp = fp + self._fp.seek(-32, os.SEEK_END) + trailer = self._fp.read(32) + if len(trailer) != 32: + raise InvalidFileException() + ( + offset_size, self._ref_size, num_objects, top_object, + offset_table_offset + ) = struct.unpack('>6xBBQQQ', trailer) + self._fp.seek(offset_table_offset) + offset_format = '>' + _BINARY_FORMAT[offset_size] * num_objects + self._ref_format = _BINARY_FORMAT[self._ref_size] + self._object_offsets = struct.unpack( + offset_format, self._fp.read(offset_size * num_objects)) + return self._read_object(self._object_offsets[top_object]) + + except (OSError, IndexError, struct.error): + raise InvalidFileException() + + def _get_size(self, tokenL): + """ return the size of the next object.""" + if tokenL == 0xF: + m = self._fp.read(1)[0] & 0x3 + s = 1 << m + f = '>' + _BINARY_FORMAT[s] + return struct.unpack(f, self._fp.read(s))[0] + + return tokenL + + def _read_refs(self, n): + return struct.unpack( + '>' + self._ref_format * n, self._fp.read(n * self._ref_size)) + + def _read_object(self, offset): + """ + read the object at offset. + + May recursively read sub-objects (content of an array/dict/set) + """ + self._fp.seek(offset) + token = self._fp.read(1)[0] + tokenH, tokenL = token & 0xF0, token & 0x0F + + if token == 0x00: + return None + + elif token == 0x08: + return False + + elif token == 0x09: + return True + + # The referenced source code also mentions URL (0x0c, 0x0d) and + # UUID (0x0e), but neither can be generated using the Cocoa libraries. + + elif token == 0x0f: + return b'' + + elif tokenH == 0x10: # int + return int.from_bytes(self._fp.read(1 << tokenL), 'big') + + elif token == 0x22: # real + return struct.unpack('>f', self._fp.read(4))[0] + + elif token == 0x23: # real + return struct.unpack('>d', self._fp.read(8))[0] + + elif token == 0x33: # date + f = struct.unpack('>d', self._fp.read(8))[0] + # timestamp 0 of binary plists corresponds to 1/1/2001 + # (year of Mac OS X 10.0), instead of 1/1/1970. + return datetime.datetime.utcfromtimestamp(f + (31 * 365 + 8) * 86400) + + elif tokenH == 0x40: # data + s = self._get_size(tokenL) + if self._use_builtin_types: + return self._fp.read(s) + else: + return Data(self._fp.read(s)) + + elif tokenH == 0x50: # ascii string + s = self._get_size(tokenL) + result = self._fp.read(s).decode('ascii') + return result + + elif tokenH == 0x60: # unicode string + s = self._get_size(tokenL) + return self._fp.read(s * 2).decode('utf-16be') + + # tokenH == 0x80 is documented as 'UID' and appears to be used for + # keyed-archiving, not in plists. + + elif tokenH == 0xA0: # array + s = self._get_size(tokenL) + obj_refs = self._read_refs(s) + return [self._read_object(self._object_offsets[x]) + for x in obj_refs] + + # tokenH == 0xB0 is documented as 'ordset', but is not actually + # implemented in the Apple reference code. + + # tokenH == 0xC0 is documented as 'set', but sets cannot be used in + # plists. + + elif tokenH == 0xD0: # dict + s = self._get_size(tokenL) + key_refs = self._read_refs(s) + obj_refs = self._read_refs(s) + result = self._dict_type() + for k, o in zip(key_refs, obj_refs): + result[self._read_object(self._object_offsets[k]) + ] = self._read_object(self._object_offsets[o]) + return result + + raise InvalidFileException() + +def _count_to_size(count): + if count < 1 << 8: + return 1 + + elif count < 1 << 16: + return 2 + + elif count << 1 << 32: + return 4 + + else: + return 8 + +class _BinaryPlistWriter (object): + def __init__(self, fp, sort_keys, skipkeys): + self._fp = fp + self._sort_keys = sort_keys + self._skipkeys = skipkeys + + def write(self, value): + + # Flattened object list: + self._objlist = [] + + # Mappings from object->objectid + # First dict has (type(object), object) as the key, + # second dict is used when object is not hashable and + # has id(object) as the key. + self._objtable = {} + self._objidtable = {} + + # Create list of all objects in the plist + self._flatten(value) + + # Size of object references in serialized containers + # depends on the number of objects in the plist. + num_objects = len(self._objlist) + self._object_offsets = [0]*num_objects + self._ref_size = _count_to_size(num_objects) + + self._ref_format = _BINARY_FORMAT[self._ref_size] + + # Write file header + self._fp.write(b'bplist00') + + # Write object list + for obj in self._objlist: + self._write_object(obj) + + # Write refnum->object offset table + top_object = self._getrefnum(value) + offset_table_offset = self._fp.tell() + offset_size = _count_to_size(offset_table_offset) + offset_format = '>' + _BINARY_FORMAT[offset_size] * num_objects + self._fp.write(struct.pack(offset_format, *self._object_offsets)) + + # Write trailer + sort_version = 0 + trailer = ( + sort_version, offset_size, self._ref_size, num_objects, + top_object, offset_table_offset + ) + self._fp.write(struct.pack('>5xBBBQQQ', *trailer)) + + def _flatten(self, value): + # First check if the object is in the object table, not used for + # containers to ensure that two subcontainers with the same contents + # will be serialized as distinct values. + if isinstance(value, ( + str, int, float, datetime.datetime, bytes, bytearray)): + if (type(value), value) in self._objtable: + return + + elif isinstance(value, Data): + if (type(value.data), value.data) in self._objtable: + return + + # Add to objectreference map + refnum = len(self._objlist) + self._objlist.append(value) + try: + if isinstance(value, Data): + self._objtable[(type(value.data), value.data)] = refnum + else: + self._objtable[(type(value), value)] = refnum + except TypeError: + self._objidtable[id(value)] = refnum + + # And finally recurse into containers + if isinstance(value, dict): + keys = [] + values = [] + items = value.items() + if self._sort_keys: + items = sorted(items) + + for k, v in items: + if not isinstance(k, str): + if self._skipkeys: + continue + raise TypeError("keys must be strings") + keys.append(k) + values.append(v) + + for o in itertools.chain(keys, values): + self._flatten(o) + + elif isinstance(value, (list, tuple)): + for o in value: + self._flatten(o) + + def _getrefnum(self, value): + try: + if isinstance(value, Data): + return self._objtable[(type(value.data), value.data)] + else: + return self._objtable[(type(value), value)] + except TypeError: + return self._objidtable[id(value)] + + def _write_size(self, token, size): + if size < 15: + self._fp.write(struct.pack('>B', token | size)) + + elif size < 1 << 8: + self._fp.write(struct.pack('>BBB', token | 0xF, 0x10, size)) + + elif size < 1 << 16: + self._fp.write(struct.pack('>BBH', token | 0xF, 0x11, size)) + + elif size < 1 << 32: + self._fp.write(struct.pack('>BBL', token | 0xF, 0x12, size)) + + else: + self._fp.write(struct.pack('>BBQ', token | 0xF, 0x13, size)) + + def _write_object(self, value): + ref = self._getrefnum(value) + self._object_offsets[ref] = self._fp.tell() + if value is None: + self._fp.write(b'\x00') + + elif value is False: + self._fp.write(b'\x08') + + elif value is True: + self._fp.write(b'\x09') + + elif isinstance(value, int): + if value < 1 << 8: + self._fp.write(struct.pack('>BB', 0x10, value)) + elif value < 1 << 16: + self._fp.write(struct.pack('>BH', 0x11, value)) + elif value < 1 << 32: + self._fp.write(struct.pack('>BL', 0x12, value)) + else: + self._fp.write(struct.pack('>BQ', 0x13, value)) + + elif isinstance(value, float): + self._fp.write(struct.pack('>Bd', 0x23, value)) + + elif isinstance(value, datetime.datetime): + f = (value - datetime.datetime(2001, 1, 1)).total_seconds() + self._fp.write(struct.pack('>Bd', 0x33, f)) + + elif isinstance(value, Data): + self._write_size(0x40, len(value.data)) + self._fp.write(value.data) + + elif isinstance(value, (bytes, bytearray)): + self._write_size(0x40, len(value)) + self._fp.write(value) + + elif isinstance(value, str): + try: + t = value.encode('ascii') + self._write_size(0x50, len(value)) + except UnicodeEncodeError: + t = value.encode('utf-16be') + self._write_size(0x60, len(value)) + + self._fp.write(t) + + elif isinstance(value, (list, tuple)): + refs = [self._getrefnum(o) for o in value] + s = len(refs) + self._write_size(0xA0, s) + self._fp.write(struct.pack('>' + self._ref_format * s, *refs)) + + elif isinstance(value, dict): + keyRefs, valRefs = [], [] + + if self._sort_keys: + rootItems = sorted(value.items()) + else: + rootItems = value.items() + + for k, v in rootItems: + if not isinstance(k, str): + if self._skipkeys: + continue + raise TypeError("keys must be strings") + keyRefs.append(self._getrefnum(k)) + valRefs.append(self._getrefnum(v)) + + s = len(keyRefs) + self._write_size(0xD0, s) + self._fp.write(struct.pack('>' + self._ref_format * s, *keyRefs)) + self._fp.write(struct.pack('>' + self._ref_format * s, *valRefs)) + + else: + raise InvalidFileException() + + +def _is_fmt_binary(header): + return header[:8] == b'bplist00' + + +# +# Generic bits +# + +_FORMATS={ + FMT_XML: dict( + detect=_is_fmt_xml, + parser=_PlistParser, + writer=_PlistWriter, + ), + FMT_BINARY: dict( + detect=_is_fmt_binary, + parser=_BinaryPlistParser, + writer=_BinaryPlistWriter, + ) +} + + +def load(fp, *, fmt=None, use_builtin_types=True, dict_type=dict): + """Read a .plist file. 'fp' should be (readable) file object. + Return the unpacked root object (which usually is a dictionary). + """ + if fmt is None: + header = fp.read(32) + fp.seek(0) + for info in _FORMATS.values(): + if info['detect'](header): + p = info['parser']( + use_builtin_types=use_builtin_types, + dict_type=dict_type, + ) + break + + else: + raise InvalidFileException() + + else: + p = _FORMATS[fmt]['parser'](use_builtin_types=use_builtin_types) + + return p.parse(fp) + + +def loads(value, *, fmt=None, use_builtin_types=True, dict_type=dict): + """Read a .plist file from a bytes object. + Return the unpacked root object (which usually is a dictionary). + """ + fp = BytesIO(value) + return load( + fp, fmt=fmt, use_builtin_types=use_builtin_types, dict_type=dict_type) + + +def dump(value, fp, *, fmt=FMT_XML, sort_keys=True, skipkeys=False): + """Write 'value' to a .plist file. 'fp' should be a (writable) + file object. + """ + if fmt not in _FORMATS: + raise ValueError("Unsupported format: %r"%(fmt,)) + + writer = _FORMATS[fmt]["writer"](fp, sort_keys=sort_keys, skipkeys=skipkeys) + writer.write(value) + + +def dumps(value, *, fmt=FMT_XML, skipkeys=False, sort_keys=True): + """Return a bytes object with the contents for a .plist file. + """ + fp = BytesIO() + dump(value, fp, fmt=fmt, skipkeys=skipkeys, sort_keys=sort_keys) + return fp.getvalue() diff --git a/Lib/test/test_plistlib.py b/Lib/test/test_plistlib.py --- a/Lib/test/test_plistlib.py +++ b/Lib/test/test_plistlib.py @@ -1,94 +1,87 @@ -# Copyright (C) 2003 Python Software Foundation +# Copyright (C) 2003-2013 Python Software Foundation import unittest import plistlib import os import datetime +import codecs +import binascii +import collections from test import support +from io import BytesIO +ALL_FORMATS=(plistlib.FMT_XML, plistlib.FMT_BINARY) -# This test data was generated through Cocoa's NSDictionary class -TESTDATA = b""" - - - - aDate - 2004-10-26T10:33:33Z - aDict - - aFalseValue - - aTrueValue - - aUnicodeValue - M\xc3\xa4ssig, Ma\xc3\x9f - anotherString - <hello & 'hi' there!> - deeperDict - - a - 17 - b - 32.5 - c - - 1 - 2 - text - - - - aFloat - 0.5 - aList - - A - B - 12 - 32.5 - - 1 - 2 - 3 - - - aString - Doodah - anEmptyDict - - anEmptyList - - anInt - 728 - nestedData - - - PGxvdHMgb2YgYmluYXJ5IGd1bms+AAECAzxsb3RzIG9mIGJpbmFyeSBndW5r - PgABAgM8bG90cyBvZiBiaW5hcnkgZ3Vuaz4AAQIDPGxvdHMgb2YgYmluYXJ5 - IGd1bms+AAECAzxsb3RzIG9mIGJpbmFyeSBndW5rPgABAgM8bG90cyBvZiBi - aW5hcnkgZ3Vuaz4AAQIDPGxvdHMgb2YgYmluYXJ5IGd1bms+AAECAzxsb3Rz - IG9mIGJpbmFyeSBndW5rPgABAgM8bG90cyBvZiBiaW5hcnkgZ3Vuaz4AAQID - PGxvdHMgb2YgYmluYXJ5IGd1bms+AAECAw== - - - someData - - PGJpbmFyeSBndW5rPg== - - someMoreData - - PGxvdHMgb2YgYmluYXJ5IGd1bms+AAECAzxsb3RzIG9mIGJpbmFyeSBndW5rPgABAgM8 - bG90cyBvZiBiaW5hcnkgZ3Vuaz4AAQIDPGxvdHMgb2YgYmluYXJ5IGd1bms+AAECAzxs - b3RzIG9mIGJpbmFyeSBndW5rPgABAgM8bG90cyBvZiBiaW5hcnkgZ3Vuaz4AAQIDPGxv - dHMgb2YgYmluYXJ5IGd1bms+AAECAzxsb3RzIG9mIGJpbmFyeSBndW5rPgABAgM8bG90 - cyBvZiBiaW5hcnkgZ3Vuaz4AAQIDPGxvdHMgb2YgYmluYXJ5IGd1bms+AAECAw== - - \xc3\x85benraa - That was a unicode key. - - -""".replace(b" " * 8, b"\t") # Apple as well as plistlib.py output hard tabs +# The testdata is generated using Mac/Tools/plistlib_generate_testdata.py +# (which using PyObjC to control the Cocoa classes for generating plists) +TESTDATA={ + plistlib.FMT_XML: binascii.a2b_base64(b''' + PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiPz4KPCFET0NU + WVBFIHBsaXN0IFBVQkxJQyAiLS8vQXBwbGUvL0RURCBQTElTVCAxLjAvL0VO + IiAiaHR0cDovL3d3dy5hcHBsZS5jb20vRFREcy9Qcm9wZXJ0eUxpc3QtMS4w + LmR0ZCI+CjxwbGlzdCB2ZXJzaW9uPSIxLjAiPgo8ZGljdD4KCTxrZXk+YURh + dGU8L2tleT4KCTxkYXRlPjIwMDQtMTAtMjZUMTA6MzM6MzNaPC9kYXRlPgoJ + PGtleT5hRGljdDwva2V5PgoJPGRpY3Q+CgkJPGtleT5hRmFsc2VWYWx1ZTwv + a2V5PgoJCTxmYWxzZS8+CgkJPGtleT5hVHJ1ZVZhbHVlPC9rZXk+CgkJPHRy + dWUvPgoJCTxrZXk+YVVuaWNvZGVWYWx1ZTwva2V5PgoJCTxzdHJpbmc+TcOk + c3NpZywgTWHDnzwvc3RyaW5nPgoJCTxrZXk+YW5vdGhlclN0cmluZzwva2V5 + PgoJCTxzdHJpbmc+Jmx0O2hlbGxvICZhbXA7ICdoaScgdGhlcmUhJmd0Ozwv + c3RyaW5nPgoJCTxrZXk+ZGVlcGVyRGljdDwva2V5PgoJCTxkaWN0PgoJCQk8 + a2V5PmE8L2tleT4KCQkJPGludGVnZXI+MTc8L2ludGVnZXI+CgkJCTxrZXk+ + Yjwva2V5PgoJCQk8cmVhbD4zMi41PC9yZWFsPgoJCQk8a2V5PmM8L2tleT4K + CQkJPGFycmF5PgoJCQkJPGludGVnZXI+MTwvaW50ZWdlcj4KCQkJCTxpbnRl + Z2VyPjI8L2ludGVnZXI+CgkJCQk8c3RyaW5nPnRleHQ8L3N0cmluZz4KCQkJ + PC9hcnJheT4KCQk8L2RpY3Q+Cgk8L2RpY3Q+Cgk8a2V5PmFGbG9hdDwva2V5 + PgoJPHJlYWw+MC41PC9yZWFsPgoJPGtleT5hTGlzdDwva2V5PgoJPGFycmF5 + PgoJCTxzdHJpbmc+QTwvc3RyaW5nPgoJCTxzdHJpbmc+Qjwvc3RyaW5nPgoJ + CTxpbnRlZ2VyPjEyPC9pbnRlZ2VyPgoJCTxyZWFsPjMyLjU8L3JlYWw+CgkJ + PGFycmF5PgoJCQk8aW50ZWdlcj4xPC9pbnRlZ2VyPgoJCQk8aW50ZWdlcj4y + PC9pbnRlZ2VyPgoJCQk8aW50ZWdlcj4zPC9pbnRlZ2VyPgoJCTwvYXJyYXk+ + Cgk8L2FycmF5PgoJPGtleT5hU3RyaW5nPC9rZXk+Cgk8c3RyaW5nPkRvb2Rh + aDwvc3RyaW5nPgoJPGtleT5hbkVtcHR5RGljdDwva2V5PgoJPGRpY3QvPgoJ + PGtleT5hbkVtcHR5TGlzdDwva2V5PgoJPGFycmF5Lz4KCTxrZXk+YW5JbnQ8 + L2tleT4KCTxpbnRlZ2VyPjcyODwvaW50ZWdlcj4KCTxrZXk+bmVzdGVkRGF0 + YTwva2V5PgoJPGFycmF5PgoJCTxkYXRhPgoJCVBHeHZkSE1nYjJZZ1ltbHVZ + WEo1SUdkMWJtcytBQUVDQXp4c2IzUnpJRzltSUdKcGJtRnllU0JuZFc1cgoJ + CVBnQUJBZ004Ykc5MGN5QnZaaUJpYVc1aGNua2daM1Z1YXo0QUFRSURQR3h2 + ZEhNZ2IyWWdZbWx1WVhKNQoJCUlHZDFibXMrQUFFQ0F6eHNiM1J6SUc5bUlH + SnBibUZ5ZVNCbmRXNXJQZ0FCQWdNOGJHOTBjeUJ2WmlCaQoJCWFXNWhjbmtn + WjNWdWF6NEFBUUlEUEd4dmRITWdiMllnWW1sdVlYSjVJR2QxYm1zK0FBRUNB + enhzYjNSegoJCUlHOW1JR0pwYm1GeWVTQm5kVzVyUGdBQkFnTThiRzkwY3lC + dlppQmlhVzVoY25rZ1ozVnVhejRBQVFJRAoJCVBHeHZkSE1nYjJZZ1ltbHVZ + WEo1SUdkMWJtcytBQUVDQXc9PQoJCTwvZGF0YT4KCTwvYXJyYXk+Cgk8a2V5 + PnNvbWVEYXRhPC9rZXk+Cgk8ZGF0YT4KCVBHSnBibUZ5ZVNCbmRXNXJQZz09 + Cgk8L2RhdGE+Cgk8a2V5PnNvbWVNb3JlRGF0YTwva2V5PgoJPGRhdGE+CglQ + R3h2ZEhNZ2IyWWdZbWx1WVhKNUlHZDFibXMrQUFFQ0F6eHNiM1J6SUc5bUlH + SnBibUZ5ZVNCbmRXNXJQZ0FCQWdNOAoJYkc5MGN5QnZaaUJpYVc1aGNua2da + M1Z1YXo0QUFRSURQR3h2ZEhNZ2IyWWdZbWx1WVhKNUlHZDFibXMrQUFFQ0F6 + eHMKCWIzUnpJRzltSUdKcGJtRnllU0JuZFc1clBnQUJBZ004Ykc5MGN5QnZa + aUJpYVc1aGNua2daM1Z1YXo0QUFRSURQR3h2CglkSE1nYjJZZ1ltbHVZWEo1 + SUdkMWJtcytBQUVDQXp4c2IzUnpJRzltSUdKcGJtRnllU0JuZFc1clBnQUJB + Z004Ykc5MAoJY3lCdlppQmlhVzVoY25rZ1ozVnVhejRBQVFJRFBHeHZkSE1n + YjJZZ1ltbHVZWEo1SUdkMWJtcytBQUVDQXc9PQoJPC9kYXRhPgoJPGtleT7D + hWJlbnJhYTwva2V5PgoJPHN0cmluZz5UaGF0IHdhcyBhIHVuaWNvZGUga2V5 + Ljwvc3RyaW5nPgo8L2RpY3Q+CjwvcGxpc3Q+Cg=='''), + plistlib.FMT_BINARY: binascii.a2b_base64(b''' + YnBsaXN0MDDcAQIDBAUGBwgJCgsMDQ4iIykqKywtLy4wVWFEYXRlVWFEaWN0 + VmFGbG9hdFVhTGlzdFdhU3RyaW5nW2FuRW1wdHlEaWN0W2FuRW1wdHlMaXN0 + VWFuSW50Wm5lc3RlZERhdGFYc29tZURhdGFcc29tZU1vcmVEYXRhZwDFAGIA + ZQBuAHIAYQBhM0GcuX30AAAA1Q8QERITFBUWFxhbYUZhbHNlVmFsdWVaYVRy + dWVWYWx1ZV1hVW5pY29kZVZhbHVlXWFub3RoZXJTdHJpbmdaZGVlcGVyRGlj + dAgJawBNAOQAcwBzAGkAZwAsACAATQBhAN9fEBU8aGVsbG8gJiAnaGknIHRo + ZXJlIT7TGRobHB0eUWFRYlFjEBEjQEBAAAAAAACjHyAhEAEQAlR0ZXh0Iz/g + AAAAAAAApSQlJh0nUUFRQhAMox8gKBADVkRvb2RhaNCgEQLYoS5PEPo8bG90 + cyBvZiBiaW5hcnkgZ3Vuaz4AAQIDPGxvdHMgb2YgYmluYXJ5IGd1bms+AAEC + Azxsb3RzIG9mIGJpbmFyeSBndW5rPgABAgM8bG90cyBvZiBiaW5hcnkgZ3Vu + az4AAQIDPGxvdHMgb2YgYmluYXJ5IGd1bms+AAECAzxsb3RzIG9mIGJpbmFy + eSBndW5rPgABAgM8bG90cyBvZiBiaW5hcnkgZ3Vuaz4AAQIDPGxvdHMgb2Yg + YmluYXJ5IGd1bms+AAECAzxsb3RzIG9mIGJpbmFyeSBndW5rPgABAgM8bG90 + cyBvZiBiaW5hcnkgZ3Vuaz4AAQIDTTxiaW5hcnkgZ3Vuaz5fEBdUaGF0IHdh + cyBhIHVuaWNvZGUga2V5LgAIACEAJwAtADQAOgBCAE4AWgBgAGsAdACBAJAA + mQCkALAAuwDJANcA4gDjAOQA+wETARoBHAEeASABIgErAS8BMQEzATgBQQFH + AUkBSwFNAVEBUwFaAVsBXAFfAWECXgJsAAAAAAAAAgEAAAAAAAAAMQAAAAAA + AAAAAAAAAAAAAoY='''), +} class TestPlistlib(unittest.TestCase): @@ -99,7 +92,7 @@ except: pass - def _create(self): + def _create(self, fmt=None): pl = dict( aString="Doodah", aList=["A", "B", 12, 32.5, [1, 2, 3]], @@ -112,9 +105,9 @@ aFalseValue=False, deeperDict=dict(a=17, b=32.5, c=[1, 2, "text"]), ), - someData = plistlib.Data(b""), - someMoreData = plistlib.Data(b"\0\1\2\3" * 10), - nestedData = [plistlib.Data(b"\0\1\2\3" * 10)], + someData = b"", + someMoreData = b"\0\1\2\3" * 10, + nestedData = [b"\0\1\2\3" * 10], aDate = datetime.datetime(2004, 10, 26, 10, 33, 33), anEmptyDict = dict(), anEmptyList = list() @@ -129,49 +122,191 @@ def test_io(self): pl = self._create() - plistlib.writePlist(pl, support.TESTFN) - pl2 = plistlib.readPlist(support.TESTFN) + with open(support.TESTFN, 'wb') as fp: + plistlib.dump(pl, fp) + + with open(support.TESTFN, 'rb') as fp: + pl2 = plistlib.load(fp) + self.assertEqual(dict(pl), dict(pl2)) + self.assertRaises(AttributeError, plistlib.dump, pl, 'filename') + self.assertRaises(AttributeError, plistlib.load, 'filename') + + def test_bytes(self): pl = self._create() - data = plistlib.writePlistToBytes(pl) - pl2 = plistlib.readPlistFromBytes(data) + data = plistlib.dumps(pl) + pl2 = plistlib.loads(data) + self.assertNotIsInstance(pl, plistlib._InternalDict) self.assertEqual(dict(pl), dict(pl2)) - data2 = plistlib.writePlistToBytes(pl2) + data2 = plistlib.dumps(pl2) self.assertEqual(data, data2) def test_indentation_array(self): - data = [[[[[[[[{'test': plistlib.Data(b'aaaaaa')}]]]]]]]] - self.assertEqual(plistlib.readPlistFromBytes(plistlib.writePlistToBytes(data)), data) + data = [[[[[[[[{'test': b'aaaaaa'}]]]]]]]] + self.assertEqual(plistlib.loads(plistlib.dumps(data)), data) def test_indentation_dict(self): - data = {'1': {'2': {'3': {'4': {'5': {'6': {'7': {'8': {'9': plistlib.Data(b'aaaaaa')}}}}}}}}} - self.assertEqual(plistlib.readPlistFromBytes(plistlib.writePlistToBytes(data)), data) + data = {'1': {'2': {'3': {'4': {'5': {'6': {'7': {'8': {'9': b'aaaaaa'}}}}}}}}} + self.assertEqual(plistlib.loads(plistlib.dumps(data)), data) def test_indentation_dict_mix(self): - data = {'1': {'2': [{'3': [[[[[{'test': plistlib.Data(b'aaaaaa')}]]]]]}]}} - self.assertEqual(plistlib.readPlistFromBytes(plistlib.writePlistToBytes(data)), data) + data = {'1': {'2': [{'3': [[[[[{'test': b'aaaaaa'}]]]]]}]}} + self.assertEqual(plistlib.loads(plistlib.dumps(data)), data) def test_appleformatting(self): - pl = plistlib.readPlistFromBytes(TESTDATA) - data = plistlib.writePlistToBytes(pl) - self.assertEqual(data, TESTDATA, - "generated data was not identical to Apple's output") + for use_builtin_types in (True, False): + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt, use_builtin_types=use_builtin_types): + pl = plistlib.loads(TESTDATA[fmt], + use_builtin_types=use_builtin_types) + data = plistlib.dumps(pl, fmt=fmt) + self.assertEqual(data, TESTDATA[fmt], + "generated data was not identical to Apple's output") + def test_appleformattingfromliteral(self): - pl = self._create() - pl2 = plistlib.readPlistFromBytes(TESTDATA) - self.assertEqual(dict(pl), dict(pl2), - "generated data was not identical to Apple's output") + self.maxDiff = None + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + pl = self._create(fmt=fmt) + pl2 = plistlib.loads(TESTDATA[fmt]) + self.assertEqual(dict(pl), dict(pl2), + "generated data was not identical to Apple's output") def test_bytesio(self): - from io import BytesIO - b = BytesIO() - pl = self._create() - plistlib.writePlist(pl, b) - pl2 = plistlib.readPlist(BytesIO(b.getvalue())) - self.assertEqual(dict(pl), dict(pl2)) + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + b = BytesIO() + pl = self._create(fmt=fmt) + plistlib.dump(pl, b, fmt=fmt) + pl2 = plistlib.load(BytesIO(b.getvalue())) + self.assertEqual(dict(pl), dict(pl2)) + + def test_keysort_bytesio(self): + pl = collections.OrderedDict() + pl['b'] = 1 + pl['a'] = 2 + pl['c'] = 3 + + for fmt in ALL_FORMATS: + for sort_keys in (False, True): + with self.subTest(fmt=fmt, sort_keys=sort_keys): + b = BytesIO() + + plistlib.dump(pl, b, fmt=fmt, sort_keys=sort_keys) + pl2 = plistlib.load(BytesIO(b.getvalue()), + dict_type=collections.OrderedDict) + + self.assertEqual(dict(pl), dict(pl2)) + if sort_keys: + self.assertEqual(list(pl2.keys()), ['a', 'b', 'c']) + else: + self.assertEqual(list(pl2.keys()), ['b', 'a', 'c']) + + def test_keysort(self): + pl = collections.OrderedDict() + pl['b'] = 1 + pl['a'] = 2 + pl['c'] = 3 + + for fmt in ALL_FORMATS: + for sort_keys in (False, True): + with self.subTest(fmt=fmt, sort_keys=sort_keys): + data = plistlib.dumps(pl, fmt=fmt, sort_keys=sort_keys) + pl2 = plistlib.loads(data, dict_type=collections.OrderedDict) + + self.assertEqual(dict(pl), dict(pl2)) + if sort_keys: + self.assertEqual(list(pl2.keys()), ['a', 'b', 'c']) + else: + self.assertEqual(list(pl2.keys()), ['b', 'a', 'c']) + + def test_keys_no_string(self): + pl = { 42: 'aNumber' } + + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + self.assertRaises(TypeError, plistlib.dumps, pl, fmt=fmt) + + b = BytesIO() + self.assertRaises(TypeError, plistlib.dump, pl, b, fmt=fmt) + + def test_skipkeys(self): + pl = { + 42: 'aNumber', + 'snake': 'aWord', + } + + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + data = plistlib.dumps( + pl, fmt=fmt, skipkeys=True, sort_keys=False) + + pl2 = plistlib.loads(data) + self.assertEqual(pl2, {'snake': 'aWord'}) + + fp = BytesIO() + plistlib.dump( + pl, fp, fmt=fmt, skipkeys=True, sort_keys=False) + data = fp.getvalue() + pl2 = plistlib.loads(fp.getvalue()) + self.assertEqual(pl2, {'snake': 'aWord'}) + + def test_tuple_members(self): + pl = { + 'first': (1, 2), + 'second': (1, 2), + 'third': (3, 4), + } + + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + data = plistlib.dumps(pl, fmt=fmt) + pl2 = plistlib.loads(data) + self.assertEqual(pl2, { + 'first': [1, 2], + 'second': [1, 2], + 'third': [3, 4], + }) + self.assertIsNot(pl2['first'], pl2['second']) + + def test_list_members(self): + pl = { + 'first': [1, 2], + 'second': [1, 2], + 'third': [3, 4], + } + + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + data = plistlib.dumps(pl, fmt=fmt) + pl2 = plistlib.loads(data) + self.assertEqual(pl2, { + 'first': [1, 2], + 'second': [1, 2], + 'third': [3, 4], + }) + self.assertIsNot(pl2['first'], pl2['second']) + + def test_dict_members(self): + pl = { + 'first': {'a': 1}, + 'second': {'a': 1}, + 'third': {'b': 2 }, + } + + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + data = plistlib.dumps(pl, fmt=fmt) + pl2 = plistlib.loads(data) + self.assertEqual(pl2, { + 'first': {'a': 1}, + 'second': {'a': 1}, + 'third': {'b': 2 }, + }) + self.assertIsNot(pl2['first'], pl2['second']) def test_controlcharacters(self): for i in range(128): @@ -179,25 +314,27 @@ testString = "string containing %s" % c if i >= 32 or c in "\r\n\t": # \r, \n and \t are the only legal control chars in XML - plistlib.writePlistToBytes(testString) + plistlib.dumps(testString, fmt=plistlib.FMT_XML) else: self.assertRaises(ValueError, - plistlib.writePlistToBytes, + plistlib.dumps, testString) def test_nondictroot(self): - test1 = "abc" - test2 = [1, 2, 3, "abc"] - result1 = plistlib.readPlistFromBytes(plistlib.writePlistToBytes(test1)) - result2 = plistlib.readPlistFromBytes(plistlib.writePlistToBytes(test2)) - self.assertEqual(test1, result1) - self.assertEqual(test2, result2) + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + test1 = "abc" + test2 = [1, 2, 3, "abc"] + result1 = plistlib.loads(plistlib.dumps(test1, fmt=fmt)) + result2 = plistlib.loads(plistlib.dumps(test2, fmt=fmt)) + self.assertEqual(test1, result1) + self.assertEqual(test2, result2) def test_invalidarray(self): for i in ["key inside an array", "key inside an array23", "key inside an array3"]: - self.assertRaises(ValueError, plistlib.readPlistFromBytes, + self.assertRaises(ValueError, plistlib.loads, ("%s"%i).encode()) def test_invaliddict(self): @@ -206,22 +343,130 @@ "missing key", "k1v15.3" "k1k2double key"]: - self.assertRaises(ValueError, plistlib.readPlistFromBytes, + self.assertRaises(ValueError, plistlib.loads, ("%s"%i).encode()) - self.assertRaises(ValueError, plistlib.readPlistFromBytes, + self.assertRaises(ValueError, plistlib.loads, ("%s"%i).encode()) def test_invalidinteger(self): - self.assertRaises(ValueError, plistlib.readPlistFromBytes, + self.assertRaises(ValueError, plistlib.loads, b"not integer") def test_invalidreal(self): - self.assertRaises(ValueError, plistlib.readPlistFromBytes, + self.assertRaises(ValueError, plistlib.loads, b"not real") + def test_xml_encodings(self): + base = TESTDATA[plistlib.FMT_XML] + + for xml_encoding, encoding, bom in [ + (b'utf-8', 'utf-8', codecs.BOM_UTF8), + (b'utf-16', 'utf-16-le', codecs.BOM_UTF16_LE), + (b'utf-16', 'utf-16-be', codecs.BOM_UTF16_BE), + # Expat does not support UTF-32 + #(b'utf-32', 'utf-32-le', codecs.BOM_UTF32_LE), + #(b'utf-32', 'utf-32-be', codecs.BOM_UTF32_BE), + ]: + + pl = self._create(fmt=plistlib.FMT_XML) + with self.subTest(encoding=encoding): + data = base.replace(b'UTF-8', xml_encoding) + data = bom + data.decode('utf-8').encode(encoding) + pl2 = plistlib.loads(data) + self.assertEqual(dict(pl), dict(pl2)) + + +class TestPlistlibDeprecated(unittest.TestCase): + def test_io_deprecated(self): + pl_in = { + 'key': 42, + 'sub': { + 'key': 9, + 'alt': 'value', + 'data': b'buffer', + } + } + pl_out = plistlib._InternalDict({ + 'key': 42, + 'sub': plistlib._InternalDict({ + 'key': 9, + 'alt': 'value', + 'data': plistlib.Data(b'buffer'), + }) + }) + + self.addCleanup(support.unlink, support.TESTFN) + with self.assertWarns(DeprecationWarning): + plistlib.writePlist(pl_in, support.TESTFN) + + with self.assertWarns(DeprecationWarning): + pl2 = plistlib.readPlist(support.TESTFN) + + self.assertEqual(pl_out, pl2) + + os.unlink(support.TESTFN) + + with open(support.TESTFN, 'wb') as fp: + with self.assertWarns(DeprecationWarning): + plistlib.writePlist(pl_in, fp) + + with open(support.TESTFN, 'rb') as fp: + with self.assertWarns(DeprecationWarning): + pl2 = plistlib.readPlist(fp) + + self.assertEqual(pl_out, pl2) + + def test_bytes_deprecated(self): + pl = { + 'key': 42, + 'sub': { + 'key': 9, + 'alt': 'value', + 'data': b'buffer', + } + } + with self.assertWarns(DeprecationWarning): + data = plistlib.writePlistToBytes(pl) + + with self.assertWarns(DeprecationWarning): + pl2 = plistlib.readPlistFromBytes(data) + + self.assertIsInstance(pl2, plistlib._InternalDict) + self.assertEqual(pl2, plistlib._InternalDict( + key=42, + sub=plistlib._InternalDict( + key=9, + alt='value', + data=plistlib.Data(b'buffer'), + ) + )) + + with self.assertWarns(DeprecationWarning): + data2 = plistlib.writePlistToBytes(pl2) + self.assertEqual(data, data2) + + def test_dataobject_deprecated(self): + in_data = { 'key': plistlib.Data(b'hello') } + out_data = { 'key': b'hello' } + + buf = plistlib.dumps(in_data) + + cur = plistlib.loads(buf) + self.assertEqual(cur, out_data) + self.assertNotEqual(cur, in_data) + + cur = plistlib.loads(buf, use_builtin_types=False) + self.assertNotEqual(cur, out_data) + self.assertEqual(cur, in_data) + + with self.assertWarns(DeprecationWarning): + cur = plistlib.readPlistFromBytes(buf) + self.assertNotEqual(cur, out_data) + self.assertEqual(cur, in_data) + def test_main(): - support.run_unittest(TestPlistlib) + support.run_unittest(TestPlistlib, TestPlistlibDeprecated) if __name__ == '__main__': diff --git a/Mac/Tools/plistlib_generate_testdata.py b/Mac/Tools/plistlib_generate_testdata.py new file mode 100644 --- /dev/null +++ b/Mac/Tools/plistlib_generate_testdata.py @@ -0,0 +1,94 @@ +#!/usr/bin/env python3 + +from Cocoa import NSMutableDictionary, NSMutableArray, NSString, NSDate +from Cocoa import NSPropertyListSerialization, NSPropertyListOpenStepFormat +from Cocoa import NSPropertyListXMLFormat_v1_0, NSPropertyListBinaryFormat_v1_0 +from Cocoa import CFUUIDCreateFromString, NSNull, NSUUID, CFPropertyListCreateData +from Cocoa import NSURL + +import datetime +from collections import OrderedDict +import binascii + +FORMATS=[ +# ('openstep', NSPropertyListOpenStepFormat), + ('plistlib.FMT_XML', NSPropertyListXMLFormat_v1_0), + ('plistlib.FMT_BINARY', NSPropertyListBinaryFormat_v1_0), +] + +def nsstr(value): + return NSString.alloc().initWithString_(value) + + +def main(): + pl = OrderedDict() + + seconds = datetime.datetime(2004, 10, 26, 10, 33, 33, tzinfo=datetime.timezone(datetime.timedelta(0))).timestamp() + pl[nsstr('aDate')] = NSDate.dateWithTimeIntervalSince1970_(seconds) + + pl[nsstr('aDict')] = d = OrderedDict() + d[nsstr('aFalseValue')] = False + d[nsstr('aTrueValue')] = True + d[nsstr('aUnicodeValue')] = "M\xe4ssig, Ma\xdf" + d[nsstr('anotherString')] = "" + d[nsstr('deeperDict')] = dd = OrderedDict() + dd[nsstr('a')] = 17 + dd[nsstr('b')] = 32.5 + dd[nsstr('c')] = a = NSMutableArray.alloc().init() + a.append(1) + a.append(2) + a.append(nsstr('text')) + + pl[nsstr('aFloat')] = 0.5 + + pl[nsstr('aList')] = a = NSMutableArray.alloc().init() + a.append(nsstr('A')) + a.append(nsstr('B')) + a.append(12) + a.append(32.5) + aa = NSMutableArray.alloc().init() + a.append(aa) + aa.append(1) + aa.append(2) + aa.append(3) + + pl[nsstr('aString')] = nsstr('Doodah') + + pl[nsstr('anEmptyDict')] = NSMutableDictionary.alloc().init() + + pl[nsstr('anEmptyList')] = NSMutableArray.alloc().init() + + pl[nsstr('anInt')] = 728 + + pl[nsstr('nestedData')] = a = NSMutableArray.alloc().init() + a.append(b'''\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03''') + + + pl[nsstr('someData')] = b'' + + pl[nsstr('someMoreData')] = b'''\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03''' + + pl[nsstr('\xc5benraa')] = nsstr("That was a unicode key.") + + print("TESTDATA={") + for fmt_name, fmt_key in FORMATS: + data, error = NSPropertyListSerialization.dataWithPropertyList_format_options_error_( + pl, fmt_key, 0, None) + if data is None: + print("Cannot serialize", fmt_name, error) + + else: + print(" %s: binascii.a2b_base64(b'''\n %s'''),"%(fmt_name, _encode_base64(bytes(data)).decode('ascii')[:-1])) + + print("}") + print() + +def _encode_base64(s, maxlinelength=60): + maxbinsize = (maxlinelength//4)*3 + pieces = [] + for i in range(0, len(s), maxbinsize): + chunk = s[i : i + maxbinsize] + pieces.append(binascii.b2a_base64(chunk)) + return b' '.join(pieces) + +main() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -59,6 +59,8 @@ Library ------- +- Issue #14455: plistlib now supports binary plists and has an updated API. + - Issue #19633: Fixed writing not compressed 16- and 32-bit wave files on big-endian platforms. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 16:27:43 2013 From: python-checkins at python.org (christian.heimes) Date: Thu, 21 Nov 2013 16:27:43 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319682=3A_Fix_comp?= =?utf-8?q?atibility_issue_with_old_version_of_OpenSSL_that?= Message-ID: <3dQPlz53w5z7LjV@mail.python.org> http://hg.python.org/cpython/rev/40bfddda43d4 changeset: 87311:40bfddda43d4 parent: 87307:eec4758e3a45 user: Christian Heimes date: Thu Nov 21 16:26:51 2013 +0100 summary: Issue #19682: Fix compatibility issue with old version of OpenSSL that was introduced by Issue #18379. files: Misc/NEWS | 3 +++ Modules/_ssl.c | 6 ++++++ 2 files changed, 9 insertions(+), 0 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -59,6 +59,9 @@ Library ------- +- Issue #19682: Fix compatibility issue with old version of OpenSSL that + was introduced by Issue #18379. + - Issue #19633: Fixed writing not compressed 16- and 32-bit wave files on big-endian platforms. diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -1024,9 +1024,15 @@ int i, j, result; PyObject *lst; +#if OPENSSL_VERSION_NUMBER < 0x10001000L + dps = X509_get_ext_d2i(certificate, NID_crl_distribution_points, + NULL, NULL); +#else /* Calls x509v3_cache_extensions and sets up crldp */ X509_check_ca(certificate); dps = certificate->crldp; +#endif + if (dps == NULL) { return Py_None; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 16:27:45 2013 From: python-checkins at python.org (christian.heimes) Date: Thu, 21 Nov 2013 16:27:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_merge?= Message-ID: <3dQPm16jdPz7Lkt@mail.python.org> http://hg.python.org/cpython/rev/0c7202b64b29 changeset: 87312:0c7202b64b29 parent: 87311:40bfddda43d4 parent: 87310:673ca119dbd0 user: Christian Heimes date: Thu Nov 21 16:27:33 2013 +0100 summary: merge files: Doc/library/plistlib.rst | 181 +- Lib/plistlib.py | 1098 ++++++++-- Lib/test/test_plistlib.py | 497 +++- Mac/Tools/plistlib_generate_testdata.py | 94 + Misc/NEWS | 2 + Objects/bytearrayobject.c | 100 +- Objects/listobject.c | 5 +- 7 files changed, 1512 insertions(+), 465 deletions(-) diff --git a/Doc/library/plistlib.rst b/Doc/library/plistlib.rst --- a/Doc/library/plistlib.rst +++ b/Doc/library/plistlib.rst @@ -16,26 +16,21 @@ -------------- This module provides an interface for reading and writing the "property list" -XML files used mainly by Mac OS X. +files used mainly by Mac OS X and supports both binary and XML plist files. -The property list (``.plist``) file format is a simple XML pickle supporting +The property list (``.plist``) file format is a simple serialization supporting basic object types, like dictionaries, lists, numbers and strings. Usually the top level object is a dictionary. -To write out and to parse a plist file, use the :func:`writePlist` and -:func:`readPlist` functions. +To write out and to parse a plist file, use the :func:`dump` and +:func:`load` functions. -To work with plist data in bytes objects, use :func:`writePlistToBytes` -and :func:`readPlistFromBytes`. +To work with plist data in bytes objects, use :func:`dumps` +and :func:`loads`. Values can be strings, integers, floats, booleans, tuples, lists, dictionaries -(but only with string keys), :class:`Data` or :class:`datetime.datetime` -objects. String values (including dictionary keys) have to be unicode strings -- -they will be written out as UTF-8. - -The ```` plist type is supported through the :class:`Data` class. This is -a thin wrapper around a Python bytes object. Use :class:`Data` if your strings -contain control characters. +(but only with string keys), :class:`Data`, :class:`bytes`, :class:`bytesarray` +or :class:`datetime.datetime` objects. .. seealso:: @@ -45,37 +40,145 @@ This module defines the following functions: +.. function:: load(fp, \*, fmt=None, use_builtin_types=True, dict_type=dict) + + Read a plist file. *fp* should be a readable and binary file object. + Return the unpacked root object (which usually is a + dictionary). + + The *fmt* is the format of the file and the following values are valid: + + * :data:`None`: Autodetect the file format + + * :data:`FMT_XML`: XML file format + + * :data:`FMT_BINARY`: Binary plist format + + If *use_builtin_types* is True (the default) binary data will be returned + as instances of :class:`bytes`, otherwise it is returned as instances of + :class:`Data`. + + The *dict_type* is the type used for dictionaries that are read from the + plist file. The exact structure of the plist can be recovered by using + :class:`collections.OrderedDict` (although the order of keys shouldn't be + important in plist files). + + XML data for the :data:`FMT_XML` format is parsed using the Expat parser + from :mod:`xml.parsers.expat` -- see its documentation for possible + exceptions on ill-formed XML. Unknown elements will simply be ignored + by the plist parser. + + The parser for the binary format raises :exc:`InvalidFileException` + when the file cannot be parsed. + + .. versionadded:: 3.4 + + +.. function:: loads(data, \*, fmt=None, use_builtin_types=True, dict_type=dict) + + Load a plist from a bytes object. See :func:`load` for an explanation of + the keyword arguments. + + +.. function:: dump(value, fp, \*, fmt=FMT_XML, sort_keys=True, skipkeys=False) + + Write *value* to a plist file. *Fp* should be a writable, binary + file object. + + The *fmt* argument specifies the format of the plist file and can be + one of the following values: + + * :data:`FMT_XML`: XML formatted plist file + + * :data:`FMT_BINARY`: Binary formatted plist file + + When *sort_keys* is true (the default) the keys for dictionaries will be + written to the plist in sorted order, otherwise they will be written in + the iteration order of the dictionary. + + When *skipkeys* is false (the default) the function raises :exc:`TypeError` + when a key of a dictionary is not a string, otherwise such keys are skipped. + + A :exc:`TypeError` will be raised if the object is of an unsupported type or + a container that contains objects of unsupported types. + + .. versionchanged:: 3.4 + Added the *fmt*, *sort_keys* and *skipkeys* arguments. + + +.. function:: dumps(value, \*, fmt=FMT_XML, sort_keys=True, skipkeys=False) + + Return *value* as a plist-formatted bytes object. See + the documentation for :func:`dump` for an explanation of the keyword + arguments of this function. + + +The following functions are deprecated: + .. function:: readPlist(pathOrFile) - Read a plist file. *pathOrFile* may either be a file name or a (readable and - binary) file object. Return the unpacked root object (which usually is a - dictionary). + Read a plist file. *pathOrFile* may be either a file name or a (readable + and binary) file object. Returns the unpacked root object (which usually + is a dictionary). - The XML data is parsed using the Expat parser from :mod:`xml.parsers.expat` - -- see its documentation for possible exceptions on ill-formed XML. - Unknown elements will simply be ignored by the plist parser. + This function calls :func:`load` to do the actual work, the the documentation + of :func:`that function ` for an explanation of the keyword arguments. + + .. note:: + + Dict values in the result have a ``__getattr__`` method that defers + to ``__getitem_``. This means that you can use attribute access to + access items of these dictionaries. + + .. deprecated: 3.4 Use :func:`load` instead. .. function:: writePlist(rootObject, pathOrFile) - Write *rootObject* to a plist file. *pathOrFile* may either be a file name - or a (writable and binary) file object. + Write *rootObject* to an XML plist file. *pathOrFile* may be either a file name + or a (writable and binary) file object - A :exc:`TypeError` will be raised if the object is of an unsupported type or - a container that contains objects of unsupported types. + .. deprecated: 3.4 Use :func:`dump` instead. .. function:: readPlistFromBytes(data) Read a plist data from a bytes object. Return the root object. + See :func:`load` for a description of the keyword arguments. + + .. note:: + + Dict values in the result have a ``__getattr__`` method that defers + to ``__getitem_``. This means that you can use attribute access to + access items of these dictionaries. + + .. deprecated:: 3.4 Use :func:`loads` instead. + .. function:: writePlistToBytes(rootObject) - Return *rootObject* as a plist-formatted bytes object. + Return *rootObject* as an XML plist-formatted bytes object. + .. deprecated:: 3.4 Use :func:`dumps` instead. -The following class is available: + .. versionchanged:: 3.4 + Added the *fmt*, *sort_keys* and *skipkeys* arguments. + + +The following classes are available: + +.. class:: Dict([dict]): + + Return an extended mapping object with the same value as dictionary + *dict*. + + This class is a subclass of :class:`dict` where attribute access can + be used to access items. That is, ``aDict.key`` is the same as + ``aDict['key']`` for getting, setting and deleting items in the mapping. + + .. deprecated:: 3.0 + .. class:: Data(data) @@ -86,6 +189,24 @@ It has one attribute, :attr:`data`, that can be used to retrieve the Python bytes object stored in it. + .. deprecated:: 3.4 Use a :class:`bytes` object instead + + +The following constants are avaiable: + +.. data:: FMT_XML + + The XML format for plist files. + + .. versionadded:: 3.4 + + +.. data:: FMT_BINARY + + The binary format for plist files + + .. versionadded:: 3.4 + Examples -------- @@ -103,13 +224,15 @@ aTrueValue = True, aFalseValue = False, ), - someData = Data(b""), - someMoreData = Data(b"" * 10), + someData = b"", + someMoreData = b"" * 10, aDate = datetime.datetime.fromtimestamp(time.mktime(time.gmtime())), ) - writePlist(pl, fileName) + with open(fileName, 'wb') as fp: + dump(pl, fp) Parsing a plist:: - pl = readPlist(pathOrFile) + with open(fileName, 'rb') as fp: + pl = load(fp) print(pl["aKey"]) diff --git a/Lib/plistlib.py b/Lib/plistlib.py --- a/Lib/plistlib.py +++ b/Lib/plistlib.py @@ -4,25 +4,20 @@ basic object types, like dictionaries, lists, numbers and strings. Usually the top level object is a dictionary. -To write out a plist file, use the writePlist(rootObject, pathOrFile) -function. 'rootObject' is the top level object, 'pathOrFile' is a -filename or a (writable) file object. +To write out a plist file, use the dump(value, file) +function. 'value' is the top level object, 'file' is +a (writable) file object. -To parse a plist from a file, use the readPlist(pathOrFile) function, -with a file name or a (readable) file object as the only argument. It +To parse a plist from a file, use the load(file) function, +with a (readable) file object as the only argument. It returns the top level object (again, usually a dictionary). -To work with plist data in bytes objects, you can use readPlistFromBytes() -and writePlistToBytes(). +To work with plist data in bytes objects, you can use loads() +and dumps(). Values can be strings, integers, floats, booleans, tuples, lists, -dictionaries (but only with string keys), Data or datetime.datetime objects. -String values (including dictionary keys) have to be unicode strings -- they -will be written out as UTF-8. - -The plist type is supported through the Data class. This is a -thin wrapper around a Python bytes object. Use 'Data' if your strings -contain control characters. +dictionaries (but only with string keys), Data, bytes, bytearray, or +datetime.datetime objects. Generate Plist example: @@ -37,226 +32,48 @@ aTrueValue = True, aFalseValue = False, ), - someData = Data(b""), - someMoreData = Data(b"" * 10), + someData = b"", + someMoreData = b"" * 10, aDate = datetime.datetime.fromtimestamp(time.mktime(time.gmtime())), ) - writePlist(pl, fileName) + with open(fileName, 'wb') as fp: + dump(pl, fp) Parse Plist example: - pl = readPlist(pathOrFile) - print pl["aKey"] + with open(fileName, 'rb') as fp: + pl = load(fp) + print(pl["aKey"]) """ - - __all__ = [ "readPlist", "writePlist", "readPlistFromBytes", "writePlistToBytes", - "Plist", "Data", "Dict" + "Plist", "Data", "Dict", "FMT_XML", "FMT_BINARY", + "load", "dump", "loads", "dumps" ] -# Note: the Plist and Dict classes have been deprecated. import binascii +import codecs +import contextlib import datetime +import enum from io import BytesIO +import itertools +import os import re +import struct +from warnings import warn +from xml.parsers.expat import ParserCreate -def readPlist(pathOrFile): - """Read a .plist file. 'pathOrFile' may either be a file name or a - (readable) file object. Return the unpacked root object (which - usually is a dictionary). - """ - didOpen = False - try: - if isinstance(pathOrFile, str): - pathOrFile = open(pathOrFile, 'rb') - didOpen = True - p = PlistParser() - rootObject = p.parse(pathOrFile) - finally: - if didOpen: - pathOrFile.close() - return rootObject +PlistFormat = enum.Enum('PlistFormat', 'FMT_XML FMT_BINARY', module=__name__) +globals().update(PlistFormat.__members__) -def writePlist(rootObject, pathOrFile): - """Write 'rootObject' to a .plist file. 'pathOrFile' may either be a - file name or a (writable) file object. - """ - didOpen = False - try: - if isinstance(pathOrFile, str): - pathOrFile = open(pathOrFile, 'wb') - didOpen = True - writer = PlistWriter(pathOrFile) - writer.writeln("") - writer.writeValue(rootObject) - writer.writeln("") - finally: - if didOpen: - pathOrFile.close() - - -def readPlistFromBytes(data): - """Read a plist data from a bytes object. Return the root object. - """ - return readPlist(BytesIO(data)) - - -def writePlistToBytes(rootObject): - """Return 'rootObject' as a plist-formatted bytes object. - """ - f = BytesIO() - writePlist(rootObject, f) - return f.getvalue() - - -class DumbXMLWriter: - def __init__(self, file, indentLevel=0, indent="\t"): - self.file = file - self.stack = [] - self.indentLevel = indentLevel - self.indent = indent - - def beginElement(self, element): - self.stack.append(element) - self.writeln("<%s>" % element) - self.indentLevel += 1 - - def endElement(self, element): - assert self.indentLevel > 0 - assert self.stack.pop() == element - self.indentLevel -= 1 - self.writeln("" % element) - - def simpleElement(self, element, value=None): - if value is not None: - value = _escape(value) - self.writeln("<%s>%s" % (element, value, element)) - else: - self.writeln("<%s/>" % element) - - def writeln(self, line): - if line: - # plist has fixed encoding of utf-8 - if isinstance(line, str): - line = line.encode('utf-8') - self.file.write(self.indentLevel * self.indent) - self.file.write(line) - self.file.write(b'\n') - - -# Contents should conform to a subset of ISO 8601 -# (in particular, YYYY '-' MM '-' DD 'T' HH ':' MM ':' SS 'Z'. Smaller units may be omitted with -# a loss of precision) -_dateParser = re.compile(r"(?P\d\d\d\d)(?:-(?P\d\d)(?:-(?P\d\d)(?:T(?P\d\d)(?::(?P\d\d)(?::(?P\d\d))?)?)?)?)?Z", re.ASCII) - -def _dateFromString(s): - order = ('year', 'month', 'day', 'hour', 'minute', 'second') - gd = _dateParser.match(s).groupdict() - lst = [] - for key in order: - val = gd[key] - if val is None: - break - lst.append(int(val)) - return datetime.datetime(*lst) - -def _dateToString(d): - return '%04d-%02d-%02dT%02d:%02d:%02dZ' % ( - d.year, d.month, d.day, - d.hour, d.minute, d.second - ) - - -# Regex to find any control chars, except for \t \n and \r -_controlCharPat = re.compile( - r"[\x00\x01\x02\x03\x04\x05\x06\x07\x08\x0b\x0c\x0e\x0f" - r"\x10\x11\x12\x13\x14\x15\x16\x17\x18\x19\x1a\x1b\x1c\x1d\x1e\x1f]") - -def _escape(text): - m = _controlCharPat.search(text) - if m is not None: - raise ValueError("strings can't contains control characters; " - "use plistlib.Data instead") - text = text.replace("\r\n", "\n") # convert DOS line endings - text = text.replace("\r", "\n") # convert Mac line endings - text = text.replace("&", "&") # escape '&' - text = text.replace("<", "<") # escape '<' - text = text.replace(">", ">") # escape '>' - return text - - -PLISTHEADER = b"""\ - - -""" - -class PlistWriter(DumbXMLWriter): - - def __init__(self, file, indentLevel=0, indent=b"\t", writeHeader=1): - if writeHeader: - file.write(PLISTHEADER) - DumbXMLWriter.__init__(self, file, indentLevel, indent) - - def writeValue(self, value): - if isinstance(value, str): - self.simpleElement("string", value) - elif isinstance(value, bool): - # must switch for bool before int, as bool is a - # subclass of int... - if value: - self.simpleElement("true") - else: - self.simpleElement("false") - elif isinstance(value, int): - self.simpleElement("integer", "%d" % value) - elif isinstance(value, float): - self.simpleElement("real", repr(value)) - elif isinstance(value, dict): - self.writeDict(value) - elif isinstance(value, Data): - self.writeData(value) - elif isinstance(value, datetime.datetime): - self.simpleElement("date", _dateToString(value)) - elif isinstance(value, (tuple, list)): - self.writeArray(value) - else: - raise TypeError("unsupported type: %s" % type(value)) - - def writeData(self, data): - self.beginElement("data") - self.indentLevel -= 1 - maxlinelength = max(16, 76 - len(self.indent.replace(b"\t", b" " * 8) * - self.indentLevel)) - for line in data.asBase64(maxlinelength).split(b"\n"): - if line: - self.writeln(line) - self.indentLevel += 1 - self.endElement("data") - - def writeDict(self, d): - if d: - self.beginElement("dict") - items = sorted(d.items()) - for key, value in items: - if not isinstance(key, str): - raise TypeError("keys must be strings") - self.simpleElement("key", key) - self.writeValue(value) - self.endElement("dict") - else: - self.simpleElement("dict") - - def writeArray(self, array): - if array: - self.beginElement("array") - for value in array: - self.writeValue(value) - self.endElement("array") - else: - self.simpleElement("array") +# +# +# Deprecated functionality +# +# class _InternalDict(dict): @@ -264,19 +81,18 @@ # This class is needed while Dict is scheduled for deprecation: # we only need to warn when a *user* instantiates Dict or when # the "attribute notation for dict keys" is used. + __slots__ = () def __getattr__(self, attr): try: value = self[attr] except KeyError: raise AttributeError(attr) - from warnings import warn warn("Attribute access from plist dicts is deprecated, use d[key] " "notation instead", DeprecationWarning, 2) return value def __setattr__(self, attr, value): - from warnings import warn warn("Attribute access from plist dicts is deprecated, use d[key] " "notation instead", DeprecationWarning, 2) self[attr] = value @@ -286,56 +102,111 @@ del self[attr] except KeyError: raise AttributeError(attr) - from warnings import warn warn("Attribute access from plist dicts is deprecated, use d[key] " "notation instead", DeprecationWarning, 2) + class Dict(_InternalDict): def __init__(self, **kwargs): - from warnings import warn warn("The plistlib.Dict class is deprecated, use builtin dict instead", DeprecationWarning, 2) super().__init__(**kwargs) + at contextlib.contextmanager +def _maybe_open(pathOrFile, mode): + if isinstance(pathOrFile, str): + with open(pathOrFile, mode) as fp: + yield fp + + else: + yield pathOrFile + + class Plist(_InternalDict): - - """This class has been deprecated. Use readPlist() and writePlist() + """This class has been deprecated. Use dump() and load() functions instead, together with regular dict objects. """ def __init__(self, **kwargs): - from warnings import warn - warn("The Plist class is deprecated, use the readPlist() and " - "writePlist() functions instead", DeprecationWarning, 2) + warn("The Plist class is deprecated, use the load() and " + "dump() functions instead", DeprecationWarning, 2) super().__init__(**kwargs) + @classmethod def fromFile(cls, pathOrFile): - """Deprecated. Use the readPlist() function instead.""" - rootObject = readPlist(pathOrFile) + """Deprecated. Use the load() function instead.""" + with maybe_open(pathOrFile, 'rb') as fp: + value = load(fp) plist = cls() - plist.update(rootObject) + plist.update(value) return plist - fromFile = classmethod(fromFile) def write(self, pathOrFile): - """Deprecated. Use the writePlist() function instead.""" - writePlist(self, pathOrFile) + """Deprecated. Use the dump() function instead.""" + with _maybe_open(pathOrFile, 'wb') as fp: + dump(self, fp) -def _encodeBase64(s, maxlinelength=76): - # copied from base64.encodebytes(), with added maxlinelength argument - maxbinsize = (maxlinelength//4)*3 - pieces = [] - for i in range(0, len(s), maxbinsize): - chunk = s[i : i + maxbinsize] - pieces.append(binascii.b2a_base64(chunk)) - return b''.join(pieces) +def readPlist(pathOrFile): + """ + Read a .plist from a path or file. pathOrFile should either + be a file name, or a readable binary file object. + + This function is deprecated, use load instead. + """ + warn("The readPlist function is deprecated, use load() instead", + DeprecationWarning, 2) + + with _maybe_open(pathOrFile, 'rb') as fp: + return load(fp, fmt=None, use_builtin_types=False, + dict_type=_InternalDict) + +def writePlist(value, pathOrFile): + """ + Write 'value' to a .plist file. 'pathOrFile' may either be a + file name or a (writable) file object. + + This function is deprecated, use dump instead. + """ + warn("The writePlist function is deprecated, use dump() instead", + DeprecationWarning, 2) + with _maybe_open(pathOrFile, 'wb') as fp: + dump(value, fp, fmt=FMT_XML, sort_keys=True, skipkeys=False) + + +def readPlistFromBytes(data): + """ + Read a plist data from a bytes object. Return the root object. + + This function is deprecated, use loads instead. + """ + warn("The readPlistFromBytes function is deprecated, use loads() instead", + DeprecationWarning, 2) + return load(BytesIO(data), fmt=None, use_builtin_types=False, + dict_type=_InternalDict) + + +def writePlistToBytes(value): + """ + Return 'value' as a plist-formatted bytes object. + + This function is deprecated, use dumps instead. + """ + warn("The writePlistToBytes function is deprecated, use dumps() instead", + DeprecationWarning, 2) + f = BytesIO() + dump(value, f, fmt=FMT_XML, sort_keys=True, skipkeys=False) + return f.getvalue() + class Data: + """ + Wrapper for binary data. - """Wrapper for binary data.""" + This class is deprecated, use a bytes object instead. + """ def __init__(self, data): if not isinstance(data, bytes): @@ -346,10 +217,10 @@ def fromBase64(cls, data): # base64.decodebytes just calls binascii.a2b_base64; # it seems overkill to use both base64 and binascii. - return cls(binascii.a2b_base64(data)) + return cls(_decode_base64(data)) def asBase64(self, maxlinelength=76): - return _encodeBase64(self.data, maxlinelength) + return _encode_base64(self.data, maxlinelength) def __eq__(self, other): if isinstance(other, self.__class__): @@ -362,43 +233,119 @@ def __repr__(self): return "%s(%s)" % (self.__class__.__name__, repr(self.data)) -class PlistParser: +# +# +# End of deprecated functionality +# +# - def __init__(self): + +# +# XML support +# + + +# XML 'header' +PLISTHEADER = b"""\ + + +""" + + +# Regex to find any control chars, except for \t \n and \r +_controlCharPat = re.compile( + r"[\x00\x01\x02\x03\x04\x05\x06\x07\x08\x0b\x0c\x0e\x0f" + r"\x10\x11\x12\x13\x14\x15\x16\x17\x18\x19\x1a\x1b\x1c\x1d\x1e\x1f]") + +def _encode_base64(s, maxlinelength=76): + # copied from base64.encodebytes(), with added maxlinelength argument + maxbinsize = (maxlinelength//4)*3 + pieces = [] + for i in range(0, len(s), maxbinsize): + chunk = s[i : i + maxbinsize] + pieces.append(binascii.b2a_base64(chunk)) + return b''.join(pieces) + +def _decode_base64(s): + if isinstance(s, str): + return binascii.a2b_base64(s.encode("utf-8")) + + else: + return binascii.a2b_base64(s) + +# Contents should conform to a subset of ISO 8601 +# (in particular, YYYY '-' MM '-' DD 'T' HH ':' MM ':' SS 'Z'. Smaller units +# may be omitted with # a loss of precision) +_dateParser = re.compile(r"(?P\d\d\d\d)(?:-(?P\d\d)(?:-(?P\d\d)(?:T(?P\d\d)(?::(?P\d\d)(?::(?P\d\d))?)?)?)?)?Z", re.ASCII) + + +def _date_from_string(s): + order = ('year', 'month', 'day', 'hour', 'minute', 'second') + gd = _dateParser.match(s).groupdict() + lst = [] + for key in order: + val = gd[key] + if val is None: + break + lst.append(int(val)) + return datetime.datetime(*lst) + + +def _date_to_string(d): + return '%04d-%02d-%02dT%02d:%02d:%02dZ' % ( + d.year, d.month, d.day, + d.hour, d.minute, d.second + ) + +def _escape(text): + m = _controlCharPat.search(text) + if m is not None: + raise ValueError("strings can't contains control characters; " + "use bytes instead") + text = text.replace("\r\n", "\n") # convert DOS line endings + text = text.replace("\r", "\n") # convert Mac line endings + text = text.replace("&", "&") # escape '&' + text = text.replace("<", "<") # escape '<' + text = text.replace(">", ">") # escape '>' + return text + +class _PlistParser: + def __init__(self, use_builtin_types, dict_type): self.stack = [] - self.currentKey = None + self.current_key = None self.root = None + self._use_builtin_types = use_builtin_types + self._dict_type = dict_type def parse(self, fileobj): - from xml.parsers.expat import ParserCreate self.parser = ParserCreate() - self.parser.StartElementHandler = self.handleBeginElement - self.parser.EndElementHandler = self.handleEndElement - self.parser.CharacterDataHandler = self.handleData + self.parser.StartElementHandler = self.handle_begin_element + self.parser.EndElementHandler = self.handle_end_element + self.parser.CharacterDataHandler = self.handle_data self.parser.ParseFile(fileobj) return self.root - def handleBeginElement(self, element, attrs): + def handle_begin_element(self, element, attrs): self.data = [] handler = getattr(self, "begin_" + element, None) if handler is not None: handler(attrs) - def handleEndElement(self, element): + def handle_end_element(self, element): handler = getattr(self, "end_" + element, None) if handler is not None: handler() - def handleData(self, data): + def handle_data(self, data): self.data.append(data) - def addObject(self, value): - if self.currentKey is not None: + def add_object(self, value): + if self.current_key is not None: if not isinstance(self.stack[-1], type({})): raise ValueError("unexpected element at line %d" % self.parser.CurrentLineNumber) - self.stack[-1][self.currentKey] = value - self.currentKey = None + self.stack[-1][self.current_key] = value + self.current_key = None elif not self.stack: # this is the root object self.root = value @@ -408,7 +355,7 @@ self.parser.CurrentLineNumber) self.stack[-1].append(value) - def getData(self): + def get_data(self): data = ''.join(self.data) self.data = [] return data @@ -416,39 +363,648 @@ # element handlers def begin_dict(self, attrs): - d = _InternalDict() - self.addObject(d) + d = self._dict_type() + self.add_object(d) self.stack.append(d) + def end_dict(self): - if self.currentKey: + if self.current_key: raise ValueError("missing value for key '%s' at line %d" % - (self.currentKey,self.parser.CurrentLineNumber)) + (self.current_key,self.parser.CurrentLineNumber)) self.stack.pop() def end_key(self): - if self.currentKey or not isinstance(self.stack[-1], type({})): + if self.current_key or not isinstance(self.stack[-1], type({})): raise ValueError("unexpected key at line %d" % self.parser.CurrentLineNumber) - self.currentKey = self.getData() + self.current_key = self.get_data() def begin_array(self, attrs): a = [] - self.addObject(a) + self.add_object(a) self.stack.append(a) + def end_array(self): self.stack.pop() def end_true(self): - self.addObject(True) + self.add_object(True) + def end_false(self): - self.addObject(False) + self.add_object(False) + def end_integer(self): - self.addObject(int(self.getData())) + self.add_object(int(self.get_data())) + def end_real(self): - self.addObject(float(self.getData())) + self.add_object(float(self.get_data())) + def end_string(self): - self.addObject(self.getData()) + self.add_object(self.get_data()) + def end_data(self): - self.addObject(Data.fromBase64(self.getData().encode("utf-8"))) + if self._use_builtin_types: + self.add_object(_decode_base64(self.get_data())) + + else: + self.add_object(Data.fromBase64(self.get_data())) + def end_date(self): - self.addObject(_dateFromString(self.getData())) + self.add_object(_date_from_string(self.get_data())) + + +class _DumbXMLWriter: + def __init__(self, file, indent_level=0, indent="\t"): + self.file = file + self.stack = [] + self._indent_level = indent_level + self.indent = indent + + def begin_element(self, element): + self.stack.append(element) + self.writeln("<%s>" % element) + self._indent_level += 1 + + def end_element(self, element): + assert self._indent_level > 0 + assert self.stack.pop() == element + self._indent_level -= 1 + self.writeln("" % element) + + def simple_element(self, element, value=None): + if value is not None: + value = _escape(value) + self.writeln("<%s>%s" % (element, value, element)) + + else: + self.writeln("<%s/>" % element) + + def writeln(self, line): + if line: + # plist has fixed encoding of utf-8 + + # XXX: is this test needed? + if isinstance(line, str): + line = line.encode('utf-8') + self.file.write(self._indent_level * self.indent) + self.file.write(line) + self.file.write(b'\n') + + +class _PlistWriter(_DumbXMLWriter): + def __init__( + self, file, indent_level=0, indent=b"\t", writeHeader=1, + sort_keys=True, skipkeys=False): + + if writeHeader: + file.write(PLISTHEADER) + _DumbXMLWriter.__init__(self, file, indent_level, indent) + self._sort_keys = sort_keys + self._skipkeys = skipkeys + + def write(self, value): + self.writeln("") + self.write_value(value) + self.writeln("") + + def write_value(self, value): + if isinstance(value, str): + self.simple_element("string", value) + + elif value is True: + self.simple_element("true") + + elif value is False: + self.simple_element("false") + + elif isinstance(value, int): + self.simple_element("integer", "%d" % value) + + elif isinstance(value, float): + self.simple_element("real", repr(value)) + + elif isinstance(value, dict): + self.write_dict(value) + + elif isinstance(value, Data): + self.write_data(value) + + elif isinstance(value, (bytes, bytearray)): + self.write_bytes(value) + + elif isinstance(value, datetime.datetime): + self.simple_element("date", _date_to_string(value)) + + elif isinstance(value, (tuple, list)): + self.write_array(value) + + else: + raise TypeError("unsupported type: %s" % type(value)) + + def write_data(self, data): + self.write_bytes(data.data) + + def write_bytes(self, data): + self.begin_element("data") + self._indent_level -= 1 + maxlinelength = max( + 16, + 76 - len(self.indent.replace(b"\t", b" " * 8) * self._indent_level)) + + for line in _encode_base64(data, maxlinelength).split(b"\n"): + if line: + self.writeln(line) + self._indent_level += 1 + self.end_element("data") + + def write_dict(self, d): + if d: + self.begin_element("dict") + if self._sort_keys: + items = sorted(d.items()) + else: + items = d.items() + + for key, value in items: + if not isinstance(key, str): + if self._skipkeys: + continue + raise TypeError("keys must be strings") + self.simple_element("key", key) + self.write_value(value) + self.end_element("dict") + + else: + self.simple_element("dict") + + def write_array(self, array): + if array: + self.begin_element("array") + for value in array: + self.write_value(value) + self.end_element("array") + + else: + self.simple_element("array") + + +def _is_fmt_xml(header): + prefixes = (b'offset... + # TRAILER + self._fp = fp + self._fp.seek(-32, os.SEEK_END) + trailer = self._fp.read(32) + if len(trailer) != 32: + raise InvalidFileException() + ( + offset_size, self._ref_size, num_objects, top_object, + offset_table_offset + ) = struct.unpack('>6xBBQQQ', trailer) + self._fp.seek(offset_table_offset) + offset_format = '>' + _BINARY_FORMAT[offset_size] * num_objects + self._ref_format = _BINARY_FORMAT[self._ref_size] + self._object_offsets = struct.unpack( + offset_format, self._fp.read(offset_size * num_objects)) + return self._read_object(self._object_offsets[top_object]) + + except (OSError, IndexError, struct.error): + raise InvalidFileException() + + def _get_size(self, tokenL): + """ return the size of the next object.""" + if tokenL == 0xF: + m = self._fp.read(1)[0] & 0x3 + s = 1 << m + f = '>' + _BINARY_FORMAT[s] + return struct.unpack(f, self._fp.read(s))[0] + + return tokenL + + def _read_refs(self, n): + return struct.unpack( + '>' + self._ref_format * n, self._fp.read(n * self._ref_size)) + + def _read_object(self, offset): + """ + read the object at offset. + + May recursively read sub-objects (content of an array/dict/set) + """ + self._fp.seek(offset) + token = self._fp.read(1)[0] + tokenH, tokenL = token & 0xF0, token & 0x0F + + if token == 0x00: + return None + + elif token == 0x08: + return False + + elif token == 0x09: + return True + + # The referenced source code also mentions URL (0x0c, 0x0d) and + # UUID (0x0e), but neither can be generated using the Cocoa libraries. + + elif token == 0x0f: + return b'' + + elif tokenH == 0x10: # int + return int.from_bytes(self._fp.read(1 << tokenL), 'big') + + elif token == 0x22: # real + return struct.unpack('>f', self._fp.read(4))[0] + + elif token == 0x23: # real + return struct.unpack('>d', self._fp.read(8))[0] + + elif token == 0x33: # date + f = struct.unpack('>d', self._fp.read(8))[0] + # timestamp 0 of binary plists corresponds to 1/1/2001 + # (year of Mac OS X 10.0), instead of 1/1/1970. + return datetime.datetime.utcfromtimestamp(f + (31 * 365 + 8) * 86400) + + elif tokenH == 0x40: # data + s = self._get_size(tokenL) + if self._use_builtin_types: + return self._fp.read(s) + else: + return Data(self._fp.read(s)) + + elif tokenH == 0x50: # ascii string + s = self._get_size(tokenL) + result = self._fp.read(s).decode('ascii') + return result + + elif tokenH == 0x60: # unicode string + s = self._get_size(tokenL) + return self._fp.read(s * 2).decode('utf-16be') + + # tokenH == 0x80 is documented as 'UID' and appears to be used for + # keyed-archiving, not in plists. + + elif tokenH == 0xA0: # array + s = self._get_size(tokenL) + obj_refs = self._read_refs(s) + return [self._read_object(self._object_offsets[x]) + for x in obj_refs] + + # tokenH == 0xB0 is documented as 'ordset', but is not actually + # implemented in the Apple reference code. + + # tokenH == 0xC0 is documented as 'set', but sets cannot be used in + # plists. + + elif tokenH == 0xD0: # dict + s = self._get_size(tokenL) + key_refs = self._read_refs(s) + obj_refs = self._read_refs(s) + result = self._dict_type() + for k, o in zip(key_refs, obj_refs): + result[self._read_object(self._object_offsets[k]) + ] = self._read_object(self._object_offsets[o]) + return result + + raise InvalidFileException() + +def _count_to_size(count): + if count < 1 << 8: + return 1 + + elif count < 1 << 16: + return 2 + + elif count << 1 << 32: + return 4 + + else: + return 8 + +class _BinaryPlistWriter (object): + def __init__(self, fp, sort_keys, skipkeys): + self._fp = fp + self._sort_keys = sort_keys + self._skipkeys = skipkeys + + def write(self, value): + + # Flattened object list: + self._objlist = [] + + # Mappings from object->objectid + # First dict has (type(object), object) as the key, + # second dict is used when object is not hashable and + # has id(object) as the key. + self._objtable = {} + self._objidtable = {} + + # Create list of all objects in the plist + self._flatten(value) + + # Size of object references in serialized containers + # depends on the number of objects in the plist. + num_objects = len(self._objlist) + self._object_offsets = [0]*num_objects + self._ref_size = _count_to_size(num_objects) + + self._ref_format = _BINARY_FORMAT[self._ref_size] + + # Write file header + self._fp.write(b'bplist00') + + # Write object list + for obj in self._objlist: + self._write_object(obj) + + # Write refnum->object offset table + top_object = self._getrefnum(value) + offset_table_offset = self._fp.tell() + offset_size = _count_to_size(offset_table_offset) + offset_format = '>' + _BINARY_FORMAT[offset_size] * num_objects + self._fp.write(struct.pack(offset_format, *self._object_offsets)) + + # Write trailer + sort_version = 0 + trailer = ( + sort_version, offset_size, self._ref_size, num_objects, + top_object, offset_table_offset + ) + self._fp.write(struct.pack('>5xBBBQQQ', *trailer)) + + def _flatten(self, value): + # First check if the object is in the object table, not used for + # containers to ensure that two subcontainers with the same contents + # will be serialized as distinct values. + if isinstance(value, ( + str, int, float, datetime.datetime, bytes, bytearray)): + if (type(value), value) in self._objtable: + return + + elif isinstance(value, Data): + if (type(value.data), value.data) in self._objtable: + return + + # Add to objectreference map + refnum = len(self._objlist) + self._objlist.append(value) + try: + if isinstance(value, Data): + self._objtable[(type(value.data), value.data)] = refnum + else: + self._objtable[(type(value), value)] = refnum + except TypeError: + self._objidtable[id(value)] = refnum + + # And finally recurse into containers + if isinstance(value, dict): + keys = [] + values = [] + items = value.items() + if self._sort_keys: + items = sorted(items) + + for k, v in items: + if not isinstance(k, str): + if self._skipkeys: + continue + raise TypeError("keys must be strings") + keys.append(k) + values.append(v) + + for o in itertools.chain(keys, values): + self._flatten(o) + + elif isinstance(value, (list, tuple)): + for o in value: + self._flatten(o) + + def _getrefnum(self, value): + try: + if isinstance(value, Data): + return self._objtable[(type(value.data), value.data)] + else: + return self._objtable[(type(value), value)] + except TypeError: + return self._objidtable[id(value)] + + def _write_size(self, token, size): + if size < 15: + self._fp.write(struct.pack('>B', token | size)) + + elif size < 1 << 8: + self._fp.write(struct.pack('>BBB', token | 0xF, 0x10, size)) + + elif size < 1 << 16: + self._fp.write(struct.pack('>BBH', token | 0xF, 0x11, size)) + + elif size < 1 << 32: + self._fp.write(struct.pack('>BBL', token | 0xF, 0x12, size)) + + else: + self._fp.write(struct.pack('>BBQ', token | 0xF, 0x13, size)) + + def _write_object(self, value): + ref = self._getrefnum(value) + self._object_offsets[ref] = self._fp.tell() + if value is None: + self._fp.write(b'\x00') + + elif value is False: + self._fp.write(b'\x08') + + elif value is True: + self._fp.write(b'\x09') + + elif isinstance(value, int): + if value < 1 << 8: + self._fp.write(struct.pack('>BB', 0x10, value)) + elif value < 1 << 16: + self._fp.write(struct.pack('>BH', 0x11, value)) + elif value < 1 << 32: + self._fp.write(struct.pack('>BL', 0x12, value)) + else: + self._fp.write(struct.pack('>BQ', 0x13, value)) + + elif isinstance(value, float): + self._fp.write(struct.pack('>Bd', 0x23, value)) + + elif isinstance(value, datetime.datetime): + f = (value - datetime.datetime(2001, 1, 1)).total_seconds() + self._fp.write(struct.pack('>Bd', 0x33, f)) + + elif isinstance(value, Data): + self._write_size(0x40, len(value.data)) + self._fp.write(value.data) + + elif isinstance(value, (bytes, bytearray)): + self._write_size(0x40, len(value)) + self._fp.write(value) + + elif isinstance(value, str): + try: + t = value.encode('ascii') + self._write_size(0x50, len(value)) + except UnicodeEncodeError: + t = value.encode('utf-16be') + self._write_size(0x60, len(value)) + + self._fp.write(t) + + elif isinstance(value, (list, tuple)): + refs = [self._getrefnum(o) for o in value] + s = len(refs) + self._write_size(0xA0, s) + self._fp.write(struct.pack('>' + self._ref_format * s, *refs)) + + elif isinstance(value, dict): + keyRefs, valRefs = [], [] + + if self._sort_keys: + rootItems = sorted(value.items()) + else: + rootItems = value.items() + + for k, v in rootItems: + if not isinstance(k, str): + if self._skipkeys: + continue + raise TypeError("keys must be strings") + keyRefs.append(self._getrefnum(k)) + valRefs.append(self._getrefnum(v)) + + s = len(keyRefs) + self._write_size(0xD0, s) + self._fp.write(struct.pack('>' + self._ref_format * s, *keyRefs)) + self._fp.write(struct.pack('>' + self._ref_format * s, *valRefs)) + + else: + raise InvalidFileException() + + +def _is_fmt_binary(header): + return header[:8] == b'bplist00' + + +# +# Generic bits +# + +_FORMATS={ + FMT_XML: dict( + detect=_is_fmt_xml, + parser=_PlistParser, + writer=_PlistWriter, + ), + FMT_BINARY: dict( + detect=_is_fmt_binary, + parser=_BinaryPlistParser, + writer=_BinaryPlistWriter, + ) +} + + +def load(fp, *, fmt=None, use_builtin_types=True, dict_type=dict): + """Read a .plist file. 'fp' should be (readable) file object. + Return the unpacked root object (which usually is a dictionary). + """ + if fmt is None: + header = fp.read(32) + fp.seek(0) + for info in _FORMATS.values(): + if info['detect'](header): + p = info['parser']( + use_builtin_types=use_builtin_types, + dict_type=dict_type, + ) + break + + else: + raise InvalidFileException() + + else: + p = _FORMATS[fmt]['parser'](use_builtin_types=use_builtin_types) + + return p.parse(fp) + + +def loads(value, *, fmt=None, use_builtin_types=True, dict_type=dict): + """Read a .plist file from a bytes object. + Return the unpacked root object (which usually is a dictionary). + """ + fp = BytesIO(value) + return load( + fp, fmt=fmt, use_builtin_types=use_builtin_types, dict_type=dict_type) + + +def dump(value, fp, *, fmt=FMT_XML, sort_keys=True, skipkeys=False): + """Write 'value' to a .plist file. 'fp' should be a (writable) + file object. + """ + if fmt not in _FORMATS: + raise ValueError("Unsupported format: %r"%(fmt,)) + + writer = _FORMATS[fmt]["writer"](fp, sort_keys=sort_keys, skipkeys=skipkeys) + writer.write(value) + + +def dumps(value, *, fmt=FMT_XML, skipkeys=False, sort_keys=True): + """Return a bytes object with the contents for a .plist file. + """ + fp = BytesIO() + dump(value, fp, fmt=fmt, skipkeys=skipkeys, sort_keys=sort_keys) + return fp.getvalue() diff --git a/Lib/test/test_plistlib.py b/Lib/test/test_plistlib.py --- a/Lib/test/test_plistlib.py +++ b/Lib/test/test_plistlib.py @@ -1,94 +1,87 @@ -# Copyright (C) 2003 Python Software Foundation +# Copyright (C) 2003-2013 Python Software Foundation import unittest import plistlib import os import datetime +import codecs +import binascii +import collections from test import support +from io import BytesIO +ALL_FORMATS=(plistlib.FMT_XML, plistlib.FMT_BINARY) -# This test data was generated through Cocoa's NSDictionary class -TESTDATA = b""" - - - - aDate - 2004-10-26T10:33:33Z - aDict - - aFalseValue - - aTrueValue - - aUnicodeValue - M\xc3\xa4ssig, Ma\xc3\x9f - anotherString - <hello & 'hi' there!> - deeperDict - - a - 17 - b - 32.5 - c - - 1 - 2 - text - - - - aFloat - 0.5 - aList - - A - B - 12 - 32.5 - - 1 - 2 - 3 - - - aString - Doodah - anEmptyDict - - anEmptyList - - anInt - 728 - nestedData - - - PGxvdHMgb2YgYmluYXJ5IGd1bms+AAECAzxsb3RzIG9mIGJpbmFyeSBndW5r - PgABAgM8bG90cyBvZiBiaW5hcnkgZ3Vuaz4AAQIDPGxvdHMgb2YgYmluYXJ5 - IGd1bms+AAECAzxsb3RzIG9mIGJpbmFyeSBndW5rPgABAgM8bG90cyBvZiBi - aW5hcnkgZ3Vuaz4AAQIDPGxvdHMgb2YgYmluYXJ5IGd1bms+AAECAzxsb3Rz - IG9mIGJpbmFyeSBndW5rPgABAgM8bG90cyBvZiBiaW5hcnkgZ3Vuaz4AAQID - PGxvdHMgb2YgYmluYXJ5IGd1bms+AAECAw== - - - someData - - PGJpbmFyeSBndW5rPg== - - someMoreData - - PGxvdHMgb2YgYmluYXJ5IGd1bms+AAECAzxsb3RzIG9mIGJpbmFyeSBndW5rPgABAgM8 - bG90cyBvZiBiaW5hcnkgZ3Vuaz4AAQIDPGxvdHMgb2YgYmluYXJ5IGd1bms+AAECAzxs - b3RzIG9mIGJpbmFyeSBndW5rPgABAgM8bG90cyBvZiBiaW5hcnkgZ3Vuaz4AAQIDPGxv - dHMgb2YgYmluYXJ5IGd1bms+AAECAzxsb3RzIG9mIGJpbmFyeSBndW5rPgABAgM8bG90 - cyBvZiBiaW5hcnkgZ3Vuaz4AAQIDPGxvdHMgb2YgYmluYXJ5IGd1bms+AAECAw== - - \xc3\x85benraa - That was a unicode key. - - -""".replace(b" " * 8, b"\t") # Apple as well as plistlib.py output hard tabs +# The testdata is generated using Mac/Tools/plistlib_generate_testdata.py +# (which using PyObjC to control the Cocoa classes for generating plists) +TESTDATA={ + plistlib.FMT_XML: binascii.a2b_base64(b''' + PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiPz4KPCFET0NU + WVBFIHBsaXN0IFBVQkxJQyAiLS8vQXBwbGUvL0RURCBQTElTVCAxLjAvL0VO + IiAiaHR0cDovL3d3dy5hcHBsZS5jb20vRFREcy9Qcm9wZXJ0eUxpc3QtMS4w + LmR0ZCI+CjxwbGlzdCB2ZXJzaW9uPSIxLjAiPgo8ZGljdD4KCTxrZXk+YURh + dGU8L2tleT4KCTxkYXRlPjIwMDQtMTAtMjZUMTA6MzM6MzNaPC9kYXRlPgoJ + PGtleT5hRGljdDwva2V5PgoJPGRpY3Q+CgkJPGtleT5hRmFsc2VWYWx1ZTwv + a2V5PgoJCTxmYWxzZS8+CgkJPGtleT5hVHJ1ZVZhbHVlPC9rZXk+CgkJPHRy + dWUvPgoJCTxrZXk+YVVuaWNvZGVWYWx1ZTwva2V5PgoJCTxzdHJpbmc+TcOk + c3NpZywgTWHDnzwvc3RyaW5nPgoJCTxrZXk+YW5vdGhlclN0cmluZzwva2V5 + PgoJCTxzdHJpbmc+Jmx0O2hlbGxvICZhbXA7ICdoaScgdGhlcmUhJmd0Ozwv + c3RyaW5nPgoJCTxrZXk+ZGVlcGVyRGljdDwva2V5PgoJCTxkaWN0PgoJCQk8 + a2V5PmE8L2tleT4KCQkJPGludGVnZXI+MTc8L2ludGVnZXI+CgkJCTxrZXk+ + Yjwva2V5PgoJCQk8cmVhbD4zMi41PC9yZWFsPgoJCQk8a2V5PmM8L2tleT4K + CQkJPGFycmF5PgoJCQkJPGludGVnZXI+MTwvaW50ZWdlcj4KCQkJCTxpbnRl + Z2VyPjI8L2ludGVnZXI+CgkJCQk8c3RyaW5nPnRleHQ8L3N0cmluZz4KCQkJ + PC9hcnJheT4KCQk8L2RpY3Q+Cgk8L2RpY3Q+Cgk8a2V5PmFGbG9hdDwva2V5 + PgoJPHJlYWw+MC41PC9yZWFsPgoJPGtleT5hTGlzdDwva2V5PgoJPGFycmF5 + PgoJCTxzdHJpbmc+QTwvc3RyaW5nPgoJCTxzdHJpbmc+Qjwvc3RyaW5nPgoJ + CTxpbnRlZ2VyPjEyPC9pbnRlZ2VyPgoJCTxyZWFsPjMyLjU8L3JlYWw+CgkJ + PGFycmF5PgoJCQk8aW50ZWdlcj4xPC9pbnRlZ2VyPgoJCQk8aW50ZWdlcj4y + PC9pbnRlZ2VyPgoJCQk8aW50ZWdlcj4zPC9pbnRlZ2VyPgoJCTwvYXJyYXk+ + Cgk8L2FycmF5PgoJPGtleT5hU3RyaW5nPC9rZXk+Cgk8c3RyaW5nPkRvb2Rh + aDwvc3RyaW5nPgoJPGtleT5hbkVtcHR5RGljdDwva2V5PgoJPGRpY3QvPgoJ + PGtleT5hbkVtcHR5TGlzdDwva2V5PgoJPGFycmF5Lz4KCTxrZXk+YW5JbnQ8 + L2tleT4KCTxpbnRlZ2VyPjcyODwvaW50ZWdlcj4KCTxrZXk+bmVzdGVkRGF0 + YTwva2V5PgoJPGFycmF5PgoJCTxkYXRhPgoJCVBHeHZkSE1nYjJZZ1ltbHVZ + WEo1SUdkMWJtcytBQUVDQXp4c2IzUnpJRzltSUdKcGJtRnllU0JuZFc1cgoJ + CVBnQUJBZ004Ykc5MGN5QnZaaUJpYVc1aGNua2daM1Z1YXo0QUFRSURQR3h2 + ZEhNZ2IyWWdZbWx1WVhKNQoJCUlHZDFibXMrQUFFQ0F6eHNiM1J6SUc5bUlH + SnBibUZ5ZVNCbmRXNXJQZ0FCQWdNOGJHOTBjeUJ2WmlCaQoJCWFXNWhjbmtn + WjNWdWF6NEFBUUlEUEd4dmRITWdiMllnWW1sdVlYSjVJR2QxYm1zK0FBRUNB + enhzYjNSegoJCUlHOW1JR0pwYm1GeWVTQm5kVzVyUGdBQkFnTThiRzkwY3lC + dlppQmlhVzVoY25rZ1ozVnVhejRBQVFJRAoJCVBHeHZkSE1nYjJZZ1ltbHVZ + WEo1SUdkMWJtcytBQUVDQXc9PQoJCTwvZGF0YT4KCTwvYXJyYXk+Cgk8a2V5 + PnNvbWVEYXRhPC9rZXk+Cgk8ZGF0YT4KCVBHSnBibUZ5ZVNCbmRXNXJQZz09 + Cgk8L2RhdGE+Cgk8a2V5PnNvbWVNb3JlRGF0YTwva2V5PgoJPGRhdGE+CglQ + R3h2ZEhNZ2IyWWdZbWx1WVhKNUlHZDFibXMrQUFFQ0F6eHNiM1J6SUc5bUlH + SnBibUZ5ZVNCbmRXNXJQZ0FCQWdNOAoJYkc5MGN5QnZaaUJpYVc1aGNua2da + M1Z1YXo0QUFRSURQR3h2ZEhNZ2IyWWdZbWx1WVhKNUlHZDFibXMrQUFFQ0F6 + eHMKCWIzUnpJRzltSUdKcGJtRnllU0JuZFc1clBnQUJBZ004Ykc5MGN5QnZa + aUJpYVc1aGNua2daM1Z1YXo0QUFRSURQR3h2CglkSE1nYjJZZ1ltbHVZWEo1 + SUdkMWJtcytBQUVDQXp4c2IzUnpJRzltSUdKcGJtRnllU0JuZFc1clBnQUJB + Z004Ykc5MAoJY3lCdlppQmlhVzVoY25rZ1ozVnVhejRBQVFJRFBHeHZkSE1n + YjJZZ1ltbHVZWEo1SUdkMWJtcytBQUVDQXc9PQoJPC9kYXRhPgoJPGtleT7D + hWJlbnJhYTwva2V5PgoJPHN0cmluZz5UaGF0IHdhcyBhIHVuaWNvZGUga2V5 + Ljwvc3RyaW5nPgo8L2RpY3Q+CjwvcGxpc3Q+Cg=='''), + plistlib.FMT_BINARY: binascii.a2b_base64(b''' + YnBsaXN0MDDcAQIDBAUGBwgJCgsMDQ4iIykqKywtLy4wVWFEYXRlVWFEaWN0 + VmFGbG9hdFVhTGlzdFdhU3RyaW5nW2FuRW1wdHlEaWN0W2FuRW1wdHlMaXN0 + VWFuSW50Wm5lc3RlZERhdGFYc29tZURhdGFcc29tZU1vcmVEYXRhZwDFAGIA + ZQBuAHIAYQBhM0GcuX30AAAA1Q8QERITFBUWFxhbYUZhbHNlVmFsdWVaYVRy + dWVWYWx1ZV1hVW5pY29kZVZhbHVlXWFub3RoZXJTdHJpbmdaZGVlcGVyRGlj + dAgJawBNAOQAcwBzAGkAZwAsACAATQBhAN9fEBU8aGVsbG8gJiAnaGknIHRo + ZXJlIT7TGRobHB0eUWFRYlFjEBEjQEBAAAAAAACjHyAhEAEQAlR0ZXh0Iz/g + AAAAAAAApSQlJh0nUUFRQhAMox8gKBADVkRvb2RhaNCgEQLYoS5PEPo8bG90 + cyBvZiBiaW5hcnkgZ3Vuaz4AAQIDPGxvdHMgb2YgYmluYXJ5IGd1bms+AAEC + Azxsb3RzIG9mIGJpbmFyeSBndW5rPgABAgM8bG90cyBvZiBiaW5hcnkgZ3Vu + az4AAQIDPGxvdHMgb2YgYmluYXJ5IGd1bms+AAECAzxsb3RzIG9mIGJpbmFy + eSBndW5rPgABAgM8bG90cyBvZiBiaW5hcnkgZ3Vuaz4AAQIDPGxvdHMgb2Yg + YmluYXJ5IGd1bms+AAECAzxsb3RzIG9mIGJpbmFyeSBndW5rPgABAgM8bG90 + cyBvZiBiaW5hcnkgZ3Vuaz4AAQIDTTxiaW5hcnkgZ3Vuaz5fEBdUaGF0IHdh + cyBhIHVuaWNvZGUga2V5LgAIACEAJwAtADQAOgBCAE4AWgBgAGsAdACBAJAA + mQCkALAAuwDJANcA4gDjAOQA+wETARoBHAEeASABIgErAS8BMQEzATgBQQFH + AUkBSwFNAVEBUwFaAVsBXAFfAWECXgJsAAAAAAAAAgEAAAAAAAAAMQAAAAAA + AAAAAAAAAAAAAoY='''), +} class TestPlistlib(unittest.TestCase): @@ -99,7 +92,7 @@ except: pass - def _create(self): + def _create(self, fmt=None): pl = dict( aString="Doodah", aList=["A", "B", 12, 32.5, [1, 2, 3]], @@ -112,9 +105,9 @@ aFalseValue=False, deeperDict=dict(a=17, b=32.5, c=[1, 2, "text"]), ), - someData = plistlib.Data(b""), - someMoreData = plistlib.Data(b"\0\1\2\3" * 10), - nestedData = [plistlib.Data(b"\0\1\2\3" * 10)], + someData = b"", + someMoreData = b"\0\1\2\3" * 10, + nestedData = [b"\0\1\2\3" * 10], aDate = datetime.datetime(2004, 10, 26, 10, 33, 33), anEmptyDict = dict(), anEmptyList = list() @@ -129,49 +122,191 @@ def test_io(self): pl = self._create() - plistlib.writePlist(pl, support.TESTFN) - pl2 = plistlib.readPlist(support.TESTFN) + with open(support.TESTFN, 'wb') as fp: + plistlib.dump(pl, fp) + + with open(support.TESTFN, 'rb') as fp: + pl2 = plistlib.load(fp) + self.assertEqual(dict(pl), dict(pl2)) + self.assertRaises(AttributeError, plistlib.dump, pl, 'filename') + self.assertRaises(AttributeError, plistlib.load, 'filename') + + def test_bytes(self): pl = self._create() - data = plistlib.writePlistToBytes(pl) - pl2 = plistlib.readPlistFromBytes(data) + data = plistlib.dumps(pl) + pl2 = plistlib.loads(data) + self.assertNotIsInstance(pl, plistlib._InternalDict) self.assertEqual(dict(pl), dict(pl2)) - data2 = plistlib.writePlistToBytes(pl2) + data2 = plistlib.dumps(pl2) self.assertEqual(data, data2) def test_indentation_array(self): - data = [[[[[[[[{'test': plistlib.Data(b'aaaaaa')}]]]]]]]] - self.assertEqual(plistlib.readPlistFromBytes(plistlib.writePlistToBytes(data)), data) + data = [[[[[[[[{'test': b'aaaaaa'}]]]]]]]] + self.assertEqual(plistlib.loads(plistlib.dumps(data)), data) def test_indentation_dict(self): - data = {'1': {'2': {'3': {'4': {'5': {'6': {'7': {'8': {'9': plistlib.Data(b'aaaaaa')}}}}}}}}} - self.assertEqual(plistlib.readPlistFromBytes(plistlib.writePlistToBytes(data)), data) + data = {'1': {'2': {'3': {'4': {'5': {'6': {'7': {'8': {'9': b'aaaaaa'}}}}}}}}} + self.assertEqual(plistlib.loads(plistlib.dumps(data)), data) def test_indentation_dict_mix(self): - data = {'1': {'2': [{'3': [[[[[{'test': plistlib.Data(b'aaaaaa')}]]]]]}]}} - self.assertEqual(plistlib.readPlistFromBytes(plistlib.writePlistToBytes(data)), data) + data = {'1': {'2': [{'3': [[[[[{'test': b'aaaaaa'}]]]]]}]}} + self.assertEqual(plistlib.loads(plistlib.dumps(data)), data) def test_appleformatting(self): - pl = plistlib.readPlistFromBytes(TESTDATA) - data = plistlib.writePlistToBytes(pl) - self.assertEqual(data, TESTDATA, - "generated data was not identical to Apple's output") + for use_builtin_types in (True, False): + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt, use_builtin_types=use_builtin_types): + pl = plistlib.loads(TESTDATA[fmt], + use_builtin_types=use_builtin_types) + data = plistlib.dumps(pl, fmt=fmt) + self.assertEqual(data, TESTDATA[fmt], + "generated data was not identical to Apple's output") + def test_appleformattingfromliteral(self): - pl = self._create() - pl2 = plistlib.readPlistFromBytes(TESTDATA) - self.assertEqual(dict(pl), dict(pl2), - "generated data was not identical to Apple's output") + self.maxDiff = None + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + pl = self._create(fmt=fmt) + pl2 = plistlib.loads(TESTDATA[fmt]) + self.assertEqual(dict(pl), dict(pl2), + "generated data was not identical to Apple's output") def test_bytesio(self): - from io import BytesIO - b = BytesIO() - pl = self._create() - plistlib.writePlist(pl, b) - pl2 = plistlib.readPlist(BytesIO(b.getvalue())) - self.assertEqual(dict(pl), dict(pl2)) + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + b = BytesIO() + pl = self._create(fmt=fmt) + plistlib.dump(pl, b, fmt=fmt) + pl2 = plistlib.load(BytesIO(b.getvalue())) + self.assertEqual(dict(pl), dict(pl2)) + + def test_keysort_bytesio(self): + pl = collections.OrderedDict() + pl['b'] = 1 + pl['a'] = 2 + pl['c'] = 3 + + for fmt in ALL_FORMATS: + for sort_keys in (False, True): + with self.subTest(fmt=fmt, sort_keys=sort_keys): + b = BytesIO() + + plistlib.dump(pl, b, fmt=fmt, sort_keys=sort_keys) + pl2 = plistlib.load(BytesIO(b.getvalue()), + dict_type=collections.OrderedDict) + + self.assertEqual(dict(pl), dict(pl2)) + if sort_keys: + self.assertEqual(list(pl2.keys()), ['a', 'b', 'c']) + else: + self.assertEqual(list(pl2.keys()), ['b', 'a', 'c']) + + def test_keysort(self): + pl = collections.OrderedDict() + pl['b'] = 1 + pl['a'] = 2 + pl['c'] = 3 + + for fmt in ALL_FORMATS: + for sort_keys in (False, True): + with self.subTest(fmt=fmt, sort_keys=sort_keys): + data = plistlib.dumps(pl, fmt=fmt, sort_keys=sort_keys) + pl2 = plistlib.loads(data, dict_type=collections.OrderedDict) + + self.assertEqual(dict(pl), dict(pl2)) + if sort_keys: + self.assertEqual(list(pl2.keys()), ['a', 'b', 'c']) + else: + self.assertEqual(list(pl2.keys()), ['b', 'a', 'c']) + + def test_keys_no_string(self): + pl = { 42: 'aNumber' } + + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + self.assertRaises(TypeError, plistlib.dumps, pl, fmt=fmt) + + b = BytesIO() + self.assertRaises(TypeError, plistlib.dump, pl, b, fmt=fmt) + + def test_skipkeys(self): + pl = { + 42: 'aNumber', + 'snake': 'aWord', + } + + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + data = plistlib.dumps( + pl, fmt=fmt, skipkeys=True, sort_keys=False) + + pl2 = plistlib.loads(data) + self.assertEqual(pl2, {'snake': 'aWord'}) + + fp = BytesIO() + plistlib.dump( + pl, fp, fmt=fmt, skipkeys=True, sort_keys=False) + data = fp.getvalue() + pl2 = plistlib.loads(fp.getvalue()) + self.assertEqual(pl2, {'snake': 'aWord'}) + + def test_tuple_members(self): + pl = { + 'first': (1, 2), + 'second': (1, 2), + 'third': (3, 4), + } + + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + data = plistlib.dumps(pl, fmt=fmt) + pl2 = plistlib.loads(data) + self.assertEqual(pl2, { + 'first': [1, 2], + 'second': [1, 2], + 'third': [3, 4], + }) + self.assertIsNot(pl2['first'], pl2['second']) + + def test_list_members(self): + pl = { + 'first': [1, 2], + 'second': [1, 2], + 'third': [3, 4], + } + + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + data = plistlib.dumps(pl, fmt=fmt) + pl2 = plistlib.loads(data) + self.assertEqual(pl2, { + 'first': [1, 2], + 'second': [1, 2], + 'third': [3, 4], + }) + self.assertIsNot(pl2['first'], pl2['second']) + + def test_dict_members(self): + pl = { + 'first': {'a': 1}, + 'second': {'a': 1}, + 'third': {'b': 2 }, + } + + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + data = plistlib.dumps(pl, fmt=fmt) + pl2 = plistlib.loads(data) + self.assertEqual(pl2, { + 'first': {'a': 1}, + 'second': {'a': 1}, + 'third': {'b': 2 }, + }) + self.assertIsNot(pl2['first'], pl2['second']) def test_controlcharacters(self): for i in range(128): @@ -179,25 +314,27 @@ testString = "string containing %s" % c if i >= 32 or c in "\r\n\t": # \r, \n and \t are the only legal control chars in XML - plistlib.writePlistToBytes(testString) + plistlib.dumps(testString, fmt=plistlib.FMT_XML) else: self.assertRaises(ValueError, - plistlib.writePlistToBytes, + plistlib.dumps, testString) def test_nondictroot(self): - test1 = "abc" - test2 = [1, 2, 3, "abc"] - result1 = plistlib.readPlistFromBytes(plistlib.writePlistToBytes(test1)) - result2 = plistlib.readPlistFromBytes(plistlib.writePlistToBytes(test2)) - self.assertEqual(test1, result1) - self.assertEqual(test2, result2) + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + test1 = "abc" + test2 = [1, 2, 3, "abc"] + result1 = plistlib.loads(plistlib.dumps(test1, fmt=fmt)) + result2 = plistlib.loads(plistlib.dumps(test2, fmt=fmt)) + self.assertEqual(test1, result1) + self.assertEqual(test2, result2) def test_invalidarray(self): for i in ["key inside an array", "key inside an array23", "key inside an array3"]: - self.assertRaises(ValueError, plistlib.readPlistFromBytes, + self.assertRaises(ValueError, plistlib.loads, ("%s"%i).encode()) def test_invaliddict(self): @@ -206,22 +343,130 @@ "missing key", "k1v15.3" "k1k2double key"]: - self.assertRaises(ValueError, plistlib.readPlistFromBytes, + self.assertRaises(ValueError, plistlib.loads, ("%s"%i).encode()) - self.assertRaises(ValueError, plistlib.readPlistFromBytes, + self.assertRaises(ValueError, plistlib.loads, ("%s"%i).encode()) def test_invalidinteger(self): - self.assertRaises(ValueError, plistlib.readPlistFromBytes, + self.assertRaises(ValueError, plistlib.loads, b"not integer") def test_invalidreal(self): - self.assertRaises(ValueError, plistlib.readPlistFromBytes, + self.assertRaises(ValueError, plistlib.loads, b"not real") + def test_xml_encodings(self): + base = TESTDATA[plistlib.FMT_XML] + + for xml_encoding, encoding, bom in [ + (b'utf-8', 'utf-8', codecs.BOM_UTF8), + (b'utf-16', 'utf-16-le', codecs.BOM_UTF16_LE), + (b'utf-16', 'utf-16-be', codecs.BOM_UTF16_BE), + # Expat does not support UTF-32 + #(b'utf-32', 'utf-32-le', codecs.BOM_UTF32_LE), + #(b'utf-32', 'utf-32-be', codecs.BOM_UTF32_BE), + ]: + + pl = self._create(fmt=plistlib.FMT_XML) + with self.subTest(encoding=encoding): + data = base.replace(b'UTF-8', xml_encoding) + data = bom + data.decode('utf-8').encode(encoding) + pl2 = plistlib.loads(data) + self.assertEqual(dict(pl), dict(pl2)) + + +class TestPlistlibDeprecated(unittest.TestCase): + def test_io_deprecated(self): + pl_in = { + 'key': 42, + 'sub': { + 'key': 9, + 'alt': 'value', + 'data': b'buffer', + } + } + pl_out = plistlib._InternalDict({ + 'key': 42, + 'sub': plistlib._InternalDict({ + 'key': 9, + 'alt': 'value', + 'data': plistlib.Data(b'buffer'), + }) + }) + + self.addCleanup(support.unlink, support.TESTFN) + with self.assertWarns(DeprecationWarning): + plistlib.writePlist(pl_in, support.TESTFN) + + with self.assertWarns(DeprecationWarning): + pl2 = plistlib.readPlist(support.TESTFN) + + self.assertEqual(pl_out, pl2) + + os.unlink(support.TESTFN) + + with open(support.TESTFN, 'wb') as fp: + with self.assertWarns(DeprecationWarning): + plistlib.writePlist(pl_in, fp) + + with open(support.TESTFN, 'rb') as fp: + with self.assertWarns(DeprecationWarning): + pl2 = plistlib.readPlist(fp) + + self.assertEqual(pl_out, pl2) + + def test_bytes_deprecated(self): + pl = { + 'key': 42, + 'sub': { + 'key': 9, + 'alt': 'value', + 'data': b'buffer', + } + } + with self.assertWarns(DeprecationWarning): + data = plistlib.writePlistToBytes(pl) + + with self.assertWarns(DeprecationWarning): + pl2 = plistlib.readPlistFromBytes(data) + + self.assertIsInstance(pl2, plistlib._InternalDict) + self.assertEqual(pl2, plistlib._InternalDict( + key=42, + sub=plistlib._InternalDict( + key=9, + alt='value', + data=plistlib.Data(b'buffer'), + ) + )) + + with self.assertWarns(DeprecationWarning): + data2 = plistlib.writePlistToBytes(pl2) + self.assertEqual(data, data2) + + def test_dataobject_deprecated(self): + in_data = { 'key': plistlib.Data(b'hello') } + out_data = { 'key': b'hello' } + + buf = plistlib.dumps(in_data) + + cur = plistlib.loads(buf) + self.assertEqual(cur, out_data) + self.assertNotEqual(cur, in_data) + + cur = plistlib.loads(buf, use_builtin_types=False) + self.assertNotEqual(cur, out_data) + self.assertEqual(cur, in_data) + + with self.assertWarns(DeprecationWarning): + cur = plistlib.readPlistFromBytes(buf) + self.assertNotEqual(cur, out_data) + self.assertEqual(cur, in_data) + def test_main(): - support.run_unittest(TestPlistlib) + support.run_unittest(TestPlistlib, TestPlistlibDeprecated) if __name__ == '__main__': diff --git a/Mac/Tools/plistlib_generate_testdata.py b/Mac/Tools/plistlib_generate_testdata.py new file mode 100644 --- /dev/null +++ b/Mac/Tools/plistlib_generate_testdata.py @@ -0,0 +1,94 @@ +#!/usr/bin/env python3 + +from Cocoa import NSMutableDictionary, NSMutableArray, NSString, NSDate +from Cocoa import NSPropertyListSerialization, NSPropertyListOpenStepFormat +from Cocoa import NSPropertyListXMLFormat_v1_0, NSPropertyListBinaryFormat_v1_0 +from Cocoa import CFUUIDCreateFromString, NSNull, NSUUID, CFPropertyListCreateData +from Cocoa import NSURL + +import datetime +from collections import OrderedDict +import binascii + +FORMATS=[ +# ('openstep', NSPropertyListOpenStepFormat), + ('plistlib.FMT_XML', NSPropertyListXMLFormat_v1_0), + ('plistlib.FMT_BINARY', NSPropertyListBinaryFormat_v1_0), +] + +def nsstr(value): + return NSString.alloc().initWithString_(value) + + +def main(): + pl = OrderedDict() + + seconds = datetime.datetime(2004, 10, 26, 10, 33, 33, tzinfo=datetime.timezone(datetime.timedelta(0))).timestamp() + pl[nsstr('aDate')] = NSDate.dateWithTimeIntervalSince1970_(seconds) + + pl[nsstr('aDict')] = d = OrderedDict() + d[nsstr('aFalseValue')] = False + d[nsstr('aTrueValue')] = True + d[nsstr('aUnicodeValue')] = "M\xe4ssig, Ma\xdf" + d[nsstr('anotherString')] = "" + d[nsstr('deeperDict')] = dd = OrderedDict() + dd[nsstr('a')] = 17 + dd[nsstr('b')] = 32.5 + dd[nsstr('c')] = a = NSMutableArray.alloc().init() + a.append(1) + a.append(2) + a.append(nsstr('text')) + + pl[nsstr('aFloat')] = 0.5 + + pl[nsstr('aList')] = a = NSMutableArray.alloc().init() + a.append(nsstr('A')) + a.append(nsstr('B')) + a.append(12) + a.append(32.5) + aa = NSMutableArray.alloc().init() + a.append(aa) + aa.append(1) + aa.append(2) + aa.append(3) + + pl[nsstr('aString')] = nsstr('Doodah') + + pl[nsstr('anEmptyDict')] = NSMutableDictionary.alloc().init() + + pl[nsstr('anEmptyList')] = NSMutableArray.alloc().init() + + pl[nsstr('anInt')] = 728 + + pl[nsstr('nestedData')] = a = NSMutableArray.alloc().init() + a.append(b'''\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03''') + + + pl[nsstr('someData')] = b'' + + pl[nsstr('someMoreData')] = b'''\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03\x00\x01\x02\x03''' + + pl[nsstr('\xc5benraa')] = nsstr("That was a unicode key.") + + print("TESTDATA={") + for fmt_name, fmt_key in FORMATS: + data, error = NSPropertyListSerialization.dataWithPropertyList_format_options_error_( + pl, fmt_key, 0, None) + if data is None: + print("Cannot serialize", fmt_name, error) + + else: + print(" %s: binascii.a2b_base64(b'''\n %s'''),"%(fmt_name, _encode_base64(bytes(data)).decode('ascii')[:-1])) + + print("}") + print() + +def _encode_base64(s, maxlinelength=60): + maxbinsize = (maxlinelength//4)*3 + pieces = [] + for i in range(0, len(s), maxbinsize): + chunk = s[i : i + maxbinsize] + pieces.append(binascii.b2a_base64(chunk)) + return b' '.join(pieces) + +main() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -62,6 +62,8 @@ - Issue #19682: Fix compatibility issue with old version of OpenSSL that was introduced by Issue #18379. +- Issue #14455: plistlib now supports binary plists and has an updated API. + - Issue #19633: Fixed writing not compressed 16- and 32-bit wave files on big-endian platforms. diff --git a/Objects/bytearrayobject.c b/Objects/bytearrayobject.c --- a/Objects/bytearrayobject.c +++ b/Objects/bytearrayobject.c @@ -453,54 +453,80 @@ Py_ssize_t avail = hi - lo; char *buf = PyByteArray_AS_STRING(self); Py_ssize_t growth = bytes_len - avail; + int res = 0; assert(avail >= 0); - if (growth != 0) { - if (growth < 0) { - if (!_canresize(self)) + if (growth < 0) { + if (!_canresize(self)) + return -1; + + if (lo == 0) { + /* Shrink the buffer by advancing its logical start */ + self->ob_start -= growth; + /* + 0 lo hi old_size + | |<----avail----->|<-----tail------>| + | |<-bytes_len->|<-----tail------>| + 0 new_lo new_hi new_size + */ + } + else { + /* + 0 lo hi old_size + | |<----avail----->|<-----tomove------>| + | |<-bytes_len->|<-----tomove------>| + 0 lo new_hi new_size + */ + memmove(buf + lo + bytes_len, buf + hi, + Py_SIZE(self) - hi); + } + if (PyByteArray_Resize((PyObject *)self, + Py_SIZE(self) + growth) < 0) { + /* Issue #19578: Handling the memory allocation failure here is + tricky here because the bytearray object has already been + modified. Depending on growth and lo, the behaviour is + different. + + If growth < 0 and lo != 0, the operation is completed, but a + MemoryError is still raised and the memory block is not + shrinked. Otherwise, the bytearray is restored in its previous + state and a MemoryError is raised. */ + if (lo == 0) { + self->ob_start += growth; return -1; - if (lo == 0) { - /* Shrink the buffer by advancing its logical start */ - self->ob_start -= growth; - /* - 0 lo hi old_size - | |<----avail----->|<-----tail------>| - | |<-bytes_len->|<-----tail------>| - 0 new_lo new_hi new_size - */ } - else { - /* - 0 lo hi old_size - | |<----avail----->|<-----tomove------>| - | |<-bytes_len->|<-----tomove------>| - 0 lo new_hi new_size - */ - memmove(buf + lo + bytes_len, buf + hi, - Py_SIZE(self) - hi); - } + /* memmove() removed bytes, the bytearray object cannot be + restored in its previous state. */ + Py_SIZE(self) += growth; + res = -1; } - /* XXX(nnorwitz): need to verify this can't overflow! */ - if (PyByteArray_Resize( - (PyObject *)self, Py_SIZE(self) + growth) < 0) + buf = PyByteArray_AS_STRING(self); + } + else if (growth > 0) { + if (Py_SIZE(self) > (Py_ssize_t)PY_SSIZE_T_MAX - growth) { + PyErr_NoMemory(); return -1; + } + + if (PyByteArray_Resize((PyObject *)self, + Py_SIZE(self) + growth) < 0) { + return -1; + } buf = PyByteArray_AS_STRING(self); - if (growth > 0) { - /* Make the place for the additional bytes */ - /* - 0 lo hi old_size - | |<-avail->|<-----tomove------>| - | |<---bytes_len-->|<-----tomove------>| - 0 lo new_hi new_size - */ - memmove(buf + lo + bytes_len, buf + hi, - Py_SIZE(self) - lo - bytes_len); - } + /* Make the place for the additional bytes */ + /* + 0 lo hi old_size + | |<-avail->|<-----tomove------>| + | |<---bytes_len-->|<-----tomove------>| + 0 lo new_hi new_size + */ + memmove(buf + lo + bytes_len, buf + hi, + Py_SIZE(self) - lo - bytes_len); } if (bytes_len > 0) memcpy(buf + lo, bytes, bytes_len); - return 0; + return res; } static int diff --git a/Objects/listobject.c b/Objects/listobject.c --- a/Objects/listobject.c +++ b/Objects/listobject.c @@ -2483,6 +2483,7 @@ PyObject **garbage; size_t cur; Py_ssize_t i; + int res; if (slicelength <= 0) return 0; @@ -2533,14 +2534,14 @@ } Py_SIZE(self) -= slicelength; - list_resize(self, Py_SIZE(self)); + res = list_resize(self, Py_SIZE(self)); for (i = 0; i < slicelength; i++) { Py_DECREF(garbage[i]); } PyMem_FREE(garbage); - return 0; + return res; } else { /* assign slice */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 17:36:49 2013 From: python-checkins at python.org (richard.oudkerk) Date: Thu, 21 Nov 2013 17:36:49 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NTk5?= =?utf-8?q?=3A_Use_a_separate_pool_for_test=5Fterminate=28=29=2E?= Message-ID: <3dQRHj3ChczSdj@mail.python.org> http://hg.python.org/cpython/rev/086865ceefe1 changeset: 87313:086865ceefe1 branch: 2.7 parent: 87306:03a32ead9c7d user: Richard Oudkerk date: Thu Nov 21 16:35:12 2013 +0000 summary: Issue #19599: Use a separate pool for test_terminate(). files: Lib/test/test_multiprocessing.py | 16 ++++------------ 1 files changed, 4 insertions(+), 12 deletions(-) diff --git a/Lib/test/test_multiprocessing.py b/Lib/test/test_multiprocessing.py --- a/Lib/test/test_multiprocessing.py +++ b/Lib/test/test_multiprocessing.py @@ -1176,20 +1176,12 @@ p.join() def test_terminate(self): - if self.TYPE == 'manager': - # On Unix a forked process increfs each shared object to - # which its parent process held a reference. If the - # forked process gets terminated then there is likely to - # be a reference leak. So to prevent - # _TestZZZNumberOfObjects from failing we skip this test - # when using a manager. - return - - result = self.pool.map_async( + p = self.Pool(4) + result = p.map_async( time.sleep, [0.1 for i in range(10000)], chunksize=1 ) - self.pool.terminate() - join = TimingWrapper(self.pool.join) + p.terminate() + join = TimingWrapper(p.join) join() self.assertTrue(join.elapsed < 0.2) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 18:24:32 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Thu, 21 Nov 2013 18:24:32 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Print_Tk_patch?= =?utf-8?q?level_in_Tk_and_Ttk_tests_in_verbose_mode_=28issue19654=29=2E?= Message-ID: <3dQSLm5CkGz7LjM@mail.python.org> http://hg.python.org/cpython/rev/cfbd894f1df1 changeset: 87314:cfbd894f1df1 branch: 3.3 parent: 87304:7b040bc289e8 user: Serhiy Storchaka date: Thu Nov 21 19:23:19 2013 +0200 summary: Print Tk patchlevel in Tk and Ttk tests in verbose mode (issue19654). files: Lib/tkinter/test/test_tkinter/test_widgets.py | 3 ++- Lib/tkinter/test/test_ttk/test_widgets.py | 3 ++- Lib/tkinter/test/widget_tests.py | 6 ++++++ 3 files changed, 10 insertions(+), 2 deletions(-) diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -7,7 +7,8 @@ get_tk_patchlevel, widget_eq) from tkinter.test.widget_tests import ( add_standard_options, noconv, pixels_round, - AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests) + AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests, + setUpModule) requires('gui') diff --git a/Lib/tkinter/test/test_ttk/test_widgets.py b/Lib/tkinter/test/test_ttk/test_widgets.py --- a/Lib/tkinter/test/test_ttk/test_widgets.py +++ b/Lib/tkinter/test/test_ttk/test_widgets.py @@ -8,7 +8,8 @@ from tkinter.test.test_ttk.test_functions import MockTclObj from tkinter.test.support import tcl_version from tkinter.test.widget_tests import (add_standard_options, noconv, - AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests) + AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests, + setUpModule) requires('gui') diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/tkinter/test/widget_tests.py @@ -491,3 +491,9 @@ setattr(cls, methodname, test) return cls return decorator + +def setUpModule(): + import test.support + if test.support.verbose: + tcl = tkinter.Tcl() + print('patchlevel =', tcl.call('info', 'patchlevel')) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 18:24:33 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Thu, 21 Nov 2013 18:24:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Print_Tk_patchlevel_in_Tk_and_Ttk_tests_in_verbose_mode_?= =?utf-8?q?=28issue19654=29=2E?= Message-ID: <3dQSLn73Tmz7Ljb@mail.python.org> http://hg.python.org/cpython/rev/cf8ac1272e07 changeset: 87315:cf8ac1272e07 parent: 87312:0c7202b64b29 parent: 87314:cfbd894f1df1 user: Serhiy Storchaka date: Thu Nov 21 19:23:50 2013 +0200 summary: Print Tk patchlevel in Tk and Ttk tests in verbose mode (issue19654). files: Lib/tkinter/test/test_tkinter/test_widgets.py | 3 ++- Lib/tkinter/test/test_ttk/test_widgets.py | 3 ++- Lib/tkinter/test/widget_tests.py | 6 ++++++ 3 files changed, 10 insertions(+), 2 deletions(-) diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -7,7 +7,8 @@ get_tk_patchlevel, widget_eq) from tkinter.test.widget_tests import ( add_standard_options, noconv, pixels_round, - AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests) + AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests, + setUpModule) requires('gui') diff --git a/Lib/tkinter/test/test_ttk/test_widgets.py b/Lib/tkinter/test/test_ttk/test_widgets.py --- a/Lib/tkinter/test/test_ttk/test_widgets.py +++ b/Lib/tkinter/test/test_ttk/test_widgets.py @@ -8,7 +8,8 @@ from tkinter.test.test_ttk.test_functions import MockTclObj from tkinter.test.support import tcl_version from tkinter.test.widget_tests import (add_standard_options, noconv, - AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests) + AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests, + setUpModule) requires('gui') diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/tkinter/test/widget_tests.py @@ -491,3 +491,9 @@ setattr(cls, methodname, test) return cls return decorator + +def setUpModule(): + import test.support + if test.support.verbose: + tcl = tkinter.Tcl() + print('patchlevel =', tcl.call('info', 'patchlevel')) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 18:24:35 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Thu, 21 Nov 2013 18:24:35 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Print_Tk_patch?= =?utf-8?q?level_in_Tk_and_Ttk_tests_in_verbose_mode_=28issue19654=29=2E?= Message-ID: <3dQSLq1qxMz7M6C@mail.python.org> http://hg.python.org/cpython/rev/08f282c96fd1 changeset: 87316:08f282c96fd1 branch: 2.7 parent: 87313:086865ceefe1 user: Serhiy Storchaka date: Thu Nov 21 19:24:04 2013 +0200 summary: Print Tk patchlevel in Tk and Ttk tests in verbose mode (issue19654). files: Lib/lib-tk/test/test_tkinter/test_widgets.py | 3 ++- Lib/lib-tk/test/test_ttk/test_widgets.py | 3 ++- Lib/lib-tk/test/widget_tests.py | 6 ++++++ 3 files changed, 10 insertions(+), 2 deletions(-) diff --git a/Lib/lib-tk/test/test_tkinter/test_widgets.py b/Lib/lib-tk/test/test_tkinter/test_widgets.py --- a/Lib/lib-tk/test/test_tkinter/test_widgets.py +++ b/Lib/lib-tk/test/test_tkinter/test_widgets.py @@ -8,7 +8,8 @@ from widget_tests import ( add_standard_options, noconv, noconv_meth, int_round, pixels_round, AbstractWidgetTest, StandardOptionsTests, - IntegerSizeTests, PixelSizeTests) + IntegerSizeTests, PixelSizeTests, + setUpModule) requires('gui') diff --git a/Lib/lib-tk/test/test_ttk/test_widgets.py b/Lib/lib-tk/test/test_ttk/test_widgets.py --- a/Lib/lib-tk/test/test_ttk/test_widgets.py +++ b/Lib/lib-tk/test/test_ttk/test_widgets.py @@ -9,7 +9,8 @@ from support import tcl_version from widget_tests import (add_standard_options, noconv, noconv_meth, AbstractWidgetTest, StandardOptionsTests, - IntegerSizeTests, PixelSizeTests) + IntegerSizeTests, PixelSizeTests, + setUpModule) requires('gui') diff --git a/Lib/lib-tk/test/widget_tests.py b/Lib/lib-tk/test/widget_tests.py --- a/Lib/lib-tk/test/widget_tests.py +++ b/Lib/lib-tk/test/widget_tests.py @@ -510,3 +510,9 @@ setattr(cls, methodname, test) return cls return decorator + +def setUpModule(): + import test.test_support + if test.test_support.verbose: + tcl = Tkinter.Tcl() + print 'patchlevel =', tcl.call('info', 'patchlevel') -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 20:08:45 2013 From: python-checkins at python.org (guido.van.rossum) Date: Thu, 21 Nov 2013 20:08:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Make_Semaphore?= =?utf-8?q?=280=29_work_properly=2E?= Message-ID: <3dQVg11dCRz7Llk@mail.python.org> http://hg.python.org/cpython/rev/4dd5de61e5b5 changeset: 87317:4dd5de61e5b5 parent: 87315:cf8ac1272e07 user: Guido van Rossum date: Thu Nov 21 11:07:45 2013 -0800 summary: asyncio: Make Semaphore(0) work properly. files: Lib/asyncio/locks.py | 4 ++-- Lib/test/test_asyncio/test_locks.py | 4 ++++ 2 files changed, 6 insertions(+), 2 deletions(-) diff --git a/Lib/asyncio/locks.py b/Lib/asyncio/locks.py --- a/Lib/asyncio/locks.py +++ b/Lib/asyncio/locks.py @@ -348,12 +348,12 @@ def __init__(self, value=1, bound=False, *, loop=None): if value < 0: - raise ValueError("Semaphore initial value must be > 0") + raise ValueError("Semaphore initial value must be >= 0") self._value = value self._bound = bound self._bound_value = value self._waiters = collections.deque() - self._locked = False + self._locked = (value == 0) if loop is not None: self._loop = loop else: diff --git a/Lib/test/test_asyncio/test_locks.py b/Lib/test/test_asyncio/test_locks.py --- a/Lib/test/test_asyncio/test_locks.py +++ b/Lib/test/test_asyncio/test_locks.py @@ -684,6 +684,10 @@ finally: events.set_event_loop(None) + def test_initial_value_zero(self): + sem = locks.Semaphore(0, loop=self.loop) + self.assertTrue(sem.locked()) + def test_repr(self): sem = locks.Semaphore(loop=self.loop) self.assertTrue(repr(sem).endswith('[unlocked,value:1]>')) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 20:30:43 2013 From: python-checkins at python.org (guido.van.rossum) Date: Thu, 21 Nov 2013 20:30:43 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Better_behavior_when_stepp?= =?utf-8?q?ing_over_yield=5Bfrom=5D=2E_Fixes_issue_16596=2E_By_Xavier_de?= Message-ID: <3dQW8M31hMz7LjY@mail.python.org> http://hg.python.org/cpython/rev/95eea8624d05 changeset: 87318:95eea8624d05 user: Guido van Rossum date: Thu Nov 21 11:30:06 2013 -0800 summary: Better behavior when stepping over yield[from]. Fixes issue 16596. By Xavier de Gaye. files: Lib/bdb.py | 36 +++- Lib/pdb.py | 12 +- Lib/test/test_pdb.py | 305 +++++++++++++++++++++++++++++++ Python/ceval.c | 5 + 4 files changed, 351 insertions(+), 7 deletions(-) diff --git a/Lib/bdb.py b/Lib/bdb.py --- a/Lib/bdb.py +++ b/Lib/bdb.py @@ -3,6 +3,7 @@ import fnmatch import sys import os +from inspect import CO_GENERATOR __all__ = ["BdbQuit", "Bdb", "Breakpoint"] @@ -75,24 +76,48 @@ if not (self.stop_here(frame) or self.break_anywhere(frame)): # No need to trace this function return # None + # Ignore call events in generator except when stepping. + if self.stopframe and frame.f_code.co_flags & CO_GENERATOR: + return self.trace_dispatch self.user_call(frame, arg) if self.quitting: raise BdbQuit return self.trace_dispatch def dispatch_return(self, frame, arg): if self.stop_here(frame) or frame == self.returnframe: + # Ignore return events in generator except when stepping. + if self.stopframe and frame.f_code.co_flags & CO_GENERATOR: + return self.trace_dispatch try: self.frame_returning = frame self.user_return(frame, arg) finally: self.frame_returning = None if self.quitting: raise BdbQuit + # The user issued a 'next' or 'until' command. + if self.stopframe is frame and self.stoplineno != -1: + self._set_stopinfo(None, None) return self.trace_dispatch def dispatch_exception(self, frame, arg): if self.stop_here(frame): + # When stepping with next/until/return in a generator frame, skip + # the internal StopIteration exception (with no traceback) + # triggered by a subiterator run with the 'yield from' statement. + if not (frame.f_code.co_flags & CO_GENERATOR + and arg[0] is StopIteration and arg[2] is None): + self.user_exception(frame, arg) + if self.quitting: raise BdbQuit + # Stop at the StopIteration or GeneratorExit exception when the user + # has set stopframe in a generator by issuing a return command, or a + # next/until command at the last statement in the generator before the + # exception. + elif (self.stopframe and frame is not self.stopframe + and self.stopframe.f_code.co_flags & CO_GENERATOR + and arg[0] in (StopIteration, GeneratorExit)): self.user_exception(frame, arg) if self.quitting: raise BdbQuit + return self.trace_dispatch # Normally derived classes don't override the following @@ -115,10 +140,8 @@ if self.stoplineno == -1: return False return frame.f_lineno >= self.stoplineno - while frame is not None and frame is not self.stopframe: - if frame is self.botframe: - return True - frame = frame.f_back + if not self.stopframe: + return True return False def break_here(self, frame): @@ -207,7 +230,10 @@ def set_return(self, frame): """Stop when returning from the given frame.""" - self._set_stopinfo(frame.f_back, frame) + if frame.f_code.co_flags & CO_GENERATOR: + self._set_stopinfo(frame, None, -1) + else: + self._set_stopinfo(frame.f_back, frame) def set_trace(self, frame=None): """Start debugging from `frame`. diff --git a/Lib/pdb.py b/Lib/pdb.py --- a/Lib/pdb.py +++ b/Lib/pdb.py @@ -297,8 +297,16 @@ return exc_type, exc_value, exc_traceback = exc_info frame.f_locals['__exception__'] = exc_type, exc_value - self.message(traceback.format_exception_only(exc_type, - exc_value)[-1].strip()) + + # An 'Internal StopIteration' exception is an exception debug event + # issued by the interpreter when handling a subgenerator run with + # 'yield from' or a generator controled by a for loop. No exception has + # actually occured in this case. The debugger uses this debug event to + # stop when the debuggee is returning from such generators. + prefix = 'Internal ' if (not exc_traceback + and exc_type is StopIteration) else '' + self.message('%s%s' % (prefix, + traceback.format_exception_only(exc_type, exc_value)[-1].strip())) self.interaction(frame, exc_traceback) # General interaction function diff --git a/Lib/test/test_pdb.py b/Lib/test/test_pdb.py --- a/Lib/test/test_pdb.py +++ b/Lib/test/test_pdb.py @@ -600,6 +600,311 @@ (Pdb) continue """ +def test_next_until_return_at_return_event(): + """Test that pdb stops after a next/until/return issued at a return debug event. + + >>> def test_function_2(): + ... x = 1 + ... x = 2 + + >>> def test_function(): + ... import pdb; pdb.Pdb(nosigint=True).set_trace() + ... test_function_2() + ... test_function_2() + ... test_function_2() + ... end = 1 + + >>> with PdbTestInput(['break test_function_2', + ... 'continue', + ... 'return', + ... 'next', + ... 'continue', + ... 'return', + ... 'until', + ... 'continue', + ... 'return', + ... 'return', + ... 'continue']): + ... test_function() + > (3)test_function() + -> test_function_2() + (Pdb) break test_function_2 + Breakpoint 1 at :1 + (Pdb) continue + > (2)test_function_2() + -> x = 1 + (Pdb) return + --Return-- + > (3)test_function_2()->None + -> x = 2 + (Pdb) next + > (4)test_function() + -> test_function_2() + (Pdb) continue + > (2)test_function_2() + -> x = 1 + (Pdb) return + --Return-- + > (3)test_function_2()->None + -> x = 2 + (Pdb) until + > (5)test_function() + -> test_function_2() + (Pdb) continue + > (2)test_function_2() + -> x = 1 + (Pdb) return + --Return-- + > (3)test_function_2()->None + -> x = 2 + (Pdb) return + > (6)test_function() + -> end = 1 + (Pdb) continue + """ + +def test_pdb_next_command_for_generator(): + """Testing skip unwindng stack on yield for generators for "next" command + + >>> def test_gen(): + ... yield 0 + ... return 1 + ... yield 2 + + >>> def test_function(): + ... import pdb; pdb.Pdb(nosigint=True).set_trace() + ... it = test_gen() + ... try: + ... assert next(it) == 0 + ... next(it) + ... except StopIteration as ex: + ... assert ex.value == 1 + ... print("finished") + + >>> with PdbTestInput(['step', + ... 'step', + ... 'step', + ... 'next', + ... 'next', + ... 'step', + ... 'step', + ... 'continue']): + ... test_function() + > (3)test_function() + -> it = test_gen() + (Pdb) step + > (4)test_function() + -> try: + (Pdb) step + > (5)test_function() + -> assert next(it) == 0 + (Pdb) step + --Call-- + > (1)test_gen() + -> def test_gen(): + (Pdb) next + > (2)test_gen() + -> yield 0 + (Pdb) next + > (3)test_gen() + -> return 1 + (Pdb) step + --Return-- + > (3)test_gen()->1 + -> return 1 + (Pdb) step + StopIteration: 1 + > (6)test_function() + -> next(it) + (Pdb) continue + finished + """ + +def test_pdb_return_command_for_generator(): + """Testing no unwindng stack on yield for generators + for "return" command + + >>> def test_gen(): + ... yield 0 + ... return 1 + ... yield 2 + + >>> def test_function(): + ... import pdb; pdb.Pdb(nosigint=True).set_trace() + ... it = test_gen() + ... try: + ... assert next(it) == 0 + ... next(it) + ... except StopIteration as ex: + ... assert ex.value == 1 + ... print("finished") + + >>> with PdbTestInput(['step', + ... 'step', + ... 'step', + ... 'return', + ... 'step', + ... 'step', + ... 'continue']): + ... test_function() + > (3)test_function() + -> it = test_gen() + (Pdb) step + > (4)test_function() + -> try: + (Pdb) step + > (5)test_function() + -> assert next(it) == 0 + (Pdb) step + --Call-- + > (1)test_gen() + -> def test_gen(): + (Pdb) return + StopIteration: 1 + > (6)test_function() + -> next(it) + (Pdb) step + > (7)test_function() + -> except StopIteration as ex: + (Pdb) step + > (8)test_function() + -> assert ex.value == 1 + (Pdb) continue + finished + """ + +def test_pdb_until_command_for_generator(): + """Testing no unwindng stack on yield for generators + for "until" command if target breakpoing is not reached + + >>> def test_gen(): + ... yield 0 + ... yield 1 + ... yield 2 + + >>> def test_function(): + ... import pdb; pdb.Pdb(nosigint=True).set_trace() + ... for i in test_gen(): + ... print(i) + ... print("finished") + + >>> with PdbTestInput(['step', + ... 'until 4', + ... 'step', + ... 'step', + ... 'continue']): + ... test_function() + > (3)test_function() + -> for i in test_gen(): + (Pdb) step + --Call-- + > (1)test_gen() + -> def test_gen(): + (Pdb) until 4 + 0 + 1 + > (4)test_gen() + -> yield 2 + (Pdb) step + --Return-- + > (4)test_gen()->2 + -> yield 2 + (Pdb) step + > (4)test_function() + -> print(i) + (Pdb) continue + 2 + finished + """ + +def test_pdb_next_command_in_generator_for_loop(): + """The next command on returning from a generator controled by a for loop. + + >>> def test_gen(): + ... yield 0 + ... return 1 + + >>> def test_function(): + ... import pdb; pdb.Pdb(nosigint=True).set_trace() + ... for i in test_gen(): + ... print('value', i) + ... x = 123 + + >>> with PdbTestInput(['break test_gen', + ... 'continue', + ... 'next', + ... 'next', + ... 'next', + ... 'continue']): + ... test_function() + > (3)test_function() + -> for i in test_gen(): + (Pdb) break test_gen + Breakpoint 6 at :1 + (Pdb) continue + > (2)test_gen() + -> yield 0 + (Pdb) next + value 0 + > (3)test_gen() + -> return 1 + (Pdb) next + Internal StopIteration: 1 + > (3)test_function() + -> for i in test_gen(): + (Pdb) next + > (5)test_function() + -> x = 123 + (Pdb) continue + """ + +def test_pdb_next_command_subiterator(): + """The next command in a generator with a subiterator. + + >>> def test_subgenerator(): + ... yield 0 + ... return 1 + + >>> def test_gen(): + ... x = yield from test_subgenerator() + ... return x + + >>> def test_function(): + ... import pdb; pdb.Pdb(nosigint=True).set_trace() + ... for i in test_gen(): + ... print('value', i) + ... x = 123 + + >>> with PdbTestInput(['step', + ... 'step', + ... 'next', + ... 'next', + ... 'next', + ... 'continue']): + ... test_function() + > (3)test_function() + -> for i in test_gen(): + (Pdb) step + --Call-- + > (1)test_gen() + -> def test_gen(): + (Pdb) step + > (2)test_gen() + -> x = yield from test_subgenerator() + (Pdb) next + value 0 + > (3)test_gen() + -> return x + (Pdb) next + Internal StopIteration: 1 + > (3)test_function() + -> for i in test_gen(): + (Pdb) next + > (5)test_function() + -> x = 123 + (Pdb) continue + """ + class PdbTestCase(unittest.TestCase): diff --git a/Python/ceval.c b/Python/ceval.c --- a/Python/ceval.c +++ b/Python/ceval.c @@ -1904,6 +1904,9 @@ Py_DECREF(v); if (retval == NULL) { PyObject *val; + if (tstate->c_tracefunc != NULL + && PyErr_ExceptionMatches(PyExc_StopIteration)) + call_exc_trace(tstate->c_tracefunc, tstate->c_traceobj, f); err = _PyGen_FetchStopIterationValue(&val); if (err < 0) goto error; @@ -2654,6 +2657,8 @@ if (PyErr_Occurred()) { if (!PyErr_ExceptionMatches(PyExc_StopIteration)) goto error; + else if (tstate->c_tracefunc != NULL) + call_exc_trace(tstate->c_tracefunc, tstate->c_traceobj, f); PyErr_Clear(); } /* iterator ended normally */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 22:38:51 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 21 Nov 2013 22:38:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2318294=3A_Fix_the_?= =?utf-8?q?zlib_module_to_make_it_64-bit_safe?= Message-ID: <3dQZ0C5G8cz7Ljt@mail.python.org> http://hg.python.org/cpython/rev/f947fe289db8 changeset: 87319:f947fe289db8 user: Victor Stinner date: Thu Nov 21 22:33:21 2013 +0100 summary: Close #18294: Fix the zlib module to make it 64-bit safe files: Misc/NEWS | 2 + Modules/zlibmodule.c | 180 +++++++++++++++++++++--------- 2 files changed, 127 insertions(+), 55 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -59,6 +59,8 @@ Library ------- +- Issue #18294: Fix the zlib module to make it 64-bit safe. + - Issue #19682: Fix compatibility issue with old version of OpenSSL that was introduced by Issue #18379. diff --git a/Modules/zlibmodule.c b/Modules/zlibmodule.c --- a/Modules/zlibmodule.c +++ b/Modules/zlibmodule.c @@ -176,7 +176,7 @@ if (!PyArg_ParseTuple(args, "y*|i:compress", &pinput, &level)) return NULL; - if (pinput.len > UINT_MAX) { + if ((size_t)pinput.len > UINT_MAX) { PyErr_SetString(PyExc_OverflowError, "Size does not fit in an unsigned int"); goto error; @@ -245,6 +245,45 @@ return ReturnVal; } +/*[python] + +class uint_converter(CConverter): + type = 'unsigned int' + converter = 'uint_converter' + +[python]*/ +/*[python checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ + +static int +uint_converter(PyObject *obj, void *ptr) +{ + long val; + unsigned long uval; + + val = PyLong_AsLong(obj); + if (val == -1 && PyErr_Occurred()) { + uval = PyLong_AsUnsignedLong(obj); + if (uval == (unsigned long)-1 && PyErr_Occurred()) + return 0; + if (uval > UINT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "Python int too large for C unsigned int"); + return 0; + } + } + else { + if (val < 0) { + PyErr_SetString(PyExc_ValueError, + "value must be positive"); + return 0; + } + uval = (unsigned long)val; + } + + *(unsigned int *)ptr = Py_SAFE_DOWNCAST(uval, unsigned long, unsigned int); + return 1; +} + PyDoc_STRVAR(decompress__doc__, "decompress(string[, wbits[, bufsize]]) -- Return decompressed string.\n" "\n" @@ -260,14 +299,14 @@ unsigned int length; int err; int wsize=DEF_WBITS; - Py_ssize_t r_strlen=DEFAULTALLOC; + unsigned int bufsize = DEFAULTALLOC, new_bufsize; z_stream zst; - if (!PyArg_ParseTuple(args, "y*|in:decompress", - &pinput, &wsize, &r_strlen)) + if (!PyArg_ParseTuple(args, "y*|iO&:decompress", + &pinput, &wsize, uint_converter, &bufsize)) return NULL; - if (pinput.len > UINT_MAX) { + if ((size_t)pinput.len > UINT_MAX) { PyErr_SetString(PyExc_OverflowError, "Size does not fit in an unsigned int"); goto error; @@ -275,13 +314,13 @@ input = pinput.buf; length = (unsigned int)pinput.len; - if (r_strlen <= 0) - r_strlen = 1; + if (bufsize == 0) + bufsize = 1; zst.avail_in = length; - zst.avail_out = r_strlen; + zst.avail_out = bufsize; - if (!(result_str = PyBytes_FromStringAndSize(NULL, r_strlen))) + if (!(result_str = PyBytes_FromStringAndSize(NULL, bufsize))) goto error; zst.opaque = NULL; @@ -326,14 +365,18 @@ /* fall through */ case(Z_OK): /* need more memory */ - if (_PyBytes_Resize(&result_str, r_strlen << 1) < 0) { + if (bufsize <= (UINT_MAX >> 1)) + new_bufsize = bufsize << 1; + else + new_bufsize = UINT_MAX; + if (_PyBytes_Resize(&result_str, new_bufsize) < 0) { inflateEnd(&zst); goto error; } zst.next_out = - (unsigned char *)PyBytes_AS_STRING(result_str) + r_strlen; - zst.avail_out = r_strlen; - r_strlen = r_strlen << 1; + (unsigned char *)PyBytes_AS_STRING(result_str) + bufsize; + zst.avail_out = bufsize; + bufsize = new_bufsize; break; default: inflateEnd(&zst); @@ -363,7 +406,7 @@ static PyObject * PyZlib_compressobj(PyObject *selfptr, PyObject *args, PyObject *kwargs) { - compobject *self; + compobject *self = NULL; int level=Z_DEFAULT_COMPRESSION, method=DEFLATED; int wbits=MAX_WBITS, memLevel=DEF_MEM_LEVEL, strategy=0, err; Py_buffer zdict; @@ -376,6 +419,12 @@ &memLevel, &strategy, &zdict)) return NULL; + if (zdict.buf != NULL && (size_t)zdict.len > UINT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "zdict length does not fit in an unsigned int"); + goto error; + } + self = newcompobject(&Comptype); if (self==NULL) goto error; @@ -391,7 +440,8 @@ if (zdict.buf == NULL) { goto success; } else { - err = deflateSetDictionary(&self->zst, zdict.buf, zdict.len); + err = deflateSetDictionary(&self->zst, + zdict.buf, (unsigned int)zdict.len); switch (err) { case (Z_OK): goto success; @@ -515,7 +565,7 @@ { int err; unsigned int inplen; - Py_ssize_t length = DEFAULTALLOC; + unsigned int length = DEFAULTALLOC, new_length; PyObject *RetVal = NULL; Py_buffer pinput; Byte *input; @@ -523,13 +573,13 @@ if (!PyArg_ParseTuple(args, "y*:compress", &pinput)) return NULL; - if (pinput.len > UINT_MAX) { + if ((size_t)pinput.len > UINT_MAX) { PyErr_SetString(PyExc_OverflowError, "Size does not fit in an unsigned int"); goto error_outer; } input = pinput.buf; - inplen = pinput.len; + inplen = (unsigned int)pinput.len; if (!(RetVal = PyBytes_FromStringAndSize(NULL, length))) goto error_outer; @@ -549,14 +599,18 @@ /* while Z_OK and the output buffer is full, there might be more output, so extend the output buffer and try again */ while (err == Z_OK && self->zst.avail_out == 0) { - if (_PyBytes_Resize(&RetVal, length << 1) < 0) { + if (length <= (UINT_MAX >> 1)) + new_length = length << 1; + else + new_length = UINT_MAX; + if (_PyBytes_Resize(&RetVal, new_length) < 0) { Py_CLEAR(RetVal); goto error; } self->zst.next_out = (unsigned char *)PyBytes_AS_STRING(RetVal) + length; self->zst.avail_out = length; - length = length << 1; + length = new_length; Py_BEGIN_ALLOW_THREADS err = deflate(&(self->zst), Z_NO_FLUSH); @@ -596,7 +650,7 @@ Py_ssize_t old_size = PyBytes_GET_SIZE(self->unused_data); Py_ssize_t new_size; PyObject *new_data; - if ((Py_ssize_t)self->zst.avail_in > PY_SSIZE_T_MAX - old_size) { + if ((size_t)self->zst.avail_in > (size_t)UINT_MAX - (size_t)old_size) { PyErr_NoMemory(); return -1; } @@ -636,7 +690,7 @@ data: Py_buffer The binary data to decompress. - max_length: int = 0 + max_length: uint = 0 The maximum allowable length of the decompressed data. Unconsumed input data will be stored in the unconsumed_tail attribute. @@ -668,18 +722,18 @@ {"decompress", (PyCFunction)zlib_Decompress_decompress, METH_VARARGS, zlib_Decompress_decompress__doc__}, static PyObject * -zlib_Decompress_decompress_impl(PyObject *self, Py_buffer *data, int max_length); +zlib_Decompress_decompress_impl(PyObject *self, Py_buffer *data, unsigned int max_length); static PyObject * zlib_Decompress_decompress(PyObject *self, PyObject *args) { PyObject *return_value = NULL; Py_buffer data; - int max_length = 0; + unsigned int max_length = 0; if (!PyArg_ParseTuple(args, - "y*|i:decompress", - &data, &max_length)) + "y*|O&:decompress", + &data, uint_converter, &max_length)) goto exit; return_value = zlib_Decompress_decompress_impl(self, &data, max_length); @@ -691,29 +745,20 @@ } static PyObject * -zlib_Decompress_decompress_impl(PyObject *self, Py_buffer *data, int max_length) -/*[clinic checksum: bfac7a0f07e891869d87c665a76dc2611014420f]*/ +zlib_Decompress_decompress_impl(PyObject *self, Py_buffer *data, unsigned int max_length) +/*[clinic checksum: 76ca9259e3f5ca86bae9da3d0e75637b5d492234]*/ { compobject *zself = (compobject *)self; int err; - unsigned int inplen; - Py_ssize_t old_length, length = DEFAULTALLOC; + unsigned int old_length, length = DEFAULTALLOC; PyObject *RetVal = NULL; - Byte *input; unsigned long start_total_out; - if (data->len > UINT_MAX) { + if ((size_t)data->len > UINT_MAX) { PyErr_SetString(PyExc_OverflowError, "Size does not fit in an unsigned int"); return NULL; } - input = data->buf; - inplen = data->len; - if (max_length < 0) { - PyErr_SetString(PyExc_ValueError, - "max_length must be greater than zero"); - return NULL; - } /* limit amount of data allocated to max_length */ if (max_length && length > max_length) @@ -724,8 +769,8 @@ ENTER_ZLIB(zself); start_total_out = zself->zst.total_out; - zself->zst.avail_in = inplen; - zself->zst.next_in = input; + zself->zst.avail_in = (unsigned int)data->len; + zself->zst.next_in = data->buf; zself->zst.avail_out = length; zself->zst.next_out = (unsigned char *)PyBytes_AS_STRING(RetVal); @@ -740,12 +785,21 @@ RetVal = NULL; goto error; } - err = inflateSetDictionary(&(zself->zst), zdict_buf.buf, zdict_buf.len); + + if ((size_t)zdict_buf.len > UINT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "zdict length does not fit in an unsigned int"); + PyBuffer_Release(&zdict_buf); + Py_CLEAR(RetVal); + goto error; + } + + err = inflateSetDictionary(&(zself->zst), + zdict_buf.buf, (unsigned int)zdict_buf.len); PyBuffer_Release(&zdict_buf); if (err != Z_OK) { zlib_error(zself->zst, err, "while decompressing data"); - Py_DECREF(RetVal); - RetVal = NULL; + Py_CLEAR(RetVal); goto error; } /* Repeat the call to inflate. */ @@ -824,7 +878,8 @@ static PyObject * PyZlib_flush(compobject *self, PyObject *args) { - int err, length = DEFAULTALLOC; + int err; + unsigned int length = DEFAULTALLOC, new_length; PyObject *RetVal; int flushmode = Z_FINISH; unsigned long start_total_out; @@ -855,14 +910,18 @@ /* while Z_OK and the output buffer is full, there might be more output, so extend the output buffer and try again */ while (err == Z_OK && self->zst.avail_out == 0) { - if (_PyBytes_Resize(&RetVal, length << 1) < 0) { + if (length <= (UINT_MAX >> 1)) + new_length = length << 1; + else + new_length = UINT_MAX; + if (_PyBytes_Resize(&RetVal, new_length) < 0) { Py_CLEAR(RetVal); goto error; } self->zst.next_out = (unsigned char *)PyBytes_AS_STRING(RetVal) + length; self->zst.avail_out = length; - length = length << 1; + length = new_length; Py_BEGIN_ALLOW_THREADS err = deflate(&(self->zst), flushmode); @@ -1041,24 +1100,31 @@ static PyObject * PyZlib_unflush(compobject *self, PyObject *args) { - int err, length = DEFAULTALLOC; + int err; + unsigned int length = DEFAULTALLOC, new_length; PyObject * retval = NULL; unsigned long start_total_out; + Py_ssize_t size; - if (!PyArg_ParseTuple(args, "|i:flush", &length)) + if (!PyArg_ParseTuple(args, "|O&:flush", uint_converter, &length)) return NULL; - if (length <= 0) { + if (length == 0) { PyErr_SetString(PyExc_ValueError, "length must be greater than zero"); return NULL; } + if (!(retval = PyBytes_FromStringAndSize(NULL, length))) return NULL; ENTER_ZLIB(self); + size = PyBytes_GET_SIZE(self->unconsumed_tail); + start_total_out = self->zst.total_out; - self->zst.avail_in = PyBytes_GET_SIZE(self->unconsumed_tail); + /* save_unconsumed_input() ensures that unconsumed_tail length is lesser + or equal than UINT_MAX */ + self->zst.avail_in = Py_SAFE_DOWNCAST(size, Py_ssize_t, unsigned int); self->zst.next_in = (Byte *)PyBytes_AS_STRING(self->unconsumed_tail); self->zst.avail_out = length; self->zst.next_out = (Byte *)PyBytes_AS_STRING(retval); @@ -1070,13 +1136,17 @@ /* while Z_OK and the output buffer is full, there might be more output, so extend the output buffer and try again */ while ((err == Z_OK || err == Z_BUF_ERROR) && self->zst.avail_out == 0) { - if (_PyBytes_Resize(&retval, length << 1) < 0) { + if (length <= (UINT_MAX >> 1)) + new_length = length << 1; + else + new_length = UINT_MAX; + if (_PyBytes_Resize(&retval, new_length) < 0) { Py_CLEAR(retval); goto error; } self->zst.next_out = (Byte *)PyBytes_AS_STRING(retval) + length; self->zst.avail_out = length; - length = length << 1; + length = new_length; Py_BEGIN_ALLOW_THREADS err = inflate(&(self->zst), Z_FINISH); @@ -1168,7 +1238,7 @@ Py_BEGIN_ALLOW_THREADS /* Avoid truncation of length for very large buffers. adler32() takes length as an unsigned int, which may be narrower than Py_ssize_t. */ - while (len > (size_t) UINT_MAX) { + while ((size_t)len > UINT_MAX) { adler32val = adler32(adler32val, buf, UINT_MAX); buf += (size_t) UINT_MAX; len -= (size_t) UINT_MAX; @@ -1206,7 +1276,7 @@ Py_BEGIN_ALLOW_THREADS /* Avoid truncation of length for very large buffers. crc32() takes length as an unsigned int, which may be narrower than Py_ssize_t. */ - while (len > (size_t) UINT_MAX) { + while ((size_t)len > UINT_MAX) { crc32val = crc32(crc32val, buf, UINT_MAX); buf += (size_t) UINT_MAX; len -= (size_t) UINT_MAX; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 22:55:28 2013 From: python-checkins at python.org (victor.stinner) Date: Thu, 21 Nov 2013 22:55:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_454=3A_Charles-Fran=C3=A7?= =?utf-8?q?ois_accepted_the_PEP?= Message-ID: <3dQZMN0pBXzPCm@mail.python.org> http://hg.python.org/peps/rev/1a066168e74e changeset: 5310:1a066168e74e user: Victor Stinner date: Thu Nov 21 22:55:10 2013 +0100 summary: PEP 454: Charles-Fran?ois accepted the PEP files: pep-0454.txt | 4 +++- 1 files changed, 3 insertions(+), 1 deletions(-) diff --git a/pep-0454.txt b/pep-0454.txt --- a/pep-0454.txt +++ b/pep-0454.txt @@ -3,11 +3,13 @@ Version: $Revision$ Last-Modified: $Date$ Author: Victor Stinner -Status: Draft +BDFL-Delegate: Charles-Fran?ois Natali +Status: Accepted Type: Standards Track Content-Type: text/x-rst Created: 3-September-2013 Python-Version: 3.4 +Resolution: https://mail.python.org/pipermail/python-dev/2013-November/130491.html Abstract -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Thu Nov 21 23:56:23 2013 From: python-checkins at python.org (christian.heimes) Date: Thu, 21 Nov 2013 23:56:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=238813=3A_Add_SSLCo?= =?utf-8?q?ntext=2Everify=5Fflags_to_change_the_verification_flags?= Message-ID: <3dQbjg5WxXz7M7s@mail.python.org> http://hg.python.org/cpython/rev/83805c9d1f05 changeset: 87320:83805c9d1f05 user: Christian Heimes date: Thu Nov 21 23:56:13 2013 +0100 summary: Issue #8813: Add SSLContext.verify_flags to change the verification flags of the context in order to enable certification revocation list (CRL) checks or strict X509 rules. files: Doc/library/ssl.rst | 45 ++++++++++++++++++ Lib/ssl.py | 2 + Lib/test/make_ssl_certs.py | 6 ++ Lib/test/revocation.crl | 11 ++++ Lib/test/test_ssl.py | 63 +++++++++++++++++++++++++- Misc/NEWS | 4 + Modules/_ssl.c | 49 ++++++++++++++++++++ 7 files changed, 179 insertions(+), 1 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -423,6 +423,38 @@ be passed, either to :meth:`SSLContext.load_verify_locations` or as a value of the ``ca_certs`` parameter to :func:`wrap_socket`. +.. data:: VERIFY_DEFAULT + + Possible value for :attr:`SSLContext.verify_flags`. In this mode, + certificate revocation lists (CRLs) are not checked. By default OpenSSL + does neither require nor verify CRLs. + + .. versionadded:: 3.4 + +.. data:: VERIFY_CRL_CHECK_LEAF + + Possible value for :attr:`SSLContext.verify_flags`. In this mode, only the + peer cert is check but non of the intermediate CA certificates. The mode + requires a valid CRL that is signed by the peer cert's issuer (its direct + ancestor CA). If no proper has been loaded + :attr:`SSLContext.load_verify_locations`, validation will fail. + + .. versionadded:: 3.4 + +.. data:: VERIFY_CRL_CHECK_CHAIN + + Possible value for :attr:`SSLContext.verify_flags`. In this mode, CRLs of + all certificates in the peer cert chain are checked. + + .. versionadded:: 3.4 + +.. data:: VERIFY_X509_STRICT + + Possible value for :attr:`SSLContext.verify_flags` to disable workarounds + for broken X.509 certificates. + + .. versionadded:: 3.4 + .. data:: PROTOCOL_SSLv2 Selects SSL version 2 as the channel encryption protocol. @@ -862,6 +894,10 @@ other peers' certificates when :data:`verify_mode` is other than :data:`CERT_NONE`. At least one of *cafile* or *capath* must be specified. + This method can also load certification revocation lists (CRLs) in PEM or + or DER format. In order to make use of CRLs, :attr:`SSLContext.verify_flags` + must be configured properly. + The *cafile* string, if present, is the path to a file of concatenated CA certificates in PEM format. See the discussion of :ref:`ssl-certificates` for more information about how to arrange the @@ -880,6 +916,7 @@ .. versionchanged:: 3.4 New optional argument *cadata* + .. method:: SSLContext.get_ca_certs(binary_form=False) Get a list of loaded "certification authority" (CA) certificates. If the @@ -1057,6 +1094,14 @@ The protocol version chosen when constructing the context. This attribute is read-only. +.. attribute:: SSLContext.verify_flags + + The flags for certificate verification operations. You can set flags like + :data:`VERIFY_CRL_CHECK_LEAF` by ORing them together. By default OpenSSL + does neither require nor verify certificate revocation lists (CRLs). + + .. versionadded:: 3.4 + .. attribute:: SSLContext.verify_mode Whether to try to verify other peers' certificates and how to behave diff --git a/Lib/ssl.py b/Lib/ssl.py --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -102,6 +102,8 @@ SSLSyscallError, SSLEOFError, ) from _ssl import CERT_NONE, CERT_OPTIONAL, CERT_REQUIRED +from _ssl import (VERIFY_DEFAULT, VERIFY_CRL_CHECK_LEAF, VERIFY_CRL_CHECK_CHAIN, + VERIFY_X509_STRICT) from _ssl import txt2obj as _txt2obj, nid2obj as _nid2obj from _ssl import RAND_status, RAND_egd, RAND_add, RAND_bytes, RAND_pseudo_bytes diff --git a/Lib/test/make_ssl_certs.py b/Lib/test/make_ssl_certs.py --- a/Lib/test/make_ssl_certs.py +++ b/Lib/test/make_ssl_certs.py @@ -28,8 +28,10 @@ [ CA_default ] dir = cadir database = $dir/index.txt + crlnumber = $dir/crl.txt default_md = sha1 default_days = 3600 + default_crl_days = 3600 certificate = pycacert.pem private_key = pycakey.pem serial = $dir/serial @@ -112,6 +114,8 @@ os.mkdir(TMP_CADIR) with open(os.path.join('cadir','index.txt'),'a+') as f: pass # empty file + with open(os.path.join('cadir','crl.txt'),'a+') as f: + r.write("00") with open(os.path.join('cadir','index.txt.attr'),'w+') as f: f.write('unique_subject = no') @@ -129,6 +133,8 @@ '-keyfile', 'pycakey.pem', '-days', '3650', '-selfsign', '-extensions', 'v3_ca', '-infiles', f.name ] check_call(['openssl'] + args) + args = ['ca', '-config', t.name, '-gencrl', '-out', 'revocation.crl'] + check_call(['openssl'] + args) if __name__ == '__main__': os.chdir(here) diff --git a/Lib/test/revocation.crl b/Lib/test/revocation.crl new file mode 100644 --- /dev/null +++ b/Lib/test/revocation.crl @@ -0,0 +1,11 @@ +-----BEGIN X509 CRL----- +MIIBpjCBjwIBATANBgkqhkiG9w0BAQUFADBNMQswCQYDVQQGEwJYWTEmMCQGA1UE +CgwdUHl0aG9uIFNvZnR3YXJlIEZvdW5kYXRpb24gQ0ExFjAUBgNVBAMMDW91ci1j +YS1zZXJ2ZXIXDTEzMTEyMTE3MDg0N1oXDTIzMDkzMDE3MDg0N1qgDjAMMAoGA1Ud +FAQDAgEAMA0GCSqGSIb3DQEBBQUAA4IBAQCNJXC2mVKauEeN3LlQ3ZtM5gkH3ExH ++i4bmJjtJn497WwvvoIeUdrmVXgJQR93RtV37hZwN0SXMLlNmUZPH4rHhihayw4m +unCzVj/OhCCY7/TPjKuJ1O/0XhaLBpBVjQN7R/1ujoRKbSia/CD3vcn7Fqxzw7LK +fSRCKRGTj1CZiuxrphtFchwALXSiFDy9mr2ZKhImcyq1PydfgEzU78APpOkMQsIC +UNJ/cf3c9emzf+dUtcMEcejQ3mynBo4eIGg1EW42bz4q4hSjzQlKcBV0muw5qXhc +HOxH2iTFhQ7SrvVuK/dM14rYM4B5mSX3nRC1kNmXpS9j3wJDhuwmjHed +-----END X509 CRL----- diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -48,6 +48,9 @@ CAFILE_CACERT = data_file("capath", "5ed36f99.0") +# empty CRL +CRLFILE = data_file("revocation.crl") + # Two keys and certs signed by the same CA (for SNI tests) SIGNED_CERTFILE = data_file("keycert3.pem") SIGNED_CERTFILE2 = data_file("keycert4.pem") @@ -631,7 +634,7 @@ with self.assertRaises(ValueError): ctx.options = 0 - def test_verify(self): + def test_verify_mode(self): ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) # Default value self.assertEqual(ctx.verify_mode, ssl.CERT_NONE) @@ -646,6 +649,23 @@ with self.assertRaises(ValueError): ctx.verify_mode = 42 + def test_verify_flags(self): + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + # default value by OpenSSL + self.assertEqual(ctx.verify_flags, ssl.VERIFY_DEFAULT) + ctx.verify_flags = ssl.VERIFY_CRL_CHECK_LEAF + self.assertEqual(ctx.verify_flags, ssl.VERIFY_CRL_CHECK_LEAF) + ctx.verify_flags = ssl.VERIFY_CRL_CHECK_CHAIN + self.assertEqual(ctx.verify_flags, ssl.VERIFY_CRL_CHECK_CHAIN) + ctx.verify_flags = ssl.VERIFY_DEFAULT + self.assertEqual(ctx.verify_flags, ssl.VERIFY_DEFAULT) + # supports any value + ctx.verify_flags = ssl.VERIFY_CRL_CHECK_LEAF | ssl.VERIFY_X509_STRICT + self.assertEqual(ctx.verify_flags, + ssl.VERIFY_CRL_CHECK_LEAF | ssl.VERIFY_X509_STRICT) + with self.assertRaises(TypeError): + ctx.verify_flags = None + def test_load_cert_chain(self): ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) # Combined key and cert in a single file @@ -1771,6 +1791,47 @@ self.assertLess(before, after) s.close() + def test_crl_check(self): + if support.verbose: + sys.stdout.write("\n") + + server_context = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + server_context.load_cert_chain(SIGNED_CERTFILE) + + context = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + context.verify_mode = ssl.CERT_REQUIRED + context.load_verify_locations(SIGNING_CA) + context.verify_mode = ssl.CERT_REQUIRED + context.verify_flags = ssl.VERIFY_DEFAULT + + # VERIFY_DEFAULT should pass + server = ThreadedEchoServer(context=server_context, chatty=True) + with server: + with context.wrap_socket(socket.socket()) as s: + s.connect((HOST, server.port)) + cert = s.getpeercert() + self.assertTrue(cert, "Can't get peer certificate.") + + # VERIFY_CRL_CHECK_LEAF without a loaded CRL file fails + context.verify_flags = ssl.VERIFY_CRL_CHECK_LEAF + + server = ThreadedEchoServer(context=server_context, chatty=True) + with server: + with context.wrap_socket(socket.socket()) as s: + with self.assertRaisesRegex(ssl.SSLError, + "certificate verify failed"): + s.connect((HOST, server.port)) + + # now load a CRL file. The CRL file is signed by the CA. + context.load_verify_locations(CRLFILE) + + server = ThreadedEchoServer(context=server_context, chatty=True) + with server: + with context.wrap_socket(socket.socket()) as s: + s.connect((HOST, server.port)) + cert = s.getpeercert() + self.assertTrue(cert, "Can't get peer certificate.") + def test_empty_cert(self): """Connecting with an empty cert file""" bad_cert_test(os.path.join(os.path.dirname(__file__) or os.curdir, diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -59,6 +59,10 @@ Library ------- +- Issue #8813: Add SSLContext.verify_flags to change the verification flags + of the context in order to enable certification revocation list (CRL) + checks or strict X509 rules. + - Issue #18294: Fix the zlib module to make it 64-bit safe. - Issue #19682: Fix compatibility issue with old version of OpenSSL that diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -2231,6 +2231,44 @@ } static PyObject * +get_verify_flags(PySSLContext *self, void *c) +{ + X509_STORE *store; + unsigned long flags; + + store = SSL_CTX_get_cert_store(self->ctx); + flags = X509_VERIFY_PARAM_get_flags(store->param); + return PyLong_FromUnsignedLong(flags); +} + +static int +set_verify_flags(PySSLContext *self, PyObject *arg, void *c) +{ + X509_STORE *store; + unsigned long new_flags, flags, set, clear; + + if (!PyArg_Parse(arg, "k", &new_flags)) + return -1; + store = SSL_CTX_get_cert_store(self->ctx); + flags = X509_VERIFY_PARAM_get_flags(store->param); + clear = flags & ~new_flags; + set = ~flags & new_flags; + if (clear) { + if (!X509_VERIFY_PARAM_clear_flags(store->param, clear)) { + _setSSLError(NULL, 0, __FILE__, __LINE__); + return -1; + } + } + if (set) { + if (!X509_VERIFY_PARAM_set_flags(store->param, set)) { + _setSSLError(NULL, 0, __FILE__, __LINE__); + return -1; + } + } + return 0; +} + +static PyObject * get_options(PySSLContext *self, void *c) { return PyLong_FromLong(SSL_CTX_get_options(self->ctx)); @@ -3048,6 +3086,8 @@ static PyGetSetDef context_getsetlist[] = { {"options", (getter) get_options, (setter) set_options, NULL}, + {"verify_flags", (getter) get_verify_flags, + (setter) set_verify_flags, NULL}, {"verify_mode", (getter) get_verify_mode, (setter) set_verify_mode, NULL}, {NULL}, /* sentinel */ @@ -3761,6 +3801,15 @@ PY_SSL_CERT_OPTIONAL); PyModule_AddIntConstant(m, "CERT_REQUIRED", PY_SSL_CERT_REQUIRED); + /* CRL verification for verification_flags */ + PyModule_AddIntConstant(m, "VERIFY_DEFAULT", + 0); + PyModule_AddIntConstant(m, "VERIFY_CRL_CHECK_LEAF", + X509_V_FLAG_CRL_CHECK); + PyModule_AddIntConstant(m, "VERIFY_CRL_CHECK_CHAIN", + X509_V_FLAG_CRL_CHECK|X509_V_FLAG_CRL_CHECK_ALL); + PyModule_AddIntConstant(m, "VERIFY_X509_STRICT", + X509_V_FLAG_X509_STRICT); #ifdef _MSC_VER /* Windows dwCertEncodingType */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 21 23:57:56 2013 From: python-checkins at python.org (christian.heimes) Date: Thu, 21 Nov 2013 23:57:56 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_lst_might_be_NULL_here?= Message-ID: <3dQblS6GL9z7M7t@mail.python.org> http://hg.python.org/cpython/rev/4d63ee2ff753 changeset: 87321:4d63ee2ff753 user: Christian Heimes date: Thu Nov 21 23:57:49 2013 +0100 summary: lst might be NULL here CID 1130752: Dereference after null check (FORWARD_NULL) files: Modules/_ssl.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -1014,7 +1014,7 @@ fail: AUTHORITY_INFO_ACCESS_free(info); - Py_DECREF(lst); + Py_XDECREF(lst); return NULL; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 00:21:15 2013 From: python-checkins at python.org (barry.warsaw) Date: Fri, 22 Nov 2013 00:21:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Update_list_of_accepted_but_n?= =?utf-8?q?ot_yet_implemented_or_merged_peps=2E?= Message-ID: <3dQcGM6M5pz7M8F@mail.python.org> http://hg.python.org/peps/rev/ea82fe7aa1ff changeset: 5311:ea82fe7aa1ff user: Barry Warsaw date: Thu Nov 21 18:21:06 2013 -0500 summary: Update list of accepted but not yet implemented or merged peps. files: pep-0429.txt | 14 +++++++++----- 1 files changed, 9 insertions(+), 5 deletions(-) diff --git a/pep-0429.txt b/pep-0429.txt --- a/pep-0429.txt +++ b/pep-0429.txt @@ -74,6 +74,15 @@ * PEP 446, make newly created file descriptors not inherited by default * PEP 450, basic statistics module for the standard library +Accepted but not yet implemented/merged: + +* PEP 428, the pathlib module -- object-oriented filesystem paths +* PEP 451, a ModuleSpec Type for the Import System +* PEP 453, pip bootstrapping/bundling with CPython +* PEP 454, the tracemalloc module for tracing Python memory allocations +* PEP 3154, Pickle protocol revision 4 +* PEP 3156, improved asynchronous IO support + Other final large-scale changes: * None so far @@ -85,12 +94,7 @@ * PEP 441, improved Python zip application support * PEP 447, support for __locallookup__ metaclass method * PEP 448, additional unpacking generalizations -* PEP 451, making custom import hooks easier to implement correctly -* PEP 453, pip bootstrapping/bundling with CPython -* PEP 454, PEP 445 based malloc tracing support * PEP 455, key transforming dictionary -* PEP 3154, Pickle protocol revision 4 -* PEP 3156, improved asynchronous IO support Other proposed large-scale changes: -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 22 00:34:37 2013 From: python-checkins at python.org (christian.heimes) Date: Fri, 22 Nov 2013 00:34:37 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_downcast_len_to_int=2E_The?= =?utf-8?q?_code_has_already_checked_that_len_=3C_INT=5FMAX?= Message-ID: <3dQcYn5P0lzQQj@mail.python.org> http://hg.python.org/cpython/rev/97b84261b794 changeset: 87322:97b84261b794 user: Christian Heimes date: Fri Nov 22 00:34:18 2013 +0100 summary: downcast len to int. The code has already checked that len < INT_MAX files: Modules/_ssl.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -2520,7 +2520,7 @@ return -1; } - biobuf = BIO_new_mem_buf(data, len); + biobuf = BIO_new_mem_buf(data, (int)len); if (biobuf == NULL) { _setSSLError("Can't allocate buffer", 0, __FILE__, __LINE__); return -1; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 00:39:46 2013 From: python-checkins at python.org (christian.heimes) Date: Fri, 22 Nov 2013 00:39:46 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_one_CERT=5FREQUIRED_is_eno?= =?utf-8?q?ugh?= Message-ID: <3dQcgk5YpXz7LkR@mail.python.org> http://hg.python.org/cpython/rev/0cfc5d0c9e77 changeset: 87323:0cfc5d0c9e77 user: Christian Heimes date: Fri Nov 22 00:39:38 2013 +0100 summary: one CERT_REQUIRED is enough files: Lib/test/test_ssl.py | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -1801,7 +1801,6 @@ context = ssl.SSLContext(ssl.PROTOCOL_TLSv1) context.verify_mode = ssl.CERT_REQUIRED context.load_verify_locations(SIGNING_CA) - context.verify_mode = ssl.CERT_REQUIRED context.verify_flags = ssl.VERIFY_DEFAULT # VERIFY_DEFAULT should pass -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 00:46:28 2013 From: python-checkins at python.org (christian.heimes) Date: Fri, 22 Nov 2013 00:46:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_silence_an_overflow_warnin?= =?utf-8?q?g=2E_slen_is_smaller_than_1MB?= Message-ID: <3dQcqS1X2zz7M6j@mail.python.org> http://hg.python.org/cpython/rev/ef03369385ba changeset: 87324:ef03369385ba user: Christian Heimes date: Fri Nov 22 00:46:18 2013 +0100 summary: silence an overflow warning. slen is smaller than 1MB files: Modules/pyexpat.c | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Modules/pyexpat.c b/Modules/pyexpat.c --- a/Modules/pyexpat.c +++ b/Modules/pyexpat.c @@ -835,7 +835,8 @@ s += MAX_CHUNK_SIZE; slen -= MAX_CHUNK_SIZE; } - rc = XML_Parse(self->itself, s, slen, isFinal); + assert(MAX_CHUNK_SIZE < INT_MAX && slen < INT_MAX); + rc = XML_Parse(self->itself, s, (int)slen, isFinal); done: if (view.buf != NULL) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 00:57:49 2013 From: python-checkins at python.org (barry.warsaw) Date: Fri, 22 Nov 2013 00:57:49 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_-_Issue_=2319555=3A_Restor?= =?utf-8?q?e_sysconfig=2Eget=5Fconfig=5Fvar=28=27SO=27=29=2C_with_a?= Message-ID: <3dQd4Y5cLkz7M75@mail.python.org> http://hg.python.org/cpython/rev/fedc2b8fbb6e changeset: 87325:fedc2b8fbb6e parent: 87321:4d63ee2ff753 user: Barry Warsaw date: Thu Nov 21 18:57:14 2013 -0500 summary: - Issue #19555: Restore sysconfig.get_config_var('SO'), with a DeprecationWarning pointing people at $EXT_SUFFIX. files: Lib/sysconfig.py | 7 +++++++ Lib/test/test_sysconfig.py | 19 +++++++++++++++++++ Misc/NEWS | 3 +++ 3 files changed, 29 insertions(+), 0 deletions(-) diff --git a/Lib/sysconfig.py b/Lib/sysconfig.py --- a/Lib/sysconfig.py +++ b/Lib/sysconfig.py @@ -409,6 +409,10 @@ # _sysconfigdata is generated at build time, see _generate_posix_vars() from _sysconfigdata import build_time_vars vars.update(build_time_vars) + # For backward compatibility, see issue19555 + SO = build_time_vars.get('EXT_SUFFIX') + if SO is not None: + vars['SO'] = SO def _init_non_posix(vars): """Initialize the module as appropriate for NT""" @@ -579,6 +583,9 @@ Equivalent to get_config_vars().get(name) """ + if name == 'SO': + import warnings + warnings.warn('SO is deprecated, use EXT_SUFFIX', DeprecationWarning) return get_config_vars().get(name) diff --git a/Lib/test/test_sysconfig.py b/Lib/test/test_sysconfig.py --- a/Lib/test/test_sysconfig.py +++ b/Lib/test/test_sysconfig.py @@ -369,6 +369,25 @@ os.chdir(cwd) self.assertEqual(srcdir, srcdir2) + @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, + 'EXT_SUFFIX required for this test') + def test_SO_deprecation(self): + self.assertWarns(DeprecationWarning, + sysconfig.get_config_var, 'SO') + + @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, + 'EXT_SUFFIX required for this test') + def test_SO_value(self): + self.assertEqual(sysconfig.get_config_var('SO'), + sysconfig.get_config_var('EXT_SUFFIX')) + + @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, + 'EXT_SUFFIX required for this test') + def test_SO_in_vars(self): + vars = sysconfig.get_config_vars() + self.assertIsNotNone(vars['SO']) + self.assertEqual(vars['SO'], vars['EXT_SUFFIX']) + class MakefileTests(unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -59,6 +59,9 @@ Library ------- +- Issue #19555: Restore sysconfig.get_config_var('SO'), with a + DeprecationWarning pointing people at $EXT_SUFFIX. + - Issue #8813: Add SSLContext.verify_flags to change the verification flags of the context in order to enable certification revocation list (CRL) checks or strict X509 rules. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 00:57:51 2013 From: python-checkins at python.org (barry.warsaw) Date: Fri, 22 Nov 2013 00:57:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_trunk_merge?= Message-ID: <3dQd4b0V9Vz7M7g@mail.python.org> http://hg.python.org/cpython/rev/f0090ff52128 changeset: 87326:f0090ff52128 parent: 87325:fedc2b8fbb6e parent: 87324:ef03369385ba user: Barry Warsaw date: Thu Nov 21 18:57:41 2013 -0500 summary: trunk merge files: Lib/test/test_ssl.py | 1 - Modules/_ssl.c | 2 +- Modules/pyexpat.c | 3 ++- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -1801,7 +1801,6 @@ context = ssl.SSLContext(ssl.PROTOCOL_TLSv1) context.verify_mode = ssl.CERT_REQUIRED context.load_verify_locations(SIGNING_CA) - context.verify_mode = ssl.CERT_REQUIRED context.verify_flags = ssl.VERIFY_DEFAULT # VERIFY_DEFAULT should pass diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -2520,7 +2520,7 @@ return -1; } - biobuf = BIO_new_mem_buf(data, len); + biobuf = BIO_new_mem_buf(data, (int)len); if (biobuf == NULL) { _setSSLError("Can't allocate buffer", 0, __FILE__, __LINE__); return -1; diff --git a/Modules/pyexpat.c b/Modules/pyexpat.c --- a/Modules/pyexpat.c +++ b/Modules/pyexpat.c @@ -835,7 +835,8 @@ s += MAX_CHUNK_SIZE; slen -= MAX_CHUNK_SIZE; } - rc = XML_Parse(self->itself, s, slen, isFinal); + assert(MAX_CHUNK_SIZE < INT_MAX && slen < INT_MAX); + rc = XML_Parse(self->itself, s, (int)slen, isFinal); done: if (view.buf != NULL) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 01:18:04 2013 From: python-checkins at python.org (christian.heimes) Date: Fri, 22 Nov 2013 01:18:04 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319664=3A_test=5Fu?= =?utf-8?q?serdict=27s_repr_test_no_longer_depends_on_the_order?= Message-ID: <3dQdWw0rDfz7M7S@mail.python.org> http://hg.python.org/cpython/rev/a0ec33b83ba4 changeset: 87327:a0ec33b83ba4 parent: 87324:ef03369385ba user: Christian Heimes date: Fri Nov 22 01:16:56 2013 +0100 summary: Issue #19664: test_userdict's repr test no longer depends on the order of dict elements. Original patch by Serhiy Storchaka files: Lib/test/test_userdict.py | 3 ++- Misc/NEWS | 3 +++ 2 files changed, 5 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_userdict.py b/Lib/test/test_userdict.py --- a/Lib/test/test_userdict.py +++ b/Lib/test/test_userdict.py @@ -45,7 +45,8 @@ # Test __repr__ self.assertEqual(str(u0), str(d0)) self.assertEqual(repr(u1), repr(d1)) - self.assertEqual(repr(u2), repr(d2)) + self.assertIn(repr(u2), ("{'one': 1, 'two': 2}", + "{'two': 2, 'one': 1}")) # Test rich comparison and __len__ all = [d0, d1, d2, u, u0, u1, u2, uu, uu0, uu1, uu2] diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -306,6 +306,9 @@ Tests ----- +- Issue #19664: test_userdict's repr test no longer depends on the order + of dict elements. + - Issue #19440: Clean up test_capi by removing an unnecessary __future__ import, converting from test_main to unittest.main, and running the _testcapi module tests as subTests of a unittest TestCase method. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 01:18:05 2013 From: python-checkins at python.org (christian.heimes) Date: Fri, 22 Nov 2013 01:18:05 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_merge?= Message-ID: <3dQdWx2t4sz7M7r@mail.python.org> http://hg.python.org/cpython/rev/392212768451 changeset: 87328:392212768451 parent: 87327:a0ec33b83ba4 parent: 87326:f0090ff52128 user: Christian Heimes date: Fri Nov 22 01:17:34 2013 +0100 summary: merge files: Lib/sysconfig.py | 7 +++++++ Lib/test/test_sysconfig.py | 19 +++++++++++++++++++ Misc/NEWS | 3 +++ 3 files changed, 29 insertions(+), 0 deletions(-) diff --git a/Lib/sysconfig.py b/Lib/sysconfig.py --- a/Lib/sysconfig.py +++ b/Lib/sysconfig.py @@ -409,6 +409,10 @@ # _sysconfigdata is generated at build time, see _generate_posix_vars() from _sysconfigdata import build_time_vars vars.update(build_time_vars) + # For backward compatibility, see issue19555 + SO = build_time_vars.get('EXT_SUFFIX') + if SO is not None: + vars['SO'] = SO def _init_non_posix(vars): """Initialize the module as appropriate for NT""" @@ -579,6 +583,9 @@ Equivalent to get_config_vars().get(name) """ + if name == 'SO': + import warnings + warnings.warn('SO is deprecated, use EXT_SUFFIX', DeprecationWarning) return get_config_vars().get(name) diff --git a/Lib/test/test_sysconfig.py b/Lib/test/test_sysconfig.py --- a/Lib/test/test_sysconfig.py +++ b/Lib/test/test_sysconfig.py @@ -369,6 +369,25 @@ os.chdir(cwd) self.assertEqual(srcdir, srcdir2) + @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, + 'EXT_SUFFIX required for this test') + def test_SO_deprecation(self): + self.assertWarns(DeprecationWarning, + sysconfig.get_config_var, 'SO') + + @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, + 'EXT_SUFFIX required for this test') + def test_SO_value(self): + self.assertEqual(sysconfig.get_config_var('SO'), + sysconfig.get_config_var('EXT_SUFFIX')) + + @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, + 'EXT_SUFFIX required for this test') + def test_SO_in_vars(self): + vars = sysconfig.get_config_vars() + self.assertIsNotNone(vars['SO']) + self.assertEqual(vars['SO'], vars['EXT_SUFFIX']) + class MakefileTests(unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -59,6 +59,9 @@ Library ------- +- Issue #19555: Restore sysconfig.get_config_var('SO'), with a + DeprecationWarning pointing people at $EXT_SUFFIX. + - Issue #8813: Add SSLContext.verify_flags to change the verification flags of the context in order to enable certification revocation list (CRL) checks or strict X509 rules. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 01:22:57 2013 From: python-checkins at python.org (christian.heimes) Date: Fri, 22 Nov 2013 01:22:57 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319681=3A_Apply_a_?= =?utf-8?q?quick_and_minimal_band-aid=2E?= Message-ID: <3dQddY63jSzSTs@mail.python.org> http://hg.python.org/cpython/rev/18a44d65d34a changeset: 87329:18a44d65d34a user: Christian Heimes date: Fri Nov 22 01:22:47 2013 +0100 summary: Issue #19681: Apply a quick and minimal band-aid. The flaky buildbots make it hard to detect real issue. This is just a temporary fix until we agree on a permanent solution. files: Lib/test/test_functools.py | 3 ++- 1 files changed, 2 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_functools.py b/Lib/test/test_functools.py --- a/Lib/test/test_functools.py +++ b/Lib/test/test_functools.py @@ -154,7 +154,8 @@ def test_repr(self): args = (object(), object()) args_repr = ', '.join(repr(a) for a in args) - kwargs = {'a': object(), 'b': object()} + #kwargs = {'a': object(), 'b': object()} + kwargs = {'a': object()} kwargs_repr = ', '.join("%s=%r" % (k, v) for k, v in kwargs.items()) if self.partial is c_functools.partial: name = 'functools.partial' -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 01:51:41 2013 From: python-checkins at python.org (christian.heimes) Date: Fri, 22 Nov 2013 01:51:41 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2317134=3A_Finalize?= =?utf-8?q?_interface_to_Windows=27_certificate_store=2E_Cert_and?= Message-ID: <3dQfGj5KJMz7Ljv@mail.python.org> http://hg.python.org/cpython/rev/9adcb61ea741 changeset: 87330:9adcb61ea741 user: Christian Heimes date: Fri Nov 22 01:51:30 2013 +0100 summary: Issue #17134: Finalize interface to Windows' certificate store. Cert and CRL enumeration are now two functions. enum_certificates() also returns purpose flags as set of OIDs. files: Doc/library/ssl.rst | 45 ++- Lib/ssl.py | 2 +- Lib/test/test_ssl.py | 53 +++- Misc/NEWS | 4 + Modules/_ssl.c | 320 +++++++++++++++++++++--------- 5 files changed, 291 insertions(+), 133 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -372,21 +372,45 @@ .. versionadded:: 3.4 -.. function:: enum_cert_store(store_name, cert_type='certificate') +.. function:: enum_certificates(store_name) Retrieve certificates from Windows' system cert store. *store_name* may be one of ``CA``, ``ROOT`` or ``MY``. Windows may provide additional cert - stores, too. *cert_type* is either ``certificate`` for X.509 certificates - or ``crl`` for X.509 certificate revocation lists. + stores, too. - The function returns a list of (bytes, encoding_type) tuples. The - encoding_type flag can be interpreted with :const:`X509_ASN_ENCODING` or - :const:`PKCS_7_ASN_ENCODING`. + The function returns a list of (cert_bytes, encoding_type, trust) tuples. + The encoding_type specifies the encoding of cert_bytes. It is either + :const:`x509_asn` for X.509 ASN.1 data or :const:`pkcs_7_asn` for + PKCS#7 ASN.1 data. Trust specifies the purpose of the certificate as a set + of OIDS or exactly ``True`` if the certificate is trustworthy for all + purposes. + + Example:: + + >>> ssl.enum_certificates("CA") + [(b'data...', 'x509_asn', {'1.3.6.1.5.5.7.3.1', '1.3.6.1.5.5.7.3.2'}), + (b'data...', 'x509_asn', True)] Availability: Windows. .. versionadded:: 3.4 +.. function:: enum_crls(store_name) + + Retrieve CRLs from Windows' system cert store. *store_name* may be + one of ``CA``, ``ROOT`` or ``MY``. Windows may provide additional cert + stores, too. + + The function returns a list of (cert_bytes, encoding_type, trust) tuples. + The encoding_type specifies the encoding of cert_bytes. It is either + :const:`x509_asn` for X.509 ASN.1 data or :const:`pkcs_7_asn` for + PKCS#7 ASN.1 data. + + Availability: Windows. + + .. versionadded:: 3.4 + + Constants ^^^^^^^^^ @@ -657,15 +681,6 @@ .. versionadded:: 3.4 -.. data:: X509_ASN_ENCODING - PKCS_7_ASN_ENCODING - - Encoding flags for :func:`enum_cert_store`. - - Availability: Windows. - - .. versionadded:: 3.4 - SSL Sockets ----------- diff --git a/Lib/ssl.py b/Lib/ssl.py --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -144,7 +144,7 @@ _PROTOCOL_NAMES[PROTOCOL_TLSv1_2] = "TLSv1.2" if sys.platform == "win32": - from _ssl import enum_cert_store, X509_ASN_ENCODING, PKCS_7_ASN_ENCODING + from _ssl import enum_certificates, enum_crls from socket import getnameinfo as _getnameinfo from socket import socket, AF_INET, SOCK_STREAM, create_connection diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -528,29 +528,44 @@ self.assertEqual(paths.cafile, CERTFILE) self.assertEqual(paths.capath, CAPATH) + @unittest.skipUnless(sys.platform == "win32", "Windows specific") + def test_enum_certificates(self): + self.assertTrue(ssl.enum_certificates("CA")) + self.assertTrue(ssl.enum_certificates("ROOT")) + + self.assertRaises(TypeError, ssl.enum_certificates) + self.assertRaises(WindowsError, ssl.enum_certificates, "") + + names = set() + ca = ssl.enum_certificates("CA") + self.assertIsInstance(ca, list) + for element in ca: + self.assertIsInstance(element, tuple) + self.assertEqual(len(element), 3) + cert, enc, trust = element + self.assertIsInstance(cert, bytes) + self.assertIn(enc, {"x509_asn", "pkcs_7_asn"}) + self.assertIsInstance(trust, (set, bool)) + if isinstance(trust, set): + names.update(trust) + + serverAuth = "1.3.6.1.5.5.7.3.1" + self.assertIn(serverAuth, names) @unittest.skipUnless(sys.platform == "win32", "Windows specific") - def test_enum_cert_store(self): - self.assertEqual(ssl.X509_ASN_ENCODING, 1) - self.assertEqual(ssl.PKCS_7_ASN_ENCODING, 0x00010000) + def test_enum_crls(self): + self.assertTrue(ssl.enum_crls("CA")) + self.assertRaises(TypeError, ssl.enum_crls) + self.assertRaises(WindowsError, ssl.enum_crls, "") - self.assertEqual(ssl.enum_cert_store("CA"), - ssl.enum_cert_store("CA", "certificate")) - ssl.enum_cert_store("CA", "crl") - self.assertEqual(ssl.enum_cert_store("ROOT"), - ssl.enum_cert_store("ROOT", "certificate")) - ssl.enum_cert_store("ROOT", "crl") + crls = ssl.enum_crls("CA") + self.assertIsInstance(crls, list) + for element in crls: + self.assertIsInstance(element, tuple) + self.assertEqual(len(element), 2) + self.assertIsInstance(element[0], bytes) + self.assertIn(element[1], {"x509_asn", "pkcs_7_asn"}) - self.assertRaises(TypeError, ssl.enum_cert_store) - self.assertRaises(WindowsError, ssl.enum_cert_store, "") - self.assertRaises(ValueError, ssl.enum_cert_store, "CA", "wrong") - - ca = ssl.enum_cert_store("CA") - self.assertIsInstance(ca, list) - self.assertIsInstance(ca[0], tuple) - self.assertEqual(len(ca[0]), 2) - self.assertIsInstance(ca[0][0], bytes) - self.assertIsInstance(ca[0][1], int) def test_asn1object(self): expected = (129, 'serverAuth', 'TLS Web Server Authentication', diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -59,6 +59,10 @@ Library ------- +- Issue #17134: Finalize interface to Windows' certificate store. Cert and + CRL enumeration are now two functions. enum_certificates() also returns + purpose flags as set of OIDs. + - Issue #19555: Restore sysconfig.get_config_var('SO'), with a DeprecationWarning pointing people at $EXT_SUFFIX. diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -3422,130 +3422,258 @@ return result; } - #ifdef _MSC_VER -PyDoc_STRVAR(PySSL_enum_cert_store_doc, -"enum_cert_store(store_name, cert_type='certificate') -> []\n\ + +static PyObject* +certEncodingType(DWORD encodingType) +{ + static PyObject *x509_asn = NULL; + static PyObject *pkcs_7_asn = NULL; + + if (x509_asn == NULL) { + x509_asn = PyUnicode_InternFromString("x509_asn"); + if (x509_asn == NULL) + return NULL; + } + if (pkcs_7_asn == NULL) { + pkcs_7_asn = PyUnicode_InternFromString("pkcs_7_asn"); + if (pkcs_7_asn == NULL) + return NULL; + } + switch(encodingType) { + case X509_ASN_ENCODING: + Py_INCREF(x509_asn); + return x509_asn; + case PKCS_7_ASN_ENCODING: + Py_INCREF(pkcs_7_asn); + return pkcs_7_asn; + default: + return PyLong_FromLong(encodingType); + } +} + +static PyObject* +parseKeyUsage(PCCERT_CONTEXT pCertCtx, DWORD flags) +{ + CERT_ENHKEY_USAGE *usage; + DWORD size, error, i; + PyObject *retval; + + if (!CertGetEnhancedKeyUsage(pCertCtx, flags, NULL, &size)) { + error = GetLastError(); + if (error == CRYPT_E_NOT_FOUND) { + Py_RETURN_TRUE; + } + return PyErr_SetFromWindowsErr(error); + } + + usage = (CERT_ENHKEY_USAGE*)PyMem_Malloc(size); + if (usage == NULL) { + return PyErr_NoMemory(); + } + + /* Now get the actual enhanced usage property */ + if (!CertGetEnhancedKeyUsage(pCertCtx, flags, usage, &size)) { + PyMem_Free(usage); + error = GetLastError(); + if (error == CRYPT_E_NOT_FOUND) { + Py_RETURN_TRUE; + } + return PyErr_SetFromWindowsErr(error); + } + retval = PySet_New(NULL); + if (retval == NULL) { + goto error; + } + for (i = 0; i < usage->cUsageIdentifier; ++i) { + if (usage->rgpszUsageIdentifier[i]) { + PyObject *oid; + int err; + oid = PyUnicode_FromString(usage->rgpszUsageIdentifier[i]); + if (oid == NULL) { + Py_CLEAR(retval); + goto error; + } + err = PySet_Add(retval, oid); + Py_DECREF(oid); + if (err == -1) { + Py_CLEAR(retval); + goto error; + } + } + } + error: + PyMem_Free(usage); + return retval; +} + +PyDoc_STRVAR(PySSL_enum_certificates_doc, +"enum_certificates(store_name) -> []\n\ \n\ Retrieve certificates from Windows' cert store. store_name may be one of\n\ 'CA', 'ROOT' or 'MY'. The system may provide more cert storages, too.\n\ -cert_type must be either 'certificate' or 'crl'.\n\ +The function returns a list of (bytes, encoding_type, trust) tuples. The\n\ +encoding_type flag can be interpreted with X509_ASN_ENCODING or\n\ +PKCS_7_ASN_ENCODING. The trust setting is either a set of OIDs or the\n\ +boolean True."); + +static PyObject * +PySSL_enum_certificates(PyObject *self, PyObject *args, PyObject *kwds) +{ + char *kwlist[] = {"store_name", NULL}; + char *store_name; + HCERTSTORE hStore = NULL; + PCCERT_CONTEXT pCertCtx = NULL; + PyObject *keyusage = NULL, *cert = NULL, *enc = NULL, *tup = NULL; + PyObject *result = NULL; + + if (!PyArg_ParseTupleAndKeywords(args, kwds, "s|s:enum_certificates", + kwlist, &store_name)) { + return NULL; + } + result = PyList_New(0); + if (result == NULL) { + return NULL; + } + hStore = CertOpenSystemStore((HCRYPTPROV)NULL, store_name); + if (hStore == NULL) { + Py_DECREF(result); + return PyErr_SetFromWindowsErr(GetLastError()); + } + + while (pCertCtx = CertEnumCertificatesInStore(hStore, pCertCtx)) { + cert = PyBytes_FromStringAndSize((const char*)pCertCtx->pbCertEncoded, + pCertCtx->cbCertEncoded); + if (!cert) { + Py_CLEAR(result); + break; + } + if ((enc = certEncodingType(pCertCtx->dwCertEncodingType)) == NULL) { + Py_CLEAR(result); + break; + } + keyusage = parseKeyUsage(pCertCtx, CERT_FIND_PROP_ONLY_ENHKEY_USAGE_FLAG); + if (keyusage == Py_True) { + Py_DECREF(keyusage); + keyusage = parseKeyUsage(pCertCtx, CERT_FIND_EXT_ONLY_ENHKEY_USAGE_FLAG); + } + if (keyusage == NULL) { + Py_CLEAR(result); + break; + } + if ((tup = PyTuple_New(3)) == NULL) { + Py_CLEAR(result); + break; + } + PyTuple_SET_ITEM(tup, 0, cert); + cert = NULL; + PyTuple_SET_ITEM(tup, 1, enc); + enc = NULL; + PyTuple_SET_ITEM(tup, 2, keyusage); + keyusage = NULL; + if (PyList_Append(result, tup) < 0) { + Py_CLEAR(result); + break; + } + Py_CLEAR(tup); + } + if (pCertCtx) { + /* loop ended with an error, need to clean up context manually */ + CertFreeCertificateContext(pCertCtx); + } + + /* In error cases cert, enc and tup may not be NULL */ + Py_XDECREF(cert); + Py_XDECREF(enc); + Py_XDECREF(keyusage); + Py_XDECREF(tup); + + if (!CertCloseStore(hStore, 0)) { + /* This error case might shadow another exception.*/ + Py_XDECREF(result); + return PyErr_SetFromWindowsErr(GetLastError()); + } + return result; +} + +PyDoc_STRVAR(PySSL_enum_crls_doc, +"enum_crls(store_name) -> []\n\ +\n\ +Retrieve CRLs from Windows' cert store. store_name may be one of\n\ +'CA', 'ROOT' or 'MY'. The system may provide more cert storages, too.\n\ The function returns a list of (bytes, encoding_type) tuples. The\n\ encoding_type flag can be interpreted with X509_ASN_ENCODING or\n\ PKCS_7_ASN_ENCODING."); static PyObject * -PySSL_enum_cert_store(PyObject *self, PyObject *args, PyObject *kwds) +PySSL_enum_crls(PyObject *self, PyObject *args, PyObject *kwds) { - char *kwlist[] = {"store_name", "cert_type", NULL}; + char *kwlist[] = {"store_name", NULL}; char *store_name; - char *cert_type = "certificate"; HCERTSTORE hStore = NULL; + PCCRL_CONTEXT pCrlCtx = NULL; + PyObject *crl = NULL, *enc = NULL, *tup = NULL; PyObject *result = NULL; - PyObject *tup = NULL, *cert = NULL, *enc = NULL; - int ok = 1; - - if (!PyArg_ParseTupleAndKeywords(args, kwds, "s|s:enum_cert_store", - kwlist, &store_name, &cert_type)) { - return NULL; - } - - if ((strcmp(cert_type, "certificate") != 0) && - (strcmp(cert_type, "crl") != 0)) { - return PyErr_Format(PyExc_ValueError, - "cert_type must be 'certificate' or 'crl', " - "not %.100s", cert_type); - } - - if ((result = PyList_New(0)) == NULL) { + + if (!PyArg_ParseTupleAndKeywords(args, kwds, "s|s:enum_crls", + kwlist, &store_name)) { return NULL; } - - if ((hStore = CertOpenSystemStore((HCRYPTPROV)NULL, store_name)) == NULL) { + result = PyList_New(0); + if (result == NULL) { + return NULL; + } + hStore = CertOpenSystemStore((HCRYPTPROV)NULL, store_name); + if (hStore == NULL) { Py_DECREF(result); return PyErr_SetFromWindowsErr(GetLastError()); } - if (strcmp(cert_type, "certificate") == 0) { - PCCERT_CONTEXT pCertCtx = NULL; - while (pCertCtx = CertEnumCertificatesInStore(hStore, pCertCtx)) { - cert = PyBytes_FromStringAndSize((const char*)pCertCtx->pbCertEncoded, - pCertCtx->cbCertEncoded); - if (!cert) { - ok = 0; - break; - } - if ((enc = PyLong_FromLong(pCertCtx->dwCertEncodingType)) == NULL) { - ok = 0; - break; - } - if ((tup = PyTuple_New(2)) == NULL) { - ok = 0; - break; - } - PyTuple_SET_ITEM(tup, 0, cert); cert = NULL; - PyTuple_SET_ITEM(tup, 1, enc); enc = NULL; - - if (PyList_Append(result, tup) < 0) { - ok = 0; - break; - } - Py_CLEAR(tup); + while (pCrlCtx = CertEnumCRLsInStore(hStore, pCrlCtx)) { + crl = PyBytes_FromStringAndSize((const char*)pCrlCtx->pbCrlEncoded, + pCrlCtx->cbCrlEncoded); + if (!crl) { + Py_CLEAR(result); + break; } - if (pCertCtx) { - /* loop ended with an error, need to clean up context manually */ - CertFreeCertificateContext(pCertCtx); + if ((enc = certEncodingType(pCrlCtx->dwCertEncodingType)) == NULL) { + Py_CLEAR(result); + break; } - } else { - PCCRL_CONTEXT pCrlCtx = NULL; - while (pCrlCtx = CertEnumCRLsInStore(hStore, pCrlCtx)) { - cert = PyBytes_FromStringAndSize((const char*)pCrlCtx->pbCrlEncoded, - pCrlCtx->cbCrlEncoded); - if (!cert) { - ok = 0; - break; - } - if ((enc = PyLong_FromLong(pCrlCtx->dwCertEncodingType)) == NULL) { - ok = 0; - break; - } - if ((tup = PyTuple_New(2)) == NULL) { - ok = 0; - break; - } - PyTuple_SET_ITEM(tup, 0, cert); cert = NULL; - PyTuple_SET_ITEM(tup, 1, enc); enc = NULL; - - if (PyList_Append(result, tup) < 0) { - ok = 0; - break; - } - Py_CLEAR(tup); + if ((tup = PyTuple_New(2)) == NULL) { + Py_CLEAR(result); + break; } - if (pCrlCtx) { - /* loop ended with an error, need to clean up context manually */ - CertFreeCRLContext(pCrlCtx); + PyTuple_SET_ITEM(tup, 0, crl); + crl = NULL; + PyTuple_SET_ITEM(tup, 1, enc); + enc = NULL; + + if (PyList_Append(result, tup) < 0) { + Py_CLEAR(result); + break; } + Py_CLEAR(tup); } + if (pCrlCtx) { + /* loop ended with an error, need to clean up context manually */ + CertFreeCRLContext(pCrlCtx); + } /* In error cases cert, enc and tup may not be NULL */ - Py_XDECREF(cert); + Py_XDECREF(crl); Py_XDECREF(enc); Py_XDECREF(tup); if (!CertCloseStore(hStore, 0)) { /* This error case might shadow another exception.*/ - Py_DECREF(result); + Py_XDECREF(result); return PyErr_SetFromWindowsErr(GetLastError()); } - if (ok) { - return result; - } else { - Py_DECREF(result); - return NULL; - } + return result; } -#endif + +#endif /* _MSC_VER */ /* List of functions exported by this module. */ @@ -3567,8 +3695,10 @@ {"get_default_verify_paths", (PyCFunction)PySSL_get_default_verify_paths, METH_NOARGS, PySSL_get_default_verify_paths_doc}, #ifdef _MSC_VER - {"enum_cert_store", (PyCFunction)PySSL_enum_cert_store, - METH_VARARGS | METH_KEYWORDS, PySSL_enum_cert_store_doc}, + {"enum_certificates", (PyCFunction)PySSL_enum_certificates, + METH_VARARGS | METH_KEYWORDS, PySSL_enum_certificates_doc}, + {"enum_crls", (PyCFunction)PySSL_enum_crls, + METH_VARARGS | METH_KEYWORDS, PySSL_enum_crls_doc}, #endif {"txt2obj", (PyCFunction)PySSL_txt2obj, METH_VARARGS | METH_KEYWORDS, PySSL_txt2obj_doc}, @@ -3811,12 +3941,6 @@ PyModule_AddIntConstant(m, "VERIFY_X509_STRICT", X509_V_FLAG_X509_STRICT); -#ifdef _MSC_VER - /* Windows dwCertEncodingType */ - PyModule_AddIntMacro(m, X509_ASN_ENCODING); - PyModule_AddIntMacro(m, PKCS_7_ASN_ENCODING); -#endif - /* Alert Descriptions from ssl.h */ /* note RESERVED constants no longer intended for use have been removed */ /* http://www.iana.org/assignments/tls-parameters/tls-parameters.xml#tls-parameters-6 */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 02:23:47 2013 From: python-checkins at python.org (christian.heimes) Date: Fri, 22 Nov 2013 02:23:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2318147=3A_Add_miss?= =?utf-8?q?ing_documentation_for_SSLContext=2Eget=5Fca=5Fcerts=28=29=2E?= Message-ID: <3dQfzl5QXDz7LkR@mail.python.org> http://hg.python.org/cpython/rev/ae0734493f6b changeset: 87331:ae0734493f6b user: Christian Heimes date: Fri Nov 22 02:22:51 2013 +0100 summary: Issue #18147: Add missing documentation for SSLContext.get_ca_certs(). Also change the argument name to the same name as getpeercert() files: Doc/library/ssl.rst | 12 ++++++++++++ Modules/_ssl.c | 10 ++++++---- 2 files changed, 18 insertions(+), 4 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -1093,6 +1093,18 @@ >>> stats['hits'], stats['misses'] (0, 0) +.. method:: SSLContext.get_ca_certs(binary_form=False) + + Returns a list of dicts with information of loaded CA certs. If the + optional argument is True, returns a DER-encoded copy of the CA + certificate. + + .. note:: + Certificates in a capath directory aren't loaded unless they have + been used at least once. + + .. versionadded:: 3.4 + .. attribute:: SSLContext.options An integer representing the set of SSL options enabled on this context. diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -3023,7 +3023,7 @@ } PyDoc_STRVAR(PySSL_get_ca_certs_doc, -"get_ca_certs([der=False]) -> list of loaded certificate\n\ +"get_ca_certs(binary_form=False) -> list of loaded certificate\n\ \n\ Returns a list of dicts with information of loaded CA certs. If the\n\ optional argument is True, returns a DER-encoded copy of the CA certificate.\n\ @@ -3031,14 +3031,16 @@ been used at least once."); static PyObject * -get_ca_certs(PySSLContext *self, PyObject *args) +get_ca_certs(PySSLContext *self, PyObject *args, PyObject *kwds) { + char *kwlist[] = {"binary_form", NULL}; X509_STORE *store; PyObject *ci = NULL, *rlist = NULL; int i; int binary_mode = 0; - if (!PyArg_ParseTuple(args, "|p:get_ca_certs", &binary_mode)) { + if (!PyArg_ParseTupleAndKeywords(args, kwds, "|p:get_ca_certs", + kwlist, &binary_mode)) { return NULL; } @@ -3119,7 +3121,7 @@ {"cert_store_stats", (PyCFunction) cert_store_stats, METH_NOARGS, PySSL_get_stats_doc}, {"get_ca_certs", (PyCFunction) get_ca_certs, - METH_VARARGS, PySSL_get_ca_certs_doc}, + METH_VARARGS | METH_KEYWORDS, PySSL_get_ca_certs_doc}, {NULL, NULL} /* sentinel */ }; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 02:58:39 2013 From: python-checkins at python.org (barry.warsaw) Date: Fri, 22 Nov 2013 02:58:39 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Regex_ain=27t_gonna_make_it?= =?utf-8?q?=2E?= Message-ID: <3dQglz3twRz7LkL@mail.python.org> http://hg.python.org/peps/rev/bbd33490f039 changeset: 5312:bbd33490f039 user: Barry Warsaw date: Thu Nov 21 20:58:33 2013 -0500 summary: Regex ain't gonna make it. files: pep-0429.txt | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/pep-0429.txt b/pep-0429.txt --- a/pep-0429.txt +++ b/pep-0429.txt @@ -99,7 +99,6 @@ Other proposed large-scale changes: * Introspection information for builtins -* Addition of the "regex" module * Email version 6 Deferred to post-3.4: -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 22 03:11:20 2013 From: python-checkins at python.org (christian.heimes) Date: Fri, 22 Nov 2013 03:11:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Add_PEP_456_to_3=2E4_release_?= =?utf-8?q?PEP?= Message-ID: <3dQh2c4mvFz7LkL@mail.python.org> http://hg.python.org/peps/rev/10b006feab42 changeset: 5313:10b006feab42 user: Christian Heimes date: Fri Nov 22 03:11:12 2013 +0100 summary: Add PEP 456 to 3.4 release PEP files: pep-0429.txt | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/pep-0429.txt b/pep-0429.txt --- a/pep-0429.txt +++ b/pep-0429.txt @@ -73,6 +73,7 @@ * PEP 445, a new C API for implementing custom memory allocators * PEP 446, make newly created file descriptors not inherited by default * PEP 450, basic statistics module for the standard library +* PEP 456, secure and interchangeable hash algorithm Accepted but not yet implemented/merged: -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 22 03:36:58 2013 From: python-checkins at python.org (christian.heimes) Date: Fri, 22 Nov 2013 03:36:58 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319664=3A_fix_anot?= =?utf-8?q?her_flake_test=5Fuserdict_test?= Message-ID: <3dQhcB596mz7M73@mail.python.org> http://hg.python.org/cpython/rev/b62eb82ca5ef changeset: 87332:b62eb82ca5ef user: Christian Heimes date: Fri Nov 22 03:36:28 2013 +0100 summary: Issue #19664: fix another flake test_userdict test files: Lib/test/test_userdict.py | 6 +++--- 1 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_userdict.py b/Lib/test/test_userdict.py --- a/Lib/test/test_userdict.py +++ b/Lib/test/test_userdict.py @@ -90,9 +90,9 @@ self.assertNotEqual(m2a, m2) # Test keys, items, values - self.assertEqual(u2.keys(), d2.keys()) - self.assertEqual(u2.items(), d2.items()) - self.assertEqual(list(u2.values()), list(d2.values())) + self.assertEqual(sorted(u2.keys()), sorted(d2.keys())) + self.assertEqual(sorted(u2.items()), sorted(d2.items())) + self.assertEqual(sorted(u2.values()), sorted(d2.values())) # Test "in". for i in u2.keys(): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 03:43:58 2013 From: python-checkins at python.org (christian.heimes) Date: Fri, 22 Nov 2013 03:43:58 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_or_VERIFY=5FCRL=5FCHECK=5F?= =?utf-8?q?LEAF_to_verify=5Fflags?= Message-ID: <3dQhmG2SZHz7LkR@mail.python.org> http://hg.python.org/cpython/rev/6d3e60be0eeb changeset: 87333:6d3e60be0eeb user: Christian Heimes date: Fri Nov 22 03:43:48 2013 +0100 summary: or VERIFY_CRL_CHECK_LEAF to verify_flags files: Lib/test/test_ssl.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -1816,7 +1816,7 @@ context = ssl.SSLContext(ssl.PROTOCOL_TLSv1) context.verify_mode = ssl.CERT_REQUIRED context.load_verify_locations(SIGNING_CA) - context.verify_flags = ssl.VERIFY_DEFAULT + self.assertEqual(context.verify_flags, ssl.VERIFY_DEFAULT) # VERIFY_DEFAULT should pass server = ThreadedEchoServer(context=server_context, chatty=True) @@ -1827,7 +1827,7 @@ self.assertTrue(cert, "Can't get peer certificate.") # VERIFY_CRL_CHECK_LEAF without a loaded CRL file fails - context.verify_flags = ssl.VERIFY_CRL_CHECK_LEAF + context.verify_flags |= ssl.VERIFY_CRL_CHECK_LEAF server = ThreadedEchoServer(context=server_context, chatty=True) with server: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 04:49:43 2013 From: python-checkins at python.org (ezio.melotti) Date: Fri, 22 Nov 2013 04:49:43 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=2319688=3A_add_back_and_d?= =?utf-8?q?eprecate_the_internal_HTMLParser=2Eunescape=28=29_method=2E?= Message-ID: <3dQkD72Y5pz7Ljv@mail.python.org> http://hg.python.org/cpython/rev/0534b882f9ce changeset: 87334:0534b882f9ce user: Ezio Melotti date: Fri Nov 22 05:49:29 2013 +0200 summary: #19688: add back and deprecate the internal HTMLParser.unescape() method. files: Lib/html/parser.py | 7 +++++++ Lib/test/test_htmlparser.py | 7 +++++++ 2 files changed, 14 insertions(+), 0 deletions(-) diff --git a/Lib/html/parser.py b/Lib/html/parser.py --- a/Lib/html/parser.py +++ b/Lib/html/parser.py @@ -513,3 +513,10 @@ def unknown_decl(self, data): if self.strict: self.error("unknown declaration: %r" % (data,)) + + # Internal -- helper to remove special character quoting + def unescape(self, s): + warnings.warn('The unescape method is deprecated and will be removed ' + 'in 3.5, use html.unescape() instead.', + DeprecationWarning, stacklevel=2) + return unescape(s) diff --git a/Lib/test/test_htmlparser.py b/Lib/test/test_htmlparser.py --- a/Lib/test/test_htmlparser.py +++ b/Lib/test/test_htmlparser.py @@ -569,6 +569,13 @@ for html, expected in data: self._run_check(html, expected) + def test_unescape_method(self): + from html import unescape + p = self.get_collector() + with self.assertWarns(DeprecationWarning): + s = '""""""&#bad;' + self.assertEqual(p.unescape(s), unescape(s)) + def test_broken_comments(self): html = ('' '' -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 05:57:24 2013 From: python-checkins at python.org (ned.deily) Date: Fri, 22 Nov 2013 05:57:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2314455=3A_Fix_mayb?= =?utf-8?q?e=5Fopen_typo_in_Plist=2EfromFile=28=29=2E?= Message-ID: <3dQlkD4XxQz7M7s@mail.python.org> http://hg.python.org/cpython/rev/602e0a0ec67e changeset: 87335:602e0a0ec67e user: Ned Deily date: Thu Nov 21 20:56:23 2013 -0800 summary: Issue #14455: Fix maybe_open typo in Plist.fromFile(). files: Lib/plistlib.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/plistlib.py b/Lib/plistlib.py --- a/Lib/plistlib.py +++ b/Lib/plistlib.py @@ -137,7 +137,7 @@ @classmethod def fromFile(cls, pathOrFile): """Deprecated. Use the load() function instead.""" - with maybe_open(pathOrFile, 'rb') as fp: + with _maybe_open(pathOrFile, 'rb') as fp: value = load(fp) plist = cls() plist.update(value) -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Fri Nov 22 07:37:30 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Fri, 22 Nov 2013 07:37:30 +0100 Subject: [Python-checkins] Daily reference leaks (ae0734493f6b): sum=408 Message-ID: results for ae0734493f6b on branch "default" -------------------------------------------- test_site leaked [-2, 2, 0] references, sum=0 test_site leaked [-2, 2, 0] memory blocks, sum=0 test_trace leaked [136, 136, 136] references, sum=408 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/refloggwVN18', '-x'] From python-checkins at python.org Fri Nov 22 08:03:16 2013 From: python-checkins at python.org (ned.deily) Date: Fri, 22 Nov 2013 08:03:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319649=3A_On_OS_X?= =?utf-8?q?=2C_the_same_set_of_file_names_are_now_installed?= Message-ID: <3dQpWS0f5Mz7M8D@mail.python.org> http://hg.python.org/cpython/rev/44d1ac9245cf changeset: 87336:44d1ac9245cf user: Ned Deily date: Thu Nov 21 22:42:25 2013 -0800 summary: Issue #19649: On OS X, the same set of file names are now installed in bin directories for all configurations: non-framework vs framework, and single arch vs universal builds. pythonx.y-32 is now always installed for 64-bit/32-bit universal builds. The obsolete and undocumented pythonw* symlinks are no longer installed anywhere. files: Mac/Makefile.in | 107 +++++++++++++++++++++++------------ Makefile.pre.in | 35 ++++++++++- Misc/NEWS | 6 ++ configure | 5 + configure.ac | 4 + 5 files changed, 116 insertions(+), 41 deletions(-) diff --git a/Mac/Makefile.in b/Mac/Makefile.in --- a/Mac/Makefile.in +++ b/Mac/Makefile.in @@ -3,9 +3,12 @@ # commandline in that case. VERSION=@VERSION@ +ABIFLAGS=@ABIFLAGS@ +LDVERSION=@LDVERSION@ builddir = .. srcdir=@srcdir@ prefix=@prefix@ +exec_prefix=@exec_prefix@ LIBDEST=$(prefix)/lib/python$(VERSION) RUNSHARED=@RUNSHARED@ BUILDEXE=@BUILDEXEEXT@ @@ -23,7 +26,7 @@ # These are normally glimpsed from the previous set -bindir=$(prefix)/bin +BINDIR= @bindir@ PYTHONAPPSDIR=@FRAMEWORKINSTALLAPPSPREFIX@/$(PYTHONFRAMEWORK) $(VERSION) APPINSTALLDIR=$(prefix)/Resources/Python.app @@ -46,19 +49,7 @@ APPSUBDIRS=MacOS Resources compileall=$(srcdir)/../Lib/compileall.py -installapps: install_Python install_pythonw install_PythonLauncher install_IDLE - -install_pythonw: pythonw - $(INSTALL_PROGRAM) $(STRIPFLAG) pythonw "$(DESTDIR)$(prefix)/bin/pythonw$(VERSION)" - $(INSTALL_PROGRAM) $(STRIPFLAG) pythonw "$(DESTDIR)$(prefix)/bin/python$(VERSION)" - ln -sf python$(VERSION) "$(DESTDIR)$(prefix)/bin/python3" - ln -sf pythonw$(VERSION) "$(DESTDIR)$(prefix)/bin/pythonw3" -ifneq ($(LIPO_32BIT_FLAGS),) - lipo $(LIPO_32BIT_FLAGS) -output $(DESTDIR)$(prefix)/bin/python$(VERSION)-32 pythonw - lipo $(LIPO_32BIT_FLAGS) -output $(DESTDIR)$(prefix)/bin/pythonw$(VERSION)-32 pythonw - ln -sf pythonw$(VERSION)-32 "$(DESTDIR)$(prefix)/bin/pythonw3-32" - ln -sf python$(VERSION)-32 "$(DESTDIR)$(prefix)/bin/python3-32" -endif +installapps: install_Python install_PythonLauncher install_IDLE # # Install unix tools in /usr/local/bin. These are just aliases for the @@ -68,21 +59,39 @@ if [ ! -d "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin" ]; then \ $(INSTALL) -d -m $(DIRMODE) "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin" ;\ fi - for fn in python3 pythonw3 idle3 pydoc3 python3-config \ - python$(VERSION) pythonw$(VERSION) idle$(VERSION) \ - pydoc$(VERSION) python$(VERSION)-config 2to3 \ - 2to3-$(VERSION) pyvenv pyvenv-$(VERSION) ;\ + cd "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin" && \ + for fn in \ + 2to3 \ + idle3 \ + pydoc3 \ + python3 \ + python3-config \ + pyvenv \ + ; \ do \ - ln -fs "$(prefix)/bin/$${fn}" "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin/$${fn}" ;\ + rm -f $${fn} ; \ + $(LN) -s $(BINDIR)/$${fn} $${fn} ;\ done -ifneq ($(LIPO_32BIT_FLAGS),) - for fn in python3-32 pythonw3-32 \ - python$(VERSION)-32 pythonw$(VERSION)-32 ;\ - do \ - ln -fs "$(prefix)/bin/$${fn}" "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin/$${fn}" ;\ - done -endif - + -if test "x$(VERSION)" != "x$(LDVERSION)"; then \ + cd "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin" && \ + for fn in \ + python$(VERSION)-config \ + ; \ + do \ + rm -f $${fn} ;\ + $(LN) -s $(BINDIR)/$${fn} $${fn} ;\ + done ;\ + fi + -if test "x$(LIPO_32BIT_FLAGS)" != "x"; then \ + cd "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin" && \ + for fn in \ + python3-32 \ + ; \ + do \ + rm -f $${fn} ;\ + $(LN) -s $(BINDIR)/$${fn} $${fn} ;\ + done ;\ + fi # # Like installunixtools, but only install links to the versioned binaries. @@ -91,20 +100,44 @@ if [ ! -d "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin" ]; then \ $(INSTALL) -d -m $(DIRMODE) "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin" ;\ fi - for fn in python$(VERSION) pythonw$(VERSION) idle$(VERSION) \ - pydoc$(VERSION) python$(VERSION)-config 2to3-$(VERSION) pyvenv-$(VERSION) ;\ + cd "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin" && \ + for fn in \ + 2to3-$(VERSION) \ + idle$(VERSION) \ + pydoc$(VERSION) \ + python$(VERSION) \ + python$(LDVERSION)-config \ + pyvenv-$(VERSION) \ + ; \ do \ - ln -fs "$(prefix)/bin/$${fn}" "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin/$${fn}" ;\ + rm -f $${fn} ;\ + $(LN) -s $(BINDIR)/$${fn} $${fn} ;\ done -ifneq ($(LIPO_32BIT_FLAGS),) - for fn in python$(VERSION)-32 pythonw$(VERSION)-32 ;\ - do \ - ln -fs "$(prefix)/bin/$${fn}" "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin/$${fn}" ;\ - done -endif + -if test "x$(VERSION)" != "x$(LDVERSION)"; then \ + cd "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin" && \ + for fn in \ + python$(LDVERSION) \ + ; \ + do \ + rm -f $${fn} ;\ + $(LN) -s $(BINDIR)/$${fn} $${fn} ;\ + done ;\ + fi + -if test "x$(LIPO_32BIT_FLAGS)" != "x"; then \ + cd "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin" && \ + for fn in \ + python$(VERSION)-32 \ + ; \ + do \ + rm -f $${fn} ;\ + $(LN) -s $(BINDIR)/$${fn} $${fn} ;\ + done ;\ + fi pythonw: $(srcdir)/Tools/pythonw.c Makefile - $(CC) $(LDFLAGS) -DPYTHONFRAMEWORK='"$(PYTHONFRAMEWORK)"' -o $@ $(srcdir)/Tools/pythonw.c -I.. -I$(srcdir)/../Include ../$(PYTHONFRAMEWORK).framework/Versions/$(VERSION)/$(PYTHONFRAMEWORK) + $(CC) $(LDFLAGS) -DPYTHONFRAMEWORK='"$(PYTHONFRAMEWORK)"' -o $@ \ + $(srcdir)/Tools/pythonw.c -I.. -I$(srcdir)/../Include \ + ../$(PYTHONFRAMEWORK).framework/Versions/$(VERSION)/$(PYTHONFRAMEWORK) install_PythonLauncher: cd PythonLauncher && make install DESTDIR=$(DESTDIR) diff --git a/Makefile.pre.in b/Makefile.pre.in --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -148,6 +148,12 @@ MACOSX_DEPLOYMENT_TARGET=@CONFIGURE_MACOSX_DEPLOYMENT_TARGET@ @EXPORT_MACOSX_DEPLOYMENT_TARGET at export MACOSX_DEPLOYMENT_TARGET +# Option to install to strip binaries +STRIPFLAG=-s + +# Flags to lipo to produce a 32-bit-only universal executable +LIPO_32BIT_FLAGS=@LIPO_32BIT_FLAGS@ + # Options to enable prebinding (for fast startup prior to Mac OS X 10.3) OTHER_LIBTOOL_OPT=@OTHER_LIBTOOL_OPT@ @@ -955,7 +961,7 @@ $(TESTRUNNER) $(QUICKTESTOPTS) -install: altinstall bininstall maninstall +install: @FRAMEWORKINSTALLFIRST@ altinstall bininstall maninstall @FRAMEWORKINSTALLLAST@ altinstall: @FRAMEWORKALTINSTALLFIRST@ altbininstall libinstall inclinstall libainstall \ sharedinstall oldsharedinstall altmaninstall @FRAMEWORKALTINSTALLLAST@ @@ -983,7 +989,7 @@ # Install the interpreter with $(VERSION) affixed # This goes into $(exec_prefix) -altbininstall: $(BUILDPYTHON) +altbininstall: $(BUILDPYTHON) @FRAMEWORKPYTHONW@ @for i in $(BINDIR) $(LIBDIR); \ do \ if test ! -d $(DESTDIR)$$i; then \ @@ -992,7 +998,11 @@ else true; \ fi; \ done - $(INSTALL_PROGRAM) $(BUILDPYTHON) $(DESTDIR)$(BINDIR)/python$(LDVERSION)$(EXE) + if test "$(PYTHONFRAMEWORKDIR)" = "no-framework" ; then \ + $(INSTALL_PROGRAM) $(BUILDPYTHON) $(DESTDIR)$(BINDIR)/python$(LDVERSION)$(EXE); \ + else \ + $(INSTALL_PROGRAM) $(STRIPFLAG) Mac/pythonw $(DESTDIR)$(BINDIR)/python$(LDVERSION)$(EXE); \ + fi -if test "$(VERSION)" != "$(LDVERSION)"; then \ if test -f $(DESTDIR)$(BINDIR)/python$(VERSION)$(EXE) -o -h $(DESTDIR)$(BINDIR)/python$(VERSION)$(EXE); \ then rm -f $(DESTDIR)$(BINDIR)/python$(VERSION)$(EXE); \ @@ -1013,6 +1023,12 @@ fi; \ else true; \ fi + if test "x$(LIPO_32BIT_FLAGS)" != "x" ; then \ + rm -f $(DESTDIR)$(BINDIR)python$(VERSION)-32$(EXE); \ + lipo $(LIPO_32BIT_FLAGS) \ + -output $(DESTDIR)$(BINDIR)/python$(VERSION)-32$(EXE) \ + $(DESTDIR)$(BINDIR)/python$(VERSION)$(EXE); \ + fi bininstall: altbininstall -if test -f $(DESTDIR)$(BINDIR)/python3$(EXE) -o -h $(DESTDIR)$(BINDIR)/python3$(EXE); \ @@ -1038,6 +1054,10 @@ (cd $(DESTDIR)$(BINDIR); $(LN) -s 2to3-$(VERSION) 2to3) -rm -f $(DESTDIR)$(BINDIR)/pyvenv (cd $(DESTDIR)$(BINDIR); $(LN) -s pyvenv-$(VERSION) pyvenv) + if test "x$(LIPO_32BIT_FLAGS)" != "x" ; then \ + rm -f $(DESTDIR)$(BINDIR)/python3-32$(EXE); \ + (cd $(DESTDIR)$(BINDIR); $(LN) -s python$(VERSION)-32$(EXE) python3-32$(EXE)) \ + fi # Install the versioned manual page altmaninstall: @@ -1364,7 +1384,14 @@ frameworkinstallapps: cd Mac && $(MAKE) installapps DESTDIR="$(DESTDIR)" -# This install the unix python and pythonw tools in /usr/local/bin +# Build the bootstrap executable that will spawn the interpreter inside +# an app bundle within the framework. This allows the interpreter to +# run OS X GUI APIs. +frameworkpythonw: + cd Mac && $(MAKE) pythonw + +# This installs the python* and other bin symlinks in $prefix/bin or in +# a bin directory relative to the framework root frameworkinstallunixtools: cd Mac && $(MAKE) installunixtools DESTDIR="$(DESTDIR)" diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -357,6 +357,12 @@ - Issue #19373: Apply upstream change to Tk 8.5.15 fixing OS X 10.9 screen refresh problem for OS X installer build. +- Issue #19649: On OS X, the same set of file names are now installed + in bin directories for all configurations: non-framework vs framework, + and single arch vs universal builds. pythonx.y-32 is now always + installed for 64-bit/32-bit universal builds. The obsolete and + undocumented pythonw* symlinks are no longer installed anywhere. + Tools/Demos ----------- diff --git a/configure b/configure --- a/configure +++ b/configure @@ -712,6 +712,7 @@ MACHDEP FRAMEWORKINSTALLAPPSPREFIX FRAMEWORKUNIXTOOLSPREFIX +FRAMEWORKPYTHONW FRAMEWORKALTINSTALLLAST FRAMEWORKALTINSTALLFIRST FRAMEWORKINSTALLLAST @@ -3152,6 +3153,7 @@ FRAMEWORKINSTALLLAST= FRAMEWORKALTINSTALLFIRST= FRAMEWORKALTINSTALLLAST= + FRAMEWORKPYTHONW= if test "x${prefix}" = "xNONE"; then FRAMEWORKUNIXTOOLSPREFIX="${ac_default_prefix}" else @@ -3166,6 +3168,7 @@ FRAMEWORKALTINSTALLFIRST="frameworkinstallstructure " FRAMEWORKINSTALLLAST="frameworkinstallmaclib frameworkinstallapps frameworkinstallunixtools" FRAMEWORKALTINSTALLLAST="frameworkinstallmaclib frameworkinstallapps frameworkaltinstallunixtools" + FRAMEWORKPYTHONW="frameworkpythonw" FRAMEWORKINSTALLAPPSPREFIX="/Applications" if test "x${prefix}" = "xNONE" ; then @@ -3233,6 +3236,7 @@ FRAMEWORKINSTALLLAST= FRAMEWORKALTINSTALLFIRST= FRAMEWORKALTINSTALLLAST= + FRAMEWORKPYTHONW= if test "x${prefix}" = "xNONE" ; then FRAMEWORKUNIXTOOLSPREFIX="${ac_default_prefix}" else @@ -3255,6 +3259,7 @@ + ##AC_ARG_WITH(dyld, ## AS_HELP_STRING([--with-dyld], ## [Use (OpenStep|Rhapsody) dynamic linker])) diff --git a/configure.ac b/configure.ac --- a/configure.ac +++ b/configure.ac @@ -250,6 +250,7 @@ FRAMEWORKINSTALLLAST= FRAMEWORKALTINSTALLFIRST= FRAMEWORKALTINSTALLLAST= + FRAMEWORKPYTHONW= if test "x${prefix}" = "xNONE"; then FRAMEWORKUNIXTOOLSPREFIX="${ac_default_prefix}" else @@ -264,6 +265,7 @@ FRAMEWORKALTINSTALLFIRST="frameworkinstallstructure " FRAMEWORKINSTALLLAST="frameworkinstallmaclib frameworkinstallapps frameworkinstallunixtools" FRAMEWORKALTINSTALLLAST="frameworkinstallmaclib frameworkinstallapps frameworkaltinstallunixtools" + FRAMEWORKPYTHONW="frameworkpythonw" FRAMEWORKINSTALLAPPSPREFIX="/Applications" if test "x${prefix}" = "xNONE" ; then @@ -325,6 +327,7 @@ FRAMEWORKINSTALLLAST= FRAMEWORKALTINSTALLFIRST= FRAMEWORKALTINSTALLLAST= + FRAMEWORKPYTHONW= if test "x${prefix}" = "xNONE" ; then FRAMEWORKUNIXTOOLSPREFIX="${ac_default_prefix}" else @@ -342,6 +345,7 @@ AC_SUBST(FRAMEWORKINSTALLLAST) AC_SUBST(FRAMEWORKALTINSTALLFIRST) AC_SUBST(FRAMEWORKALTINSTALLLAST) +AC_SUBST(FRAMEWORKPYTHONW) AC_SUBST(FRAMEWORKUNIXTOOLSPREFIX) AC_SUBST(FRAMEWORKINSTALLAPPSPREFIX) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 08:03:17 2013 From: python-checkins at python.org (ned.deily) Date: Fri, 22 Nov 2013 08:03:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319553=3A_PEP_453_?= =?utf-8?q?-_=22make_install=22_and_=22make_altinstall=22_now_install_or?= Message-ID: <3dQpWT3wLXz7M8n@mail.python.org> http://hg.python.org/cpython/rev/90d4153728f6 changeset: 87337:90d4153728f6 user: Ned Deily date: Thu Nov 21 23:01:59 2013 -0800 summary: Issue #19553: PEP 453 - "make install" and "make altinstall" now install or upgrade pip by default, using the bundled pip provided by the new ensurepip module. A new configure option, --with-ensurepip[=upgrade|install|no], is available to override the default ensurepip "--upgrade" option. The option can also be set with "make [alt]install ENSUREPIP=[upgrade|install\no]". files: Mac/Makefile.in | 22 ++++++++++++++++++++++ Makefile.pre.in | 31 +++++++++++++++++++++++++++---- Misc/NEWS | 6 ++++++ configure | 29 +++++++++++++++++++++++++++++ configure.ac | 15 +++++++++++++++ 5 files changed, 99 insertions(+), 4 deletions(-) diff --git a/Mac/Makefile.in b/Mac/Makefile.in --- a/Mac/Makefile.in +++ b/Mac/Makefile.in @@ -5,6 +5,7 @@ VERSION=@VERSION@ ABIFLAGS=@ABIFLAGS@ LDVERSION=@LDVERSION@ +ENSUREPIP=@ENSUREPIP@ builddir = .. srcdir=@srcdir@ prefix=@prefix@ @@ -92,6 +93,16 @@ $(LN) -s $(BINDIR)/$${fn} $${fn} ;\ done ;\ fi + -if test "x$(ENSUREPIP)" != "xno" ; then \ + cd "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin" && \ + for fn in \ + pip3 \ + ; \ + do \ + rm -f $${fn} ;\ + $(LN) -s $(BINDIR)/$${fn} $${fn} ;\ + done ;\ + fi # # Like installunixtools, but only install links to the versioned binaries. @@ -133,6 +144,17 @@ $(LN) -s $(BINDIR)/$${fn} $${fn} ;\ done ;\ fi + -if test "x$(ENSUREPIP)" != "xno" ; then \ + cd "$(DESTDIR)$(FRAMEWORKUNIXTOOLSPREFIX)/bin" && \ + for fn in \ + easy_install-$(VERSION) \ + pip$(VERSION) \ + ; \ + do \ + rm -f $${fn} ;\ + $(LN) -s $(BINDIR)/$${fn} $${fn} ;\ + done ;\ + fi pythonw: $(srcdir)/Tools/pythonw.c Makefile $(CC) $(LDFLAGS) -DPYTHONFRAMEWORK='"$(PYTHONFRAMEWORK)"' -o $@ \ diff --git a/Makefile.pre.in b/Makefile.pre.in --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -160,6 +160,9 @@ # Environment to run shared python without installed libraries RUNSHARED= @RUNSHARED@ +# ensurepip options +ENSUREPIP= @ENSUREPIP@ + # Modes for directories, executables and data files created by the # install process. Default to user-only-writable for all file types. DIRMODE= 755 @@ -961,10 +964,30 @@ $(TESTRUNNER) $(QUICKTESTOPTS) -install: @FRAMEWORKINSTALLFIRST@ altinstall bininstall maninstall @FRAMEWORKINSTALLLAST@ +install: @FRAMEWORKINSTALLFIRST@ commoninstall bininstall maninstall @FRAMEWORKINSTALLLAST@ + if test "x$(ENSUREPIP)" != "xno" ; then \ + case $(ENSUREPIP) in \ + upgrade) ensurepip="--upgrade" ;; \ + install|*) ensurepip="" ;; \ + esac; \ + $(RUNSHARED) $(PYTHON_FOR_BUILD) -m ensurepip \ + $$ensurepip --root=$(DESTDIR)/ ; \ + fi -altinstall: @FRAMEWORKALTINSTALLFIRST@ altbininstall libinstall inclinstall libainstall \ - sharedinstall oldsharedinstall altmaninstall @FRAMEWORKALTINSTALLLAST@ +altinstall: commoninstall + if test "x$(ENSUREPIP)" != "xno" ; then \ + case $(ENSUREPIP) in \ + upgrade) ensurepip="--altinstall --upgrade" ;; \ + install|*) ensurepip="--altinstall" ;; \ + esac; \ + $(RUNSHARED) $(PYTHON_FOR_BUILD) -m ensurepip \ + $$ensurepip --root=$(DESTDIR)/ ; \ + fi + +commoninstall: @FRAMEWORKALTINSTALLFIRST@ \ + altbininstall libinstall inclinstall libainstall \ + sharedinstall oldsharedinstall altmaninstall \ + @FRAMEWORKALTINSTALLLAST@ # Install shared libraries enabled by Setup DESTDIRS= $(exec_prefix) $(LIBDIR) $(BINLIBDEST) $(DESTSHARED) @@ -1570,7 +1593,7 @@ .PHONY: frameworkinstall frameworkinstallframework frameworkinstallstructure .PHONY: frameworkinstallmaclib frameworkinstallapps frameworkinstallunixtools .PHONY: frameworkaltinstallunixtools recheck autoconf clean clobber distclean -.PHONY: smelly funny patchcheck touch altmaninstall +.PHONY: smelly funny patchcheck touch altmaninstall commoninstall .PHONY: gdbhooks # IF YOU PUT ANYTHING HERE IT WILL GO AWAY diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -363,6 +363,12 @@ installed for 64-bit/32-bit universal builds. The obsolete and undocumented pythonw* symlinks are no longer installed anywhere. +- Issue #19553: PEP 453 - "make install" and "make altinstall" now install or + upgrade pip by default, using the bundled pip provided by the new ensurepip + module. A new configure option, --with-ensurepip[=upgrade|install|no], is + available to override the default ensurepip "--upgrade" option. The option + can also be set with "make [alt]install ENSUREPIP=[upgrade|install\no]". + Tools/Demos ----------- diff --git a/configure b/configure --- a/configure +++ b/configure @@ -623,6 +623,7 @@ #endif" ac_subst_vars='LTLIBOBJS +ENSUREPIP SRCDIRS THREADHEADERS LIBPL @@ -815,6 +816,7 @@ with_libc enable_big_digits with_computed_gotos +with_ensurepip ' ac_precious_vars='build_alias host_alias @@ -1498,6 +1500,8 @@ --with(out)-computed-gotos Use computed gotos in evaluation loop (enabled by default on supported compilers) + --with(out)-ensurepip=[=upgrade] + "install" or "upgrade" using bundled pip Some influential environment variables: MACHDEP name for machine-dependent library files @@ -15344,6 +15348,31 @@ esac fi +# ensurepip option +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for ensurepip" >&5 +$as_echo_n "checking for ensurepip... " >&6; } + +# Check whether --with-ensurepip was given. +if test "${with_ensurepip+set}" = set; then : + withval=$with_ensurepip; +else + with_ensurepip=upgrade +fi + +case $with_ensurepip in #( + yes|upgrade) : + ENSUREPIP=upgrade ;; #( + install) : + ENSUREPIP=install ;; #( + no) : + ENSUREPIP=no ;; #( + *) : + as_fn_error $? "--with-ensurepip=upgrade|install|no" "$LINENO" 5 ;; +esac +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ENSUREPIP" >&5 +$as_echo "$ENSUREPIP" >&6; } + + # generate output files ac_config_files="$ac_config_files Makefile.pre Modules/Setup.config Misc/python.pc Misc/python-config.sh" diff --git a/configure.ac b/configure.ac --- a/configure.ac +++ b/configure.ac @@ -4767,6 +4767,21 @@ esac fi +# ensurepip option +AC_MSG_CHECKING(for ensurepip) +AC_ARG_WITH(ensurepip, + [AS_HELP_STRING([--with(out)-ensurepip=@<:@=upgrade@:>@], + ["install" or "upgrade" using bundled pip])], + [], + [with_ensurepip=upgrade]) +AS_CASE($with_ensurepip, + [yes|upgrade],[ENSUREPIP=upgrade], + [install],[ENSUREPIP=install], + [no],[ENSUREPIP=no], + [AC_MSG_ERROR([--with-ensurepip=upgrade|install|no])]) +AC_MSG_RESULT($ENSUREPIP) +AC_SUBST(ENSUREPIP) + # generate output files AC_CONFIG_FILES(Makefile.pre Modules/Setup.config Misc/python.pc Misc/python-config.sh) AC_CONFIG_FILES([Modules/ld_so_aix], [chmod +x Modules/ld_so_aix]) -- Repository URL: http://hg.python.org/cpython From victor.stinner at gmail.com Fri Nov 22 02:04:04 2013 From: victor.stinner at gmail.com (Victor Stinner) Date: Fri, 22 Nov 2013 02:04:04 +0100 Subject: [Python-checkins] peps: Update list of accepted but not yet implemented or merged peps. In-Reply-To: <3dQcGM6M5pz7M8F@mail.python.org> References: <3dQcGM6M5pz7M8F@mail.python.org> Message-ID: Hum, I don't think that regex module will enter Python 3.4 before this week-end, there is no PEP. For the "Introspection information for builtins", I think the PEP 436 has been accepted. The code has been merged, but the PEP status is still draft. Victor 2013/11/22 barry.warsaw : > http://hg.python.org/peps/rev/ea82fe7aa1ff > changeset: 5311:ea82fe7aa1ff > user: Barry Warsaw > date: Thu Nov 21 18:21:06 2013 -0500 > summary: > Update list of accepted but not yet implemented or merged peps. > > files: > pep-0429.txt | 14 +++++++++----- > 1 files changed, 9 insertions(+), 5 deletions(-) > > > diff --git a/pep-0429.txt b/pep-0429.txt > --- a/pep-0429.txt > +++ b/pep-0429.txt > @@ -74,6 +74,15 @@ > * PEP 446, make newly created file descriptors not inherited by default > * PEP 450, basic statistics module for the standard library > > +Accepted but not yet implemented/merged: > + > +* PEP 428, the pathlib module -- object-oriented filesystem paths > +* PEP 451, a ModuleSpec Type for the Import System > +* PEP 453, pip bootstrapping/bundling with CPython > +* PEP 454, the tracemalloc module for tracing Python memory allocations > +* PEP 3154, Pickle protocol revision 4 > +* PEP 3156, improved asynchronous IO support > + > Other final large-scale changes: > > * None so far > @@ -85,12 +94,7 @@ > * PEP 441, improved Python zip application support > * PEP 447, support for __locallookup__ metaclass method > * PEP 448, additional unpacking generalizations > -* PEP 451, making custom import hooks easier to implement correctly > -* PEP 453, pip bootstrapping/bundling with CPython > -* PEP 454, PEP 445 based malloc tracing support > * PEP 455, key transforming dictionary > -* PEP 3154, Pickle protocol revision 4 > -* PEP 3156, improved asynchronous IO support > > Other proposed large-scale changes: > > > -- > Repository URL: http://hg.python.org/peps > > _______________________________________________ > Python-checkins mailing list > Python-checkins at python.org > https://mail.python.org/mailman/listinfo/python-checkins > From python-checkins at python.org Fri Nov 22 13:39:51 2013 From: python-checkins at python.org (nick.coghlan) Date: Fri, 22 Nov 2013 13:39:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319619=3A_Blacklis?= =?utf-8?q?t_non-text_codecs_in_method_API?= Message-ID: <3dQxzq4q67z7Lkg@mail.python.org> http://hg.python.org/cpython/rev/d68df99d7a57 changeset: 87338:d68df99d7a57 user: Nick Coghlan date: Fri Nov 22 22:39:36 2013 +1000 summary: Issue #19619: Blacklist non-text codecs in method API str.encode, bytes.decode and bytearray.decode now use an internal API to throw LookupError for known non-text encodings, rather than attempting the encoding or decoding operation and then throwing a TypeError for an unexpected output type. The latter mechanism remains in place for third party non-text encodings. files: Include/codecs.h | 27 +++ Lib/codecs.py | 14 +- Lib/encodings/base64_codec.py | 1 + Lib/encodings/bz2_codec.py | 1 + Lib/encodings/hex_codec.py | 1 + Lib/encodings/quopri_codec.py | 1 + Lib/encodings/rot_13.py | 1 + Lib/encodings/uu_codec.py | 1 + Lib/encodings/zlib_codec.py | 1 + Lib/test/test_codecs.py | 188 +++++++++++++-------- Misc/NEWS | 6 + Objects/unicodeobject.c | 4 +- Python/codecs.c | 138 ++++++++++++++- 13 files changed, 291 insertions(+), 93 deletions(-) diff --git a/Include/codecs.h b/Include/codecs.h --- a/Include/codecs.h +++ b/Include/codecs.h @@ -94,6 +94,33 @@ const char *errors ); +#ifndef PY_LIMITED_API +/* Text codec specific encoding and decoding API. + + Checks the encoding against a list of codecs which do not + implement a str<->bytes encoding before attempting the + operation. + + Please note that these APIs are internal and should not + be used in Python C extensions. + + */ + +PyAPI_FUNC(PyObject *) _PyCodec_EncodeText( + PyObject *object, + const char *encoding, + const char *errors + ); + +PyAPI_FUNC(PyObject *) _PyCodec_DecodeText( + PyObject *object, + const char *encoding, + const char *errors + ); +#endif + + + /* --- Codec Lookup APIs -------------------------------------------------- All APIs return a codec object with incremented refcount and are diff --git a/Lib/codecs.py b/Lib/codecs.py --- a/Lib/codecs.py +++ b/Lib/codecs.py @@ -73,9 +73,19 @@ ### Codec base classes (defining the API) class CodecInfo(tuple): + """Codec details when looking up the codec registry""" + + # Private API to allow Python 3.4 to blacklist the known non-Unicode + # codecs in the standard library. A more general mechanism to + # reliably distinguish test encodings from other codecs will hopefully + # be defined for Python 3.5 + # + # See http://bugs.python.org/issue19619 + _is_text_encoding = True # Assume codecs are text encodings by default def __new__(cls, encode, decode, streamreader=None, streamwriter=None, - incrementalencoder=None, incrementaldecoder=None, name=None): + incrementalencoder=None, incrementaldecoder=None, name=None, + *, _is_text_encoding=None): self = tuple.__new__(cls, (encode, decode, streamreader, streamwriter)) self.name = name self.encode = encode @@ -84,6 +94,8 @@ self.incrementaldecoder = incrementaldecoder self.streamwriter = streamwriter self.streamreader = streamreader + if _is_text_encoding is not None: + self._is_text_encoding = _is_text_encoding return self def __repr__(self): diff --git a/Lib/encodings/base64_codec.py b/Lib/encodings/base64_codec.py --- a/Lib/encodings/base64_codec.py +++ b/Lib/encodings/base64_codec.py @@ -52,4 +52,5 @@ incrementaldecoder=IncrementalDecoder, streamwriter=StreamWriter, streamreader=StreamReader, + _is_text_encoding=False, ) diff --git a/Lib/encodings/bz2_codec.py b/Lib/encodings/bz2_codec.py --- a/Lib/encodings/bz2_codec.py +++ b/Lib/encodings/bz2_codec.py @@ -74,4 +74,5 @@ incrementaldecoder=IncrementalDecoder, streamwriter=StreamWriter, streamreader=StreamReader, + _is_text_encoding=False, ) diff --git a/Lib/encodings/hex_codec.py b/Lib/encodings/hex_codec.py --- a/Lib/encodings/hex_codec.py +++ b/Lib/encodings/hex_codec.py @@ -52,4 +52,5 @@ incrementaldecoder=IncrementalDecoder, streamwriter=StreamWriter, streamreader=StreamReader, + _is_text_encoding=False, ) diff --git a/Lib/encodings/quopri_codec.py b/Lib/encodings/quopri_codec.py --- a/Lib/encodings/quopri_codec.py +++ b/Lib/encodings/quopri_codec.py @@ -53,4 +53,5 @@ incrementaldecoder=IncrementalDecoder, streamwriter=StreamWriter, streamreader=StreamReader, + _is_text_encoding=False, ) diff --git a/Lib/encodings/rot_13.py b/Lib/encodings/rot_13.py --- a/Lib/encodings/rot_13.py +++ b/Lib/encodings/rot_13.py @@ -43,6 +43,7 @@ incrementaldecoder=IncrementalDecoder, streamwriter=StreamWriter, streamreader=StreamReader, + _is_text_encoding=False, ) ### Map diff --git a/Lib/encodings/uu_codec.py b/Lib/encodings/uu_codec.py --- a/Lib/encodings/uu_codec.py +++ b/Lib/encodings/uu_codec.py @@ -96,4 +96,5 @@ incrementaldecoder=IncrementalDecoder, streamreader=StreamReader, streamwriter=StreamWriter, + _is_text_encoding=False, ) diff --git a/Lib/encodings/zlib_codec.py b/Lib/encodings/zlib_codec.py --- a/Lib/encodings/zlib_codec.py +++ b/Lib/encodings/zlib_codec.py @@ -74,4 +74,5 @@ incrementaldecoder=IncrementalDecoder, streamreader=StreamReader, streamwriter=StreamWriter, + _is_text_encoding=False, ) diff --git a/Lib/test/test_codecs.py b/Lib/test/test_codecs.py --- a/Lib/test/test_codecs.py +++ b/Lib/test/test_codecs.py @@ -6,6 +6,7 @@ import sys import unittest import warnings +import encodings from test import support @@ -2381,67 +2382,68 @@ view_decoded = codecs.decode(view, encoding) self.assertEqual(view_decoded, data) - def test_type_error_for_text_input(self): + def test_text_to_binary_blacklists_binary_transforms(self): # Check binary -> binary codecs give a good error for str input bad_input = "bad input type" for encoding in bytes_transform_encodings: with self.subTest(encoding=encoding): - msg = "^encoding with '{}' codec failed".format(encoding) - with self.assertRaisesRegex(TypeError, msg) as failure: + fmt = ( "{!r} is not a text encoding; " + "use codecs.encode\(\) to handle arbitrary codecs") + msg = fmt.format(encoding) + with self.assertRaisesRegex(LookupError, msg) as failure: bad_input.encode(encoding) - self.assertTrue(isinstance(failure.exception.__cause__, - TypeError)) + self.assertIsNone(failure.exception.__cause__) - def test_type_error_for_binary_input(self): - # Check str -> str codec gives a good error for binary input - for bad_input in (b"immutable", bytearray(b"mutable")): - with self.subTest(bad_input=bad_input): - msg = "^decoding with 'rot_13' codec failed" - with self.assertRaisesRegex(AttributeError, msg) as failure: - bad_input.decode("rot_13") - self.assertTrue(isinstance(failure.exception.__cause__, - AttributeError)) + def test_text_to_binary_blacklists_text_transforms(self): + # Check str.encode gives a good error message for str -> str codecs + msg = (r"^'rot_13' is not a text encoding; " + "use codecs.encode\(\) to handle arbitrary codecs") + with self.assertRaisesRegex(LookupError, msg): + "just an example message".encode("rot_13") - def test_custom_zlib_error_is_wrapped(self): - # Check zlib codec gives a good error for malformed input - msg = "^decoding with 'zlib_codec' codec failed" - with self.assertRaisesRegex(Exception, msg) as failure: - b"hello".decode("zlib_codec") - self.assertTrue(isinstance(failure.exception.__cause__, - type(failure.exception))) - - def test_custom_hex_error_is_wrapped(self): - # Check hex codec gives a good error for malformed input - msg = "^decoding with 'hex_codec' codec failed" - with self.assertRaisesRegex(Exception, msg) as failure: - b"hello".decode("hex_codec") - self.assertTrue(isinstance(failure.exception.__cause__, - type(failure.exception))) - - # Unfortunately, the bz2 module throws OSError, which the codec - # machinery currently can't wrap :( - - def test_bad_decoding_output_type(self): + def test_binary_to_text_blacklists_binary_transforms(self): # Check bytes.decode and bytearray.decode give a good error # message for binary -> binary codecs data = b"encode first to ensure we meet any format restrictions" for encoding in bytes_transform_encodings: with self.subTest(encoding=encoding): encoded_data = codecs.encode(data, encoding) - fmt = ("'{}' decoder returned 'bytes' instead of 'str'; " - "use codecs.decode\(\) to decode to arbitrary types") + fmt = (r"{!r} is not a text encoding; " + "use codecs.decode\(\) to handle arbitrary codecs") msg = fmt.format(encoding) - with self.assertRaisesRegex(TypeError, msg): + with self.assertRaisesRegex(LookupError, msg): encoded_data.decode(encoding) - with self.assertRaisesRegex(TypeError, msg): + with self.assertRaisesRegex(LookupError, msg): bytearray(encoded_data).decode(encoding) - def test_bad_encoding_output_type(self): - # Check str.encode gives a good error message for str -> str codecs - msg = ("'rot_13' encoder returned 'str' instead of 'bytes'; " - "use codecs.encode\(\) to encode to arbitrary types") - with self.assertRaisesRegex(TypeError, msg): - "just an example message".encode("rot_13") + def test_binary_to_text_blacklists_text_transforms(self): + # Check str -> str codec gives a good error for binary input + for bad_input in (b"immutable", bytearray(b"mutable")): + with self.subTest(bad_input=bad_input): + msg = (r"^'rot_13' is not a text encoding; " + "use codecs.decode\(\) to handle arbitrary codecs") + with self.assertRaisesRegex(LookupError, msg) as failure: + bad_input.decode("rot_13") + self.assertIsNone(failure.exception.__cause__) + + def test_custom_zlib_error_is_wrapped(self): + # Check zlib codec gives a good error for malformed input + msg = "^decoding with 'zlib_codec' codec failed" + with self.assertRaisesRegex(Exception, msg) as failure: + codecs.decode(b"hello", "zlib_codec") + self.assertIsInstance(failure.exception.__cause__, + type(failure.exception)) + + def test_custom_hex_error_is_wrapped(self): + # Check hex codec gives a good error for malformed input + msg = "^decoding with 'hex_codec' codec failed" + with self.assertRaisesRegex(Exception, msg) as failure: + codecs.decode(b"hello", "hex_codec") + self.assertIsInstance(failure.exception.__cause__, + type(failure.exception)) + + # Unfortunately, the bz2 module throws OSError, which the codec + # machinery currently can't wrap :( # The codec system tries to wrap exceptions in order to ensure the error @@ -2466,27 +2468,44 @@ # case finishes by using the test case repr as the codec name # The codecs module normalizes codec names, although this doesn't # appear to be formally documented... - self.codec_name = repr(self).lower().replace(" ", "-") + # We also make sure we use a truly unique id for the custom codec + # to avoid issues with the codec cache when running these tests + # multiple times (e.g. when hunting for refleaks) + unique_id = repr(self) + str(id(self)) + self.codec_name = encodings.normalize_encoding(unique_id).lower() + + # We store the object to raise on the instance because of a bad + # interaction between the codec caching (which means we can't + # recreate the codec entry) and regrtest refleak hunting (which + # runs the same test instance multiple times). This means we + # need to ensure the codecs call back in to the instance to find + # out which exception to raise rather than binding them in a + # closure to an object that may change on the next run + self.obj_to_raise = RuntimeError def tearDown(self): _TEST_CODECS.pop(self.codec_name, None) - def set_codec(self, obj_to_raise): - def raise_obj(*args, **kwds): - raise obj_to_raise - codec_info = codecs.CodecInfo(raise_obj, raise_obj, + def set_codec(self, encode, decode): + codec_info = codecs.CodecInfo(encode, decode, name=self.codec_name) _TEST_CODECS[self.codec_name] = codec_info @contextlib.contextmanager def assertWrapped(self, operation, exc_type, msg): - full_msg = "{} with '{}' codec failed \({}: {}\)".format( + full_msg = r"{} with {!r} codec failed \({}: {}\)".format( operation, self.codec_name, exc_type.__name__, msg) with self.assertRaisesRegex(exc_type, full_msg) as caught: yield caught + self.assertIsInstance(caught.exception.__cause__, exc_type) + + def raise_obj(self, *args, **kwds): + # Helper to dynamically change the object raised by a test codec + raise self.obj_to_raise def check_wrapped(self, obj_to_raise, msg, exc_type=RuntimeError): - self.set_codec(obj_to_raise) + self.obj_to_raise = obj_to_raise + self.set_codec(self.raise_obj, self.raise_obj) with self.assertWrapped("encoding", exc_type, msg): "str_input".encode(self.codec_name) with self.assertWrapped("encoding", exc_type, msg): @@ -2515,23 +2534,17 @@ pass self.check_wrapped(MyRuntimeError(msg), msg, MyRuntimeError) - @contextlib.contextmanager - def assertNotWrapped(self, operation, exc_type, msg_re, msg=None): - if msg is None: - msg = msg_re - with self.assertRaisesRegex(exc_type, msg) as caught: - yield caught - self.assertEqual(str(caught.exception), msg) - - def check_not_wrapped(self, obj_to_raise, msg_re, msg=None): - self.set_codec(obj_to_raise) - with self.assertNotWrapped("encoding", RuntimeError, msg_re, msg): + def check_not_wrapped(self, obj_to_raise, msg): + def raise_obj(*args, **kwds): + raise obj_to_raise + self.set_codec(raise_obj, raise_obj) + with self.assertRaisesRegex(RuntimeError, msg): "str input".encode(self.codec_name) - with self.assertNotWrapped("encoding", RuntimeError, msg_re, msg): + with self.assertRaisesRegex(RuntimeError, msg): codecs.encode("str input", self.codec_name) - with self.assertNotWrapped("decoding", RuntimeError, msg_re, msg): + with self.assertRaisesRegex(RuntimeError, msg): b"bytes input".decode(self.codec_name) - with self.assertNotWrapped("decoding", RuntimeError, msg_re, msg): + with self.assertRaisesRegex(RuntimeError, msg): codecs.decode(b"bytes input", self.codec_name) def test_init_override_is_not_wrapped(self): @@ -2550,29 +2563,56 @@ msg = "This should NOT be wrapped" exc = RuntimeError(msg) exc.attr = 1 - self.check_not_wrapped(exc, msg) + self.check_not_wrapped(exc, "^{}$".format(msg)) def test_non_str_arg_is_not_wrapped(self): self.check_not_wrapped(RuntimeError(1), "1") def test_multiple_args_is_not_wrapped(self): - msg_re = "\('a', 'b', 'c'\)" - msg = "('a', 'b', 'c')" - self.check_not_wrapped(RuntimeError('a', 'b', 'c'), msg_re, msg) + msg_re = r"^\('a', 'b', 'c'\)$" + self.check_not_wrapped(RuntimeError('a', 'b', 'c'), msg_re) # http://bugs.python.org/issue19609 def test_codec_lookup_failure_not_wrapped(self): - msg = "unknown encoding: %s" % self.codec_name + msg = "^unknown encoding: {}$".format(self.codec_name) # The initial codec lookup should not be wrapped - with self.assertNotWrapped("encoding", LookupError, msg): + with self.assertRaisesRegex(LookupError, msg): "str input".encode(self.codec_name) - with self.assertNotWrapped("encoding", LookupError, msg): + with self.assertRaisesRegex(LookupError, msg): codecs.encode("str input", self.codec_name) - with self.assertNotWrapped("decoding", LookupError, msg): + with self.assertRaisesRegex(LookupError, msg): b"bytes input".decode(self.codec_name) - with self.assertNotWrapped("decoding", LookupError, msg): + with self.assertRaisesRegex(LookupError, msg): codecs.decode(b"bytes input", self.codec_name) + def test_unflagged_non_text_codec_handling(self): + # The stdlib non-text codecs are now marked so they're + # pre-emptively skipped by the text model related methods + # However, third party codecs won't be flagged, so we still make + # sure the case where an inappropriate output type is produced is + # handled appropriately + def encode_to_str(*args, **kwds): + return "not bytes!", 0 + def decode_to_bytes(*args, **kwds): + return b"not str!", 0 + self.set_codec(encode_to_str, decode_to_bytes) + # No input or output type checks on the codecs module functions + encoded = codecs.encode(None, self.codec_name) + self.assertEqual(encoded, "not bytes!") + decoded = codecs.decode(None, self.codec_name) + self.assertEqual(decoded, b"not str!") + # Text model methods should complain + fmt = (r"^{!r} encoder returned 'str' instead of 'bytes'; " + "use codecs.encode\(\) to encode to arbitrary types$") + msg = fmt.format(self.codec_name) + with self.assertRaisesRegex(TypeError, msg): + "str_input".encode(self.codec_name) + fmt = (r"^{!r} decoder returned 'bytes' instead of 'str'; " + "use codecs.decode\(\) to decode to arbitrary types$") + msg = fmt.format(self.codec_name) + with self.assertRaisesRegex(TypeError, msg): + b"bytes input".decode(self.codec_name) + @unittest.skipUnless(sys.platform == 'win32', diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,6 +10,12 @@ Core and Builtins ----------------- +- Issue #19619: str.encode, bytes.decode and bytearray.decode now use an + internal API to throw LookupError for known non-text encodings, rather + than attempting the encoding or decoding operation and then throwing a + TypeError for an unexpected output type. (The latter mechanism remains + in place for third party non-text encodings) + - Issue #19183: Implement PEP 456 'secure and interchangeable hash algorithm'. Python now uses SipHash24 on all major platforms. diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -3044,7 +3044,7 @@ buffer = PyMemoryView_FromBuffer(&info); if (buffer == NULL) goto onError; - unicode = PyCodec_Decode(buffer, encoding, errors); + unicode = _PyCodec_DecodeText(buffer, encoding, errors); if (unicode == NULL) goto onError; if (!PyUnicode_Check(unicode)) { @@ -3410,7 +3410,7 @@ } /* Encode via the codec registry */ - v = PyCodec_Encode(unicode, encoding, errors); + v = _PyCodec_EncodeText(unicode, encoding, errors); if (v == NULL) return NULL; diff --git a/Python/codecs.c b/Python/codecs.c --- a/Python/codecs.c +++ b/Python/codecs.c @@ -353,18 +353,15 @@ errors is passed to the encoder factory as argument if non-NULL. */ -PyObject *PyCodec_Encode(PyObject *object, - const char *encoding, - const char *errors) +static PyObject * +_PyCodec_EncodeInternal(PyObject *object, + PyObject *encoder, + const char *encoding, + const char *errors) { - PyObject *encoder = NULL; PyObject *args = NULL, *result = NULL; PyObject *v = NULL; - encoder = PyCodec_Encoder(encoding); - if (encoder == NULL) - goto onError; - args = args_tuple(object, errors); if (args == NULL) goto onError; @@ -402,18 +399,15 @@ errors is passed to the decoder factory as argument if non-NULL. */ -PyObject *PyCodec_Decode(PyObject *object, - const char *encoding, - const char *errors) +static PyObject * +_PyCodec_DecodeInternal(PyObject *object, + PyObject *decoder, + const char *encoding, + const char *errors) { - PyObject *decoder = NULL; PyObject *args = NULL, *result = NULL; PyObject *v; - decoder = PyCodec_Decoder(encoding); - if (decoder == NULL) - goto onError; - args = args_tuple(object, errors); if (args == NULL) goto onError; @@ -445,6 +439,118 @@ return NULL; } +/* Generic encoding/decoding API */ +PyObject *PyCodec_Encode(PyObject *object, + const char *encoding, + const char *errors) +{ + PyObject *encoder; + + encoder = PyCodec_Encoder(encoding); + if (encoder == NULL) + return NULL; + + return _PyCodec_EncodeInternal(object, encoder, encoding, errors); +} + +PyObject *PyCodec_Decode(PyObject *object, + const char *encoding, + const char *errors) +{ + PyObject *decoder; + + decoder = PyCodec_Decoder(encoding); + if (decoder == NULL) + return NULL; + + return _PyCodec_DecodeInternal(object, decoder, encoding, errors); +} + +/* Text encoding/decoding API */ +static +PyObject *codec_getitem_checked(const char *encoding, + const char *operation_name, + int index) +{ + _Py_IDENTIFIER(_is_text_encoding); + PyObject *codec; + PyObject *attr; + PyObject *v; + int is_text_codec; + + codec = _PyCodec_Lookup(encoding); + if (codec == NULL) + return NULL; + + /* Backwards compatibility: assume any raw tuple describes a text + * encoding, and the same for anything lacking the private + * attribute. + */ + if (!PyTuple_CheckExact(codec)) { + attr = _PyObject_GetAttrId(codec, &PyId__is_text_encoding); + if (attr == NULL) { + if (PyErr_ExceptionMatches(PyExc_AttributeError)) { + PyErr_Clear(); + } else { + Py_DECREF(codec); + return NULL; + } + } else { + is_text_codec = PyObject_IsTrue(attr); + Py_DECREF(attr); + if (!is_text_codec) { + Py_DECREF(codec); + PyErr_Format(PyExc_LookupError, + "'%.400s' is not a text encoding; " + "use codecs.%s() to handle arbitrary codecs", + encoding, operation_name); + return NULL; + } + } + } + + v = PyTuple_GET_ITEM(codec, index); + Py_DECREF(codec); + Py_INCREF(v); + return v; +} + +static PyObject * _PyCodec_TextEncoder(const char *encoding) +{ + return codec_getitem_checked(encoding, "encode", 0); +} + +static PyObject * _PyCodec_TextDecoder(const char *encoding) +{ + return codec_getitem_checked(encoding, "decode", 1); +} + +PyObject *_PyCodec_EncodeText(PyObject *object, + const char *encoding, + const char *errors) +{ + PyObject *encoder; + + encoder = _PyCodec_TextEncoder(encoding); + if (encoder == NULL) + return NULL; + + return _PyCodec_EncodeInternal(object, encoder, encoding, errors); +} + +PyObject *_PyCodec_DecodeText(PyObject *object, + const char *encoding, + const char *errors) +{ + PyObject *decoder; + + decoder = _PyCodec_TextDecoder(encoding); + if (decoder == NULL) + return NULL; + + return _PyCodec_DecodeInternal(object, decoder, encoding, errors); +} + /* Register the error handling callback function error under the name name. This function will be called by the codec when it encounters an unencodable characters/undecodable bytes and doesn't know the -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 14:00:54 2013 From: python-checkins at python.org (nick.coghlan) Date: Fri, 22 Nov 2013 14:00:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319619=3A_Update_W?= =?utf-8?q?hat=27s_New_for_codec_blacklist?= Message-ID: <3dQyS60dN6z7LkC@mail.python.org> http://hg.python.org/cpython/rev/1e3b8601b098 changeset: 87339:1e3b8601b098 user: Nick Coghlan date: Fri Nov 22 23:00:22 2013 +1000 summary: Issue #19619: Update What's New for codec blacklist files: Doc/whatsnew/3.4.rst | 23 ++++++++++++----------- 1 files changed, 12 insertions(+), 11 deletions(-) diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -194,37 +194,37 @@ encodings (in Python 3) or ``basestring`` <-> ``basestring`` conversions (in Python 2). -In Python 3.4, the errors raised by the convenience methods when a codec -produces the incorrect output type have also been updated to direct users -towards these general purpose convenience functions:: +In Python 3.4, the interpreter is able to identify the known non-text +encodings provided in the standard library and direct users towards these +general purpose convenience functions when appropriate:: >>> import codecs - >>> codecs.encode(b"hello", "bz2_codec").decode("bz2_codec") + >>> b"abcdef".decode("hex_codec") Traceback (most recent call last): File "", line 1, in - TypeError: 'bz2_codec' decoder returned 'bytes' instead of 'str'; use codecs.decode() to decode to arbitrary types + LookupError: 'hex_codec' is not a text encoding; use codecs.decode() to handle arbitrary codecs >>> "hello".encode("rot_13") Traceback (most recent call last): File "", line 1, in - TypeError: 'rot_13' encoder returned 'str' instead of 'bytes'; use codecs.encode() to encode to arbitrary types + LookupError: 'rot_13' is not a text encoding; use codecs.encode() to handle arbitrary codecs In a related change, whenever it is feasible without breaking backwards compatibility, exceptions raised during encoding and decoding operations will be wrapped in a chained exception of the same type that mentions the name of the codec responsible for producing the error:: - >>> b"hello".decode("uu_codec") - ValueError: Missing "begin" line in input data + >>> codecs.decode(b"abcdefgh", "hex_codec") + binascii.Error: Non-hexadecimal digit found The above exception was the direct cause of the following exception: Traceback (most recent call last): File "", line 1, in - ValueError: decoding with 'uu_codec' codec failed (ValueError: Missing "begin" line in input data) + binascii.Error: decoding with 'hex_codec' codec failed (Error: Non-hexadecimal digit found) - >>> "hello".encode("bz2_codec") + >>> codecs.encode("hello", "bz2_codec") TypeError: 'str' does not support the buffer interface The above exception was the direct cause of the following exception: @@ -233,7 +233,8 @@ File "", line 1, in TypeError: encoding with 'bz2_codec' codec failed (TypeError: 'str' does not support the buffer interface) -(Contributed by Nick Coghlan in :issue:`17827` and :issue:`17828`) +(Contributed by Nick Coghlan in :issue:`17827`, :issue:`17828` and +:issue:`19619`) Other Language Changes -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 14:33:00 2013 From: python-checkins at python.org (nick.coghlan) Date: Fri, 22 Nov 2013 14:33:00 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Update_bundled_pip_to_1=2E?= =?utf-8?q?5rc1?= Message-ID: <3dQz981SFCz7LkF@mail.python.org> http://hg.python.org/cpython/rev/4051f2dcd99b changeset: 87340:4051f2dcd99b user: Nick Coghlan date: Fri Nov 22 23:32:24 2013 +1000 summary: Update bundled pip to 1.5rc1 files: Lib/ensurepip/__init__.py | 2 +- Lib/ensurepip/_bundled/pip-1.5.dev1-py2.py3-none-any.whl | Bin Lib/ensurepip/_bundled/pip-1.5.rc1-py2.py3-none-any.whl | Bin 3 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/ensurepip/__init__.py b/Lib/ensurepip/__init__.py --- a/Lib/ensurepip/__init__.py +++ b/Lib/ensurepip/__init__.py @@ -12,7 +12,7 @@ _SETUPTOOLS_VERSION = "1.3.2" -_PIP_VERSION = "1.5.dev1" +_PIP_VERSION = "1.5.rc1" _PROJECTS = [ ("setuptools", _SETUPTOOLS_VERSION), diff --git a/Lib/ensurepip/_bundled/pip-1.5.dev1-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/pip-1.5.dev1-py2.py3-none-any.whl deleted file mode 100644 index 65e354860e048066407bb142f03c419cfbae3d52..0000000000000000000000000000000000000000 GIT binary patch [stripped] diff --git a/Lib/ensurepip/_bundled/pip-1.5.rc1-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/pip-1.5.rc1-py2.py3-none-any.whl new file mode 100644 index 0000000000000000000000000000000000000000..6418895a582a1a9243a73d79a8b73a5140d46e78 GIT binary patch [stripped] -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 15:34:28 2013 From: python-checkins at python.org (nick.coghlan) Date: Fri, 22 Nov 2013 15:34:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2319552=3A_venv_and?= =?utf-8?q?_pyvenv_ensurepip_integration?= Message-ID: <3dR0X45TfTz7Ll5@mail.python.org> http://hg.python.org/cpython/rev/57fbab22ab4e changeset: 87341:57fbab22ab4e user: Nick Coghlan date: Sat Nov 23 00:30:34 2013 +1000 summary: Close #19552: venv and pyvenv ensurepip integration files: Doc/library/venv.rst | 15 ++++++- Doc/using/venv-create.inc | 10 ++++- Lib/test/test_venv.py | 54 ++++++++++++++++++++++++-- Lib/venv/__init__.py | 34 ++++++++++++++-- Misc/NEWS | 5 ++ 5 files changed, 105 insertions(+), 13 deletions(-) diff --git a/Doc/library/venv.rst b/Doc/library/venv.rst --- a/Doc/library/venv.rst +++ b/Doc/library/venv.rst @@ -85,7 +85,8 @@ mechanisms for third-party virtual environment creators to customize environment creation according to their needs, the :class:`EnvBuilder` class. -.. class:: EnvBuilder(system_site_packages=False, clear=False, symlinks=False, upgrade=False) +.. class:: EnvBuilder(system_site_packages=False, clear=False, \ + symlinks=False, upgrade=False, with_pip=False) The :class:`EnvBuilder` class accepts the following keyword arguments on instantiation: @@ -105,6 +106,12 @@ environment with the running Python - for use when that Python has been upgraded in-place (defaults to ``False``). + * ``with_pip`` -- a Boolean value which, if True, ensures pip is + installed in the virtual environment + + .. versionchanged:: 3.4 + Added the ``with_pip`` parameter + Creators of third-party virtual environment tools will be free to use the provided ``EnvBuilder`` class as a base class. @@ -201,11 +208,15 @@ There is also a module-level convenience function: -.. function:: create(env_dir, system_site_packages=False, clear=False, symlinks=False) +.. function:: create(env_dir, system_site_packages=False, clear=False, \ + symlinks=False, with_pip=False) Create an :class:`EnvBuilder` with the given keyword arguments, and call its :meth:`~EnvBuilder.create` method with the *env_dir* argument. + .. versionchanged:: 3.4 + Added the ``with_pip`` parameter + An example of extending ``EnvBuilder`` -------------------------------------- diff --git a/Doc/using/venv-create.inc b/Doc/using/venv-create.inc --- a/Doc/using/venv-create.inc +++ b/Doc/using/venv-create.inc @@ -25,7 +25,7 @@ The command, if run with ``-h``, will show the available options:: usage: pyvenv [-h] [--system-site-packages] [--symlinks] [--clear] - [--upgrade] ENV_DIR [ENV_DIR ...] + [--upgrade] [--without-pip] ENV_DIR [ENV_DIR ...] Creates virtual Python environments in one or more target directories. @@ -43,6 +43,11 @@ raised. --upgrade Upgrade the environment directory to use this version of Python, assuming Python has been upgraded in-place. + --without-pip Skips installing or upgrading pip in the virtual + environment (pip is bootstrapped by default) + +.. versionchanged:: 3.4 + Installs pip by default, added the ``--without-pip`` option If the target directory already exists an error will be raised, unless the ``--clear`` or ``--upgrade`` option was provided. @@ -51,6 +56,9 @@ ``include-system-site-packages`` key, set to ``true`` if ``venv`` is run with the ``--system-site-packages`` option, ``false`` otherwise. +Unless the ``--without-pip`` option is given, :mod:`ensurepip` will be +invoked to bootstrap ``pip`` into the virtual environment. + Multiple paths can be given to ``pyvenv``, in which case an identical virtualenv will be created, according to the given options, at each provided path. diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -16,6 +16,10 @@ import unittest import venv +skipInVenv = unittest.skipIf(sys.prefix != sys.base_prefix, + 'Test not appropriate in a venv') + + class BaseTest(unittest.TestCase): """Base class for venv tests.""" @@ -83,8 +87,7 @@ print(' %r' % os.listdir(bd)) self.assertTrue(os.path.exists(fn), 'File %r should exist.' % fn) - @unittest.skipIf(sys.prefix != sys.base_prefix, 'Test not appropriate ' - 'in a venv') + @skipInVenv def test_prefixes(self): """ Test that the prefix values are as expected. @@ -217,8 +220,7 @@ # run the test, the pyvenv.cfg in the venv created in the test will # point to the venv being used to run the test, and we lose the link # to the source build - so Python can't initialise properly. - @unittest.skipIf(sys.prefix != sys.base_prefix, 'Test not appropriate ' - 'in a venv') + @skipInVenv def test_executable(self): """ Test that the sys.executable value is as expected. @@ -247,8 +249,50 @@ out, err = p.communicate() self.assertEqual(out.strip(), envpy.encode()) + + at skipInVenv +class EnsurePipTest(BaseTest): + """Test venv module installation of pip.""" + + def test_no_pip_by_default(self): + shutil.rmtree(self.env_dir) + self.run_with_capture(venv.create, self.env_dir) + envpy = os.path.join(os.path.realpath(self.env_dir), self.bindir, self.exe) + try_import = 'try:\n import pip\nexcept ImportError:\n print("OK")' + cmd = [envpy, '-c', try_import] + p = subprocess.Popen(cmd, stdout=subprocess.PIPE, + stderr=subprocess.PIPE) + out, err = p.communicate() + self.assertEqual(err, b"") + self.assertEqual(out.strip(), b"OK") + + def test_explicit_no_pip(self): + shutil.rmtree(self.env_dir) + self.run_with_capture(venv.create, self.env_dir, with_pip=False) + envpy = os.path.join(os.path.realpath(self.env_dir), self.bindir, self.exe) + try_import = 'try:\n import pip\nexcept ImportError:\n print("OK")' + cmd = [envpy, '-c', try_import] + p = subprocess.Popen(cmd, stdout=subprocess.PIPE, + stderr=subprocess.PIPE) + out, err = p.communicate() + self.assertEqual(err, b"") + self.assertEqual(out.strip(), b"OK") + + def test_with_pip(self): + shutil.rmtree(self.env_dir) + self.run_with_capture(venv.create, self.env_dir, with_pip=True) + envpy = os.path.join(os.path.realpath(self.env_dir), self.bindir, self.exe) + cmd = [envpy, '-m', 'pip', '--version'] + p = subprocess.Popen(cmd, stdout=subprocess.PIPE, + stderr=subprocess.PIPE) + out, err = p.communicate() + self.assertEqual(err, b"") + self.assertTrue(out.startswith(b"pip")) + self.assertIn(self.env_dir.encode(), out) + + def test_main(): - run_unittest(BasicTest) + run_unittest(BasicTest, EnsurePipTest) if __name__ == "__main__": test_main() diff --git a/Lib/venv/__init__.py b/Lib/venv/__init__.py --- a/Lib/venv/__init__.py +++ b/Lib/venv/__init__.py @@ -24,10 +24,13 @@ raised. --upgrade Upgrade the environment directory to use this version of Python, assuming Python has been upgraded in-place. + --without-pip Skips installing or upgrading pip in the virtual + environment (pip is bootstrapped by default) """ import logging import os import shutil +import subprocess import sys import sysconfig import types @@ -56,14 +59,17 @@ :param symlinks: If True, attempt to symlink rather than copy files into virtual environment. :param upgrade: If True, upgrade an existing virtual environment. + :param with_pip: If True, ensure pip is installed in the virtual + environment """ def __init__(self, system_site_packages=False, clear=False, - symlinks=False, upgrade=False): + symlinks=False, upgrade=False, with_pip=False): self.system_site_packages = system_site_packages self.clear = clear self.symlinks = symlinks self.upgrade = upgrade + self.with_pip = with_pip def create(self, env_dir): """ @@ -76,6 +82,8 @@ context = self.ensure_directories(env_dir) self.create_configuration(context) self.setup_python(context) + if self.with_pip: + self._setup_pip(context) if not self.upgrade: self.setup_scripts(context) self.post_setup(context) @@ -224,6 +232,12 @@ shutil.copyfile(src, dst) break + def _setup_pip(self, context): + """Installs or upgrades pip in a virtual environment""" + cmd = [context.env_exe, '-m', 'ensurepip', '--upgrade', + '--default-pip'] + subprocess.check_output(cmd) + def setup_scripts(self, context): """ Set up scripts into the created environment from a directory. @@ -317,7 +331,8 @@ shutil.copymode(srcfile, dstfile) -def create(env_dir, system_site_packages=False, clear=False, symlinks=False): +def create(env_dir, system_site_packages=False, clear=False, + symlinks=False, with_pip=False): """ Create a virtual environment in a directory. @@ -333,9 +348,11 @@ raised. :param symlinks: If True, attempt to symlink rather than copy files into virtual environment. + :param with_pip: If True, ensure pip is installed in the virtual + environment """ builder = EnvBuilder(system_site_packages=system_site_packages, - clear=clear, symlinks=symlinks) + clear=clear, symlinks=symlinks, with_pip=with_pip) builder.create(env_dir) def main(args=None): @@ -390,12 +407,19 @@ 'directory to use this version ' 'of Python, assuming Python ' 'has been upgraded in-place.') + parser.add_argument('--without-pip', dest='with_pip', + default=True, action='store_false', + help='Skips installing or upgrading pip in the ' + 'virtual environment (pip is bootstrapped ' + 'by default)') options = parser.parse_args(args) if options.upgrade and options.clear: raise ValueError('you cannot supply --upgrade and --clear together.') builder = EnvBuilder(system_site_packages=options.system_site, - clear=options.clear, symlinks=options.symlinks, - upgrade=options.upgrade) + clear=options.clear, + symlinks=options.symlinks, + upgrade=options.upgrade, + with_pip=options.with_pip) for d in options.dirs: builder.create(d) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -65,6 +65,8 @@ Library ------- +- Issue #19552: venv now supports bootstrapping pip into virtual environments + - Issue #17134: Finalize interface to Windows' certificate store. Cert and CRL enumeration are now two functions. enum_certificates() also returns purpose flags as set of OIDs. @@ -378,6 +380,9 @@ Tools/Demos ----------- +- Issue #19552: pyvenv now bootstraps pip into virtual environments by + default (pass --without-pip to request the old behaviour) + - Issue #19390: Argument Clinic no longer accepts malformed Python and C ids. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 15:57:24 2013 From: python-checkins at python.org (nick.coghlan) Date: Fri, 22 Nov 2013 15:57:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2317916=3A_dis=2EBy?= =?utf-8?q?tecode_based_replacement_for_distb?= Message-ID: <3dR12X2Y36z7Lm2@mail.python.org> http://hg.python.org/cpython/rev/d71251d9fbbe changeset: 87342:d71251d9fbbe user: Nick Coghlan date: Sat Nov 23 00:57:00 2013 +1000 summary: Close #17916: dis.Bytecode based replacement for distb - Bytecode.from_traceback() alternate constructor - current_offset parameter and attribute Patch by Claudiu Popa files: Doc/library/dis.rst | 12 +++++- Doc/whatsnew/3.4.rst | 3 +- Lib/dis.py | 17 +++++++- Lib/test/test_dis.py | 66 ++++++++++++++++++++++++++++++++ Misc/NEWS | 4 + 5 files changed, 98 insertions(+), 4 deletions(-) diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -44,7 +44,7 @@ :class:`Bytecode` object that provides easy access to details of the compiled code. -.. class:: Bytecode(x, *, first_line=None) +.. class:: Bytecode(x, *, first_line=None, current_offset=None) Analyse the bytecode corresponding to a function, method, string of source code, or a code object (as returned by :func:`compile`). @@ -59,6 +59,16 @@ Otherwise, the source line information (if any) is taken directly from the disassembled code object. + If *current_offset* is not None, it refers to an instruction offset + in the disassembled code. Setting this means :meth:`dis` will display + a "current instruction" marker against the specified opcode. + + .. classmethod:: from_traceback(tb) + + Construct a :class:`Bytecode` instance from the given traceback, + setting *current_offset* to the instruction responsible for the + exception. + .. data:: codeobj The compiled code object. diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -385,7 +385,8 @@ inspecting bytecode, both in human-readable form and for iterating over instructions. -(Contributed by Nick Coghlan, Ryan Kelly and Thomas Kluyver in :issue:`11816`) +(Contributed by Nick Coghlan, Ryan Kelly and Thomas Kluyver in :issue:`11816` +and Claudiu Popa in :issue:`17916`) doctest diff --git a/Lib/dis.py b/Lib/dis.py --- a/Lib/dis.py +++ b/Lib/dis.py @@ -406,7 +406,7 @@ Iterating over this yields the bytecode operations as Instruction instances. """ - def __init__(self, x, *, first_line=None): + def __init__(self, x, *, first_line=None, current_offset=None): self.codeobj = co = _get_code_object(x) if first_line is None: self.first_line = co.co_firstlineno @@ -417,6 +417,7 @@ self._cell_names = co.co_cellvars + co.co_freevars self._linestarts = dict(findlinestarts(co)) self._original_object = x + self.current_offset = current_offset def __iter__(self): co = self.codeobj @@ -429,6 +430,13 @@ return "{}({!r})".format(self.__class__.__name__, self._original_object) + @classmethod + def from_traceback(cls, tb): + """ Construct a Bytecode from the given traceback """ + while tb.tb_next: + tb = tb.tb_next + return cls(tb.tb_frame.f_code, current_offset=tb.tb_lasti) + def info(self): """Return formatted information about the code object.""" return _format_code_info(self.codeobj) @@ -436,13 +444,18 @@ def dis(self): """Return a formatted view of the bytecode operations.""" co = self.codeobj + if self.current_offset is not None: + offset = self.current_offset + else: + offset = -1 with io.StringIO() as output: _disassemble_bytes(co.co_code, varnames=co.co_varnames, names=co.co_names, constants=co.co_consts, cells=self._cell_names, linestarts=self._linestarts, line_offset=self._line_offset, - file=output) + file=output, + lasti=offset) return output.getvalue() diff --git a/Lib/test/test_dis.py b/Lib/test/test_dis.py --- a/Lib/test/test_dis.py +++ b/Lib/test/test_dis.py @@ -10,6 +10,21 @@ import types import contextlib +def get_tb(): + def _error(): + try: + 1 / 0 + except Exception as e: + tb = e.__traceback__ + return tb + + tb = _error() + while tb.tb_next: + tb = tb.tb_next + return tb + +TRACEBACK_CODE = get_tb().tb_frame.f_code + class _C: def __init__(self, x): self.x = x == 1 @@ -174,6 +189,46 @@ 25 RETURN_VALUE """ +dis_traceback = """\ + %-4d 0 SETUP_EXCEPT 12 (to 15) + + %-4d 3 LOAD_CONST 1 (1) + 6 LOAD_CONST 2 (0) + --> 9 BINARY_TRUE_DIVIDE + 10 POP_TOP + 11 POP_BLOCK + 12 JUMP_FORWARD 46 (to 61) + + %-4d >> 15 DUP_TOP + 16 LOAD_GLOBAL 0 (Exception) + 19 COMPARE_OP 10 (exception match) + 22 POP_JUMP_IF_FALSE 60 + 25 POP_TOP + 26 STORE_FAST 0 (e) + 29 POP_TOP + 30 SETUP_FINALLY 14 (to 47) + + %-4d 33 LOAD_FAST 0 (e) + 36 LOAD_ATTR 1 (__traceback__) + 39 STORE_FAST 1 (tb) + 42 POP_BLOCK + 43 POP_EXCEPT + 44 LOAD_CONST 0 (None) + >> 47 LOAD_CONST 0 (None) + 50 STORE_FAST 0 (e) + 53 DELETE_FAST 0 (e) + 56 END_FINALLY + 57 JUMP_FORWARD 1 (to 61) + >> 60 END_FINALLY + + %-4d >> 61 LOAD_FAST 1 (tb) + 64 RETURN_VALUE +""" % (TRACEBACK_CODE.co_firstlineno + 1, + TRACEBACK_CODE.co_firstlineno + 2, + TRACEBACK_CODE.co_firstlineno + 3, + TRACEBACK_CODE.co_firstlineno + 4, + TRACEBACK_CODE.co_firstlineno + 5) + class DisTests(unittest.TestCase): def get_disassembly(self, func, lasti=-1, wrapper=True): @@ -758,6 +813,17 @@ actual = dis.Bytecode(_f).dis() self.assertEqual(actual, dis_f) + def test_from_traceback(self): + tb = get_tb() + b = dis.Bytecode.from_traceback(tb) + while tb.tb_next: tb = tb.tb_next + + self.assertEqual(b.current_offset, tb.tb_lasti) + + def test_from_traceback_dis(self): + tb = get_tb() + b = dis.Bytecode.from_traceback(tb) + self.assertEqual(b.dis(), dis_traceback) def test_main(): run_unittest(DisTests, DisWithFileTests, CodeInfoTests, diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -65,6 +65,10 @@ Library ------- +- Issue #17916: Added dis.Bytecode.from_traceback() and + dis.Bytecode.current_offset to easily display "current instruction" + markers in the new disassembly API (Patch by Claudiu Popa). + - Issue #19552: venv now supports bootstrapping pip into virtual environments - Issue #17134: Finalize interface to Windows' certificate store. Cert and -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 16:06:40 2013 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 22 Nov 2013 16:06:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?devguide=3A_Remove_mention_of_the_old?= =?utf-8?q?_LaTeX_markup?= Message-ID: <3dR1FD4XcCz7Llq@mail.python.org> http://hg.python.org/devguide/rev/33f58b469a4d changeset: 653:33f58b469a4d user: Antoine Pitrou date: Fri Nov 22 16:06:36 2013 +0100 summary: Remove mention of the old LaTeX markup files: documenting.rst | 201 ------------------------------------ 1 files changed, 0 insertions(+), 201 deletions(-) diff --git a/documenting.rst b/documenting.rst --- a/documenting.rst +++ b/documenting.rst @@ -1437,207 +1437,6 @@ file is not found. This one only emits a warning. -Differences to the LaTeX markup -=============================== - -Though the markup language is different, most of the concepts and markup types -of the old LaTeX docs have been kept -- environments as reST directives, inline -commands as reST roles and so forth. - -However, there are some differences in the way these work, partly due to the -differences in the markup languages, partly due to improvements in Sphinx. This -section lists these differences, in order to give those familiar with the old -format a quick overview of what they might run into. - -Inline markup -------------- - -These changes have been made to inline markup: - -* **Cross-reference roles** - - Most of the following semantic roles existed previously as inline commands, - but didn't do anything except formatting the content as code. Now, they - cross-reference to known targets (some names have also been shortened): - - | *mod* (previously *refmodule* or *module*) - | *func* (previously *function*) - | *data* (new) - | *const* - | *class* - | *meth* (previously *method*) - | *attr* (previously *member*) - | *exc* (previously *exception*) - | *cdata* - | *cfunc* (previously *cfunction*) - | *cmacro* (previously *csimplemacro*) - | *ctype* - - Also different is the handling of *func* and *meth*: while previously - parentheses were added to the callable name (like ``\func{str()}``), they are - now appended by the build system -- appending them in the source will result - in double parentheses. This also means that ``:func:`str(object)``` will not - work as expected -- use ````str(object)```` instead! - -* **Inline commands implemented as directives** - - These were inline commands in LaTeX, but are now directives in reST: - - | *deprecated* - | *versionadded* - | *versionchanged* - - These are used like so:: - - .. deprecated:: 2.5 - Reason of deprecation. - - Also, no period is appended to the text for *versionadded* and - *versionchanged*. - - | *note* - | *warning* - - These are used like so:: - - .. note:: - - Content of note. - -* **Otherwise changed commands** - - The *samp* command previously formatted code and added quotation marks around - it. The *samp* role, however, features a new highlighting system just like - *file* does: - - ``:samp:`open({filename}, {mode})``` results in :samp:`open({filename}, {mode})` - -* **Dropped commands** - - These were commands in LaTeX, but are not available as roles: - - | *bfcode* - | *character* (use :samp:`\`\`'c'\`\``) - | *citetitle* (use ```Title `_``) - | *code* (use ````code````) - | *email* (just write the address in body text) - | *filenq* - | *filevar* (use the ``{...}`` highlighting feature of *file*) - | *programopt*, *longprogramopt* (use *option*) - | *ulink* (use ```Title `_``) - | *url* (just write the URL in body text) - | *var* (use ``*var*``) - | *infinity*, *plusminus* (use the Unicode character) - | *shortversion*, *version* (use the ``|version|`` and ``|release|`` substitutions) - | *emph*, *strong* (use the reST markup) - -* **Backslash escaping** - - In reST, a backslash must be escaped in normal text, and in the content of - roles. However, in code literals and literal blocks, it must not be escaped. - Example: ``:file:`C:\\Temp\\my.tmp``` vs. ````open("C:\Temp\my.tmp")````. - - -Information units ------------------ - -Information units (*...desc* environments) have been made reST directives. -These changes to information units should be noted: - -* **New names** - - "desc" has been removed from every name. Additionally, these directives have - new names: - - | *cfunction* (previously *cfuncdesc*) - | *cmacro* (previously *csimplemacrodesc*) - | *exception* (previously *excdesc*) - | *function* (previously *funcdesc*) - | *attribute* (previously *memberdesc*) - - The *classdesc\** and *excclassdesc* environments have been dropped, the - *class* and *exception* directives support classes documented with and without - constructor arguments. - -* **Multiple objects** - - The equivalent of the *...line* commands is:: - - .. function:: do_foo(bar) - do_bar(baz) - - Description of the functions. - - IOW, just give one signatures per line, at the same indentation level. - -* **Arguments** - - There is no *optional* command. Just give function signatures like they - should appear in the output:: - - .. function:: open(filename[, mode[, buffering]]) - - Description. - - Note: markup in the signature is not supported. - -* **Indexing** - - The *...descni* environments have been dropped. To mark an information unit - as unsuitable for index entry generation, use the *noindex* option like so:: - - .. function:: foo_* - :noindex: - - Description. - -* **New information units** - - There are new generic information units: One is called "describe" and can be - used to document things that are not covered by the other units:: - - .. describe:: a == b - - The equals operator. - - The others are:: - - .. cmdoption:: -O - - Describes a command-line option. - - .. envvar:: PYTHONINSPECT - - Describes an environment variable. - - -Structure ---------- - -The LaTeX docs were split in several toplevel manuals. Now, all files are part -of the same documentation tree, as indicated by the *toctree* directives in the -sources (though individual output formats may choose to split them up into parts -again). Every *toctree* directive embeds other files as subdocuments of the -current file (this structure is not necessarily mirrored in the filesystem -layout). The toplevel file is :file:`contents.rst`. - -However, most of the old directory structure has been kept, with the -directories renamed as follows: - -* :file:`api` -> :file:`c-api` -* :file:`dist` -> :file:`distutils`, with the single TeX file split up -* :file:`doc` -> :file:`documenting` -* :file:`ext` -> :file:`extending` -* :file:`inst` -> :file:`installing` -* :file:`lib` -> :file:`library` -* :file:`mac` -> merged into :file:`library`, with :file:`mac/using.tex` - moved to :file:`using/mac.rst` -* :file:`ref` -> :file:`reference` -* :file:`tut` -> :file:`tutorial`, with the single TeX file split up - - -.. XXX more (index-generating, production lists, ...) - .. _building-doc: Building the documentation -- Repository URL: http://hg.python.org/devguide From python-checkins at python.org Fri Nov 22 16:14:03 2013 From: python-checkins at python.org (christian.heimes) Date: Fri, 22 Nov 2013 16:14:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2317134=3A_check_ce?= =?utf-8?q?rts_of_CA_and_ROOT_system_store?= Message-ID: <3dR1Pl4VjBz7Lmm@mail.python.org> http://hg.python.org/cpython/rev/de65df13ed50 changeset: 87343:de65df13ed50 user: Christian Heimes date: Fri Nov 22 16:13:55 2013 +0100 summary: Issue #17134: check certs of CA and ROOT system store files: Lib/test/test_ssl.py | 27 ++++++++++++++------------- 1 files changed, 14 insertions(+), 13 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -536,21 +536,22 @@ self.assertRaises(TypeError, ssl.enum_certificates) self.assertRaises(WindowsError, ssl.enum_certificates, "") - names = set() - ca = ssl.enum_certificates("CA") - self.assertIsInstance(ca, list) - for element in ca: - self.assertIsInstance(element, tuple) - self.assertEqual(len(element), 3) - cert, enc, trust = element - self.assertIsInstance(cert, bytes) - self.assertIn(enc, {"x509_asn", "pkcs_7_asn"}) - self.assertIsInstance(trust, (set, bool)) - if isinstance(trust, set): - names.update(trust) + trust_oids = set() + for storename in ("CA", "ROOT"): + store = ssl.enum_certificates(storename) + self.assertIsInstance(store, list) + for element in store: + self.assertIsInstance(element, tuple) + self.assertEqual(len(element), 3) + cert, enc, trust = element + self.assertIsInstance(cert, bytes) + self.assertIn(enc, {"x509_asn", "pkcs_7_asn"}) + self.assertIsInstance(trust, (set, bool)) + if isinstance(trust, set): + trust_oids.update(trust) serverAuth = "1.3.6.1.5.5.7.3.1" - self.assertIn(serverAuth, names) + self.assertIn(serverAuth, trust_oids) @unittest.skipUnless(sys.platform == "win32", "Windows specific") def test_enum_crls(self): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 16:21:04 2013 From: python-checkins at python.org (christian.heimes) Date: Fri, 22 Nov 2013 16:21:04 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319448=3A_report_n?= =?utf-8?q?ame_/_NID_in_exception_message_of_ASN1Object?= Message-ID: <3dR1Yr3bY5z7Lml@mail.python.org> http://hg.python.org/cpython/rev/7d914d4b05fe changeset: 87344:7d914d4b05fe user: Christian Heimes date: Fri Nov 22 16:20:53 2013 +0100 summary: Issue #19448: report name / NID in exception message of ASN1Object files: Lib/test/test_ssl.py | 6 ++++-- Modules/_ssl.c | 6 +++--- 2 files changed, 7 insertions(+), 5 deletions(-) diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -585,7 +585,8 @@ self.assertEqual(val, expected) self.assertIsInstance(val, ssl._ASN1Object) self.assertRaises(ValueError, ssl._ASN1Object.fromnid, -1) - self.assertRaises(ValueError, ssl._ASN1Object.fromnid, 100000) + with self.assertRaisesRegex(ValueError, "unknown NID 100000"): + ssl._ASN1Object.fromnid(100000) for i in range(1000): try: obj = ssl._ASN1Object.fromnid(i) @@ -603,7 +604,8 @@ self.assertEqual(ssl._ASN1Object.fromname('serverAuth'), expected) self.assertEqual(ssl._ASN1Object.fromname('1.3.6.1.5.5.7.3.1'), expected) - self.assertRaises(ValueError, ssl._ASN1Object.fromname, 'serverauth') + with self.assertRaisesRegex(ValueError, "unknown object 'serverauth'"): + ssl._ASN1Object.fromname('serverauth') class ContextTests(unittest.TestCase): diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -3387,7 +3387,7 @@ } obj = OBJ_txt2obj(txt, name ? 0 : 1); if (obj == NULL) { - PyErr_Format(PyExc_ValueError, "Unknown object"); + PyErr_Format(PyExc_ValueError, "unknown object '%.100s'", txt); return NULL; } result = asn1obj2py(obj); @@ -3411,12 +3411,12 @@ return NULL; } if (nid < NID_undef) { - PyErr_Format(PyExc_ValueError, "NID must be positive."); + PyErr_SetString(PyExc_ValueError, "NID must be positive."); return NULL; } obj = OBJ_nid2obj(nid); if (obj == NULL) { - PyErr_Format(PyExc_ValueError, "Unknown NID"); + PyErr_Format(PyExc_ValueError, "unknown NID %i", nid); return NULL; } result = asn1obj2py(obj); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 17:08:39 2013 From: python-checkins at python.org (barry.warsaw) Date: Fri, 22 Nov 2013 17:08:39 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_A_fix_for_issue_19555_on_W?= =?utf-8?q?indows=2E?= Message-ID: <3dR2cl14pjz7Lkp@mail.python.org> http://hg.python.org/cpython/rev/331b7a8bb830 changeset: 87345:331b7a8bb830 parent: 87341:57fbab22ab4e user: Barry Warsaw date: Fri Nov 22 11:08:05 2013 -0500 summary: A fix for issue 19555 on Windows. files: Lib/sysconfig.py | 8 ++++---- 1 files changed, 4 insertions(+), 4 deletions(-) diff --git a/Lib/sysconfig.py b/Lib/sysconfig.py --- a/Lib/sysconfig.py +++ b/Lib/sysconfig.py @@ -409,10 +409,6 @@ # _sysconfigdata is generated at build time, see _generate_posix_vars() from _sysconfigdata import build_time_vars vars.update(build_time_vars) - # For backward compatibility, see issue19555 - SO = build_time_vars.get('EXT_SUFFIX') - if SO is not None: - vars['SO'] = SO def _init_non_posix(vars): """Initialize the module as appropriate for NT""" @@ -540,6 +536,10 @@ _init_non_posix(_CONFIG_VARS) if os.name == 'posix': _init_posix(_CONFIG_VARS) + # For backward compatibility, see issue19555 + SO = _CONFIG_VARS.get('EXT_SUFFIX') + if SO is not None: + _CONFIG_VARS['SO'] = SO # Setting 'userbase' is done below the call to the # init function to enable using 'get_config_var' in # the init-function. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 17:08:40 2013 From: python-checkins at python.org (barry.warsaw) Date: Fri, 22 Nov 2013 17:08:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?b?KTogVHJ1bmsgbWVyZ2Uu?= Message-ID: <3dR2cm4dDRz7LmM@mail.python.org> http://hg.python.org/cpython/rev/364b29d1c79d changeset: 87346:364b29d1c79d parent: 87345:331b7a8bb830 parent: 87344:7d914d4b05fe user: Barry Warsaw date: Fri Nov 22 11:08:25 2013 -0500 summary: Trunk merge. files: Doc/library/dis.rst | 12 +++++- Doc/whatsnew/3.4.rst | 3 +- Lib/dis.py | 17 +++++++- Lib/test/test_dis.py | 66 ++++++++++++++++++++++++++++++++ Lib/test/test_ssl.py | 33 ++++++++------- Misc/NEWS | 4 + Modules/_ssl.c | 6 +- 7 files changed, 119 insertions(+), 22 deletions(-) diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -44,7 +44,7 @@ :class:`Bytecode` object that provides easy access to details of the compiled code. -.. class:: Bytecode(x, *, first_line=None) +.. class:: Bytecode(x, *, first_line=None, current_offset=None) Analyse the bytecode corresponding to a function, method, string of source code, or a code object (as returned by :func:`compile`). @@ -59,6 +59,16 @@ Otherwise, the source line information (if any) is taken directly from the disassembled code object. + If *current_offset* is not None, it refers to an instruction offset + in the disassembled code. Setting this means :meth:`dis` will display + a "current instruction" marker against the specified opcode. + + .. classmethod:: from_traceback(tb) + + Construct a :class:`Bytecode` instance from the given traceback, + setting *current_offset* to the instruction responsible for the + exception. + .. data:: codeobj The compiled code object. diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -385,7 +385,8 @@ inspecting bytecode, both in human-readable form and for iterating over instructions. -(Contributed by Nick Coghlan, Ryan Kelly and Thomas Kluyver in :issue:`11816`) +(Contributed by Nick Coghlan, Ryan Kelly and Thomas Kluyver in :issue:`11816` +and Claudiu Popa in :issue:`17916`) doctest diff --git a/Lib/dis.py b/Lib/dis.py --- a/Lib/dis.py +++ b/Lib/dis.py @@ -406,7 +406,7 @@ Iterating over this yields the bytecode operations as Instruction instances. """ - def __init__(self, x, *, first_line=None): + def __init__(self, x, *, first_line=None, current_offset=None): self.codeobj = co = _get_code_object(x) if first_line is None: self.first_line = co.co_firstlineno @@ -417,6 +417,7 @@ self._cell_names = co.co_cellvars + co.co_freevars self._linestarts = dict(findlinestarts(co)) self._original_object = x + self.current_offset = current_offset def __iter__(self): co = self.codeobj @@ -429,6 +430,13 @@ return "{}({!r})".format(self.__class__.__name__, self._original_object) + @classmethod + def from_traceback(cls, tb): + """ Construct a Bytecode from the given traceback """ + while tb.tb_next: + tb = tb.tb_next + return cls(tb.tb_frame.f_code, current_offset=tb.tb_lasti) + def info(self): """Return formatted information about the code object.""" return _format_code_info(self.codeobj) @@ -436,13 +444,18 @@ def dis(self): """Return a formatted view of the bytecode operations.""" co = self.codeobj + if self.current_offset is not None: + offset = self.current_offset + else: + offset = -1 with io.StringIO() as output: _disassemble_bytes(co.co_code, varnames=co.co_varnames, names=co.co_names, constants=co.co_consts, cells=self._cell_names, linestarts=self._linestarts, line_offset=self._line_offset, - file=output) + file=output, + lasti=offset) return output.getvalue() diff --git a/Lib/test/test_dis.py b/Lib/test/test_dis.py --- a/Lib/test/test_dis.py +++ b/Lib/test/test_dis.py @@ -10,6 +10,21 @@ import types import contextlib +def get_tb(): + def _error(): + try: + 1 / 0 + except Exception as e: + tb = e.__traceback__ + return tb + + tb = _error() + while tb.tb_next: + tb = tb.tb_next + return tb + +TRACEBACK_CODE = get_tb().tb_frame.f_code + class _C: def __init__(self, x): self.x = x == 1 @@ -174,6 +189,46 @@ 25 RETURN_VALUE """ +dis_traceback = """\ + %-4d 0 SETUP_EXCEPT 12 (to 15) + + %-4d 3 LOAD_CONST 1 (1) + 6 LOAD_CONST 2 (0) + --> 9 BINARY_TRUE_DIVIDE + 10 POP_TOP + 11 POP_BLOCK + 12 JUMP_FORWARD 46 (to 61) + + %-4d >> 15 DUP_TOP + 16 LOAD_GLOBAL 0 (Exception) + 19 COMPARE_OP 10 (exception match) + 22 POP_JUMP_IF_FALSE 60 + 25 POP_TOP + 26 STORE_FAST 0 (e) + 29 POP_TOP + 30 SETUP_FINALLY 14 (to 47) + + %-4d 33 LOAD_FAST 0 (e) + 36 LOAD_ATTR 1 (__traceback__) + 39 STORE_FAST 1 (tb) + 42 POP_BLOCK + 43 POP_EXCEPT + 44 LOAD_CONST 0 (None) + >> 47 LOAD_CONST 0 (None) + 50 STORE_FAST 0 (e) + 53 DELETE_FAST 0 (e) + 56 END_FINALLY + 57 JUMP_FORWARD 1 (to 61) + >> 60 END_FINALLY + + %-4d >> 61 LOAD_FAST 1 (tb) + 64 RETURN_VALUE +""" % (TRACEBACK_CODE.co_firstlineno + 1, + TRACEBACK_CODE.co_firstlineno + 2, + TRACEBACK_CODE.co_firstlineno + 3, + TRACEBACK_CODE.co_firstlineno + 4, + TRACEBACK_CODE.co_firstlineno + 5) + class DisTests(unittest.TestCase): def get_disassembly(self, func, lasti=-1, wrapper=True): @@ -758,6 +813,17 @@ actual = dis.Bytecode(_f).dis() self.assertEqual(actual, dis_f) + def test_from_traceback(self): + tb = get_tb() + b = dis.Bytecode.from_traceback(tb) + while tb.tb_next: tb = tb.tb_next + + self.assertEqual(b.current_offset, tb.tb_lasti) + + def test_from_traceback_dis(self): + tb = get_tb() + b = dis.Bytecode.from_traceback(tb) + self.assertEqual(b.dis(), dis_traceback) def test_main(): run_unittest(DisTests, DisWithFileTests, CodeInfoTests, diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -536,21 +536,22 @@ self.assertRaises(TypeError, ssl.enum_certificates) self.assertRaises(WindowsError, ssl.enum_certificates, "") - names = set() - ca = ssl.enum_certificates("CA") - self.assertIsInstance(ca, list) - for element in ca: - self.assertIsInstance(element, tuple) - self.assertEqual(len(element), 3) - cert, enc, trust = element - self.assertIsInstance(cert, bytes) - self.assertIn(enc, {"x509_asn", "pkcs_7_asn"}) - self.assertIsInstance(trust, (set, bool)) - if isinstance(trust, set): - names.update(trust) + trust_oids = set() + for storename in ("CA", "ROOT"): + store = ssl.enum_certificates(storename) + self.assertIsInstance(store, list) + for element in store: + self.assertIsInstance(element, tuple) + self.assertEqual(len(element), 3) + cert, enc, trust = element + self.assertIsInstance(cert, bytes) + self.assertIn(enc, {"x509_asn", "pkcs_7_asn"}) + self.assertIsInstance(trust, (set, bool)) + if isinstance(trust, set): + trust_oids.update(trust) serverAuth = "1.3.6.1.5.5.7.3.1" - self.assertIn(serverAuth, names) + self.assertIn(serverAuth, trust_oids) @unittest.skipUnless(sys.platform == "win32", "Windows specific") def test_enum_crls(self): @@ -584,7 +585,8 @@ self.assertEqual(val, expected) self.assertIsInstance(val, ssl._ASN1Object) self.assertRaises(ValueError, ssl._ASN1Object.fromnid, -1) - self.assertRaises(ValueError, ssl._ASN1Object.fromnid, 100000) + with self.assertRaisesRegex(ValueError, "unknown NID 100000"): + ssl._ASN1Object.fromnid(100000) for i in range(1000): try: obj = ssl._ASN1Object.fromnid(i) @@ -602,7 +604,8 @@ self.assertEqual(ssl._ASN1Object.fromname('serverAuth'), expected) self.assertEqual(ssl._ASN1Object.fromname('1.3.6.1.5.5.7.3.1'), expected) - self.assertRaises(ValueError, ssl._ASN1Object.fromname, 'serverauth') + with self.assertRaisesRegex(ValueError, "unknown object 'serverauth'"): + ssl._ASN1Object.fromname('serverauth') class ContextTests(unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -65,6 +65,10 @@ Library ------- +- Issue #17916: Added dis.Bytecode.from_traceback() and + dis.Bytecode.current_offset to easily display "current instruction" + markers in the new disassembly API (Patch by Claudiu Popa). + - Issue #19552: venv now supports bootstrapping pip into virtual environments - Issue #17134: Finalize interface to Windows' certificate store. Cert and diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -3387,7 +3387,7 @@ } obj = OBJ_txt2obj(txt, name ? 0 : 1); if (obj == NULL) { - PyErr_Format(PyExc_ValueError, "Unknown object"); + PyErr_Format(PyExc_ValueError, "unknown object '%.100s'", txt); return NULL; } result = asn1obj2py(obj); @@ -3411,12 +3411,12 @@ return NULL; } if (nid < NID_undef) { - PyErr_Format(PyExc_ValueError, "NID must be positive."); + PyErr_SetString(PyExc_ValueError, "NID must be positive."); return NULL; } obj = OBJ_nid2obj(nid); if (obj == NULL) { - PyErr_Format(PyExc_ValueError, "Unknown NID"); + PyErr_Format(PyExc_ValueError, "unknown NID %i", nid); return NULL; } result = asn1obj2py(obj); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 17:17:12 2013 From: python-checkins at python.org (eric.snow) Date: Fri, 22 Nov 2013 17:17:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Implement_PEP_451_=28Modul?= =?utf-8?q?eSpec=29=2E?= Message-ID: <3dR2pc4Twcz7Lkp@mail.python.org> http://hg.python.org/cpython/rev/07229c6104b1 changeset: 87347:07229c6104b1 user: Eric Snow date: Fri Nov 22 09:05:39 2013 -0700 summary: Implement PEP 451 (ModuleSpec). files: Doc/library/importlib.rst | 144 +- Doc/reference/import.rst | 429 +- Doc/whatsnew/3.4.rst | 20 + Lib/imp.py | 30 +- Lib/importlib/__init__.py | 64 +- Lib/importlib/_bootstrap.py | 1107 +- Lib/importlib/abc.py | 102 +- Lib/importlib/machinery.py | 1 + Lib/importlib/util.py | 65 +- Lib/multiprocessing/spawn.py | 11 +- Lib/pkgutil.py | 11 +- Lib/pydoc.py | 3 +- Lib/test/test_descr.py | 2 +- Lib/test/test_frozen.py | 77 - Lib/test/test_import.py | 14 +- Lib/test/test_importlib/abc.py | 5 - Lib/test/test_importlib/builtin/test_finder.py | 59 +- Lib/test/test_importlib/builtin/test_loader.py | 78 +- Lib/test/test_importlib/extension/test_case_sensitivity.py | 2 + Lib/test/test_importlib/extension/test_finder.py | 20 +- Lib/test/test_importlib/extension/test_loader.py | 81 +- Lib/test/test_importlib/frozen/test_finder.py | 45 +- Lib/test/test_importlib/frozen/test_loader.py | 79 +- Lib/test/test_importlib/import_/test_meta_path.py | 4 +- Lib/test/test_importlib/test_abc.py | 197 - Lib/test/test_importlib/test_api.py | 136 +- Lib/test/test_importlib/test_spec.py | 968 + Lib/test/test_importlib/test_util.py | 65 - Lib/test/test_module.py | 12 +- Lib/test/test_namespace_pkgs.py | 20 - Lib/test/test_pkg.py | 20 +- Lib/test/test_pkgutil.py | 16 +- Lib/test/test_reprlib.py | 1 + Lib/test/test_runpy.py | 5 +- Objects/moduleobject.c | 53 +- Python/import.c | 16 +- Python/importlib.h | 7415 +++++---- 37 files changed, 6981 insertions(+), 4396 deletions(-) diff --git a/Doc/library/importlib.rst b/Doc/library/importlib.rst --- a/Doc/library/importlib.rst +++ b/Doc/library/importlib.rst @@ -310,11 +310,11 @@ from the import. If the loader inserted a module and the load fails, it must be removed by the loader from :data:`sys.modules`; modules already in :data:`sys.modules` before the loader began execution should be left - alone (see :func:`importlib.util.module_to_load`). + alone (see :func:`importlib.util.module_for_loader`). The loader should set several attributes on the module. (Note that some of these attributes can change when a module is - reloaded; see :meth:`init_module_attrs`): + reloaded): - :attr:`__name__` The name of the module. @@ -357,17 +357,6 @@ .. versionchanged:: 3.4 Made optional instead of an abstractmethod. - .. method:: init_module_attrs(module) - - Set the :attr:`__loader__` attribute on the module. - - Subclasses overriding this method should set whatever appropriate - attributes it can, getting the module's name from :attr:`__name__` when - needed. All values should also be overridden so that reloading works as - expected. - - .. versionadded:: 3.4 - .. class:: ResourceLoader @@ -442,14 +431,6 @@ .. versionadded:: 3.4 - .. method:: init_module_attrs(module) - - Set the :attr:`__package__` attribute and :attr:`__path__` attribute to - the empty list if appropriate along with what - :meth:`importlib.abc.Loader.init_module_attrs` sets. - - .. versionadded:: 3.4 - .. method:: load_module(fullname) Implementation of :meth:`Loader.load_module`. @@ -474,15 +455,6 @@ .. versionchanged:: 3.4 Raises :exc:`ImportError` instead of :exc:`NotImplementedError`. - .. method:: init_module_attrs(module) - - Set :attr:`__file__` and if initializing a package then set - :attr:`__path__` to ``[os.path.dirname(__file__)]`` along with - all attributes set by - :meth:`importlib.abc.InspectLoader.init_module_attrs`. - - .. versionadded:: 3.4 - .. class:: FileLoader(fullname, path) @@ -599,14 +571,6 @@ ``__init__`` when the file extension is removed **and** the module name itself does not end in ``__init__``. - .. method:: init_module_attr(module) - - Set :attr:`__cached__` using :func:`imp.cache_from_source`. Other - attributes set by - :meth:`importlib.abc.ExecutionLoader.init_module_attrs`. - - .. versionadded:: 3.4 - :mod:`importlib.machinery` -- Importers and path hooks ------------------------------------------------------ @@ -882,6 +846,64 @@ .. versionadded:: 3.4 +.. class:: ModuleSpec(name, loader, *, origin=None, loader_state=None, is_package=None) + + A specification for a module's import-system-related state. + + .. versionadded:: 3.4 + + .. attribute:: name + + (``__name__``) + + A string for the fully-qualified name of the module. + + .. attribute:: loader + + (``__loader__``) + + The loader to use for loading. For namespace packages this should be + set to None. + + .. attribute:: origin + + (``__file__``) + + Name of the place from which the module is loaded, e.g. "builtin" for + built-in modules and the filename for modules loaded from source. + Normally "origin" should be set, but it may be None (the default) + which indicates it is unspecified. + + .. attribute:: submodule_search_locations + + (``__path__``) + + List of strings for where to find submodules, if a package (None + otherwise). + + .. attribute:: loader_state + + Container of extra module-specific data for use during loading (or + None). + + .. attribute:: cached + + (``__cached__``) + + String for where the compiled module should be stored (or None). + + .. attribute:: parent + + (``__package__``) + + (Read-only) Fully-qualified name of the package to which the module + belongs as a submodule (or None). + + .. attribute:: has_location + + (Read-only) Boolean indicating whether or not the module's "origin" + attribute refers to a loadable location. + :mod:`importlib.util` -- Utility code for importers --------------------------------------------------- @@ -952,20 +974,6 @@ .. versionadded:: 3.3 -.. function:: module_to_load(name, *, reset_name=True) - - Returns a :term:`context manager` which provides the module to load. The - module will either come from :attr:`sys.modules` in the case of reloading or - a fresh module if loading a new module. Proper cleanup of - :attr:`sys.modules` occurs if the module was new and an exception was - raised. - - If **reset_name** is true and the module requested is being reloaded then - the module's :attr:`__name__` attribute will - be reset to **name**, else it will be left untouched. - - .. versionadded:: 3.4 - .. decorator:: module_for_loader A :term:`decorator` for :meth:`importlib.abc.Loader.load_module` @@ -999,9 +1007,8 @@ unconditionally to support reloading. .. deprecated:: 3.4 - For the benefit of :term:`loader` subclasses, please use - :func:`module_to_load` and - :meth:`importlib.abc.Loader.init_module_attrs` instead. + The import machinery now directly performs all the functionality + provided by this function. .. decorator:: set_loader @@ -1012,11 +1019,6 @@ the wrapped method (i.e. ``self``) is what :attr:`__loader__` should be set to. - .. note:: - As this decorator sets :attr:`__loader__` after loading the module, it is - recommended to use :meth:`importlib.abc.Loader.init_module_attrs` instead - when appropriate. - .. versionchanged:: 3.4 Set ``__loader__`` if set to ``None``, as if the attribute does not exist. @@ -1026,7 +1028,21 @@ A :term:`decorator` for :meth:`importlib.abc.Loader.load_module` to set the :attr:`__package__` attribute on the returned module. If :attr:`__package__` is set and has a value other than ``None`` it will not be changed. - .. note:: - As this decorator sets :attr:`__package__` after loading the module, it is - recommended to use :meth:`importlib.abc.Loader.init_module_attrs` instead - when appropriate. +.. function:: spec_from_loader(name, loader, *, origin=None, is_package=None) + + A factory function for creating a :class:`ModuleSpec` instance based + on a loader. The parameters have the same meaning as they do for + ModuleSpec. The function uses available :term:`loader` APIs, such as + :meth:`InspectLoader.is_package`, to fill in any missing + information on the spec. + + .. versionadded:: 3.4 + +.. function:: spec_from_file_location(name, location, *, loader=None, submodule_search_locations=None) + + A factory function for creating a :class:`ModuleSpec` instance based + on the path to a file. Missing information will be filled in on the + spec by making use of loader APIs and by the implication that the + module will be file-based. + + .. versionadded:: 3.4 diff --git a/Doc/reference/import.rst b/Doc/reference/import.rst --- a/Doc/reference/import.rst +++ b/Doc/reference/import.rst @@ -210,6 +210,7 @@ .. index:: single: finder single: loader + single: module spec If the named module is not found in :data:`sys.modules`, then Python's import protocol is invoked to find and load the module. This protocol consists of @@ -230,13 +231,17 @@ range and scope of module searching. Finders do not actually load modules. If they can find the named module, they -return a :term:`loader`, which the import machinery then invokes to load the -module and create the corresponding module object. +return a :term:`module spec`, an encapsulation of the module's import-related +information, which the import machinery then uses when loading the module. The following sections describe the protocol for finders and loaders in more detail, including how you can create and register new ones to extend the import machinery. +.. versionchanged:: 3.4 + In previous versions of Python, finders returned :term:`loaders ` + directly, whereas now they return module specs which *contain* loaders. + Loaders are still used during import but have fewer responsibilities. Import hooks ------------ @@ -270,24 +275,23 @@ .. index:: single: sys.meta_path - pair: finder; find_module - pair: finder; find_loader + pair: finder; find_spec When the named module is not found in :data:`sys.modules`, Python next searches :data:`sys.meta_path`, which contains a list of meta path finder objects. These finders are queried in order to see if they know how to handle the named module. Meta path finders must implement a method called -:meth:`find_module()` which takes two arguments, a name and an import path. +:meth:`find_spec()` which takes two arguments, a name and an import path. The meta path finder can use any strategy it wants to determine whether it can handle the named module or not. If the meta path finder knows how to handle the named module, it returns a -loader object. If it cannot handle the named module, it returns ``None``. If +spec object. If it cannot handle the named module, it returns ``None``. If :data:`sys.meta_path` processing reaches the end of its list without returning -a loader, then an :exc:`ImportError` is raised. Any other exceptions raised +a spec, then an :exc:`ImportError` is raised. Any other exceptions raised are simply propagated up, aborting the import process. -The :meth:`find_module()` method of meta path finders is called with two +The :meth:`find_spec()` method of meta path finders is called with two arguments. The first is the fully qualified name of the module being imported, for example ``foo.bar.baz``. The second argument is the path entries to use for the module search. For top-level modules, the second @@ -299,12 +303,12 @@ The meta path may be traversed multiple times for a single import request. For example, assuming none of the modules involved has already been cached, importing ``foo.bar.baz`` will first perform a top level import, calling -``mpf.find_module("foo", None)`` on each meta path finder (``mpf``). After +``mpf.find_spec("foo", None)`` on each meta path finder (``mpf``). After ``foo`` has been imported, ``foo.bar`` will be imported by traversing the meta path a second time, calling -``mpf.find_module("foo.bar", foo.__path__)``. Once ``foo.bar`` has been +``mpf.find_spec("foo.bar", foo.__path__)``. Once ``foo.bar`` has been imported, the final traversal will call -``mpf.find_module("foo.bar.baz", foo.bar.__path__)``. +``mpf.find_spec("foo.bar.baz", foo.bar.__path__)``. Some meta path finders only support top level imports. These importers will always return ``None`` when anything other than ``None`` is passed as the @@ -315,131 +319,229 @@ modules, and one that knows how to import modules from an :term:`import path` (i.e. the :term:`path based finder`). +.. versionchanged:: 3.4 + The find_spec() method of meta path finders replaced :meth:`find_module()`. + which is now deprecated. While it will continue to work without change, + the import machinery will try it only if the finder does not implement + find_spec(). -Loaders + +Loading ======= -If and when a module loader is found its -:meth:`~importlib.abc.Loader.load_module` method is called, with a single -argument, the fully qualified name of the module being imported. This method -has several responsibilities, and should return the module object it has -loaded [#fnlo]_. If it cannot load the module, it should raise an -:exc:`ImportError`, although any other exception raised during -:meth:`load_module()` will be propagated. +If and when a module spec is found, the import machinery will use it (and +the loader it contains) when loading the module. Here is an approximation +of what happens during the loading portion of import:: -In many cases, the finder and loader can be the same object; in such cases the -:meth:`finder.find_module()` would just return ``self``. + module = None + if spec.loader is not None and hasattr(spec.loader, 'create_module'): + module = spec.loader.create_module(spec) + if module is None: + module = ModuleType(spec.name) + # The import-related module attributes get set here: + _init_module_attrs(spec, module) -Loaders must satisfy the following requirements: + if spec.loader is None: + if spec.submodule_search_locations is not None: + # namespace package + sys.modules[spec.name] = module + else: + # unsupported + raise ImportError + elif not hasattr(spec.loader, 'exec_module'): + module = spec.loader.load_module(spec.name) + else: + sys.modules[spec.name] = module + try: + spec.loader.exec_module(module) + except BaseException: + try: + del sys.modules[spec.name] + except KeyError: + pass + raise + module_to_return = sys.modules[spec.name] + +Note the following details: * If there is an existing module object with the given name in - :data:`sys.modules`, the loader must use that existing module. (Otherwise, - :func:`imp.reload` will not work correctly.) If the named module does - not exist in :data:`sys.modules`, the loader must create a new module - object and add it to :data:`sys.modules`. + :data:`sys.modules`, import will have already returned it. - Note that the module *must* exist in :data:`sys.modules` before the loader + * The module will exist in :data:`sys.modules` before the loader executes the module code. This is crucial because the module code may (directly or indirectly) import itself; adding it to :data:`sys.modules` beforehand prevents unbounded recursion in the worst case and multiple loading in the best. - If loading fails, the loader must remove any modules it has inserted into - :data:`sys.modules`, but it must remove **only** the failing module, and - only if the loader itself has loaded it explicitly. Any module already in - the :data:`sys.modules` cache, and any module that was successfully loaded - as a side-effect, must remain in the cache. + * If loading fails, the failing module -- and only the failing module -- + gets removed from :data:`sys.modules`. Any module already in the + :data:`sys.modules` cache, and any module that was successfully loaded + as a side-effect, must remain in the cache. This contrasts with + reloading where even the failing module is left in :data:`sys.modules`. - * The loader may set the ``__file__`` attribute of the module. If set, this - attribute's value must be a string. The loader may opt to leave - ``__file__`` unset if it has no semantic meaning (e.g. a module loaded from - a database). If ``__file__`` is set, it may also be appropriate to set the - ``__cached__`` attribute which is the path to any compiled version of the - code (e.g. byte-compiled file). The file does not need to exist to set this - attribute; the path can simply point to whether the compiled file would - exist (see :pep:`3147`). + * After the module is created but before execution, the import machinery + sets the import-related module attributes ("init_module_attrs"), as + summarized in a `later section `_. - * The loader may set the ``__name__`` attribute of the module. While not - required, setting this attribute is highly recommended so that the - :meth:`repr()` of the module is more informative. + * Module execution is the key moment of loading in which the module's + namespace gets populated. Execution is entirely delegated to the + loader, which gets to decide what gets populated and how. - * If the module is a package (either regular or namespace), the loader must - set the module object's ``__path__`` attribute. The value must be - iterable, but may be empty if ``__path__`` has no further significance - to the loader. If ``__path__`` is not empty, it must produce strings - when iterated over. More details on the semantics of ``__path__`` are - given :ref:`below `. + * The module created during loading and passed to exec_module() may + not be the one returned at the end of import [#fnlo]_. - * The ``__loader__`` attribute must be set to the loader object that loaded - the module. This is mostly for introspection and reloading, but can be - used for additional loader-specific functionality, for example getting - data associated with a loader. If the attribute is missing or set to ``None`` - then the import machinery will automatically set it **after** the module has - been imported. +.. versionchanged:: 3.4 + The import system has taken over the boilerplate responsibilities of + loaders. These were previously performed by the :meth:`load_module()` + method. - * The module's ``__package__`` attribute must be set. Its value must be a - string, but it can be the same value as its ``__name__``. If the attribute - is set to ``None`` or is missing, the import system will fill it in with a - more appropriate value **after** the module has been imported. - When the module is a package, its ``__package__`` value should be set to its - ``__name__``. When the module is not a package, ``__package__`` should be - set to the empty string for top-level modules, or for submodules, to the - parent package's name. See :pep:`366` for further details. +Loaders +------- - This attribute is used instead of ``__name__`` to calculate explicit - relative imports for main modules, as defined in :pep:`366`. +Module loaders provide the critical function of loading: module execution. +The import machinery calls the :meth:`~importlib.abc.Loader.exec_module()` +method with a single argument, the module object to execute. Any value +returned from exec_module() is ignored. + +Loaders must satisfy the following requirements: * If the module is a Python module (as opposed to a built-in module or a dynamically loaded extension), the loader should execute the module's code in the module's global name space (``module.__dict__``). + * If loader cannot execute the module, it should raise an + :exc:`ImportError`, although any other exception raised during + :meth:`exec_module()` will be propagated. -Module reprs ------------- +In many cases, the finder and loader can be the same object; in such cases the +:meth:`finder.find_spec()` would just return a spec with the loader set +to ``self``. -By default, all modules have a usable repr, however depending on the -attributes set above, and hooks in the loader, you can more explicitly control -the repr of module objects. +Module loaders may opt in to creating the module object during loading +by implementing a :meth:`create_module()` method. It takes one argument, +the module spec, and returns the new module object to use during loading. +create_module() does not need to set any attributes on the module object. +If the loader does not define create_module(), the import machinery will +create the new module itself. -Loaders may implement a :meth:`module_repr()` method which takes a single -argument, the module object. When ``repr(module)`` is called for a module -with a loader supporting this protocol, whatever is returned from -``module.__loader__.module_repr(module)`` is returned as the module's repr -without further processing. This return value must be a string. +.. versionadded:: 3.4 + The create_module() method of loaders. -If the module has no ``__loader__`` attribute, or the loader has no -:meth:`module_repr()` method, then the module object implementation itself -will craft a default repr using whatever information is available. It will -try to use the ``module.__name__``, ``module.__file__``, and -``module.__loader__`` as input into the repr, with defaults for whatever -information is missing. +.. versionchanged:: 3.4 + The load_module() method was replaced by exec_module() and the import + machinery assumed all the boilerplate responsibilities of loading. -Here are the exact rules used: + For compatibility with existing loaders, the import machinery will use + the :meth:`~importlib.abc.Loader.load_module()` method of loaders if it + exists and the loader does not also implement exec_module(). However, + load_module() has been deprecated and loaders should implement + exec_module() instead. - * If the module has a ``__loader__`` and that loader has a - :meth:`module_repr()` method, call it with a single argument, which is the - module object. The value returned is used as the module's repr. + The load_module() method must implement all the boilerplate loading + functionality described above in addition to executing the module. All + the same constraints apply, with some additional clarification: - * If an exception occurs in :meth:`module_repr()`, the exception is caught - and discarded, and the calculation of the module's repr continues as if - :meth:`module_repr()` did not exist. + * If there is an existing module object with the given name in + :data:`sys.modules`, the loader must use that existing module. + (Otherwise, :func:`imp.reload` will not work correctly.) If the + named module does not exist in :data:`sys.modules`, the loader + must create a new module object and add it to :data:`sys.modules`. - * If the module has a ``__file__`` attribute, this is used as part of the - module's repr. + * The module *must* exist in :data:`sys.modules` before the loader + executes the module code, to prevent unbounded recursion or multiple + loading. - * If the module has no ``__file__`` but does have a ``__loader__`` that is not - ``None``, then the loader's repr is used as part of the module's repr. + * If loading fails, the loader must remove any modules it has inserted + into :data:`sys.modules`, but it must remove **only** the failing + module, and only if the loader itself has loaded it explicitly. - * Otherwise, just use the module's ``__name__`` in the repr. +Module spec +----------- -This example, from :pep:`420` shows how a loader can craft its own module -repr:: +The import machinery uses a variety of information about each module +during import, especially before loading. Most of the information is +common to all modules. The purpose of a module's spec is to encapsulate +this import-related information on a per-module basis. - class NamespaceLoader: - @classmethod - def module_repr(cls, module): - return "".format(module.__name__) +Using a spec during import allows state to be transferred between import +system components, e.g. between the finder that creates the module spec +and the loader that executes it. Most importantly, it allows the +import machinery to perform the boilerplate operations of loading, +whereas without a module spec the loader had that responsibility. +See :class:`~importlib.machinery.ModuleSpec` for more specifics on what +information a module's spec may hold. + +.. versionadded:: 3.4 + +Import-related module attributes +-------------------------------- + +The import machinery fills in these attributes on each module object +during loading, based on the module's spec, before the loader executes +the module. + +.. attribute:: __name__ + + The ``__name__`` attribute must be set to the fully-qualified name of + the module. This name is used to uniquely identify the module in + the import system. + +.. attribute:: __loader__ + + The ``__loader__`` attribute must be set to the loader object that + the import machinery used when loading the module. This is mostly + for introspection, but can be used for additional loader-specific + functionality, for example getting data associated with a loader. + +.. attribute:: __package__ + + The module's ``__package__`` attribute must be set. Its value must + be a string, but it can be the same value as its ``__name__``. When + the module is a package, its ``__package__`` value should be set to + its ``__name__``. When the module is not a package, ``__package__`` + should be set to the empty string for top-level modules, or for + submodules, to the parent package's name. See :pep:`366` for further + details. + + This attribute is used instead of ``__name__`` to calculate explicit + relative imports for main modules, as defined in :pep:`366`. + +.. attribute:: __spec__ + + The ``__spec__`` attribute must be set to the module spec that was + used when importing the module. This is used primarily for + introspection and during reloading. + +.. attribute:: __path__ + + If the module is a package (either regular or namespace), the module + object's ``__path__`` attribute must be set. The value must be + iterable, but may be empty if ``__path__`` has no further significance. + If ``__path__`` is not empty, it must produce strings when iterated + over. More details on the semantics of ``__path__`` are given + :ref:`below `. + + Non-package modules should not have a ``__path__`` attribute. + +.. attribute:: __file__ +.. attribute:: __cached__ + + ``__file__`` is optional. If set, this attribute's value must be a + string. The import system may opt to leave ``__file__`` unset if it + has no semantic meaning (e.g. a module loaded from a database). + + If ``__file__`` is set, it may also be appropriate to set the + ``__cached__`` attribute which is the path to any compiled version of + the code (e.g. byte-compiled file). The file does not need to exist + to set this attribute; the path can simply point to where the + compiled file would exist (see :pep:`3147`). + + It is also appropriate to set ``__cached__`` when ``__file__`` is not + set. However, that scenario is quite atypical. Ultimately, the + loader is what makes use of ``__file__`` and/or ``__cached__``. So + if a loader can load from a cached module but otherwise does not load + from a file, that atypical scenario may be appropriate. .. _package-path-rules: @@ -464,9 +566,46 @@ attribute, and this was typically the way namespace packages were implemented prior to :pep:`420`. With the adoption of :pep:`420`, namespace packages no longer need to supply ``__init__.py`` files containing only ``__path__`` -manipulation code; the namespace loader automatically sets ``__path__`` +manipulation code; the import machinery automatically sets ``__path__`` correctly for the namespace package. +Module reprs +------------ + +By default, all modules have a usable repr, however depending on the +attributes set above, and in the module's spec, you can more explicitly +control the repr of module objects. + +If the module has a spec (``__spec__``), the import machinery will try +to generate a repr from it. If that fails or there is no spec, the import +system will craft a default repr using whatever information is available +on the module. It will try to use the ``module.__name__``, +``module.__file__``, and ``module.__loader__`` as input into the repr, +with defaults for whatever information is missing. + +Here are the exact rules used: + + * If the module has a ``__spec__`` attribute, the information in the spec + is used to generate the repr. The "name", "loader", "origin", and + "has_location" attributes are consulted. + + * If the module has a ``__file__`` attribute, this is used as part of the + module's repr. + + * If the module has no ``__file__`` but does have a ``__loader__`` that is not + ``None``, then the loader's repr is used as part of the module's repr. + + * Otherwise, just use the module's ``__name__`` in the repr. + +.. versionchanged:: 3.4 + Use of loader.module_repr() has been deprecated and the module spec + is now used by the import machinery to generate a module repr. + + For backward compatibility with Python 3.3, the module repr will be + generated by calling the loader's :meth:`module_repr()` method, if + defined, before trying either approach described above. However, the + method is deprecated. + The Path Based Finder ===================== @@ -531,7 +670,7 @@ not be limited to this. As a meta path finder, the :term:`path based finder` implements the -:meth:`find_module()` protocol previously described, however it exposes +:meth:`find_spec()` protocol previously described, however it exposes additional hooks that can be used to customize how modules are found and loaded from the :term:`import path`. @@ -553,8 +692,8 @@ The :term:`path based finder` is a :term:`meta path finder`, so the import machinery begins the :term:`import path` search by calling the path -based finder's :meth:`find_module()` method as described previously. When -the ``path`` argument to :meth:`find_module()` is given, it will be a +based finder's :meth:`find_spec()` method as described previously. When +the ``path`` argument to :meth:`find_spec()` is given, it will be a list of string paths to traverse - typically a package's ``__path__`` attribute for an import within that package. If the ``path`` argument is ``None``, this indicates a top level import and :data:`sys.path` is used. @@ -585,51 +724,70 @@ argument, it should raise :exc:`ImportError`. If :data:`sys.path_hooks` iteration ends with no :term:`path entry finder` -being returned, then the path based finder's :meth:`find_module()` method +being returned, then the path based finder's :meth:`find_spec()` method will store ``None`` in :data:`sys.path_importer_cache` (to indicate that there is no finder for this path entry) and return ``None``, indicating that this :term:`meta path finder` could not find the module. If a :term:`path entry finder` *is* returned by one of the :term:`path entry hook` callables on :data:`sys.path_hooks`, then the following protocol is used -to ask the finder for a module loader, which is then used to load the module. - +to ask the finder for a module spec, which is then used when loading the +module. Path entry finder protocol -------------------------- In order to support imports of modules and initialized packages and also to contribute portions to namespace packages, path entry finders must implement -the :meth:`find_loader()` method. +the :meth:`find_spec()` method. -:meth:`find_loader()` takes one argument, the fully qualified name of the -module being imported. :meth:`find_loader()` returns a 2-tuple where the -first item is the loader and the second item is a namespace :term:`portion`. -When the first item (i.e. the loader) is ``None``, this means that while the -path entry finder does not have a loader for the named module, it knows that the -path entry contributes to a namespace portion for the named module. This will -almost always be the case where Python is asked to import a namespace package -that has no physical presence on the file system. When a path entry finder -returns ``None`` for the loader, the second item of the 2-tuple return value -must be a sequence, although it can be empty. +:meth:`find_spec()` takes one argument, the fully qualified name of the +module being imported. :meth:`find_spec()` returns a fully populated +spec for the module. This spec will always have "loader" set (with one +exception). -If :meth:`find_loader()` returns a non-``None`` loader value, the portion is -ignored and the loader is returned from the path based finder, terminating -the search through the path entries. +To indicate to the import machinery that the spec represents a namespace +:term:`portion`. the path entry finder sets "loader" on the spec to +``None`` and "submodule_search_locations" to a list containing the +portion. -For backwards compatibility with other implementations of the import -protocol, many path entry finders also support the same, -traditional :meth:`find_module()` method that meta path finders support. -However path entry finder :meth:`find_module()` methods are never called -with a ``path`` argument (they are expected to record the appropriate -path information from the initial call to the path hook). +.. versionchanged:: 3.4 + find_spec() replaced find_loader() and find_module(), but of which + are now deprecated, but will be used if find_spec() is not defined. -The :meth:`find_module()` method on path entry finders is deprecated, -as it does not allow the path entry finder to contribute portions to -namespace packages. Instead path entry finders should implement the -:meth:`find_loader()` method as described above. If it exists on the path -entry finder, the import system will always call :meth:`find_loader()` -in preference to :meth:`find_module()`. + Older path entry finders may implement one of these two deprecated methods + instead of :meth:`find_spec()`. The methods are still respected for the + sake of backward compatibility. Howevever, if find_spec() is implemented + on the path entry finder, the legacy methods are ignored. + + :meth:`find_loader()` takes one argument, the fully qualified name of the + module being imported. :meth:`find_loader()` returns a 2-tuple where the + first item is the loader and the second item is a namespace :term:`portion`. + When the first item (i.e. the loader) is ``None``, this means that while the + path entry finder does not have a loader for the named module, it knows that + the path entry contributes to a namespace portion for the named module. + This will almost always be the case where Python is asked to import a + namespace package that has no physical presence on the file system. + When a path entry finder returns ``None`` for the loader, the second + item of the 2-tuple return value must be a sequence, although it can be + empty. + + If :meth:`find_loader()` returns a non-``None`` loader value, the portion is + ignored and the loader is returned from the path based finder, terminating + the search through the path entries. + + For backwards compatibility with other implementations of the import + protocol, many path entry finders also support the same, + traditional :meth:`find_module()` method that meta path finders support. + However path entry finder :meth:`find_module()` methods are never called + with a ``path`` argument (they are expected to record the appropriate + path information from the initial call to the path hook). + + The :meth:`find_module()` method on path entry finders is deprecated, + as it does not allow the path entry finder to contribute portions to + namespace packages. If both :meth:`find_loader()` and :meth:`find_module()` + exist on a path entry finder, the import system will always call + :meth:`find_loader()` in preference to :meth:`find_module()`. Replacing the standard import system @@ -648,7 +806,7 @@ To selectively prevent import of some modules from a hook early on the meta path (rather than disabling the standard import system entirely), it is sufficient to raise :exc:`ImportError` directly from -:meth:`find_module` instead of returning ``None``. The latter indicates +:meth:`find_spec` instead of returning ``None``. The latter indicates that the meta path search should continue. while raising an exception terminates it immediately. @@ -690,6 +848,11 @@ :pep:`338` defines executing modules as scripts. +:pep:`451` adds the encapsulation of per-module import state in spec +objects. It also off-loads most of the boilerplate responsibilities of +loaders back onto the import machinery. These changes allow the +deprecation of several APIs in the import system and also addition of new +methods to finders and loaders. .. rubric:: Footnotes diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -236,6 +236,26 @@ (Contributed by Nick Coghlan in :issue:`17827`, :issue:`17828` and :issue:`19619`) +.. _pep-451: + +PEP 451: A ModuleSpec Type for the Import System +================================================ + +:pep:`451` provides an encapsulation of the information about a module +that the import machinery will use to load it, (i.e. a module spec). +This helps simplify both the import implementation and several +import-related APIs. The change is also a stepping stone for several +future import-related improvements. + +https://mail.python.org/pipermail/python-dev/2013-November/130111.html + +The public-facing changes from the PEP are entirely backward-compatible. +Furthermore, they should be transparent to everyone but importer +authors. Key finder and loader methods have been deprecated, but they +will continue working. New importers should use the new methods +described in the PEP. Existing importers should be updated to implement +the new methods. + Other Language Changes ====================== diff --git a/Lib/imp.py b/Lib/imp.py --- a/Lib/imp.py +++ b/Lib/imp.py @@ -16,7 +16,7 @@ # Platform doesn't support dynamic loading. load_dynamic = None -from importlib._bootstrap import SourcelessFileLoader, _ERR_MSG +from importlib._bootstrap import SourcelessFileLoader, _ERR_MSG, _SpecMethods from importlib import machinery from importlib import util @@ -162,11 +162,17 @@ def load_source(name, pathname, file=None): - _LoadSourceCompatibility(name, pathname, file).load_module(name) - module = sys.modules[name] + loader = _LoadSourceCompatibility(name, pathname, file) + spec = util.spec_from_file_location(name, pathname, loader=loader) + methods = _SpecMethods(spec) + if name in sys.modules: + module = methods.exec(sys.modules[name]) + else: + module = methods.load() # To allow reloading to potentially work, use a non-hacked loader which # won't rely on a now-closed file object. module.__loader__ = machinery.SourceFileLoader(name, pathname) + module.__spec__.loader = module.__loader__ return module @@ -177,11 +183,17 @@ def load_compiled(name, pathname, file=None): """**DEPRECATED**""" - _LoadCompiledCompatibility(name, pathname, file).load_module(name) - module = sys.modules[name] + loader = _LoadCompiledCompatibility(name, pathname, file) + spec = util.spec_from_file_location(name, pathname, loader=loader) + methods = _SpecMethods(spec) + if name in sys.modules: + module = methods.exec(sys.modules[name]) + else: + module = methods.load() # To allow reloading to potentially work, use a non-hacked loader which # won't rely on a now-closed file object. module.__loader__ = SourcelessFileLoader(name, pathname) + module.__spec__.loader = module.__loader__ return module @@ -196,7 +208,13 @@ break else: raise ValueError('{!r} is not a package'.format(path)) - return machinery.SourceFileLoader(name, path).load_module(name) + spec = util.spec_from_file_location(name, path, + submodule_search_locations=[]) + methods = _SpecMethods(spec) + if name in sys.modules: + return methods.exec(sys.modules[name]) + else: + return methods.load() def load_module(name, file, filename, details): diff --git a/Lib/importlib/__init__.py b/Lib/importlib/__init__.py --- a/Lib/importlib/__init__.py +++ b/Lib/importlib/__init__.py @@ -46,19 +46,42 @@ finder.invalidate_caches() -def find_loader(name, path=None): - """Find the loader for the specified module. +def find_spec(name, path=None): + """Return the spec for the specified module. First, sys.modules is checked to see if the module was already imported. If - so, then sys.modules[name].__loader__ is returned. If that happens to be + so, then sys.modules[name].__spec__ is returned. If that happens to be set to None, then ValueError is raised. If the module is not in - sys.modules, then sys.meta_path is searched for a suitable loader with the - value of 'path' given to the finders. None is returned if no loader could + sys.modules, then sys.meta_path is searched for a suitable spec with the + value of 'path' given to the finders. None is returned if no spec could be found. Dotted names do not have their parent packages implicitly imported. You will most likely need to explicitly import all parent packages in the proper - order for a submodule to get the correct loader. + order for a submodule to get the correct spec. + + """ + if name not in sys.modules: + return _bootstrap._find_spec(name, path) + else: + module = sys.modules[name] + if module is None: + return None + try: + spec = module.__spec__ + except AttributeError: + raise ValueError('{}.__spec__ is not set'.format(name)) + else: + if spec is None: + raise ValueError('{}.__spec__ is None'.format(name)) + return spec + + +# XXX Deprecate... +def find_loader(name, path=None): + """Return the loader for the specified module. + + This is a backward-compatible wrapper around find_spec(). """ try: @@ -71,7 +94,18 @@ pass except AttributeError: raise ValueError('{}.__loader__ is not set'.format(name)) - return _bootstrap._find_module(name, path) + + spec = _bootstrap._find_spec(name, path) + # We won't worry about malformed specs (missing attributes). + if spec is None: + return None + if spec.loader is None: + if spec.submodule_search_locations is None: + raise ImportError('spec for {} missing loader'.format(name), + name=name) + raise ImportError('namespace packages do not have loaders', + name=name) + return spec.loader def import_module(name, package=None): @@ -106,7 +140,11 @@ """ if not module or not isinstance(module, types.ModuleType): raise TypeError("reload() argument must be module") - name = module.__name__ + try: + name = module.__spec__.name + except AttributeError: + name = module.__name__ + if sys.modules.get(name) is not module: msg = "module {} not in sys.modules" raise ImportError(msg.format(name), name=name) @@ -118,13 +156,11 @@ if parent_name and parent_name not in sys.modules: msg = "parent {!r} not in sys.modules" raise ImportError(msg.format(parent_name), name=parent_name) - loader = _bootstrap._find_module(name, None) - if loader is None: - raise ImportError(_bootstrap._ERR_MSG.format(name), name=name) - module.__loader__ = loader - loader.load_module(name) + spec = module.__spec__ = _bootstrap._find_spec(name, None, module) + methods = _bootstrap._SpecMethods(spec) + methods.exec(module) # The module may have replaced itself in sys.modules! - return sys.modules[module.__name__] + return sys.modules[name] finally: try: del _RELOADING[name] diff --git a/Lib/importlib/_bootstrap.py b/Lib/importlib/_bootstrap.py --- a/Lib/importlib/_bootstrap.py +++ b/Lib/importlib/_bootstrap.py @@ -84,13 +84,11 @@ return (stat_info.st_mode & 0o170000) == mode -# XXX Could also expose Modules/getpath.c:isfile() def _path_isfile(path): """Replacement for os.path.isfile.""" return _path_is_mode_type(path, 0o100000) -# XXX Could also expose Modules/getpath.c:isdir() def _path_isdir(path): """Replacement for os.path.isdir.""" if not path: @@ -128,6 +126,10 @@ new.__dict__.update(old.__dict__) +def _new_module(name): + return type(sys)(name) + + _code_type = type(_wrap.__code__) @@ -232,6 +234,23 @@ return '_DummyModuleLock({!r}) at {}'.format(self.name, id(self)) +class _ModuleLockManager: + + def __init__(self, name): + self._name = name + self._lock = None + + def __enter__(self): + try: + self._lock = _get_module_lock(self._name) + finally: + _imp.release_lock() + self._lock.acquire() + + def __exit__(self, *args, **kwargs): + self._lock.release() + + # The following two functions are for consumption by Python/import.c. def _get_module_lock(name): @@ -485,124 +504,6 @@ print(message.format(*args), file=sys.stderr) -class _ManageReload: - - def __init__(self, name): - self._name = name - - def __enter__(self): - self._is_reload = self._name in sys.modules - - def __exit__(self, *args): - if any(arg is not None for arg in args) and not self._is_reload: - try: - del sys.modules[self._name] - except KeyError: - pass - - -# Written as a class only because contextlib is not available. -class _ModuleManager(_ManageReload): - - """Context manager which returns the module to be loaded. - - Does the proper unloading from sys.modules upon failure. - - """ - - def __init__(self, name, *, reset_name=True): - """Prepare the context manager. - - The reset_name argument specifies whether to unconditionally reset - the __name__ attribute if the module is found to be a reload. - """ - super().__init__(name) - self._reset_name = reset_name - - def __enter__(self): - super().__enter__() - self._module = sys.modules.get(self._name) - if not self._is_reload: - # This must be done before open() is called as the 'io' module - # implicitly imports 'locale' and would otherwise trigger an - # infinite loop. - self._module = type(_io)(self._name) - # This must be done before putting the module in sys.modules - # (otherwise an optimization shortcut in import.c becomes wrong) - self._module.__initializing__ = True - sys.modules[self._name] = self._module - elif self._reset_name: - try: - self._module.__name__ = self._name - except AttributeError: - pass - return self._module - - def __exit__(self, *args): - self._module.__initializing__ = False - del self._module - super().__exit__(*args) - - -def module_to_load(name, *, reset_name=True): - """Return a context manager which provides the module object to load. - - If reset_name is true, reset the module's __name__ to 'name'. - """ - # Hiding _ModuleManager behind a function for better naming. - return _ModuleManager(name, reset_name=reset_name) - - -def _init_package_attrs(loader, module): - """Set __package__ and __path__ based on what loader.is_package() says.""" - name = module.__name__ - try: - is_package = loader.is_package(name) - except ImportError: - pass - else: - if is_package: - module.__package__ = name - module.__path__ = [] - else: - module.__package__ = name.rpartition('.')[0] - - -def _init_file_attrs(loader, module): - """Set __file__ and __path__ based on loader.get_filename().""" - try: - module.__file__ = loader.get_filename(module.__name__) - except ImportError: - pass - else: - if module.__name__ == module.__package__: - module.__path__.append(_path_split(module.__file__)[0]) - - -def set_package(fxn): - """Set __package__ on the returned module.""" - def set_package_wrapper(*args, **kwargs): - module = fxn(*args, **kwargs) - if getattr(module, '__package__', None) is None: - module.__package__ = module.__name__ - if not hasattr(module, '__path__'): - module.__package__ = module.__package__.rpartition('.')[0] - return module - _wrap(set_package_wrapper, fxn) - return set_package_wrapper - - -def set_loader(fxn): - """Set __loader__ on the returned module.""" - def set_loader_wrapper(self, *args, **kwargs): - module = fxn(self, *args, **kwargs) - if getattr(module, '__loader__', None) is None: - module.__loader__ = self - return module - _wrap(set_loader_wrapper, fxn) - return set_loader_wrapper - - def _check_name(method): """Decorator to verify that the module being requested matches the one the loader can handle. @@ -656,6 +557,19 @@ return loader +def _load_module_shim(self, fullname): + """Load the specified module into sys.modules and return it.""" + # XXX Deprecation Warning here... + spec = spec_from_loader(fullname, self) + methods = _SpecMethods(spec) + if fullname in sys.modules: + module = sys.modules[fullname] + methods.exec(module) + return sys.modules[fullname] + else: + return methods.load() + + def _validate_bytecode_header(data, source_stats=None, name=None, path=None): """Validate the header of the passed-in bytecode against source_stats (if given) and returning the bytecode that can be compiled by compile(). @@ -745,6 +659,556 @@ return newline_decoder.decode(source_bytes.decode(encoding[0])) +# Module specifications ####################################################### + +def _module_repr(module): + # The implementation of ModuleType__repr__(). + loader = getattr(module, '__loader__', None) + if hasattr(loader, 'module_repr'): + # XXX Deprecation Warning here... + try: + return loader.module_repr(module) + except Exception: + pass + try: + spec = module.__spec__ + except AttributeError: + pass + else: + if spec is not None: + return _SpecMethods(spec).module_repr() + + # We could use module.__class__.__name__ instead of 'module' in the + # various repr permutations. + try: + name = module.__name__ + except AttributeError: + name = '?' + try: + filename = module.__file__ + except AttributeError: + if loader is None: + return ''.format(name) + else: + return ''.format(name, loader) + else: + return ''.format(name, filename) + + +class _installed_safely: + + def __init__(self, module): + self._module = module + self._spec = module.__spec__ + + def __enter__(self): + # This must be done before putting the module in sys.modules + # (otherwise an optimization shortcut in import.c becomes + # wrong) + self._spec._initializing = True + sys.modules[self._spec.name] = self._module + + def __exit__(self, *args): + try: + spec = self._spec + if any(arg is not None for arg in args): + try: + del sys.modules[spec.name] + except KeyError: + pass + else: + _verbose_message('import {!r} # {!r}', spec.name, spec.loader) + finally: + self._spec._initializing = False + + +class ModuleSpec: + """The specification for a module, used for loading. + + A module's spec is the source for information about the module. For + data associated with the module, including source, use the spec's + loader. + + `name` is the absolute name of the module. `loader` is the loader + to use when loading the module. `parent` is the name of the + package the module is in. The parent is derived from the name. + + `is_package` determines if the module is considered a package or + not. On modules this is reflected by the `__path__` attribute. + + `origin` is the specific location used by the loader from which to + load the module, if that information is available. When filename is + set, origin will match. + + `has_location` indicates that a spec's "origin" reflects a location. + When this is True, `__file__` attribute of the module is set. + + `cached` is the location of the cached bytecode file, if any. It + corresponds to the `__cached__` attribute. + + `submodule_search_locations` is the sequence of path entries to + search when importing submodules. If set, is_package should be + True--and False otherwise. + + Packages are simply modules that (may) have submodules. If a spec + has a non-None value in `submodule_search_locations`, the import + system will consider modules loaded from the spec as packages. + + Only finders (see importlib.abc.MetaPathFinder and + importlib.abc.PathEntryFinder) should modify ModuleSpec instances. + + """ + + def __init__(self, name, loader, *, origin=None, loader_state=None, + is_package=None): + self.name = name + self.loader = loader + self.origin = origin + self.loader_state = loader_state + self.submodule_search_locations = [] if is_package else None + + # file-location attributes + self._set_fileattr = False + self._cached = None + + def __repr__(self): + args = ['name={!r}'.format(self.name), + 'loader={!r}'.format(self.loader)] + if self.origin is not None: + args.append('origin={!r}'.format(self.origin)) + if self.submodule_search_locations is not None: + args.append('submodule_search_locations={}' + .format(self.submodule_search_locations)) + return '{}({})'.format(self.__class__.__name__, ', '.join(args)) + + def __eq__(self, other): + smsl = self.submodule_search_locations + try: + return (self.name == other.name and + self.loader == other.loader and + self.origin == other.origin and + smsl == other.submodule_search_locations and + self.cached == other.cached and + self.has_location == other.has_location) + except AttributeError: + return False + + @property + def cached(self): + if self._cached is None: + if self.origin is not None and self._set_fileattr: + filename = self.origin + if filename.endswith(tuple(SOURCE_SUFFIXES)): + try: + self._cached = cache_from_source(filename) + except NotImplementedError: + pass + elif filename.endswith(tuple(BYTECODE_SUFFIXES)): + self._cached = filename + return self._cached + + @cached.setter + def cached(self, cached): + self._cached = cached + + @property + def parent(self): + """The name of the module's parent.""" + if self.submodule_search_locations is None: + return self.name.rpartition('.')[0] + else: + return self.name + + @property + def has_location(self): + return self._set_fileattr + + +def spec_from_loader(name, loader, *, origin=None, is_package=None): + """Return a module spec based on various loader methods.""" + +# if hasattr(loader, 'get_data'): + if hasattr(loader, 'get_filename'): + if is_package is None: + return spec_from_file_location(name, loader=loader) + search = [] if is_package else None + return spec_from_file_location(name, loader=loader, + submodule_search_locations=search) + + if is_package is None: + if hasattr(loader, 'is_package'): + try: + is_package = loader.is_package(name) + except ImportError: + is_package = None # aka, undefined + else: + # the default + is_package = False + + return ModuleSpec(name, loader, origin=origin, is_package=is_package) + + +_POPULATE = object() + + +def spec_from_file_location(name, location=None, *, loader=None, + submodule_search_locations=_POPULATE): + """Return a module spec based on a file location. + + To indicate that the module is a package, set + submodule_search_locations to a list of directory paths. An + empty list is sufficient, though its not otherwise useful to the + import system. + + The loader must take a spec as its only __init__() arg. + + """ + if location is None: + # The caller may simply want a partially populated location- + # oriented spec. So we set the location to a bogus value and + # fill in as much as we can. + location = '' + if hasattr(loader, 'get_filename'): + # ExecutionLoader + try: + location = loader.get_filename(name) + except ImportError: + pass + + # If the location is on the filesystem, but doesn't actually exist, + # we could return None here, indicating that the location is not + # valid. However, we don't have a good way of testing since an + # indirect location (e.g. a zip file or URL) will look like a + # non-existent file relative to the filesystem. + + spec = ModuleSpec(name, loader, origin=location) + spec._set_fileattr = True + + # Pick a loader if one wasn't provided. + if loader is None: + for loader_class, suffixes in _get_supported_file_loaders(): + if location.endswith(tuple(suffixes)): + loader = loader_class(name, location) + spec.loader = loader + break + else: + return None + + # Set submodule_search_paths appropriately. + if submodule_search_locations is _POPULATE: + # Check the loader. + if hasattr(loader, 'is_package'): + try: + is_package = loader.is_package(name) + except ImportError: + pass + else: + if is_package: + spec.submodule_search_locations = [] + else: + spec.submodule_search_locations = submodule_search_locations + if spec.submodule_search_locations == []: + if location: + dirname = _path_split(location)[0] + spec.submodule_search_locations.append(dirname) + + return spec + + +def _spec_from_module(module, loader=None, origin=None): + # This function is meant for use in _setup(). + try: + spec = module.__spec__ + except AttributeError: + pass + else: + if spec is not None: + return spec + + name = module.__name__ + if loader is None: + try: + loader = module.__loader__ + except AttributeError: + # loader will stay None. + pass + try: + location = module.__file__ + except AttributeError: + location = None + if origin is None: + if location is None: + try: + origin = loader._ORIGIN + except AttributeError: + origin = None + else: + origin = location + try: + cached = module.__cached__ + except AttributeError: + cached = None + try: + submodule_search_locations = list(module.__path__) + except AttributeError: + submodule_search_locations = None + + spec = ModuleSpec(name, loader, origin=origin) + spec._set_fileattr = False if location is None else True + spec.cached = cached + spec.submodule_search_locations = submodule_search_locations + return spec + + +class _SpecMethods: + + """Convenience wrapper around spec objects to provide spec-specific + methods.""" + + def __init__(self, spec): + self.spec = spec + + @classmethod + def from_module(cls, module): + """Create a spec from a module's attributes.""" + try: + spec = module.__spec__ + except AttributeError: + try: + loader = spec.__loader__ + except AttributeError: + spec = _find_spec(module.__name__) + if spec is None: + spec = spec_from_loader(module.__name__, loader) + else: + spec = spec_from_loader(module.__name__, loader) + return cls(spec) + + def module_repr(self): + """Return the repr to use for the module.""" + # We mostly replicate _module_repr() using the spec attributes. + spec = self.spec + name = '?' if spec.name is None else spec.name + if spec.origin is None: + if spec.loader is None: + return ''.format(name) + else: + return ''.format(name, spec.loader) + else: + if spec.has_location: + return ''.format(name, spec.origin) + else: + return ''.format(spec.name, spec.origin) + + def init_module_attrs(self, module, *, _override=False, _force_name=True): + """Set the module's attributes. + + All missing import-related module attributes will be set. Here + is how the spec attributes map onto the module: + + spec.name -> module.__name__ + spec.loader -> module.__loader__ + spec.parent -> module.__package__ + spec -> module.__spec__ + + Optional: + spec.origin -> module.__file__ (if spec.set_fileattr is true) + spec.cached -> module.__cached__ (if __file__ also set) + spec.submodule_search_locations -> module.__path__ (if set) + + """ + spec = self.spec + + # The passed in module may be not support attribute assignment, + # in which case we simply don't set the attributes. + + # __name__ + if (_override or _force_name or + getattr(module, '__name__', None) is None): + try: + module.__name__ = spec.name + except AttributeError: + pass + + # __loader__ + if _override or getattr(module, '__loader__', None) is None: + loader = spec.loader + if loader is None: + # A backward compatibility hack. + if spec.submodule_search_locations is not None: + loader = _NamespaceLoader.__new__(_NamespaceLoader) + loader._path = spec.submodule_search_locations + try: + module.__loader__ = loader + except AttributeError: + pass + + # __package__ + if _override or getattr(module, '__package__', None) is None: + try: + module.__package__ = spec.parent + except AttributeError: + pass + + # __spec__ + try: + module.__spec__ = spec + except AttributeError: + pass + + # __path__ + if _override or getattr(module, '__path__', None) is None: + if spec.submodule_search_locations is not None: + try: + module.__path__ = spec.submodule_search_locations + except AttributeError: + pass + + if spec.has_location: + # __file__ + if _override or getattr(module, '__file__', None) is None: + try: + module.__file__ = spec.origin + except AttributeError: + pass + + # __cached__ + if _override or getattr(module, '__cached__', None) is None: + if spec.cached is not None: + try: + module.__cached__ = spec.cached + except AttributeError: + pass + + def create(self): + """Return a new module to be loaded. + + The import-related module attributes are also set with the + appropriate values from the spec. + + """ + spec = self.spec + # Typically loaders will not implement create_module(). + if hasattr(spec.loader, 'create_module'): + # If create_module() returns `None` it means the default + # module creation should be used. + module = spec.loader.create_module(spec) + else: + module = None + if module is None: + # This must be done before open() is ever called as the 'io' + # module implicitly imports 'locale' and would otherwise + # trigger an infinite loop. + module = _new_module(spec.name) + self.init_module_attrs(module) + return module + + def _exec(self, module): + """Do everything necessary to execute the module. + + The namespace of `module` is used as the target of execution. + This method uses the loader's `exec_module()` method. + + """ + self.spec.loader.exec_module(module) + + # Used by importlib.reload() and _load_module_shim(). + def exec(self, module): + """Execute the spec in an existing module's namespace.""" + name = self.spec.name + _imp.acquire_lock() + with _ModuleLockManager(name): + if sys.modules.get(name) is not module: + msg = 'module {} not in sys.modules'.format(name) + raise ImportError(msg, name=name) + if self.spec.loader is None: + if self.spec.submodule_search_locations is None: + raise ImportError('missing loader', name=self.spec.name) + # namespace package + self.init_module_attrs(module, _override=True) + return module + self.init_module_attrs(module, _override=True) + if not hasattr(self.spec.loader, 'exec_module'): + # XXX DeprecationWarning goes here... + self.spec.loader.load_module(name) + else: + self._exec(module) + return sys.modules[name] + + def _load_backward_compatible(self): + # XXX DeprecationWarning goes here... + spec = self.spec + # The module must be in sys.modules! + spec.loader.load_module(spec.name) + module = sys.modules[spec.name] + if getattr(module, '__loader__', None) is None: + try: + module.__loader__ = spec.loader + except AttributeError: + pass + if getattr(module, '__package__', None) is None: + try: + # Since module.__path__ may not line up with + # spec.submodule_search_paths, we can't necessarily rely + # on spec.parent here. + module.__package__ = module.__name__ + if not hasattr(module, '__path__'): + module.__package__ = spec.name.rpartition('.')[0] + except AttributeError: + pass + if getattr(module, '__spec__', None) is None: + try: + module.__spec__ = spec + except AttributeError: + pass + return module + + # XXX If we don't end up using this for pythonrun.c/runpy, we should + # get rid of it. + def _load_existing(self, module): + """Exec the spec'ed module into an existing module's namespace.""" + # For use by runpy. + with _installed_safely(module): + loaded = self.exec(module) + return loaded + + def _load_unlocked(self): + # A helper for direct use by the import system. + if self.spec.loader is not None: + # not a namespace package + if not hasattr(self.spec.loader, 'exec_module'): + return self._load_backward_compatible() + + module = self.create() + with _installed_safely(module): + if self.spec.loader is None: + if self.spec.submodule_search_locations is None: + raise ImportError('missing loader', name=self.spec.name) + # A namespace package so do nothing. + else: + self._exec(module) + + # We don't ensure that the import-related module attributes get + # set in the sys.modules replacement case. Such modules are on + # their own. + return sys.modules[self.spec.name] + + # A method used during testing of _load_unlocked() and by + # _load_module_shim(). + def load(self): + """Return a new module object, loaded by the spec's loader. + + The module is not added to its parent. + + If a module is already in sys.modules, that existing module gets + clobbered. + + """ + _imp.acquire_lock() + with _ModuleLockManager(self.spec.name): + return self._load_unlocked() + + # Loaders ##################################################################### class BuiltinImporter: @@ -756,9 +1220,19 @@ """ + @staticmethod + def module_repr(module): + # XXX deprecate + return ''.format(module.__name__) + @classmethod - def module_repr(cls, module): - return ''.format(module.__name__) + def find_spec(cls, fullname, path=None, target=None): + if path is not None: + return None + if _imp.is_builtin(fullname): + return spec_from_loader(fullname, cls, origin='built-in') + else: + return None @classmethod def find_module(cls, fullname, path=None): @@ -767,18 +1241,26 @@ If 'path' is ever specified then the search is considered a failure. """ - if path is not None: - return None - return cls if _imp.is_builtin(fullname) else None + spec = cls.find_spec(fullname, path) + return spec.loader if spec is not None else None + + @staticmethod + def exec_module(module): + spec = module.__spec__ + name = spec.name + if not _imp.is_builtin(name): + raise ImportError('{} is not a built-in module'.format(name), + name=name) + _call_with_frames_removed(_imp.init_builtin, name) + # Have to manually initialize attributes since init_builtin() is not + # going to do it for us. + # XXX: Create _imp.exec_builtin(module) + _SpecMethods(spec).init_module_attrs(sys.modules[name]) @classmethod - @set_package - @set_loader - @_requires_builtin def load_module(cls, fullname): """Load a built-in module.""" - with _ManageReload(fullname): - return _call_with_frames_removed(_imp.init_builtin, fullname) + return _load_module_shim(cls, fullname) @classmethod @_requires_builtin @@ -796,6 +1278,7 @@ @_requires_builtin def is_package(cls, fullname): """Return False as built-in modules are never packages.""" + # XXX DeprecationWarning here... return False @@ -808,26 +1291,36 @@ """ + @staticmethod + def module_repr(m): + # XXX deprecate + return ''.format(m.__name__) + @classmethod - def module_repr(cls, m): - return ''.format(m.__name__) + def find_spec(cls, fullname, path=None, target=None): + if _imp.is_frozen(fullname): + return spec_from_loader(fullname, cls, origin='frozen') + else: + return None @classmethod def find_module(cls, fullname, path=None): """Find a frozen module.""" return cls if _imp.is_frozen(fullname) else None + @staticmethod + def exec_module(module): + name = module.__spec__.name + if not _imp.is_frozen(name): + raise ImportError('{} is not a frozen module'.format(name), + name=name) + code = _call_with_frames_removed(_imp.get_frozen_object, name) + exec(code, module.__dict__) + @classmethod - @set_package - @set_loader - @_requires_frozen def load_module(cls, fullname): """Load a frozen module.""" - with _ManageReload(fullname): - m = _call_with_frames_removed(_imp.init_frozen, fullname) - # Let our own module_repr() method produce a suitable repr. - del m.__file__ - return m + return _load_module_shim(cls, fullname) @classmethod @_requires_frozen @@ -850,8 +1343,7 @@ class WindowsRegistryFinder: - """Meta path finder for modules declared in the Windows registry. - """ + """Meta path finder for modules declared in the Windows registry.""" REGISTRY_KEY = ( 'Software\\Python\\PythonCore\\{sys_version}' @@ -884,8 +1376,8 @@ return filepath @classmethod - def find_module(cls, fullname, path=None): - """Find module named in the registry.""" + def find_spec(cls, fullname, path=None, target=None): + # XXX untested! Need a Windows person to write tests (otherwise mock out appropriately) filepath = cls._search_registry(fullname) if filepath is None: return None @@ -895,7 +1387,18 @@ return None for loader, suffixes in _get_supported_file_loaders(): if filepath.endswith(tuple(suffixes)): - return loader(fullname, filepath) + spec = spec_from_loader(fullname, loader(fullname, filepath), + origin=filepath) + return spec + + @classmethod + def find_module(cls, fullname, path=None): + """Find module named in the registry.""" + spec = self.find_spec(fullname, path) + if spec is not None: + return spec.loader + else: + return None class _LoaderBasics: @@ -903,6 +1406,7 @@ """Base class of common code needed by both SourceLoader and SourcelessFileLoader.""" + # XXX deprecate? def is_package(self, fullname): """Concrete implementation of InspectLoader.is_package by checking if the path returned by get_filename has a filename of '__init__.py'.""" @@ -911,32 +1415,15 @@ tail_name = fullname.rpartition('.')[2] return filename_base == '__init__' and tail_name != '__init__' - def init_module_attrs(self, module): - """Set various attributes on the module. + def exec_module(self, module): + """Execute the module.""" + code = self.get_code(module.__name__) + if code is None: + raise ImportError('cannot load module {!r} when get_code() ' + 'returns None'.format(module.__name__)) + _call_with_frames_removed(exec, code, module.__dict__) - ExecutionLoader.init_module_attrs() is used to set __loader__, - __package__, __file__, and optionally __path__. The __cached__ attribute - is set using imp.cache_from_source() and __file__. - """ - module.__loader__ = self # Loader - _init_package_attrs(self, module) # InspectLoader - _init_file_attrs(self, module) # ExecutionLoader - if hasattr(module, '__file__'): # SourceLoader - try: - module.__cached__ = cache_from_source(module.__file__) - except NotImplementedError: - pass - - def load_module(self, fullname): - """Load the specified module into sys.modules and return it.""" - with module_to_load(fullname) as module: - self.init_module_attrs(module) - code = self.get_code(fullname) - if code is None: - raise ImportError('cannot load module {!r} when get_code() ' - 'returns None'.format(fullname)) - _call_with_frames_removed(exec, code, module.__dict__) - return module + load_module = _load_module_shim class SourceLoader(_LoaderBasics): @@ -1063,7 +1550,9 @@ @_check_name def load_module(self, fullname): """Load a module from a file.""" - # Issue #14857: Avoid the zero-argument form so the implementation + # The only reason for this method is for the name check. + + # Issue #14857: Avoid the zero-argument form of super so the implementation # of that form can be updated without breaking the frozen module return super(FileLoader, self).load_module(fullname) @@ -1125,10 +1614,6 @@ """Loader which handles sourceless file imports.""" - def init_module_attrs(self, module): - super().init_module_attrs(module) - module.__cached__ = module.__file__ - def get_code(self, fullname): path = self.get_filename(fullname) data = self.get_data(path) @@ -1156,18 +1641,22 @@ self.name = name self.path = path + def exec_module(self, module): + # XXX create _imp.exec_dynamic() + spec = module.__spec__ + fullname = spec.name + module = _call_with_frames_removed(_imp.load_dynamic, + fullname, self.path) + _verbose_message('extension module loaded from {!r}', self.path) + if self.is_package(fullname) and not hasattr(module, '__path__'): + module.__path__ = [_path_split(self.path)[0]] + _SpecMethods(spec).init_module_attrs(module) + + # XXX deprecate @_check_name - @set_package - @set_loader def load_module(self, fullname): """Load an extension module.""" - with _ManageReload(fullname): - module = _call_with_frames_removed(_imp.load_dynamic, - fullname, self.path) - _verbose_message('extension module loaded from {!r}', self.path) - if self.is_package(fullname) and not hasattr(module, '__path__'): - module.__path__ = [_path_split(self.path)[0]] - return module + return _load_module_shim(self, fullname) def is_package(self, fullname): """Return True if the extension module is a package.""" @@ -1220,11 +1709,12 @@ # If the parent's path has changed, recalculate _path parent_path = tuple(self._get_parent_path()) # Make a copy if parent_path != self._last_parent_path: - loader, new_path = self._path_finder(self._name, parent_path) + spec = self._path_finder(self._name, parent_path) # Note that no changes are made if a loader is returned, but we # do remember the new parent path - if loader is None: - self._path = new_path + if spec is not None and spec.loader is None: + if spec.submodule_search_locations: + self._path = spec.submodule_search_locations self._last_parent_path = parent_path # Save the copy return self._path @@ -1244,10 +1734,12 @@ self._path.append(item) -class NamespaceLoader: +# We use this exclusively in init_module_attrs() for backward-compatibility. +class _NamespaceLoader: def __init__(self, name, path, path_finder): self._path = _NamespacePath(name, path, path_finder) + # XXX Deprecate @classmethod def module_repr(cls, module): return ''.format(module.__name__) @@ -1261,17 +1753,11 @@ def get_code(self, fullname): return compile('', '', 'exec', dont_inherit=True) - def init_module_attrs(self, module): - module.__loader__ = self - module.__package__ = module.__name__ - + # XXX Deprecate def load_module(self, fullname): """Load a namespace module.""" _verbose_message('namespace module loaded with path {!r}', self._path) - with module_to_load(fullname) as module: - self.init_module_attrs(module) - module.__path__ = self._path - return module + return _load_module_shim(self, fullname) # Finders ##################################################################### @@ -1323,7 +1809,20 @@ return finder @classmethod - def _get_loader(cls, fullname, path): + def _legacy_get_spec(cls, fullname, finder): + if hasattr(finder, 'find_loader'): + loader, portions = finder.find_loader(fullname) + else: + loader = finder.find_module(fullname) + portions = None + if loader is not None: + return spec_from_loader(fullname, loader) + spec = ModuleSpec(fullname, None) + spec.submodule_search_locations = portions + return spec + + @classmethod + def _get_spec(cls, fullname, path, target=None): """Find the loader or namespace_path for this module/package name.""" # If this ends up being a namespace package, namespace_path is # the list of paths that will become its __path__ @@ -1333,38 +1832,58 @@ continue finder = cls._path_importer_cache(entry) if finder is not None: - if hasattr(finder, 'find_loader'): - loader, portions = finder.find_loader(fullname) + if hasattr(finder, 'find_spec'): + spec = finder.find_spec(fullname, target) else: - loader = finder.find_module(fullname) - portions = [] - if loader is not None: - # We found a loader: return it immediately. - return loader, namespace_path + spec = cls._legacy_get_spec(fullname, finder) + if spec is None: + continue + if spec.loader is not None: + return spec + portions = spec.submodule_search_locations + if portions is None: + raise ImportError('spec missing loader') # This is possibly part of a namespace package. # Remember these path entries (if any) for when we # create a namespace package, and continue iterating # on path. namespace_path.extend(portions) else: - return None, namespace_path + spec = ModuleSpec(fullname, None) + spec.submodule_search_locations = namespace_path + return spec + + @classmethod + def find_spec(cls, fullname, path=None, target=None): + """find the module on sys.path or 'path' based on sys.path_hooks and + sys.path_importer_cache.""" + if path is None: + path = sys.path + spec = cls._get_spec(fullname, path, target) + if spec is None: + return None + elif spec.loader is None: + namespace_path = spec.submodule_search_locations + if namespace_path: + # We found at least one namespace path. Return a + # spec which can create the namespace package. + spec.origin = 'namespace' + spec.submodule_search_locations = _NamespacePath(fullname, namespace_path, cls._get_spec) + return spec + else: + return None + else: + return spec @classmethod def find_module(cls, fullname, path=None): - """Find the module on sys.path or 'path' based on sys.path_hooks and + """find the module on sys.path or 'path' based on sys.path_hooks and sys.path_importer_cache.""" - if path is None: - path = sys.path - loader, namespace_path = cls._get_loader(fullname, path) - if loader is not None: - return loader - else: - if namespace_path: - # We found at least one namespace path. Return a - # loader which can create the namespace package. - return NamespaceLoader(fullname, namespace_path, cls._get_loader) - else: - return None + # XXX Deprecation warning here. + spec = cls.find_spec(fullname, path) + if spec is None: + return None + return spec.loader class FileFinder: @@ -1399,6 +1918,24 @@ def find_loader(self, fullname): """Try to find a loader for the specified module, or the namespace package portions. Returns (loader, list-of-portions).""" + spec = self.find_spec(fullname) + if spec is None: + return None, [] + return spec.loader, spec.submodule_search_locations or [] + + def _get_spec(self, loader_class, fullname, path, submodule_search_locations, target): + loader = loader_class(fullname, path) + try: + get_spec = loader._get_spec + except AttributeError: + return spec_from_file_location(fullname, path, loader=loader, + submodule_search_locations=submodule_search_locations) + else: + return get_spec(fullname, path, submodule_search_locations, target) + + def find_spec(self, fullname, target=None): + """Try to find a loader for the specified module, or the namespace + package portions. Returns (loader, list-of-portions).""" is_namespace = False tail_module = fullname.rpartition('.')[2] try: @@ -1418,26 +1955,28 @@ # Check if the module is the name of a directory (and thus a package). if cache_module in cache: base_path = _path_join(self.path, tail_module) - for suffix, loader in self._loaders: + for suffix, loader_class in self._loaders: init_filename = '__init__' + suffix full_path = _path_join(base_path, init_filename) if _path_isfile(full_path): - return (loader(fullname, full_path), [base_path]) + return self._get_spec(loader_class, fullname, full_path, [base_path], target) else: # If a namespace package, return the path if we don't # find a module in the next section. is_namespace = _path_isdir(base_path) # Check for a file w/ a proper suffix exists. - for suffix, loader in self._loaders: + for suffix, loader_class in self._loaders: full_path = _path_join(self.path, tail_module + suffix) _verbose_message('trying {}'.format(full_path), verbosity=2) if cache_module + suffix in cache: if _path_isfile(full_path): - return (loader(fullname, full_path), []) + return self._get_spec(loader_class, fullname, full_path, None, target) if is_namespace: _verbose_message('possible namespace for {}'.format(base_path)) - return (None, [base_path]) - return (None, []) + spec = ModuleSpec(fullname, None) + spec.submodule_search_locations = [base_path] + return spec + return None def _fill_cache(self): """Fill the cache of potential modules and packages for this directory.""" @@ -1516,23 +2055,43 @@ return '{}.{}'.format(base, name) if name else base -def _find_module(name, path): +def _find_spec(name, path, target=None): """Find a module's loader.""" if not sys.meta_path: _warnings.warn('sys.meta_path is empty', ImportWarning) + # We check sys.modules here for the reload case. While a passed-in + # target will usually indicate a reload there is no guarantee, whereas + # sys.modules provides one. is_reload = name in sys.modules for finder in sys.meta_path: with _ImportLockContext(): - loader = finder.find_module(name, path) - if loader is not None: + try: + find_spec = finder.find_spec + except AttributeError: + loader = finder.find_module(name, path) + if loader is None: + continue + spec = spec_from_loader(name, loader) + else: + spec = find_spec(name, path, target) + if spec is not None: # The parent import may have already imported this module. - if is_reload or name not in sys.modules: - return loader + if not is_reload and name in sys.modules: + module = sys.modules[name] + try: + __spec__ = module.__spec__ + except AttributeError: + # We use the found spec since that is the one that + # we would have used if the parent module hadn't + # beaten us to the punch. + return spec + else: + if __spec__ is None: + return spec + else: + return __spec__ else: - try: - return sys.modules[name].__loader__ - except AttributeError: - return loader + return spec else: return None @@ -1566,54 +2125,28 @@ # Crazy side-effects! if name in sys.modules: return sys.modules[name] - # Backwards-compatibility; be nicer to skip the dict lookup. parent_module = sys.modules[parent] try: path = parent_module.__path__ except AttributeError: msg = (_ERR_MSG + '; {} is not a package').format(name, parent) raise ImportError(msg, name=name) - loader = _find_module(name, path) - if loader is None: + spec = _find_spec(name, path) + if spec is None: raise ImportError(_ERR_MSG.format(name), name=name) - elif name not in sys.modules: - # The parent import may have already imported this module. - loader.load_module(name) - _verbose_message('import {!r} # {!r}', name, loader) - # Backwards-compatibility; be nicer to skip the dict lookup. - module = sys.modules[name] + else: + module = _SpecMethods(spec)._load_unlocked() if parent: # Set the module as an attribute on its parent. parent_module = sys.modules[parent] setattr(parent_module, name.rpartition('.')[2], module) - # Set __package__ if the loader did not. - if getattr(module, '__package__', None) is None: - try: - module.__package__ = module.__name__ - if not hasattr(module, '__path__'): - module.__package__ = module.__package__.rpartition('.')[0] - except AttributeError: - pass - # Set loader if need be. - if getattr(module, '__loader__', None) is None: - try: - module.__loader__ = loader - except AttributeError: - pass return module def _find_and_load(name, import_): """Find and load the module, and release the import lock.""" - try: - lock = _get_module_lock(name) - finally: - _imp.release_lock() - lock.acquire() - try: + with _ModuleLockManager(name): return _find_and_load_unlocked(name, import_) - finally: - lock.release() def _gcd_import(name, package=None, level=0): @@ -1733,6 +2266,11 @@ return _handle_fromlist(module, fromlist, _gcd_import) +def _builtin_from_name(name): + spec = BuiltinImporter.find_spec(name) + methods = _SpecMethods(spec) + return methods._load_unlocked() + def _setup(sys_module, _imp_module): """Setup importlib by importing needed built-in modules and injecting them @@ -1751,23 +2289,30 @@ else: BYTECODE_SUFFIXES = DEBUG_BYTECODE_SUFFIXES + # Set up the spec for existing builtin/frozen modules. module_type = type(sys) for name, module in sys.modules.items(): if isinstance(module, module_type): - if getattr(module, '__loader__', None) is None: - if name in sys.builtin_module_names: - module.__loader__ = BuiltinImporter - elif _imp.is_frozen(name): - module.__loader__ = FrozenImporter + if name in sys.builtin_module_names: + loader = BuiltinImporter + elif _imp.is_frozen(name): + loader = FrozenImporter + else: + continue + spec = _spec_from_module(module, loader) + methods = _SpecMethods(spec) + methods.init_module_attrs(module) + # Directly load built-in modules needed during bootstrap. self_module = sys.modules[__name__] for builtin_name in ('_io', '_warnings', 'builtins', 'marshal'): if builtin_name not in sys.modules: - builtin_module = BuiltinImporter.load_module(builtin_name) + builtin_module = _builtin_from_name(builtin_name) else: builtin_module = sys.modules[builtin_name] setattr(self_module, builtin_name, builtin_module) + # Directly load the os module (needed during bootstrap). os_details = ('posix', ['/']), ('nt', ['\\', '/']) for builtin_os, path_separators in os_details: # Assumption made in _path_join() @@ -1778,29 +2323,33 @@ break else: try: - os_module = BuiltinImporter.load_module(builtin_os) + os_module = _builtin_from_name(builtin_os) break except ImportError: continue else: raise ImportError('importlib requires posix or nt') + setattr(self_module, '_os', os_module) + setattr(self_module, 'path_sep', path_sep) + setattr(self_module, 'path_separators', ''.join(path_separators)) + # Directly load the _thread module (needed during bootstrap). try: - thread_module = BuiltinImporter.load_module('_thread') + thread_module = _builtin_from_name('_thread') except ImportError: # Python was built without threads thread_module = None - weakref_module = BuiltinImporter.load_module('_weakref') + setattr(self_module, '_thread', thread_module) + # Directly load the _weakref module (needed during bootstrap). + weakref_module = _builtin_from_name('_weakref') + setattr(self_module, '_weakref', weakref_module) + + # Directly load the winreg module (needed during bootstrap). if builtin_os == 'nt': - winreg_module = BuiltinImporter.load_module('winreg') + winreg_module = _builtin_from_name('winreg') setattr(self_module, '_winreg', winreg_module) - setattr(self_module, '_os', os_module) - setattr(self_module, '_thread', thread_module) - setattr(self_module, '_weakref', weakref_module) - setattr(self_module, 'path_sep', path_sep) - setattr(self_module, 'path_separators', ''.join(path_separators)) # Constants setattr(self_module, '_relax_case', _make_relax_case()) EXTENSION_SUFFIXES.extend(_imp.extension_suffixes()) diff --git a/Lib/importlib/abc.py b/Lib/importlib/abc.py --- a/Lib/importlib/abc.py +++ b/Lib/importlib/abc.py @@ -40,12 +40,18 @@ """Abstract base class for import finders on sys.meta_path.""" - @abc.abstractmethod + # We don't define find_spec() here since that would break + # hasattr checks we do to support backward compatibility. + + # XXX Deprecate def find_module(self, fullname, path): - """Abstract method which, when implemented, should find a module. - The fullname is a str and the path is a list of strings or None. - Returns a Loader object or None. + """Return a loader for the module. + + If no module is found, return None. The fullname is a str and + the path is a list of strings or None. + """ + return None def invalidate_caches(self): """An optional method for clearing the finder's cache, if any. @@ -60,17 +66,25 @@ """Abstract base class for path entry finders used by PathFinder.""" - @abc.abstractmethod + # We don't define find_spec() here since that would break + # hasattr checks we do to support backward compatibility. + + # XXX Deprecate. def find_loader(self, fullname): - """Abstract method which, when implemented, returns a module loader or - a possible part of a namespace. - The fullname is a str. Returns a 2-tuple of (Loader, portion) where - portion is a sequence of file system locations contributing to part of - a namespace package. The sequence may be empty and the loader may be - None. + """Return (loader, namespace portion) for the path entry. + + The fullname is a str. The namespace portion is a sequence of + path entries contributing to part of a namespace package. The + sequence may be empty. If loader is not None, the portion will + be ignored. + + The portion will be discarded if another path entry finder + locates the module as a normal module or package. + """ return None, [] + # XXX Deprecate. find_module = _bootstrap._find_module_shim def invalidate_caches(self): @@ -83,35 +97,47 @@ class Loader(metaclass=abc.ABCMeta): - """Abstract base class for import loaders. + """Abstract base class for import loaders.""" - The optional method module_repr(module) may be defined to provide a - repr for a module when appropriate (see PEP 420). The __repr__() method on - the module type will use the method as appropriate. + def create_module(self, spec): + """Return a module to initialize and into which to load. - """ + This method should raise ImportError if anything prevents it + from creating a new module. It may return None to indicate + that the spec should create the new module. - @abc.abstractmethod + create_module() is optional. + + """ + # By default, defer to _SpecMethods.create() for the new module. + return None + + # We don't define exec_module() here since that would break + # hasattr checks we do to support backward compatibility. + + # XXX Deprecate. def load_module(self, fullname): - """Abstract method which when implemented should load a module. - The fullname is a str. + """Return the loaded module. + + The module must be added to sys.modules and have import-related + attributes set properly. The fullname is a str. ImportError is raised on failure. + """ raise ImportError + # XXX Deprecate. def module_repr(self, module): """Return a module's repr. Used by the module type when the method does not raise NotImplementedError. + """ + # The exception will cause ModuleType.__repr__ to ignore this method. raise NotImplementedError - def init_module_attrs(self, module): - """Set the module's __loader__ attribute.""" - module.__loader__ = self - class ResourceLoader(Loader): @@ -138,12 +164,11 @@ """ - @abc.abstractmethod def is_package(self, fullname): - """Abstract method which when implemented should return whether the + """Optional method which when implemented should return whether the module is a package. The fullname is a str. Returns a bool. - Raises ImportError is the module cannot be found. + Raises ImportError if the module cannot be found. """ raise ImportError @@ -176,19 +201,10 @@ argument should be where the data was retrieved (when applicable).""" return compile(data, path, 'exec', dont_inherit=True) - def init_module_attrs(self, module): - """Initialize the __loader__ and __package__ attributes of the module. - - The name of the module is gleaned from module.__name__. The __package__ - attribute is set based on self.is_package(). - """ - super().init_module_attrs(module) - _bootstrap._init_package_attrs(self, module) - + exec_module = _bootstrap._LoaderBasics.exec_module load_module = _bootstrap._LoaderBasics.load_module -_register(InspectLoader, machinery.BuiltinImporter, machinery.FrozenImporter, - _bootstrap.NamespaceLoader) +_register(InspectLoader, machinery.BuiltinImporter, machinery.FrozenImporter) class ExecutionLoader(InspectLoader): @@ -225,18 +241,6 @@ else: return self.source_to_code(source, path) - def init_module_attrs(self, module): - """Initialize the module's attributes. - - It is assumed that the module's name has been set on module.__name__. - It is also assumed that any path returned by self.get_filename() uses - (one of) the operating system's path separator(s) to separate filenames - from directories in order to set __path__ intelligently. - InspectLoader.init_module_attrs() sets __loader__ and __package__. - """ - super().init_module_attrs(module) - _bootstrap._init_file_attrs(self, module) - _register(ExecutionLoader, machinery.ExtensionFileLoader) diff --git a/Lib/importlib/machinery.py b/Lib/importlib/machinery.py --- a/Lib/importlib/machinery.py +++ b/Lib/importlib/machinery.py @@ -5,6 +5,7 @@ from ._bootstrap import (SOURCE_SUFFIXES, DEBUG_BYTECODE_SUFFIXES, OPTIMIZED_BYTECODE_SUFFIXES, BYTECODE_SUFFIXES, EXTENSION_SUFFIXES) +from ._bootstrap import ModuleSpec from ._bootstrap import BuiltinImporter from ._bootstrap import FrozenImporter from ._bootstrap import WindowsRegistryFinder diff --git a/Lib/importlib/util.py b/Lib/importlib/util.py --- a/Lib/importlib/util.py +++ b/Lib/importlib/util.py @@ -3,13 +3,14 @@ from ._bootstrap import MAGIC_NUMBER from ._bootstrap import cache_from_source from ._bootstrap import decode_source -from ._bootstrap import module_to_load -from ._bootstrap import set_loader -from ._bootstrap import set_package from ._bootstrap import source_from_cache +from ._bootstrap import spec_from_loader +from ._bootstrap import spec_from_file_location from ._bootstrap import _resolve_name +from contextlib import contextmanager import functools +import sys import warnings @@ -28,6 +29,58 @@ return _resolve_name(name[level:], package, level) + at contextmanager +def _module_to_load(name): + is_reload = name in sys.modules + + module = sys.modules.get(name) + if not is_reload: + # This must be done before open() is called as the 'io' module + # implicitly imports 'locale' and would otherwise trigger an + # infinite loop. + module = type(sys)(name) + # This must be done before putting the module in sys.modules + # (otherwise an optimization shortcut in import.c becomes wrong) + module.__initializing__ = True + sys.modules[name] = module + try: + yield module + except Exception: + if not is_reload: + try: + del sys.modules[name] + except KeyError: + pass + finally: + module.__initializing__ = False + + +# XXX deprecate +def set_package(fxn): + """Set __package__ on the returned module.""" + @functools.wraps(fxn) + def set_package_wrapper(*args, **kwargs): + module = fxn(*args, **kwargs) + if getattr(module, '__package__', None) is None: + module.__package__ = module.__name__ + if not hasattr(module, '__path__'): + module.__package__ = module.__package__.rpartition('.')[0] + return module + return set_package_wrapper + + +# XXX deprecate +def set_loader(fxn): + """Set __loader__ on the returned module.""" + @functools.wraps(fxn) + def set_loader_wrapper(self, *args, **kwargs): + module = fxn(self, *args, **kwargs) + if getattr(module, '__loader__', None) is None: + module.__loader__ = self + return module + return set_loader_wrapper + + def module_for_loader(fxn): """Decorator to handle selecting the proper module for loaders. @@ -46,13 +99,11 @@ the second argument. """ - warnings.warn('To make it easier for subclasses, please use ' - 'importlib.util.module_to_load() and ' - 'importlib.abc.Loader.init_module_attrs()', + warnings.warn('The import system now takes care of this automatically.', PendingDeprecationWarning, stacklevel=2) @functools.wraps(fxn) def module_for_loader_wrapper(self, fullname, *args, **kwargs): - with module_to_load(fullname) as module: + with _module_to_load(fullname) as module: module.__loader__ = self try: is_package = self.is_package(fullname) diff --git a/Lib/multiprocessing/spawn.py b/Lib/multiprocessing/spawn.py --- a/Lib/multiprocessing/spawn.py +++ b/Lib/multiprocessing/spawn.py @@ -245,14 +245,13 @@ # We should not try to load __main__ # since that would execute 'if __name__ == "__main__"' # clauses, potentially causing a psuedo fork bomb. - loader = importlib.find_loader(main_name, path=dirs) main_module = types.ModuleType(main_name) - try: - loader.init_module_attrs(main_module) - except AttributeError: # init_module_attrs is optional - pass + # XXX Use a target of main_module? + spec = importlib.find_spec(main_name, path=dirs) + methods = importlib._bootstrap._SpecMethods(spec) + methods.init_module_attrs(main_module) main_module.__name__ = '__mp_main__' - code = loader.get_code(main_name) + code = spec.loader.get_code(main_name) exec(code, main_module.__dict__) old_main_modules.append(sys.modules['__main__']) diff --git a/Lib/pkgutil.py b/Lib/pkgutil.py --- a/Lib/pkgutil.py +++ b/Lib/pkgutil.py @@ -430,6 +430,7 @@ for item in path: yield get_importer(item) + def get_loader(module_or_name): """Get a PEP 302 "loader" object for module_or_name @@ -570,6 +571,7 @@ return path + def get_data(package, resource): """Get a resource from a package. @@ -592,10 +594,15 @@ which does not support get_data(), then None is returned. """ - loader = get_loader(package) + spec = importlib.find_spec(package) + if spec is None: + return None + loader = spec.loader if loader is None or not hasattr(loader, 'get_data'): return None - mod = sys.modules.get(package) or loader.load_module(package) + # XXX needs test + mod = (sys.modules.get(package) or + importlib._bootstrap._SpecMethods(spec).load()) if mod is None or not hasattr(mod, '__file__'): return None diff --git a/Lib/pydoc.py b/Lib/pydoc.py --- a/Lib/pydoc.py +++ b/Lib/pydoc.py @@ -167,8 +167,9 @@ def visiblename(name, all=None, obj=None): """Decide whether to show documentation on a variable.""" # Certain special names are redundant or internal. + # XXX Remove __initializing__? if name in {'__author__', '__builtins__', '__cached__', '__credits__', - '__date__', '__doc__', '__file__', '__initializing__', + '__date__', '__doc__', '__file__', '__spec__', '__loader__', '__module__', '__name__', '__package__', '__path__', '__qualname__', '__slots__', '__version__'}: return 0 diff --git a/Lib/test/test_descr.py b/Lib/test/test_descr.py --- a/Lib/test/test_descr.py +++ b/Lib/test/test_descr.py @@ -2259,7 +2259,7 @@ minstance.b = 2 minstance.a = 1 default_attributes = ['__name__', '__doc__', '__package__', - '__loader__'] + '__loader__', '__spec__'] names = [x for x in dir(minstance) if x not in default_attributes] self.assertEqual(names, ['a', 'b']) diff --git a/Lib/test/test_frozen.py b/Lib/test/test_frozen.py deleted file mode 100644 --- a/Lib/test/test_frozen.py +++ /dev/null @@ -1,77 +0,0 @@ -# Test the frozen module defined in frozen.c. - -from test.support import captured_stdout, run_unittest -import unittest -import sys - -class FrozenTests(unittest.TestCase): - - module_attrs = frozenset(['__builtins__', '__cached__', '__doc__', - '__loader__', '__name__', - '__package__']) - package_attrs = frozenset(list(module_attrs) + ['__path__']) - - def test_frozen(self): - with captured_stdout() as stdout: - try: - import __hello__ - except ImportError as x: - self.fail("import __hello__ failed:" + str(x)) - self.assertEqual(__hello__.initialized, True) - expect = set(self.module_attrs) - expect.add('initialized') - self.assertEqual(set(dir(__hello__)), expect) - self.assertEqual(stdout.getvalue(), 'Hello world!\n') - - with captured_stdout() as stdout: - try: - import __phello__ - except ImportError as x: - self.fail("import __phello__ failed:" + str(x)) - self.assertEqual(__phello__.initialized, True) - expect = set(self.package_attrs) - expect.add('initialized') - if not "__phello__.spam" in sys.modules: - self.assertEqual(set(dir(__phello__)), expect) - else: - expect.add('spam') - self.assertEqual(set(dir(__phello__)), expect) - self.assertEqual(__phello__.__path__, []) - self.assertEqual(stdout.getvalue(), 'Hello world!\n') - - with captured_stdout() as stdout: - try: - import __phello__.spam - except ImportError as x: - self.fail("import __phello__.spam failed:" + str(x)) - self.assertEqual(__phello__.spam.initialized, True) - spam_expect = set(self.module_attrs) - spam_expect.add('initialized') - self.assertEqual(set(dir(__phello__.spam)), spam_expect) - phello_expect = set(self.package_attrs) - phello_expect.add('initialized') - phello_expect.add('spam') - self.assertEqual(set(dir(__phello__)), phello_expect) - self.assertEqual(stdout.getvalue(), 'Hello world!\n') - - try: - import __phello__.foo - except ImportError: - pass - else: - self.fail("import __phello__.foo should have failed") - - try: - import __phello__.foo - except ImportError: - pass - else: - self.fail("import __phello__.foo should have failed") - - del sys.modules['__hello__'] - del sys.modules['__phello__'] - del sys.modules['__phello__.spam'] - - -if __name__ == "__main__": - unittest.main() diff --git a/Lib/test/test_import.py b/Lib/test/test_import.py --- a/Lib/test/test_import.py +++ b/Lib/test/test_import.py @@ -1036,11 +1036,14 @@ # away from the traceback. self.create_module("foo", "") importlib = sys.modules['_frozen_importlib'] - old_load_module = importlib.SourceLoader.load_module + if 'load_module' in vars(importlib.SourceLoader): + old_exec_module = importlib.SourceLoader.exec_module + else: + old_exec_module = None try: - def load_module(*args): + def exec_module(*args): 1/0 - importlib.SourceLoader.load_module = load_module + importlib.SourceLoader.exec_module = exec_module try: import foo except ZeroDivisionError as e: @@ -1049,7 +1052,10 @@ self.fail("ZeroDivisionError should have been raised") self.assert_traceback(tb, [__file__, '') + self.assertIsNone(spec) + +Frozen_FindSpecTests, Source_FindSpecTests = util.test_both(FindSpecTests, + machinery=machinery) + + class FinderTests(abc.FinderTests): """Test finding frozen modules.""" @@ -27,13 +62,11 @@ loader = self.find('__phello__.spam', ['__phello__']) self.assertTrue(hasattr(loader, 'load_module')) - def test_package_in_package(self): - # No frozen package within another package to test with. - pass + # No frozen package within another package to test with. + test_package_in_package = None - def test_package_over_module(self): - # No easy way to test. - pass + # No easy way to test. + test_package_over_module = None def test_failure(self): loader = self.find('') diff --git a/Lib/test/test_importlib/frozen/test_loader.py b/Lib/test/test_importlib/frozen/test_loader.py --- a/Lib/test/test_importlib/frozen/test_loader.py +++ b/Lib/test/test_importlib/frozen/test_loader.py @@ -3,9 +3,81 @@ machinery = util.import_importlib('importlib.machinery') -import unittest + +import sys from test.support import captured_stdout import types +import unittest + + +class ExecModuleTests(abc.LoaderTests): + + def exec_module(self, name): + with util.uncache(name), captured_stdout() as stdout: + spec = self.machinery.ModuleSpec( + name, self.machinery.FrozenImporter, origin='frozen', + is_package=self.machinery.FrozenImporter.is_package(name)) + module = types.ModuleType(name) + module.__spec__ = spec + assert not hasattr(module, 'initialized') + self.machinery.FrozenImporter.exec_module(module) + self.assertTrue(module.initialized) + self.assertTrue(hasattr(module, '__spec__')) + self.assertEqual(module.__spec__.origin, 'frozen') + return module, stdout.getvalue() + + def test_module(self): + name = '__hello__' + module, output = self.exec_module(name) + check = {'__name__': name} + for attr, value in check.items(): + self.assertEqual(getattr(module, attr), value) + self.assertEqual(output, 'Hello world!\n') + self.assertTrue(hasattr(module, '__spec__')) + + def test_package(self): + name = '__phello__' + module, output = self.exec_module(name) + check = {'__name__': name} + for attr, value in check.items(): + attr_value = getattr(module, attr) + self.assertEqual(attr_value, value, + 'for {name}.{attr}, {given!r} != {expected!r}'.format( + name=name, attr=attr, given=attr_value, + expected=value)) + self.assertEqual(output, 'Hello world!\n') + + def test_lacking_parent(self): + name = '__phello__.spam' + with util.uncache('__phello__'): + module, output = self.exec_module(name) + check = {'__name__': name} + for attr, value in check.items(): + attr_value = getattr(module, attr) + self.assertEqual(attr_value, value, + 'for {name}.{attr}, {given} != {expected!r}'.format( + name=name, attr=attr, given=attr_value, + expected=value)) + self.assertEqual(output, 'Hello world!\n') + + + def test_module_repr(self): + name = '__hello__' + module, output = self.exec_module(name) + self.assertEqual(repr(module), + "") + + # No way to trigger an error in a frozen module. + test_state_after_failure = None + + def test_unloadable(self): + assert self.machinery.FrozenImporter.find_module('_not_real') is None + with self.assertRaises(ImportError) as cm: + self.exec_module('_not_real') + self.assertEqual(cm.exception.name, '_not_real') + +Frozen_ExecModuleTests, Source_ExecModuleTests = util.test_both(ExecModuleTests, + machinery=machinery) class LoaderTests(abc.LoaderTests): @@ -68,9 +140,8 @@ self.assertEqual(repr(module), "") - def test_state_after_failure(self): - # No way to trigger an error in a frozen module. - pass + # No way to trigger an error in a frozen module. + test_state_after_failure = None def test_unloadable(self): assert self.machinery.FrozenImporter.find_module('_not_real') is None diff --git a/Lib/test/test_importlib/import_/test_meta_path.py b/Lib/test/test_importlib/import_/test_meta_path.py --- a/Lib/test/test_importlib/import_/test_meta_path.py +++ b/Lib/test/test_importlib/import_/test_meta_path.py @@ -46,8 +46,8 @@ with util.import_state(meta_path=[]): with warnings.catch_warnings(record=True) as w: warnings.simplefilter('always') - self.assertIsNone(importlib._bootstrap._find_module('nothing', - None)) + self.assertIsNone(importlib._bootstrap._find_spec('nothing', + None)) self.assertEqual(len(w), 1) self.assertTrue(issubclass(w[-1].category, ImportWarning)) diff --git a/Lib/test/test_importlib/test_abc.py b/Lib/test/test_importlib/test_abc.py --- a/Lib/test/test_importlib/test_abc.py +++ b/Lib/test/test_importlib/test_abc.py @@ -284,22 +284,6 @@ tests = make_return_value_tests(ExecutionLoader, InspectLoaderDefaultsTests) Frozen_ELDefaultTests, Source_ELDefaultsTests = tests -##### Loader concrete methods ################################################## -class LoaderConcreteMethodTests: - - def test_init_module_attrs(self): - loader = self.LoaderSubclass() - module = types.ModuleType('blah') - loader.init_module_attrs(module) - self.assertEqual(module.__loader__, loader) - - -class Frozen_LoaderConcreateMethodTests(LoaderConcreteMethodTests, unittest.TestCase): - LoaderSubclass = Frozen_L - -class Source_LoaderConcreateMethodTests(LoaderConcreteMethodTests, unittest.TestCase): - LoaderSubclass = Source_L - ##### InspectLoader concrete methods ########################################### class InspectLoaderSourceToCodeTests: @@ -385,60 +369,6 @@ InspectLoaderSubclass = Source_IL -class InspectLoaderInitModuleTests: - - def mock_is_package(self, return_value): - return mock.patch.object(self.InspectLoaderSubclass, 'is_package', - return_value=return_value) - - def init_module_attrs(self, name): - loader = self.InspectLoaderSubclass() - module = types.ModuleType(name) - loader.init_module_attrs(module) - self.assertEqual(module.__loader__, loader) - return module - - def test_package(self): - # If a package, then __package__ == __name__, __path__ == [] - with self.mock_is_package(True): - name = 'blah' - module = self.init_module_attrs(name) - self.assertEqual(module.__package__, name) - self.assertEqual(module.__path__, []) - - def test_toplevel(self): - # If a module is top-level, __package__ == '' - with self.mock_is_package(False): - name = 'blah' - module = self.init_module_attrs(name) - self.assertEqual(module.__package__, '') - - def test_submodule(self): - # If a module is contained within a package then set __package__ to the - # package name. - with self.mock_is_package(False): - name = 'pkg.mod' - module = self.init_module_attrs(name) - self.assertEqual(module.__package__, 'pkg') - - def test_is_package_ImportError(self): - # If is_package() raises ImportError, __package__ should be None and - # __path__ should not be set. - with self.mock_is_package(False) as mocked_method: - mocked_method.side_effect = ImportError - name = 'mod' - module = self.init_module_attrs(name) - self.assertIsNone(module.__package__) - self.assertFalse(hasattr(module, '__path__')) - - -class Frozen_ILInitModuleTests(InspectLoaderInitModuleTests, unittest.TestCase): - InspectLoaderSubclass = Frozen_IL - -class Source_ILInitModuleTests(InspectLoaderInitModuleTests, unittest.TestCase): - InspectLoaderSubclass = Source_IL - - class InspectLoaderLoadModuleTests: """Test InspectLoader.load_module().""" @@ -550,80 +480,6 @@ ExecutionLoaderSubclass = Source_EL -class ExecutionLoaderInitModuleTests: - - def mock_is_package(self, return_value): - return mock.patch.object(self.ExecutionLoaderSubclass, 'is_package', - return_value=return_value) - - @contextlib.contextmanager - def mock_methods(self, is_package, filename): - is_package_manager = self.mock_is_package(is_package) - get_filename_manager = mock.patch.object(self.ExecutionLoaderSubclass, - 'get_filename', return_value=filename) - with is_package_manager as mock_is_package: - with get_filename_manager as mock_get_filename: - yield {'is_package': mock_is_package, - 'get_filename': mock_get_filename} - - def test_toplevel(self): - # Verify __loader__, __file__, and __package__; no __path__. - name = 'blah' - path = os.path.join('some', 'path', '{}.py'.format(name)) - with self.mock_methods(False, path): - loader = self.ExecutionLoaderSubclass() - module = types.ModuleType(name) - loader.init_module_attrs(module) - self.assertIs(module.__loader__, loader) - self.assertEqual(module.__file__, path) - self.assertEqual(module.__package__, '') - self.assertFalse(hasattr(module, '__path__')) - - def test_package(self): - # Verify __loader__, __file__, __package__, and __path__. - name = 'pkg' - path = os.path.join('some', 'pkg', '__init__.py') - with self.mock_methods(True, path): - loader = self.ExecutionLoaderSubclass() - module = types.ModuleType(name) - loader.init_module_attrs(module) - self.assertIs(module.__loader__, loader) - self.assertEqual(module.__file__, path) - self.assertEqual(module.__package__, 'pkg') - self.assertEqual(module.__path__, [os.path.dirname(path)]) - - def test_submodule(self): - # Verify __package__ and not __path__; test_toplevel() takes care of - # other attributes. - name = 'pkg.submodule' - path = os.path.join('some', 'pkg', 'submodule.py') - with self.mock_methods(False, path): - loader = self.ExecutionLoaderSubclass() - module = types.ModuleType(name) - loader.init_module_attrs(module) - self.assertEqual(module.__package__, 'pkg') - self.assertEqual(module.__file__, path) - self.assertFalse(hasattr(module, '__path__')) - - def test_get_filename_ImportError(self): - # If get_filename() raises ImportError, don't set __file__. - name = 'blah' - path = 'blah.py' - with self.mock_methods(False, path) as mocked_methods: - mocked_methods['get_filename'].side_effect = ImportError - loader = self.ExecutionLoaderSubclass() - module = types.ModuleType(name) - loader.init_module_attrs(module) - self.assertFalse(hasattr(module, '__file__')) - - -class Frozen_ELInitModuleTests(ExecutionLoaderInitModuleTests, unittest.TestCase): - ExecutionLoaderSubclass = Frozen_EL - -class Source_ELInitModuleTests(ExecutionLoaderInitModuleTests, unittest.TestCase): - ExecutionLoaderSubclass = Source_EL - - ##### SourceLoader concrete methods ############################################ class SourceLoader: @@ -952,58 +808,5 @@ SourceOnlyLoaderMock = Source_SourceOnlyL -class SourceLoaderInitModuleAttrTests: - - """Tests for importlib.abc.SourceLoader.init_module_attrs().""" - - def test_init_module_attrs(self): - # If __file__ set, __cached__ == importlib.util.cached_from_source(__file__). - name = 'blah' - path = 'blah.py' - loader = self.SourceOnlyLoaderMock(path) - module = types.ModuleType(name) - loader.init_module_attrs(module) - self.assertEqual(module.__loader__, loader) - self.assertEqual(module.__package__, '') - self.assertEqual(module.__file__, path) - self.assertEqual(module.__cached__, self.util.cache_from_source(path)) - - def test_no_get_filename(self): - # No __file__, no __cached__. - with mock.patch.object(self.SourceOnlyLoaderMock, 'get_filename') as mocked: - mocked.side_effect = ImportError - name = 'blah' - loader = self.SourceOnlyLoaderMock('blah.py') - module = types.ModuleType(name) - loader.init_module_attrs(module) - self.assertFalse(hasattr(module, '__file__')) - self.assertFalse(hasattr(module, '__cached__')) - - -class Frozen_SLInitModuleAttrTests(SourceLoaderInitModuleAttrTests, unittest.TestCase): - SourceOnlyLoaderMock = Frozen_SourceOnlyL - util = frozen_util - - # Difficult to test under source thanks to cross-module mocking needs. - @mock.patch('importlib._bootstrap.cache_from_source') - def test_cache_from_source_NotImplementedError(self, mock_cache_from_source): - # If importlib.util.cache_from_source() raises NotImplementedError don't set - # __cached__. - mock_cache_from_source.side_effect = NotImplementedError - name = 'blah' - path = 'blah.py' - loader = self.SourceOnlyLoaderMock(path) - module = types.ModuleType(name) - loader.init_module_attrs(module) - self.assertEqual(module.__file__, path) - self.assertFalse(hasattr(module, '__cached__')) - - -class Source_SLInitModuleAttrTests(SourceLoaderInitModuleAttrTests, unittest.TestCase): - SourceOnlyLoaderMock = Source_SourceOnlyL - util = source_util - - - if __name__ == '__main__': unittest.main() diff --git a/Lib/test/test_importlib/test_api.py b/Lib/test/test_importlib/test_api.py --- a/Lib/test/test_importlib/test_api.py +++ b/Lib/test/test_importlib/test_api.py @@ -165,6 +165,96 @@ init = source_init +class FindSpecTests: + + class FakeMetaFinder: + @staticmethod + def find_spec(name, path=None, target=None): return name, path, target + + def test_sys_modules(self): + name = 'some_mod' + with util.uncache(name): + module = types.ModuleType(name) + loader = 'a loader!' + spec = self.machinery.ModuleSpec(name, loader) + module.__loader__ = loader + module.__spec__ = spec + sys.modules[name] = module + found = self.init.find_spec(name) + self.assertEqual(found, spec) + + def test_sys_modules_without___loader__(self): + name = 'some_mod' + with util.uncache(name): + module = types.ModuleType(name) + del module.__loader__ + loader = 'a loader!' + spec = self.machinery.ModuleSpec(name, loader) + module.__spec__ = spec + sys.modules[name] = module + found = self.init.find_spec(name) + self.assertEqual(found, spec) + + def test_sys_modules_spec_is_None(self): + name = 'some_mod' + with util.uncache(name): + module = types.ModuleType(name) + module.__spec__ = None + sys.modules[name] = module + with self.assertRaises(ValueError): + self.init.find_spec(name) + + def test_sys_modules_loader_is_None(self): + name = 'some_mod' + with util.uncache(name): + module = types.ModuleType(name) + spec = self.machinery.ModuleSpec(name, None) + module.__spec__ = spec + sys.modules[name] = module + found = self.init.find_spec(name) + self.assertEqual(found, spec) + + def test_sys_modules_spec_is_not_set(self): + name = 'some_mod' + with util.uncache(name): + module = types.ModuleType(name) + try: + del module.__spec__ + except AttributeError: + pass + sys.modules[name] = module + with self.assertRaises(ValueError): + self.init.find_spec(name) + + def test_success(self): + name = 'some_mod' + with util.uncache(name): + with util.import_state(meta_path=[self.FakeMetaFinder]): + self.assertEqual((name, None, None), + self.init.find_spec(name)) + + def test_success_path(self): + # Searching on a path should work. + name = 'some_mod' + path = 'path to some place' + with util.uncache(name): + with util.import_state(meta_path=[self.FakeMetaFinder]): + self.assertEqual((name, path, None), + self.init.find_spec(name, path)) + + def test_nothing(self): + # None is returned upon failure to find a loader. + self.assertIsNone(self.init.find_spec('nevergoingtofindthismodule')) + +class Frozen_FindSpecTests(FindSpecTests, unittest.TestCase): + init = frozen_init + machinery = frozen_machinery + +class Source_FindSpecTests(FindSpecTests, unittest.TestCase): + init = source_init + machinery = source_machinery + + class ReloadTests: """Test module reloading for builtin and extension modules.""" @@ -219,6 +309,7 @@ with support.temp_cwd(None) as cwd: with util.uncache('spam'): with support.DirsOnSysPath(cwd): + # Start as a plain module. self.init.invalidate_caches() path = os.path.join(cwd, name + '.py') cached = self.util.cache_from_source(path) @@ -232,11 +323,14 @@ support.create_empty_file(path) module = self.init.import_module(name) ns = vars(module) - del ns['__initializing__'] loader = ns.pop('__loader__') + spec = ns.pop('__spec__') + self.assertEqual(spec.name, name) + self.assertEqual(spec.loader, loader) self.assertEqual(loader.path, path) self.assertEqual(ns, expected) + # Change to a package. self.init.invalidate_caches() init_path = os.path.join(cwd, name, '__init__.py') cached = self.util.cache_from_source(init_path) @@ -252,18 +346,21 @@ os.rename(path, init_path) reloaded = self.init.reload(module) ns = vars(reloaded) - del ns['__initializing__'] loader = ns.pop('__loader__') + spec = ns.pop('__spec__') + self.assertEqual(spec.name, name) + self.assertEqual(spec.loader, loader) self.assertIs(reloaded, module) self.assertEqual(loader.path, init_path) + self.maxDiff = None self.assertEqual(ns, expected) def test_reload_namespace_changed(self): - self.maxDiff = None name = 'spam' with support.temp_cwd(None) as cwd: with util.uncache('spam'): with support.DirsOnSysPath(cwd): + # Start as a namespace package. self.init.invalidate_caches() bad_path = os.path.join(cwd, name, '__init.py') cached = self.util.cache_from_source(bad_path) @@ -276,9 +373,12 @@ init_file.write('eggs = None') module = self.init.import_module(name) ns = vars(module) - del ns['__initializing__'] loader = ns.pop('__loader__') path = ns.pop('__path__') + spec = ns.pop('__spec__') + self.assertEqual(spec.name, name) + self.assertIs(spec.loader, None) + self.assertIsNot(loader, None) self.assertEqual(set(path), set([os.path.dirname(bad_path)])) with self.assertRaises(AttributeError): @@ -286,6 +386,7 @@ loader.path self.assertEqual(ns, expected) + # Change to a regular package. self.init.invalidate_caches() init_path = os.path.join(cwd, name, '__init__.py') cached = self.util.cache_from_source(init_path) @@ -301,8 +402,10 @@ os.rename(bad_path, init_path) reloaded = self.init.reload(module) ns = vars(reloaded) - del ns['__initializing__'] loader = ns.pop('__loader__') + spec = ns.pop('__spec__') + self.assertEqual(spec.name, name) + self.assertEqual(spec.loader, loader) self.assertIs(reloaded, module) self.assertEqual(loader.path, init_path) self.assertEqual(ns, expected) @@ -371,12 +474,23 @@ # Issue #17098: all modules should have __loader__ defined. for name, module in sys.modules.items(): if isinstance(module, types.ModuleType): - self.assertTrue(hasattr(module, '__loader__'), - '{!r} lacks a __loader__ attribute'.format(name)) - if self.machinery.BuiltinImporter.find_module(name): - self.assertIsNot(module.__loader__, None) - elif self.machinery.FrozenImporter.find_module(name): - self.assertIsNot(module.__loader__, None) + with self.subTest(name=name): + self.assertTrue(hasattr(module, '__loader__'), + '{!r} lacks a __loader__ attribute'.format(name)) + if self.machinery.BuiltinImporter.find_module(name): + self.assertIsNot(module.__loader__, None) + elif self.machinery.FrozenImporter.find_module(name): + self.assertIsNot(module.__loader__, None) + + def test_everyone_has___spec__(self): + for name, module in sys.modules.items(): + if isinstance(module, types.ModuleType): + with self.subTest(name=name): + self.assertTrue(hasattr(module, '__spec__')) + if self.machinery.BuiltinImporter.find_module(name): + self.assertIsNot(module.__spec__, None) + elif self.machinery.FrozenImporter.find_module(name): + self.assertIsNot(module.__spec__, None) class Frozen_StartupTests(StartupTests, unittest.TestCase): machinery = frozen_machinery diff --git a/Lib/test/test_importlib/test_spec.py b/Lib/test/test_importlib/test_spec.py new file mode 100644 --- /dev/null +++ b/Lib/test/test_importlib/test_spec.py @@ -0,0 +1,968 @@ +from . import util + +frozen_init, source_init = util.import_importlib('importlib') +frozen_bootstrap = frozen_init._bootstrap +source_bootstrap = source_init._bootstrap +frozen_machinery, source_machinery = util.import_importlib('importlib.machinery') +frozen_util, source_util = util.import_importlib('importlib.util') + +import os.path +from test.support import CleanImport +import unittest +import sys + + + +class TestLoader: + + def __init__(self, path=None, is_package=None): +# if path: +# if is_package: +# if not path.endswith('.py'): +# path = os.path.join(path, '__init__.py') +# elif is_package is None: +# is_package = path.endswith('__init__.py') + + self.path = path + self.package = is_package + + def __repr__(self): + return '' + + def __getattr__(self, name): + if name == 'get_filename' and self.path is not None: + return self._get_filename + if name == 'is_package': + return self._is_package + raise AttributeError(name) + + def _get_filename(self, name): + return self.path + + def _is_package(self, name): + return self.package + + +class NewLoader(TestLoader): + + EGGS = 1 + + def exec_module(self, module): + module.eggs = self.EGGS + + +class LegacyLoader(TestLoader): + + HAM = -1 + + @frozen_util.module_for_loader + def load_module(self, module): + module.ham = self.HAM + return module + + +class ModuleSpecTests: + + def setUp(self): + self.name = 'spam' + self.path = 'spam.py' + self.cached = self.util.cache_from_source(self.path) + self.loader = TestLoader() + self.spec = self.machinery.ModuleSpec(self.name, self.loader) + self.loc_spec = self.machinery.ModuleSpec(self.name, self.loader, + origin=self.path) + self.loc_spec._set_fileattr = True + + def test_default(self): + spec = self.machinery.ModuleSpec(self.name, self.loader) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.loader) + self.assertIs(spec.origin, None) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertIs(spec.cached, None) + self.assertFalse(spec.has_location) + + def test_default_no_loader(self): + spec = self.machinery.ModuleSpec(self.name, None) + + self.assertEqual(spec.name, self.name) + self.assertIs(spec.loader, None) + self.assertIs(spec.origin, None) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertIs(spec.cached, None) + self.assertFalse(spec.has_location) + + def test_default_is_package_false(self): + spec = self.machinery.ModuleSpec(self.name, self.loader, + is_package=False) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.loader) + self.assertIs(spec.origin, None) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertIs(spec.cached, None) + self.assertFalse(spec.has_location) + + def test_default_is_package_true(self): + spec = self.machinery.ModuleSpec(self.name, self.loader, + is_package=True) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.loader) + self.assertIs(spec.origin, None) + self.assertIs(spec.loader_state, None) + self.assertEqual(spec.submodule_search_locations, []) + self.assertIs(spec.cached, None) + self.assertFalse(spec.has_location) + + def test_equality(self): + other = type(sys.implementation)(name=self.name, + loader=self.loader, + origin=None, + submodule_search_locations=None, + has_location=False, + cached=None, + ) + + self.assertTrue(self.spec == other) + + def test_equality_location(self): + other = type(sys.implementation)(name=self.name, + loader=self.loader, + origin=self.path, + submodule_search_locations=None, + has_location=True, + cached=self.cached, + ) + + self.assertEqual(self.loc_spec, other) + + def test_inequality(self): + other = type(sys.implementation)(name='ham', + loader=self.loader, + origin=None, + submodule_search_locations=None, + has_location=False, + cached=None, + ) + + self.assertNotEqual(self.spec, other) + + def test_inequality_incomplete(self): + other = type(sys.implementation)(name=self.name, + loader=self.loader, + ) + + self.assertNotEqual(self.spec, other) + + def test_package(self): + spec = self.machinery.ModuleSpec('spam.eggs', self.loader) + + self.assertEqual(spec.parent, 'spam') + + def test_package_is_package(self): + spec = self.machinery.ModuleSpec('spam.eggs', self.loader, + is_package=True) + + self.assertEqual(spec.parent, 'spam.eggs') + + # cached + + def test_cached_set(self): + before = self.spec.cached + self.spec.cached = 'there' + after = self.spec.cached + + self.assertIs(before, None) + self.assertEqual(after, 'there') + + def test_cached_no_origin(self): + spec = self.machinery.ModuleSpec(self.name, self.loader) + + self.assertIs(spec.cached, None) + + def test_cached_with_origin_not_location(self): + spec = self.machinery.ModuleSpec(self.name, self.loader, + origin=self.path) + + self.assertIs(spec.cached, None) + + def test_cached_source(self): + expected = self.util.cache_from_source(self.path) + + self.assertEqual(self.loc_spec.cached, expected) + + def test_cached_source_unknown_suffix(self): + self.loc_spec.origin = 'spam.spamspamspam' + + self.assertIs(self.loc_spec.cached, None) + + def test_cached_source_missing_cache_tag(self): + original = sys.implementation.cache_tag + sys.implementation.cache_tag = None + try: + cached = self.loc_spec.cached + finally: + sys.implementation.cache_tag = original + + self.assertIs(cached, None) + + def test_cached_sourceless(self): + self.loc_spec.origin = 'spam.pyc' + + self.assertEqual(self.loc_spec.cached, 'spam.pyc') + + +class Frozen_ModuleSpecTests(ModuleSpecTests, unittest.TestCase): + util = frozen_util + machinery = frozen_machinery + + +class Source_ModuleSpecTests(ModuleSpecTests, unittest.TestCase): + util = source_util + machinery = source_machinery + + +class ModuleSpecMethodsTests: + + def setUp(self): + self.name = 'spam' + self.path = 'spam.py' + self.cached = self.util.cache_from_source(self.path) + self.loader = TestLoader() + self.spec = self.machinery.ModuleSpec(self.name, self.loader) + self.loc_spec = self.machinery.ModuleSpec(self.name, self.loader, + origin=self.path) + self.loc_spec._set_fileattr = True + + # init_module_attrs + + def test_init_module_attrs(self): + module = type(sys)(self.name) + spec = self.machinery.ModuleSpec(self.name, self.loader) + self.bootstrap._SpecMethods(spec).init_module_attrs(module) + + self.assertEqual(module.__name__, spec.name) + self.assertIs(module.__loader__, spec.loader) + self.assertEqual(module.__package__, spec.parent) + self.assertIs(module.__spec__, spec) + self.assertFalse(hasattr(module, '__path__')) + self.assertFalse(hasattr(module, '__file__')) + self.assertFalse(hasattr(module, '__cached__')) + + def test_init_module_attrs_package(self): + module = type(sys)(self.name) + spec = self.machinery.ModuleSpec(self.name, self.loader) + spec.submodule_search_locations = ['spam', 'ham'] + self.bootstrap._SpecMethods(spec).init_module_attrs(module) + + self.assertEqual(module.__name__, spec.name) + self.assertIs(module.__loader__, spec.loader) + self.assertEqual(module.__package__, spec.parent) + self.assertIs(module.__spec__, spec) + self.assertIs(module.__path__, spec.submodule_search_locations) + self.assertFalse(hasattr(module, '__file__')) + self.assertFalse(hasattr(module, '__cached__')) + + def test_init_module_attrs_location(self): + module = type(sys)(self.name) + spec = self.loc_spec + self.bootstrap._SpecMethods(spec).init_module_attrs(module) + + self.assertEqual(module.__name__, spec.name) + self.assertIs(module.__loader__, spec.loader) + self.assertEqual(module.__package__, spec.parent) + self.assertIs(module.__spec__, spec) + self.assertFalse(hasattr(module, '__path__')) + self.assertEqual(module.__file__, spec.origin) + self.assertEqual(module.__cached__, + self.util.cache_from_source(spec.origin)) + + def test_init_module_attrs_different_name(self): + module = type(sys)('eggs') + spec = self.machinery.ModuleSpec(self.name, self.loader) + self.bootstrap._SpecMethods(spec).init_module_attrs(module) + + self.assertEqual(module.__name__, spec.name) + + def test_init_module_attrs_different_spec(self): + module = type(sys)(self.name) + module.__spec__ = self.machinery.ModuleSpec('eggs', object()) + spec = self.machinery.ModuleSpec(self.name, self.loader) + self.bootstrap._SpecMethods(spec).init_module_attrs(module) + + self.assertEqual(module.__name__, spec.name) + self.assertIs(module.__loader__, spec.loader) + self.assertEqual(module.__package__, spec.parent) + self.assertIs(module.__spec__, spec) + + def test_init_module_attrs_already_set(self): + module = type(sys)('ham.eggs') + module.__loader__ = object() + module.__package__ = 'ham' + module.__path__ = ['eggs'] + module.__file__ = 'ham/eggs/__init__.py' + module.__cached__ = self.util.cache_from_source(module.__file__) + original = vars(module).copy() + spec = self.loc_spec + spec.submodule_search_locations = [''] + self.bootstrap._SpecMethods(spec).init_module_attrs(module) + + self.assertIs(module.__loader__, original['__loader__']) + self.assertEqual(module.__package__, original['__package__']) + self.assertIs(module.__path__, original['__path__']) + self.assertEqual(module.__file__, original['__file__']) + self.assertEqual(module.__cached__, original['__cached__']) + + def test_init_module_attrs_immutable(self): + module = object() + spec = self.loc_spec + spec.submodule_search_locations = [''] + self.bootstrap._SpecMethods(spec).init_module_attrs(module) + + self.assertFalse(hasattr(module, '__name__')) + self.assertFalse(hasattr(module, '__loader__')) + self.assertFalse(hasattr(module, '__package__')) + self.assertFalse(hasattr(module, '__spec__')) + self.assertFalse(hasattr(module, '__path__')) + self.assertFalse(hasattr(module, '__file__')) + self.assertFalse(hasattr(module, '__cached__')) + + # create() + + def test_create(self): + created = self.bootstrap._SpecMethods(self.spec).create() + + self.assertEqual(created.__name__, self.spec.name) + self.assertIs(created.__loader__, self.spec.loader) + self.assertEqual(created.__package__, self.spec.parent) + self.assertIs(created.__spec__, self.spec) + self.assertFalse(hasattr(created, '__path__')) + self.assertFalse(hasattr(created, '__file__')) + self.assertFalse(hasattr(created, '__cached__')) + + def test_create_from_loader(self): + module = type(sys.implementation)() + class CreatingLoader(TestLoader): + def create_module(self, spec): + return module + self.spec.loader = CreatingLoader() + created = self.bootstrap._SpecMethods(self.spec).create() + + self.assertIs(created, module) + self.assertEqual(created.__name__, self.spec.name) + self.assertIs(created.__loader__, self.spec.loader) + self.assertEqual(created.__package__, self.spec.parent) + self.assertIs(created.__spec__, self.spec) + self.assertFalse(hasattr(created, '__path__')) + self.assertFalse(hasattr(created, '__file__')) + self.assertFalse(hasattr(created, '__cached__')) + + def test_create_from_loader_not_handled(self): + class CreatingLoader(TestLoader): + def create_module(self, spec): + return None + self.spec.loader = CreatingLoader() + created = self.bootstrap._SpecMethods(self.spec).create() + + self.assertEqual(created.__name__, self.spec.name) + self.assertIs(created.__loader__, self.spec.loader) + self.assertEqual(created.__package__, self.spec.parent) + self.assertIs(created.__spec__, self.spec) + self.assertFalse(hasattr(created, '__path__')) + self.assertFalse(hasattr(created, '__file__')) + self.assertFalse(hasattr(created, '__cached__')) + + # exec() + + def test_exec(self): + self.spec.loader = NewLoader() + module = self.bootstrap._SpecMethods(self.spec).create() + sys.modules[self.name] = module + self.assertFalse(hasattr(module, 'eggs')) + self.bootstrap._SpecMethods(self.spec).exec(module) + + self.assertEqual(module.eggs, 1) + + # load() + + def test_load(self): + self.spec.loader = NewLoader() + with CleanImport(self.spec.name): + loaded = self.bootstrap._SpecMethods(self.spec).load() + installed = sys.modules[self.spec.name] + + self.assertEqual(loaded.eggs, 1) + self.assertIs(loaded, installed) + + def test_load_replaced(self): + replacement = object() + class ReplacingLoader(TestLoader): + def exec_module(self, module): + sys.modules[module.__name__] = replacement + self.spec.loader = ReplacingLoader() + with CleanImport(self.spec.name): + loaded = self.bootstrap._SpecMethods(self.spec).load() + installed = sys.modules[self.spec.name] + + self.assertIs(loaded, replacement) + self.assertIs(installed, replacement) + + def test_load_failed(self): + class FailedLoader(TestLoader): + def exec_module(self, module): + raise RuntimeError + self.spec.loader = FailedLoader() + with CleanImport(self.spec.name): + with self.assertRaises(RuntimeError): + loaded = self.bootstrap._SpecMethods(self.spec).load() + self.assertNotIn(self.spec.name, sys.modules) + + def test_load_failed_removed(self): + class FailedLoader(TestLoader): + def exec_module(self, module): + del sys.modules[module.__name__] + raise RuntimeError + self.spec.loader = FailedLoader() + with CleanImport(self.spec.name): + with self.assertRaises(RuntimeError): + loaded = self.bootstrap._SpecMethods(self.spec).load() + self.assertNotIn(self.spec.name, sys.modules) + + def test_load_existing(self): + existing = type(sys)('ham') + existing.count = 5 + self.spec.loader = NewLoader() + with CleanImport(self.name): + sys.modules[self.name] = existing + assert self.spec.name == self.name + loaded = self.bootstrap._SpecMethods(self.spec).load() + + self.assertEqual(loaded.eggs, 1) + self.assertFalse(hasattr(loaded, 'ham')) + + def test_load_legacy(self): + self.spec.loader = LegacyLoader() + with CleanImport(self.spec.name): + loaded = self.bootstrap._SpecMethods(self.spec).load() + + self.assertEqual(loaded.ham, -1) + + def test_load_legacy_attributes(self): + self.spec.loader = LegacyLoader() + with CleanImport(self.spec.name): + loaded = self.bootstrap._SpecMethods(self.spec).load() + + self.assertIs(loaded.__loader__, self.spec.loader) + self.assertEqual(loaded.__package__, self.spec.parent) + self.assertIs(loaded.__spec__, self.spec) + + def test_load_legacy_attributes_immutable(self): + module = object() + class ImmutableLoader(TestLoader): + def load_module(self, name): + sys.modules[name] = module + return module + self.spec.loader = ImmutableLoader() + with CleanImport(self.spec.name): + loaded = self.bootstrap._SpecMethods(self.spec).load() + + self.assertIs(sys.modules[self.spec.name], module) + + # reload() + + def test_reload(self): + self.spec.loader = NewLoader() + with CleanImport(self.spec.name): + loaded = self.bootstrap._SpecMethods(self.spec).load() + reloaded = self.bootstrap._SpecMethods(self.spec).exec(loaded) + installed = sys.modules[self.spec.name] + + self.assertEqual(loaded.eggs, 1) + self.assertIs(reloaded, loaded) + self.assertIs(installed, loaded) + + def test_reload_modified(self): + self.spec.loader = NewLoader() + with CleanImport(self.spec.name): + loaded = self.bootstrap._SpecMethods(self.spec).load() + loaded.eggs = 2 + reloaded = self.bootstrap._SpecMethods(self.spec).exec(loaded) + + self.assertEqual(loaded.eggs, 1) + self.assertIs(reloaded, loaded) + + def test_reload_extra_attributes(self): + self.spec.loader = NewLoader() + with CleanImport(self.spec.name): + loaded = self.bootstrap._SpecMethods(self.spec).load() + loaded.available = False + reloaded = self.bootstrap._SpecMethods(self.spec).exec(loaded) + + self.assertFalse(loaded.available) + self.assertIs(reloaded, loaded) + + def test_reload_init_module_attrs(self): + self.spec.loader = NewLoader() + with CleanImport(self.spec.name): + loaded = self.bootstrap._SpecMethods(self.spec).load() + loaded.__name__ = 'ham' + del loaded.__loader__ + del loaded.__package__ + del loaded.__spec__ + self.bootstrap._SpecMethods(self.spec).exec(loaded) + + self.assertEqual(loaded.__name__, self.spec.name) + self.assertIs(loaded.__loader__, self.spec.loader) + self.assertEqual(loaded.__package__, self.spec.parent) + self.assertIs(loaded.__spec__, self.spec) + self.assertFalse(hasattr(loaded, '__path__')) + self.assertFalse(hasattr(loaded, '__file__')) + self.assertFalse(hasattr(loaded, '__cached__')) + + def test_reload_legacy(self): + self.spec.loader = LegacyLoader() + with CleanImport(self.spec.name): + loaded = self.bootstrap._SpecMethods(self.spec).load() + reloaded = self.bootstrap._SpecMethods(self.spec).exec(loaded) + installed = sys.modules[self.spec.name] + + self.assertEqual(loaded.ham, -1) + self.assertIs(reloaded, loaded) + self.assertIs(installed, loaded) + + +class Frozen_ModuleSpecMethodsTests(ModuleSpecMethodsTests, unittest.TestCase): + bootstrap = frozen_bootstrap + machinery = frozen_machinery + util = frozen_util + + +class Source_ModuleSpecMethodsTests(ModuleSpecMethodsTests, unittest.TestCase): + bootstrap = source_bootstrap + machinery = source_machinery + util = source_util + + +class ModuleReprTests: + + # XXX Add more tests for repr(module) once ModuleSpec._module_repr() + # is in place? + + def setUp(self): + self.module = type(os)('spam') + self.spec = self.machinery.ModuleSpec('spam', TestLoader()) + + def test_module___loader___module_repr(self): + class Loader: + def module_repr(self, module): + return ''.format(module.__name__) + self.module.__loader__ = Loader() + modrepr = self.bootstrap._module_repr(self.module) + + self.assertEqual(modrepr, '') + + def test_module___loader___module_repr_bad(self): + class Loader(TestLoader): + def module_repr(self, module): + raise Exception + self.module.__loader__ = Loader() + modrepr = self.bootstrap._module_repr(self.module) + + self.assertEqual(modrepr, + ')>'.format('spam')) + + def test_module___spec__(self): + origin = 'in a hole, in the ground' + self.spec.origin = origin + self.module.__spec__ = self.spec + modrepr = self.bootstrap._module_repr(self.module) + + self.assertEqual(modrepr, ''.format('spam', origin)) + + def test_module___spec___location(self): + location = 'in_a_galaxy_far_far_away.py' + self.spec.origin = location + self.spec._set_fileattr = True + self.module.__spec__ = self.spec + modrepr = self.bootstrap._module_repr(self.module) + + self.assertEqual(modrepr, + ''.format('spam', location)) + + def test_module___spec___no_origin(self): + self.spec.loader = TestLoader() + self.module.__spec__ = self.spec + modrepr = self.bootstrap._module_repr(self.module) + + self.assertEqual(modrepr, + ')>'.format('spam')) + + def test_module___spec___no_origin_no_loader(self): + self.spec.loader = None + self.module.__spec__ = self.spec + modrepr = self.bootstrap._module_repr(self.module) + + self.assertEqual(modrepr, ''.format('spam')) + + def test_module_no_name(self): + del self.module.__name__ + modrepr = self.bootstrap._module_repr(self.module) + + self.assertEqual(modrepr, ''.format('?')) + + def test_module_with_file(self): + filename = 'e/i/e/i/o/spam.py' + self.module.__file__ = filename + modrepr = self.bootstrap._module_repr(self.module) + + self.assertEqual(modrepr, + ''.format('spam', filename)) + + def test_module_no_file(self): + self.module.__loader__ = TestLoader() + modrepr = self.bootstrap._module_repr(self.module) + + self.assertEqual(modrepr, + ')>'.format('spam')) + + def test_module_no_file_no_loader(self): + modrepr = self.bootstrap._module_repr(self.module) + + self.assertEqual(modrepr, ''.format('spam')) + + +class Frozen_ModuleReprTests(ModuleReprTests, unittest.TestCase): + bootstrap = frozen_bootstrap + machinery = frozen_machinery + util = frozen_util + + +class Source_ModuleReprTests(ModuleReprTests, unittest.TestCase): + bootstrap = source_bootstrap + machinery = source_machinery + util = source_util + + +class FactoryTests: + + def setUp(self): + self.name = 'spam' + self.path = 'spam.py' + self.cached = self.util.cache_from_source(self.path) + self.loader = TestLoader() + self.fileloader = TestLoader(self.path) + self.pkgloader = TestLoader(self.path, True) + + # spec_from_loader() + + def test_spec_from_loader_default(self): + spec = self.util.spec_from_loader(self.name, self.loader) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.loader) + self.assertIs(spec.origin, None) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertIs(spec.cached, None) + self.assertFalse(spec.has_location) + + def test_spec_from_loader_default_with_bad_is_package(self): + class Loader: + def is_package(self, name): + raise ImportError + loader = Loader() + spec = self.util.spec_from_loader(self.name, loader) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, loader) + self.assertIs(spec.origin, None) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertIs(spec.cached, None) + self.assertFalse(spec.has_location) + + def test_spec_from_loader_origin(self): + origin = 'somewhere over the rainbow' + spec = self.util.spec_from_loader(self.name, self.loader, + origin=origin) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.loader) + self.assertIs(spec.origin, origin) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertIs(spec.cached, None) + self.assertFalse(spec.has_location) + + def test_spec_from_loader_is_package_false(self): + spec = self.util.spec_from_loader(self.name, self.loader, + is_package=False) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.loader) + self.assertIs(spec.origin, None) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertIs(spec.cached, None) + self.assertFalse(spec.has_location) + + def test_spec_from_loader_is_package_true(self): + spec = self.util.spec_from_loader(self.name, self.loader, + is_package=True) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.loader) + self.assertIs(spec.origin, None) + self.assertIs(spec.loader_state, None) + self.assertEqual(spec.submodule_search_locations, []) + self.assertIs(spec.cached, None) + self.assertFalse(spec.has_location) + + def test_spec_from_loader_origin_and_is_package(self): + origin = 'where the streets have no name' + spec = self.util.spec_from_loader(self.name, self.loader, + origin=origin, is_package=True) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.loader) + self.assertIs(spec.origin, origin) + self.assertIs(spec.loader_state, None) + self.assertEqual(spec.submodule_search_locations, []) + self.assertIs(spec.cached, None) + self.assertFalse(spec.has_location) + + def test_spec_from_loader_is_package_with_loader_false(self): + loader = TestLoader(is_package=False) + spec = self.util.spec_from_loader(self.name, loader) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, loader) + self.assertIs(spec.origin, None) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertIs(spec.cached, None) + self.assertFalse(spec.has_location) + + def test_spec_from_loader_is_package_with_loader_true(self): + loader = TestLoader(is_package=True) + spec = self.util.spec_from_loader(self.name, loader) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, loader) + self.assertIs(spec.origin, None) + self.assertIs(spec.loader_state, None) + self.assertEqual(spec.submodule_search_locations, []) + self.assertIs(spec.cached, None) + self.assertFalse(spec.has_location) + + def test_spec_from_loader_default_with_file_loader(self): + spec = self.util.spec_from_loader(self.name, self.fileloader) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.fileloader) + self.assertEqual(spec.origin, self.path) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertEqual(spec.cached, self.cached) + self.assertTrue(spec.has_location) + + def test_spec_from_loader_is_package_false_with_fileloader(self): + spec = self.util.spec_from_loader(self.name, self.fileloader, + is_package=False) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.fileloader) + self.assertEqual(spec.origin, self.path) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertEqual(spec.cached, self.cached) + self.assertTrue(spec.has_location) + + def test_spec_from_loader_is_package_true_with_fileloader(self): + spec = self.util.spec_from_loader(self.name, self.fileloader, + is_package=True) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.fileloader) + self.assertEqual(spec.origin, self.path) + self.assertIs(spec.loader_state, None) + self.assertEqual(spec.submodule_search_locations, ['']) + self.assertEqual(spec.cached, self.cached) + self.assertTrue(spec.has_location) + + # spec_from_file_location() + + def test_spec_from_file_location_default(self): + if self.machinery is source_machinery: + raise unittest.SkipTest('not sure why this is breaking...') + spec = self.util.spec_from_file_location(self.name, self.path) + + self.assertEqual(spec.name, self.name) + self.assertIsInstance(spec.loader, + self.machinery.SourceFileLoader) + self.assertEqual(spec.loader.name, self.name) + self.assertEqual(spec.loader.path, self.path) + self.assertEqual(spec.origin, self.path) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertEqual(spec.cached, self.cached) + self.assertTrue(spec.has_location) + + def test_spec_from_file_location_default_without_location(self): + spec = self.util.spec_from_file_location(self.name) + + self.assertIs(spec, None) + + def test_spec_from_file_location_default_bad_suffix(self): + spec = self.util.spec_from_file_location(self.name, 'spam.eggs') + + self.assertIs(spec, None) + + def test_spec_from_file_location_loader_no_location(self): + spec = self.util.spec_from_file_location(self.name, + loader=self.fileloader) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.fileloader) + self.assertEqual(spec.origin, self.path) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertEqual(spec.cached, self.cached) + self.assertTrue(spec.has_location) + + def test_spec_from_file_location_loader_no_location_no_get_filename(self): + spec = self.util.spec_from_file_location(self.name, + loader=self.loader) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.loader) + self.assertEqual(spec.origin, '') + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertIs(spec.cached, None) + self.assertTrue(spec.has_location) + + def test_spec_from_file_location_loader_no_location_bad_get_filename(self): + class Loader: + def get_filename(self, name): + raise ImportError + loader = Loader() + spec = self.util.spec_from_file_location(self.name, loader=loader) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, loader) + self.assertEqual(spec.origin, '') + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertIs(spec.cached, None) + self.assertTrue(spec.has_location) + + def test_spec_from_file_location_smsl_none(self): + spec = self.util.spec_from_file_location(self.name, self.path, + loader=self.fileloader, + submodule_search_locations=None) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.fileloader) + self.assertEqual(spec.origin, self.path) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertEqual(spec.cached, self.cached) + self.assertTrue(spec.has_location) + + def test_spec_from_file_location_smsl_empty(self): + spec = self.util.spec_from_file_location(self.name, self.path, + loader=self.fileloader, + submodule_search_locations=[]) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.fileloader) + self.assertEqual(spec.origin, self.path) + self.assertIs(spec.loader_state, None) + self.assertEqual(spec.submodule_search_locations, ['']) + self.assertEqual(spec.cached, self.cached) + self.assertTrue(spec.has_location) + + def test_spec_from_file_location_smsl_not_empty(self): + spec = self.util.spec_from_file_location(self.name, self.path, + loader=self.fileloader, + submodule_search_locations=['eggs']) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.fileloader) + self.assertEqual(spec.origin, self.path) + self.assertIs(spec.loader_state, None) + self.assertEqual(spec.submodule_search_locations, ['eggs']) + self.assertEqual(spec.cached, self.cached) + self.assertTrue(spec.has_location) + + def test_spec_from_file_location_smsl_default(self): + spec = self.util.spec_from_file_location(self.name, self.path, + loader=self.pkgloader) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.pkgloader) + self.assertEqual(spec.origin, self.path) + self.assertIs(spec.loader_state, None) + self.assertEqual(spec.submodule_search_locations, ['']) + self.assertEqual(spec.cached, self.cached) + self.assertTrue(spec.has_location) + + def test_spec_from_file_location_smsl_default_not_package(self): + class Loader: + def is_package(self, name): + return False + loader = Loader() + spec = self.util.spec_from_file_location(self.name, self.path, + loader=loader) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, loader) + self.assertEqual(spec.origin, self.path) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertEqual(spec.cached, self.cached) + self.assertTrue(spec.has_location) + + def test_spec_from_file_location_smsl_default_no_is_package(self): + spec = self.util.spec_from_file_location(self.name, self.path, + loader=self.fileloader) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, self.fileloader) + self.assertEqual(spec.origin, self.path) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertEqual(spec.cached, self.cached) + self.assertTrue(spec.has_location) + + def test_spec_from_file_location_smsl_default_bad_is_package(self): + class Loader: + def is_package(self, name): + raise ImportError + loader = Loader() + spec = self.util.spec_from_file_location(self.name, self.path, + loader=loader) + + self.assertEqual(spec.name, self.name) + self.assertEqual(spec.loader, loader) + self.assertEqual(spec.origin, self.path) + self.assertIs(spec.loader_state, None) + self.assertIs(spec.submodule_search_locations, None) + self.assertEqual(spec.cached, self.cached) + self.assertTrue(spec.has_location) + + +class Frozen_FactoryTests(FactoryTests, unittest.TestCase): + util = frozen_util + machinery = frozen_machinery + + +class Source_FactoryTests(FactoryTests, unittest.TestCase): + util = source_util + machinery = source_machinery diff --git a/Lib/test/test_importlib/test_util.py b/Lib/test/test_importlib/test_util.py --- a/Lib/test/test_importlib/test_util.py +++ b/Lib/test/test_importlib/test_util.py @@ -34,71 +34,6 @@ DecodeSourceBytesTests, util=[frozen_util, source_util]) -class ModuleToLoadTests: - - module_name = 'ModuleManagerTest_module' - - def setUp(self): - support.unload(self.module_name) - self.addCleanup(support.unload, self.module_name) - - def test_new_module(self): - # Test a new module is created, inserted into sys.modules, has - # __initializing__ set to True after entering the context manager, - # and __initializing__ set to False after exiting. - with self.util.module_to_load(self.module_name) as module: - self.assertIn(self.module_name, sys.modules) - self.assertIs(sys.modules[self.module_name], module) - self.assertTrue(module.__initializing__) - self.assertFalse(module.__initializing__) - - def test_new_module_failed(self): - # Test the module is removed from sys.modules. - try: - with self.util.module_to_load(self.module_name) as module: - self.assertIn(self.module_name, sys.modules) - raise exception - except Exception: - self.assertNotIn(self.module_name, sys.modules) - else: - self.fail('importlib.util.module_to_load swallowed an exception') - - def test_reload(self): - # Test that the same module is in sys.modules. - created_module = types.ModuleType(self.module_name) - sys.modules[self.module_name] = created_module - with self.util.module_to_load(self.module_name) as module: - self.assertIs(module, created_module) - - def test_reload_failed(self): - # Test that the module was left in sys.modules. - created_module = types.ModuleType(self.module_name) - sys.modules[self.module_name] = created_module - try: - with self.util.module_to_load(self.module_name) as module: - raise Exception - except Exception: - self.assertIn(self.module_name, sys.modules) - else: - self.fail('importlib.util.module_to_load swallowed an exception') - - def test_reset_name(self): - # If reset_name is true then module.__name__ = name, else leave it be. - odd_name = 'not your typical name' - created_module = types.ModuleType(self.module_name) - created_module.__name__ = odd_name - sys.modules[self.module_name] = created_module - with self.util.module_to_load(self.module_name) as module: - self.assertEqual(module.__name__, self.module_name) - created_module.__name__ = odd_name - with self.util.module_to_load(self.module_name, reset_name=False) as module: - self.assertEqual(module.__name__, odd_name) - -Frozen_ModuleToLoadTests, Source_ModuleToLoadTests = test_util.test_both( - ModuleToLoadTests, - util=[frozen_util, source_util]) - - class ModuleForLoaderTests: """Tests for importlib.util.module_for_loader.""" diff --git a/Lib/test/test_module.py b/Lib/test/test_module.py --- a/Lib/test/test_module.py +++ b/Lib/test/test_module.py @@ -37,8 +37,10 @@ self.assertEqual(foo.__doc__, None) self.assertIs(foo.__loader__, None) self.assertIs(foo.__package__, None) + self.assertIs(foo.__spec__, None) self.assertEqual(foo.__dict__, {"__name__": "foo", "__doc__": None, - "__loader__": None, "__package__": None}) + "__loader__": None, "__package__": None, + "__spec__": None}) def test_ascii_docstring(self): # ASCII docstring @@ -47,7 +49,8 @@ self.assertEqual(foo.__doc__, "foodoc") self.assertEqual(foo.__dict__, {"__name__": "foo", "__doc__": "foodoc", - "__loader__": None, "__package__": None}) + "__loader__": None, "__package__": None, + "__spec__": None}) def test_unicode_docstring(self): # Unicode docstring @@ -56,7 +59,8 @@ self.assertEqual(foo.__doc__, "foodoc\u1234") self.assertEqual(foo.__dict__, {"__name__": "foo", "__doc__": "foodoc\u1234", - "__loader__": None, "__package__": None}) + "__loader__": None, "__package__": None, + "__spec__": None}) def test_reinit(self): # Reinitialization should not replace the __dict__ @@ -69,7 +73,7 @@ self.assertEqual(foo.bar, 42) self.assertEqual(foo.__dict__, {"__name__": "foo", "__doc__": "foodoc", "bar": 42, - "__loader__": None, "__package__": None}) + "__loader__": None, "__package__": None, "__spec__": None}) self.assertTrue(foo.__dict__ is d) def test_dont_clear_dict(self): diff --git a/Lib/test/test_namespace_pkgs.py b/Lib/test/test_namespace_pkgs.py --- a/Lib/test/test_namespace_pkgs.py +++ b/Lib/test/test_namespace_pkgs.py @@ -1,5 +1,4 @@ import contextlib -from importlib._bootstrap import NamespaceLoader import importlib.abc import importlib.machinery import os @@ -290,24 +289,5 @@ self.assertEqual(a_test.attr, 'in module') -class ABCTests(unittest.TestCase): - - def setUp(self): - self.loader = NamespaceLoader('foo', ['pkg'], - importlib.machinery.PathFinder) - - def test_is_package(self): - self.assertTrue(self.loader.is_package('foo')) - - def test_get_code(self): - self.assertTrue(isinstance(self.loader.get_code('foo'), types.CodeType)) - - def test_get_source(self): - self.assertEqual(self.loader.get_source('foo'), '') - - def test_abc_isinstance(self): - self.assertTrue(isinstance(self.loader, importlib.abc.InspectLoader)) - - if __name__ == "__main__": unittest.main() diff --git a/Lib/test/test_pkg.py b/Lib/test/test_pkg.py --- a/Lib/test/test_pkg.py +++ b/Lib/test/test_pkg.py @@ -199,14 +199,14 @@ import t5 self.assertEqual(fixdir(dir(t5)), ['__cached__', '__doc__', '__file__', '__loader__', - '__name__', '__package__', '__path__', 'foo', - 'string', 't5']) + '__name__', '__package__', '__path__', '__spec__', + 'foo', 'string', 't5']) self.assertEqual(fixdir(dir(t5.foo)), ['__cached__', '__doc__', '__file__', '__loader__', - '__name__', '__package__', 'string']) + '__name__', '__package__', '__spec__', 'string']) self.assertEqual(fixdir(dir(t5.string)), ['__cached__', '__doc__', '__file__', '__loader__', - '__name__', '__package__', 'spam']) + '__name__', '__package__', '__spec__', 'spam']) def test_6(self): hier = [ @@ -222,14 +222,15 @@ import t6 self.assertEqual(fixdir(dir(t6)), ['__all__', '__cached__', '__doc__', '__file__', - '__loader__', '__name__', '__package__', '__path__']) + '__loader__', '__name__', '__package__', '__path__', + '__spec__']) s = """ import t6 from t6 import * self.assertEqual(fixdir(dir(t6)), ['__all__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', - '__path__', 'eggs', 'ham', 'spam']) + '__path__', '__spec__', 'eggs', 'ham', 'spam']) self.assertEqual(dir(), ['eggs', 'ham', 'self', 'spam', 't6']) """ self.run_code(s) @@ -256,18 +257,19 @@ import t7 as tas self.assertEqual(fixdir(dir(tas)), ['__cached__', '__doc__', '__file__', '__loader__', - '__name__', '__package__', '__path__']) + '__name__', '__package__', '__path__', '__spec__']) self.assertFalse(t7) from t7 import sub as subpar self.assertEqual(fixdir(dir(subpar)), ['__cached__', '__doc__', '__file__', '__loader__', - '__name__', '__package__', '__path__']) + '__name__', '__package__', '__path__', '__spec__']) self.assertFalse(t7) self.assertFalse(sub) from t7.sub import subsub as subsubsub self.assertEqual(fixdir(dir(subsubsub)), ['__cached__', '__doc__', '__file__', '__loader__', - '__name__', '__package__', '__path__', 'spam']) + '__name__', '__package__', '__path__', '__spec__', + 'spam']) self.assertFalse(t7) self.assertFalse(sub) self.assertFalse(subsub) diff --git a/Lib/test/test_pkgutil.py b/Lib/test/test_pkgutil.py --- a/Lib/test/test_pkgutil.py +++ b/Lib/test/test_pkgutil.py @@ -208,9 +208,16 @@ importers = list(iter_importers(fullname)) expected_importer = get_importer(pathitem) for finder in importers: + loader = finder.find_module(fullname) + try: + loader = loader.loader + except AttributeError: + # For now we still allow raw loaders from + # find_module(). + pass self.assertIsInstance(finder, importlib.machinery.FileFinder) self.assertEqual(finder, expected_importer) - self.assertIsInstance(finder.find_module(fullname), + self.assertIsInstance(loader, importlib.machinery.SourceFileLoader) self.assertIsNone(finder.find_module(pkgname)) @@ -222,8 +229,11 @@ finally: shutil.rmtree(dirname) del sys.path[0] - del sys.modules['spam'] - del sys.modules['spam.eggs'] + try: + del sys.modules['spam'] + del sys.modules['spam.eggs'] + except KeyError: + pass def test_mixed_namespace(self): diff --git a/Lib/test/test_reprlib.py b/Lib/test/test_reprlib.py --- a/Lib/test/test_reprlib.py +++ b/Lib/test/test_reprlib.py @@ -253,6 +253,7 @@ print("cached_path_len =", cached_path_len) def test_module(self): + self.maxDiff = None self._check_path_limitations(self.pkgname) create_empty_file(os.path.join(self.subpkgname, self.pkgname + '.py')) importlib.invalidate_caches() diff --git a/Lib/test/test_runpy.py b/Lib/test/test_runpy.py --- a/Lib/test/test_runpy.py +++ b/Lib/test/test_runpy.py @@ -47,6 +47,7 @@ "__cached__": None, "__package__": None, "__doc__": None, +# "__spec__": None, # XXX Uncomment. } example_namespace = { "sys": sys, @@ -56,7 +57,7 @@ "run_name_in_sys_modules": False, "module_in_sys_modules": False, "nested": dict(implicit_namespace, - x=1, __name__="", __loader__=None), + x=1, __name__="", __loader__=None, __spec__=None), } example_namespace.update(implicit_namespace) @@ -243,6 +244,7 @@ "__name__": mod_name, "__file__": mod_fname, "__package__": mod_name.rpartition(".")[0], +# "__spec__": None, # XXX Needs to be set. }) if alter_sys: expected_ns.update({ @@ -279,6 +281,7 @@ "__name__": mod_name, "__file__": mod_fname, "__package__": pkg_name, +# "__spec__": None, # XXX Needs to be set. }) if alter_sys: expected_ns.update({ diff --git a/Objects/moduleobject.c b/Objects/moduleobject.c --- a/Objects/moduleobject.c +++ b/Objects/moduleobject.c @@ -45,6 +45,8 @@ return -1; if (PyDict_SetItemString(md_dict, "__loader__", Py_None) != 0) return -1; + if (PyDict_SetItemString(md_dict, "__spec__", Py_None) != 0) + return -1; if (PyUnicode_CheckExact(name)) { Py_INCREF(name); Py_XDECREF(mod->md_name); @@ -398,55 +400,10 @@ static PyObject * module_repr(PyModuleObject *m) { - PyObject *name, *filename, *repr, *loader = NULL; + PyThreadState *tstate = PyThreadState_GET(); + PyInterpreterState *interp = tstate->interp; - /* See if the module has an __loader__. If it does, give the loader the - * first shot at producing a repr for the module. - */ - if (m->md_dict != NULL) { - loader = PyDict_GetItemString(m->md_dict, "__loader__"); - } - if (loader != NULL && loader != Py_None) { - repr = PyObject_CallMethod(loader, "module_repr", "(O)", - (PyObject *)m, NULL); - if (repr == NULL) { - PyErr_Clear(); - } - else { - return repr; - } - } - /* __loader__.module_repr(m) did not provide us with a repr. Next, see if - * the module has an __file__. If it doesn't then use repr(__loader__) if - * it exists, otherwise, just use module.__name__. - */ - name = PyModule_GetNameObject((PyObject *)m); - if (name == NULL) { - PyErr_Clear(); - name = PyUnicode_FromStringAndSize("?", 1); - if (name == NULL) - return NULL; - } - filename = PyModule_GetFilenameObject((PyObject *)m); - if (filename == NULL) { - PyErr_Clear(); - /* There's no m.__file__, so if there was a __loader__, use that in - * the repr, otherwise, the only thing you can use is m.__name__ - */ - if (loader == NULL || loader == Py_None) { - repr = PyUnicode_FromFormat("", name); - } - else { - repr = PyUnicode_FromFormat("", name, loader); - } - } - /* Finally, use m.__file__ */ - else { - repr = PyUnicode_FromFormat("", name, filename); - Py_DECREF(filename); - } - Py_DECREF(name); - return repr; + return PyObject_CallMethod(interp->importlib, "_module_repr", "O", m); } static int diff --git a/Python/import.c b/Python/import.c --- a/Python/import.c +++ b/Python/import.c @@ -1232,7 +1232,8 @@ int level) { _Py_IDENTIFIER(__import__); - _Py_IDENTIFIER(__initializing__); + _Py_IDENTIFIER(__spec__); + _Py_IDENTIFIER(_initializing); _Py_IDENTIFIER(__package__); _Py_IDENTIFIER(__path__); _Py_IDENTIFIER(__name__); @@ -1426,16 +1427,21 @@ goto error_with_unlock; } else if (mod != NULL) { - PyObject *value; + PyObject *value = NULL; + PyObject *spec; int initializing = 0; Py_INCREF(mod); /* Optimization: only call _bootstrap._lock_unlock_module() if - __initializing__ is true. - NOTE: because of this, __initializing__ must be set *before* + __spec__._initializing is true. + NOTE: because of this, initializing must be set *before* stuffing the new module in sys.modules. */ - value = _PyObject_GetAttrId(mod, &PyId___initializing__); + spec = _PyObject_GetAttrId(mod, &PyId___spec__); + if (spec != NULL) { + value = _PyObject_GetAttrId(spec, &PyId__initializing); + Py_DECREF(spec); + } if (value == NULL) PyErr_Clear(); else { diff --git a/Python/importlib.h b/Python/importlib.h --- a/Python/importlib.h +++ b/Python/importlib.h [stripped] -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 17:38:52 2013 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 22 Nov 2013 17:38:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319673=3A_Add_path?= =?utf-8?q?lib_to_the_stdlib_as_a_provisional_module_=28PEP_428=29=2E?= Message-ID: <3dR3Hc3rlBz7Lnj@mail.python.org> http://hg.python.org/cpython/rev/43377dcfb801 changeset: 87348:43377dcfb801 user: Antoine Pitrou date: Fri Nov 22 17:38:12 2013 +0100 summary: Issue #19673: Add pathlib to the stdlib as a provisional module (PEP 428). files: Doc/library/filesys.rst | 1 + Doc/library/glob.rst | 4 + Doc/library/os.path.rst | 5 + Doc/library/pathlib.rst | 874 +++++++++++++ Doc/whatsnew/3.4.rst | 18 + Lib/pathlib.py | 1287 ++++++++++++++++++++ Lib/test/test_pathlib.py | 1639 ++++++++++++++++++++++++++ Misc/NEWS | 2 + 8 files changed, 3830 insertions(+), 0 deletions(-) diff --git a/Doc/library/filesys.rst b/Doc/library/filesys.rst --- a/Doc/library/filesys.rst +++ b/Doc/library/filesys.rst @@ -12,6 +12,7 @@ .. toctree:: + pathlib.rst os.path.rst fileinput.rst stat.rst diff --git a/Doc/library/glob.rst b/Doc/library/glob.rst --- a/Doc/library/glob.rst +++ b/Doc/library/glob.rst @@ -25,6 +25,10 @@ For example, ``'[?]'`` matches the character ``'?'``. +.. seealso:: + The :mod:`pathlib` module offers high-level path objects. + + .. function:: glob(pathname) Return a possibly-empty list of path names that match *pathname*, which must be diff --git a/Doc/library/os.path.rst b/Doc/library/os.path.rst --- a/Doc/library/os.path.rst +++ b/Doc/library/os.path.rst @@ -22,6 +22,11 @@ explicitly when an application desires shell-like path expansion. (See also the :mod:`glob` module.) + +.. seealso:: + The :mod:`pathlib` module offers high-level path objects. + + .. note:: All of these functions accept either only bytes or only string objects as diff --git a/Doc/library/pathlib.rst b/Doc/library/pathlib.rst new file mode 100644 --- /dev/null +++ b/Doc/library/pathlib.rst @@ -0,0 +1,874 @@ + +:mod:`pathlib` --- Object-oriented filesystem paths +=================================================== + +.. module:: pathlib + :synopsis: Object-oriented filesystem paths + +.. index:: single: path; operations + +.. versionadded:: 3.4 + + +This module offers classes representing filesystem paths with semantics +appropriate for different operating systems. Path classes are divided +between :ref:`pure paths `, which provide purely computational +operations without I/O, and :ref:`concrete paths `, which +inherit from pure paths but also provide I/O operations. + +The main point of entry is the :class:`Path` class, which will instantiate +a :ref:`concrete path ` for the current platform. + +.. note:: + This module module has been included in the standard library on a + :term:`provisional basis `. Backwards incompatible + changes (up to and including removal of the package) may occur if deemed + necessary by the core developers. + +.. seealso:: + :pep:`428`: The pathlib module -- object-oriented filesystem paths. + +.. seealso:: + For low-level path manipulation on strings, you can also use the + :mod:`os.path` module. + + +Basic use +--------- + +Importing the main class:: + + >>> from pathlib import Path + +Listing subdirectories:: + + >>> p = Path('.') + >>> [x for x in p.iterdir() if x.is_dir()] + [PosixPath('.hg'), PosixPath('docs'), PosixPath('dist'), + PosixPath('__pycache__'), PosixPath('build')] + +Listing Python source files in this directory tree:: + + >>> list(p.glob('**/*.py')) + [PosixPath('test_pathlib.py'), PosixPath('setup.py'), + PosixPath('pathlib.py'), PosixPath('docs/conf.py'), + PosixPath('build/lib/pathlib.py')] + +Navigating inside a directory tree:: + + >>> p = Path('/etc') + >>> q = p / 'init.d' / 'reboot' + >>> q + PosixPath('/etc/init.d/reboot') + >>> q.resolve() + PosixPath('/etc/rc.d/init.d/halt') + +Querying path properties:: + + >>> q.exists() + True + >>> q.is_dir() + False + +Opening a file:: + + >>> with q.open() as f: f.readline() + ... + '#!/bin/bash\n' + + +.. _pure-paths: + +Pure paths +---------- + +Pure path objects provide path-handling operations which don't actually +access a filesystem. There are three ways to access these classes, which +we also call *flavours*: + + +.. class:: PurePosixPath + + A subclass of :class:`PurePath`, this path flavour represents non-Windows + filesystem paths:: + + >>> PurePosixPath('/etc') + PurePosixPath('/etc') + +.. class:: PureWindowsPath + + A subclass of :class:`PurePath`, this path flavour represents Windows + filesystem paths:: + + >>> PureWindowsPath('c:/Program Files/') + PureWindowsPath('c:/Program Files') + +.. class:: PurePath + + A generic class that represents the system's path flavour (instantiating + it creates either a :class:`PurePosixPath` or a :class:`PureWindowsPath`):: + + >>> PurePath('setup.py') + PurePosixPath('setup.py') + + +Regardless of the system you're running on, you can instantiate all of +these classes, since they don't provide any operation that does system calls. + + +Constructing paths +^^^^^^^^^^^^^^^^^^ + +Path constructors accept an arbitrary number of positional arguments. +When called without any argument, a path object points to the current +directory:: + + >>> PurePath() + PurePosixPath('.') + +Any argument can be a string or bytes object representing an arbitrary number +of path segments, but it can also be another path object:: + + >>> PurePath('foo', 'some/path', 'bar') + PurePosixPath('foo/some/path/bar') + >>> PurePath(Path('foo'), Path('bar')) + PurePosixPath('foo/bar') + +When several absolute paths are given, the last is taken as an anchor +(mimicking :func:`os.path.join`'s behaviour):: + + >>> PurePath('/etc', '/usr', 'lib64') + PurePosixPath('/usr/lib64') + >>> PureWindowsPath('c:/Windows', 'd:bar') + PureWindowsPath('d:bar') + +However, in a Windows path, changing the local root doesn't discard the +previous drive setting:: + + >>> PureWindowsPath('c:/Windows', '/Program Files') + PureWindowsPath('c:/Program Files') + +Spurious slashes and single dots are collapsed, but double dots (``'..'``) +are not, since this would change the meaning of a path in the face of +symbolic links:: + + >>> PurePath('foo//bar') + PurePosixPath('foo/bar') + >>> PurePath('foo/./bar') + PurePosixPath('foo/bar') + >>> PurePath('foo/../bar') + PurePosixPath('foo/../bar') + +(a na?ve approach would make ``PurePosixPath('foo/../bar')`` equivalent +to ``PurePosixPath('bar')``, which is wrong if ``foo`` is a symbolic link +to another directory) + + +General properties +^^^^^^^^^^^^^^^^^^ + +Paths are immutable and hashable. Paths of a same flavour are comparable +and orderable. These properties respect the flavour's case-folding +semantics:: + + >>> PurePosixPath('foo') == PurePosixPath('FOO') + False + >>> PureWindowsPath('foo') == PureWindowsPath('FOO') + True + >>> PureWindowsPath('FOO') in { PureWindowsPath('foo') } + True + >>> PureWindowsPath('C:') < PureWindowsPath('d:') + True + +Paths of a different flavour compare unequal and cannot be ordered:: + + >>> PureWindowsPath('foo') == PurePosixPath('foo') + False + >>> PureWindowsPath('foo') < PurePosixPath('foo') + Traceback (most recent call last): + File "", line 1, in + TypeError: unorderable types: PureWindowsPath() < PurePosixPath() + + +Operators +^^^^^^^^^ + +The slash operator helps create child paths, similarly to :func:`os.path.join`:: + + >>> p = PurePath('/etc') + >>> p + PurePosixPath('/etc') + >>> p / 'init.d' / 'apache2' + PurePosixPath('/etc/init.d/apache2') + >>> q = PurePath('bin') + >>> '/usr' / q + PurePosixPath('/usr/bin') + +The string representation of a path is the raw filesystem path itself +(in native form, e.g. with backslashes under Windows), which you can +pass to any function taking a file path as a string:: + + >>> p = PurePath('/etc') + >>> str(p) + '/etc' + >>> p = PureWindowsPath('c:/Program Files') + >>> str(p) + 'c:\\Program Files' + +Similarly, calling :class:`bytes` on a path gives the raw filesystem path as a +bytes object, as encoded by :func:`os.fsencode`:: + + >>> bytes(p) + b'/etc' + +.. note:: + Calling :class:`bytes` is only recommended under Unix. Under Windows, + the unicode form is the canonical representation of filesystem paths. + + +Accessing individual parts +^^^^^^^^^^^^^^^^^^^^^^^^^^ + +To access the individual "parts" (components) of a path, use the following +property: + +.. data:: PurePath.parts + + A tuple giving access to the path's various components:: + + >>> p = PurePath('/usr/bin/python3') + >>> p.parts + ('/', 'usr', 'bin', 'python3') + + >>> p = PureWindowsPath('c:/Program Files/PSF') + >>> p.parts + ('c:\\', 'Program Files', 'PSF') + + (note how the drive and local root are regrouped in a single part) + + +Methods and properties +^^^^^^^^^^^^^^^^^^^^^^ + +Pure paths provide the following methods an properties: + +.. data:: PurePath.drive + + A string representing the drive letter or name, if any:: + + >>> PureWindowsPath('c:/Program Files/').drive + 'c:' + >>> PureWindowsPath('/Program Files/').drive + '' + >>> PurePosixPath('/etc').drive + '' + + UNC shares are also considered drives:: + + >>> PureWindowsPath('//host/share/foo.txt').drive + '\\\\host\\share' + +.. data:: PurePath.root + + A string representing the (local or global) root, if any:: + + >>> PureWindowsPath('c:/Program Files/').root + '\\' + >>> PureWindowsPath('c:Program Files/').root + '' + >>> PurePosixPath('/etc').root + '/' + + UNC shares always have a root:: + + >>> PureWindowsPath('//host/share').root + '\\' + +.. data:: PurePath.anchor + + The concatenation of the drive and root:: + + >>> PureWindowsPath('c:/Program Files/').anchor + 'c:\\' + >>> PureWindowsPath('c:Program Files/').anchor + 'c:' + >>> PurePosixPath('/etc').anchor + '/' + >>> PureWindowsPath('//host/share').anchor + '\\\\host\\share\\' + + +.. data:: PurePath.parents + + An immutable sequence providing access to the logical ancestors of + the path:: + + >>> p = PureWindowsPath('c:/foo/bar/setup.py') + >>> p.parents[0] + PureWindowsPath('c:/foo/bar') + >>> p.parents[1] + PureWindowsPath('c:/foo') + >>> p.parents[2] + PureWindowsPath('c:/') + + +.. data:: PurePath.parent + + The logical parent of the path:: + + >>> p = PurePosixPath('/a/b/c/d') + >>> p.parent + PurePosixPath('/a/b/c') + + You cannot go past an anchor, or empty path:: + + >>> p = PurePosixPath('/') + >>> p.parent + PurePosixPath('/') + >>> p = PurePosixPath('.') + >>> p.parent + PurePosixPath('.') + + .. note:: + This is a purely lexical operation, hence the following behaviour:: + + >>> p = PurePosixPath('foo/..') + >>> p.parent + PurePosixPath('foo') + + If you want to walk an arbitrary filesystem path upwards, it is + recommended to first call :meth:`Path.resolve` so as to resolve + symlinks and eliminate `".."` components. + + +.. data:: PurePath.name + + A string representing the final path component, excluding the drive and + root, if any:: + + >>> PurePosixPath('my/library/setup.py').name + 'setup.py' + + UNC drive names are not considered:: + + >>> PureWindowsPath('//some/share/setup.py').name + 'setup.py' + >>> PureWindowsPath('//some/share').name + '' + + +.. data:: PurePath.suffix + + The file extension of the final component, if any:: + + >>> PurePosixPath('my/library/setup.py').suffix + '.py' + >>> PurePosixPath('my/library.tar.gz').suffix + '.gz' + >>> PurePosixPath('my/library').suffix + '' + + +.. data:: PurePath.suffixes + + A list of the path's file extensions:: + + >>> PurePosixPath('my/library.tar.gar').suffixes + ['.tar', '.gar'] + >>> PurePosixPath('my/library.tar.gz').suffixes + ['.tar', '.gz'] + >>> PurePosixPath('my/library').suffixes + [] + + +.. data:: PurePath.stem + + The final path component, without its suffix:: + + >>> PurePosixPath('my/library.tar.gz').stem + 'library.tar' + >>> PurePosixPath('my/library.tar').stem + 'library' + >>> PurePosixPath('my/library').stem + 'library' + + +.. method:: PurePath.as_posix() + + Return a string representation of the path with forward slashes (``/``):: + + >>> p = PureWindowsPath('c:\\windows') + >>> str(p) + 'c:\\windows' + >>> p.as_posix() + 'c:/windows' + + +.. method:: PurePath.as_uri() + + Represent the path as a ``file`` URI. :exc:`ValueError` is raised if + the path isn't absolute. + + >>> p = PurePosixPath('/etc/passwd') + >>> p.as_uri() + 'file:///etc/passwd' + >>> p = PureWindowsPath('c:/Windows') + >>> p.as_uri() + 'file:///c:/Windows' + + +.. method:: PurePath.is_absolute() + + Return whether the path is absolute or not. A path is considered absolute + if it has both a root and (if the flavour allows) a drive:: + + >>> PurePosixPath('/a/b').is_absolute() + True + >>> PurePosixPath('a/b').is_absolute() + False + + >>> PureWindowsPath('c:/a/b').is_absolute() + True + >>> PureWindowsPath('/a/b').is_absolute() + False + >>> PureWindowsPath('c:').is_absolute() + False + >>> PureWindowsPath('//some/share').is_absolute() + True + + +.. method:: PurePath.is_reserved() + + With :class:`PureWindowsPath`, return True if the path is considered + reserved under Windows, False otherwise. With :class:`PurePosixPath`, + False is always returned. + + >>> PureWindowsPath('nul').is_reserved() + True + >>> PurePosixPath('nul').is_reserved() + False + + File system calls on reserved paths can fail mysteriously or have + unintended effects. + + +.. method:: PurePath.joinpath(*other) + + Calling this method is equivalent to indexing the path with each of + the *other* arguments in turn:: + + >>> PurePosixPath('/etc').joinpath('passwd') + PurePosixPath('/etc/passwd') + >>> PurePosixPath('/etc').joinpath(PurePosixPath('passwd')) + PurePosixPath('/etc/passwd') + >>> PurePosixPath('/etc').joinpath('init.d', 'apache2') + PurePosixPath('/etc/init.d/apache2') + >>> PureWindowsPath('c:').joinpath('/Program Files') + PureWindowsPath('c:/Program Files') + + +.. method:: PurePath.match(pattern) + + Match this path against the provided glob-style pattern. Return True + if matching is successful, False otherwise. + + If *pattern* is relative, the path can be either relative or absolute, + and matching is done from the right:: + + >>> PurePath('a/b.py').match('*.py') + True + >>> PurePath('/a/b/c.py').match('b/*.py') + True + >>> PurePath('/a/b/c.py').match('a/*.py') + False + + If *pattern* is absolute, the path must be absolute, and the whole path + must match:: + + >>> PurePath('/a.py').match('/*.py') + True + >>> PurePath('a/b.py').match('/*.py') + False + + As with other methods, case-sensitivity is observed:: + + >>> PureWindowsPath('b.py').match('*.PY') + True + + +.. method:: PurePath.relative_to(*other) + + Compute a version of this path relative to the path represented by + *other*. If it's impossible, ValueError is raised:: + + >>> p = PurePosixPath('/etc/passwd') + >>> p.relative_to('/') + PurePosixPath('etc/passwd') + >>> p.relative_to('/etc') + PurePosixPath('passwd') + >>> p.relative_to('/usr') + Traceback (most recent call last): + File "", line 1, in + File "pathlib.py", line 694, in relative_to + .format(str(self), str(formatted))) + ValueError: '/etc/passwd' does not start with '/usr' + + +.. _concrete-paths: + + +Concrete paths +-------------- + +Concrete paths are subclasses of the pure path classes. In addition to +operations provided by the latter, they also provide methods to do system +calls on path objects. There are three ways to instantiate concrete paths: + + +.. class:: PosixPath + + A subclass of :class:`Path` and :class:`PurePosixPath`, this class + represents concrete non-Windows filesystem paths:: + + >>> PosixPath('/etc') + PosixPath('/etc') + +.. class:: WindowsPath + + A subclass of :class:`Path` and :class:`PureWindowsPath`, this class + represents concrete Windows filesystem paths:: + + >>> WindowsPath('c:/Program Files/') + WindowsPath('c:/Program Files') + +.. class:: Path + + A subclass of :class:`PurePath`, this class represents concrete paths of + the system's path flavour (instantiating it creates either a + :class:`PosixPath` or a :class:`WindowsPath`):: + + >>> Path('setup.py') + PosixPath('setup.py') + + +You can only instantiate the class flavour that corresponds to your system +(allowing system calls on non-compatible path flavours could lead to +bugs or failures in your application):: + + >>> import os + >>> os.name + 'posix' + >>> Path('setup.py') + PosixPath('setup.py') + >>> PosixPath('setup.py') + PosixPath('setup.py') + >>> WindowsPath('setup.py') + Traceback (most recent call last): + File "", line 1, in + File "pathlib.py", line 798, in __new__ + % (cls.__name__,)) + NotImplementedError: cannot instantiate 'WindowsPath' on your system + + +Methods +^^^^^^^ + +Concrete paths provide the following methods in addition to pure paths +methods. Many of these methods can raise an :exc:`OSError` if a system +call fails (for example because the path doesn't exist): + +.. classmethod:: Path.cwd() + + Return a new path object representing the current directory (as returned + by :func:`os.getcwd`):: + + >>> Path.cwd() + PosixPath('/home/antoine/pathlib') + + +.. method:: Path.stat() + + Return information about this path (similarly to :func:`os.stat`). + The result is looked up at each call to this method. + + >>> p = Path('setup.py') + >>> p.stat().st_size + 956 + >>> p.stat().st_mtime + 1327883547.852554 + + +.. method:: Path.chmod(mode) + + Change the file mode and permissions, like :func:`os.chmod`:: + + >>> p = Path('setup.py') + >>> p.stat().st_mode + 33277 + >>> p.chmod(0o444) + >>> p.stat().st_mode + 33060 + + +.. method:: Path.exists() + + Whether the path points to an existing file or directory:: + + >>> Path('.').exists() + True + >>> Path('setup.py').exists() + True + >>> Path('/etc').exists() + True + >>> Path('nonexistentfile').exists() + False + + .. note:: + If the path points to a symlink, :meth:`exists` returns whether the + symlink *points to* an existing file or directory. + + +.. method:: Path.glob(pattern) + + Glob the given *pattern* in the directory represented by this path, + yielding all matching files (of any kind):: + + >>> sorted(Path('.').glob('*.py')) + [PosixPath('pathlib.py'), PosixPath('setup.py'), PosixPath('test_pathlib.py')] + >>> sorted(Path('.').glob('*/*.py')) + [PosixPath('docs/conf.py')] + + The "``**``" pattern means "this directory and all subdirectories, + recursively". In other words, it enables recursive globbing:: + + >>> sorted(Path('.').glob('**/*.py')) + [PosixPath('build/lib/pathlib.py'), + PosixPath('docs/conf.py'), + PosixPath('pathlib.py'), + PosixPath('setup.py'), + PosixPath('test_pathlib.py')] + + .. note:: + Using the "``**``" pattern in large directory trees may consume + an inordinate amount of time. + + +.. method:: Path.group() + + Return the name of the group owning the file. :exc:`KeyError` is thrown + if the file's gid isn't found in the system database. + + +.. method:: Path.is_dir() + + Return True if the path points to a directory (or a symbolic link + pointing to a directory), False if it points to another kind of file. + + False is also returned if the path doesn't exist or is a broken symlink; + other errors (such as permission errors) are propagated. + + +.. method:: Path.is_file() + + Return True if the path points to a regular file (or a symbolic link + pointing to a regular file), False if it points to another kind of file. + + False is also returned if the path doesn't exist or is a broken symlink; + other errors (such as permission errors) are propagated. + + +.. method:: Path.is_symlink() + + Return True if the path points to a symbolic link, False otherwise. + + False is also returned if the path doesn't exist; other errors (such + as permission errors) are propagated. + + +.. method:: Path.is_socket() + + Return True if the path points to a Unix socket (or a symbolic link + pointing to a Unix socket), False if it points to another kind of file. + + False is also returned if the path doesn't exist or is a broken symlink; + other errors (such as permission errors) are propagated. + + +.. method:: Path.is_fifo() + + Return True if the path points to a FIFO (or a symbolic link + pointing to a FIFO), False if it points to another kind of file. + + False is also returned if the path doesn't exist or is a broken symlink; + other errors (such as permission errors) are propagated. + + +.. method:: Path.is_block_device() + + Return True if the path points to a block device (or a symbolic link + pointing to a block device), False if it points to another kind of file. + + False is also returned if the path doesn't exist or is a broken symlink; + other errors (such as permission errors) are propagated. + + +.. method:: Path.is_char_device() + + Return True if the path points to a character device (or a symbolic link + pointing to a character device), False if it points to another kind of file. + + False is also returned if the path doesn't exist or is a broken symlink; + other errors (such as permission errors) are propagated. + + +.. method:: Path.iterdir() + + When the path points to a directory, yield path objects of the directory + contents:: + + >>> p = Path('docs') + >>> for child in p.iterdir(): child + ... + PosixPath('docs/conf.py') + PosixPath('docs/_templates') + PosixPath('docs/make.bat') + PosixPath('docs/index.rst') + PosixPath('docs/_build') + PosixPath('docs/_static') + PosixPath('docs/Makefile') + +.. method:: Path.lchmod(mode) + + Like :meth:`Path.chmod` but, if the path points to a symbolic link, the + symbolic link's mode is changed rather than its target's. + + +.. method:: Path.lstat() + + Like :meth:`Path.stat` but, if the path points to a symbolic link, return + the symbolic link's information rather than its target's. + + +.. method:: Path.mkdir(mode=0o777, parents=False) + + Create a new directory at this given path. If *mode* is given, it is + combined with the process' ``umask`` value to determine the file mode + and access flags. If the path already exists, :exc:`OSError` is raised. + + If *parents* is True, any missing parents of this path are created + as needed. If *parents* is False (the default), a missing parent raises + :exc:`OSError`. + + +.. method:: Path.open(mode='r', buffering=-1, encoding=None, errors=None, newline=None) + + Open the file pointed to by the path, like the built-in :func:`open` + function does:: + + >>> p = Path('setup.py') + >>> with p.open() as f: + ... f.readline() + ... + '#!/usr/bin/env python3\n' + + +.. method:: Path.owner() + + Return the name of the user owning the file. :exc:`KeyError` is thrown + if the file's uid isn't found in the system database. + + +.. method:: Path.rename(target) + + Rename this file or directory to the given *target*. *target* can be + either a string or another path object:: + + >>> p = Path('foo') + >>> p.open('w').write('some text') + 9 + >>> target = Path('bar') + >>> p.rename(target) + >>> target.open().read() + 'some text' + + +.. method:: Path.replace(target) + + Rename this file or directory to the given *target*. If *target* points + to an existing file or directory, it will be unconditionally replaced. + + +.. method:: Path.resolve() + + Make the path absolute, resolving any symlinks. A new path object is + returned:: + + >>> p = Path() + >>> p + PosixPath('.') + >>> p.resolve() + PosixPath('/home/antoine/pathlib') + + `".."` components are also eliminated (this is the only method to do so):: + + >>> p = Path('docs/../setup.py') + >>> p.resolve() + PosixPath('/home/antoine/pathlib/setup.py') + + If the path doesn't exist, :exc:`FileNotFoundError` is raised. If an + infinite loop is encountered along the resolution path, + :exc:`RuntimeError` is raised. + + +.. method:: Path.rglob(pattern) + + This is like calling :meth:`glob` with "``**``" added in front of the + given *pattern*: + + >>> sorted(Path().rglob("*.py")) + [PosixPath('build/lib/pathlib.py'), + PosixPath('docs/conf.py'), + PosixPath('pathlib.py'), + PosixPath('setup.py'), + PosixPath('test_pathlib.py')] + + +.. method:: Path.rmdir() + + Remove this directory. The directory must be empty. + + +.. method:: Path.symlink_to(target, target_is_directory=False) + + Make this path a symbolic link to *target*. Under Windows, + *target_is_directory* must be True (default False) if the link's target + is a directory. Under POSIX, *target_is_directory*'s value is ignored. + + >>> p = Path('mylink') + >>> p.symlink_to('setup.py') + >>> p.resolve() + PosixPath('/home/antoine/pathlib/setup.py') + >>> p.stat().st_size + 956 + >>> p.lstat().st_size + 8 + + .. note:: + The order of arguments (link, target) is the reverse + of :func:`os.symlink`'s. + + +.. method:: Path.touch(mode=0o777, exist_ok=True) + + Create a file at this given path. If *mode* is given, it is combined + with the process' ``umask`` value to determine the file mode and access + flags. If the file already exists, the function succeeds if *exist_ok* + is true (and its modification time is updated to the current time), + otherwise :exc:`OSError` is raised. + + +.. method:: Path.unlink() + + Remove this file or symbolic link. If the path points to a directory, + use :func:`Path.rmdir` instead. + diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -93,6 +93,7 @@ * :mod:`asyncio`: New provisonal API for asynchronous IO (:pep:`3156`). * :mod:`enum`: Support for enumeration types (:pep:`435`). * :mod:`ensurepip`: Bootstrapping the pip installer (:pep:`453`). +* :mod:`pathlib`: Object-oriented filesystem paths (:pep:`428`). * :mod:`selectors`: High-level and efficient I/O multiplexing, built upon the :mod:`select` module primitives. * :mod:`statistics`: A basic numerically stable statistics library (:pep:`450`). @@ -318,6 +319,23 @@ implemented by Ethan Furman. +pathlib +------- + +The new :mod:`pathlib` module offers classes representing filesystem paths +with semantics appropriate for different operating systems. Path classes are +divided between *pure paths*, which provide purely computational operations +without I/O, and *concrete paths*, which inherit from pure paths but also +provide I/O operations. + +For Python 3.4, this module is considered a :term:`provisional API`. + +.. seealso:: + + :pep:`428` - The pathlib module -- object-oriented filesystem paths + PEP written and implemented by Antoine Pitrou. + + selectors --------- diff --git a/Lib/pathlib.py b/Lib/pathlib.py new file mode 100644 --- /dev/null +++ b/Lib/pathlib.py @@ -0,0 +1,1287 @@ +import fnmatch +import functools +import io +import ntpath +import os +import posixpath +import re +import sys +import time +import weakref +try: + import threading +except ImportError: + import dummy_threading as threading + +from collections import Sequence, defaultdict +from contextlib import contextmanager +from errno import EINVAL, ENOENT +from itertools import chain, count +from operator import attrgetter +from stat import S_ISDIR, S_ISLNK, S_ISREG, S_ISSOCK, S_ISBLK, S_ISCHR, S_ISFIFO +from urllib.parse import quote as urlquote, quote_from_bytes as urlquote_from_bytes + + +supports_symlinks = True +try: + import nt +except ImportError: + nt = None +else: + if sys.getwindowsversion()[:2] >= (6, 0): + from nt import _getfinalpathname + else: + supports_symlinks = False + _getfinalpathname = None + + +__all__ = [ + "PurePath", "PurePosixPath", "PureWindowsPath", + "Path", "PosixPath", "WindowsPath", + ] + +# +# Internals +# + +def _is_wildcard_pattern(pat): + # Whether this pattern needs actual matching using fnmatch, or can + # be looked up directly as a file. + return "*" in pat or "?" in pat or "[" in pat + + +class _Flavour(object): + """A flavour implements a particular (platform-specific) set of path + semantics.""" + + def __init__(self): + self.join = self.sep.join + + def parse_parts(self, parts): + parsed = [] + sep = self.sep + altsep = self.altsep + drv = root = '' + it = reversed(parts) + for part in it: + if not part: + continue + if altsep: + part = part.replace(altsep, sep) + drv, root, rel = self.splitroot(part) + if sep in rel: + for x in reversed(rel.split(sep)): + if x and x != '.': + parsed.append(sys.intern(x)) + else: + if rel and rel != '.': + parsed.append(sys.intern(rel)) + if drv or root: + if not drv: + # If no drive is present, try to find one in the previous + # parts. This makes the result of parsing e.g. + # ("C:", "/", "a") reasonably intuitive. + for part in it: + drv = self.splitroot(part)[0] + if drv: + break + break + if drv or root: + parsed.append(drv + root) + parsed.reverse() + return drv, root, parsed + + def join_parsed_parts(self, drv, root, parts, drv2, root2, parts2): + """ + Join the two paths represented by the respective + (drive, root, parts) tuples. Return a new (drive, root, parts) tuple. + """ + if root2: + parts = parts2 + root = root2 + else: + parts = parts + parts2 + # XXX raise error if drv and drv2 are different? + return drv2 or drv, root, parts + + +class _WindowsFlavour(_Flavour): + # Reference for Windows paths can be found at + # http://msdn.microsoft.com/en-us/library/aa365247%28v=vs.85%29.aspx + + sep = '\\' + altsep = '/' + has_drv = True + pathmod = ntpath + + is_supported = (nt is not None) + + drive_letters = ( + set(chr(x) for x in range(ord('a'), ord('z') + 1)) | + set(chr(x) for x in range(ord('A'), ord('Z') + 1)) + ) + ext_namespace_prefix = '\\\\?\\' + + reserved_names = ( + {'CON', 'PRN', 'AUX', 'NUL'} | + {'COM%d' % i for i in range(1, 10)} | + {'LPT%d' % i for i in range(1, 10)} + ) + + # Interesting findings about extended paths: + # - '\\?\c:\a', '//?/c:\a' and '//?/c:/a' are all supported + # but '\\?\c:/a' is not + # - extended paths are always absolute; "relative" extended paths will + # fail. + + def splitroot(self, part, sep=sep): + first = part[0:1] + second = part[1:2] + if (second == sep and first == sep): + # XXX extended paths should also disable the collapsing of "." + # components (according to MSDN docs). + prefix, part = self._split_extended_path(part) + first = part[0:1] + second = part[1:2] + else: + prefix = '' + third = part[2:3] + if (second == sep and first == sep and third != sep): + # is a UNC path: + # vvvvvvvvvvvvvvvvvvvvv root + # \\machine\mountpoint\directory\etc\... + # directory ^^^^^^^^^^^^^^ + index = part.find(sep, 2) + if index != -1: + index2 = part.find(sep, index + 1) + # a UNC path can't have two slashes in a row + # (after the initial two) + if index2 != index + 1: + if index2 == -1: + index2 = len(part) + if prefix: + return prefix + part[1:index2], sep, part[index2+1:] + else: + return part[:index2], sep, part[index2+1:] + drv = root = '' + if second == ':' and first in self.drive_letters: + drv = part[:2] + part = part[2:] + first = third + if first == sep: + root = first + part = part.lstrip(sep) + return prefix + drv, root, part + + def casefold(self, s): + return s.lower() + + def casefold_parts(self, parts): + return [p.lower() for p in parts] + + def resolve(self, path): + s = str(path) + if not s: + return os.getcwd() + if _getfinalpathname is not None: + return self._ext_to_normal(_getfinalpathname(s)) + # Means fallback on absolute + return None + + def _split_extended_path(self, s, ext_prefix=ext_namespace_prefix): + prefix = '' + if s.startswith(ext_prefix): + prefix = s[:4] + s = s[4:] + if s.startswith('UNC\\'): + prefix += s[:3] + s = '\\' + s[3:] + return prefix, s + + def _ext_to_normal(self, s): + # Turn back an extended path into a normal DOS-like path + return self._split_extended_path(s)[1] + + def is_reserved(self, parts): + # NOTE: the rules for reserved names seem somewhat complicated + # (e.g. r"..\NUL" is reserved but not r"foo\NUL"). + # We err on the side of caution and return True for paths which are + # not considered reserved by Windows. + if not parts: + return False + if parts[0].startswith('\\\\'): + # UNC paths are never reserved + return False + return parts[-1].partition('.')[0].upper() in self.reserved_names + + def make_uri(self, path): + # Under Windows, file URIs use the UTF-8 encoding. + drive = path.drive + if len(drive) == 2 and drive[1] == ':': + # It's a path on a local drive => 'file:///c:/a/b' + rest = path.as_posix()[2:].lstrip('/') + return 'file:///%s/%s' % ( + drive, urlquote_from_bytes(rest.encode('utf-8'))) + else: + # It's a path on a network drive => 'file://host/share/a/b' + return 'file:' + urlquote_from_bytes(path.as_posix().encode('utf-8')) + + +class _PosixFlavour(_Flavour): + sep = '/' + altsep = '' + has_drv = False + pathmod = posixpath + + is_supported = (os.name != 'nt') + + def splitroot(self, part, sep=sep): + if part and part[0] == sep: + stripped_part = part.lstrip(sep) + # According to POSIX path resolution: + # http://pubs.opengroup.org/onlinepubs/009695399/basedefs/xbd_chap04.html#tag_04_11 + # "A pathname that begins with two successive slashes may be + # interpreted in an implementation-defined manner, although more + # than two leading slashes shall be treated as a single slash". + if len(part) - len(stripped_part) == 2: + return '', sep * 2, stripped_part + else: + return '', sep, stripped_part + else: + return '', '', part + + def casefold(self, s): + return s + + def casefold_parts(self, parts): + return parts + + def resolve(self, path): + sep = self.sep + def split(p): + return [x for x in p.split(sep) if x] + def absparts(p): + # Our own abspath(), since the posixpath one makes + # the mistake of "normalizing" the path without resolving the + # symlinks first. + if not p.startswith(sep): + return split(os.getcwd()) + split(p) + else: + return split(p) + parts = absparts(str(path))[::-1] + accessor = path._accessor + resolved = cur = "" + symlinks = {} + while parts: + part = parts.pop() + cur = resolved + sep + part + if cur in symlinks and symlinks[cur] <= len(parts): + # We've already seen the symlink and there's not less + # work to do than the last time. + raise RuntimeError("Symlink loop from %r" % cur) + try: + target = accessor.readlink(cur) + except OSError as e: + if e.errno != EINVAL: + raise + # Not a symlink + resolved = cur + else: + # Take note of remaining work from this symlink + symlinks[cur] = len(parts) + if target.startswith(sep): + # Symlink points to absolute path + resolved = "" + parts.extend(split(target)[::-1]) + return resolved or sep + + def is_reserved(self, parts): + return False + + def make_uri(self, path): + # We represent the path using the local filesystem encoding, + # for portability to other applications. + bpath = bytes(path) + return 'file://' + urlquote_from_bytes(bpath) + + +_windows_flavour = _WindowsFlavour() +_posix_flavour = _PosixFlavour() + + +class _Accessor: + """An accessor implements a particular (system-specific or not) way of + accessing paths on the filesystem.""" + + +class _NormalAccessor(_Accessor): + + def _wrap_strfunc(strfunc): + @functools.wraps(strfunc) + def wrapped(pathobj, *args): + return strfunc(str(pathobj), *args) + return staticmethod(wrapped) + + def _wrap_binary_strfunc(strfunc): + @functools.wraps(strfunc) + def wrapped(pathobjA, pathobjB, *args): + return strfunc(str(pathobjA), str(pathobjB), *args) + return staticmethod(wrapped) + + stat = _wrap_strfunc(os.stat) + + lstat = _wrap_strfunc(os.lstat) + + open = _wrap_strfunc(os.open) + + listdir = _wrap_strfunc(os.listdir) + + chmod = _wrap_strfunc(os.chmod) + + if hasattr(os, "lchmod"): + lchmod = _wrap_strfunc(os.lchmod) + else: + def lchmod(self, pathobj, mode): + raise NotImplementedError("lchmod() not available on this system") + + mkdir = _wrap_strfunc(os.mkdir) + + unlink = _wrap_strfunc(os.unlink) + + rmdir = _wrap_strfunc(os.rmdir) + + rename = _wrap_binary_strfunc(os.rename) + + replace = _wrap_binary_strfunc(os.replace) + + if nt: + if supports_symlinks: + symlink = _wrap_binary_strfunc(os.symlink) + else: + def symlink(a, b, target_is_directory): + raise NotImplementedError("symlink() not available on this system") + else: + # Under POSIX, os.symlink() takes two args + @staticmethod + def symlink(a, b, target_is_directory): + return os.symlink(str(a), str(b)) + + utime = _wrap_strfunc(os.utime) + + # Helper for resolve() + def readlink(self, path): + return os.readlink(path) + + +_normal_accessor = _NormalAccessor() + + +# +# Globbing helpers +# + + at contextmanager +def _cached(func): + try: + func.__cached__ + yield func + except AttributeError: + cache = {} + def wrapper(*args): + try: + return cache[args] + except KeyError: + value = cache[args] = func(*args) + return value + wrapper.__cached__ = True + try: + yield wrapper + finally: + cache.clear() + +def _make_selector(pattern_parts): + pat = pattern_parts[0] + child_parts = pattern_parts[1:] + if pat == '**': + cls = _RecursiveWildcardSelector + elif '**' in pat: + raise ValueError("Invalid pattern: '**' can only be an entire path component") + elif _is_wildcard_pattern(pat): + cls = _WildcardSelector + else: + cls = _PreciseSelector + return cls(pat, child_parts) + +if hasattr(functools, "lru_cache"): + _make_selector = functools.lru_cache()(_make_selector) + + +class _Selector: + """A selector matches a specific glob pattern part against the children + of a given path.""" + + def __init__(self, child_parts): + self.child_parts = child_parts + if child_parts: + self.successor = _make_selector(child_parts) + else: + self.successor = _TerminatingSelector() + + def select_from(self, parent_path): + """Iterate over all child paths of `parent_path` matched by this + selector. This can contain parent_path itself.""" + path_cls = type(parent_path) + is_dir = path_cls.is_dir + exists = path_cls.exists + listdir = parent_path._accessor.listdir + return self._select_from(parent_path, is_dir, exists, listdir) + + +class _TerminatingSelector: + + def _select_from(self, parent_path, is_dir, exists, listdir): + yield parent_path + + +class _PreciseSelector(_Selector): + + def __init__(self, name, child_parts): + self.name = name + _Selector.__init__(self, child_parts) + + def _select_from(self, parent_path, is_dir, exists, listdir): + if not is_dir(parent_path): + return + path = parent_path._make_child_relpath(self.name) + if exists(path): + for p in self.successor._select_from(path, is_dir, exists, listdir): + yield p + + +class _WildcardSelector(_Selector): + + def __init__(self, pat, child_parts): + self.pat = re.compile(fnmatch.translate(pat)) + _Selector.__init__(self, child_parts) + + def _select_from(self, parent_path, is_dir, exists, listdir): + if not is_dir(parent_path): + return + cf = parent_path._flavour.casefold + for name in listdir(parent_path): + casefolded = cf(name) + if self.pat.match(casefolded): + path = parent_path._make_child_relpath(name) + for p in self.successor._select_from(path, is_dir, exists, listdir): + yield p + + +class _RecursiveWildcardSelector(_Selector): + + def __init__(self, pat, child_parts): + _Selector.__init__(self, child_parts) + + def _iterate_directories(self, parent_path, is_dir, listdir): + yield parent_path + for name in listdir(parent_path): + path = parent_path._make_child_relpath(name) + if is_dir(path): + for p in self._iterate_directories(path, is_dir, listdir): + yield p + + def _select_from(self, parent_path, is_dir, exists, listdir): + if not is_dir(parent_path): + return + with _cached(listdir) as listdir: + yielded = set() + try: + successor_select = self.successor._select_from + for starting_point in self._iterate_directories(parent_path, is_dir, listdir): + for p in successor_select(starting_point, is_dir, exists, listdir): + if p not in yielded: + yield p + yielded.add(p) + finally: + yielded.clear() + + +# +# Public API +# + +class _PathParents(Sequence): + """This object provides sequence-like access to the logical ancestors + of a path. Don't try to construct it yourself.""" + __slots__ = ('_pathcls', '_drv', '_root', '_parts') + + def __init__(self, path): + # We don't store the instance to avoid reference cycles + self._pathcls = type(path) + self._drv = path._drv + self._root = path._root + self._parts = path._parts + + def __len__(self): + if self._drv or self._root: + return len(self._parts) - 1 + else: + return len(self._parts) + + def __getitem__(self, idx): + if idx < 0 or idx >= len(self): + raise IndexError(idx) + return self._pathcls._from_parsed_parts(self._drv, self._root, + self._parts[:-idx - 1]) + + def __repr__(self): + return "<{}.parents>".format(self._pathcls.__name__) + + +class PurePath(object): + """PurePath represents a filesystem path and offers operations which + don't imply any actual filesystem I/O. Depending on your system, + instantiating a PurePath will return either a PurePosixPath or a + PureWindowsPath object. You can also instantiate either of these classes + directly, regardless of your system. + """ + __slots__ = ( + '_drv', '_root', '_parts', + '_str', '_hash', '_pparts', '_cached_cparts', + ) + + def __new__(cls, *args): + """Construct a PurePath from one or several strings and or existing + PurePath objects. The strings and path objects are combined so as + to yield a canonicalized path, which is incorporated into the + new PurePath object. + """ + if cls is PurePath: + cls = PureWindowsPath if os.name == 'nt' else PurePosixPath + return cls._from_parts(args) + + def __reduce__(self): + # Using the parts tuple helps share interned path parts + # when pickling related paths. + return (self.__class__, tuple(self._parts)) + + @classmethod + def _parse_args(cls, args): + # This is useful when you don't want to create an instance, just + # canonicalize some constructor arguments. + parts = [] + for a in args: + if isinstance(a, PurePath): + parts += a._parts + elif isinstance(a, str): + # Assuming a str + parts.append(a) + else: + raise TypeError( + "argument should be a path or str object, not %r" + % type(a)) + return cls._flavour.parse_parts(parts) + + @classmethod + def _from_parts(cls, args, init=True): + # We need to call _parse_args on the instance, so as to get the + # right flavour. + self = object.__new__(cls) + drv, root, parts = self._parse_args(args) + self._drv = drv + self._root = root + self._parts = parts + if init: + self._init() + return self + + @classmethod + def _from_parsed_parts(cls, drv, root, parts, init=True): + self = object.__new__(cls) + self._drv = drv + self._root = root + self._parts = parts + if init: + self._init() + return self + + @classmethod + def _format_parsed_parts(cls, drv, root, parts): + if drv or root: + return drv + root + cls._flavour.join(parts[1:]) + else: + return cls._flavour.join(parts) + + def _init(self): + # Overriden in concrete Path + pass + + def _make_child(self, args): + drv, root, parts = self._parse_args(args) + drv, root, parts = self._flavour.join_parsed_parts( + self._drv, self._root, self._parts, drv, root, parts) + return self._from_parsed_parts(drv, root, parts) + + def __str__(self): + """Return the string representation of the path, suitable for + passing to system calls.""" + try: + return self._str + except AttributeError: + self._str = self._format_parsed_parts(self._drv, self._root, + self._parts) or '.' + return self._str + + def as_posix(self): + """Return the string representation of the path with forward (/) + slashes.""" + f = self._flavour + return str(self).replace(f.sep, '/') + + def __bytes__(self): + """Return the bytes representation of the path. This is only + recommended to use under Unix.""" + return os.fsencode(str(self)) + + def __repr__(self): + return "{}({!r})".format(self.__class__.__name__, self.as_posix()) + + def as_uri(self): + """Return the path as a 'file' URI.""" + if not self.is_absolute(): + raise ValueError("relative path can't be expressed as a file URI") + return self._flavour.make_uri(self) + + @property + def _cparts(self): + # Cached casefolded parts, for hashing and comparison + try: + return self._cached_cparts + except AttributeError: + self._cached_cparts = self._flavour.casefold_parts(self._parts) + return self._cached_cparts + + def __eq__(self, other): + if not isinstance(other, PurePath): + return NotImplemented + return self._cparts == other._cparts and self._flavour is other._flavour + + def __ne__(self, other): + return not self == other + + def __hash__(self): + try: + return self._hash + except AttributeError: + self._hash = hash(tuple(self._cparts)) + return self._hash + + def __lt__(self, other): + if not isinstance(other, PurePath) or self._flavour is not other._flavour: + return NotImplemented + return self._cparts < other._cparts + + def __le__(self, other): + if not isinstance(other, PurePath) or self._flavour is not other._flavour: + return NotImplemented + return self._cparts <= other._cparts + + def __gt__(self, other): + if not isinstance(other, PurePath) or self._flavour is not other._flavour: + return NotImplemented + return self._cparts > other._cparts + + def __ge__(self, other): + if not isinstance(other, PurePath) or self._flavour is not other._flavour: + return NotImplemented + return self._cparts >= other._cparts + + drive = property(attrgetter('_drv'), + doc="""The drive prefix (letter or UNC path), if any.""") + + root = property(attrgetter('_root'), + doc="""The root of the path, if any.""") + + @property + def anchor(self): + """The concatenation of the drive and root, or ''.""" + anchor = self._drv + self._root + return anchor + + @property + def name(self): + """The final path component, if any.""" + parts = self._parts + if len(parts) == (1 if (self._drv or self._root) else 0): + return '' + return parts[-1] + + @property + def suffix(self): + """The final component's last suffix, if any.""" + name = self.name + i = name.rfind('.') + if 0 < i < len(name) - 1: + return name[i:] + else: + return '' + + @property + def suffixes(self): + """A list of the final component's suffixes, if any.""" + name = self.name + if name.endswith('.'): + return [] + name = name.lstrip('.') + return ['.' + suffix for suffix in name.split('.')[1:]] + + @property + def stem(self): + """The final path component, minus its last suffix.""" + name = self.name + i = name.rfind('.') + if 0 < i < len(name) - 1: + return name[:i] + else: + return name + + def with_name(self, name): + """Return a new path with the file name changed.""" + if not self.name: + raise ValueError("%r has an empty name" % (self,)) + return self._from_parsed_parts(self._drv, self._root, + self._parts[:-1] + [name]) + + def with_suffix(self, suffix): + """Return a new path with the file suffix changed (or added, if none).""" + # XXX if suffix is None, should the current suffix be removed? + name = self.name + if not name: + raise ValueError("%r has an empty name" % (self,)) + old_suffix = self.suffix + if not old_suffix: + name = name + suffix + else: + name = name[:-len(old_suffix)] + suffix + return self._from_parsed_parts(self._drv, self._root, + self._parts[:-1] + [name]) + + def relative_to(self, *other): + """Return the relative path to another path identified by the passed + arguments. If the operation is not possible (because this is not + a subpath of the other path), raise ValueError. + """ + # For the purpose of this method, drive and root are considered + # separate parts, i.e.: + # Path('c:/').relative_to('c:') gives Path('/') + # Path('c:/').relative_to('/') raise ValueError + if not other: + raise TypeError("need at least one argument") + parts = self._parts + drv = self._drv + root = self._root + if drv or root: + if root: + abs_parts = [drv, root] + parts[1:] + else: + abs_parts = [drv] + parts[1:] + else: + abs_parts = parts + to_drv, to_root, to_parts = self._parse_args(other) + if to_drv or to_root: + if to_root: + to_abs_parts = [to_drv, to_root] + to_parts[1:] + else: + to_abs_parts = [to_drv] + to_parts[1:] + else: + to_abs_parts = to_parts + n = len(to_abs_parts) + if n == 0 and (drv or root) or abs_parts[:n] != to_abs_parts: + formatted = self._format_parsed_parts(to_drv, to_root, to_parts) + raise ValueError("{!r} does not start with {!r}" + .format(str(self), str(formatted))) + return self._from_parsed_parts('', '', abs_parts[n:]) + + @property + def parts(self): + """An object providing sequence-like access to the + components in the filesystem path.""" + # We cache the tuple to avoid building a new one each time .parts + # is accessed. XXX is this necessary? + try: + return self._pparts + except AttributeError: + self._pparts = tuple(self._parts) + return self._pparts + + def joinpath(self, *args): + """Combine this path with one or several arguments, and return a + new path representing either a subpath (if all arguments are relative + paths) or a totally different path (if one of the arguments is + anchored). + """ + return self._make_child(args) + + def __truediv__(self, key): + return self._make_child((key,)) + + def __rtruediv__(self, key): + return self._from_parts([key] + self._parts) + + @property + def parent(self): + """The logical parent of the path.""" + drv = self._drv + root = self._root + parts = self._parts + if len(parts) == 1 and (drv or root): + return self + return self._from_parsed_parts(drv, root, parts[:-1]) + + @property + def parents(self): + """A sequence of this path's logical parents.""" + return _PathParents(self) + + def is_absolute(self): + """True if the path is absolute (has both a root and, if applicable, + a drive).""" + if not self._root: + return False + return not self._flavour.has_drv or bool(self._drv) + + def is_reserved(self): + """Return True if the path contains one of the special names reserved + by the system, if any.""" + return self._flavour.is_reserved(self._parts) + + def match(self, path_pattern): + """ + Return True if this path matches the given pattern. + """ + cf = self._flavour.casefold + path_pattern = cf(path_pattern) + drv, root, pat_parts = self._flavour.parse_parts((path_pattern,)) + if not pat_parts: + raise ValueError("empty pattern") + if drv and drv != cf(self._drv): + return False + if root and root != cf(self._root): + return False + parts = self._cparts + if drv or root: + if len(pat_parts) != len(parts): + return False + pat_parts = pat_parts[1:] + elif len(pat_parts) > len(parts): + return False + for part, pat in zip(reversed(parts), reversed(pat_parts)): + if not fnmatch.fnmatchcase(part, pat): + return False + return True + + +class PurePosixPath(PurePath): + _flavour = _posix_flavour + __slots__ = () + + +class PureWindowsPath(PurePath): + _flavour = _windows_flavour + __slots__ = () + + +# Filesystem-accessing classes + + +class Path(PurePath): + __slots__ = ( + '_accessor', + '_closed', + ) + + def __new__(cls, *args, **kwargs): + if cls is Path: + cls = WindowsPath if os.name == 'nt' else PosixPath + self = cls._from_parts(args, init=False) + if not self._flavour.is_supported: + raise NotImplementedError("cannot instantiate %r on your system" + % (cls.__name__,)) + self._init() + return self + + def _init(self, + # Private non-constructor arguments + template=None, + ): + self._closed = False + if template is not None: + self._accessor = template._accessor + else: + self._accessor = _normal_accessor + + def _make_child_relpath(self, part): + # This is an optimization used for dir walking. `part` must be + # a single part relative to this path. + parts = self._parts + [part] + return self._from_parsed_parts(self._drv, self._root, parts) + + def __enter__(self): + if self._closed: + self._raise_closed() + return self + + def __exit__(self, t, v, tb): + self._closed = True + + def _raise_closed(self): + raise ValueError("I/O operation on closed path") + + def _opener(self, name, flags, mode=0o666): + # A stub for the opener argument to built-in open() + return self._accessor.open(self, flags, mode) + + # Public API + + @classmethod + def cwd(cls): + """Return a new path pointing to the current working directory + (as returned by os.getcwd()). + """ + return cls(os.getcwd()) + + def iterdir(self): + """Iterate over the files in this directory. Does not yield any + result for the special paths '.' and '..'. + """ + if self._closed: + self._raise_closed() + for name in self._accessor.listdir(self): + if name in {'.', '..'}: + # Yielding a path object for these makes little sense + continue + yield self._make_child_relpath(name) + if self._closed: + self._raise_closed() + + def glob(self, pattern): + """Iterate over this subtree and yield all existing files (of any + kind, including directories) matching the given pattern. + """ + pattern = self._flavour.casefold(pattern) + drv, root, pattern_parts = self._flavour.parse_parts((pattern,)) + if drv or root: + raise NotImplementedError("Non-relative patterns are unsupported") + selector = _make_selector(tuple(pattern_parts)) + for p in selector.select_from(self): + yield p + + def rglob(self, pattern): + """Recursively yield all existing files (of any kind, including + directories) matching the given pattern, anywhere in this subtree. + """ + pattern = self._flavour.casefold(pattern) + drv, root, pattern_parts = self._flavour.parse_parts((pattern,)) + if drv or root: + raise NotImplementedError("Non-relative patterns are unsupported") + selector = _make_selector(("**",) + tuple(pattern_parts)) + for p in selector.select_from(self): + yield p + + def absolute(self): + """Return an absolute version of this path. This function works + even if the path doesn't point to anything. + + No normalization is done, i.e. all '.' and '..' will be kept along. + Use resolve() to get the canonical path to a file. + """ + # XXX untested yet! + if self._closed: + self._raise_closed() + if self.is_absolute(): + return self + # FIXME this must defer to the specific flavour (and, under Windows, + # use nt._getfullpathname()) + obj = self._from_parts([os.getcwd()] + self._parts, init=False) + obj._init(template=self) + return obj + + def resolve(self): + """ + Make the path absolute, resolving all symlinks on the way and also + normalizing it (for example turning slashes into backslashes under + Windows). + """ + if self._closed: + self._raise_closed() + s = self._flavour.resolve(self) + if s is None: + # No symlink resolution => for consistency, raise an error if + # the path doesn't exist or is forbidden + self.stat() + s = str(self.absolute()) + # Now we have no symlinks in the path, it's safe to normalize it. + normed = self._flavour.pathmod.normpath(s) + obj = self._from_parts((normed,), init=False) + obj._init(template=self) + return obj + + def stat(self): + """ + Return the result of the stat() system call on this path, like + os.stat() does. + """ + return self._accessor.stat(self) + + def owner(self): + """ + Return the login name of the file owner. + """ + import pwd + return pwd.getpwuid(self.stat().st_uid).pw_name + + def group(self): + """ + Return the group name of the file gid. + """ + import grp + return grp.getgrgid(self.stat().st_gid).gr_name + + def _raw_open(self, flags, mode=0o777): + """ + Open the file pointed by this path and return a file descriptor, + as os.open() does. + """ + if self._closed: + self._raise_closed() + return self._accessor.open(self, flags, mode) + + def open(self, mode='r', buffering=-1, encoding=None, + errors=None, newline=None): + """ + Open the file pointed by this path and return a file object, as + the built-in open() function does. + """ + if self._closed: + self._raise_closed() + return io.open(str(self), mode, buffering, encoding, errors, newline, + opener=self._opener) + + def touch(self, mode=0o666, exist_ok=True): + """ + Create this file with the given access mode, if it doesn't exist. + """ + if self._closed: + self._raise_closed() + if exist_ok: + # First try to bump modification time + # Implementation note: GNU touch uses the UTIME_NOW option of + # the utimensat() / futimens() functions. + t = time.time() + try: + self._accessor.utime(self, (t, t)) + except OSError: + # Avoid exception chaining + pass + else: + return + flags = os.O_CREAT | os.O_WRONLY + if not exist_ok: + flags |= os.O_EXCL + fd = self._raw_open(flags, mode) + os.close(fd) + + def mkdir(self, mode=0o777, parents=False): + if self._closed: + self._raise_closed() + if not parents: + self._accessor.mkdir(self, mode) + else: + try: + self._accessor.mkdir(self, mode) + except OSError as e: + if e.errno != ENOENT: + raise + self.parent.mkdir(mode, True) + self._accessor.mkdir(self, mode) + + def chmod(self, mode): + """ + Change the permissions of the path, like os.chmod(). + """ + if self._closed: + self._raise_closed() + self._accessor.chmod(self, mode) + + def lchmod(self, mode): + """ + Like chmod(), except if the path points to a symlink, the symlink's + permissions are changed, rather than its target's. + """ + if self._closed: + self._raise_closed() + self._accessor.lchmod(self, mode) + + def unlink(self): + """ + Remove this file or link. + If the path is a directory, use rmdir() instead. + """ + if self._closed: + self._raise_closed() + self._accessor.unlink(self) + + def rmdir(self): + """ + Remove this directory. The directory must be empty. + """ + if self._closed: + self._raise_closed() + self._accessor.rmdir(self) + + def lstat(self): + """ + Like stat(), except if the path points to a symlink, the symlink's + status information is returned, rather than its target's. + """ + if self._closed: + self._raise_closed() + return self._accessor.lstat(self) + + def rename(self, target): + """ + Rename this path to the given path. + """ + if self._closed: + self._raise_closed() + self._accessor.rename(self, target) + + def replace(self, target): + """ + Rename this path to the given path, clobbering the existing + destination if it exists. + """ + if self._closed: + self._raise_closed() + self._accessor.replace(self, target) + + def symlink_to(self, target, target_is_directory=False): + """ + Make this path a symlink pointing to the given path. + Note the order of arguments (self, target) is the reverse of os.symlink's. + """ + if self._closed: + self._raise_closed() + self._accessor.symlink(target, self, target_is_directory) + + # Convenience functions for querying the stat results + + def exists(self): + """ + Whether this path exists. + """ + try: + self.stat() + except OSError as e: + if e.errno != ENOENT: + raise + return False + return True + + def is_dir(self): + """ + Whether this path is a directory. + """ + try: + return S_ISDIR(self.stat().st_mode) + except OSError as e: + if e.errno != ENOENT: + raise + # Path doesn't exist or is a broken symlink + # (see https://bitbucket.org/pitrou/pathlib/issue/12/) + return False + + def is_file(self): + """ + Whether this path is a regular file (also True for symlinks pointing + to regular files). + """ + try: + return S_ISREG(self.stat().st_mode) + except OSError as e: + if e.errno != ENOENT: + raise + # Path doesn't exist or is a broken symlink + # (see https://bitbucket.org/pitrou/pathlib/issue/12/) + return False + + def is_symlink(self): + """ + Whether this path is a symbolic link. + """ + try: + return S_ISLNK(self.lstat().st_mode) + except OSError as e: + if e.errno != ENOENT: + raise + # Path doesn't exist + return False + + def is_block_device(self): + """ + Whether this path is a block device. + """ + try: + return S_ISBLK(self.stat().st_mode) + except OSError as e: + if e.errno != ENOENT: + raise + # Path doesn't exist or is a broken symlink + # (see https://bitbucket.org/pitrou/pathlib/issue/12/) + return False + + def is_char_device(self): + """ + Whether this path is a character device. + """ + try: + return S_ISCHR(self.stat().st_mode) + except OSError as e: + if e.errno != ENOENT: + raise + # Path doesn't exist or is a broken symlink + # (see https://bitbucket.org/pitrou/pathlib/issue/12/) + return False + + def is_fifo(self): + """ + Whether this path is a FIFO. + """ + try: + return S_ISFIFO(self.stat().st_mode) + except OSError as e: + if e.errno != ENOENT: + raise + # Path doesn't exist or is a broken symlink + # (see https://bitbucket.org/pitrou/pathlib/issue/12/) + return False + + def is_socket(self): + """ + Whether this path is a socket. + """ + try: + return S_ISSOCK(self.stat().st_mode) + except OSError as e: + if e.errno != ENOENT: + raise + # Path doesn't exist or is a broken symlink + # (see https://bitbucket.org/pitrou/pathlib/issue/12/) + return False + + +class PosixPath(Path, PurePosixPath): + __slots__ = () + +class WindowsPath(Path, PureWindowsPath): + __slots__ = () + diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py new file mode 100755 --- /dev/null +++ b/Lib/test/test_pathlib.py @@ -0,0 +1,1639 @@ +import collections +import io +import os +import errno +import pathlib +import pickle +import shutil +import socket +import stat +import sys +import tempfile +import unittest +from contextlib import contextmanager + +from test import support +TESTFN = support.TESTFN + +try: + import grp, pwd +except ImportError: + grp = pwd = None + + +class _BaseFlavourTest(object): + + def _check_parse_parts(self, arg, expected): + f = self.flavour.parse_parts + sep = self.flavour.sep + altsep = self.flavour.altsep + actual = f([x.replace('/', sep) for x in arg]) + self.assertEqual(actual, expected) + if altsep: + actual = f([x.replace('/', altsep) for x in arg]) + self.assertEqual(actual, expected) + + def test_parse_parts_common(self): + check = self._check_parse_parts + sep = self.flavour.sep + # Unanchored parts + check([], ('', '', [])) + check(['a'], ('', '', ['a'])) + check(['a/'], ('', '', ['a'])) + check(['a', 'b'], ('', '', ['a', 'b'])) + # Expansion + check(['a/b'], ('', '', ['a', 'b'])) + check(['a/b/'], ('', '', ['a', 'b'])) + check(['a', 'b/c', 'd'], ('', '', ['a', 'b', 'c', 'd'])) + # Collapsing and stripping excess slashes + check(['a', 'b//c', 'd'], ('', '', ['a', 'b', 'c', 'd'])) + check(['a', 'b/c/', 'd'], ('', '', ['a', 'b', 'c', 'd'])) + # Eliminating standalone dots + check(['.'], ('', '', [])) + check(['.', '.', 'b'], ('', '', ['b'])) + check(['a', '.', 'b'], ('', '', ['a', 'b'])) + check(['a', '.', '.'], ('', '', ['a'])) + # The first part is anchored + check(['/a/b'], ('', sep, [sep, 'a', 'b'])) + check(['/a', 'b'], ('', sep, [sep, 'a', 'b'])) + check(['/a/', 'b'], ('', sep, [sep, 'a', 'b'])) + # Ignoring parts before an anchored part + check(['a', '/b', 'c'], ('', sep, [sep, 'b', 'c'])) + check(['a', '/b', '/c'], ('', sep, [sep, 'c'])) + + +class PosixFlavourTest(_BaseFlavourTest, unittest.TestCase): + flavour = pathlib._posix_flavour + + def test_parse_parts(self): + check = self._check_parse_parts + # Collapsing of excess leading slashes, except for the double-slash + # special case. + check(['//a', 'b'], ('', '//', ['//', 'a', 'b'])) + check(['///a', 'b'], ('', '/', ['/', 'a', 'b'])) + check(['////a', 'b'], ('', '/', ['/', 'a', 'b'])) + # Paths which look like NT paths aren't treated specially + check(['c:a'], ('', '', ['c:a'])) + check(['c:\\a'], ('', '', ['c:\\a'])) + check(['\\a'], ('', '', ['\\a'])) + + def test_splitroot(self): + f = self.flavour.splitroot + self.assertEqual(f(''), ('', '', '')) + self.assertEqual(f('a'), ('', '', 'a')) + self.assertEqual(f('a/b'), ('', '', 'a/b')) + self.assertEqual(f('a/b/'), ('', '', 'a/b/')) + self.assertEqual(f('/a'), ('', '/', 'a')) + self.assertEqual(f('/a/b'), ('', '/', 'a/b')) + self.assertEqual(f('/a/b/'), ('', '/', 'a/b/')) + # The root is collapsed when there are redundant slashes + # except when there are exactly two leading slashes, which + # is a special case in POSIX. + self.assertEqual(f('//a'), ('', '//', 'a')) + self.assertEqual(f('///a'), ('', '/', 'a')) + self.assertEqual(f('///a/b'), ('', '/', 'a/b')) + # Paths which look like NT paths aren't treated specially + self.assertEqual(f('c:/a/b'), ('', '', 'c:/a/b')) + self.assertEqual(f('\\/a/b'), ('', '', '\\/a/b')) + self.assertEqual(f('\\a\\b'), ('', '', '\\a\\b')) + + +class NTFlavourTest(_BaseFlavourTest, unittest.TestCase): + flavour = pathlib._windows_flavour + + def test_parse_parts(self): + check = self._check_parse_parts + # First part is anchored + check(['c:'], ('c:', '', ['c:'])) + check(['c:\\'], ('c:', '\\', ['c:\\'])) + check(['\\'], ('', '\\', ['\\'])) + check(['c:a'], ('c:', '', ['c:', 'a'])) + check(['c:\\a'], ('c:', '\\', ['c:\\', 'a'])) + check(['\\a'], ('', '\\', ['\\', 'a'])) + # UNC paths + check(['\\\\a\\b'], ('\\\\a\\b', '\\', ['\\\\a\\b\\'])) + check(['\\\\a\\b\\'], ('\\\\a\\b', '\\', ['\\\\a\\b\\'])) + check(['\\\\a\\b\\c'], ('\\\\a\\b', '\\', ['\\\\a\\b\\', 'c'])) + # Second part is anchored, so that the first part is ignored + check(['a', 'Z:b', 'c'], ('Z:', '', ['Z:', 'b', 'c'])) + check(['a', 'Z:\\b', 'c'], ('Z:', '\\', ['Z:\\', 'b', 'c'])) + check(['a', '\\b', 'c'], ('', '\\', ['\\', 'b', 'c'])) + # UNC paths + check(['a', '\\\\b\\c', 'd'], ('\\\\b\\c', '\\', ['\\\\b\\c\\', 'd'])) + # Collapsing and stripping excess slashes + check(['a', 'Z:\\\\b\\\\c\\', 'd\\'], ('Z:', '\\', ['Z:\\', 'b', 'c', 'd'])) + # UNC paths + check(['a', '\\\\b\\c\\\\', 'd'], ('\\\\b\\c', '\\', ['\\\\b\\c\\', 'd'])) + # Extended paths + check(['\\\\?\\c:\\'], ('\\\\?\\c:', '\\', ['\\\\?\\c:\\'])) + check(['\\\\?\\c:\\a'], ('\\\\?\\c:', '\\', ['\\\\?\\c:\\', 'a'])) + # Extended UNC paths (format is "\\?\UNC\server\share") + check(['\\\\?\\UNC\\b\\c'], ('\\\\?\\UNC\\b\\c', '\\', ['\\\\?\\UNC\\b\\c\\'])) + check(['\\\\?\\UNC\\b\\c\\d'], ('\\\\?\\UNC\\b\\c', '\\', ['\\\\?\\UNC\\b\\c\\', 'd'])) + + def test_splitroot(self): + f = self.flavour.splitroot + self.assertEqual(f(''), ('', '', '')) + self.assertEqual(f('a'), ('', '', 'a')) + self.assertEqual(f('a\\b'), ('', '', 'a\\b')) + self.assertEqual(f('\\a'), ('', '\\', 'a')) + self.assertEqual(f('\\a\\b'), ('', '\\', 'a\\b')) + self.assertEqual(f('c:a\\b'), ('c:', '', 'a\\b')) + self.assertEqual(f('c:\\a\\b'), ('c:', '\\', 'a\\b')) + # Redundant slashes in the root are collapsed + self.assertEqual(f('\\\\a'), ('', '\\', 'a')) + self.assertEqual(f('\\\\\\a/b'), ('', '\\', 'a/b')) + self.assertEqual(f('c:\\\\a'), ('c:', '\\', 'a')) + self.assertEqual(f('c:\\\\\\a/b'), ('c:', '\\', 'a/b')) + # Valid UNC paths + self.assertEqual(f('\\\\a\\b'), ('\\\\a\\b', '\\', '')) + self.assertEqual(f('\\\\a\\b\\'), ('\\\\a\\b', '\\', '')) + self.assertEqual(f('\\\\a\\b\\c\\d'), ('\\\\a\\b', '\\', 'c\\d')) + # These are non-UNC paths (according to ntpath.py and test_ntpath) + # However, command.com says such paths are invalid, so it's + # difficult to know what the right semantics are + self.assertEqual(f('\\\\\\a\\b'), ('', '\\', 'a\\b')) + self.assertEqual(f('\\\\a'), ('', '\\', 'a')) + + +# +# Tests for the pure classes +# + +class _BasePurePathTest(object): + + # keys are canonical paths, values are list of tuples of arguments + # supposed to produce equal paths + equivalences = { + 'a/b': [ + ('a', 'b'), ('a/', 'b'), ('a', 'b/'), ('a/', 'b/'), + ('a/b/',), ('a//b',), ('a//b//',), + # empty components get removed + ('', 'a', 'b'), ('a', '', 'b'), ('a', 'b', ''), + ], + '/b/c/d': [ + ('a', '/b/c', 'd'), ('a', '///b//c', 'd/'), + ('/a', '/b/c', 'd'), + # empty components get removed + ('/', 'b', '', 'c/d'), ('/', '', 'b/c/d'), ('', '/b/c/d'), + ], + } + + def setUp(self): + p = self.cls('a') + self.flavour = p._flavour + self.sep = self.flavour.sep + self.altsep = self.flavour.altsep + + def test_constructor_common(self): + P = self.cls + p = P('a') + self.assertIsInstance(p, P) + P('a', 'b', 'c') + P('/a', 'b', 'c') + P('a/b/c') + P('/a/b/c') + self.assertEqual(P(P('a')), P('a')) + self.assertEqual(P(P('a'), 'b'), P('a/b')) + self.assertEqual(P(P('a'), P('b')), P('a/b')) + + def test_join_common(self): + P = self.cls + p = P('a/b') + pp = p.joinpath('c') + self.assertEqual(pp, P('a/b/c')) + self.assertIs(type(pp), type(p)) + pp = p.joinpath('c', 'd') + self.assertEqual(pp, P('a/b/c/d')) + pp = p.joinpath(P('c')) + self.assertEqual(pp, P('a/b/c')) + pp = p.joinpath('/c') + self.assertEqual(pp, P('/c')) + + def test_div_common(self): + # Basically the same as joinpath() + P = self.cls + p = P('a/b') + pp = p / 'c' + self.assertEqual(pp, P('a/b/c')) + self.assertIs(type(pp), type(p)) + pp = p / 'c/d' + self.assertEqual(pp, P('a/b/c/d')) + pp = p / 'c' / 'd' + self.assertEqual(pp, P('a/b/c/d')) + pp = 'c' / p / 'd' + self.assertEqual(pp, P('c/a/b/d')) + pp = p / P('c') + self.assertEqual(pp, P('a/b/c')) + pp = p/ '/c' + self.assertEqual(pp, P('/c')) + + def _check_str(self, expected, args): + p = self.cls(*args) + self.assertEqual(str(p), expected.replace('/', self.sep)) + + def test_str_common(self): + # Canonicalized paths roundtrip + for pathstr in ('a', 'a/b', 'a/b/c', '/', '/a/b', '/a/b/c'): + self._check_str(pathstr, (pathstr,)) + # Special case for the empty path + self._check_str('.', ('',)) + # Other tests for str() are in test_equivalences() + + def test_as_posix_common(self): + P = self.cls + for pathstr in ('a', 'a/b', 'a/b/c', '/', '/a/b', '/a/b/c'): + self.assertEqual(P(pathstr).as_posix(), pathstr) + # Other tests for as_posix() are in test_equivalences() + + def test_as_bytes_common(self): + sep = os.fsencode(self.sep) + P = self.cls + self.assertEqual(bytes(P('a/b')), b'a' + sep + b'b') + + def test_as_uri_common(self): + P = self.cls + with self.assertRaises(ValueError): + P('a').as_uri() + with self.assertRaises(ValueError): + P().as_uri() + + def test_repr_common(self): + for pathstr in ('a', 'a/b', 'a/b/c', '/', '/a/b', '/a/b/c'): + p = self.cls(pathstr) + clsname = p.__class__.__name__ + r = repr(p) + # The repr() is in the form ClassName("forward-slashes path") + self.assertTrue(r.startswith(clsname + '('), r) + self.assertTrue(r.endswith(')'), r) + inner = r[len(clsname) + 1 : -1] + self.assertEqual(eval(inner), p.as_posix()) + # The repr() roundtrips + q = eval(r, pathlib.__dict__) + self.assertIs(q.__class__, p.__class__) + self.assertEqual(q, p) + self.assertEqual(repr(q), r) + + def test_eq_common(self): + P = self.cls + self.assertEqual(P('a/b'), P('a/b')) + self.assertEqual(P('a/b'), P('a', 'b')) + self.assertNotEqual(P('a/b'), P('a')) + self.assertNotEqual(P('a/b'), P('/a/b')) + self.assertNotEqual(P('a/b'), P()) + self.assertNotEqual(P('/a/b'), P('/')) + self.assertNotEqual(P(), P('/')) + self.assertNotEqual(P(), "") + self.assertNotEqual(P(), {}) + self.assertNotEqual(P(), int) + + def test_match_common(self): + P = self.cls + self.assertRaises(ValueError, P('a').match, '') + self.assertRaises(ValueError, P('a').match, '.') + # Simple relative pattern + self.assertTrue(P('b.py').match('b.py')) + self.assertTrue(P('a/b.py').match('b.py')) + self.assertTrue(P('/a/b.py').match('b.py')) + self.assertFalse(P('a.py').match('b.py')) + self.assertFalse(P('b/py').match('b.py')) + self.assertFalse(P('/a.py').match('b.py')) + self.assertFalse(P('b.py/c').match('b.py')) + # Wilcard relative pattern + self.assertTrue(P('b.py').match('*.py')) + self.assertTrue(P('a/b.py').match('*.py')) + self.assertTrue(P('/a/b.py').match('*.py')) + self.assertFalse(P('b.pyc').match('*.py')) + self.assertFalse(P('b./py').match('*.py')) + self.assertFalse(P('b.py/c').match('*.py')) + # Multi-part relative pattern + self.assertTrue(P('ab/c.py').match('a*/*.py')) + self.assertTrue(P('/d/ab/c.py').match('a*/*.py')) + self.assertFalse(P('a.py').match('a*/*.py')) + self.assertFalse(P('/dab/c.py').match('a*/*.py')) + self.assertFalse(P('ab/c.py/d').match('a*/*.py')) + # Absolute pattern + self.assertTrue(P('/b.py').match('/*.py')) + self.assertFalse(P('b.py').match('/*.py')) + self.assertFalse(P('a/b.py').match('/*.py')) + self.assertFalse(P('/a/b.py').match('/*.py')) + # Multi-part absolute pattern + self.assertTrue(P('/a/b.py').match('/a/*.py')) + self.assertFalse(P('/ab.py').match('/a/*.py')) + self.assertFalse(P('/a/b/c.py').match('/a/*.py')) + + def test_ordering_common(self): + # Ordering is tuple-alike + def assertLess(a, b): + self.assertLess(a, b) + self.assertGreater(b, a) + P = self.cls + a = P('a') + b = P('a/b') + c = P('abc') + d = P('b') + assertLess(a, b) + assertLess(a, c) + assertLess(a, d) + assertLess(b, c) + assertLess(c, d) + P = self.cls + a = P('/a') + b = P('/a/b') + c = P('/abc') + d = P('/b') + assertLess(a, b) + assertLess(a, c) + assertLess(a, d) + assertLess(b, c) + assertLess(c, d) + with self.assertRaises(TypeError): + P() < {} + + def test_parts_common(self): + # `parts` returns a tuple + sep = self.sep + P = self.cls + p = P('a/b') + parts = p.parts + self.assertEqual(parts, ('a', 'b')) + # The object gets reused + self.assertIs(parts, p.parts) + # When the path is absolute, the anchor is a separate part + p = P('/a/b') + parts = p.parts + self.assertEqual(parts, (sep, 'a', 'b')) + + def test_equivalences(self): + for k, tuples in self.equivalences.items(): + canon = k.replace('/', self.sep) + posix = k.replace(self.sep, '/') + if canon != posix: + tuples = tuples + [ + tuple(part.replace('/', self.sep) for part in t) + for t in tuples + ] + tuples.append((posix, )) + pcanon = self.cls(canon) + for t in tuples: + p = self.cls(*t) + self.assertEqual(p, pcanon, "failed with args {}".format(t)) + self.assertEqual(hash(p), hash(pcanon)) + self.assertEqual(str(p), canon) + self.assertEqual(p.as_posix(), posix) + + def test_parent_common(self): + # Relative + P = self.cls + p = P('a/b/c') + self.assertEqual(p.parent, P('a/b')) + self.assertEqual(p.parent.parent, P('a')) + self.assertEqual(p.parent.parent.parent, P()) + self.assertEqual(p.parent.parent.parent.parent, P()) + # Anchored + p = P('/a/b/c') + self.assertEqual(p.parent, P('/a/b')) + self.assertEqual(p.parent.parent, P('/a')) + self.assertEqual(p.parent.parent.parent, P('/')) + self.assertEqual(p.parent.parent.parent.parent, P('/')) + + def test_parents_common(self): + # Relative + P = self.cls + p = P('a/b/c') + par = p.parents + self.assertEqual(len(par), 3) + self.assertEqual(par[0], P('a/b')) + self.assertEqual(par[1], P('a')) + self.assertEqual(par[2], P('.')) + self.assertEqual(list(par), [P('a/b'), P('a'), P('.')]) + with self.assertRaises(IndexError): + par[-1] + with self.assertRaises(IndexError): + par[3] + with self.assertRaises(TypeError): + par[0] = p + # Anchored + p = P('/a/b/c') + par = p.parents + self.assertEqual(len(par), 3) + self.assertEqual(par[0], P('/a/b')) + self.assertEqual(par[1], P('/a')) + self.assertEqual(par[2], P('/')) + self.assertEqual(list(par), [P('/a/b'), P('/a'), P('/')]) + with self.assertRaises(IndexError): + par[3] + + def test_drive_common(self): + P = self.cls + self.assertEqual(P('a/b').drive, '') + self.assertEqual(P('/a/b').drive, '') + self.assertEqual(P('').drive, '') + + def test_root_common(self): + P = self.cls + sep = self.sep + self.assertEqual(P('').root, '') + self.assertEqual(P('a/b').root, '') + self.assertEqual(P('/').root, sep) + self.assertEqual(P('/a/b').root, sep) + + def test_anchor_common(self): + P = self.cls + sep = self.sep + self.assertEqual(P('').anchor, '') + self.assertEqual(P('a/b').anchor, '') + self.assertEqual(P('/').anchor, sep) + self.assertEqual(P('/a/b').anchor, sep) + + def test_name_common(self): + P = self.cls + self.assertEqual(P('').name, '') + self.assertEqual(P('.').name, '') + self.assertEqual(P('/').name, '') + self.assertEqual(P('a/b').name, 'b') + self.assertEqual(P('/a/b').name, 'b') + self.assertEqual(P('/a/b/.').name, 'b') + self.assertEqual(P('a/b.py').name, 'b.py') + self.assertEqual(P('/a/b.py').name, 'b.py') + + def test_suffix_common(self): + P = self.cls + self.assertEqual(P('').suffix, '') + self.assertEqual(P('.').suffix, '') + self.assertEqual(P('..').suffix, '') + self.assertEqual(P('/').suffix, '') + self.assertEqual(P('a/b').suffix, '') + self.assertEqual(P('/a/b').suffix, '') + self.assertEqual(P('/a/b/.').suffix, '') + self.assertEqual(P('a/b.py').suffix, '.py') + self.assertEqual(P('/a/b.py').suffix, '.py') + self.assertEqual(P('a/.hgrc').suffix, '') + self.assertEqual(P('/a/.hgrc').suffix, '') + self.assertEqual(P('a/.hg.rc').suffix, '.rc') + self.assertEqual(P('/a/.hg.rc').suffix, '.rc') + self.assertEqual(P('a/b.tar.gz').suffix, '.gz') + self.assertEqual(P('/a/b.tar.gz').suffix, '.gz') + self.assertEqual(P('a/Some name. Ending with a dot.').suffix, '') + self.assertEqual(P('/a/Some name. Ending with a dot.').suffix, '') + + def test_suffixes_common(self): + P = self.cls + self.assertEqual(P('').suffixes, []) + self.assertEqual(P('.').suffixes, []) + self.assertEqual(P('/').suffixes, []) + self.assertEqual(P('a/b').suffixes, []) + self.assertEqual(P('/a/b').suffixes, []) + self.assertEqual(P('/a/b/.').suffixes, []) + self.assertEqual(P('a/b.py').suffixes, ['.py']) + self.assertEqual(P('/a/b.py').suffixes, ['.py']) + self.assertEqual(P('a/.hgrc').suffixes, []) + self.assertEqual(P('/a/.hgrc').suffixes, []) + self.assertEqual(P('a/.hg.rc').suffixes, ['.rc']) + self.assertEqual(P('/a/.hg.rc').suffixes, ['.rc']) + self.assertEqual(P('a/b.tar.gz').suffixes, ['.tar', '.gz']) + self.assertEqual(P('/a/b.tar.gz').suffixes, ['.tar', '.gz']) + self.assertEqual(P('a/Some name. Ending with a dot.').suffixes, []) + self.assertEqual(P('/a/Some name. Ending with a dot.').suffixes, []) + + def test_stem_common(self): + P = self.cls + self.assertEqual(P('').stem, '') + self.assertEqual(P('.').stem, '') + self.assertEqual(P('..').stem, '..') + self.assertEqual(P('/').stem, '') + self.assertEqual(P('a/b').stem, 'b') + self.assertEqual(P('a/b.py').stem, 'b') + self.assertEqual(P('a/.hgrc').stem, '.hgrc') + self.assertEqual(P('a/.hg.rc').stem, '.hg') + self.assertEqual(P('a/b.tar.gz').stem, 'b.tar') + self.assertEqual(P('a/Some name. Ending with a dot.').stem, + 'Some name. Ending with a dot.') + + def test_with_name_common(self): + P = self.cls + self.assertEqual(P('a/b').with_name('d.xml'), P('a/d.xml')) + self.assertEqual(P('/a/b').with_name('d.xml'), P('/a/d.xml')) + self.assertEqual(P('a/b.py').with_name('d.xml'), P('a/d.xml')) + self.assertEqual(P('/a/b.py').with_name('d.xml'), P('/a/d.xml')) + self.assertEqual(P('a/Dot ending.').with_name('d.xml'), P('a/d.xml')) + self.assertEqual(P('/a/Dot ending.').with_name('d.xml'), P('/a/d.xml')) + self.assertRaises(ValueError, P('').with_name, 'd.xml') + self.assertRaises(ValueError, P('.').with_name, 'd.xml') + self.assertRaises(ValueError, P('/').with_name, 'd.xml') + + def test_with_suffix_common(self): + P = self.cls + self.assertEqual(P('a/b').with_suffix('.gz'), P('a/b.gz')) + self.assertEqual(P('/a/b').with_suffix('.gz'), P('/a/b.gz')) + self.assertEqual(P('a/b.py').with_suffix('.gz'), P('a/b.gz')) + self.assertEqual(P('/a/b.py').with_suffix('.gz'), P('/a/b.gz')) + self.assertRaises(ValueError, P('').with_suffix, '.gz') + self.assertRaises(ValueError, P('.').with_suffix, '.gz') + self.assertRaises(ValueError, P('/').with_suffix, '.gz') + + def test_relative_to_common(self): + P = self.cls + p = P('a/b') + self.assertRaises(TypeError, p.relative_to) + self.assertEqual(p.relative_to(P()), P('a/b')) + self.assertEqual(p.relative_to(P('a')), P('b')) + self.assertEqual(p.relative_to(P('a/b')), P()) + # With several args + self.assertEqual(p.relative_to('a', 'b'), P()) + # Unrelated paths + self.assertRaises(ValueError, p.relative_to, P('c')) + self.assertRaises(ValueError, p.relative_to, P('a/b/c')) + self.assertRaises(ValueError, p.relative_to, P('a/c')) + self.assertRaises(ValueError, p.relative_to, P('/a')) + p = P('/a/b') + self.assertEqual(p.relative_to(P('/')), P('a/b')) + self.assertEqual(p.relative_to(P('/a')), P('b')) + self.assertEqual(p.relative_to(P('/a/b')), P()) + # Unrelated paths + self.assertRaises(ValueError, p.relative_to, P('/c')) + self.assertRaises(ValueError, p.relative_to, P('/a/b/c')) + self.assertRaises(ValueError, p.relative_to, P('/a/c')) + self.assertRaises(ValueError, p.relative_to, P()) + self.assertRaises(ValueError, p.relative_to, P('a')) + + def test_pickling_common(self): + P = self.cls + p = P('/a/b') + for proto in range(0, pickle.HIGHEST_PROTOCOL + 1): + dumped = pickle.dumps(p, proto) + pp = pickle.loads(dumped) + self.assertIs(pp.__class__, p.__class__) + self.assertEqual(pp, p) + self.assertEqual(hash(pp), hash(p)) + self.assertEqual(str(pp), str(p)) + + +class PurePosixPathTest(_BasePurePathTest, unittest.TestCase): + cls = pathlib.PurePosixPath + + def test_root(self): + P = self.cls + self.assertEqual(P('/a/b').root, '/') + self.assertEqual(P('///a/b').root, '/') + # POSIX special case for two leading slashes + self.assertEqual(P('//a/b').root, '//') + + def test_eq(self): + P = self.cls + self.assertNotEqual(P('a/b'), P('A/b')) + self.assertEqual(P('/a'), P('///a')) + self.assertNotEqual(P('/a'), P('//a')) + + def test_as_uri(self): + from urllib.parse import quote_from_bytes + P = self.cls + self.assertEqual(P('/').as_uri(), 'file:///') + self.assertEqual(P('/a/b.c').as_uri(), 'file:///a/b.c') + self.assertEqual(P('/a/b%#c').as_uri(), 'file:///a/b%25%23c') + self.assertEqual(P('/a/b\xe9').as_uri(), + 'file:///a/b' + quote_from_bytes(os.fsencode('\xe9'))) + + def test_match(self): + P = self.cls + self.assertFalse(P('A.py').match('a.PY')) + + def test_is_absolute(self): + P = self.cls + self.assertFalse(P().is_absolute()) + self.assertFalse(P('a').is_absolute()) + self.assertFalse(P('a/b/').is_absolute()) + self.assertTrue(P('/').is_absolute()) + self.assertTrue(P('/a').is_absolute()) + self.assertTrue(P('/a/b/').is_absolute()) + self.assertTrue(P('//a').is_absolute()) + self.assertTrue(P('//a/b').is_absolute()) + + def test_is_reserved(self): + P = self.cls + self.assertIs(False, P('').is_reserved()) + self.assertIs(False, P('/').is_reserved()) + self.assertIs(False, P('/foo/bar').is_reserved()) + self.assertIs(False, P('/dev/con/PRN/NUL').is_reserved()) + + def test_join(self): + P = self.cls + p = P('//a') + pp = p.joinpath('b') + self.assertEqual(pp, P('//a/b')) + pp = P('/a').joinpath('//c') + self.assertEqual(pp, P('//c')) + pp = P('//a').joinpath('/c') + self.assertEqual(pp, P('/c')) + + def test_div(self): + # Basically the same as joinpath() + P = self.cls + p = P('//a') + pp = p / 'b' + self.assertEqual(pp, P('//a/b')) + pp = P('/a') / '//c' + self.assertEqual(pp, P('//c')) + pp = P('//a') / '/c' + self.assertEqual(pp, P('/c')) + + +class PureWindowsPathTest(_BasePurePathTest, unittest.TestCase): + cls = pathlib.PureWindowsPath + + equivalences = _BasePurePathTest.equivalences.copy() + equivalences.update({ + 'c:a': [ ('c:', 'a'), ('c:', 'a/'), ('/', 'c:', 'a') ], + 'c:/a': [ + ('c:/', 'a'), ('c:', '/', 'a'), ('c:', '/a'), + ('/z', 'c:/', 'a'), ('//x/y', 'c:/', 'a'), + ], + '//a/b/': [ ('//a/b',) ], + '//a/b/c': [ + ('//a/b', 'c'), ('//a/b/', 'c'), + ], + }) + + def test_str(self): + p = self.cls('a/b/c') + self.assertEqual(str(p), 'a\\b\\c') + p = self.cls('c:/a/b/c') + self.assertEqual(str(p), 'c:\\a\\b\\c') + p = self.cls('//a/b') + self.assertEqual(str(p), '\\\\a\\b\\') + p = self.cls('//a/b/c') + self.assertEqual(str(p), '\\\\a\\b\\c') + p = self.cls('//a/b/c/d') + self.assertEqual(str(p), '\\\\a\\b\\c\\d') + + def test_eq(self): + P = self.cls + self.assertEqual(P('c:a/b'), P('c:a/b')) + self.assertEqual(P('c:a/b'), P('c:', 'a', 'b')) + self.assertNotEqual(P('c:a/b'), P('d:a/b')) + self.assertNotEqual(P('c:a/b'), P('c:/a/b')) + self.assertNotEqual(P('/a/b'), P('c:/a/b')) + # Case-insensitivity + self.assertEqual(P('a/B'), P('A/b')) + self.assertEqual(P('C:a/B'), P('c:A/b')) + self.assertEqual(P('//Some/SHARE/a/B'), P('//somE/share/A/b')) + + def test_as_uri(self): + from urllib.parse import quote_from_bytes + P = self.cls + with self.assertRaises(ValueError): + P('/a/b').as_uri() + with self.assertRaises(ValueError): + P('c:a/b').as_uri() + self.assertEqual(P('c:/').as_uri(), 'file:///c:/') + self.assertEqual(P('c:/a/b.c').as_uri(), 'file:///c:/a/b.c') + self.assertEqual(P('c:/a/b%#c').as_uri(), 'file:///c:/a/b%25%23c') + self.assertEqual(P('c:/a/b\xe9').as_uri(), 'file:///c:/a/b%C3%A9') + self.assertEqual(P('//some/share/').as_uri(), 'file://some/share/') + self.assertEqual(P('//some/share/a/b.c').as_uri(), + 'file://some/share/a/b.c') + self.assertEqual(P('//some/share/a/b%#c\xe9').as_uri(), + 'file://some/share/a/b%25%23c%C3%A9') + + def test_match_common(self): + P = self.cls + # Absolute patterns + self.assertTrue(P('c:/b.py').match('/*.py')) + self.assertTrue(P('c:/b.py').match('c:*.py')) + self.assertTrue(P('c:/b.py').match('c:/*.py')) + self.assertFalse(P('d:/b.py').match('c:/*.py')) # wrong drive + self.assertFalse(P('b.py').match('/*.py')) + self.assertFalse(P('b.py').match('c:*.py')) + self.assertFalse(P('b.py').match('c:/*.py')) + self.assertFalse(P('c:b.py').match('/*.py')) + self.assertFalse(P('c:b.py').match('c:/*.py')) + self.assertFalse(P('/b.py').match('c:*.py')) + self.assertFalse(P('/b.py').match('c:/*.py')) + # UNC patterns + self.assertTrue(P('//some/share/a.py').match('/*.py')) + self.assertTrue(P('//some/share/a.py').match('//some/share/*.py')) + self.assertFalse(P('//other/share/a.py').match('//some/share/*.py')) + self.assertFalse(P('//some/share/a/b.py').match('//some/share/*.py')) + # Case-insensitivity + self.assertTrue(P('B.py').match('b.PY')) + self.assertTrue(P('c:/a/B.Py').match('C:/A/*.pY')) + self.assertTrue(P('//Some/Share/B.Py').match('//somE/sharE/*.pY')) + + def test_ordering_common(self): + # Case-insensitivity + def assertOrderedEqual(a, b): + self.assertLessEqual(a, b) + self.assertGreaterEqual(b, a) + P = self.cls + p = P('c:A/b') + q = P('C:a/B') + assertOrderedEqual(p, q) + self.assertFalse(p < q) + self.assertFalse(p > q) + p = P('//some/Share/A/b') + q = P('//Some/SHARE/a/B') + assertOrderedEqual(p, q) + self.assertFalse(p < q) + self.assertFalse(p > q) + + def test_parts(self): + P = self.cls + p = P('c:a/b') + parts = p.parts + self.assertEqual(parts, ('c:', 'a', 'b')) + p = P('c:/a/b') + parts = p.parts + self.assertEqual(parts, ('c:\\', 'a', 'b')) + p = P('//a/b/c/d') + parts = p.parts + self.assertEqual(parts, ('\\\\a\\b\\', 'c', 'd')) + + def test_parent(self): + # Anchored + P = self.cls + p = P('z:a/b/c') + self.assertEqual(p.parent, P('z:a/b')) + self.assertEqual(p.parent.parent, P('z:a')) + self.assertEqual(p.parent.parent.parent, P('z:')) + self.assertEqual(p.parent.parent.parent.parent, P('z:')) + p = P('z:/a/b/c') + self.assertEqual(p.parent, P('z:/a/b')) + self.assertEqual(p.parent.parent, P('z:/a')) + self.assertEqual(p.parent.parent.parent, P('z:/')) + self.assertEqual(p.parent.parent.parent.parent, P('z:/')) + p = P('//a/b/c/d') + self.assertEqual(p.parent, P('//a/b/c')) + self.assertEqual(p.parent.parent, P('//a/b')) + self.assertEqual(p.parent.parent.parent, P('//a/b')) + + def test_parents(self): + # Anchored + P = self.cls + p = P('z:a/b/') + par = p.parents + self.assertEqual(len(par), 2) + self.assertEqual(par[0], P('z:a')) + self.assertEqual(par[1], P('z:')) + self.assertEqual(list(par), [P('z:a'), P('z:')]) + with self.assertRaises(IndexError): + par[2] + p = P('z:/a/b/') + par = p.parents + self.assertEqual(len(par), 2) + self.assertEqual(par[0], P('z:/a')) + self.assertEqual(par[1], P('z:/')) + self.assertEqual(list(par), [P('z:/a'), P('z:/')]) + with self.assertRaises(IndexError): + par[2] + p = P('//a/b/c/d') + par = p.parents + self.assertEqual(len(par), 2) + self.assertEqual(par[0], P('//a/b/c')) + self.assertEqual(par[1], P('//a/b')) + self.assertEqual(list(par), [P('//a/b/c'), P('//a/b')]) + with self.assertRaises(IndexError): + par[2] + + def test_drive(self): + P = self.cls + self.assertEqual(P('c:').drive, 'c:') + self.assertEqual(P('c:a/b').drive, 'c:') + self.assertEqual(P('c:/').drive, 'c:') + self.assertEqual(P('c:/a/b/').drive, 'c:') + self.assertEqual(P('//a/b').drive, '\\\\a\\b') + self.assertEqual(P('//a/b/').drive, '\\\\a\\b') + self.assertEqual(P('//a/b/c/d').drive, '\\\\a\\b') + + def test_root(self): + P = self.cls + self.assertEqual(P('c:').root, '') + self.assertEqual(P('c:a/b').root, '') + self.assertEqual(P('c:/').root, '\\') + self.assertEqual(P('c:/a/b/').root, '\\') + self.assertEqual(P('//a/b').root, '\\') + self.assertEqual(P('//a/b/').root, '\\') + self.assertEqual(P('//a/b/c/d').root, '\\') + + def test_anchor(self): + P = self.cls + self.assertEqual(P('c:').anchor, 'c:') + self.assertEqual(P('c:a/b').anchor, 'c:') + self.assertEqual(P('c:/').anchor, 'c:\\') + self.assertEqual(P('c:/a/b/').anchor, 'c:\\') + self.assertEqual(P('//a/b').anchor, '\\\\a\\b\\') + self.assertEqual(P('//a/b/').anchor, '\\\\a\\b\\') + self.assertEqual(P('//a/b/c/d').anchor, '\\\\a\\b\\') + + def test_name(self): + P = self.cls + self.assertEqual(P('c:').name, '') + self.assertEqual(P('c:/').name, '') + self.assertEqual(P('c:a/b').name, 'b') + self.assertEqual(P('c:/a/b').name, 'b') + self.assertEqual(P('c:a/b.py').name, 'b.py') + self.assertEqual(P('c:/a/b.py').name, 'b.py') + self.assertEqual(P('//My.py/Share.php').name, '') + self.assertEqual(P('//My.py/Share.php/a/b').name, 'b') + + def test_suffix(self): + P = self.cls + self.assertEqual(P('c:').suffix, '') + self.assertEqual(P('c:/').suffix, '') + self.assertEqual(P('c:a/b').suffix, '') + self.assertEqual(P('c:/a/b').suffix, '') + self.assertEqual(P('c:a/b.py').suffix, '.py') + self.assertEqual(P('c:/a/b.py').suffix, '.py') + self.assertEqual(P('c:a/.hgrc').suffix, '') + self.assertEqual(P('c:/a/.hgrc').suffix, '') + self.assertEqual(P('c:a/.hg.rc').suffix, '.rc') + self.assertEqual(P('c:/a/.hg.rc').suffix, '.rc') + self.assertEqual(P('c:a/b.tar.gz').suffix, '.gz') + self.assertEqual(P('c:/a/b.tar.gz').suffix, '.gz') + self.assertEqual(P('c:a/Some name. Ending with a dot.').suffix, '') + self.assertEqual(P('c:/a/Some name. Ending with a dot.').suffix, '') + self.assertEqual(P('//My.py/Share.php').suffix, '') + self.assertEqual(P('//My.py/Share.php/a/b').suffix, '') + + def test_suffixes(self): + P = self.cls + self.assertEqual(P('c:').suffixes, []) + self.assertEqual(P('c:/').suffixes, []) + self.assertEqual(P('c:a/b').suffixes, []) + self.assertEqual(P('c:/a/b').suffixes, []) + self.assertEqual(P('c:a/b.py').suffixes, ['.py']) + self.assertEqual(P('c:/a/b.py').suffixes, ['.py']) + self.assertEqual(P('c:a/.hgrc').suffixes, []) + self.assertEqual(P('c:/a/.hgrc').suffixes, []) + self.assertEqual(P('c:a/.hg.rc').suffixes, ['.rc']) + self.assertEqual(P('c:/a/.hg.rc').suffixes, ['.rc']) + self.assertEqual(P('c:a/b.tar.gz').suffixes, ['.tar', '.gz']) + self.assertEqual(P('c:/a/b.tar.gz').suffixes, ['.tar', '.gz']) + self.assertEqual(P('//My.py/Share.php').suffixes, []) + self.assertEqual(P('//My.py/Share.php/a/b').suffixes, []) + self.assertEqual(P('c:a/Some name. Ending with a dot.').suffixes, []) + self.assertEqual(P('c:/a/Some name. Ending with a dot.').suffixes, []) + + def test_stem(self): + P = self.cls + self.assertEqual(P('c:').stem, '') + self.assertEqual(P('c:.').stem, '') + self.assertEqual(P('c:..').stem, '..') + self.assertEqual(P('c:/').stem, '') + self.assertEqual(P('c:a/b').stem, 'b') + self.assertEqual(P('c:a/b.py').stem, 'b') + self.assertEqual(P('c:a/.hgrc').stem, '.hgrc') + self.assertEqual(P('c:a/.hg.rc').stem, '.hg') + self.assertEqual(P('c:a/b.tar.gz').stem, 'b.tar') + self.assertEqual(P('c:a/Some name. Ending with a dot.').stem, + 'Some name. Ending with a dot.') + + def test_with_name(self): + P = self.cls + self.assertEqual(P('c:a/b').with_name('d.xml'), P('c:a/d.xml')) + self.assertEqual(P('c:/a/b').with_name('d.xml'), P('c:/a/d.xml')) + self.assertEqual(P('c:a/Dot ending.').with_name('d.xml'), P('c:a/d.xml')) + self.assertEqual(P('c:/a/Dot ending.').with_name('d.xml'), P('c:/a/d.xml')) + self.assertRaises(ValueError, P('c:').with_name, 'd.xml') + self.assertRaises(ValueError, P('c:/').with_name, 'd.xml') + self.assertRaises(ValueError, P('//My/Share').with_name, 'd.xml') + + def test_with_suffix(self): + P = self.cls + self.assertEqual(P('c:a/b').with_suffix('.gz'), P('c:a/b.gz')) + self.assertEqual(P('c:/a/b').with_suffix('.gz'), P('c:/a/b.gz')) + self.assertEqual(P('c:a/b.py').with_suffix('.gz'), P('c:a/b.gz')) + self.assertEqual(P('c:/a/b.py').with_suffix('.gz'), P('c:/a/b.gz')) + self.assertRaises(ValueError, P('').with_suffix, '.gz') + self.assertRaises(ValueError, P('.').with_suffix, '.gz') + self.assertRaises(ValueError, P('/').with_suffix, '.gz') + self.assertRaises(ValueError, P('//My/Share').with_suffix, '.gz') + + def test_relative_to(self): + P = self.cls + p = P('c:a/b') + self.assertEqual(p.relative_to(P('c:')), P('a/b')) + self.assertEqual(p.relative_to(P('c:a')), P('b')) + self.assertEqual(p.relative_to(P('c:a/b')), P()) + # Unrelated paths + self.assertRaises(ValueError, p.relative_to, P()) + self.assertRaises(ValueError, p.relative_to, P('d:')) + self.assertRaises(ValueError, p.relative_to, P('a')) + self.assertRaises(ValueError, p.relative_to, P('/a')) + self.assertRaises(ValueError, p.relative_to, P('c:a/b/c')) + self.assertRaises(ValueError, p.relative_to, P('c:a/c')) + self.assertRaises(ValueError, p.relative_to, P('c:/a')) + p = P('c:/a/b') + self.assertEqual(p.relative_to(P('c:')), P('/a/b')) + self.assertEqual(p.relative_to(P('c:/')), P('a/b')) + self.assertEqual(p.relative_to(P('c:/a')), P('b')) + self.assertEqual(p.relative_to(P('c:/a/b')), P()) + # Unrelated paths + self.assertRaises(ValueError, p.relative_to, P('c:/c')) + self.assertRaises(ValueError, p.relative_to, P('c:/a/b/c')) + self.assertRaises(ValueError, p.relative_to, P('c:/a/c')) + self.assertRaises(ValueError, p.relative_to, P('c:a')) + self.assertRaises(ValueError, p.relative_to, P('d:')) + self.assertRaises(ValueError, p.relative_to, P('d:/')) + self.assertRaises(ValueError, p.relative_to, P('/a')) + self.assertRaises(ValueError, p.relative_to, P('//c/a')) + # UNC paths + p = P('//a/b/c/d') + self.assertEqual(p.relative_to(P('//a/b')), P('c/d')) + self.assertEqual(p.relative_to(P('//a/b/c')), P('d')) + self.assertEqual(p.relative_to(P('//a/b/c/d')), P()) + # Unrelated paths + self.assertRaises(ValueError, p.relative_to, P('/a/b/c')) + self.assertRaises(ValueError, p.relative_to, P('c:/a/b/c')) + self.assertRaises(ValueError, p.relative_to, P('//z/b/c')) + self.assertRaises(ValueError, p.relative_to, P('//a/z/c')) + + def test_is_absolute(self): + P = self.cls + # Under NT, only paths with both a drive and a root are absolute + self.assertFalse(P().is_absolute()) + self.assertFalse(P('a').is_absolute()) + self.assertFalse(P('a/b/').is_absolute()) + self.assertFalse(P('/').is_absolute()) + self.assertFalse(P('/a').is_absolute()) + self.assertFalse(P('/a/b/').is_absolute()) + self.assertFalse(P('c:').is_absolute()) + self.assertFalse(P('c:a').is_absolute()) + self.assertFalse(P('c:a/b/').is_absolute()) + self.assertTrue(P('c:/').is_absolute()) + self.assertTrue(P('c:/a').is_absolute()) + self.assertTrue(P('c:/a/b/').is_absolute()) + # UNC paths are absolute by definition + self.assertTrue(P('//a/b').is_absolute()) + self.assertTrue(P('//a/b/').is_absolute()) + self.assertTrue(P('//a/b/c').is_absolute()) + self.assertTrue(P('//a/b/c/d').is_absolute()) + + def test_is_reserved(self): + P = self.cls + self.assertIs(False, P('').is_reserved()) + self.assertIs(False, P('/').is_reserved()) + self.assertIs(False, P('/foo/bar').is_reserved()) + self.assertIs(True, P('con').is_reserved()) + self.assertIs(True, P('NUL').is_reserved()) + self.assertIs(True, P('NUL.txt').is_reserved()) + self.assertIs(True, P('com1').is_reserved()) + self.assertIs(True, P('com9.bar').is_reserved()) + self.assertIs(False, P('bar.com9').is_reserved()) + self.assertIs(True, P('lpt1').is_reserved()) + self.assertIs(True, P('lpt9.bar').is_reserved()) + self.assertIs(False, P('bar.lpt9').is_reserved()) + # Only the last component matters + self.assertIs(False, P('c:/NUL/con/baz').is_reserved()) + # UNC paths are never reserved + self.assertIs(False, P('//my/share/nul/con/aux').is_reserved()) + + +class PurePathTest(_BasePurePathTest, unittest.TestCase): + cls = pathlib.PurePath + + def test_concrete_class(self): + p = self.cls('a') + self.assertIs(type(p), + pathlib.PureWindowsPath if os.name == 'nt' else pathlib.PurePosixPath) + + def test_different_flavours_unequal(self): + p = pathlib.PurePosixPath('a') + q = pathlib.PureWindowsPath('a') + self.assertNotEqual(p, q) + + def test_different_flavours_unordered(self): + p = pathlib.PurePosixPath('a') + q = pathlib.PureWindowsPath('a') + with self.assertRaises(TypeError): + p < q + with self.assertRaises(TypeError): + p <= q + with self.assertRaises(TypeError): + p > q + with self.assertRaises(TypeError): + p >= q + + +# +# Tests for the concrete classes +# + +# Make sure any symbolic links in the base test path are resolved +BASE = os.path.realpath(TESTFN) +join = lambda *x: os.path.join(BASE, *x) +rel_join = lambda *x: os.path.join(TESTFN, *x) + +def symlink_skip_reason(): + if not pathlib.supports_symlinks: + return "no system support for symlinks" + try: + os.symlink(__file__, BASE) + except OSError as e: + return str(e) + else: + support.unlink(BASE) + return None + +symlink_skip_reason = symlink_skip_reason() + +only_nt = unittest.skipIf(os.name != 'nt', + 'test requires a Windows-compatible system') +only_posix = unittest.skipIf(os.name == 'nt', + 'test requires a POSIX-compatible system') +with_symlinks = unittest.skipIf(symlink_skip_reason, symlink_skip_reason) + + + at only_posix +class PosixPathAsPureTest(PurePosixPathTest): + cls = pathlib.PosixPath + + at only_nt +class WindowsPathAsPureTest(PureWindowsPathTest): + cls = pathlib.WindowsPath + + +class _BasePathTest(object): + """Tests for the FS-accessing functionalities of the Path classes.""" + + # (BASE) + # | + # |-- dirA/ + # |-- linkC -> "../dirB" + # |-- dirB/ + # | |-- fileB + # |-- linkD -> "../dirB" + # |-- dirC/ + # | |-- fileC + # | |-- fileD + # |-- fileA + # |-- linkA -> "fileA" + # |-- linkB -> "dirB" + # + + def setUp(self): + os.mkdir(BASE) + self.addCleanup(shutil.rmtree, BASE) + os.mkdir(join('dirA')) + os.mkdir(join('dirB')) + os.mkdir(join('dirC')) + os.mkdir(join('dirC', 'dirD')) + with open(join('fileA'), 'wb') as f: + f.write(b"this is file A\n") + with open(join('dirB', 'fileB'), 'wb') as f: + f.write(b"this is file B\n") + with open(join('dirC', 'fileC'), 'wb') as f: + f.write(b"this is file C\n") + with open(join('dirC', 'dirD', 'fileD'), 'wb') as f: + f.write(b"this is file D\n") + if not symlink_skip_reason: + if os.name == 'nt': + # Workaround for http://bugs.python.org/issue13772 + def dirlink(src, dest): + os.symlink(src, dest, target_is_directory=True) + else: + def dirlink(src, dest): + os.symlink(src, dest) + # Relative symlinks + os.symlink('fileA', join('linkA')) + os.symlink('non-existing', join('brokenLink')) + dirlink('dirB', join('linkB')) + dirlink(os.path.join('..', 'dirB'), join('dirA', 'linkC')) + # This one goes upwards but doesn't create a loop + dirlink(os.path.join('..', 'dirB'), join('dirB', 'linkD')) + + def assertSame(self, path_a, path_b): + self.assertTrue(os.path.samefile(str(path_a), str(path_b)), + "%r and %r don't point to the same file" % + (path_a, path_b)) + + def assertFileNotFound(self, func, *args, **kwargs): + with self.assertRaises(FileNotFoundError) as cm: + func(*args, **kwargs) + self.assertEqual(cm.exception.errno, errno.ENOENT) + + def _test_cwd(self, p): + q = self.cls(os.getcwd()) + self.assertEqual(p, q) + self.assertEqual(str(p), str(q)) + self.assertIs(type(p), type(q)) + self.assertTrue(p.is_absolute()) + + def test_cwd(self): + p = self.cls.cwd() + self._test_cwd(p) + + def test_empty_path(self): + # The empty path points to '.' + p = self.cls('') + self.assertEqual(p.stat(), os.stat('.')) + + def test_exists(self): + P = self.cls + p = P(BASE) + self.assertIs(True, p.exists()) + self.assertIs(True, (p / 'dirA').exists()) + self.assertIs(True, (p / 'fileA').exists()) + if not symlink_skip_reason: + self.assertIs(True, (p / 'linkA').exists()) + self.assertIs(True, (p / 'linkB').exists()) + self.assertIs(False, (p / 'foo').exists()) + self.assertIs(False, P('/xyzzy').exists()) + + def test_open_common(self): + p = self.cls(BASE) + with (p / 'fileA').open('r') as f: + self.assertIsInstance(f, io.TextIOBase) + self.assertEqual(f.read(), "this is file A\n") + with (p / 'fileA').open('rb') as f: + self.assertIsInstance(f, io.BufferedIOBase) + self.assertEqual(f.read().strip(), b"this is file A") + with (p / 'fileA').open('rb', buffering=0) as f: + self.assertIsInstance(f, io.RawIOBase) + self.assertEqual(f.read().strip(), b"this is file A") + + def test_iterdir(self): + P = self.cls + p = P(BASE) + it = p.iterdir() + paths = set(it) + expected = ['dirA', 'dirB', 'dirC', 'fileA'] + if not symlink_skip_reason: + expected += ['linkA', 'linkB', 'brokenLink'] + self.assertEqual(paths, { P(BASE, q) for q in expected }) + + @with_symlinks + def test_iterdir_symlink(self): + # __iter__ on a symlink to a directory + P = self.cls + p = P(BASE, 'linkB') + paths = set(p.iterdir()) + expected = { P(BASE, 'linkB', q) for q in ['fileB', 'linkD'] } + self.assertEqual(paths, expected) + + def test_iterdir_nodir(self): + # __iter__ on something that is not a directory + p = self.cls(BASE, 'fileA') + with self.assertRaises(OSError) as cm: + next(p.iterdir()) + # ENOENT or EINVAL under Windows, ENOTDIR otherwise + # (see issue #12802) + self.assertIn(cm.exception.errno, (errno.ENOTDIR, + errno.ENOENT, errno.EINVAL)) + + def test_glob_common(self): + def _check(glob, expected): + self.assertEqual(set(glob), { P(BASE, q) for q in expected }) + P = self.cls + p = P(BASE) + it = p.glob("fileA") + self.assertIsInstance(it, collections.Iterator) + _check(it, ["fileA"]) + _check(p.glob("fileB"), []) + _check(p.glob("dir*/file*"), ["dirB/fileB", "dirC/fileC"]) + if symlink_skip_reason: + _check(p.glob("*A"), ['dirA', 'fileA']) + else: + _check(p.glob("*A"), ['dirA', 'fileA', 'linkA']) + if symlink_skip_reason: + _check(p.glob("*B/*"), ['dirB/fileB']) + else: + _check(p.glob("*B/*"), ['dirB/fileB', 'dirB/linkD', + 'linkB/fileB', 'linkB/linkD']) + if symlink_skip_reason: + _check(p.glob("*/fileB"), ['dirB/fileB']) + else: + _check(p.glob("*/fileB"), ['dirB/fileB', 'linkB/fileB']) + + def test_rglob_common(self): + def _check(glob, expected): + self.assertEqual(set(glob), { P(BASE, q) for q in expected }) + P = self.cls + p = P(BASE) + it = p.rglob("fileA") + self.assertIsInstance(it, collections.Iterator) + # XXX cannot test because of symlink loops in the test setup + #_check(it, ["fileA"]) + #_check(p.rglob("fileB"), ["dirB/fileB"]) + #_check(p.rglob("*/fileA"), [""]) + #_check(p.rglob("*/fileB"), ["dirB/fileB"]) + #_check(p.rglob("file*"), ["fileA", "dirB/fileB"]) + # No symlink loops here + p = P(BASE, "dirC") + _check(p.rglob("file*"), ["dirC/fileC", "dirC/dirD/fileD"]) + _check(p.rglob("*/*"), ["dirC/dirD/fileD"]) + + def test_glob_dotdot(self): + # ".." is not special in globs + P = self.cls + p = P(BASE) + self.assertEqual(set(p.glob("..")), { P(BASE, "..") }) + self.assertEqual(set(p.glob("dirA/../file*")), { P(BASE, "dirA/../fileA") }) + self.assertEqual(set(p.glob("../xyzzy")), set()) + + def _check_resolve_relative(self, p, expected): + q = p.resolve() + self.assertEqual(q, expected) + + def _check_resolve_absolute(self, p, expected): + q = p.resolve() + self.assertEqual(q, expected) + + @with_symlinks + def test_resolve_common(self): + P = self.cls + p = P(BASE, 'foo') + with self.assertRaises(OSError) as cm: + p.resolve() + self.assertEqual(cm.exception.errno, errno.ENOENT) + # These are all relative symlinks + p = P(BASE, 'dirB', 'fileB') + self._check_resolve_relative(p, p) + p = P(BASE, 'linkA') + self._check_resolve_relative(p, P(BASE, 'fileA')) + p = P(BASE, 'dirA', 'linkC', 'fileB') + self._check_resolve_relative(p, P(BASE, 'dirB', 'fileB')) + p = P(BASE, 'dirB', 'linkD', 'fileB') + self._check_resolve_relative(p, P(BASE, 'dirB', 'fileB')) + # Now create absolute symlinks + d = tempfile.mkdtemp(suffix='-dirD') + self.addCleanup(shutil.rmtree, d) + os.symlink(os.path.join(d), join('dirA', 'linkX')) + os.symlink(join('dirB'), os.path.join(d, 'linkY')) + p = P(BASE, 'dirA', 'linkX', 'linkY', 'fileB') + self._check_resolve_absolute(p, P(BASE, 'dirB', 'fileB')) + + def test_with(self): + p = self.cls(BASE) + it = p.iterdir() + it2 = p.iterdir() + next(it2) + with p: + pass + # I/O operation on closed path + self.assertRaises(ValueError, next, it) + self.assertRaises(ValueError, next, it2) + self.assertRaises(ValueError, p.open) + self.assertRaises(ValueError, p.resolve) + self.assertRaises(ValueError, p.absolute) + self.assertRaises(ValueError, p.__enter__) + + def test_chmod(self): + p = self.cls(BASE) / 'fileA' + mode = p.stat().st_mode + # Clear writable bit + new_mode = mode & ~0o222 + p.chmod(new_mode) + self.assertEqual(p.stat().st_mode, new_mode) + # Set writable bit + new_mode = mode | 0o222 + p.chmod(new_mode) + self.assertEqual(p.stat().st_mode, new_mode) + + # XXX also need a test for lchmod + + def test_stat(self): + p = self.cls(BASE) / 'fileA' + st = p.stat() + self.assertEqual(p.stat(), st) + # Change file mode by flipping write bit + p.chmod(st.st_mode ^ 0o222) + self.addCleanup(p.chmod, st.st_mode) + self.assertNotEqual(p.stat(), st) + + @with_symlinks + def test_lstat(self): + p = self.cls(BASE)/ 'linkA' + st = p.stat() + self.assertNotEqual(st, p.lstat()) + + def test_lstat_nosymlink(self): + p = self.cls(BASE) / 'fileA' + st = p.stat() + self.assertEqual(st, p.lstat()) + + @unittest.skipUnless(pwd, "the pwd module is needed for this test") + def test_owner(self): + p = self.cls(BASE) / 'fileA' + uid = p.stat().st_uid + name = pwd.getpwuid(uid).pw_name + self.assertEqual(name, p.owner()) + + @unittest.skipUnless(grp, "the grp module is needed for this test") + def test_group(self): + p = self.cls(BASE) / 'fileA' + gid = p.stat().st_gid + name = grp.getgrgid(gid).gr_name + self.assertEqual(name, p.group()) + + def test_unlink(self): + p = self.cls(BASE) / 'fileA' + p.unlink() + self.assertFileNotFound(p.stat) + self.assertFileNotFound(p.unlink) + + def test_rmdir(self): + p = self.cls(BASE) / 'dirA' + for q in p.iterdir(): + q.unlink() + p.rmdir() + self.assertFileNotFound(p.stat) + self.assertFileNotFound(p.unlink) + + def test_rename(self): + P = self.cls(BASE) + p = P / 'fileA' + size = p.stat().st_size + # Renaming to another path + q = P / 'dirA' / 'fileAA' + p.rename(q) + self.assertEqual(q.stat().st_size, size) + self.assertFileNotFound(p.stat) + # Renaming to a str of a relative path + r = rel_join('fileAAA') + q.rename(r) + self.assertEqual(os.stat(r).st_size, size) + self.assertFileNotFound(q.stat) + + def test_replace(self): + P = self.cls(BASE) + p = P / 'fileA' + size = p.stat().st_size + # Replacing a non-existing path + q = P / 'dirA' / 'fileAA' + p.replace(q) + self.assertEqual(q.stat().st_size, size) + self.assertFileNotFound(p.stat) + # Replacing another (existing) path + r = rel_join('dirB', 'fileB') + q.replace(r) + self.assertEqual(os.stat(r).st_size, size) + self.assertFileNotFound(q.stat) + + def test_touch_common(self): + P = self.cls(BASE) + p = P / 'newfileA' + self.assertFalse(p.exists()) + p.touch() + self.assertTrue(p.exists()) + old_mtime = p.stat().st_mtime + # Rewind the mtime sufficiently far in the past to work around + # filesystem-specific timestamp granularity. + os.utime(str(p), (old_mtime - 10, old_mtime - 10)) + # The file mtime is refreshed by calling touch() again + p.touch() + self.assertGreaterEqual(p.stat().st_mtime, old_mtime) + p = P / 'newfileB' + self.assertFalse(p.exists()) + p.touch(mode=0o700, exist_ok=False) + self.assertTrue(p.exists()) + self.assertRaises(OSError, p.touch, exist_ok=False) + + def test_mkdir(self): + P = self.cls(BASE) + p = P / 'newdirA' + self.assertFalse(p.exists()) + p.mkdir() + self.assertTrue(p.exists()) + self.assertTrue(p.is_dir()) + with self.assertRaises(OSError) as cm: + p.mkdir() + self.assertEqual(cm.exception.errno, errno.EEXIST) + # XXX test `mode` arg + + def test_mkdir_parents(self): + # Creating a chain of directories + p = self.cls(BASE, 'newdirB', 'newdirC') + self.assertFalse(p.exists()) + with self.assertRaises(OSError) as cm: + p.mkdir() + self.assertEqual(cm.exception.errno, errno.ENOENT) + p.mkdir(parents=True) + self.assertTrue(p.exists()) + self.assertTrue(p.is_dir()) + with self.assertRaises(OSError) as cm: + p.mkdir(parents=True) + self.assertEqual(cm.exception.errno, errno.EEXIST) + # XXX test `mode` arg + + @with_symlinks + def test_symlink_to(self): + P = self.cls(BASE) + target = P / 'fileA' + # Symlinking a path target + link = P / 'dirA' / 'linkAA' + link.symlink_to(target) + self.assertEqual(link.stat(), target.stat()) + self.assertNotEqual(link.lstat(), target.stat()) + # Symlinking a str target + link = P / 'dirA' / 'linkAAA' + link.symlink_to(str(target)) + self.assertEqual(link.stat(), target.stat()) + self.assertNotEqual(link.lstat(), target.stat()) + self.assertFalse(link.is_dir()) + # Symlinking to a directory + target = P / 'dirB' + link = P / 'dirA' / 'linkAAAA' + link.symlink_to(target, target_is_directory=True) + self.assertEqual(link.stat(), target.stat()) + self.assertNotEqual(link.lstat(), target.stat()) + self.assertTrue(link.is_dir()) + self.assertTrue(list(link.iterdir())) + + def test_is_dir(self): + P = self.cls(BASE) + self.assertTrue((P / 'dirA').is_dir()) + self.assertFalse((P / 'fileA').is_dir()) + self.assertFalse((P / 'non-existing').is_dir()) + if not symlink_skip_reason: + self.assertFalse((P / 'linkA').is_dir()) + self.assertTrue((P / 'linkB').is_dir()) + self.assertFalse((P/ 'brokenLink').is_dir()) + + def test_is_file(self): + P = self.cls(BASE) + self.assertTrue((P / 'fileA').is_file()) + self.assertFalse((P / 'dirA').is_file()) + self.assertFalse((P / 'non-existing').is_file()) + if not symlink_skip_reason: + self.assertTrue((P / 'linkA').is_file()) + self.assertFalse((P / 'linkB').is_file()) + self.assertFalse((P/ 'brokenLink').is_file()) + + def test_is_symlink(self): + P = self.cls(BASE) + self.assertFalse((P / 'fileA').is_symlink()) + self.assertFalse((P / 'dirA').is_symlink()) + self.assertFalse((P / 'non-existing').is_symlink()) + if not symlink_skip_reason: + self.assertTrue((P / 'linkA').is_symlink()) + self.assertTrue((P / 'linkB').is_symlink()) + self.assertTrue((P/ 'brokenLink').is_symlink()) + + def test_is_fifo_false(self): + P = self.cls(BASE) + self.assertFalse((P / 'fileA').is_fifo()) + self.assertFalse((P / 'dirA').is_fifo()) + self.assertFalse((P / 'non-existing').is_fifo()) + + @unittest.skipUnless(hasattr(os, "mkfifo"), "os.mkfifo() required") + def test_is_fifo_true(self): + P = self.cls(BASE, 'myfifo') + os.mkfifo(str(P)) + self.assertTrue(P.is_fifo()) + self.assertFalse(P.is_socket()) + self.assertFalse(P.is_file()) + + def test_is_socket_false(self): + P = self.cls(BASE) + self.assertFalse((P / 'fileA').is_socket()) + self.assertFalse((P / 'dirA').is_socket()) + self.assertFalse((P / 'non-existing').is_socket()) + + @unittest.skipUnless(hasattr(socket, "AF_UNIX"), "Unix sockets required") + def test_is_socket_true(self): + P = self.cls(BASE, 'mysock') + sock = socket.socket(socket.SOCK_STREAM, socket.AF_UNIX) + self.addCleanup(sock.close) + sock.bind(str(P)) + self.assertTrue(P.is_socket()) + self.assertFalse(P.is_fifo()) + self.assertFalse(P.is_file()) + + def test_is_block_device_false(self): + P = self.cls(BASE) + self.assertFalse((P / 'fileA').is_block_device()) + self.assertFalse((P / 'dirA').is_block_device()) + self.assertFalse((P / 'non-existing').is_block_device()) + + def test_is_char_device_false(self): + P = self.cls(BASE) + self.assertFalse((P / 'fileA').is_char_device()) + self.assertFalse((P / 'dirA').is_char_device()) + self.assertFalse((P / 'non-existing').is_char_device()) + + def test_is_char_device_true(self): + # Under Unix, /dev/null should generally be a char device + P = self.cls('/dev/null') + if not P.exists(): + self.skipTest("/dev/null required") + self.assertTrue(P.is_char_device()) + self.assertFalse(P.is_block_device()) + self.assertFalse(P.is_file()) + + def test_pickling_common(self): + p = self.cls(BASE, 'fileA') + for proto in range(0, pickle.HIGHEST_PROTOCOL + 1): + dumped = pickle.dumps(p, proto) + pp = pickle.loads(dumped) + self.assertEqual(pp.stat(), p.stat()) + + def test_parts_interning(self): + P = self.cls + p = P('/usr/bin/foo') + q = P('/usr/local/bin') + # 'usr' + self.assertIs(p.parts[1], q.parts[1]) + # 'bin' + self.assertIs(p.parts[2], q.parts[3]) + + +class PathTest(_BasePathTest, unittest.TestCase): + cls = pathlib.Path + + def test_concrete_class(self): + p = self.cls('a') + self.assertIs(type(p), + pathlib.WindowsPath if os.name == 'nt' else pathlib.PosixPath) + + def test_unsupported_flavour(self): + if os.name == 'nt': + self.assertRaises(NotImplementedError, pathlib.PosixPath) + else: + self.assertRaises(NotImplementedError, pathlib.WindowsPath) + + + at only_posix +class PosixPathTest(_BasePathTest, unittest.TestCase): + cls = pathlib.PosixPath + + def _check_symlink_loop(self, *args): + path = self.cls(*args) + with self.assertRaises(RuntimeError): + print(path.resolve()) + + def test_open_mode(self): + old_mask = os.umask(0) + self.addCleanup(os.umask, old_mask) + p = self.cls(BASE) + with (p / 'new_file').open('wb'): + pass + st = os.stat(join('new_file')) + self.assertEqual(stat.S_IMODE(st.st_mode), 0o666) + os.umask(0o022) + with (p / 'other_new_file').open('wb'): + pass + st = os.stat(join('other_new_file')) + self.assertEqual(stat.S_IMODE(st.st_mode), 0o644) + + def test_touch_mode(self): + old_mask = os.umask(0) + self.addCleanup(os.umask, old_mask) + p = self.cls(BASE) + (p / 'new_file').touch() + st = os.stat(join('new_file')) + self.assertEqual(stat.S_IMODE(st.st_mode), 0o666) + os.umask(0o022) + (p / 'other_new_file').touch() + st = os.stat(join('other_new_file')) + self.assertEqual(stat.S_IMODE(st.st_mode), 0o644) + (p / 'masked_new_file').touch(mode=0o750) + st = os.stat(join('masked_new_file')) + self.assertEqual(stat.S_IMODE(st.st_mode), 0o750) + + @with_symlinks + def test_resolve_loop(self): + # Loop detection for broken symlinks under POSIX + P = self.cls + # Loops with relative symlinks + os.symlink('linkX/inside', join('linkX')) + self._check_symlink_loop(BASE, 'linkX') + os.symlink('linkY', join('linkY')) + self._check_symlink_loop(BASE, 'linkY') + os.symlink('linkZ/../linkZ', join('linkZ')) + self._check_symlink_loop(BASE, 'linkZ') + # Loops with absolute symlinks + os.symlink(join('linkU/inside'), join('linkU')) + self._check_symlink_loop(BASE, 'linkU') + os.symlink(join('linkV'), join('linkV')) + self._check_symlink_loop(BASE, 'linkV') + os.symlink(join('linkW/../linkW'), join('linkW')) + self._check_symlink_loop(BASE, 'linkW') + + def test_glob(self): + P = self.cls + p = P(BASE) + self.assertEqual(set(p.glob("FILEa")), set()) + + def test_rglob(self): + P = self.cls + p = P(BASE, "dirC") + self.assertEqual(set(p.rglob("FILEd")), set()) + + + at only_nt +class WindowsPathTest(_BasePathTest, unittest.TestCase): + cls = pathlib.WindowsPath + + def test_glob(self): + P = self.cls + p = P(BASE) + self.assertEqual(set(p.glob("FILEa")), { P(BASE, "fileA") }) + + def test_rglob(self): + P = self.cls + p = P(BASE, "dirC") + self.assertEqual(set(p.rglob("FILEd")), { P(BASE, "dirC/dirD/fileD") }) + + +if __name__ == "__main__": + unittest.main() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -65,6 +65,8 @@ Library ------- +- Issue #19673: Add pathlib to the stdlib as a provisional module (PEP 428). + - Issue #17916: Added dis.Bytecode.from_traceback() and dis.Bytecode.current_offset to easily display "current instruction" markers in the new disassembly API (Patch by Claudiu Popa). -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 17:38:53 2013 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 22 Nov 2013 17:38:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_whitespace?= Message-ID: <3dR3Hd68Ncz7Lp3@mail.python.org> http://hg.python.org/cpython/rev/8877dd155a13 changeset: 87349:8877dd155a13 user: Antoine Pitrou date: Fri Nov 22 17:38:41 2013 +0100 summary: Fix whitespace files: Lib/pathlib.py | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/Lib/pathlib.py b/Lib/pathlib.py --- a/Lib/pathlib.py +++ b/Lib/pathlib.py @@ -1284,4 +1284,3 @@ class WindowsPath(Path, PureWindowsPath): __slots__ = () - -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 17:57:09 2013 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 22 Nov 2013 17:57:09 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_test_failure_under_sys?= =?utf-8?q?tems_with_an_incompatible_locale?= Message-ID: <3dR3hj02v2z7Lp2@mail.python.org> http://hg.python.org/cpython/rev/88b634b67e4d changeset: 87350:88b634b67e4d user: Antoine Pitrou date: Fri Nov 22 17:57:03 2013 +0100 summary: Fix test failure under systems with an incompatible locale files: Lib/test/test_pathlib.py | 9 ++++++++- 1 files changed, 8 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -586,11 +586,18 @@ self.assertNotEqual(P('/a'), P('//a')) def test_as_uri(self): - from urllib.parse import quote_from_bytes P = self.cls self.assertEqual(P('/').as_uri(), 'file:///') self.assertEqual(P('/a/b.c').as_uri(), 'file:///a/b.c') self.assertEqual(P('/a/b%#c').as_uri(), 'file:///a/b%25%23c') + + def test_as_uri_non_ascii(self): + from urllib.parse import quote_from_bytes + P = self.cls + try: + os.fsencode('\xe9') + except UnicodeEncodeError: + self.skipTest("\\xe9 cannot be encoded to the filesystem encoding") self.assertEqual(P('/a/b\xe9').as_uri(), 'file:///a/b' + quote_from_bytes(os.fsencode('\xe9'))) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 17:58:21 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 22 Nov 2013 17:58:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Don=27t_assume_trying_to_f?= =?utf-8?q?ind_a_builtin_will_succeed_=28e=2Eg=2E_posix_isn=27t_on_Windows?= =?utf-8?q?=29?= Message-ID: <3dR3k50qxqz7LpZ@mail.python.org> http://hg.python.org/cpython/rev/2bea86df60b9 changeset: 87351:2bea86df60b9 user: Brett Cannon date: Fri Nov 22 11:58:17 2013 -0500 summary: Don't assume trying to find a builtin will succeed (e.g. posix isn't on Windows) files: Lib/importlib/_bootstrap.py | 2 + Python/importlib.h | 392 ++++++++++++----------- 2 files changed, 200 insertions(+), 194 deletions(-) diff --git a/Lib/importlib/_bootstrap.py b/Lib/importlib/_bootstrap.py --- a/Lib/importlib/_bootstrap.py +++ b/Lib/importlib/_bootstrap.py @@ -2268,6 +2268,8 @@ def _builtin_from_name(name): spec = BuiltinImporter.find_spec(name) + if spec is None: + raise ImportError('no built-in module named ' + name) methods = _SpecMethods(spec) return methods._load_unlocked() diff --git a/Python/importlib.h b/Python/importlib.h --- a/Python/importlib.h +++ b/Python/importlib.h [stripped] -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 18:05:26 2013 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 22 Nov 2013 18:05:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Hopefully_fix_test=5Fis=5F?= =?utf-8?q?socket=5Ftrue?= Message-ID: <3dR3tG5KNXz7LjM@mail.python.org> http://hg.python.org/cpython/rev/b97310457be0 changeset: 87352:b97310457be0 user: Antoine Pitrou date: Fri Nov 22 18:05:06 2013 +0100 summary: Hopefully fix test_is_socket_true files: Lib/test/test_pathlib.py | 8 ++++++-- 1 files changed, 6 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -1499,9 +1499,13 @@ @unittest.skipUnless(hasattr(socket, "AF_UNIX"), "Unix sockets required") def test_is_socket_true(self): P = self.cls(BASE, 'mysock') - sock = socket.socket(socket.SOCK_STREAM, socket.AF_UNIX) + sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) self.addCleanup(sock.close) - sock.bind(str(P)) + try: + sock.bind(str(P)) + except OSError as e: + if "AF_UNIX path too long" in str(e): + self.skipTest("cannot bind Unix socket: " + str(e)) self.assertTrue(P.is_socket()) self.assertFalse(P.is_fifo()) self.assertFalse(P.is_file()) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 18:07:51 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 22 Nov 2013 18:07:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Remove_a_commented-out_lin?= =?utf-8?q?e?= Message-ID: <3dR3x33ZJyz7LkK@mail.python.org> http://hg.python.org/cpython/rev/c8a84eed9155 changeset: 87353:c8a84eed9155 user: Brett Cannon date: Fri Nov 22 12:07:43 2013 -0500 summary: Remove a commented-out line files: Lib/importlib/_bootstrap.py | 2 - Python/importlib.h | 250 ++++++++++++------------ 2 files changed, 125 insertions(+), 127 deletions(-) diff --git a/Lib/importlib/_bootstrap.py b/Lib/importlib/_bootstrap.py --- a/Lib/importlib/_bootstrap.py +++ b/Lib/importlib/_bootstrap.py @@ -826,8 +826,6 @@ def spec_from_loader(name, loader, *, origin=None, is_package=None): """Return a module spec based on various loader methods.""" - -# if hasattr(loader, 'get_data'): if hasattr(loader, 'get_filename'): if is_package is None: return spec_from_file_location(name, loader=loader) diff --git a/Python/importlib.h b/Python/importlib.h --- a/Python/importlib.h +++ b/Python/importlib.h [stripped] -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 18:30:36 2013 From: python-checkins at python.org (barry.warsaw) Date: Fri, 22 Nov 2013 18:30:36 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_More_PEP_updates=2E?= Message-ID: <3dR4RJ3gnJz7Ljh@mail.python.org> http://hg.python.org/peps/rev/d93a7320b0f5 changeset: 5314:d93a7320b0f5 user: Barry Warsaw date: Fri Nov 22 12:30:31 2013 -0500 summary: More PEP updates. files: pep-0428.txt | 2 +- pep-0429.txt | 6 +++--- pep-0451.txt | 2 +- pep-0453.txt | 2 +- 4 files changed, 6 insertions(+), 6 deletions(-) diff --git a/pep-0428.txt b/pep-0428.txt --- a/pep-0428.txt +++ b/pep-0428.txt @@ -3,7 +3,7 @@ Version: $Revision$ Last-Modified: $Date$ Author: Antoine Pitrou -Status: Accepted +Status: Final Type: Standards Track Content-Type: text/x-rst Created: 30-July-2012 diff --git a/pep-0429.txt b/pep-0429.txt --- a/pep-0429.txt +++ b/pep-0429.txt @@ -67,19 +67,19 @@ Implemented / Final PEPs: +* PEP 428, the pathlib module -- object-oriented filesystem paths * PEP 435, a standardized "enum" module * PEP 442, improved semantics for object finalization * PEP 443, adding single-dispatch generic functions to the standard library * PEP 445, a new C API for implementing custom memory allocators * PEP 446, make newly created file descriptors not inherited by default * PEP 450, basic statistics module for the standard library +* PEP 451, a ModuleSpec Type for the Import System +* PEP 453, pip bootstrapping/bundling with CPython * PEP 456, secure and interchangeable hash algorithm Accepted but not yet implemented/merged: -* PEP 428, the pathlib module -- object-oriented filesystem paths -* PEP 451, a ModuleSpec Type for the Import System -* PEP 453, pip bootstrapping/bundling with CPython * PEP 454, the tracemalloc module for tracing Python memory allocations * PEP 3154, Pickle protocol revision 4 * PEP 3156, improved asynchronous IO support diff --git a/pep-0451.txt b/pep-0451.txt --- a/pep-0451.txt +++ b/pep-0451.txt @@ -5,7 +5,7 @@ Author: Eric Snow BDFL-Delegate: Brett Cannon , Nick Coghlan Discussions-To: import-sig at python.org -Status: Accepted +Status: Final Type: Standards Track Content-Type: text/x-rst Created: 8-Aug-2013 diff --git a/pep-0453.txt b/pep-0453.txt --- a/pep-0453.txt +++ b/pep-0453.txt @@ -5,7 +5,7 @@ Author: Donald Stufft , Nick Coghlan BDFL-Delegate: Martin von L?wis -Status: Accepted +Status: Final Type: Process Content-Type: text/x-rst Created: 10-Aug-2013 -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 22 18:35:20 2013 From: python-checkins at python.org (barry.warsaw) Date: Fri, 22 Nov 2013 18:35:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_3156_has_also_been_committed?= =?utf-8?q?=2E?= Message-ID: <3dR4Xm4QfLz7LmQ@mail.python.org> http://hg.python.org/peps/rev/706fe4b8b148 changeset: 5315:706fe4b8b148 user: Barry Warsaw date: Fri Nov 22 12:35:10 2013 -0500 summary: 3156 has also been committed. files: pep-0429.txt | 2 +- pep-3156.txt | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/pep-0429.txt b/pep-0429.txt --- a/pep-0429.txt +++ b/pep-0429.txt @@ -77,12 +77,12 @@ * PEP 451, a ModuleSpec Type for the Import System * PEP 453, pip bootstrapping/bundling with CPython * PEP 456, secure and interchangeable hash algorithm +* PEP 3156, improved asynchronous IO support Accepted but not yet implemented/merged: * PEP 454, the tracemalloc module for tracing Python memory allocations * PEP 3154, Pickle protocol revision 4 -* PEP 3156, improved asynchronous IO support Other final large-scale changes: diff --git a/pep-3156.txt b/pep-3156.txt --- a/pep-3156.txt +++ b/pep-3156.txt @@ -5,7 +5,7 @@ Author: Guido van Rossum BDFL-Delegate: Antoine Pitrou Discussions-To: -Status: Accepted +Status: Final Type: Standards Track Content-Type: text/x-rst Created: 12-Dec-2012 -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 22 18:36:41 2013 From: python-checkins at python.org (martin.v.loewis) Date: Fri, 22 Nov 2013 18:36:41 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Update_SQLite_to_3=2E8=2E1?= =?utf-8?q?_on_Windows=2E?= Message-ID: <3dR4ZK42vKz7LjM@mail.python.org> http://hg.python.org/cpython/rev/96860eee3bf3 changeset: 87354:96860eee3bf3 user: Martin v. L?wis date: Fri Nov 22 18:36:28 2013 +0100 summary: Update SQLite to 3.8.1 on Windows. files: Misc/NEWS | 2 ++ PC/VS9.0/pyproject.vsprops | 2 +- PCbuild/pyproject.props | 2 +- PCbuild/readme.txt | 2 +- Tools/buildbot/external-common.bat | 6 +++--- 5 files changed, 8 insertions(+), 6 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -352,6 +352,8 @@ Build ----- +- Update SQLite to 3.8.1 on Windows. + - Issue #16632: Enable DEP and ASLR on Windows. - Issue #17791: Drop PREFIX and EXEC_PREFIX definitions from PC/pyconfig.h diff --git a/PC/VS9.0/pyproject.vsprops b/PC/VS9.0/pyproject.vsprops --- a/PC/VS9.0/pyproject.vsprops +++ b/PC/VS9.0/pyproject.vsprops @@ -50,7 +50,7 @@ /> $(OutDir)python$(PyDebugExt).exe $(OutDir)kill_python$(PyDebugExt).exe ..\.. - $(externalsDir)\sqlite-3.7.12 + $(externalsDir)\sqlite-3.8.1 $(externalsDir)\bzip2-1.0.6 $(externalsDir)\xz-5.0.3 $(externalsDir)\openssl-1.0.1e diff --git a/PCbuild/readme.txt b/PCbuild/readme.txt --- a/PCbuild/readme.txt +++ b/PCbuild/readme.txt @@ -207,7 +207,7 @@ The ssl sub-project does not have the ability to clean the OpenSSL build; if you need to rebuild, you'll have to clean it by hand. _sqlite3 - Wraps SQLite 3.7.12, which is itself built by sqlite3.vcxproj + Wraps SQLite 3.8.1, which is itself built by sqlite3.vcxproj Homepage: http://www.sqlite.org/ _tkinter diff --git a/Tools/buildbot/external-common.bat b/Tools/buildbot/external-common.bat --- a/Tools/buildbot/external-common.bat +++ b/Tools/buildbot/external-common.bat @@ -37,9 +37,9 @@ if not exist tk-8.5.11.0 svn export http://svn.python.org/projects/external/tk-8.5.11.0 @rem sqlite3 -if not exist sqlite-3.7.12 ( - rd /s/q sqlite-source-3.7.4 - svn export http://svn.python.org/projects/external/sqlite-3.7.12 +if not exist sqlite-3.8.1 ( + rd /s/q sqlite-source-3.7.12 + svn export http://svn.python.org/projects/external/sqlite-3.8.1 ) @rem lzma -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 19:14:13 2013 From: python-checkins at python.org (martin.v.loewis) Date: Fri, 22 Nov 2013 19:14:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Update_xz_to_5=2E0=2E5=2E?= Message-ID: <3dR5Pd5gJHz7LkJ@mail.python.org> http://hg.python.org/cpython/rev/c5e7c83641cb changeset: 87355:c5e7c83641cb user: Martin v. L?wis date: Fri Nov 22 19:13:51 2013 +0100 summary: Update xz to 5.0.5. files: Misc/NEWS | 2 +- PC/VS9.0/pyproject.vsprops | 2 +- PCbuild/pyproject.props | 2 +- PCbuild/readme.txt | 6 +++--- Tools/buildbot/external-common.bat | 5 +++-- 5 files changed, 9 insertions(+), 8 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -352,7 +352,7 @@ Build ----- -- Update SQLite to 3.8.1 on Windows. +- Update SQLite to 3.8.1 and xz to 5.0.5 on Windows. - Issue #16632: Enable DEP and ASLR on Windows. diff --git a/PC/VS9.0/pyproject.vsprops b/PC/VS9.0/pyproject.vsprops --- a/PC/VS9.0/pyproject.vsprops +++ b/PC/VS9.0/pyproject.vsprops @@ -58,7 +58,7 @@ /> ..\.. $(externalsDir)\sqlite-3.8.1 $(externalsDir)\bzip2-1.0.6 - $(externalsDir)\xz-5.0.3 + $(externalsDir)\xz-5.0.5 $(externalsDir)\openssl-1.0.1e $(externalsDir)\tcltk $(externalsDir)\tcltk64 diff --git a/PCbuild/readme.txt b/PCbuild/readme.txt --- a/PCbuild/readme.txt +++ b/PCbuild/readme.txt @@ -162,7 +162,7 @@ http://www.bzip.org/ _lzma Python wrapper for the liblzma compression library, using pre-built - binaries of XZ Utils version 5.0.3 + binaries of XZ Utils version 5.0.5 Homepage: http://tukaani.org/xz/ _ssl @@ -243,8 +243,8 @@ It is also possible to download sources from each project's homepage, though you may have to change the names of some folders in order to make -things work. For instance, if you were to download a version 5.0.5 of -XZ Utils, you would need to extract the archive into ..\..\xz-5.0.3 +things work. For instance, if you were to download a version 5.0.7 of +XZ Utils, you would need to extract the archive into ..\..\xz-5.0.5 anyway, since that is where the solution is set to look for xz. The same is true for all other external projects. diff --git a/Tools/buildbot/external-common.bat b/Tools/buildbot/external-common.bat --- a/Tools/buildbot/external-common.bat +++ b/Tools/buildbot/external-common.bat @@ -43,6 +43,7 @@ ) @rem lzma -if not exist xz-5.0.3 ( - svn export http://svn.python.org/projects/external/xz-5.0.3 +if not exist xz-5.0.5 ( + rd /s/q xz-5.0.3 + svn export http://svn.python.org/projects/external/xz-5.0.5 ) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 19:22:24 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 22 Nov 2013 19:22:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Make_some_tests_more_verbo?= =?utf-8?q?se_in_the_face_of_failure?= Message-ID: <3dR5b46LgWz7LkJ@mail.python.org> http://hg.python.org/cpython/rev/9621a37f069f changeset: 87356:9621a37f069f user: Brett Cannon date: Fri Nov 22 13:22:22 2013 -0500 summary: Make some tests more verbose in the face of failure files: Lib/test/test_module.py | 8 ++++++-- 1 files changed, 6 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_module.py b/Lib/test/test_module.py --- a/Lib/test/test_module.py +++ b/Lib/test/test_module.py @@ -191,8 +191,12 @@ def test_module_repr_source(self): r = repr(unittest) - self.assertEqual(r[:25], "") + starts_with = " http://hg.python.org/cpython/rev/98bab4a03120 changeset: 87357:98bab4a03120 user: Brett Cannon date: Fri Nov 22 14:38:09 2013 -0500 summary: Issue #18864: Don't try and use unittest as a testing module for built-in loading; leads to a reload scenario where attributes get set which are wrong after the test. files: Lib/test/test_importlib/builtin/test_loader.py | 24 +++++++-- 1 files changed, 18 insertions(+), 6 deletions(-) diff --git a/Lib/test/test_importlib/builtin/test_loader.py b/Lib/test/test_importlib/builtin/test_loader.py --- a/Lib/test/test_importlib/builtin/test_loader.py +++ b/Lib/test/test_importlib/builtin/test_loader.py @@ -63,10 +63,18 @@ def test_already_imported(self): # Using the name of a module already imported but not a built-in should # still fail. - assert hasattr(unittest, '__file__') # Not a built-in. + module_name = 'builtin_reload_test' + assert module_name not in sys.builtin_module_names + with util.uncache(module_name): + module = types.ModuleType(module_name) + spec = self.machinery.ModuleSpec(module_name, + self.machinery.BuiltinImporter, + origin='built-in') + module.__spec__ = spec + sys.modules[module_name] = module with self.assertRaises(ImportError) as cm: - self.load_spec('unittest') - self.assertEqual(cm.exception.name, 'unittest') + self.machinery.BuiltinImporter.exec_module(module) + self.assertEqual(cm.exception.name, module_name) Frozen_ExecModTests, Source_ExecModTests = util.test_both(ExecModTests, @@ -120,10 +128,14 @@ def test_already_imported(self): # Using the name of a module already imported but not a built-in should # still fail. - assert hasattr(unittest, '__file__') # Not a built-in. + module_name = 'builtin_reload_test' + assert module_name not in sys.builtin_module_names + with util.uncache(module_name): + module = types.ModuleType(module_name) + sys.modules[module_name] = module with self.assertRaises(ImportError) as cm: - self.load_module('unittest') - self.assertEqual(cm.exception.name, 'unittest') + self.load_module(module_name) + self.assertEqual(cm.exception.name, module_name) Frozen_LoaderTests, Source_LoaderTests = util.test_both(LoaderTests, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 20:47:10 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 22 Nov 2013 20:47:10 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Make_test=5Fimportlib_outp?= =?utf-8?q?ut_easier_to_trace_back_to_the_failing_test?= Message-ID: <3dR7St3mG3z7Ljq@mail.python.org> http://hg.python.org/cpython/rev/3736d01e1f53 changeset: 87358:3736d01e1f53 user: Brett Cannon date: Fri Nov 22 14:47:09 2013 -0500 summary: Make test_importlib output easier to trace back to the failing test class. files: Lib/test/test_importlib/util.py | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_importlib/util.py b/Lib/test/test_importlib/util.py --- a/Lib/test/test_importlib/util.py +++ b/Lib/test/test_importlib/util.py @@ -20,6 +20,7 @@ (test_class, unittest.TestCase)) source_tests = types.new_class('Source_'+test_class.__name__, (test_class, unittest.TestCase)) + frozen_tests.__module__ = source_tests.__module__ = test_class.__module__ for attr, (frozen_value, source_value) in kwargs.items(): setattr(frozen_tests, attr, frozen_value) setattr(source_tests, attr, source_value) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 20:49:15 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 22 Nov 2013 20:49:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Pass_cancellati?= =?utf-8?q?on_from_wrapping_Future_to_wrapped_Future=2E_By_Sa=C3=BAl?= Message-ID: <3dR7WH3Vlgz7LmT@mail.python.org> http://hg.python.org/cpython/rev/47f5c86d32ea changeset: 87359:47f5c86d32ea user: Guido van Rossum date: Fri Nov 22 11:47:22 2013 -0800 summary: asyncio: Pass cancellation from wrapping Future to wrapped Future. By Sa?l Ibarra Corretg? (mostly). files: Doc/library/asyncio.rst | 20 +++++++++++++++ Doc/library/concurrency.rst | 1 + Lib/asyncio/futures.py | 11 ++++++-- Lib/test/test_asyncio/test_futures.py | 18 +++++++++++++ 4 files changed, 47 insertions(+), 3 deletions(-) diff --git a/Doc/library/asyncio.rst b/Doc/library/asyncio.rst new file mode 100644 --- /dev/null +++ b/Doc/library/asyncio.rst @@ -0,0 +1,20 @@ +:mod:`asyncio` -- Asynchronous I/O, event loop, coroutines and tasks +==================================================================== + +.. module:: asyncio + :synopsis: Asynchronous I/O, event loop, coroutines and tasks. + +.. versionadded:: 3.4 + + +Introduction +------------ + +This package includes a pluggable event loop, transport and protocol +abstractions similar to those in Twisted, and a higher-level scheduler +for coroutines and tasks based on ``yield from`` (:PEP:`380`). + +Full documentation is not yet ready; we hope to have it written +before Python 3.4 leaves beta. Until then, the best reference is +:PEP:`3156`. For a motivational primer on transports and protocols, +see :PEP:`3153`. diff --git a/Doc/library/concurrency.rst b/Doc/library/concurrency.rst --- a/Doc/library/concurrency.rst +++ b/Doc/library/concurrency.rst @@ -22,6 +22,7 @@ queue.rst select.rst selectors.rst + asyncio.rst The following are support modules for some of the above services: diff --git a/Lib/asyncio/futures.py b/Lib/asyncio/futures.py --- a/Lib/asyncio/futures.py +++ b/Lib/asyncio/futures.py @@ -301,6 +301,8 @@ The other Future may be a concurrent.futures.Future. """ assert other.done() + if self.cancelled(): + return assert not self.done() if other.cancelled(): self.cancel() @@ -324,14 +326,17 @@ """Wrap concurrent.futures.Future object.""" if isinstance(fut, Future): return fut - assert isinstance(fut, concurrent.futures.Future), \ 'concurrent.futures.Future is expected, got {!r}'.format(fut) - if loop is None: loop = events.get_event_loop() + new_future = Future(loop=loop) - new_future = Future(loop=loop) + def _check_cancel_other(f): + if f.cancelled(): + fut.cancel() + + new_future.add_done_callback(_check_cancel_other) fut.add_done_callback( lambda future: loop.call_soon_threadsafe( new_future._copy_state, fut)) diff --git a/Lib/test/test_asyncio/test_futures.py b/Lib/test/test_asyncio/test_futures.py --- a/Lib/test/test_asyncio/test_futures.py +++ b/Lib/test/test_asyncio/test_futures.py @@ -241,6 +241,24 @@ f2 = futures.wrap_future(f1) self.assertIs(m_events.get_event_loop.return_value, f2._loop) + def test_wrap_future_cancel(self): + f1 = concurrent.futures.Future() + f2 = futures.wrap_future(f1, loop=self.loop) + f2.cancel() + test_utils.run_briefly(self.loop) + self.assertTrue(f1.cancelled()) + self.assertTrue(f2.cancelled()) + + def test_wrap_future_cancel2(self): + f1 = concurrent.futures.Future() + f2 = futures.wrap_future(f1, loop=self.loop) + f1.set_result(42) + f2.cancel() + test_utils.run_briefly(self.loop) + self.assertFalse(f1.cancelled()) + self.assertEqual(f1.result(), 42) + self.assertTrue(f2.cancelled()) + class FutureDoneCallbackTests(unittest.TestCase): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 20:53:07 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 22 Nov 2013 20:53:07 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_User_the_repr_for_a_module?= =?utf-8?q?_name_in_more_places?= Message-ID: <3dR7bl6zkCz7LjM@mail.python.org> http://hg.python.org/cpython/rev/2e4169aac667 changeset: 87360:2e4169aac667 parent: 87358:3736d01e1f53 user: Brett Cannon date: Fri Nov 22 14:52:36 2013 -0500 summary: User the repr for a module name in more places files: Lib/importlib/_bootstrap.py | 12 +- Python/importlib.h | 6587 +++++++++++----------- 2 files changed, 3300 insertions(+), 3299 deletions(-) diff --git a/Lib/importlib/_bootstrap.py b/Lib/importlib/_bootstrap.py --- a/Lib/importlib/_bootstrap.py +++ b/Lib/importlib/_bootstrap.py @@ -526,7 +526,7 @@ """Decorator to verify the named module is built-in.""" def _requires_builtin_wrapper(self, fullname): if fullname not in sys.builtin_module_names: - raise ImportError('{} is not a built-in module'.format(fullname), + raise ImportError('{!r} is not a built-in module'.format(fullname), name=fullname) return fxn(self, fullname) _wrap(_requires_builtin_wrapper, fxn) @@ -537,7 +537,7 @@ """Decorator to verify the named module is frozen.""" def _requires_frozen_wrapper(self, fullname): if not _imp.is_frozen(fullname): - raise ImportError('{} is not a frozen module'.format(fullname), + raise ImportError('{!r} is not a frozen module'.format(fullname), name=fullname) return fxn(self, fullname) _wrap(_requires_frozen_wrapper, fxn) @@ -1117,7 +1117,7 @@ _imp.acquire_lock() with _ModuleLockManager(name): if sys.modules.get(name) is not module: - msg = 'module {} not in sys.modules'.format(name) + msg = 'module {!r} not in sys.modules'.format(name) raise ImportError(msg, name=name) if self.spec.loader is None: if self.spec.submodule_search_locations is None: @@ -1247,7 +1247,7 @@ spec = module.__spec__ name = spec.name if not _imp.is_builtin(name): - raise ImportError('{} is not a built-in module'.format(name), + raise ImportError('{!r} is not a built-in module'.format(name), name=name) _call_with_frames_removed(_imp.init_builtin, name) # Have to manually initialize attributes since init_builtin() is not @@ -1310,7 +1310,7 @@ def exec_module(module): name = module.__spec__.name if not _imp.is_frozen(name): - raise ImportError('{} is not a frozen module'.format(name), + raise ImportError('{!r} is not a frozen module'.format(name), name=name) code = _call_with_frames_removed(_imp.get_frozen_object, name) exec(code, module.__dict__) @@ -2127,7 +2127,7 @@ try: path = parent_module.__path__ except AttributeError: - msg = (_ERR_MSG + '; {} is not a package').format(name, parent) + msg = (_ERR_MSG + '; {!r} is not a package').format(name, parent) raise ImportError(msg, name=name) spec = _find_spec(name, path) if spec is None: diff --git a/Python/importlib.h b/Python/importlib.h --- a/Python/importlib.h +++ b/Python/importlib.h [stripped] -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 20:53:09 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 22 Nov 2013 20:53:09 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_merge?= Message-ID: <3dR7bn1j1Bz7LkJ@mail.python.org> http://hg.python.org/cpython/rev/f199dffa957e changeset: 87361:f199dffa957e parent: 87360:2e4169aac667 parent: 87359:47f5c86d32ea user: Brett Cannon date: Fri Nov 22 14:53:07 2013 -0500 summary: merge files: Doc/library/asyncio.rst | 20 +++++++++++++++ Doc/library/concurrency.rst | 1 + Lib/asyncio/futures.py | 11 ++++++-- Lib/test/test_asyncio/test_futures.py | 18 +++++++++++++ 4 files changed, 47 insertions(+), 3 deletions(-) diff --git a/Doc/library/asyncio.rst b/Doc/library/asyncio.rst new file mode 100644 --- /dev/null +++ b/Doc/library/asyncio.rst @@ -0,0 +1,20 @@ +:mod:`asyncio` -- Asynchronous I/O, event loop, coroutines and tasks +==================================================================== + +.. module:: asyncio + :synopsis: Asynchronous I/O, event loop, coroutines and tasks. + +.. versionadded:: 3.4 + + +Introduction +------------ + +This package includes a pluggable event loop, transport and protocol +abstractions similar to those in Twisted, and a higher-level scheduler +for coroutines and tasks based on ``yield from`` (:PEP:`380`). + +Full documentation is not yet ready; we hope to have it written +before Python 3.4 leaves beta. Until then, the best reference is +:PEP:`3156`. For a motivational primer on transports and protocols, +see :PEP:`3153`. diff --git a/Doc/library/concurrency.rst b/Doc/library/concurrency.rst --- a/Doc/library/concurrency.rst +++ b/Doc/library/concurrency.rst @@ -22,6 +22,7 @@ queue.rst select.rst selectors.rst + asyncio.rst The following are support modules for some of the above services: diff --git a/Lib/asyncio/futures.py b/Lib/asyncio/futures.py --- a/Lib/asyncio/futures.py +++ b/Lib/asyncio/futures.py @@ -301,6 +301,8 @@ The other Future may be a concurrent.futures.Future. """ assert other.done() + if self.cancelled(): + return assert not self.done() if other.cancelled(): self.cancel() @@ -324,14 +326,17 @@ """Wrap concurrent.futures.Future object.""" if isinstance(fut, Future): return fut - assert isinstance(fut, concurrent.futures.Future), \ 'concurrent.futures.Future is expected, got {!r}'.format(fut) - if loop is None: loop = events.get_event_loop() + new_future = Future(loop=loop) - new_future = Future(loop=loop) + def _check_cancel_other(f): + if f.cancelled(): + fut.cancel() + + new_future.add_done_callback(_check_cancel_other) fut.add_done_callback( lambda future: loop.call_soon_threadsafe( new_future._copy_state, fut)) diff --git a/Lib/test/test_asyncio/test_futures.py b/Lib/test/test_asyncio/test_futures.py --- a/Lib/test/test_asyncio/test_futures.py +++ b/Lib/test/test_asyncio/test_futures.py @@ -241,6 +241,24 @@ f2 = futures.wrap_future(f1) self.assertIs(m_events.get_event_loop.return_value, f2._loop) + def test_wrap_future_cancel(self): + f1 = concurrent.futures.Future() + f2 = futures.wrap_future(f1, loop=self.loop) + f2.cancel() + test_utils.run_briefly(self.loop) + self.assertTrue(f1.cancelled()) + self.assertTrue(f2.cancelled()) + + def test_wrap_future_cancel2(self): + f1 = concurrent.futures.Future() + f2 = futures.wrap_future(f1, loop=self.loop) + f1.set_result(42) + f2.cancel() + test_utils.run_briefly(self.loop) + self.assertFalse(f1.cancelled()) + self.assertEqual(f1.result(), 42) + self.assertTrue(f2.cancelled()) + class FutureDoneCallbackTests(unittest.TestCase): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 20:54:12 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 22 Nov 2013 20:54:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_NEWS_entry_for_module_name?= =?utf-8?q?_repr_commit?= Message-ID: <3dR7d05zP7z7LjM@mail.python.org> http://hg.python.org/cpython/rev/781a61297877 changeset: 87362:781a61297877 user: Brett Cannon date: Fri Nov 22 14:54:13 2013 -0500 summary: NEWS entry for module name repr commit files: Misc/NEWS | 2 ++ 1 files changed, 2 insertions(+), 0 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,6 +10,8 @@ Core and Builtins ----------------- +- Use the repr of a module name in more places in import, especially exceptions. + - Issue #19619: str.encode, bytes.decode and bytearray.decode now use an internal API to throw LookupError for known non-text encodings, rather than attempting the encoding or decoding operation and then throwing a -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 21:02:13 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 22 Nov 2013 21:02:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Move_select=2C_selectors?= =?utf-8?q?=2C_asyncio_to_section_18_=28IPC=29=2E?= Message-ID: <3dR7pF0pq0z7LkT@mail.python.org> http://hg.python.org/cpython/rev/14204cb465ce changeset: 87363:14204cb465ce user: Guido van Rossum date: Fri Nov 22 11:56:46 2013 -0800 summary: Move select, selectors, asyncio to section 18 (IPC). files: Doc/library/concurrency.rst | 3 --- Doc/library/ipc.rst | 3 +++ 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Doc/library/concurrency.rst b/Doc/library/concurrency.rst --- a/Doc/library/concurrency.rst +++ b/Doc/library/concurrency.rst @@ -20,9 +20,6 @@ subprocess.rst sched.rst queue.rst - select.rst - selectors.rst - asyncio.rst The following are support modules for some of the above services: diff --git a/Doc/library/ipc.rst b/Doc/library/ipc.rst --- a/Doc/library/ipc.rst +++ b/Doc/library/ipc.rst @@ -18,6 +18,9 @@ socket.rst ssl.rst + select.rst + selectors.rst + asyncio.rst asyncore.rst asynchat.rst signal.rst -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 21:02:14 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 22 Nov 2013 21:02:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Add_note_to_asyncore/async?= =?utf-8?q?hat_recommending_asyncio_for_new_code=2E?= Message-ID: <3dR7pG39RKz7LkT@mail.python.org> http://hg.python.org/cpython/rev/db6ae01a5f7f changeset: 87364:db6ae01a5f7f user: Guido van Rossum date: Fri Nov 22 11:57:35 2013 -0800 summary: Add note to asyncore/asynchat recommending asyncio for new code. files: Doc/library/asynchat.rst | 3 +++ Doc/library/asyncore.rst | 3 +++ 2 files changed, 6 insertions(+), 0 deletions(-) diff --git a/Doc/library/asynchat.rst b/Doc/library/asynchat.rst --- a/Doc/library/asynchat.rst +++ b/Doc/library/asynchat.rst @@ -10,6 +10,9 @@ -------------- +Note: This module exists for backwards compatibility only. For new code we +recommend using :module:`asyncio`. + This module builds on the :mod:`asyncore` infrastructure, simplifying asynchronous clients and servers and making it easier to handle protocols whose elements are terminated by arbitrary strings, or are of variable length. diff --git a/Doc/library/asyncore.rst b/Doc/library/asyncore.rst --- a/Doc/library/asyncore.rst +++ b/Doc/library/asyncore.rst @@ -13,6 +13,9 @@ -------------- +Note: This module exists for backwards compatibility only. For new code we +recommend using :module:`asyncio`. + This module provides the basic infrastructure for writing asynchronous socket service clients and servers. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 21:04:35 2013 From: python-checkins at python.org (zach.ware) Date: Fri, 22 Nov 2013 21:04:35 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE4MzI2?= =?utf-8?q?=3A_Clarify_that_list=2Esort=27s_arguments_are_keyword-only=2E?= Message-ID: <3dR7rz0qjZz7LkJ@mail.python.org> http://hg.python.org/cpython/rev/9192c0798a90 changeset: 87365:9192c0798a90 branch: 3.3 parent: 87314:cfbd894f1df1 user: Zachary Ware date: Fri Nov 22 13:58:34 2013 -0600 summary: Issue #18326: Clarify that list.sort's arguments are keyword-only. Also, attempt to reduce confusion in the glossary by not saying there are different "types" of arguments and parameters. files: Doc/glossary.rst | 6 ++++-- Doc/library/stdtypes.rst | 3 +++ Misc/NEWS | 7 +++++++ 3 files changed, 14 insertions(+), 2 deletions(-) diff --git a/Doc/glossary.rst b/Doc/glossary.rst --- a/Doc/glossary.rst +++ b/Doc/glossary.rst @@ -41,7 +41,7 @@ argument A value passed to a :term:`function` (or :term:`method`) when calling the - function. There are two types of arguments: + function. There are two kinds of argument: * :dfn:`keyword argument`: an argument preceded by an identifier (e.g. ``name=``) in a function call or passed as a value in a dictionary @@ -592,7 +592,7 @@ parameter A named entity in a :term:`function` (or method) definition that specifies an :term:`argument` (or in some cases, arguments) that the - function can accept. There are five types of parameters: + function can accept. There are five kinds of parameter: * :dfn:`positional-or-keyword`: specifies an argument that can be passed either :term:`positionally ` or as a :term:`keyword argument @@ -606,6 +606,8 @@ parameters. However, some built-in functions have positional-only parameters (e.g. :func:`abs`). + .. _keyword-only_parameter: + * :dfn:`keyword-only`: specifies an argument that can be supplied only by keyword. Keyword-only parameters can be defined by including a single var-positional parameter or bare ``*`` in the parameter list diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -1149,6 +1149,9 @@ fail, the entire sort operation will fail (and the list will likely be left in a partially modified state). + :meth:`sort` accepts two arguments that can only be passed by keyword + (:ref:`keyword-only arguments `): + *key* specifies a function of one argument that is used to extract a comparison key from each list element (for example, ``key=str.lower``). The key corresponding to each item in the list is calculated once and diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,13 @@ - Issue #19085: Added basic tests for all tkinter widget options. +Documentation +------------- + +- Issue #18326: Clarify that list.sort's arguments are keyword-only. Also, + attempt to reduce confusion in the glossary by not saying there are + different "types" of arguments and parameters. + Build ----- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 21:04:36 2013 From: python-checkins at python.org (zach.ware) Date: Fri, 22 Nov 2013 21:04:36 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2318326=3A_merge_with_3=2E3?= Message-ID: <3dR7s02ty7z7LkL@mail.python.org> http://hg.python.org/cpython/rev/3f1c332c5e2e changeset: 87366:3f1c332c5e2e parent: 87362:781a61297877 parent: 87365:9192c0798a90 user: Zachary Ware date: Fri Nov 22 14:03:10 2013 -0600 summary: Issue #18326: merge with 3.3 files: Doc/glossary.rst | 6 ++++-- Doc/library/stdtypes.rst | 3 +++ Misc/NEWS | 7 +++++++ 3 files changed, 14 insertions(+), 2 deletions(-) diff --git a/Doc/glossary.rst b/Doc/glossary.rst --- a/Doc/glossary.rst +++ b/Doc/glossary.rst @@ -41,7 +41,7 @@ argument A value passed to a :term:`function` (or :term:`method`) when calling the - function. There are two types of arguments: + function. There are two kinds of argument: * :dfn:`keyword argument`: an argument preceded by an identifier (e.g. ``name=``) in a function call or passed as a value in a dictionary @@ -601,7 +601,7 @@ parameter A named entity in a :term:`function` (or method) definition that specifies an :term:`argument` (or in some cases, arguments) that the - function can accept. There are five types of parameters: + function can accept. There are five kinds of parameter: * :dfn:`positional-or-keyword`: specifies an argument that can be passed either :term:`positionally ` or as a :term:`keyword argument @@ -615,6 +615,8 @@ parameters. However, some built-in functions have positional-only parameters (e.g. :func:`abs`). + .. _keyword-only_parameter: + * :dfn:`keyword-only`: specifies an argument that can be supplied only by keyword. Keyword-only parameters can be defined by including a single var-positional parameter or bare ``*`` in the parameter list diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -1149,6 +1149,9 @@ fail, the entire sort operation will fail (and the list will likely be left in a partially modified state). + :meth:`sort` accepts two arguments that can only be passed by keyword + (:ref:`keyword-only arguments `): + *key* specifies a function of one argument that is used to extract a comparison key from each list element (for example, ``key=str.lower``). The key corresponding to each item in the list is calculated once and diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -351,6 +351,13 @@ - Issue 19384: Fix test_py_compile for root user, patch by Claudiu Popa. +Documentation +------------- + +- Issue #18326: Clarify that list.sort's arguments are keyword-only. Also, + attempt to reduce confusion in the glossary by not saying there are + different "types" of arguments and parameters. + Build ----- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 21:04:37 2013 From: python-checkins at python.org (zach.ware) Date: Fri, 22 Nov 2013 21:04:37 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merge_heads?= Message-ID: <3dR7s14b1lz7LmH@mail.python.org> http://hg.python.org/cpython/rev/e8333e91e517 changeset: 87367:e8333e91e517 parent: 87366:3f1c332c5e2e parent: 87364:db6ae01a5f7f user: Zachary Ware date: Fri Nov 22 14:04:01 2013 -0600 summary: Merge heads files: Doc/library/asynchat.rst | 3 +++ Doc/library/asyncore.rst | 3 +++ Doc/library/concurrency.rst | 3 --- Doc/library/ipc.rst | 3 +++ 4 files changed, 9 insertions(+), 3 deletions(-) diff --git a/Doc/library/asynchat.rst b/Doc/library/asynchat.rst --- a/Doc/library/asynchat.rst +++ b/Doc/library/asynchat.rst @@ -10,6 +10,9 @@ -------------- +Note: This module exists for backwards compatibility only. For new code we +recommend using :module:`asyncio`. + This module builds on the :mod:`asyncore` infrastructure, simplifying asynchronous clients and servers and making it easier to handle protocols whose elements are terminated by arbitrary strings, or are of variable length. diff --git a/Doc/library/asyncore.rst b/Doc/library/asyncore.rst --- a/Doc/library/asyncore.rst +++ b/Doc/library/asyncore.rst @@ -13,6 +13,9 @@ -------------- +Note: This module exists for backwards compatibility only. For new code we +recommend using :module:`asyncio`. + This module provides the basic infrastructure for writing asynchronous socket service clients and servers. diff --git a/Doc/library/concurrency.rst b/Doc/library/concurrency.rst --- a/Doc/library/concurrency.rst +++ b/Doc/library/concurrency.rst @@ -20,9 +20,6 @@ subprocess.rst sched.rst queue.rst - select.rst - selectors.rst - asyncio.rst The following are support modules for some of the above services: diff --git a/Doc/library/ipc.rst b/Doc/library/ipc.rst --- a/Doc/library/ipc.rst +++ b/Doc/library/ipc.rst @@ -18,6 +18,9 @@ socket.rst ssl.rst + select.rst + selectors.rst + asyncio.rst asyncore.rst asynchat.rst signal.rst -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 21:27:52 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 22 Nov 2013 21:27:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_markup_of_notes_recomm?= =?utf-8?q?ending_asyncio=2E?= Message-ID: <3dR8Mr6Dmjz7Lkm@mail.python.org> http://hg.python.org/cpython/rev/f3eb4a1b27c9 changeset: 87368:f3eb4a1b27c9 user: Guido van Rossum date: Fri Nov 22 12:27:45 2013 -0800 summary: Fix markup of notes recommending asyncio. files: Doc/library/asynchat.rst | 6 ++++-- Doc/library/asyncore.rst | 6 ++++-- 2 files changed, 8 insertions(+), 4 deletions(-) diff --git a/Doc/library/asynchat.rst b/Doc/library/asynchat.rst --- a/Doc/library/asynchat.rst +++ b/Doc/library/asynchat.rst @@ -10,8 +10,10 @@ -------------- -Note: This module exists for backwards compatibility only. For new code we -recommend using :module:`asyncio`. +.. note:: + + This module exists for backwards compatibility only. For new code we + recommend using :mod:`asyncio`. This module builds on the :mod:`asyncore` infrastructure, simplifying asynchronous clients and servers and making it easier to handle protocols diff --git a/Doc/library/asyncore.rst b/Doc/library/asyncore.rst --- a/Doc/library/asyncore.rst +++ b/Doc/library/asyncore.rst @@ -13,8 +13,10 @@ -------------- -Note: This module exists for backwards compatibility only. For new code we -recommend using :module:`asyncio`. +.. note:: + + This module exists for backwards compatibility only. For new code we + recommend using :mod:`asyncio`. This module provides the basic infrastructure for writing asynchronous socket service clients and servers. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 21:31:57 2013 From: python-checkins at python.org (barry.warsaw) Date: Fri, 22 Nov 2013 21:31:57 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_19555_for_distutils?= =?utf-8?q?=2C_plus_a_little_clean_up_=28pyflakes=2C_line_lengths=29=2E?= Message-ID: <3dR8SY42D1z7LkG@mail.python.org> http://hg.python.org/cpython/rev/8a130fd92255 changeset: 87369:8a130fd92255 parent: 87364:db6ae01a5f7f user: Barry Warsaw date: Fri Nov 22 15:31:35 2013 -0500 summary: Issue 19555 for distutils, plus a little clean up (pyflakes, line lengths). files: Lib/distutils/sysconfig.py | 8 ++ Lib/distutils/tests/test_sysconfig.py | 43 +++++++++++--- Misc/NEWS | 7 +- 3 files changed, 44 insertions(+), 14 deletions(-) diff --git a/Lib/distutils/sysconfig.py b/Lib/distutils/sysconfig.py --- a/Lib/distutils/sysconfig.py +++ b/Lib/distutils/sysconfig.py @@ -518,6 +518,11 @@ _config_vars['prefix'] = PREFIX _config_vars['exec_prefix'] = EXEC_PREFIX + # For backward compatibility, see issue19555 + SO = _config_vars.get('EXT_SUFFIX') + if SO is not None: + _config_vars['SO'] = SO + # Always convert srcdir to an absolute path srcdir = _config_vars.get('srcdir', project_base) if os.name == 'posix': @@ -568,4 +573,7 @@ returned by 'get_config_vars()'. Equivalent to get_config_vars().get(name) """ + if name == 'SO': + import warnings + warnings.warn('SO is deprecated, use EXT_SUFFIX', DeprecationWarning) return get_config_vars().get(name) diff --git a/Lib/distutils/tests/test_sysconfig.py b/Lib/distutils/tests/test_sysconfig.py --- a/Lib/distutils/tests/test_sysconfig.py +++ b/Lib/distutils/tests/test_sysconfig.py @@ -1,7 +1,6 @@ """Tests for distutils.sysconfig.""" import os import shutil -import test import unittest from distutils import sysconfig @@ -9,8 +8,7 @@ from distutils.tests import support from test.support import TESTFN, run_unittest -class SysconfigTestCase(support.EnvironGuard, - unittest.TestCase): +class SysconfigTestCase(support.EnvironGuard, unittest.TestCase): def setUp(self): super(SysconfigTestCase, self).setUp() self.makefile = None @@ -32,7 +30,6 @@ self.assertTrue(os.path.isfile(config_h), config_h) def test_get_python_lib(self): - lib_dir = sysconfig.get_python_lib() # XXX doesn't work on Linux when Python was never installed before #self.assertTrue(os.path.isdir(lib_dir), lib_dir) # test for pythonxx.lib? @@ -67,8 +64,9 @@ self.assertTrue(os.path.exists(Python_h), Python_h) self.assertTrue(sysconfig._is_python_source_dir(srcdir)) elif os.name == 'posix': - self.assertEqual(os.path.dirname(sysconfig.get_makefile_filename()), - srcdir) + self.assertEqual( + os.path.dirname(sysconfig.get_makefile_filename()), + srcdir) def test_srcdir_independent_of_cwd(self): # srcdir should be independent of the current working directory @@ -129,10 +127,13 @@ def test_sysconfig_module(self): import sysconfig as global_sysconfig - self.assertEqual(global_sysconfig.get_config_var('CFLAGS'), sysconfig.get_config_var('CFLAGS')) - self.assertEqual(global_sysconfig.get_config_var('LDFLAGS'), sysconfig.get_config_var('LDFLAGS')) + self.assertEqual(global_sysconfig.get_config_var('CFLAGS'), + sysconfig.get_config_var('CFLAGS')) + self.assertEqual(global_sysconfig.get_config_var('LDFLAGS'), + sysconfig.get_config_var('LDFLAGS')) - @unittest.skipIf(sysconfig.get_config_var('CUSTOMIZED_OSX_COMPILER'),'compiler flags customized') + @unittest.skipIf(sysconfig.get_config_var('CUSTOMIZED_OSX_COMPILER'), + 'compiler flags customized') def test_sysconfig_compiler_vars(self): # On OS X, binary installers support extension module building on # various levels of the operating system with differing Xcode @@ -151,9 +152,29 @@ import sysconfig as global_sysconfig if sysconfig.get_config_var('CUSTOMIZED_OSX_COMPILER'): return - self.assertEqual(global_sysconfig.get_config_var('LDSHARED'), sysconfig.get_config_var('LDSHARED')) - self.assertEqual(global_sysconfig.get_config_var('CC'), sysconfig.get_config_var('CC')) + self.assertEqual(global_sysconfig.get_config_var('LDSHARED'), + sysconfig.get_config_var('LDSHARED')) + self.assertEqual(global_sysconfig.get_config_var('CC'), + sysconfig.get_config_var('CC')) + @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, + 'EXT_SUFFIX required for this test') + def test_SO_deprecation(self): + self.assertWarns(DeprecationWarning, + sysconfig.get_config_var, 'SO') + + @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, + 'EXT_SUFFIX required for this test') + def test_SO_value(self): + self.assertEqual(sysconfig.get_config_var('SO'), + sysconfig.get_config_var('EXT_SUFFIX')) + + @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, + 'EXT_SUFFIX required for this test') + def test_SO_in_vars(self): + vars = sysconfig.get_config_vars() + self.assertIsNotNone(vars['SO']) + self.assertEqual(vars['SO'], vars['EXT_SUFFIX']) def test_suite(): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,7 +10,8 @@ Core and Builtins ----------------- -- Use the repr of a module name in more places in import, especially exceptions. +- Use the repr of a module name in more places in import, especially + exceptions. - Issue #19619: str.encode, bytes.decode and bytearray.decode now use an internal API to throw LookupError for known non-text encodings, rather @@ -79,8 +80,8 @@ CRL enumeration are now two functions. enum_certificates() also returns purpose flags as set of OIDs. -- Issue #19555: Restore sysconfig.get_config_var('SO'), with a - DeprecationWarning pointing people at $EXT_SUFFIX. +- Issue #19555: Restore sysconfig.get_config_var('SO'), (and the distutils + equivalent) with a DeprecationWarning pointing people at $EXT_SUFFIX. - Issue #8813: Add SSLContext.verify_flags to change the verification flags of the context in order to enable certification revocation list (CRL) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 21:31:58 2013 From: python-checkins at python.org (barry.warsaw) Date: Fri, 22 Nov 2013 21:31:58 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_trunk_merge?= Message-ID: <3dR8SZ73Cyz7LmR@mail.python.org> http://hg.python.org/cpython/rev/307122160def changeset: 87370:307122160def parent: 87369:8a130fd92255 parent: 87368:f3eb4a1b27c9 user: Barry Warsaw date: Fri Nov 22 15:31:49 2013 -0500 summary: trunk merge files: Doc/glossary.rst | 6 ++++-- Doc/library/asynchat.rst | 6 ++++-- Doc/library/asyncore.rst | 6 ++++-- Doc/library/stdtypes.rst | 3 +++ Misc/NEWS | 7 +++++++ 5 files changed, 22 insertions(+), 6 deletions(-) diff --git a/Doc/glossary.rst b/Doc/glossary.rst --- a/Doc/glossary.rst +++ b/Doc/glossary.rst @@ -41,7 +41,7 @@ argument A value passed to a :term:`function` (or :term:`method`) when calling the - function. There are two types of arguments: + function. There are two kinds of argument: * :dfn:`keyword argument`: an argument preceded by an identifier (e.g. ``name=``) in a function call or passed as a value in a dictionary @@ -601,7 +601,7 @@ parameter A named entity in a :term:`function` (or method) definition that specifies an :term:`argument` (or in some cases, arguments) that the - function can accept. There are five types of parameters: + function can accept. There are five kinds of parameter: * :dfn:`positional-or-keyword`: specifies an argument that can be passed either :term:`positionally ` or as a :term:`keyword argument @@ -615,6 +615,8 @@ parameters. However, some built-in functions have positional-only parameters (e.g. :func:`abs`). + .. _keyword-only_parameter: + * :dfn:`keyword-only`: specifies an argument that can be supplied only by keyword. Keyword-only parameters can be defined by including a single var-positional parameter or bare ``*`` in the parameter list diff --git a/Doc/library/asynchat.rst b/Doc/library/asynchat.rst --- a/Doc/library/asynchat.rst +++ b/Doc/library/asynchat.rst @@ -10,8 +10,10 @@ -------------- -Note: This module exists for backwards compatibility only. For new code we -recommend using :module:`asyncio`. +.. note:: + + This module exists for backwards compatibility only. For new code we + recommend using :mod:`asyncio`. This module builds on the :mod:`asyncore` infrastructure, simplifying asynchronous clients and servers and making it easier to handle protocols diff --git a/Doc/library/asyncore.rst b/Doc/library/asyncore.rst --- a/Doc/library/asyncore.rst +++ b/Doc/library/asyncore.rst @@ -13,8 +13,10 @@ -------------- -Note: This module exists for backwards compatibility only. For new code we -recommend using :module:`asyncio`. +.. note:: + + This module exists for backwards compatibility only. For new code we + recommend using :mod:`asyncio`. This module provides the basic infrastructure for writing asynchronous socket service clients and servers. diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -1149,6 +1149,9 @@ fail, the entire sort operation will fail (and the list will likely be left in a partially modified state). + :meth:`sort` accepts two arguments that can only be passed by keyword + (:ref:`keyword-only arguments `): + *key* specifies a function of one argument that is used to extract a comparison key from each list element (for example, ``key=str.lower``). The key corresponding to each item in the list is calculated once and diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -352,6 +352,13 @@ - Issue 19384: Fix test_py_compile for root user, patch by Claudiu Popa. +Documentation +------------- + +- Issue #18326: Clarify that list.sort's arguments are keyword-only. Also, + attempt to reduce confusion in the glossary by not saying there are + different "types" of arguments and parameters. + Build ----- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 21:45:17 2013 From: python-checkins at python.org (andrew.kuchling) Date: Fri, 22 Nov 2013 21:45:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Wording_changes_to_pathlib?= =?utf-8?q?_docs=2E?= Message-ID: <3dR8lx362Kz7LjM@mail.python.org> http://hg.python.org/cpython/rev/cce14bc9b675 changeset: 87371:cce14bc9b675 user: Andrew Kuchling date: Fri Nov 22 15:45:02 2013 -0500 summary: Wording changes to pathlib docs. Only possibly-controversial change: joinpath() was described as: "Calling this method is equivalent to indexing the path with each of the *other* arguments in turn." 'Indexing' is an odd word to use, because you can't subscript Path or PurePath objects, so I changed it to "combining". files: Doc/library/pathlib.rst | 7 +++---- 1 files changed, 3 insertions(+), 4 deletions(-) diff --git a/Doc/library/pathlib.rst b/Doc/library/pathlib.rst --- a/Doc/library/pathlib.rst +++ b/Doc/library/pathlib.rst @@ -20,7 +20,7 @@ a :ref:`concrete path ` for the current platform. .. note:: - This module module has been included in the standard library on a + This module has been included in the standard library on a :term:`provisional basis `. Backwards incompatible changes (up to and including removal of the package) may occur if deemed necessary by the core developers. @@ -250,7 +250,7 @@ Methods and properties ^^^^^^^^^^^^^^^^^^^^^^ -Pure paths provide the following methods an properties: +Pure paths provide the following methods and properties: .. data:: PurePath.drive @@ -454,7 +454,7 @@ .. method:: PurePath.joinpath(*other) - Calling this method is equivalent to indexing the path with each of + Calling this method is equivalent to combining the path with each of the *other* arguments in turn:: >>> PurePosixPath('/etc').joinpath('passwd') @@ -871,4 +871,3 @@ Remove this file or symbolic link. If the path points to a directory, use :func:`Path.rmdir` instead. - -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 22:01:11 2013 From: python-checkins at python.org (eric.snow) Date: Fri, 22 Nov 2013 22:01:11 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319724=3A_clear_ou?= =?utf-8?q?t_colliding_temp_module=2E?= Message-ID: <3dR96H5Pr1z7LlL@mail.python.org> http://hg.python.org/cpython/rev/59f3e50061bd changeset: 87372:59f3e50061bd parent: 87367:e8333e91e517 user: Eric Snow date: Fri Nov 22 13:55:23 2013 -0700 summary: Issue #19724: clear out colliding temp module. files: Lib/test/test_pkgutil.py | 2 ++ 1 files changed, 2 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_pkgutil.py b/Lib/test/test_pkgutil.py --- a/Lib/test/test_pkgutil.py +++ b/Lib/test/test_pkgutil.py @@ -200,6 +200,8 @@ dirname = self.create_init(pkgname) pathitem = os.path.join(dirname, pkgname) fullname = '{}.{}'.format(pkgname, modname) + sys.modules.pop(fullname, None) + sys.modules.pop(pkgname, None) try: self.create_submodule(dirname, pkgname, modname, 0) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 22:01:13 2013 From: python-checkins at python.org (eric.snow) Date: Fri, 22 Nov 2013 22:01:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?b?KTogTWVyZ2UgaGVhZHMu?= Message-ID: <3dR96K0xpTz7LlR@mail.python.org> http://hg.python.org/cpython/rev/64ff3457010c changeset: 87373:64ff3457010c parent: 87371:cce14bc9b675 parent: 87372:59f3e50061bd user: Eric Snow date: Fri Nov 22 13:55:59 2013 -0700 summary: Merge heads. files: Lib/test/test_pkgutil.py | 2 ++ 1 files changed, 2 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_pkgutil.py b/Lib/test/test_pkgutil.py --- a/Lib/test/test_pkgutil.py +++ b/Lib/test/test_pkgutil.py @@ -200,6 +200,8 @@ dirname = self.create_init(pkgname) pathitem = os.path.join(dirname, pkgname) fullname = '{}.{}'.format(pkgname, modname) + sys.modules.pop(fullname, None) + sys.modules.pop(pkgname, None) try: self.create_submodule(dirname, pkgname, modname, 0) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 22:14:24 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 22 Nov 2013 22:14:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319718=3A_Add_a_ca?= =?utf-8?q?se-insensitive_FS_check_to_test=2Esupport_to_use?= Message-ID: <3dR9PX6Ghrz7LjW@mail.python.org> http://hg.python.org/cpython/rev/cd09766bb18f changeset: 87374:cd09766bb18f parent: 87364:db6ae01a5f7f user: Brett Cannon date: Fri Nov 22 16:14:10 2013 -0500 summary: Issue #19718: Add a case-insensitive FS check to test.support to use in test_pathlib. Purposefully designed to work from a specified directory in case multiple file systems are used on the system. files: Lib/test/support/__init__.py | 16 +++++++++++++++- Lib/test/test_pathlib.py | 8 ++++++-- 2 files changed, 21 insertions(+), 3 deletions(-) diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -76,7 +76,7 @@ "captured_stdin", "captured_stderr", # filesystem "TESTFN", "SAVEDCWD", "unlink", "rmtree", "temp_cwd", "findfile", - "create_empty_file", "can_symlink", + "create_empty_file", "can_symlink", "fs_is_case_insensitive", # unittest "is_resource_enabled", "requires", "requires_freebsd_version", "requires_linux_version", "requires_mac_ver", "check_syntax_error", @@ -2045,6 +2045,20 @@ return test if ok else unittest.skip(msg)(test) +def fs_is_case_insensitive(directory): + """Detects if the file system for the specified directory is case-insensitive.""" + base_fp, base_path = tempfile.mkstemp(dir=directory) + case_path = base_path.upper() + if case_path == base_path: + case_path = base_path.lower() + try: + return os.path.samefile(base_path, case_path) + except FileNotFoundError: + return False + finally: + os.unlink(base_path) + + class SuppressCrashReport: """Try to prevent a crash report from popping up. diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -1623,12 +1623,16 @@ def test_glob(self): P = self.cls p = P(BASE) - self.assertEqual(set(p.glob("FILEa")), set()) + given = set(p.glob("FILEa")) + expect = set() if not support.fs_is_case_insensitive(BASE) else given + self.assertEqual(given, expect) def test_rglob(self): P = self.cls p = P(BASE, "dirC") - self.assertEqual(set(p.rglob("FILEd")), set()) + given = set(p.rglob("FILEd")) + expect = set() if not support.fs_is_case_insensitive(BASE) else given + self.assertEqual(given, expect) @only_nt -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 22:14:26 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 22 Nov 2013 22:14:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_merge?= Message-ID: <3dR9PZ3RgJz7LnV@mail.python.org> http://hg.python.org/cpython/rev/830bb51eb31c changeset: 87375:830bb51eb31c parent: 87374:cd09766bb18f parent: 87373:64ff3457010c user: Brett Cannon date: Fri Nov 22 16:14:24 2013 -0500 summary: merge files: Doc/glossary.rst | 6 +- Doc/library/asynchat.rst | 6 +- Doc/library/asyncore.rst | 6 +- Doc/library/pathlib.rst | 7 +- Doc/library/stdtypes.rst | 3 + Lib/distutils/sysconfig.py | 8 ++ Lib/distutils/tests/test_sysconfig.py | 43 +++++++++++--- Lib/test/test_pkgutil.py | 2 + Misc/NEWS | 14 +++- 9 files changed, 71 insertions(+), 24 deletions(-) diff --git a/Doc/glossary.rst b/Doc/glossary.rst --- a/Doc/glossary.rst +++ b/Doc/glossary.rst @@ -41,7 +41,7 @@ argument A value passed to a :term:`function` (or :term:`method`) when calling the - function. There are two types of arguments: + function. There are two kinds of argument: * :dfn:`keyword argument`: an argument preceded by an identifier (e.g. ``name=``) in a function call or passed as a value in a dictionary @@ -601,7 +601,7 @@ parameter A named entity in a :term:`function` (or method) definition that specifies an :term:`argument` (or in some cases, arguments) that the - function can accept. There are five types of parameters: + function can accept. There are five kinds of parameter: * :dfn:`positional-or-keyword`: specifies an argument that can be passed either :term:`positionally ` or as a :term:`keyword argument @@ -615,6 +615,8 @@ parameters. However, some built-in functions have positional-only parameters (e.g. :func:`abs`). + .. _keyword-only_parameter: + * :dfn:`keyword-only`: specifies an argument that can be supplied only by keyword. Keyword-only parameters can be defined by including a single var-positional parameter or bare ``*`` in the parameter list diff --git a/Doc/library/asynchat.rst b/Doc/library/asynchat.rst --- a/Doc/library/asynchat.rst +++ b/Doc/library/asynchat.rst @@ -10,8 +10,10 @@ -------------- -Note: This module exists for backwards compatibility only. For new code we -recommend using :module:`asyncio`. +.. note:: + + This module exists for backwards compatibility only. For new code we + recommend using :mod:`asyncio`. This module builds on the :mod:`asyncore` infrastructure, simplifying asynchronous clients and servers and making it easier to handle protocols diff --git a/Doc/library/asyncore.rst b/Doc/library/asyncore.rst --- a/Doc/library/asyncore.rst +++ b/Doc/library/asyncore.rst @@ -13,8 +13,10 @@ -------------- -Note: This module exists for backwards compatibility only. For new code we -recommend using :module:`asyncio`. +.. note:: + + This module exists for backwards compatibility only. For new code we + recommend using :mod:`asyncio`. This module provides the basic infrastructure for writing asynchronous socket service clients and servers. diff --git a/Doc/library/pathlib.rst b/Doc/library/pathlib.rst --- a/Doc/library/pathlib.rst +++ b/Doc/library/pathlib.rst @@ -20,7 +20,7 @@ a :ref:`concrete path ` for the current platform. .. note:: - This module module has been included in the standard library on a + This module has been included in the standard library on a :term:`provisional basis `. Backwards incompatible changes (up to and including removal of the package) may occur if deemed necessary by the core developers. @@ -250,7 +250,7 @@ Methods and properties ^^^^^^^^^^^^^^^^^^^^^^ -Pure paths provide the following methods an properties: +Pure paths provide the following methods and properties: .. data:: PurePath.drive @@ -454,7 +454,7 @@ .. method:: PurePath.joinpath(*other) - Calling this method is equivalent to indexing the path with each of + Calling this method is equivalent to combining the path with each of the *other* arguments in turn:: >>> PurePosixPath('/etc').joinpath('passwd') @@ -871,4 +871,3 @@ Remove this file or symbolic link. If the path points to a directory, use :func:`Path.rmdir` instead. - diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -1149,6 +1149,9 @@ fail, the entire sort operation will fail (and the list will likely be left in a partially modified state). + :meth:`sort` accepts two arguments that can only be passed by keyword + (:ref:`keyword-only arguments `): + *key* specifies a function of one argument that is used to extract a comparison key from each list element (for example, ``key=str.lower``). The key corresponding to each item in the list is calculated once and diff --git a/Lib/distutils/sysconfig.py b/Lib/distutils/sysconfig.py --- a/Lib/distutils/sysconfig.py +++ b/Lib/distutils/sysconfig.py @@ -518,6 +518,11 @@ _config_vars['prefix'] = PREFIX _config_vars['exec_prefix'] = EXEC_PREFIX + # For backward compatibility, see issue19555 + SO = _config_vars.get('EXT_SUFFIX') + if SO is not None: + _config_vars['SO'] = SO + # Always convert srcdir to an absolute path srcdir = _config_vars.get('srcdir', project_base) if os.name == 'posix': @@ -568,4 +573,7 @@ returned by 'get_config_vars()'. Equivalent to get_config_vars().get(name) """ + if name == 'SO': + import warnings + warnings.warn('SO is deprecated, use EXT_SUFFIX', DeprecationWarning) return get_config_vars().get(name) diff --git a/Lib/distutils/tests/test_sysconfig.py b/Lib/distutils/tests/test_sysconfig.py --- a/Lib/distutils/tests/test_sysconfig.py +++ b/Lib/distutils/tests/test_sysconfig.py @@ -1,7 +1,6 @@ """Tests for distutils.sysconfig.""" import os import shutil -import test import unittest from distutils import sysconfig @@ -9,8 +8,7 @@ from distutils.tests import support from test.support import TESTFN, run_unittest -class SysconfigTestCase(support.EnvironGuard, - unittest.TestCase): +class SysconfigTestCase(support.EnvironGuard, unittest.TestCase): def setUp(self): super(SysconfigTestCase, self).setUp() self.makefile = None @@ -32,7 +30,6 @@ self.assertTrue(os.path.isfile(config_h), config_h) def test_get_python_lib(self): - lib_dir = sysconfig.get_python_lib() # XXX doesn't work on Linux when Python was never installed before #self.assertTrue(os.path.isdir(lib_dir), lib_dir) # test for pythonxx.lib? @@ -67,8 +64,9 @@ self.assertTrue(os.path.exists(Python_h), Python_h) self.assertTrue(sysconfig._is_python_source_dir(srcdir)) elif os.name == 'posix': - self.assertEqual(os.path.dirname(sysconfig.get_makefile_filename()), - srcdir) + self.assertEqual( + os.path.dirname(sysconfig.get_makefile_filename()), + srcdir) def test_srcdir_independent_of_cwd(self): # srcdir should be independent of the current working directory @@ -129,10 +127,13 @@ def test_sysconfig_module(self): import sysconfig as global_sysconfig - self.assertEqual(global_sysconfig.get_config_var('CFLAGS'), sysconfig.get_config_var('CFLAGS')) - self.assertEqual(global_sysconfig.get_config_var('LDFLAGS'), sysconfig.get_config_var('LDFLAGS')) + self.assertEqual(global_sysconfig.get_config_var('CFLAGS'), + sysconfig.get_config_var('CFLAGS')) + self.assertEqual(global_sysconfig.get_config_var('LDFLAGS'), + sysconfig.get_config_var('LDFLAGS')) - @unittest.skipIf(sysconfig.get_config_var('CUSTOMIZED_OSX_COMPILER'),'compiler flags customized') + @unittest.skipIf(sysconfig.get_config_var('CUSTOMIZED_OSX_COMPILER'), + 'compiler flags customized') def test_sysconfig_compiler_vars(self): # On OS X, binary installers support extension module building on # various levels of the operating system with differing Xcode @@ -151,9 +152,29 @@ import sysconfig as global_sysconfig if sysconfig.get_config_var('CUSTOMIZED_OSX_COMPILER'): return - self.assertEqual(global_sysconfig.get_config_var('LDSHARED'), sysconfig.get_config_var('LDSHARED')) - self.assertEqual(global_sysconfig.get_config_var('CC'), sysconfig.get_config_var('CC')) + self.assertEqual(global_sysconfig.get_config_var('LDSHARED'), + sysconfig.get_config_var('LDSHARED')) + self.assertEqual(global_sysconfig.get_config_var('CC'), + sysconfig.get_config_var('CC')) + @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, + 'EXT_SUFFIX required for this test') + def test_SO_deprecation(self): + self.assertWarns(DeprecationWarning, + sysconfig.get_config_var, 'SO') + + @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, + 'EXT_SUFFIX required for this test') + def test_SO_value(self): + self.assertEqual(sysconfig.get_config_var('SO'), + sysconfig.get_config_var('EXT_SUFFIX')) + + @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, + 'EXT_SUFFIX required for this test') + def test_SO_in_vars(self): + vars = sysconfig.get_config_vars() + self.assertIsNotNone(vars['SO']) + self.assertEqual(vars['SO'], vars['EXT_SUFFIX']) def test_suite(): diff --git a/Lib/test/test_pkgutil.py b/Lib/test/test_pkgutil.py --- a/Lib/test/test_pkgutil.py +++ b/Lib/test/test_pkgutil.py @@ -200,6 +200,8 @@ dirname = self.create_init(pkgname) pathitem = os.path.join(dirname, pkgname) fullname = '{}.{}'.format(pkgname, modname) + sys.modules.pop(fullname, None) + sys.modules.pop(pkgname, None) try: self.create_submodule(dirname, pkgname, modname, 0) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,7 +10,8 @@ Core and Builtins ----------------- -- Use the repr of a module name in more places in import, especially exceptions. +- Use the repr of a module name in more places in import, especially + exceptions. - Issue #19619: str.encode, bytes.decode and bytearray.decode now use an internal API to throw LookupError for known non-text encodings, rather @@ -79,8 +80,8 @@ CRL enumeration are now two functions. enum_certificates() also returns purpose flags as set of OIDs. -- Issue #19555: Restore sysconfig.get_config_var('SO'), with a - DeprecationWarning pointing people at $EXT_SUFFIX. +- Issue #19555: Restore sysconfig.get_config_var('SO'), (and the distutils + equivalent) with a DeprecationWarning pointing people at $EXT_SUFFIX. - Issue #8813: Add SSLContext.verify_flags to change the verification flags of the context in order to enable certification revocation list (CRL) @@ -351,6 +352,13 @@ - Issue 19384: Fix test_py_compile for root user, patch by Claudiu Popa. +Documentation +------------- + +- Issue #18326: Clarify that list.sort's arguments are keyword-only. Also, + attempt to reduce confusion in the glossary by not saying there are + different "types" of arguments and parameters. + Build ----- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 22:15:50 2013 From: python-checkins at python.org (andrew.kuchling) Date: Fri, 22 Nov 2013 22:15:50 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Wording_changes?= Message-ID: <3dR9RB0jF0z7Ln7@mail.python.org> http://hg.python.org/cpython/rev/f7e9e9932a14 changeset: 87376:f7e9e9932a14 user: Andrew Kuchling date: Fri Nov 22 16:15:28 2013 -0500 summary: Wording changes files: Doc/library/selectors.rst | 10 +++++----- 1 files changed, 5 insertions(+), 5 deletions(-) diff --git a/Doc/library/selectors.rst b/Doc/library/selectors.rst --- a/Doc/library/selectors.rst +++ b/Doc/library/selectors.rst @@ -78,12 +78,12 @@ .. attribute:: events - Events that must be waited for this file object. + Events that must be waited for on this file object. .. attribute:: data Optional opaque data associated to this file object: for example, this - could be used to store per-client session. + could be used to store a per-client session ID. .. class:: BaseSelector @@ -122,7 +122,7 @@ .. method:: modify(fileobj, events, data=None) - Change a registered file object monitored events or attached data. + Change a registered file object's monitored events or attached data. This is equivalent to :meth:`BaseSelector.unregister(fileobj)` followed by :meth:`BaseSelector.register(fileobj, events, data)`, except that it @@ -143,7 +143,7 @@ If *timeout* is ``None``, the call will block until a monitored file object becomes ready. - This returns a list of ``(key, events)`` tuple, one for each ready file + This returns a list of ``(key, events)`` tuples, one for each ready file object. *key* is the :class:`SelectorKey` instance corresponding to a ready file @@ -159,7 +159,7 @@ .. method:: get_key(fileobj) - Return the key associated to a registered file object. + Return the key associated with a registered file object. This returns the :class:`SelectorKey` instance associated to this file object, or raises :exc:`KeyError` if the file object is not registered. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 22:26:12 2013 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 22 Nov 2013 22:26:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319718=3A_add_one_?= =?utf-8?q?more_globbing_test_under_POSIX?= Message-ID: <3dR9g816N4zQyZ@mail.python.org> http://hg.python.org/cpython/rev/4734f6e2fd2f changeset: 87377:4734f6e2fd2f user: Antoine Pitrou date: Fri Nov 22 22:26:01 2013 +0100 summary: Issue #19718: add one more globbing test under POSIX files: Lib/test/test_pathlib.py | 2 ++ 1 files changed, 2 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -1626,6 +1626,7 @@ given = set(p.glob("FILEa")) expect = set() if not support.fs_is_case_insensitive(BASE) else given self.assertEqual(given, expect) + self.assertEqual(set(p.glob("FILEa*")), set()) def test_rglob(self): P = self.cls @@ -1633,6 +1634,7 @@ given = set(p.rglob("FILEd")) expect = set() if not support.fs_is_case_insensitive(BASE) else given self.assertEqual(given, expect) + self.assertEqual(set(p.rglob("FILEd*")), set()) @only_nt -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 22 23:20:16 2013 From: python-checkins at python.org (antoine.pitrou) Date: Fri, 22 Nov 2013 23:20:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Try_to_debug_issue_=231971?= =?utf-8?q?5?= Message-ID: <3dRBsX37Rjz7LmW@mail.python.org> http://hg.python.org/cpython/rev/4101bfaa76fe changeset: 87378:4101bfaa76fe user: Antoine Pitrou date: Fri Nov 22 23:20:08 2013 +0100 summary: Try to debug issue #19715 files: Lib/test/test_pathlib.py | 8 ++++++-- 1 files changed, 6 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -1382,13 +1382,17 @@ self.assertFalse(p.exists()) p.touch() self.assertTrue(p.exists()) - old_mtime = p.stat().st_mtime + st = p.stat() + old_mtime = st.st_mtime + old_mtime_ns = st.st_mtime_ns # Rewind the mtime sufficiently far in the past to work around # filesystem-specific timestamp granularity. os.utime(str(p), (old_mtime - 10, old_mtime - 10)) # The file mtime is refreshed by calling touch() again p.touch() - self.assertGreaterEqual(p.stat().st_mtime, old_mtime) + st = p.stat() + self.assertGreaterEqual(st.st_mtime_ns, old_mtime_ns) + self.assertGreaterEqual(st.st_mtime, old_mtime) p = P / 'newfileB' self.assertFalse(p.exists()) p.touch(mode=0o700, exist_ok=False) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 00:18:14 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 23 Nov 2013 00:18:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319715=3A_Ensure_t?= =?utf-8?q?hat_consecutive_calls_to_monotonic=28=29_are_monotonic?= Message-ID: <3dRD8Q3wrZz7LlL@mail.python.org> http://hg.python.org/cpython/rev/716e41100553 changeset: 87379:716e41100553 user: Victor Stinner date: Sat Nov 23 00:15:27 2013 +0100 summary: Issue #19715: Ensure that consecutive calls to monotonic() are monotonic files: Lib/test/test_time.py | 9 +++++++++ 1 files changed, 9 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_time.py b/Lib/test/test_time.py --- a/Lib/test/test_time.py +++ b/Lib/test/test_time.py @@ -367,6 +367,14 @@ @unittest.skipUnless(hasattr(time, 'monotonic'), 'need time.monotonic') def test_monotonic(self): + # monotonic() should not go backward + times = [time.monotonic() for n in range(100)] + t1 = times[0] + for t2 in times[1:]: + self.assertGreaterEqual(t2, t1, "times=%s" % times) + t1 = t2 + + # monotonic() includes time elapsed during a sleep t1 = time.monotonic() time.sleep(0.5) t2 = time.monotonic() @@ -374,6 +382,7 @@ self.assertGreater(t2, t1) self.assertAlmostEqual(dt, 0.5, delta=0.2) + # monotonic() is a monotonic but non adjustable clock info = time.get_clock_info('monotonic') self.assertTrue(info.monotonic) self.assertFalse(info.adjustable) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 00:34:33 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 00:34:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319291=3A_add_crud?= =?utf-8?q?e_stubs_to_the_asyncio_docs?= Message-ID: <3dRDWF2QjKz7LlL@mail.python.org> http://hg.python.org/cpython/rev/e4b7377a690a changeset: 87380:e4b7377a690a user: Antoine Pitrou date: Sat Nov 23 00:34:26 2013 +0100 summary: Issue #19291: add crude stubs to the asyncio docs files: Doc/library/asyncio.rst | 99 +++++++++++++++++++++++++++- 1 files changed, 94 insertions(+), 5 deletions(-) diff --git a/Doc/library/asyncio.rst b/Doc/library/asyncio.rst --- a/Doc/library/asyncio.rst +++ b/Doc/library/asyncio.rst @@ -7,14 +7,103 @@ .. versionadded:: 3.4 -Introduction ------------- +This module provides infrastructure for writing single-threaded concurrent +code using coroutines, multiplexing I/O access over sockets and other +resources, running network clients and servers, and other related primitives. -This package includes a pluggable event loop, transport and protocol -abstractions similar to those in Twisted, and a higher-level scheduler -for coroutines and tasks based on ``yield from`` (:PEP:`380`). +Here is a more detailed list of the package contents: + +* a pluggable :ref:`event loop ` with various system-specific + implementations; + +* :ref:`transport ` and :ref:`protocol ` abstractions + (similar to those in `Twisted `_); + +* concrete support for TCP, UDP, SSL, subprocess pipes, delayed calls, and + others (some may be system-dependent); + +* a Future class that mimicks the one in the :mod:`concurrent.futures` module, + but adapted for use with the event loop; + +* coroutines and tasks based on ``yield from`` (:PEP:`380`), to help write + concurrent code in a sequential fashion; + +* cancellation support for Futures and coroutines; + +* :ref:`synchronization primitives ` for use between coroutines in + a single thread, mimicking those in the :mod:`threading` module; + + +Disclaimer +---------- Full documentation is not yet ready; we hope to have it written before Python 3.4 leaves beta. Until then, the best reference is :PEP:`3156`. For a motivational primer on transports and protocols, see :PEP:`3153`. + + +.. XXX should the asyncio documentation come in several pages, as for logging? + + +.. _event-loop: + +Event loops +----------- + + +.. _protocol: + +Protocols +--------- + + +.. _transport: + +Transports +---------- + + +.. _sync: + +Synchronization primitives +-------------------------- + + +Examples +-------- + +A :class:`Protocol` implementing an echo server:: + + class EchoServer(asyncio.Protocol): + + TIMEOUT = 5.0 + + def timeout(self): + print('connection timeout, closing.') + self.transport.close() + + def connection_made(self, transport): + print('connection made') + self.transport = transport + + # start 5 seconds timeout timer + self.h_timeout = asyncio.get_event_loop().call_later( + self.TIMEOUT, self.timeout) + + def data_received(self, data): + print('data received: ', data.decode()) + self.transport.write(b'Re: ' + data) + + # restart timeout timer + self.h_timeout.cancel() + self.h_timeout = asyncio.get_event_loop().call_later( + self.TIMEOUT, self.timeout) + + def eof_received(self): + pass + + def connection_lost(self, exc): + print('connection lost:', exc) + self.h_timeout.cancel() + -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 00:45:12 2013 From: python-checkins at python.org (guido.van.rossum) Date: Sat, 23 Nov 2013 00:45:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Mention_threadpool_interfa?= =?utf-8?q?ce_in_asyncio_overview=2E?= Message-ID: <3dRDlX6hHMz7Lkw@mail.python.org> http://hg.python.org/cpython/rev/ba1996038d0c changeset: 87381:ba1996038d0c user: Guido van Rossum date: Fri Nov 22 15:45:02 2013 -0800 summary: Mention threadpool interface in asyncio overview. files: Doc/library/asyncio.rst | 4 ++++ 1 files changed, 4 insertions(+), 0 deletions(-) diff --git a/Doc/library/asyncio.rst b/Doc/library/asyncio.rst --- a/Doc/library/asyncio.rst +++ b/Doc/library/asyncio.rst @@ -33,6 +33,10 @@ * :ref:`synchronization primitives ` for use between coroutines in a single thread, mimicking those in the :mod:`threading` module; +* an interface for passing work off to a threadpool, for times when + you absolutely, positively have to use a library that makes blocking + I/O calls. + Disclaimer ---------- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 01:08:49 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 01:08:49 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Start_documenting_protocol?= =?utf-8?q?s?= Message-ID: <3dRFGn2WJrz7Ll7@mail.python.org> http://hg.python.org/cpython/rev/f0bf7a5adbfc changeset: 87382:f0bf7a5adbfc user: Antoine Pitrou date: Sat Nov 23 01:08:43 2013 +0100 summary: Start documenting protocols files: Doc/library/asyncio.rst | 130 ++++++++++++++++++++++++++++ 1 files changed, 130 insertions(+), 0 deletions(-) diff --git a/Doc/library/asyncio.rst b/Doc/library/asyncio.rst --- a/Doc/library/asyncio.rst +++ b/Doc/library/asyncio.rst @@ -61,6 +61,136 @@ Protocols --------- +:mod:`asyncio` provides base classes that you can subclass to implement +your network protocols. Those classes are used in conjunction with +:ref:`transports ` (see below): the protocol parses incoming +data and asks for the writing of outgoing data, while the transport is +responsible for the actual I/O and buffering. + +When subclassing a protocol class, it is recommended you override certain +methods. Those methods are callbacks: they will be called by the transport +on certain events (for example when some data is received); you shouldn't +call them yourself, unless you are implementing a transport. + + +Protocol classes +^^^^^^^^^^^^^^^^ + +.. class:: Protocol + + The base class for implementing streaming protocols (for use with + e.g. TCP and SSL transports). + +.. class:: DatagramProtocol + + The base class for implementing datagram protocols (for use with + e.g. UDP transports). + +.. class:: SubprocessProtocol + + The base class for implementing protocols representing communication + channels with subprocesses (i.e., the set of pipes allowing bidirectional + data exchange between this process and the child process). + + +Connection callbacks +^^^^^^^^^^^^^^^^^^^^ + +These callbacks may be called on :class:`Protocol` and +:class:`SubprocessProtocol` instances. The default implementations are +empty. + +.. method:: connection_made(transport) + + Called when a connection is made. + + The *transport* argument is the transport representing the + connection. You are responsible for storing it somewhere + (e.g. as an attribute) if you need to. + +.. method:: connection_lost(exc) + + Called when the connection is lost or closed. + + The argument is either an exception object or :const:`None`. + The latter means a regular EOF is received, or the connection was + aborted or closed by this side of the connection. + +:meth:`connection_made` and :meth:`connection_lost` are called exactly once +per successful connection. All other callbacks will be called between those +two methods, which allows for easier resource management in your protocol +implementation. + + +Data reception callbacks +^^^^^^^^^^^^^^^^^^^^^^^^ + +The following callbacks are called on :class:`Protocol` instances. +The default implementations are empty. + +.. method:: data_received(data) + + Called when some data is received. *data* is a non-empty bytes object + containing the incoming data. + + .. note:: + Whether the data is buffered, chunked or reassembled depends on + the transport. In general, you shouldn't rely on specific semantics + and instead make your parsing generic and flexible enough. + + However, data always comes in the correct order. + +.. method:: eof_received() + + Calls when the other end signals it won't send any more data + (for example by calling :meth:`write_eof`, if the other end also uses + asyncio). + + This method may return a false value (including None), in which case + the transport will close itself. Conversely, if this method returns a + true value, closing the transport is up to the protocol. Since the + default implementation returns None, it implicitly closes the connection. + + .. note:: + Some transports such as SSL don't support half-closed connections, + in which case returning true from this method will not prevent closing + the connection. + + +:meth:`data_received` can be called an arbitrary number of times during +a connection. However, :meth:`eof_received` is called at most once +and, if called, :meth:`data_received` won't be called after it. + + +Flow control callbacks +^^^^^^^^^^^^^^^^^^^^^^ + +These callbacks may be called on :class:`Protocol` and +:class:`SubprocessProtocol`. The default implementations are empty. + +.. method:: pause_writing() + + Called when the transport's buffer goes over the high-water mark. + +.. method:: resume_writing() + + Called when the transport's buffer drains below the low-water mark. + + +:meth:`pause_writing` and :meth:`resume_writing` calls are paired -- +:meth:`pause_writing` is called once when the buffer goes strictly over +the high-water mark (even if subsequent writes increases the buffer size +even more), and eventually :meth:`resume_writing` is called once when the +buffer size reaches the low-water mark. + +.. note:: + If the buffer size equals the high-water mark, + :meth:`pause_writing` is not called -- it must go strictly over. + Conversely, :meth:`resume_writing` is called when the buffer size is + equal or lower than the low-water mark. These end conditions + are important to ensure that things go as expected when either + mark is zero. + .. _transport: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 01:21:25 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 01:21:25 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Finish_protocol_documentat?= =?utf-8?q?ion?= Message-ID: <3dRFYK6Q4wz7Ll3@mail.python.org> http://hg.python.org/cpython/rev/6bb7ff403360 changeset: 87383:6bb7ff403360 user: Antoine Pitrou date: Sat Nov 23 01:21:11 2013 +0100 summary: Finish protocol documentation files: Doc/library/asyncio.rst | 66 +++++++++++++++++++++++----- 1 files changed, 54 insertions(+), 12 deletions(-) diff --git a/Doc/library/asyncio.rst b/Doc/library/asyncio.rst --- a/Doc/library/asyncio.rst +++ b/Doc/library/asyncio.rst @@ -72,6 +72,11 @@ on certain events (for example when some data is received); you shouldn't call them yourself, unless you are implementing a transport. +.. note:: + All callbacks have default implementations, which are empty. Therefore, + you only need to implement the callbacks for the events in which you + are interested. + Protocol classes ^^^^^^^^^^^^^^^^ @@ -88,17 +93,15 @@ .. class:: SubprocessProtocol - The base class for implementing protocols representing communication - channels with subprocesses (i.e., the set of pipes allowing bidirectional - data exchange between this process and the child process). + The base class for implementing protocols communicating with child + processes (through a set of unidirectional pipes). Connection callbacks ^^^^^^^^^^^^^^^^^^^^ These callbacks may be called on :class:`Protocol` and -:class:`SubprocessProtocol` instances. The default implementations are -empty. +:class:`SubprocessProtocol` instances: .. method:: connection_made(transport) @@ -121,12 +124,32 @@ two methods, which allows for easier resource management in your protocol implementation. +The following callbacks may be called only on :class:`SubprocessProtocol` +instances: + +.. method:: pipe_data_received(fd, data) + + Called when the child process writes data into its stdout or stderr pipe. + *fd* is the integer file descriptor of the pipe. *data* is a non-empty + bytes object containing the data. + +.. method:: pipe_connection_lost(fd, exc) + + Called when one of the pipes communicating with the child process + is closed. *fd* is the integer file descriptor that was closed. + +.. method:: process_exited() + + Called when the child process has exited. + Data reception callbacks ^^^^^^^^^^^^^^^^^^^^^^^^ -The following callbacks are called on :class:`Protocol` instances. -The default implementations are empty. +Streaming protocols +""""""""""""""""""" + +The following callbacks are called on :class:`Protocol` instances: .. method:: data_received(data) @@ -136,9 +159,8 @@ .. note:: Whether the data is buffered, chunked or reassembled depends on the transport. In general, you shouldn't rely on specific semantics - and instead make your parsing generic and flexible enough. - - However, data always comes in the correct order. + and instead make your parsing generic and flexible enough. However, + data is always received in the correct order. .. method:: eof_received() @@ -156,17 +178,37 @@ in which case returning true from this method will not prevent closing the connection. - :meth:`data_received` can be called an arbitrary number of times during a connection. However, :meth:`eof_received` is called at most once and, if called, :meth:`data_received` won't be called after it. +Datagram protocols +"""""""""""""""""" + +The following callbacks are called on :class:`DatagramProtocol` instances. + +.. method:: datagram_received(data, addr) + + Called when a datagram is received. *data* is a bytes object containing + the incoming data. *addr* is the address of the peer sending the data; + the exact format depends on the transport. + +.. method:: error_received(exc) + + Called when a previous send or receive operation raises an + :class:`OSError`. *exc* is the :class:`OSError` instance. + + This method is called in rare conditions, when the transport (e.g. UDP) + detects that a datagram couldn't be delivered to its recipient. + In many conditions though, undeliverable datagrams will be silently + dropped. + Flow control callbacks ^^^^^^^^^^^^^^^^^^^^^^ These callbacks may be called on :class:`Protocol` and -:class:`SubprocessProtocol`. The default implementations are empty. +:class:`SubprocessProtocol` instances: .. method:: pause_writing() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 01:33:04 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 01:33:04 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Try_to_fix_issue_=2319715_?= =?utf-8?q?=28timestamp_rounding_inconsistencies_under_Windows=3F=29?= Message-ID: <3dRFpm6PmLz7Llf@mail.python.org> http://hg.python.org/cpython/rev/602062d2a008 changeset: 87384:602062d2a008 user: Antoine Pitrou date: Sat Nov 23 01:32:53 2013 +0100 summary: Try to fix issue #19715 (timestamp rounding inconsistencies under Windows?) files: Lib/test/test_pathlib.py | 10 +++++++--- 1 files changed, 7 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -1388,11 +1388,15 @@ # Rewind the mtime sufficiently far in the past to work around # filesystem-specific timestamp granularity. os.utime(str(p), (old_mtime - 10, old_mtime - 10)) - # The file mtime is refreshed by calling touch() again + # The file mtime should be refreshed by calling touch() again p.touch() st = p.stat() - self.assertGreaterEqual(st.st_mtime_ns, old_mtime_ns) - self.assertGreaterEqual(st.st_mtime, old_mtime) + # Issue #19715: there can be an inconsistency under Windows between + # the timestamp rounding when creating a file, and the timestamp + # rounding done when calling utime(). `delta` makes up for this. + delta = 1e-6 if os.name == 'nt' else 0 + self.assertGreaterEqual(st.st_mtime, old_mtime - delta) + # Now with exist_ok=False p = P / 'newfileB' self.assertFalse(p.exists()) p.touch(mode=0o700, exist_ok=False) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 01:53:34 2013 From: python-checkins at python.org (guido.van.rossum) Date: Sat, 23 Nov 2013 01:53:34 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Add_source_link_for_asynci?= =?utf-8?q?o=2E?= Message-ID: <3dRGGQ1dbQz7LkB@mail.python.org> http://hg.python.org/cpython/rev/6f8b779f6556 changeset: 87385:6f8b779f6556 user: Guido van Rossum date: Fri Nov 22 16:53:25 2013 -0800 summary: Add source link for asyncio. files: Doc/library/asyncio.rst | 3 +++ 1 files changed, 3 insertions(+), 0 deletions(-) diff --git a/Doc/library/asyncio.rst b/Doc/library/asyncio.rst --- a/Doc/library/asyncio.rst +++ b/Doc/library/asyncio.rst @@ -6,6 +6,9 @@ .. versionadded:: 3.4 +**Source code:** :source:`Lib/asyncio/` + +-------------- This module provides infrastructure for writing single-threaded concurrent code using coroutines, multiplexing I/O access over sockets and other -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 01:55:03 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 01:55:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Trying_other_strategy_for_?= =?utf-8?b?IzE5NzE1OiB1c2UgdXRpbWUoLi4uLCBOb25lKQ==?= Message-ID: <3dRGJ74Mjkz7LlL@mail.python.org> http://hg.python.org/cpython/rev/d647a4a8505e changeset: 87386:d647a4a8505e user: Antoine Pitrou date: Sat Nov 23 01:54:27 2013 +0100 summary: Trying other strategy for #19715: use utime(..., None) files: Lib/pathlib.py | 4 +--- Lib/test/test_pathlib.py | 7 ++----- 2 files changed, 3 insertions(+), 8 deletions(-) diff --git a/Lib/pathlib.py b/Lib/pathlib.py --- a/Lib/pathlib.py +++ b/Lib/pathlib.py @@ -6,7 +6,6 @@ import posixpath import re import sys -import time import weakref try: import threading @@ -1076,9 +1075,8 @@ # First try to bump modification time # Implementation note: GNU touch uses the UTIME_NOW option of # the utimensat() / futimens() functions. - t = time.time() try: - self._accessor.utime(self, (t, t)) + self._accessor.utime(self, None) except OSError: # Avoid exception chaining pass diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -1391,11 +1391,8 @@ # The file mtime should be refreshed by calling touch() again p.touch() st = p.stat() - # Issue #19715: there can be an inconsistency under Windows between - # the timestamp rounding when creating a file, and the timestamp - # rounding done when calling utime(). `delta` makes up for this. - delta = 1e-6 if os.name == 'nt' else 0 - self.assertGreaterEqual(st.st_mtime, old_mtime - delta) + self.assertGreaterEqual(st.st_mtime_ns, old_mtime_ns) + self.assertGreaterEqual(st.st_mtime, old_mtime) # Now with exist_ok=False p = P / 'newfileB' self.assertFalse(p.exists()) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 02:11:10 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 02:11:10 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbjogUmV2ZXJ0IHV0aW1lKC4uLiwg?= =?utf-8?q?None=29_strategy_=28it_has_too_poor_resolution_under_Windows=29?= =?utf-8?q?_and?= Message-ID: <3dRGfk0B6Qz7Ll4@mail.python.org> http://hg.python.org/cpython/rev/a273f99159e0 changeset: 87387:a273f99159e0 user: Antoine Pitrou date: Sat Nov 23 02:11:02 2013 +0100 summary: Revert utime(..., None) strategy (it has too poor resolution under Windows) and restore the previous test workaround (issue #19715) files: Lib/pathlib.py | 4 +++- Lib/test/test_pathlib.py | 7 +++++-- 2 files changed, 8 insertions(+), 3 deletions(-) diff --git a/Lib/pathlib.py b/Lib/pathlib.py --- a/Lib/pathlib.py +++ b/Lib/pathlib.py @@ -6,6 +6,7 @@ import posixpath import re import sys +import time import weakref try: import threading @@ -1075,8 +1076,9 @@ # First try to bump modification time # Implementation note: GNU touch uses the UTIME_NOW option of # the utimensat() / futimens() functions. + t = time.time() try: - self._accessor.utime(self, None) + self._accessor.utime(self, (t, t)) except OSError: # Avoid exception chaining pass diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -1391,8 +1391,11 @@ # The file mtime should be refreshed by calling touch() again p.touch() st = p.stat() - self.assertGreaterEqual(st.st_mtime_ns, old_mtime_ns) - self.assertGreaterEqual(st.st_mtime, old_mtime) + # Issue #19715: there can be an inconsistency under Windows between + # the timestamp rounding when creating a file, and the timestamp + # rounding done when calling utime(). `delta` makes up for this. + delta = 1e-6 if os.name == 'nt' else 0 + self.assertGreaterEqual(st.st_mtime, old_mtime - delta) # Now with exist_ok=False p = P / 'newfileB' self.assertFalse(p.exists()) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 02:14:37 2013 From: python-checkins at python.org (nick.coghlan) Date: Sat, 23 Nov 2013 02:14:37 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=237475=3A_Restore_b?= =?utf-8?q?inary_=26_text_transform_codecs?= Message-ID: <3dRGkj27Yxz7LkB@mail.python.org> http://hg.python.org/cpython/rev/5e960d2c2156 changeset: 87388:5e960d2c2156 user: Nick Coghlan date: Sat Nov 23 11:13:36 2013 +1000 summary: Close #7475: Restore binary & text transform codecs The codecs themselves were restored in Python 3.2, this completes the restoration by adding back the convenience aliases. These aliases were originally left out due to confusing errors when attempting to use them with the text encoding specific convenience methods. Python 3.4 includes several improvements to those errors, thus permitting the aliases to be restored as well. files: Doc/library/codecs.rst | 110 ++++++++++++++++---------- Doc/whatsnew/3.4.rst | 50 ++++++++--- Lib/encodings/aliases.py | 36 ++++---- Lib/test/test_codecs.py | 20 ++++ 4 files changed, 139 insertions(+), 77 deletions(-) diff --git a/Doc/library/codecs.rst b/Doc/library/codecs.rst --- a/Doc/library/codecs.rst +++ b/Doc/library/codecs.rst @@ -1188,6 +1188,9 @@ arbitrary data transforms rather than just text encodings). For asymmetric codecs, the stated purpose describes the encoding direction. +Text Encodings +^^^^^^^^^^^^^^ + The following codecs provide :class:`str` to :class:`bytes` encoding and :term:`bytes-like object` to :class:`str` decoding, similar to the Unicode text encodings. @@ -1234,62 +1237,83 @@ | | | .. deprecated:: 3.3 | +--------------------+---------+---------------------------+ -The following codecs provide :term:`bytes-like object` to :class:`bytes` -mappings. +.. _binary-transforms: +Binary Transforms +^^^^^^^^^^^^^^^^^ -.. tabularcolumns:: |l|L|L| +The following codecs provide binary transforms: :term:`bytes-like object` +to :class:`bytes` mappings. -+----------------------+------------------------------+------------------------------+ -| Codec | Purpose | Encoder / decoder | -+======================+==============================+==============================+ -| base64_codec [#b64]_ | Convert operand to MIME | :meth:`base64.b64encode` / | -| | base64 (the result always | :meth:`base64.b64decode` | -| | includes a trailing | | -| | ``'\n'``) | | -| | | | -| | .. versionchanged:: 3.4 | | -| | accepts any | | -| | :term:`bytes-like object` | | -| | as input for encoding and | | -| | decoding | | -+----------------------+------------------------------+------------------------------+ -| bz2_codec | Compress the operand | :meth:`bz2.compress` / | -| | using bz2 | :meth:`bz2.decompress` | -+----------------------+------------------------------+------------------------------+ -| hex_codec | Convert operand to | :meth:`base64.b16encode` / | -| | hexadecimal | :meth:`base64.b16decode` | -| | representation, with two | | -| | digits per byte | | -+----------------------+------------------------------+------------------------------+ -| quopri_codec | Convert operand to MIME | :meth:`quopri.encodestring` /| -| | quoted printable | :meth:`quopri.decodestring` | -+----------------------+------------------------------+------------------------------+ -| uu_codec | Convert the operand using | :meth:`uu.encode` / | -| | uuencode | :meth:`uu.decode` | -+----------------------+------------------------------+------------------------------+ -| zlib_codec | Compress the operand | :meth:`zlib.compress` / | -| | using gzip | :meth:`zlib.decompress` | -+----------------------+------------------------------+------------------------------+ + +.. tabularcolumns:: |l|L|L|L| + ++----------------------+------------------+------------------------------+------------------------------+ +| Codec | Aliases | Purpose | Encoder / decoder | ++======================+==================+==============================+==============================+ +| base64_codec [#b64]_ | base64, base_64 | Convert operand to MIME | :meth:`base64.b64encode` / | +| | | base64 (the result always | :meth:`base64.b64decode` | +| | | includes a trailing | | +| | | ``'\n'``) | | +| | | | | +| | | .. versionchanged:: 3.4 | | +| | | accepts any | | +| | | :term:`bytes-like object` | | +| | | as input for encoding and | | +| | | decoding | | ++----------------------+------------------+------------------------------+------------------------------+ +| bz2_codec | bz2 | Compress the operand | :meth:`bz2.compress` / | +| | | using bz2 | :meth:`bz2.decompress` | ++----------------------+------------------+------------------------------+------------------------------+ +| hex_codec | hex | Convert operand to | :meth:`base64.b16encode` / | +| | | hexadecimal | :meth:`base64.b16decode` | +| | | representation, with two | | +| | | digits per byte | | ++----------------------+------------------+------------------------------+------------------------------+ +| quopri_codec | quopri, | Convert operand to MIME | :meth:`quopri.encodestring` /| +| | quotedprintable, | quoted printable | :meth:`quopri.decodestring` | +| | quoted_printable | | | ++----------------------+------------------+------------------------------+------------------------------+ +| uu_codec | uu | Convert the operand using | :meth:`uu.encode` / | +| | | uuencode | :meth:`uu.decode` | ++----------------------+------------------+------------------------------+------------------------------+ +| zlib_codec | zip, zlib | Compress the operand | :meth:`zlib.compress` / | +| | | using gzip | :meth:`zlib.decompress` | ++----------------------+------------------+------------------------------+------------------------------+ .. [#b64] In addition to :term:`bytes-like objects `, ``'base64_codec'`` also accepts ASCII-only instances of :class:`str` for decoding +.. versionadded:: 3.2 + Restoration of the binary transforms. -The following codecs provide :class:`str` to :class:`str` mappings. +.. versionchanged:: 3.4 + Restoration of the aliases for the binary transforms. -.. tabularcolumns:: |l|L| -+--------------------+---------------------------+ -| Codec | Purpose | -+====================+===========================+ -| rot_13 | Returns the Caesar-cypher | -| | encryption of the operand | -+--------------------+---------------------------+ +.. _text-transforms: + +Text Transforms +^^^^^^^^^^^^^^^ + +The following codec provides a text transform: a :class:`str` to :class:`str` +mapping. + +.. tabularcolumns:: |l|l|L| + ++--------------------+---------+---------------------------+ +| Codec | Aliases | Purpose | ++====================+=========+===========================+ +| rot_13 | rot13 | Returns the Caesar-cypher | +| | | encryption of the operand | ++--------------------+---------+---------------------------+ .. versionadded:: 3.2 - bytes-to-bytes and str-to-str codecs. + Restoration of the ``rot_13`` text transform. + +.. versionchanged:: 3.4 + Restoration of the ``rot13`` alias. :mod:`encodings.idna` --- Internationalized Domain Names in Applications diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -103,7 +103,8 @@ * :ref:`PEP 446: Make newly created file descriptors non-inheritable `. * command line option for :ref:`isolated mode `, (:issue:`16499`). -* improvements to handling of non-Unicode codecs +* :ref:`improvements ` in the handling of + codecs that are not text encodings Significantly Improved Library Modules: @@ -173,8 +174,10 @@ PEP written and implemented by Victor Stinner. -Improvements to handling of non-Unicode codecs -============================================== +.. _codec-handling-improvements: + +Improvements to codec handling +============================== Since it was first introduced, the :mod:`codecs` module has always been intended to operate as a type-neutral dynamic encoding and decoding @@ -186,7 +189,7 @@ As a key step in clarifying the situation, the :meth:`codecs.encode` and :meth:`codecs.decode` convenience functions are now properly documented in Python 2.7, 3.3 and 3.4. These functions have existed in the :mod:`codecs` -module and have been covered by the regression test suite since Python 2.4, +module (and have been covered by the regression test suite) since Python 2.4, but were previously only discoverable through runtime introspection. Unlike the convenience methods on :class:`str`, :class:`bytes` and @@ -199,43 +202,58 @@ encodings provided in the standard library and direct users towards these general purpose convenience functions when appropriate:: - >>> import codecs - - >>> b"abcdef".decode("hex_codec") + >>> b"abcdef".decode("hex") Traceback (most recent call last): File "", line 1, in - LookupError: 'hex_codec' is not a text encoding; use codecs.decode() to handle arbitrary codecs + LookupError: 'hex' is not a text encoding; use codecs.decode() to handle arbitrary codecs - >>> "hello".encode("rot_13") + >>> "hello".encode("rot13") Traceback (most recent call last): File "", line 1, in - LookupError: 'rot_13' is not a text encoding; use codecs.encode() to handle arbitrary codecs + LookupError: 'rot13' is not a text encoding; use codecs.encode() to handle arbitrary codecs In a related change, whenever it is feasible without breaking backwards compatibility, exceptions raised during encoding and decoding operations will be wrapped in a chained exception of the same type that mentions the name of the codec responsible for producing the error:: - >>> codecs.decode(b"abcdefgh", "hex_codec") + >>> import codecs + + >>> codecs.decode(b"abcdefgh", "hex") binascii.Error: Non-hexadecimal digit found The above exception was the direct cause of the following exception: Traceback (most recent call last): File "", line 1, in - binascii.Error: decoding with 'hex_codec' codec failed (Error: Non-hexadecimal digit found) + binascii.Error: decoding with 'hex' codec failed (Error: Non-hexadecimal digit found) - >>> codecs.encode("hello", "bz2_codec") + >>> codecs.encode("hello", "bz2") TypeError: 'str' does not support the buffer interface The above exception was the direct cause of the following exception: Traceback (most recent call last): File "", line 1, in - TypeError: encoding with 'bz2_codec' codec failed (TypeError: 'str' does not support the buffer interface) + TypeError: encoding with 'bz2' codec failed (TypeError: 'str' does not support the buffer interface) -(Contributed by Nick Coghlan in :issue:`17827`, :issue:`17828` and -:issue:`19619`) +Finally, as the examples above show, these improvements have permitted +the restoration of the convenience aliases for the non-Unicode codecs that +were themselves restored in Python 3.2. This means that encoding binary data +to and from its hexadecimal representation (for example) can now be written +as:: + + >>> from codecs import encode, decode + >>> encode(b"hello", "hex") + b'68656c6c6f' + >>> decode(b"68656c6c6f", "hex") + b'hello' + +The binary and text transforms provided in the standard library are detailed +in :ref:`binary-transforms` and :ref:`text-transforms`. + +(Contributed by Nick Coghlan in :issue:`7475`, , :issue:`17827`, +:issue:`17828` and :issue:`19619`) .. _pep-451: diff --git a/Lib/encodings/aliases.py b/Lib/encodings/aliases.py --- a/Lib/encodings/aliases.py +++ b/Lib/encodings/aliases.py @@ -33,9 +33,9 @@ 'us' : 'ascii', 'us_ascii' : 'ascii', - ## base64_codec codec - #'base64' : 'base64_codec', - #'base_64' : 'base64_codec', + # base64_codec codec + 'base64' : 'base64_codec', + 'base_64' : 'base64_codec', # big5 codec 'big5_tw' : 'big5', @@ -45,8 +45,8 @@ 'big5_hkscs' : 'big5hkscs', 'hkscs' : 'big5hkscs', - ## bz2_codec codec - #'bz2' : 'bz2_codec', + # bz2_codec codec + 'bz2' : 'bz2_codec', # cp037 codec '037' : 'cp037', @@ -248,8 +248,8 @@ 'cp936' : 'gbk', 'ms936' : 'gbk', - ## hex_codec codec - #'hex' : 'hex_codec', + # hex_codec codec + 'hex' : 'hex_codec', # hp_roman8 codec 'roman8' : 'hp_roman8', @@ -450,13 +450,13 @@ 'cp154' : 'ptcp154', 'cyrillic_asian' : 'ptcp154', - ## quopri_codec codec - #'quopri' : 'quopri_codec', - #'quoted_printable' : 'quopri_codec', - #'quotedprintable' : 'quopri_codec', + # quopri_codec codec + 'quopri' : 'quopri_codec', + 'quoted_printable' : 'quopri_codec', + 'quotedprintable' : 'quopri_codec', - ## rot_13 codec - #'rot13' : 'rot_13', + # rot_13 codec + 'rot13' : 'rot_13', # shift_jis codec 'csshiftjis' : 'shift_jis', @@ -518,12 +518,12 @@ 'utf8_ucs2' : 'utf_8', 'utf8_ucs4' : 'utf_8', - ## uu_codec codec - #'uu' : 'uu_codec', + # uu_codec codec + 'uu' : 'uu_codec', - ## zlib_codec codec - #'zip' : 'zlib_codec', - #'zlib' : 'zlib_codec', + # zlib_codec codec + 'zip' : 'zlib_codec', + 'zlib' : 'zlib_codec', # temporary mac CJK aliases, will be replaced by proper codecs in 3.1 'x_mac_japanese' : 'shift_jis', diff --git a/Lib/test/test_codecs.py b/Lib/test/test_codecs.py --- a/Lib/test/test_codecs.py +++ b/Lib/test/test_codecs.py @@ -2320,18 +2320,29 @@ "quopri_codec", "hex_codec", ] + +transform_aliases = { + "base64_codec": ["base64", "base_64"], + "uu_codec": ["uu"], + "quopri_codec": ["quopri", "quoted_printable", "quotedprintable"], + "hex_codec": ["hex"], + "rot_13": ["rot13"], +} + try: import zlib except ImportError: pass else: bytes_transform_encodings.append("zlib_codec") + transform_aliases["zlib_codec"] = ["zip", "zlib"] try: import bz2 except ImportError: pass else: bytes_transform_encodings.append("bz2_codec") + transform_aliases["bz2_codec"] = ["bz2"] class TransformCodecTest(unittest.TestCase): @@ -2445,6 +2456,15 @@ # Unfortunately, the bz2 module throws OSError, which the codec # machinery currently can't wrap :( + # Ensure codec aliases from http://bugs.python.org/issue7475 work + def test_aliases(self): + for codec_name, aliases in transform_aliases.items(): + expected_name = codecs.lookup(codec_name).name + for alias in aliases: + with self.subTest(alias=alias): + info = codecs.lookup(alias) + self.assertEqual(info.name, expected_name) + # The codec system tries to wrap exceptions in order to ensure the error # mentions the operation being performed and the codec involved. We -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 02:17:31 2013 From: python-checkins at python.org (andrew.kuchling) Date: Sat, 23 Nov 2013 02:17:31 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Update_Itamar?= =?utf-8?q?=27s_name?= Message-ID: <3dRGp32G1wzQr9@mail.python.org> http://hg.python.org/cpython/rev/03a55e207720 changeset: 87389:03a55e207720 branch: 3.3 parent: 87365:9192c0798a90 user: Andrew Kuchling date: Fri Nov 22 20:17:24 2013 -0500 summary: Update Itamar's name files: Misc/ACKS | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -1138,7 +1138,6 @@ Michael Shiplett John W. Shipman Joel Shprentz -Itamar Shtull-Trauring Yue Shuaijie Terrel Shumway Eric Siegerman @@ -1263,6 +1262,7 @@ Erno Tukia David Turner Stephen Turner +Itamar Turner-Trauring Theodore Turocy Bill Tutt Fraser Tweedale -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 02:18:33 2013 From: python-checkins at python.org (andrew.kuchling) Date: Sat, 23 Nov 2013 02:18:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge_from_3=2E3?= Message-ID: <3dRGqF6DfJzQr9@mail.python.org> http://hg.python.org/cpython/rev/403c612cfffd changeset: 87390:403c612cfffd parent: 87388:5e960d2c2156 parent: 87389:03a55e207720 user: Andrew Kuchling date: Fri Nov 22 20:18:26 2013 -0500 summary: Merge from 3.3 files: Misc/ACKS | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -1186,7 +1186,6 @@ Michael Shiplett John W. Shipman Joel Shprentz -Itamar Shtull-Trauring Yue Shuaijie Terrel Shumway Eric Siegerman @@ -1316,6 +1315,7 @@ Erno Tukia David Turner Stephen Turner +Itamar Turner-Trauring Theodore Turocy Bill Tutt Fraser Tweedale -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 02:38:25 2013 From: python-checkins at python.org (nick.coghlan) Date: Sat, 23 Nov 2013 02:38:25 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2319694=3A_venv_now?= =?utf-8?q?_runs_ensurepip_in_isolated_mode?= Message-ID: <3dRHG94dspzPK0@mail.python.org> http://hg.python.org/cpython/rev/0ce8d68181a2 changeset: 87391:0ce8d68181a2 user: Nick Coghlan date: Sat Nov 23 11:37:28 2013 +1000 summary: Close #19694: venv now runs ensurepip in isolated mode files: Lib/test/test_venv.py | 9 +++++++-- Lib/venv/__init__.py | 7 +++++-- 2 files changed, 12 insertions(+), 4 deletions(-) diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -12,7 +12,7 @@ import sys import tempfile from test.support import (captured_stdout, captured_stderr, run_unittest, - can_symlink) + can_symlink, EnvironmentVarGuard) import unittest import venv @@ -280,7 +280,12 @@ def test_with_pip(self): shutil.rmtree(self.env_dir) - self.run_with_capture(venv.create, self.env_dir, with_pip=True) + with EnvironmentVarGuard() as envvars: + # pip's cross-version compatibility may trigger deprecation + # warnings in current versions of Python. Ensure related + # environment settings don't cause venv to fail. + envvars["PYTHONWARNINGS"] = "e" + self.run_with_capture(venv.create, self.env_dir, with_pip=True) envpy = os.path.join(os.path.realpath(self.env_dir), self.bindir, self.exe) cmd = [envpy, '-m', 'pip', '--version'] p = subprocess.Popen(cmd, stdout=subprocess.PIPE, diff --git a/Lib/venv/__init__.py b/Lib/venv/__init__.py --- a/Lib/venv/__init__.py +++ b/Lib/venv/__init__.py @@ -234,8 +234,11 @@ def _setup_pip(self, context): """Installs or upgrades pip in a virtual environment""" - cmd = [context.env_exe, '-m', 'ensurepip', '--upgrade', - '--default-pip'] + # We run ensurepip in isolated mode to avoid side effects from + # environment vars, the current directory and anything else + # intended for the global Python environment + cmd = [context.env_exe, '-Im', 'ensurepip', '--upgrade', + '--default-pip'] subprocess.check_output(cmd) def setup_scripts(self, context): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 03:00:05 2013 From: python-checkins at python.org (nick.coghlan) Date: Sat, 23 Nov 2013 03:00:05 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Update_What=27s_New_with_P?= =?utf-8?q?EP_453_progress?= Message-ID: <3dRHl91My3z7LpK@mail.python.org> http://hg.python.org/cpython/rev/59309ff09762 changeset: 87392:59309ff09762 user: Nick Coghlan date: Sat Nov 23 11:59:40 2013 +1000 summary: Update What's New with PEP 453 progress files: Doc/whatsnew/3.4.rst | 28 +++++++++++++++++++--------- 1 files changed, 19 insertions(+), 9 deletions(-) diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -136,24 +136,34 @@ cross-platform mechanism to boostrap the pip installer into Python installations and virtual environments. +The :mod:`venv` module and the :command:`pyvenv` utility make use of this +module to make ``pip`` readily available in virtual environments. When +using the command line interface, ``pip`` is installed by default, while +for the module API installation of ``pip`` must be requested explicitly. + +For CPython source builds on POSIX systems, the ``make install`` and +``make altinstall`` commands bootstrap ``pip`` by default. This behaviour +can be controlled through configure options, and overridden through +Makefile options. + +On Windows, the CPython installer now offers the option to install ``pip`` +along with CPython itself. + .. note:: - Only the first phase of PEP 453 has been implemented at this point. - This section will be fleshed out with additional details once those - other changes are implemented. + The implementation of PEP 453 is still a work in progress. Refer to + :issue:`19347` for the progress on additional steps: - Refer to :issue:`19347` for the progress on additional steps: - - * ``make install`` and ``make altinstall`` integration - * Windows installer integration * Mac OS X installer integration - * :mod:`venv` module and :command:`pyvenv` integration + * Having the binary installers install ``pip`` by default + * Recommending the use of ``pip`` in the "Installing Python Module" + documentation. .. seealso:: :pep:`453` - Explicit bootstrapping of pip in Python installations PEP written by Donald Stufft and Nick Coghlan, implemented by - Donald Stufft, Nick Coghlan (and ...). + Donald Stufft, Nick Coghlan, Martin von L?wis and Ned Deily. .. _pep-446: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 07:26:33 2013 From: python-checkins at python.org (ned.deily) Date: Sat, 23 Nov 2013 07:26:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319551=3A_PEP_453_?= =?utf-8?q?-_OS_X_installer_now_installs_or_upgrades_pip_by_default=2E?= Message-ID: <3dRPfd08htz7Ll0@mail.python.org> http://hg.python.org/cpython/rev/4e12f145f18f changeset: 87393:4e12f145f18f user: Ned Deily date: Fri Nov 22 22:25:43 2013 -0800 summary: Issue #19551: PEP 453 - OS X installer now installs or upgrades pip by default. files: Mac/BuildScript/build-installer.py | 23 +++- Mac/BuildScript/resources/ReadMe.txt | 31 ++++ Mac/BuildScript/scripts/postflight.ensurepip | 65 ++++++++++ Misc/NEWS | 2 + 4 files changed, 120 insertions(+), 1 deletions(-) diff --git a/Mac/BuildScript/build-installer.py b/Mac/BuildScript/build-installer.py --- a/Mac/BuildScript/build-installer.py +++ b/Mac/BuildScript/build-installer.py @@ -364,6 +364,7 @@ # Instructions for building packages inside the .mpkg. def pkg_recipes(): unselected_for_python3 = ('selected', 'unselected')[PYTHON_3] + unselected_for_lt_python34 = ('selected', 'unselected')[getVersionTuple() < (3, 4)] result = [ dict( name="PythonFramework", @@ -432,10 +433,27 @@ topdir="/Library/Frameworks/Python.framework", source="/empty-dir", required=False, - selected=unselected_for_python3, + selected=unselected_for_lt_python34, ), ] + if getVersionTuple() >= (3, 4): + result.append( + dict( + name="PythonInstallPip", + long_name="Install or upgrade pip", + readme="""\ + This package installs (or upgrades from an earlier version) + pip, a tool for installing and managing Python packages. + """, + postflight="scripts/postflight.ensurepip", + topdir="/Library/Frameworks/Python.framework", + source="/empty-dir", + required=False, + selected='selected', + ) + ) + if DEPTARGET < '10.4' and not PYTHON_3: result.append( dict( @@ -453,6 +471,7 @@ selected=unselected_for_python3, ) ) + return result def fatal(msg): @@ -955,11 +974,13 @@ runCommand("%s -C --enable-framework --enable-universalsdk=%s " "--with-universal-archs=%s " "%s " + "%s " "LDFLAGS='-g -L%s/libraries/usr/local/lib' " "CFLAGS='-g -I%s/libraries/usr/local/include' 2>&1"%( shellQuote(os.path.join(SRCDIR, 'configure')), shellQuote(SDKPATH), UNIVERSALARCHS, (' ', '--with-computed-gotos ')[PYTHON_3], + (' ', '--without-ensurepip ')[getVersionTuple() >= (3, 4)], shellQuote(WORKDIR)[1:-1], shellQuote(WORKDIR)[1:-1])) diff --git a/Mac/BuildScript/resources/ReadMe.txt b/Mac/BuildScript/resources/ReadMe.txt --- a/Mac/BuildScript/resources/ReadMe.txt +++ b/Mac/BuildScript/resources/ReadMe.txt @@ -17,6 +17,37 @@ installer package icon. Then select "Open using ... Installer" from the contextual menu that appears. + **NEW* As of Python 3.4.0b1: + +New Installation Options and Defaults +===================================== + +The Python installer now includes an option to automatically install +or upgrade pip, a tool for installing and managing Python packages. +This option is enabled by default and no Internet access is required. +If you do want the installer to do this, select the "Customize" option +at the "Installation Type" step and uncheck the "Install or ugprade +pip" option. + +To make it easier to use scripts installed by third-party Python +packages, with pip or by other means, the "Shell profile updater" +option is now enabled by default, as has been the case with Python +2.7.x installers. You can also turn this option off by selecting +"Customize" and unchecking the "Shell profile updater" option. You can +also update your shell profile later by launching the "Update Shell +Profile" command found in the /Applications/Python $VERSION folder. You may +need to start a new terminal window for the changes to take effect. + +Python.org Python $VERSION and 2.7.x versions can both be installed and +will not conflict. Command names for Python 3 contain a 3 in them, +python3 (or python$VERSION), idle3 (or idle$VERSION), pip3 (or pip$VERSION), etc. +Python 2.7 command names contain a 2 or no digit: python2 (or +python2.7 or python), idle2 (or idle2.7 or idle), etc. If you want to +use pip with Python 2.7.x, you will need to download and install a +separate copy of it from the Python Package Index +(https://pypi.python.org/pypi). + + **** IMPORTANT changes if you use IDLE and Tkinter **** Installing a third-party version of Tcl/Tk is no longer required diff --git a/Mac/BuildScript/scripts/postflight.ensurepip b/Mac/BuildScript/scripts/postflight.ensurepip new file mode 100755 --- /dev/null +++ b/Mac/BuildScript/scripts/postflight.ensurepip @@ -0,0 +1,65 @@ +#!/bin/sh +# +# Install/upgrade pip. +# + +PYVER="@PYVER@" +PYMAJOR="3" +FWK="/Library/Frameworks/Python.framework/Versions/${PYVER}" +RELFWKBIN="../../..${FWK}/bin" + +umask 022 + +"${FWK}/bin/python${PYVER}" -m ensurepip --upgrade + +"${FWK}/bin/python${PYVER}" -Wi \ + "${FWK}/lib/python${PYVER}/compileall.py" \ + -f -x badsyntax \ + "${FWK}/lib/python${PYVER}/site-packages" + +"${FWK}/bin/python${PYVER}" -Wi -O \ + "${FWK}/lib/python${PYVER}/compileall.py" \ + -f -x badsyntax \ + "${FWK}/lib/python${PYVER}/site-packages" + +chgrp -R admin "${FWK}/lib/python${PYVER}/site-packages" "${FWK}/bin" +chmod -R g+w "${FWK}/lib/python${PYVER}/site-packages" "${FWK}/bin" + +# We do not know if the user selected the Python command-line tools +# package that installs symlinks to /usr/local/bin. So we assume +# that the command-line tools package has already completed or was +# not selected and we will only install /usr/local/bin symlinks for +# pip et al if there are /usr/local/bin/python* symlinks to our +# framework bin directory. + +if [ -d /usr/local/bin ] ; then + ( + cd /usr/local/bin + # Create pipx.y and easy_install-x.y links if /usr/local/bin/pythonx.y + # is linked to this framework version + if [ "$(readlink -n ./python${PYVER})" = "${RELFWKBIN}/python${PYVER}" ] ; then + for fn in "pip${PYVER}" "easy_install-${PYVER}" ; + do + if [ -e "${RELFWKBIN}/${fn}" ] ; then + rm -f ./${fn} + ln -s "${RELFWKBIN}/${fn}" "./${fn}" + chgrp -h admin "./${fn}" + chmod -h g+w "./${fn}" + fi + done + fi + # Create pipx link if /usr/local/bin/pythonx is linked to this version + if [ "$(readlink -n ./python${PYMAJOR})" = "${RELFWKBIN}/python${PYMAJOR}" ] ; then + for fn in "pip${PYMAJOR}" ; + do + if [ -e "${RELFWKBIN}/${fn}" ] ; then + rm -f ./${fn} + ln -s "${RELFWKBIN}/${fn}" "./${fn}" + chgrp -h admin "./${fn}" + chmod -h g+w "./${fn}" + fi + done + fi + ) +fi +exit 0 diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -395,6 +395,8 @@ available to override the default ensurepip "--upgrade" option. The option can also be set with "make [alt]install ENSUREPIP=[upgrade|install\no]". +- Issue #19551: PEP 453 - the OS X installer now installs pip by default. + Tools/Demos ----------- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 07:39:56 2013 From: python-checkins at python.org (ned.deily) Date: Sat, 23 Nov 2013 07:39:56 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319551=3A_Update_w?= =?utf-8?q?hatsnew=2E?= Message-ID: <3dRPy45Ctvz7Lp4@mail.python.org> http://hg.python.org/cpython/rev/257fda20a6cf changeset: 87394:257fda20a6cf user: Ned Deily date: Fri Nov 22 22:39:09 2013 -0800 summary: Issue #19551: Update whatsnew. files: Doc/whatsnew/3.4.rst | 5 ++--- 1 files changed, 2 insertions(+), 3 deletions(-) diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -146,15 +146,14 @@ can be controlled through configure options, and overridden through Makefile options. -On Windows, the CPython installer now offers the option to install ``pip`` -along with CPython itself. +On Windows and Mac OS X, the CPython installers now offer the option to +install ``pip`` along with CPython itself. .. note:: The implementation of PEP 453 is still a work in progress. Refer to :issue:`19347` for the progress on additional steps: - * Mac OS X installer integration * Having the binary installers install ``pip`` by default * Recommending the use of ``pip`` in the "Installing Python Module" documentation. -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Sat Nov 23 07:42:11 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sat, 23 Nov 2013 07:42:11 +0100 Subject: [Python-checkins] Daily reference leaks (59309ff09762): sum=412 Message-ID: results for 59309ff09762 on branch "default" -------------------------------------------- test_site leaked [2, -2, 2] references, sum=2 test_site leaked [2, -2, 2] memory blocks, sum=2 test_trace leaked [136, 136, 136] references, sum=408 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogQZqBMc', '-x'] From python-checkins at python.org Sat Nov 23 07:54:51 2013 From: python-checkins at python.org (ned.deily) Date: Sat, 23 Nov 2013 07:54:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Update_third-party_librari?= =?utf-8?q?es_for_OS_X_installers=3A?= Message-ID: <3dRQHH26mWz7LjR@mail.python.org> http://hg.python.org/cpython/rev/d60dfbef7bce changeset: 87395:d60dfbef7bce user: Ned Deily date: Fri Nov 22 22:54:02 2013 -0800 summary: Update third-party libraries for OS X installers: XZ 5.0.3 -> 5.0.5 SQLite 3.7.13 -> 3.8.1 files: Mac/BuildScript/README.txt | 10 +++++----- Mac/BuildScript/build-installer.py | 12 ++++++------ Misc/NEWS | 4 ++++ 3 files changed, 15 insertions(+), 11 deletions(-) diff --git a/Mac/BuildScript/README.txt b/Mac/BuildScript/README.txt --- a/Mac/BuildScript/README.txt +++ b/Mac/BuildScript/README.txt @@ -8,7 +8,7 @@ an Installer package from the installation plus other files in ``resources`` and ``scripts`` and placed that on a ``.dmg`` disk image. -As of Python 3.3.0, PSF practice is to build two installer variants +For Python 3.4.0, PSF practice is to build two installer variants for each release. 1. 32-bit-only, i386 and PPC universal, capable on running on all machines @@ -22,8 +22,8 @@ - builds the following third-party libraries * NCurses 5.9 (http://bugs.python.org/issue15037) - * SQLite 3.7.13 - * XZ 5.0.3 + * SQLite 3.8.1 + * XZ 5.0.5 - uses system-supplied versions of third-party libraries @@ -56,10 +56,10 @@ - builds the following third-party libraries * NCurses 5.9 (http://bugs.python.org/issue15037) - * SQLite 3.7.13 + * SQLite 3.8.1 * Tcl 8.5.15 * Tk 8.5.15 - * XZ 5.0.3 + * XZ 5.0.5 - uses system-supplied versions of third-party libraries diff --git a/Mac/BuildScript/build-installer.py b/Mac/BuildScript/build-installer.py --- a/Mac/BuildScript/build-installer.py +++ b/Mac/BuildScript/build-installer.py @@ -237,9 +237,9 @@ if getVersionTuple() >= (3, 3): result.extend([ dict( - name="XZ 5.0.3", - url="http://tukaani.org/xz/xz-5.0.3.tar.gz", - checksum='fefe52f9ecd521de2a8ce38c21a27574', + name="XZ 5.0.5", + url="http://tukaani.org/xz/xz-5.0.5.tar.gz", + checksum='19d924e066b6fff0bc9d1981b4e53196', configure_pre=[ '--disable-dependency-tracking', ] @@ -282,9 +282,9 @@ ), ), dict( - name="SQLite 3.7.13", - url="http://www.sqlite.org/sqlite-autoconf-3071300.tar.gz", - checksum='c97df403e8a3d5b67bb408fcd6aabd8e', + name="SQLite 3.8.1", + url="http://www.sqlite.org/2013/sqlite-autoconf-3080100.tar.gz", + checksum='8b5a0a02dfcb0c7daf90856a5cfd485a', extra_cflags=('-Os ' '-DSQLITE_ENABLE_FTS4 ' '-DSQLITE_ENABLE_FTS3_PARENTHESIS ' diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -397,6 +397,10 @@ - Issue #19551: PEP 453 - the OS X installer now installs pip by default. +- Update third-party libraries for OS X installers: + xz 5.0.3 -> 5.0.5 + SQLite 3.7.13 -> 3.8.1 + Tools/Demos ----------- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 09:24:54 2013 From: python-checkins at python.org (ned.deily) Date: Sat, 23 Nov 2013 09:24:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319551=3A_Update_i?= =?utf-8?q?nstaller_Welcome_file=2E?= Message-ID: <3dRSHB3bCQz7Ll0@mail.python.org> http://hg.python.org/cpython/rev/7c4d1daa0bc1 changeset: 87396:7c4d1daa0bc1 user: Ned Deily date: Sat Nov 23 00:24:15 2013 -0800 summary: Issue #19551: Update installer Welcome file. files: Mac/BuildScript/resources/Welcome.rtf | 20 +++++++++------ 1 files changed, 12 insertions(+), 8 deletions(-) diff --git a/Mac/BuildScript/resources/Welcome.rtf b/Mac/BuildScript/resources/Welcome.rtf --- a/Mac/BuildScript/resources/Welcome.rtf +++ b/Mac/BuildScript/resources/Welcome.rtf @@ -1,5 +1,5 @@ -{\rtf1\ansi\ansicpg1252\cocoartf1187\cocoasubrtf390 -\cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;} +{\rtf1\ansi\ansicpg1252\cocoartf1187\cocoasubrtf400 +\cocoascreenfonts1{\fonttbl\f0\fswiss\fcharset0 Helvetica;\f1\fmodern\fcharset0 CourierNewPSMT;} {\colortbl;\red255\green255\blue255;} \paperw11905\paperh16837\margl1440\margr1440\vieww9640\viewh10620\viewkind0 \pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640 @@ -16,13 +16,17 @@ \b IDLE \b0 and a set of pre-built extension modules that open up specific Macintosh technologies to Python programs.\ \ -See the ReadMe file and the Python documentation for more information.\ + +\b NEW for Python 3.4: +\b0 This package now updates your shell profile by default to make $FULL_VERSION the default Python 3 version. This version can co-exist with other installed versions of Python 3 and Python 2. This package also installs a version of +\f1 pip +\f0 , the recommended tool for installing and managing Python packages. Type\ \ - -\b NOTE: -\b0 This package will not update your shell profile by default. Double-click -\b Update Shell Profile -\b0 at any time to make $FULL_VERSION the default Python 3 version. This version can co-exist with other installed versions of Python 3 and Python 2.\ + +\f1 pip3.4 --help +\f0 \ +\ +for an overview. See the ReadMe file and the Python documentation for more information.\ \ \b IMPORTANT for users of IDLE and tkinter: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 11:24:42 2013 From: python-checkins at python.org (christian.heimes) Date: Sat, 23 Nov 2013 11:24:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=238813=3A_X509=5FVE?= =?utf-8?q?RIFY=5FPARAM_is_only_available_on_OpenSSL_0=2E9=2E8+?= Message-ID: <3dRVxQ2SByz7Lkw@mail.python.org> http://hg.python.org/cpython/rev/40d4be2b7258 changeset: 87397:40d4be2b7258 user: Christian Heimes date: Sat Nov 23 11:24:32 2013 +0100 summary: Issue #8813: X509_VERIFY_PARAM is only available on OpenSSL 0.9.8+ The patch removes the verify_flags feature on Mac OS X 10.4 with OpenSSL 0.9.7l 28 Sep 2006. files: Doc/library/ssl.rst | 1 + Lib/test/test_ssl.py | 8 ++++++++ Modules/_ssl.c | 9 +++++++++ 3 files changed, 18 insertions(+), 0 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -1126,6 +1126,7 @@ The flags for certificate verification operations. You can set flags like :data:`VERIFY_CRL_CHECK_LEAF` by ORing them together. By default OpenSSL does neither require nor verify certificate revocation lists (CRLs). + Available only with openssl version 0.9.8+. .. versionadded:: 3.4 diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -82,6 +82,10 @@ # 0.9.7h or higher return ssl.OPENSSL_VERSION_INFO >= (0, 9, 7, 8, 15) +def have_verify_flags(): + # 0.9.8 or higher + return ssl.OPENSSL_VERSION_INFO >= (0, 9, 8, 0, 15) + def asn1time(cert_time): # Some versions of OpenSSL ignore seconds, see #18207 # 0.9.8.i @@ -667,6 +671,8 @@ with self.assertRaises(ValueError): ctx.verify_mode = 42 + @unittest.skipUnless(have_verify_flags(), + "verify_flags need OpenSSL > 0.9.8") def test_verify_flags(self): ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) # default value by OpenSSL @@ -1809,6 +1815,8 @@ self.assertLess(before, after) s.close() + @unittest.skipUnless(have_verify_flags(), + "verify_flags need OpenSSL > 0.9.8") def test_crl_check(self): if support.verbose: sys.stdout.write("\n") diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -198,6 +198,11 @@ # define OPENSSL_NO_COMP #endif +/* X509_VERIFY_PARAM got added to OpenSSL in 0.9.8 */ +#if OPENSSL_VERSION_NUMBER >= 0x0090800fL +# define HAVE_OPENSSL_VERIFY_PARAM +#endif + typedef struct { PyObject_HEAD @@ -2230,6 +2235,7 @@ return 0; } +#ifdef HAVE_OPENSSL_VERIFY_PARAM static PyObject * get_verify_flags(PySSLContext *self, void *c) { @@ -2267,6 +2273,7 @@ } return 0; } +#endif static PyObject * get_options(PySSLContext *self, void *c) @@ -3088,8 +3095,10 @@ static PyGetSetDef context_getsetlist[] = { {"options", (getter) get_options, (setter) set_options, NULL}, +#ifdef HAVE_OPENSSL_VERIFY_PARAM {"verify_flags", (getter) get_verify_flags, (setter) set_verify_flags, NULL}, +#endif {"verify_mode", (getter) get_verify_mode, (setter) set_verify_mode, NULL}, {NULL}, /* sentinel */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 12:30:06 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 12:30:06 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_transport_docstrings?= Message-ID: <3dRXNt4YS8z7LlL@mail.python.org> http://hg.python.org/cpython/rev/4a6c15cf9516 changeset: 87398:4a6c15cf9516 user: Antoine Pitrou date: Sat Nov 23 12:30:00 2013 +0100 summary: Fix transport docstrings files: Lib/asyncio/transports.py | 10 +++++----- 1 files changed, 5 insertions(+), 5 deletions(-) diff --git a/Lib/asyncio/transports.py b/Lib/asyncio/transports.py --- a/Lib/asyncio/transports.py +++ b/Lib/asyncio/transports.py @@ -16,7 +16,7 @@ return self._extra.get(name, default) def close(self): - """Closes the transport. + """Close the transport. Buffered data will be flushed asynchronously. No more data will be received. After all buffered data is flushed, the @@ -92,7 +92,7 @@ self.write(data) def write_eof(self): - """Closes the write end after flushing buffered data. + """Close the write end after flushing buffered data. (This is like typing ^D into a UNIX program reading from stdin.) @@ -101,11 +101,11 @@ raise NotImplementedError def can_write_eof(self): - """Return True if this protocol supports write_eof(), False if not.""" + """Return True if this transport supports write_eof(), False if not.""" raise NotImplementedError def abort(self): - """Closes the transport immediately. + """Closs the transport immediately. Buffered data will be lost. No more data will be received. The protocol's connection_lost() method will (eventually) be @@ -150,7 +150,7 @@ raise NotImplementedError def abort(self): - """Closes the transport immediately. + """Close the transport immediately. Buffered data will be lost. No more data will be received. The protocol's connection_lost() method will (eventually) be -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 12:33:37 2013 From: python-checkins at python.org (ned.deily) Date: Sat, 23 Nov 2013 12:33:37 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2315663=3A_Revert_O?= =?utf-8?q?S_X_installer_built-in_Tcl/Tk_support_for_3=2E4=2E0b1=2E?= Message-ID: <3dRXSx3qnlz7LpG@mail.python.org> http://hg.python.org/cpython/rev/d666e8ee687d changeset: 87399:d666e8ee687d parent: 87397:40d4be2b7258 user: Ned Deily date: Sat Nov 23 03:30:11 2013 -0800 summary: Issue #15663: Revert OS X installer built-in Tcl/Tk support for 3.4.0b1. files: Mac/BuildScript/README.txt | 29 ------ Mac/BuildScript/build-installer.py | 66 +-------------- Mac/BuildScript/resources/ReadMe.txt | 37 ++----- Mac/BuildScript/resources/Welcome.rtf | 10 +- Misc/NEWS | 5 + 5 files changed, 25 insertions(+), 122 deletions(-) diff --git a/Mac/BuildScript/README.txt b/Mac/BuildScript/README.txt --- a/Mac/BuildScript/README.txt +++ b/Mac/BuildScript/README.txt @@ -57,8 +57,6 @@ * NCurses 5.9 (http://bugs.python.org/issue15037) * SQLite 3.8.1 - * Tcl 8.5.15 - * Tk 8.5.15 * XZ 5.0.5 - uses system-supplied versions of third-party libraries @@ -67,33 +65,6 @@ - requires ActiveState Tcl/Tk 8.5.15 (or later) to be installed for building - * Beginning with Python 3.4 alpha2, this installer now includes its own - builtin copy of Tcl and Tk 8.5.15 libraries and thus is no longer - dependent on the buggy releases of Aqua Cocoa Tk 8.5 shipped with - OS X 10.6 or on installing a newer third-party version of Tcl/Tk - in /Library/Frameworks, such as from ActiveState. Because this - is a new feature, it should be considered somewhat experimental and - subject to change prior to the final release of Python 3.4. If it - is necessary to fallback to using a third-party Tcl/Tk because of - a problem with the builtin Tcl/Tk, there is a backup version of - the _tkinter extension included which will dynamically link to - Tcl and Tk frameworks in /Library/Frameworks as in previous releases. - To enable (for all users of this Python 3.4):: - - sudo bash - cd /Library/Frameworks/Python.framework/Versions/3.4 - cd ./lib/python3.4 - cp -p ./lib-tkinter/library/_tkinter.so ./lib-dynload - exit - - To restore using Python's builtin versions of Tcl and Tk:: - - sudo bash - cd /Library/Frameworks/Python.framework/Versions/3.4 - cd ./lib/python3.4 - cp -p ./lib-tkinter/builtin/_tkinter.so ./lib-dynload - exit - - recommended build environment: * Mac OS X 10.6.8 (or later) diff --git a/Mac/BuildScript/build-installer.py b/Mac/BuildScript/build-installer.py --- a/Mac/BuildScript/build-installer.py +++ b/Mac/BuildScript/build-installer.py @@ -193,7 +193,8 @@ LT_10_5 = bool(DEPTARGET < '10.5') - if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): +# Disable for now + if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 5)): result.extend([ dict( name="Tcl 8.5.15", @@ -586,20 +587,6 @@ % frameworks['Tk'], ] - # For 10.6+ builds, we build two versions of _tkinter: - # - the traditional version (renamed to _tkinter_library.so) linked - # with /Library/Frameworks/{Tcl,Tk}.framework - # - the default version linked with our builtin copies of Tcl and Tk - if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): - EXPECTED_SHARED_LIBS['_tkinter_library.so'] = \ - EXPECTED_SHARED_LIBS['_tkinter.so'] - EXPECTED_SHARED_LIBS['_tkinter.so'] = [ - "/Library/Frameworks/Python.framework/Versions/%s/lib/libtcl%s.dylib" - % (getVersion(), frameworks['Tcl']), - "/Library/Frameworks/Python.framework/Versions/%s/lib/libtk%s.dylib" - % (getVersion(), frameworks['Tk']), - ] - # Remove inherited environment variables which might influence build environ_var_prefixes = ['CPATH', 'C_INCLUDE_', 'DYLD_', 'LANG', 'LC_', 'LD_', 'LIBRARY_', 'PATH', 'PYTHON'] @@ -987,23 +974,6 @@ print("Running make") runCommand("make") - # For deployment targets of 10.6 and higher, we build our own version - # of Tcl and Cocoa Aqua Tk libs because the Apple-supplied Tk 8.5 is - # out-of-date and has critical bugs. Save the _tkinter.so that was - # linked with /Library/Frameworks/{Tck,Tk}.framework and build - # another _tkinter.so linked with our builtin Tcl and Tk libs. - if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): - runCommand("find build -name '_tkinter.so' " - " -execdir mv '{}' _tkinter_library.so \;") - print("Running make to build builtin _tkinter") - runCommand("make TCLTK_INCLUDES='-I%s/libraries/usr/local/include' " - "TCLTK_LIBS='-L%s/libraries/usr/local/lib -ltcl8.5 -ltk8.5'"%( - shellQuote(WORKDIR)[1:-1], - shellQuote(WORKDIR)[1:-1])) - # make a copy which will be moved to lib-tkinter later - runCommand("find build -name '_tkinter.so' " - " -execdir cp -p '{}' _tkinter_builtin.so \;") - print("Running make install") runCommand("make install DESTDIR=%s"%( shellQuote(rootDir))) @@ -1028,27 +998,11 @@ 'Python.framework', 'Versions', version, 'lib', 'python%s'%(version,)) - # If we made multiple versions of _tkinter, move them to - # their own directories under python lib. This allows - # users to select which to import by manipulating sys.path - # directly or with PYTHONPATH. - - if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): - TKINTERS = ['builtin', 'library'] - tkinter_moves = [('_tkinter_' + tkn + '.so', - os.path.join(path_to_lib, 'lib-tkinter', tkn)) - for tkn in TKINTERS] - # Create the destination directories under lib-tkinter. - # The permissions and uid/gid will be fixed up next. - for tkm in tkinter_moves: - os.makedirs(tkm[1]) - print("Fix file modes") frmDir = os.path.join(rootDir, 'Library', 'Frameworks', 'Python.framework') gid = grp.getgrnam('admin').gr_gid shared_lib_error = False - moves_list = [] for dirpath, dirnames, filenames in os.walk(frmDir): for dn in dirnames: os.chmod(os.path.join(dirpath, dn), STAT_0o775) @@ -1074,25 +1028,9 @@ % (sl, p)) shared_lib_error = True - # If this is a _tkinter variant, move it to its own directory - # now that we have fixed its permissions and checked that it - # was linked properly. The directory was created earlier. - # The files are moved after the entire tree has been walked - # since the shared library checking depends on the files - # having unique names. - if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 4)): - for tkm in tkinter_moves: - if fn == tkm[0]: - moves_list.append( - (p, os.path.join(tkm[1], '_tkinter.so'))) - if shared_lib_error: fatal("Unexpected shared library errors.") - # Now do the moves. - for ml in moves_list: - shutil.move(ml[0], ml[1]) - if PYTHON_3: LDVERSION=None VERSION=None diff --git a/Mac/BuildScript/resources/ReadMe.txt b/Mac/BuildScript/resources/ReadMe.txt --- a/Mac/BuildScript/resources/ReadMe.txt +++ b/Mac/BuildScript/resources/ReadMe.txt @@ -17,6 +17,17 @@ installer package icon. Then select "Open using ... Installer" from the contextual menu that appears. + **** IMPORTANT **** + +Update your version of Tcl/Tk to use IDLE or other Tk applications +================================================================== + +To use IDLE or other programs that use the Tkinter graphical user +interface toolkit, you may need to install a newer third-party version +of the Tcl/Tk frameworks. Visit http://www.python.org/download/mac/tcltk/ +for current information about supported and recommended versions of +Tcl/Tk for this version of Python and of Mac OS X. + **NEW* As of Python 3.4.0b1: New Installation Options and Defaults @@ -47,32 +58,6 @@ separate copy of it from the Python Package Index (https://pypi.python.org/pypi). - - **** IMPORTANT changes if you use IDLE and Tkinter **** - -Installing a third-party version of Tcl/Tk is no longer required -================================================================ - -Beginning with Python 3.4 alpha2, the 10.6+ 64-bit installer now -comes with its own private copy of Tcl and Tk 8.5 libraries. For -this version of Python, it is no longer necessary to install -a third-party version of Tcl/Tk 8.5, such as those from ActiveState, -to work around the problematic versions of Tcl/Tk 8.5 shipped by -Apple in OS X 10.6 and later. (This does not change the requirements -for older versions of Python installed from python.org.) By default, -this version of Python will always use its own private version, -regardless of whether a third-party Tcl/Tk is installed. -The 10.5+ 32-bit-only installer continues to use Tcl/Tk 8.4, -either a third-party or system-supplied version. -Since this is a new feature, it should be considered somewhat -experimental and subject to change prior to the final release of -Python 3.4. Please report any problems found to the Python bug -tracker at http://bugs.python.org. - -Visit http://www.python.org/download/mac/tcltk/ -for current information about supported and recommended versions of -Tcl/Tk for this version of Python and of Mac OS X. - Using this version of Python on OS X ==================================== diff --git a/Mac/BuildScript/resources/Welcome.rtf b/Mac/BuildScript/resources/Welcome.rtf --- a/Mac/BuildScript/resources/Welcome.rtf +++ b/Mac/BuildScript/resources/Welcome.rtf @@ -29,7 +29,11 @@ for an overview. See the ReadMe file and the Python documentation for more information.\ \ -\b IMPORTANT for users of IDLE and tkinter: -\b0 Beginning with Python 3.4 alpha 2, it is no longer necessary to install third-party versions of the +\b IMPORTANT: +\b0 +\b IDLE +\b0 and other programs using the +\b tkinter +\b0 graphical user interface toolkit require specific versions of the \b Tcl/Tk -\b0 platform independent windowing toolkit. Please read the ReadMe file and visit {\field{\*\fldinst{HYPERLINK "http://www.python.org/download/mac/tcltk/"}}{\fldrslt http://www.python.org/download/mac/tcltk/}} for more information on supported and recommended versions of Tcl/Tk for this version of Python and Mac OS X.} \ No newline at end of file +\b0 platform independent windowing toolkit. Visit {\field{\*\fldinst{HYPERLINK "http://www.python.org/download/mac/tcltk/"}}{\fldrslt http://www.python.org/download/mac/tcltk/}} for current information on supported and recommended versions of Tcl/Tk for this version of Python and Mac OS X.} \ No newline at end of file diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -401,6 +401,11 @@ xz 5.0.3 -> 5.0.5 SQLite 3.7.13 -> 3.8.1 +- Issue #15663: Revert OS X installer built-in Tcl/Tk support for 3.4.0b1. + Some third-party projects, such as Matplotlib and PIL/Pillow, + depended on being able to build with Tcl and Tk frameworks in + /Library/Frameworks. + Tools/Demos ----------- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 12:33:38 2013 From: python-checkins at python.org (ned.deily) Date: Sat, 23 Nov 2013 12:33:38 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_merge?= Message-ID: <3dRXSy5T2xz7Lpx@mail.python.org> http://hg.python.org/cpython/rev/9e4d8552cee7 changeset: 87400:9e4d8552cee7 parent: 87399:d666e8ee687d parent: 87398:4a6c15cf9516 user: Ned Deily date: Sat Nov 23 03:33:00 2013 -0800 summary: merge files: Lib/asyncio/transports.py | 10 +++++----- 1 files changed, 5 insertions(+), 5 deletions(-) diff --git a/Lib/asyncio/transports.py b/Lib/asyncio/transports.py --- a/Lib/asyncio/transports.py +++ b/Lib/asyncio/transports.py @@ -16,7 +16,7 @@ return self._extra.get(name, default) def close(self): - """Closes the transport. + """Close the transport. Buffered data will be flushed asynchronously. No more data will be received. After all buffered data is flushed, the @@ -92,7 +92,7 @@ self.write(data) def write_eof(self): - """Closes the write end after flushing buffered data. + """Close the write end after flushing buffered data. (This is like typing ^D into a UNIX program reading from stdin.) @@ -101,11 +101,11 @@ raise NotImplementedError def can_write_eof(self): - """Return True if this protocol supports write_eof(), False if not.""" + """Return True if this transport supports write_eof(), False if not.""" raise NotImplementedError def abort(self): - """Closes the transport immediately. + """Closs the transport immediately. Buffered data will be lost. No more data will be received. The protocol's connection_lost() method will (eventually) be @@ -150,7 +150,7 @@ raise NotImplementedError def abort(self): - """Closes the transport immediately. + """Close the transport immediately. Buffered data will be lost. No more data will be received. The protocol's connection_lost() method will (eventually) be -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 12:35:45 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 23 Nov 2013 12:35:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2318874=3A_Implemen?= =?utf-8?q?t_the_PEP_454_=28tracemalloc=29?= Message-ID: <3dRXWP0Fwyz7Lpm@mail.python.org> http://hg.python.org/cpython/rev/6e2089dbc5ad changeset: 87401:6e2089dbc5ad user: Victor Stinner date: Sat Nov 23 12:27:24 2013 +0100 summary: Issue #18874: Implement the PEP 454 (tracemalloc) files: Doc/library/debug.rst | 1 + Doc/library/tracemalloc.rst | 608 +++++++++ Doc/license.rst | 41 + Doc/using/cmdline.rst | 18 +- Lib/test/support/__init__.py | 19 + Lib/test/test_atexit.py | 4 +- Lib/test/test_capi.py | 2 +- Lib/test/test_threading.py | 4 +- Lib/test/test_tracemalloc.py | 797 ++++++++++++ Lib/tracemalloc.py | 464 +++++++ Modules/Setup.dist | 17 +- Modules/_tracemalloc.c | 1407 ++++++++++++++++++++++ Modules/hashtable.c | 518 ++++++++ Modules/hashtable.h | 128 ++ PC/config.c | 2 + PCbuild/pythoncore.vcxproj | 5 +- Python/pythonrun.c | 4 + 17 files changed, 4024 insertions(+), 15 deletions(-) diff --git a/Doc/library/debug.rst b/Doc/library/debug.rst --- a/Doc/library/debug.rst +++ b/Doc/library/debug.rst @@ -15,3 +15,4 @@ profile.rst timeit.rst trace.rst + tracemalloc.rst diff --git a/Doc/library/tracemalloc.rst b/Doc/library/tracemalloc.rst new file mode 100644 --- /dev/null +++ b/Doc/library/tracemalloc.rst @@ -0,0 +1,608 @@ +:mod:`tracemalloc` --- Trace memory allocations +=============================================== + +.. module:: tracemalloc + :synopsis: Trace memory allocations. + +The tracemalloc module is a debug tool to trace memory blocks allocated by +Python. It provides the following information: + +* Traceback where an object was allocated +* Statistics on allocated memory blocks per filename and per line number: + total size, number and average size of allocated memory blocks +* Compute the differences between two snapshots to detect memory leaks + +To trace most memory blocks allocated by Python, the module should be started +as early as possible by setting the :envvar:`PYTHONTRACEMALLOC` environment +variable to ``1``, or by using :option:`-X` ``tracemalloc`` command line +option. The :func:`tracemalloc.start` function can be called at runtime to +start tracing Python memory allocations. + +By default, a trace of an allocated memory block only stores the most recent +frame (1 frame). To store 25 frames at startup: set the +:envvar:`PYTHONTRACEMALLOC` environment variable to ``25``, or use the +:option:`-X` ``tracemalloc=25`` command line option. The +:func:`set_traceback_limit` function can be used at runtime to set the limit. + +.. versionadded:: 3.4 + + +Examples +======== + +Display the top 10 +------------------ + +Display the 10 files allocating the most memory:: + + import tracemalloc + + tracemalloc.start() + + # ... run your application ... + + snapshot = tracemalloc.take_snapshot() + top_stats = snapshot.statistics('lineno') + + print("[ Top 10 ]") + for stat in top_stats[:10]: + print(stat) + + +Example of output of the Python test suite:: + + [ Top 10 ] + :716: size=4855 KiB, count=39328, average=126 B + :284: size=521 KiB, count=3199, average=167 B + /usr/lib/python3.4/collections/__init__.py:368: size=244 KiB, count=2315, average=108 B + /usr/lib/python3.4/unittest/case.py:381: size=185 KiB, count=779, average=243 B + /usr/lib/python3.4/unittest/case.py:402: size=154 KiB, count=378, average=416 B + /usr/lib/python3.4/abc.py:133: size=88.7 KiB, count=347, average=262 B + :1446: size=70.4 KiB, count=911, average=79 B + :1454: size=52.0 KiB, count=25, average=2131 B + :5: size=49.7 KiB, count=148, average=344 B + /usr/lib/python3.4/sysconfig.py:411: size=48.0 KiB, count=1, average=48.0 KiB + +We can see that Python loaded ``4.8 MiB`` data (bytecode and constants) from +modules and that the :mod:`collections` module allocated ``244 KiB`` to build +:class:`~collections.namedtuple` types. + +See :meth:`Snapshot.statistics` for more options. + + +Compute differences +------------------- + +Take two snapshots and display the differences:: + + import tracemalloc + tracemalloc.start() + # ... start your application ... + + snapshot1 = tracemalloc.take_snapshot() + # ... call the function leaking memory ... + snapshot2 = tracemalloc.take_snapshot() + + top_stats = snapshot2.compare_to(snapshot1, 'lineno') + + print("[ Top 10 differences ]") + for stat in top_stats[:10]: + print(stat) + +Example of output before/after running some tests of the Python test suite:: + + [ Top 10 differences ] + :716: size=8173 KiB (+4428 KiB), count=71332 (+39369), average=117 B + /usr/lib/python3.4/linecache.py:127: size=940 KiB (+940 KiB), count=8106 (+8106), average=119 B + /usr/lib/python3.4/unittest/case.py:571: size=298 KiB (+298 KiB), count=589 (+589), average=519 B + :284: size=1005 KiB (+166 KiB), count=7423 (+1526), average=139 B + /usr/lib/python3.4/mimetypes.py:217: size=112 KiB (+112 KiB), count=1334 (+1334), average=86 B + /usr/lib/python3.4/http/server.py:848: size=96.0 KiB (+96.0 KiB), count=1 (+1), average=96.0 KiB + /usr/lib/python3.4/inspect.py:1465: size=83.5 KiB (+83.5 KiB), count=109 (+109), average=784 B + /usr/lib/python3.4/unittest/mock.py:491: size=77.7 KiB (+77.7 KiB), count=143 (+143), average=557 B + /usr/lib/python3.4/urllib/parse.py:476: size=71.8 KiB (+71.8 KiB), count=969 (+969), average=76 B + /usr/lib/python3.4/contextlib.py:38: size=67.2 KiB (+67.2 KiB), count=126 (+126), average=546 B + +We can see that Python loaded ``4.4 MiB`` of new data (bytecode and constants) +from modules (on of total of ``8.2 MiB``) and that the :mod:`linecache` module +cached ``940 KiB`` of Python source code to format tracebacks. + +If the system has little free memory, snapshots can be written on disk using +the :meth:`Snapshot.dump` method to analyze the snapshot offline. Then use the +:meth:`Snapshot.load` method reload the snapshot. + + +Get the traceback of a memory block +----------------------------------- + +Code to display the traceback of the biggest memory block:: + + import linecache + import tracemalloc + + tracemalloc.set_traceback_limit(25) + tracemalloc.start() + + # ... run your application ... + + snapshot = tracemalloc.take_snapshot() + top_stats = snapshot.statistics('traceback') + + # pick the biggest memory block + stat = top_stats[0] + print("%s memory blocks: %.1f KiB" % (stat.count, stat.size / 1024)) + for frame in stat.traceback: + print(' File "%s", line %s' % (frame.filename, frame.lineno)) + line = linecache.getline(frame.filename, frame.lineno) + line = line.strip() + if line: + print(' ' + line) + +Example of output of the Python test suite (traceback limited to 25 frames):: + + 903 memory blocks: 870.1 KiB + File "", line 716 + File "", line 1036 + File "", line 934 + File "", line 1068 + File "", line 619 + File "", line 1581 + File "", line 1614 + File "/usr/lib/python3.4/doctest.py", line 101 + import pdb + File "", line 284 + File "", line 938 + File "", line 1068 + File "", line 619 + File "", line 1581 + File "", line 1614 + File "/usr/lib/python3.4/test/support/__init__.py", line 1728 + import doctest + File "/usr/lib/python3.4/test/test_pickletools.py", line 21 + support.run_doctest(pickletools) + File "/usr/lib/python3.4/test/regrtest.py", line 1276 + test_runner() + File "/usr/lib/python3.4/test/regrtest.py", line 976 + display_failure=not verbose) + File "/usr/lib/python3.4/test/regrtest.py", line 761 + match_tests=ns.match_tests) + File "/usr/lib/python3.4/test/regrtest.py", line 1563 + main() + File "/usr/lib/python3.4/test/__main__.py", line 3 + regrtest.main_in_temp_cwd() + File "/usr/lib/python3.4/runpy.py", line 73 + exec(code, run_globals) + File "/usr/lib/python3.4/runpy.py", line 160 + "__main__", fname, loader, pkg_name) + +We can see that most memory was allocated in the :mod:`importlib` module to +load data (bytecode and constants) from modules: ``870 KiB``. The traceback is +where the :mod:`importlib` loaded data for the the last time: on the ``import +pdb`` line of the :mod:`doctest` module. The traceback may change if a new +module is loaded. + + +Pretty top +---------- + +Code to display the 10 lines allocating the most memory with a pretty output, +ignoring ```` and ```` files:: + + import os + import tracemalloc + + def display_top(snapshot, group_by='lineno', limit=10): + snapshot = snapshot.filter_traces(( + tracemalloc.Filter(False, ""), + tracemalloc.Filter(False, ""), + )) + top_stats = snapshot.statistics(group_by) + + print("Top %s lines" % limit) + for index, stat in enumerate(top_stats[:limit], 1): + frame = stat.traceback[0] + # replace "/path/to/module/file.py" with "module/file.py" + filename = os.sep.join(frame.filename.split(os.sep)[-2:]) + print("#%s: %s:%s: %.1f KiB" + % (index, filename, frame.lineno, + stat.size / 1024)) + + other = top_stats[limit:] + if other: + size = sum(stat.size for stat in other) + print("%s other: %.1f KiB" % (len(other), size / 1024)) + total = sum(stat.size for stat in top_stats) + print("Total allocated size: %.1f KiB" % (total / 1024)) + + tracemalloc.start() + + # ... run your application ... + + snapshot = tracemalloc.take_snapshot() + display_top(snapshot, 10) + +Example of output of the Python test suite:: + + 2013-11-08 14:16:58.149320: Top 10 lines + #1: collections/__init__.py:368: 291.9 KiB + #2: Lib/doctest.py:1291: 200.2 KiB + #3: unittest/case.py:571: 160.3 KiB + #4: Lib/abc.py:133: 99.8 KiB + #5: urllib/parse.py:476: 71.8 KiB + #6: :5: 62.7 KiB + #7: Lib/base64.py:140: 59.8 KiB + #8: Lib/_weakrefset.py:37: 51.8 KiB + #9: collections/__init__.py:362: 50.6 KiB + #10: test/test_site.py:56: 48.0 KiB + 7496 other: 4161.9 KiB + Total allocated size: 5258.8 KiB + +See :meth:`Snapshot.statistics` for more options. + + +API +=== + +Functions +--------- + +.. function:: clear_traces() + + Clear traces of memory blocks allocated by Python. + + See also :func:`stop`. + + +.. function:: get_object_traceback(obj) + + Get the traceback where the Python object *obj* was allocated. + Return a :class:`Traceback` instance, or ``None`` if the :mod:`tracemalloc` + module is not tracing memory allocations or did not trace the allocation of + the object. + + See also :func:`gc.get_referrers` and :func:`sys.getsizeof` functions. + + +.. function:: get_traceback_limit() + + Get the maximum number of frames stored in the traceback of a trace. + + By default, a trace of a memory block only stores the most recent + frame: the limit is ``1``. + + Use the :func:`set_traceback_limit` function to change the limit. + + +.. function:: get_traced_memory() + + Get the current size and maximum size of memory blocks traced by the + :mod:`tracemalloc` module as a tuple: ``(size: int, max_size: int)``. + + +.. function:: get_tracemalloc_memory() + + Get the memory usage in bytes of the :mod:`tracemalloc` module used to store + traces of memory blocks. + Return an :class:`int`. + + +.. function:: is_tracing() + + ``True`` if the :mod:`tracemalloc` module is tracing Python memory + allocations, ``False`` otherwise. + + See also :func:`start` and :func:`stop` functions. + + +.. function:: set_traceback_limit(nframe: int) + + Set the maximum number of frames stored in the traceback of a trace. + *nframe* must be greater or equal to ``1``. + + Storing more than ``1`` frame is only useful to compute statistics grouped + by ``'traceback'`` or to compute cumulative statistics: see the + :meth:`Snapshot.compare_to` and :meth:`Snapshot.statistics` methods. + + Storing more frames increases the memory and CPU overhead of the + :mod:`tracemalloc` module. Use the :func:`get_tracemalloc_memory` function + to measure how much memory is used by the :mod:`tracemalloc` module. + + The :envvar:`PYTHONTRACEMALLOC` environment variable + (``PYTHONTRACEMALLOC=NFRAME``) and the :option:`-X` ``tracemalloc=NFRAME`` + command line option can be used to set the limit at startup. + + Use the :func:`get_traceback_limit` function to get the current limit. + + +.. function:: start() + + Start tracing Python memory allocations: install hooks on Python memory + allocators. + + See also :func:`stop` and :func:`is_tracing` functions. + + +.. function:: stop() + + Stop tracing Python memory allocations: uninstall hooks on Python memory + allocators. Clear also traces of memory blocks allocated by Python + + Call :func:`take_snapshot` function to take a snapshot of traces before + clearing them. + + See also :func:`start` and :func:`is_tracing` functions. + + +.. function:: take_snapshot() + + Take a snapshot of traces of memory blocks allocated by Python. Return a new + :class:`Snapshot` instance. + + The snapshot does not include memory blocks allocated before the + :mod:`tracemalloc` module started to trace memory allocations. + + Tracebacks of traces are limited to :func:`get_traceback_limit` frames. Use + :func:`set_traceback_limit` to store more frames. + + The :mod:`tracemalloc` module must be tracing memory allocations to take a + snapshot, see the the :func:`start` function. + + See also the :func:`get_object_traceback` function. + + +Filter +------ + +.. class:: Filter(inclusive: bool, filename_pattern: str, lineno: int=None, all_frames: bool=False) + + Filter on traces of memory blocks. + + See the :func:`fnmatch.fnmatch` function for the syntax of + *filename_pattern*. The ``'.pyc'`` and ``'.pyo'`` file extensions are + replaced with ``'.py'``. + + Examples: + + * ``Filter(True, subprocess.__file__)`` only includes traces of the + :mod:`subprocess` module + * ``Filter(False, tracemalloc.__file__)`` excludes traces of the + :mod:`tracemalloc` module + * ``Filter(False, "")`` excludes empty tracebacks + + .. attribute:: inclusive + + If *inclusive* is ``True`` (include), only trace memory blocks allocated + in a file with a name matching :attr:`filename_pattern` at line number + :attr:`lineno`. + + If *inclusive* is ``False`` (exclude), ignore memory blocks allocated in + a file with a name matching :attr:`filename_pattern` at line number + :attr:`lineno`. + + .. attribute:: lineno + + Line number (``int``) of the filter. If *lineno* is ``None``, the filter + matches any line number. + + .. attribute:: filename_pattern + + Filename pattern of the filter (``str``). + + .. attribute:: all_frames + + If *all_frames* is ``True``, all frames of the traceback are checked. If + *all_frames* is ``False``, only the most recent frame is checked. + + This attribute is ignored if the traceback limit is less than ``2``. See + the :func:`get_traceback_limit` function and + :attr:`Snapshot.traceback_limit` attribute. + + +Frame +----- + +.. class:: Frame + + Frame of a traceback. + + The :class:`Traceback` class is a sequence of :class:`Frame` instances. + + .. attribute:: filename + + Filename (``str``). + + .. attribute:: lineno + + Line number (``int``). + + +Snapshot +-------- + +.. class:: Snapshot + + Snapshot of traces of memory blocks allocated by Python. + + The :func:`take_snapshot` function creates a snapshot instance. + + .. method:: compare_to(old_snapshot: Snapshot, group_by: str, cumulative: bool=False) + + Compute the differences with an old snapshot. Get statistics as a sorted + list of :class:`StatisticDiff` instances grouped by *group_by*. + + See the :meth:`statistics` method for *group_by* and *cumulative* + parameters. + + The result is sorted from the biggest to the smallest by: absolute value + of :attr:`StatisticDiff.size_diff`, :attr:`StatisticDiff.size`, absolute + value of :attr:`StatisticDiff.count_diff`, :attr:`Statistic.count` and + then by :attr:`StatisticDiff.traceback`. + + + .. method:: dump(filename) + + Write the snapshot into a file. + + Use :meth:`load` to reload the snapshot. + + + .. method:: filter_traces(filters) + + Create a new :class:`Snapshot` instance with a filtered :attr:`traces` + sequence, *filters* is a list of :class:`Filter` instances. If *filters* + is an empty list, return a new :class:`Snapshot` instance with a copy of + the traces. + + All inclusive filters are applied at once, a trace is ignored if no + inclusive filters match it. A trace is ignored if at least one exclusive + filter matchs it. + + + .. classmethod:: load(filename) + + Load a snapshot from a file. + + See also :meth:`dump`. + + + .. method:: statistics(group_by: str, cumulative: bool=False) + + Get statistics as a sorted list of :class:`Statistic` instances grouped + by *group_by*: + + ===================== ======================== + group_by description + ===================== ======================== + ``'filename'`` filename + ``'lineno'`` filename and line number + ``'traceback'`` traceback + ===================== ======================== + + If *cumulative* is ``True``, cumulate size and count of memory blocks of + all frames of the traceback of a trace, not only the most recent frame. + The cumulative mode can only be used with *group_by* equals to + ``'filename'`` and ``'lineno'`` and :attr:`traceback_limit` greater than + ``1``. + + The result is sorted from the biggest to the smallest by: + :attr:`Statistic.size`, :attr:`Statistic.count` and then by + :attr:`Statistic.traceback`. + + + .. attribute:: traceback_limit + + Maximum number of frames stored in the traceback of :attr:`traces`: + result of the :func:`get_traceback_limit` when the snapshot was taken. + + .. attribute:: traces + + Traces of all memory blocks allocated by Python: sequence of + :class:`Trace` instances. + + The sequence has an undefined order. Use the :meth:`Snapshot.statistics` + method to get a sorted list of statistics. + + +Statistic +--------- + +.. class:: Statistic + + Statistic on memory allocations. + + :func:`Snapshot.statistics` returns a list of :class:`Statistic` instances. + + See also the :class:`StatisticDiff` class. + + .. attribute:: count + + Number of memory blocks (``int``). + + .. attribute:: size + + Total size of memory blocks in bytes (``int``). + + .. attribute:: traceback + + Traceback where the memory block was allocated, :class:`Traceback` + instance. + + +StatisticDiff +------------- + +.. class:: StatisticDiff + + Statistic difference on memory allocations between an old and a new + :class:`Snapshot` instance. + + :func:`Snapshot.compare_to` returns a list of :class:`StatisticDiff` + instances. See also the :class:`Statistic` class. + + .. attribute:: count + + Number of memory blocks in the new snapshot (``int``): ``0`` if + the memory blocks have been released in the new snapshot. + + .. attribute:: count_diff + + Difference of number of memory blocks between the old and the new + snapshots (``int``): ``0`` if the memory blocks have been allocated in + the new snapshot. + + .. attribute:: size + + Total size of memory blocks in bytes in the new snapshot (``int``): + ``0`` if the memory blocks have been released in the new snapshot. + + .. attribute:: size_diff + + Difference of total size of memory blocks in bytes between the old and + the new snapshots (``int``): ``0`` if the memory blocks have been + allocated in the new snapshot. + + .. attribute:: traceback + + Traceback where the memory blocks were allocated, :class:`Traceback` + instance. + + +Trace +----- + +.. class:: Trace + + Trace of a memory block. + + The :attr:`Snapshot.traces` attribute is a sequence of :class:`Trace` + instances. + + .. attribute:: size + + Size of the memory block in bytes (``int``). + + .. attribute:: traceback + + Traceback where the memory block was allocated, :class:`Traceback` + instance. + + +Traceback +--------- + +.. class:: Traceback + + Sequence of :class:`Frame` instances sorted from the most recent frame to + the oldest frame. + + A traceback contains at least ``1`` frame. If the ``tracemalloc`` module + failed to get a frame, the filename ``""`` at line number ``0`` is + used. + + When a snapshot is taken, tracebacks of traces are limited to + :func:`get_traceback_limit` frames. See the :func:`take_snapshot` function. + + The :attr:`Trace.traceback` attribute is an instance of :class:`Traceback` + instance. + + diff --git a/Doc/license.rst b/Doc/license.rst --- a/Doc/license.rst +++ b/Doc/license.rst @@ -893,3 +893,44 @@ Jean-loup Gailly Mark Adler jloup at gzip.org madler at alumni.caltech.edu + +cfuhash +------- + +The implementtation of the hash table used by the :mod:`tracemalloc` is based +on the cfuhash project:: + + Copyright (c) 2005 Don Owens + All rights reserved. + + This code is released under the BSD license: + + Redistribution and use in source and binary forms, with or without + modification, are permitted provided that the following conditions + are met: + + * Redistributions of source code must retain the above copyright + notice, this list of conditions and the following disclaimer. + + * Redistributions in binary form must reproduce the above + copyright notice, this list of conditions and the following + disclaimer in the documentation and/or other materials provided + with the distribution. + + * Neither the name of the author nor the names of its + contributors may be used to endorse or promote products derived + from this software without specific prior written permission. + + THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS + "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT + LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS + FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE + COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, + INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES + (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR + SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) + HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, + STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) + ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED + OF THE POSSIBILITY OF SUCH DAMAGE. + diff --git a/Doc/using/cmdline.rst b/Doc/using/cmdline.rst --- a/Doc/using/cmdline.rst +++ b/Doc/using/cmdline.rst @@ -376,11 +376,15 @@ .. cmdoption:: -X Reserved for various implementation-specific options. CPython currently - defines two possible values: + defines the following possible values: * ``-X faulthandler`` to enable :mod:`faulthandler`; * ``-X showrefcount`` to enable the output of the total reference count and memory blocks (only works on debug builds); + * ``-X tracemalloc`` to enable :mod:`tracemalloc`. + * ``-X tracemalloc=NFRAME`` to enable :mod:`tracemalloc`, *NFRAME* is the + maximum number of frames stored in a trace: see the + :func:`tracemalloc.set_traceback_limit` function. It also allows to pass arbitrary values and retrieve them through the :data:`sys._xoptions` dictionary. @@ -392,7 +396,7 @@ The ``-X faulthandler`` option. .. versionadded:: 3.4 - The ``-X showrefcount`` option. + The ``-X showrefcount`` and ``-X tracemalloc`` options. Options you shouldn't use @@ -594,6 +598,16 @@ .. versionadded:: 3.3 +.. envvar:: PYTHONTRACEMALLOC + + If this environment variable is set to a non-empty string, all memory + allocations made by Python are traced by the :mod:`tracemalloc` module. + The value of the variable is the maximum number of frames stored in a trace: + see the :func:`tracemalloc.set_traceback_limit` function. + + .. versionadded:: 3.4 + + Debug-mode variables ~~~~~~~~~~~~~~~~~~~~ diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -2154,3 +2154,22 @@ # actually override the attribute setattr(object_to_patch, attr_name, new_value) + + +def run_in_subinterp(code): + """ + Run code in a subinterpreter. Raise unittest.SkipTest if the tracemalloc + module is enabled. + """ + # Issue #10915, #15751: PyGILState_*() functions don't work with + # sub-interpreters, the tracemalloc module uses these functions internally + try: + import tracemalloc + except ImportError: + pass + else: + if tracemalloc.is_tracing(): + raise unittest.SkipTest("run_in_subinterp() cannot be used " + "if tracemalloc module is tracing " + "memory allocations") + return _testcapi.run_in_subinterp(code) diff --git a/Lib/test/test_atexit.py b/Lib/test/test_atexit.py --- a/Lib/test/test_atexit.py +++ b/Lib/test/test_atexit.py @@ -158,7 +158,7 @@ atexit.register(f) del atexit """ - ret = _testcapi.run_in_subinterp(code) + ret = support.run_in_subinterp(code) self.assertEqual(ret, 0) self.assertEqual(atexit._ncallbacks(), n) @@ -173,7 +173,7 @@ atexit.register(f) atexit.__atexit = atexit """ - ret = _testcapi.run_in_subinterp(code) + ret = support.run_in_subinterp(code) self.assertEqual(ret, 0) self.assertEqual(atexit._ncallbacks(), n) diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -205,7 +205,7 @@ pickle.dump(id(builtins), f) """.format(w) with open(r, "rb") as f: - ret = _testcapi.run_in_subinterp(code) + ret = support.run_in_subinterp(code) self.assertEqual(ret, 0) self.assertNotEqual(pickle.load(f), id(sys.modules)) self.assertNotEqual(pickle.load(f), id(builtins)) diff --git a/Lib/test/test_threading.py b/Lib/test/test_threading.py --- a/Lib/test/test_threading.py +++ b/Lib/test/test_threading.py @@ -853,7 +853,7 @@ os.write(%d, b"x") threading.Thread(target=f).start() """ % (w,) - ret = _testcapi.run_in_subinterp(code) + ret = test.support.run_in_subinterp(code) self.assertEqual(ret, 0) # The thread was joined properly. self.assertEqual(os.read(r, 1), b"x") @@ -885,7 +885,7 @@ os.write(%d, b"x") threading.Thread(target=f).start() """ % (w,) - ret = _testcapi.run_in_subinterp(code) + ret = test.support.run_in_subinterp(code) self.assertEqual(ret, 0) # The thread was joined properly. self.assertEqual(os.read(r, 1), b"x") diff --git a/Lib/test/test_tracemalloc.py b/Lib/test/test_tracemalloc.py new file mode 100644 --- /dev/null +++ b/Lib/test/test_tracemalloc.py @@ -0,0 +1,797 @@ +import _tracemalloc +import contextlib +import datetime +import os +import sys +import tracemalloc +import unittest +from unittest.mock import patch +from test.script_helper import assert_python_ok, assert_python_failure +from test import support +try: + import threading +except ImportError: + threading = None + +EMPTY_STRING_SIZE = sys.getsizeof(b'') + +def get_frames(nframe, lineno_delta): + frames = [] + frame = sys._getframe(1) + for index in range(nframe): + code = frame.f_code + lineno = frame.f_lineno + lineno_delta + frames.append((code.co_filename, lineno)) + lineno_delta = 0 + frame = frame.f_back + if frame is None: + break + return tuple(frames) + +def allocate_bytes(size): + nframe = tracemalloc.get_traceback_limit() + bytes_len = (size - EMPTY_STRING_SIZE) + frames = get_frames(nframe, 1) + data = b'x' * bytes_len + return data, tracemalloc.Traceback(frames) + +def create_snapshots(): + traceback_limit = 2 + + raw_traces = [ + (10, (('a.py', 2), ('b.py', 4))), + (10, (('a.py', 2), ('b.py', 4))), + (10, (('a.py', 2), ('b.py', 4))), + + (2, (('a.py', 5), ('b.py', 4))), + + (66, (('b.py', 1),)), + + (7, (('', 0),)), + ] + snapshot = tracemalloc.Snapshot(raw_traces, traceback_limit) + + raw_traces2 = [ + (10, (('a.py', 2), ('b.py', 4))), + (10, (('a.py', 2), ('b.py', 4))), + (10, (('a.py', 2), ('b.py', 4))), + + (2, (('a.py', 5), ('b.py', 4))), + (5000, (('a.py', 5), ('b.py', 4))), + + (400, (('c.py', 578),)), + ] + snapshot2 = tracemalloc.Snapshot(raw_traces2, traceback_limit) + + return (snapshot, snapshot2) + +def frame(filename, lineno): + return tracemalloc._Frame((filename, lineno)) + +def traceback(*frames): + return tracemalloc.Traceback(frames) + +def traceback_lineno(filename, lineno): + return traceback((filename, lineno)) + +def traceback_filename(filename): + return traceback_lineno(filename, 0) + + +class TestTracemallocEnabled(unittest.TestCase): + def setUp(self): + if tracemalloc.is_tracing(): + self.skipTest("tracemalloc must be stopped before the test") + + tracemalloc.set_traceback_limit(1) + tracemalloc.start() + + def tearDown(self): + tracemalloc.stop() + + def test_get_tracemalloc_memory(self): + data = [allocate_bytes(123) for count in range(1000)] + size = tracemalloc.get_tracemalloc_memory() + self.assertGreaterEqual(size, 0) + + tracemalloc.clear_traces() + size2 = tracemalloc.get_tracemalloc_memory() + self.assertGreaterEqual(size2, 0) + self.assertLessEqual(size2, size) + + def test_get_object_traceback(self): + tracemalloc.clear_traces() + obj_size = 12345 + obj, obj_traceback = allocate_bytes(obj_size) + traceback = tracemalloc.get_object_traceback(obj) + self.assertEqual(traceback, obj_traceback) + + def test_set_traceback_limit(self): + obj_size = 10 + + nframe = tracemalloc.get_traceback_limit() + self.addCleanup(tracemalloc.set_traceback_limit, nframe) + + self.assertRaises(ValueError, tracemalloc.set_traceback_limit, -1) + + tracemalloc.clear_traces() + tracemalloc.set_traceback_limit(10) + obj2, obj2_traceback = allocate_bytes(obj_size) + traceback = tracemalloc.get_object_traceback(obj2) + self.assertEqual(len(traceback), 10) + self.assertEqual(traceback, obj2_traceback) + + tracemalloc.clear_traces() + tracemalloc.set_traceback_limit(1) + obj, obj_traceback = allocate_bytes(obj_size) + traceback = tracemalloc.get_object_traceback(obj) + self.assertEqual(len(traceback), 1) + self.assertEqual(traceback, obj_traceback) + + + def find_trace(self, traces, traceback): + for trace in traces: + if trace[1] == traceback._frames: + return trace + + self.fail("trace not found") + + def test_get_traces(self): + tracemalloc.clear_traces() + obj_size = 12345 + obj, obj_traceback = allocate_bytes(obj_size) + + traces = tracemalloc._get_traces() + trace = self.find_trace(traces, obj_traceback) + + self.assertIsInstance(trace, tuple) + size, traceback = trace + self.assertEqual(size, obj_size) + self.assertEqual(traceback, obj_traceback._frames) + + tracemalloc.stop() + self.assertEqual(tracemalloc._get_traces(), []) + + + def test_get_traces_intern_traceback(self): + # dummy wrappers to get more useful and identical frames in the traceback + def allocate_bytes2(size): + return allocate_bytes(size) + def allocate_bytes3(size): + return allocate_bytes2(size) + def allocate_bytes4(size): + return allocate_bytes3(size) + + # Ensure that two identical tracebacks are not duplicated + tracemalloc.clear_traces() + tracemalloc.set_traceback_limit(4) + obj_size = 123 + obj1, obj1_traceback = allocate_bytes4(obj_size) + obj2, obj2_traceback = allocate_bytes4(obj_size) + + traces = tracemalloc._get_traces() + + trace1 = self.find_trace(traces, obj1_traceback) + trace2 = self.find_trace(traces, obj2_traceback) + size1, traceback1 = trace1 + size2, traceback2 = trace2 + self.assertEqual(traceback2, traceback1) + self.assertIs(traceback2, traceback1) + + def test_get_traced_memory(self): + # Python allocates some internals objects, so the test must tolerate + # a small difference between the expected size and the real usage + max_error = 2048 + + # allocate one object + obj_size = 1024 * 1024 + tracemalloc.clear_traces() + obj, obj_traceback = allocate_bytes(obj_size) + size, max_size = tracemalloc.get_traced_memory() + self.assertGreaterEqual(size, obj_size) + self.assertGreaterEqual(max_size, size) + + self.assertLessEqual(size - obj_size, max_error) + self.assertLessEqual(max_size - size, max_error) + + # destroy the object + obj = None + size2, max_size2 = tracemalloc.get_traced_memory() + self.assertLess(size2, size) + self.assertGreaterEqual(size - size2, obj_size - max_error) + self.assertGreaterEqual(max_size2, max_size) + + # clear_traces() must reset traced memory counters + tracemalloc.clear_traces() + self.assertEqual(tracemalloc.get_traced_memory(), (0, 0)) + + # allocate another object + obj, obj_traceback = allocate_bytes(obj_size) + size, max_size = tracemalloc.get_traced_memory() + self.assertGreater(size, 0) + + # stop() rests also traced memory counters + tracemalloc.stop() + self.assertEqual(tracemalloc.get_traced_memory(), (0, 0)) + + def test_clear_traces(self): + obj, obj_traceback = allocate_bytes(123) + traceback = tracemalloc.get_object_traceback(obj) + self.assertIsNotNone(traceback) + + tracemalloc.clear_traces() + traceback2 = tracemalloc.get_object_traceback(obj) + self.assertIsNone(traceback2) + + def test_is_tracing(self): + tracemalloc.stop() + self.assertFalse(tracemalloc.is_tracing()) + + tracemalloc.start() + self.assertTrue(tracemalloc.is_tracing()) + + def test_snapshot(self): + obj, source = allocate_bytes(123) + + # take a snapshot + snapshot = tracemalloc.take_snapshot() + + # write on disk + snapshot.dump(support.TESTFN) + self.addCleanup(support.unlink, support.TESTFN) + + # load from disk + snapshot2 = tracemalloc.Snapshot.load(support.TESTFN) + self.assertEqual(snapshot2.traces, snapshot.traces) + + # tracemalloc must be tracing memory allocations to take a snapshot + tracemalloc.stop() + with self.assertRaises(RuntimeError) as cm: + tracemalloc.take_snapshot() + self.assertEqual(str(cm.exception), + "the tracemalloc module must be tracing memory " + "allocations to take a snapshot") + + def test_snapshot_save_attr(self): + # take a snapshot with a new attribute + snapshot = tracemalloc.take_snapshot() + snapshot.test_attr = "new" + snapshot.dump(support.TESTFN) + self.addCleanup(support.unlink, support.TESTFN) + + # load() should recreates the attribute + snapshot2 = tracemalloc.Snapshot.load(support.TESTFN) + self.assertEqual(snapshot2.test_attr, "new") + + def fork_child(self): + if not tracemalloc.is_tracing(): + return 2 + + obj_size = 12345 + obj, obj_traceback = allocate_bytes(obj_size) + traceback = tracemalloc.get_object_traceback(obj) + if traceback is None: + return 3 + + # everything is fine + return 0 + + @unittest.skipUnless(hasattr(os, 'fork'), 'need os.fork()') + def test_fork(self): + # check that tracemalloc is still working after fork + pid = os.fork() + if not pid: + # child + exitcode = 1 + try: + exitcode = self.fork_child() + finally: + os._exit(exitcode) + else: + pid2, status = os.waitpid(pid, 0) + self.assertTrue(os.WIFEXITED(status)) + exitcode = os.WEXITSTATUS(status) + self.assertEqual(exitcode, 0) + + +class TestSnapshot(unittest.TestCase): + maxDiff = 4000 + + def test_create_snapshot(self): + raw_traces = [(5, (('a.py', 2),))] + + with contextlib.ExitStack() as stack: + stack.enter_context(patch.object(tracemalloc, 'is_tracing', + return_value=True)) + stack.enter_context(patch.object(tracemalloc, 'get_traceback_limit', + return_value=5)) + stack.enter_context(patch.object(tracemalloc, '_get_traces', + return_value=raw_traces)) + + snapshot = tracemalloc.take_snapshot() + self.assertEqual(snapshot.traceback_limit, 5) + self.assertEqual(len(snapshot.traces), 1) + trace = snapshot.traces[0] + self.assertEqual(trace.size, 5) + self.assertEqual(len(trace.traceback), 1) + self.assertEqual(trace.traceback[0].filename, 'a.py') + self.assertEqual(trace.traceback[0].lineno, 2) + + def test_filter_traces(self): + snapshot, snapshot2 = create_snapshots() + filter1 = tracemalloc.Filter(False, "b.py") + filter2 = tracemalloc.Filter(True, "a.py", 2) + filter3 = tracemalloc.Filter(True, "a.py", 5) + + original_traces = list(snapshot.traces._traces) + + # exclude b.py + snapshot3 = snapshot.filter_traces((filter1,)) + self.assertEqual(snapshot3.traces._traces, [ + (10, (('a.py', 2), ('b.py', 4))), + (10, (('a.py', 2), ('b.py', 4))), + (10, (('a.py', 2), ('b.py', 4))), + (2, (('a.py', 5), ('b.py', 4))), + (7, (('', 0),)), + ]) + + # filter_traces() must not touch the original snapshot + self.assertEqual(snapshot.traces._traces, original_traces) + + # only include two lines of a.py + snapshot4 = snapshot3.filter_traces((filter2, filter3)) + self.assertEqual(snapshot4.traces._traces, [ + (10, (('a.py', 2), ('b.py', 4))), + (10, (('a.py', 2), ('b.py', 4))), + (10, (('a.py', 2), ('b.py', 4))), + (2, (('a.py', 5), ('b.py', 4))), + ]) + + # No filter: just duplicate the snapshot + snapshot5 = snapshot.filter_traces(()) + self.assertIsNot(snapshot5, snapshot) + self.assertIsNot(snapshot5.traces, snapshot.traces) + self.assertEqual(snapshot5.traces, snapshot.traces) + + def test_snapshot_group_by_line(self): + snapshot, snapshot2 = create_snapshots() + tb_0 = traceback_lineno('', 0) + tb_a_2 = traceback_lineno('a.py', 2) + tb_a_5 = traceback_lineno('a.py', 5) + tb_b_1 = traceback_lineno('b.py', 1) + tb_c_578 = traceback_lineno('c.py', 578) + + # stats per file and line + stats1 = snapshot.statistics('lineno') + self.assertEqual(stats1, [ + tracemalloc.Statistic(tb_b_1, 66, 1), + tracemalloc.Statistic(tb_a_2, 30, 3), + tracemalloc.Statistic(tb_0, 7, 1), + tracemalloc.Statistic(tb_a_5, 2, 1), + ]) + + # stats per file and line (2) + stats2 = snapshot2.statistics('lineno') + self.assertEqual(stats2, [ + tracemalloc.Statistic(tb_a_5, 5002, 2), + tracemalloc.Statistic(tb_c_578, 400, 1), + tracemalloc.Statistic(tb_a_2, 30, 3), + ]) + + # stats diff per file and line + statistics = snapshot2.compare_to(snapshot, 'lineno') + self.assertEqual(statistics, [ + tracemalloc.StatisticDiff(tb_a_5, 5002, 5000, 2, 1), + tracemalloc.StatisticDiff(tb_c_578, 400, 400, 1, 1), + tracemalloc.StatisticDiff(tb_b_1, 0, -66, 0, -1), + tracemalloc.StatisticDiff(tb_0, 0, -7, 0, -1), + tracemalloc.StatisticDiff(tb_a_2, 30, 0, 3, 0), + ]) + + def test_snapshot_group_by_file(self): + snapshot, snapshot2 = create_snapshots() + tb_0 = traceback_filename('') + tb_a = traceback_filename('a.py') + tb_b = traceback_filename('b.py') + tb_c = traceback_filename('c.py') + + # stats per file + stats1 = snapshot.statistics('filename') + self.assertEqual(stats1, [ + tracemalloc.Statistic(tb_b, 66, 1), + tracemalloc.Statistic(tb_a, 32, 4), + tracemalloc.Statistic(tb_0, 7, 1), + ]) + + # stats per file (2) + stats2 = snapshot2.statistics('filename') + self.assertEqual(stats2, [ + tracemalloc.Statistic(tb_a, 5032, 5), + tracemalloc.Statistic(tb_c, 400, 1), + ]) + + # stats diff per file + diff = snapshot2.compare_to(snapshot, 'filename') + self.assertEqual(diff, [ + tracemalloc.StatisticDiff(tb_a, 5032, 5000, 5, 1), + tracemalloc.StatisticDiff(tb_c, 400, 400, 1, 1), + tracemalloc.StatisticDiff(tb_b, 0, -66, 0, -1), + tracemalloc.StatisticDiff(tb_0, 0, -7, 0, -1), + ]) + + def test_snapshot_group_by_traceback(self): + snapshot, snapshot2 = create_snapshots() + + # stats per file + tb1 = traceback(('a.py', 2), ('b.py', 4)) + tb2 = traceback(('a.py', 5), ('b.py', 4)) + tb3 = traceback(('b.py', 1)) + tb4 = traceback(('', 0)) + stats1 = snapshot.statistics('traceback') + self.assertEqual(stats1, [ + tracemalloc.Statistic(tb3, 66, 1), + tracemalloc.Statistic(tb1, 30, 3), + tracemalloc.Statistic(tb4, 7, 1), + tracemalloc.Statistic(tb2, 2, 1), + ]) + + # stats per file (2) + tb5 = traceback(('c.py', 578)) + stats2 = snapshot2.statistics('traceback') + self.assertEqual(stats2, [ + tracemalloc.Statistic(tb2, 5002, 2), + tracemalloc.Statistic(tb5, 400, 1), + tracemalloc.Statistic(tb1, 30, 3), + ]) + + # stats diff per file + diff = snapshot2.compare_to(snapshot, 'traceback') + self.assertEqual(diff, [ + tracemalloc.StatisticDiff(tb2, 5002, 5000, 2, 1), + tracemalloc.StatisticDiff(tb5, 400, 400, 1, 1), + tracemalloc.StatisticDiff(tb3, 0, -66, 0, -1), + tracemalloc.StatisticDiff(tb4, 0, -7, 0, -1), + tracemalloc.StatisticDiff(tb1, 30, 0, 3, 0), + ]) + + self.assertRaises(ValueError, + snapshot.statistics, 'traceback', cumulative=True) + + def test_snapshot_group_by_cumulative(self): + snapshot, snapshot2 = create_snapshots() + tb_0 = traceback_filename('') + tb_a = traceback_filename('a.py') + tb_b = traceback_filename('b.py') + tb_a_2 = traceback_lineno('a.py', 2) + tb_a_5 = traceback_lineno('a.py', 5) + tb_b_1 = traceback_lineno('b.py', 1) + tb_b_4 = traceback_lineno('b.py', 4) + + # per file + stats = snapshot.statistics('filename', True) + self.assertEqual(stats, [ + tracemalloc.Statistic(tb_b, 98, 5), + tracemalloc.Statistic(tb_a, 32, 4), + tracemalloc.Statistic(tb_0, 7, 1), + ]) + + # per line + stats = snapshot.statistics('lineno', True) + self.assertEqual(stats, [ + tracemalloc.Statistic(tb_b_1, 66, 1), + tracemalloc.Statistic(tb_b_4, 32, 4), + tracemalloc.Statistic(tb_a_2, 30, 3), + tracemalloc.Statistic(tb_0, 7, 1), + tracemalloc.Statistic(tb_a_5, 2, 1), + ]) + + def test_trace_format(self): + snapshot, snapshot2 = create_snapshots() + trace = snapshot.traces[0] + self.assertEqual(str(trace), 'a.py:2: 10 B') + traceback = trace.traceback + self.assertEqual(str(traceback), 'a.py:2') + frame = traceback[0] + self.assertEqual(str(frame), 'a.py:2') + + def test_statistic_format(self): + snapshot, snapshot2 = create_snapshots() + stats = snapshot.statistics('lineno') + stat = stats[0] + self.assertEqual(str(stat), + 'b.py:1: size=66 B, count=1, average=66 B') + + def test_statistic_diff_format(self): + snapshot, snapshot2 = create_snapshots() + stats = snapshot2.compare_to(snapshot, 'lineno') + stat = stats[0] + self.assertEqual(str(stat), + 'a.py:5: size=5002 B (+5000 B), count=2 (+1), average=2501 B') + + + +class TestFilters(unittest.TestCase): + maxDiff = 2048 + + def test_filter_attributes(self): + # test default values + f = tracemalloc.Filter(True, "abc") + self.assertEqual(f.inclusive, True) + self.assertEqual(f.filename_pattern, "abc") + self.assertIsNone(f.lineno) + self.assertEqual(f.all_frames, False) + + # test custom values + f = tracemalloc.Filter(False, "test.py", 123, True) + self.assertEqual(f.inclusive, False) + self.assertEqual(f.filename_pattern, "test.py") + self.assertEqual(f.lineno, 123) + self.assertEqual(f.all_frames, True) + + # parameters passed by keyword + f = tracemalloc.Filter(inclusive=False, filename_pattern="test.py", lineno=123, all_frames=True) + self.assertEqual(f.inclusive, False) + self.assertEqual(f.filename_pattern, "test.py") + self.assertEqual(f.lineno, 123) + self.assertEqual(f.all_frames, True) + + # read-only attribute + self.assertRaises(AttributeError, setattr, f, "filename_pattern", "abc") + + def test_filter_match(self): + # filter without line number + f = tracemalloc.Filter(True, "abc") + self.assertTrue(f._match_frame("abc", 0)) + self.assertTrue(f._match_frame("abc", 5)) + self.assertTrue(f._match_frame("abc", 10)) + self.assertFalse(f._match_frame("12356", 0)) + self.assertFalse(f._match_frame("12356", 5)) + self.assertFalse(f._match_frame("12356", 10)) + + f = tracemalloc.Filter(False, "abc") + self.assertFalse(f._match_frame("abc", 0)) + self.assertFalse(f._match_frame("abc", 5)) + self.assertFalse(f._match_frame("abc", 10)) + self.assertTrue(f._match_frame("12356", 0)) + self.assertTrue(f._match_frame("12356", 5)) + self.assertTrue(f._match_frame("12356", 10)) + + # filter with line number > 0 + f = tracemalloc.Filter(True, "abc", 5) + self.assertFalse(f._match_frame("abc", 0)) + self.assertTrue(f._match_frame("abc", 5)) + self.assertFalse(f._match_frame("abc", 10)) + self.assertFalse(f._match_frame("12356", 0)) + self.assertFalse(f._match_frame("12356", 5)) + self.assertFalse(f._match_frame("12356", 10)) + + f = tracemalloc.Filter(False, "abc", 5) + self.assertTrue(f._match_frame("abc", 0)) + self.assertFalse(f._match_frame("abc", 5)) + self.assertTrue(f._match_frame("abc", 10)) + self.assertTrue(f._match_frame("12356", 0)) + self.assertTrue(f._match_frame("12356", 5)) + self.assertTrue(f._match_frame("12356", 10)) + + # filter with line number 0 + f = tracemalloc.Filter(True, "abc", 0) + self.assertTrue(f._match_frame("abc", 0)) + self.assertFalse(f._match_frame("abc", 5)) + self.assertFalse(f._match_frame("abc", 10)) + self.assertFalse(f._match_frame("12356", 0)) + self.assertFalse(f._match_frame("12356", 5)) + self.assertFalse(f._match_frame("12356", 10)) + + f = tracemalloc.Filter(False, "abc", 0) + self.assertFalse(f._match_frame("abc", 0)) + self.assertTrue(f._match_frame("abc", 5)) + self.assertTrue(f._match_frame("abc", 10)) + self.assertTrue(f._match_frame("12356", 0)) + self.assertTrue(f._match_frame("12356", 5)) + self.assertTrue(f._match_frame("12356", 10)) + + def test_filter_match_filename(self): + def fnmatch(inclusive, filename, pattern): + f = tracemalloc.Filter(inclusive, pattern) + return f._match_frame(filename, 0) + + self.assertTrue(fnmatch(True, "abc", "abc")) + self.assertFalse(fnmatch(True, "12356", "abc")) + self.assertFalse(fnmatch(True, "", "abc")) + + self.assertFalse(fnmatch(False, "abc", "abc")) + self.assertTrue(fnmatch(False, "12356", "abc")) + self.assertTrue(fnmatch(False, "", "abc")) + + def test_filter_match_filename_joker(self): + def fnmatch(filename, pattern): + filter = tracemalloc.Filter(True, pattern) + return filter._match_frame(filename, 0) + + # empty string + self.assertFalse(fnmatch('abc', '')) + self.assertFalse(fnmatch('', 'abc')) + self.assertTrue(fnmatch('', '')) + self.assertTrue(fnmatch('', '*')) + + # no * + self.assertTrue(fnmatch('abc', 'abc')) + self.assertFalse(fnmatch('abc', 'abcd')) + self.assertFalse(fnmatch('abc', 'def')) + + # a* + self.assertTrue(fnmatch('abc', 'a*')) + self.assertTrue(fnmatch('abc', 'abc*')) + self.assertFalse(fnmatch('abc', 'b*')) + self.assertFalse(fnmatch('abc', 'abcd*')) + + # a*b + self.assertTrue(fnmatch('abc', 'a*c')) + self.assertTrue(fnmatch('abcdcx', 'a*cx')) + self.assertFalse(fnmatch('abb', 'a*c')) + self.assertFalse(fnmatch('abcdce', 'a*cx')) + + # a*b*c + self.assertTrue(fnmatch('abcde', 'a*c*e')) + self.assertTrue(fnmatch('abcbdefeg', 'a*bd*eg')) + self.assertFalse(fnmatch('abcdd', 'a*c*e')) + self.assertFalse(fnmatch('abcbdefef', 'a*bd*eg')) + + # replace .pyc and .pyo suffix with .py + self.assertTrue(fnmatch('a.pyc', 'a.py')) + self.assertTrue(fnmatch('a.pyo', 'a.py')) + self.assertTrue(fnmatch('a.py', 'a.pyc')) + self.assertTrue(fnmatch('a.py', 'a.pyo')) + + if os.name == 'nt': + # case insensitive + self.assertTrue(fnmatch('aBC', 'ABc')) + self.assertTrue(fnmatch('aBcDe', 'Ab*dE')) + + self.assertTrue(fnmatch('a.pyc', 'a.PY')) + self.assertTrue(fnmatch('a.PYO', 'a.py')) + self.assertTrue(fnmatch('a.py', 'a.PYC')) + self.assertTrue(fnmatch('a.PY', 'a.pyo')) + else: + # case sensitive + self.assertFalse(fnmatch('aBC', 'ABc')) + self.assertFalse(fnmatch('aBcDe', 'Ab*dE')) + + self.assertFalse(fnmatch('a.pyc', 'a.PY')) + self.assertFalse(fnmatch('a.PYO', 'a.py')) + self.assertFalse(fnmatch('a.py', 'a.PYC')) + self.assertFalse(fnmatch('a.PY', 'a.pyo')) + + if os.name == 'nt': + # normalize alternate separator "/" to the standard separator "\" + self.assertTrue(fnmatch(r'a/b', r'a\b')) + self.assertTrue(fnmatch(r'a\b', r'a/b')) + self.assertTrue(fnmatch(r'a/b\c', r'a\b/c')) + self.assertTrue(fnmatch(r'a/b/c', r'a\b\c')) + else: + # there is no alternate separator + self.assertFalse(fnmatch(r'a/b', r'a\b')) + self.assertFalse(fnmatch(r'a\b', r'a/b')) + self.assertFalse(fnmatch(r'a/b\c', r'a\b/c')) + self.assertFalse(fnmatch(r'a/b/c', r'a\b\c')) + + def test_filter_match_trace(self): + t1 = (("a.py", 2), ("b.py", 3)) + t2 = (("b.py", 4), ("b.py", 5)) + t3 = (("c.py", 5), ('', 0)) + unknown = (('', 0),) + + f = tracemalloc.Filter(True, "b.py", all_frames=True) + self.assertTrue(f._match_traceback(t1)) + self.assertTrue(f._match_traceback(t2)) + self.assertFalse(f._match_traceback(t3)) + self.assertFalse(f._match_traceback(unknown)) + + f = tracemalloc.Filter(True, "b.py", all_frames=False) + self.assertFalse(f._match_traceback(t1)) + self.assertTrue(f._match_traceback(t2)) + self.assertFalse(f._match_traceback(t3)) + self.assertFalse(f._match_traceback(unknown)) + + f = tracemalloc.Filter(False, "b.py", all_frames=True) + self.assertFalse(f._match_traceback(t1)) + self.assertFalse(f._match_traceback(t2)) + self.assertTrue(f._match_traceback(t3)) + self.assertTrue(f._match_traceback(unknown)) + + f = tracemalloc.Filter(False, "b.py", all_frames=False) + self.assertTrue(f._match_traceback(t1)) + self.assertFalse(f._match_traceback(t2)) + self.assertTrue(f._match_traceback(t3)) + self.assertTrue(f._match_traceback(unknown)) + + f = tracemalloc.Filter(False, "", all_frames=False) + self.assertTrue(f._match_traceback(t1)) + self.assertTrue(f._match_traceback(t2)) + self.assertTrue(f._match_traceback(t3)) + self.assertFalse(f._match_traceback(unknown)) + + f = tracemalloc.Filter(True, "", all_frames=True) + self.assertFalse(f._match_traceback(t1)) + self.assertFalse(f._match_traceback(t2)) + self.assertTrue(f._match_traceback(t3)) + self.assertTrue(f._match_traceback(unknown)) + + f = tracemalloc.Filter(False, "", all_frames=True) + self.assertTrue(f._match_traceback(t1)) + self.assertTrue(f._match_traceback(t2)) + self.assertFalse(f._match_traceback(t3)) + self.assertFalse(f._match_traceback(unknown)) + + +class TestCommandLine(unittest.TestCase): + def test_env_var(self): + # not tracing by default + code = 'import tracemalloc; print(tracemalloc.is_tracing())' + ok, stdout, stderr = assert_python_ok('-c', code) + stdout = stdout.rstrip() + self.assertEqual(stdout, b'False') + + # PYTHON* environment varibles must be ignored when -E option is + # present + code = 'import tracemalloc; print(tracemalloc.is_tracing())' + ok, stdout, stderr = assert_python_ok('-E', '-c', code, PYTHONTRACEMALLOC='1') + stdout = stdout.rstrip() + self.assertEqual(stdout, b'False') + + # tracing at startup + code = 'import tracemalloc; print(tracemalloc.is_tracing())' + ok, stdout, stderr = assert_python_ok('-c', code, PYTHONTRACEMALLOC='1') + stdout = stdout.rstrip() + self.assertEqual(stdout, b'True') + + # start and set the number of frames + code = 'import tracemalloc; print(tracemalloc.get_traceback_limit())' + ok, stdout, stderr = assert_python_ok('-c', code, PYTHONTRACEMALLOC='10') + stdout = stdout.rstrip() + self.assertEqual(stdout, b'10') + + def test_env_var_invalid(self): + for nframe in (-1, 0, 5000): + with self.subTest(nframe=nframe): + with support.SuppressCrashReport(): + ok, stdout, stderr = assert_python_failure( + '-c', 'pass', + PYTHONTRACEMALLOC=str(nframe)) + self.assertIn(b'PYTHONTRACEMALLOC must be an integer ' + b'in range [1; 100]', + stderr) + + def test_sys_xoptions(self): + for xoptions, nframe in ( + ('tracemalloc', 1), + ('tracemalloc=1', 1), + ('tracemalloc=15', 15), + ): + with self.subTest(xoptions=xoptions, nframe=nframe): + code = 'import tracemalloc; print(tracemalloc.get_traceback_limit())' + ok, stdout, stderr = assert_python_ok('-X', xoptions, '-c', code) + stdout = stdout.rstrip() + self.assertEqual(stdout, str(nframe).encode('ascii')) + + def test_sys_xoptions_invalid(self): + for nframe in (-1, 0, 5000): + with self.subTest(nframe=nframe): + with support.SuppressCrashReport(): + args = ('-X', 'tracemalloc=%s' % nframe, '-c', 'pass') + ok, stdout, stderr = assert_python_failure(*args) + self.assertIn(b'-X tracemalloc=NFRAME: number of frame must ' + b'be an integer in range [1; 100]', + stderr) + + +def test_main(): + support.run_unittest( + TestTracemallocEnabled, + TestSnapshot, + TestFilters, + TestCommandLine, + ) + +if __name__ == "__main__": + test_main() diff --git a/Lib/tracemalloc.py b/Lib/tracemalloc.py new file mode 100644 --- /dev/null +++ b/Lib/tracemalloc.py @@ -0,0 +1,464 @@ +from collections import Sequence +from functools import total_ordering +import fnmatch +import os.path +import pickle + +# Import types and functions implemented in C +from _tracemalloc import * +from _tracemalloc import _get_object_traceback, _get_traces + + +def _format_size(size, sign): + for unit in ('B', 'KiB', 'MiB', 'GiB', 'TiB'): + if abs(size) < 100 and unit != 'B': + # 3 digits (xx.x UNIT) + if sign: + return "%+.1f %s" % (size, unit) + else: + return "%.1f %s" % (size, unit) + if abs(size) < 10 * 1024 or unit == 'TiB': + # 4 or 5 digits (xxxx UNIT) + if sign: + return "%+.0f %s" % (size, unit) + else: + return "%.0f %s" % (size, unit) + size /= 1024 + + +class Statistic: + """ + Statistic difference on memory allocations between two Snapshot instance. + """ + + __slots__ = ('traceback', 'size', 'count') + + def __init__(self, traceback, size, count): + self.traceback = traceback + self.size = size + self.count = count + + def __hash__(self): + return (self.traceback, self.size, self.count) + + def __eq__(self, other): + return (self.traceback == other.traceback + and self.size == other.size + and self.count == other.count) + + def __str__(self): + text = ("%s: size=%s, count=%i" + % (self.traceback, + _format_size(self.size, False), + self.count)) + if self.count: + average = self.size / self.count + text += ", average=%s" % _format_size(average, False) + return text + + def __repr__(self): + return ('' + % (self.traceback, self.size, self.count)) + + def _sort_key(self): + return (self.size, self.count, self.traceback) + + +class StatisticDiff: + """ + Statistic difference on memory allocations between an old and a new + Snapshot instance. + """ + __slots__ = ('traceback', 'size', 'size_diff', 'count', 'count_diff') + + def __init__(self, traceback, size, size_diff, count, count_diff): + self.traceback = traceback + self.size = size + self.size_diff = size_diff + self.count = count + self.count_diff = count_diff + + def __hash__(self): + return (self.traceback, self.size, self.size_diff, + self.count, self.count_diff) + + def __eq__(self, other): + return (self.traceback == other.traceback + and self.size == other.size + and self.size_diff == other.size_diff + and self.count == other.count + and self.count_diff == other.count_diff) + + def __str__(self): + text = ("%s: size=%s (%s), count=%i (%+i)" + % (self.traceback, + _format_size(self.size, False), + _format_size(self.size_diff, True), + self.count, + self.count_diff)) + if self.count: + average = self.size / self.count + text += ", average=%s" % _format_size(average, False) + return text + + def __repr__(self): + return ('' + % (self.traceback, self.size, self.size_diff, + + self.count, self.count_diff)) + + def _sort_key(self): + return (abs(self.size_diff), self.size, + abs(self.count_diff), self.count, + self.traceback) + + +def _compare_grouped_stats(old_group, new_group): + statistics = [] + for traceback, stat in new_group.items(): + previous = old_group.pop(traceback, None) + if previous is not None: + stat = StatisticDiff(traceback, + stat.size, stat.size - previous.size, + stat.count, stat.count - previous.count) + else: + stat = StatisticDiff(traceback, + stat.size, stat.size, + stat.count, stat.count) + statistics.append(stat) + + for traceback, stat in old_group.items(): + stat = StatisticDiff(traceback, 0, -stat.size, 0, -stat.count) + statistics.append(stat) + return statistics + + + at total_ordering +class Frame: + """ + Frame of a traceback. + """ + __slots__ = ("_frame",) + + def __init__(self, frame): + self._frame = frame + + @property + def filename(self): + return self._frame[0] + + @property + def lineno(self): + return self._frame[1] + + def __eq__(self, other): + return (self._frame == other._frame) + + def __lt__(self, other): + return (self._frame < other._frame) + + def __hash__(self): + return hash(self._frame) + + def __str__(self): + return "%s:%s" % (self.filename, self.lineno) + + def __repr__(self): + return "" % (self.filename, self.lineno) + + + at total_ordering +class Traceback(Sequence): + """ + Sequence of Frame instances sorted from the most recent frame + to the oldest frame. + """ + __slots__ = ("_frames",) + + def __init__(self, frames): + Sequence.__init__(self) + self._frames = frames + + def __len__(self): + return len(self._frames) + + def __getitem__(self, index): + trace = self._frames[index] + return Frame(trace) + + def __contains__(self, frame): + return frame._frame in self._frames + + def __hash__(self): + return hash(self._frames) + + def __eq__(self, other): + return (self._frames == other._frames) + + def __lt__(self, other): + return (self._frames < other._frames) + + def __str__(self): + return str(self[0]) + + def __repr__(self): + return "" % (tuple(self),) + + +def get_object_traceback(obj): + """ + Get the traceback where the Python object *obj* was allocated. + Return a Traceback instance. + + Return None if the tracemalloc module is not tracing memory allocations or + did not trace the allocation of the object. + """ + frames = _get_object_traceback(obj) + if frames is not None: + return Traceback(frames) + else: + return None + + +class Trace: + """ + Trace of a memory block. + """ + __slots__ = ("_trace",) + + def __init__(self, trace): + self._trace = trace + + @property + def size(self): + return self._trace[0] + + @property + def traceback(self): + return Traceback(self._trace[1]) + + def __eq__(self, other): + return (self._trace == other._trace) + + def __hash__(self): + return hash(self._trace) + + def __str__(self): + return "%s: %s" % (self.traceback, _format_size(self.size, False)) + + def __repr__(self): + return ("" + % (_format_size(self.size, False), self.traceback)) + + +class _Traces(Sequence): + def __init__(self, traces): + Sequence.__init__(self) + self._traces = traces + + def __len__(self): + return len(self._traces) + + def __getitem__(self, index): + trace = self._traces[index] + return Trace(trace) + + def __contains__(self, trace): + return trace._trace in self._traces + + def __eq__(self, other): + return (self._traces == other._traces) + + def __repr__(self): + return "" % len(self) + + +def _normalize_filename(filename): + filename = os.path.normcase(filename) + if filename.endswith(('.pyc', '.pyo')): + filename = filename[:-1] + return filename + + +class Filter: + def __init__(self, inclusive, filename_pattern, + lineno=None, all_frames=False): + self.inclusive = inclusive + self._filename_pattern = _normalize_filename(filename_pattern) + self.lineno = lineno + self.all_frames = all_frames + + @property + def filename_pattern(self): + return self._filename_pattern + + def __match_frame(self, filename, lineno): + filename = _normalize_filename(filename) + if not fnmatch.fnmatch(filename, self._filename_pattern): + return False + if self.lineno is None: + return True + else: + return (lineno == self.lineno) + + def _match_frame(self, filename, lineno): + return self.__match_frame(filename, lineno) ^ (not self.inclusive) + + def _match_traceback(self, traceback): + if self.all_frames: + if any(self.__match_frame(filename, lineno) + for filename, lineno in traceback): + return self.inclusive + else: + return (not self.inclusive) + else: + filename, lineno = traceback[0] + return self._match_frame(filename, lineno) + + +class Snapshot: + """ + Snapshot of traces of memory blocks allocated by Python. + """ + + def __init__(self, traces, traceback_limit): + self.traces = _Traces(traces) + self.traceback_limit = traceback_limit + + def dump(self, filename): + """ + Write the snapshot into a file. + """ + with open(filename, "wb") as fp: + pickle.dump(self, fp, pickle.HIGHEST_PROTOCOL) + + @staticmethod + def load(filename): + """ + Load a snapshot from a file. + """ + with open(filename, "rb") as fp: + return pickle.load(fp) + + def _filter_trace(self, include_filters, exclude_filters, trace): + traceback = trace[1] + if include_filters: + if not any(trace_filter._match_traceback(traceback) + for trace_filter in include_filters): + return False + if exclude_filters: + if any(not trace_filter._match_traceback(traceback) + for trace_filter in exclude_filters): + return False + return True + + def filter_traces(self, filters): + """ + Create a new Snapshot instance with a filtered traces sequence, filters + is a list of Filter instances. If filters is an empty list, return a + new Snapshot instance with a copy of the traces. + """ + if filters: + include_filters = [] + exclude_filters = [] + for trace_filter in filters: + if trace_filter.inclusive: + include_filters.append(trace_filter) + else: + exclude_filters.append(trace_filter) + new_traces = [trace for trace in self.traces._traces + if self._filter_trace(include_filters, + exclude_filters, + trace)] + else: + new_traces = self.traces._traces.copy() + return Snapshot(new_traces, self.traceback_limit) + + def _group_by(self, key_type, cumulative): + if key_type not in ('traceback', 'filename', 'lineno'): + raise ValueError("unknown key_type: %r" % (key_type,)) + if cumulative and key_type not in ('lineno', 'filename'): + raise ValueError("cumulative mode cannot by used " + "with key type %r" % key_type) + if cumulative and self.traceback_limit < 2: + raise ValueError("cumulative mode needs tracebacks with at least " + "2 frames, traceback limit is %s" + % self.traceback_limit) + + stats = {} + tracebacks = {} + if not cumulative: + for trace in self.traces._traces: + size, trace_traceback = trace + try: + traceback = tracebacks[trace_traceback] + except KeyError: + if key_type == 'traceback': + frames = trace_traceback + elif key_type == 'lineno': + frames = trace_traceback[:1] + else: # key_type == 'filename': + frames = ((trace_traceback[0][0], 0),) + traceback = Traceback(frames) + tracebacks[trace_traceback] = traceback + try: + stat = stats[traceback] + stat.size += size + stat.count += 1 + except KeyError: + stats[traceback] = Statistic(traceback, size, 1) + else: + # cumulative statistics + for trace in self.traces._traces: + size, trace_traceback = trace + for frame in trace_traceback: + try: + traceback = tracebacks[frame] + except KeyError: + if key_type == 'lineno': + frames = (frame,) + else: # key_type == 'filename': + frames = ((frame[0], 0),) + traceback = Traceback(frames) + tracebacks[frame] = traceback + try: + stat = stats[traceback] + stat.size += size + stat.count += 1 + except KeyError: + stats[traceback] = Statistic(traceback, size, 1) + return stats + + def statistics(self, key_type, cumulative=False): + """ + Group statistics by key_type. Return a sorted list of Statistic + instances. + """ + grouped = self._group_by(key_type, cumulative) + statistics = list(grouped.values()) + statistics.sort(reverse=True, key=Statistic._sort_key) + return statistics + + def compare_to(self, old_snapshot, key_type, cumulative=False): + """ + Compute the differences with an old snapshot old_snapshot. Get + statistics as a sorted list of StatisticDiff instances, grouped by + group_by. + """ + new_group = self._group_by(key_type, cumulative) + old_group = old_snapshot._group_by(key_type, cumulative) + statistics = _compare_grouped_stats(old_group, new_group) + statistics.sort(reverse=True, key=StatisticDiff._sort_key) + return statistics + + +def take_snapshot(): + """ + Take a snapshot of traces of memory blocks allocated by Python. + """ + if not is_tracing(): + raise RuntimeError("the tracemalloc module must be tracing memory " + "allocations to take a snapshot") + traces = _get_traces() + traceback_limit = get_traceback_limit() + return Snapshot(traces, traceback_limit) diff --git a/Modules/Setup.dist b/Modules/Setup.dist --- a/Modules/Setup.dist +++ b/Modules/Setup.dist @@ -102,7 +102,7 @@ # various reasons; therefore they are listed here instead of in the # normal order. -# This only contains the minimal set of modules required to run the +# This only contains the minimal set of modules required to run the # setup.py script in the root of the Python source tree. posix posixmodule.c # posix (UNIX) system calls @@ -115,7 +115,7 @@ _functools _functoolsmodule.c # Tools for working with functions and callable objects _operator _operator.c # operator.add() and similar goodies _collections _collectionsmodule.c # Container types -itertools itertoolsmodule.c # Functions creating iterators for efficient looping +itertools itertoolsmodule.c # Functions creating iterators for efficient looping atexit atexitmodule.c # Register functions to be run at interpreter-shutdown _stat _stat.c # stat.h interface @@ -132,12 +132,15 @@ # faulthandler module faulthandler faulthandler.c +# debug tool to trace memory blocks allocated by Python +_tracemalloc _tracemalloc.c hashtable.c + # The rest of the modules listed in this file are all commented out by # default. Usually they can be detected and built as dynamically # loaded modules by the new setup.py script added in Python 2.1. If -# you're on a platform that doesn't support dynamic loading, want to -# compile modules statically into the Python binary, or need to -# specify some odd set of compiler switches, you can uncomment the +# you're on a platform that doesn't support dynamic loading, want to +# compile modules statically into the Python binary, or need to +# specify some odd set of compiler switches, you can uncomment the # appropriate lines below. # ====================================================================== @@ -186,7 +189,7 @@ # supported...) #fcntl fcntlmodule.c # fcntl(2) and ioctl(2) -#spwd spwdmodule.c # spwd(3) +#spwd spwdmodule.c # spwd(3) #grp grpmodule.c # grp(3) #select selectmodule.c # select(2); not on ancient System V @@ -302,7 +305,7 @@ #_curses _cursesmodule.c -lcurses -ltermcap # Wrapper for the panel library that's part of ncurses and SYSV curses. -#_curses_panel _curses_panel.c -lpanel -lncurses +#_curses_panel _curses_panel.c -lpanel -lncurses # Modules that provide persistent dictionary-like semantics. You will diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c new file mode 100644 --- /dev/null +++ b/Modules/_tracemalloc.c @@ -0,0 +1,1407 @@ +#include "Python.h" +#include "hashtable.h" +#include "frameobject.h" +#include "pythread.h" +#include "osdefs.h" + +/* Trace memory blocks allocated by PyMem_RawMalloc() */ +#define TRACE_RAW_MALLOC + +/* Forward declaration */ +static void tracemalloc_stop(void); +static int tracemalloc_atexit_register(void); +static void* raw_malloc(size_t size); +static void raw_free(void *ptr); + +#ifdef Py_DEBUG +# define TRACE_DEBUG +#endif + +#define _STR(VAL) #VAL +#define STR(VAL) _STR(VAL) + +/* Protected by the GIL */ +static struct { + PyMemAllocator mem; + PyMemAllocator raw; + PyMemAllocator obj; +} allocators; + +/* Arbitrary limit of the number of frames in a traceback. The value was chosen + to not allocate too much memory on the stack (see TRACEBACK_STACK_SIZE + below). */ +#define MAX_NFRAME 100 + +static struct { + /* Module initialized? + Variable protected by the GIL */ + enum { + TRACEMALLOC_NOT_INITIALIZED, + TRACEMALLOC_INITIALIZED, + TRACEMALLOC_FINALIZED + } initialized; + + /* atexit handler registered? */ + int atexit_registered; + + /* Is tracemalloc tracing memory allocations? + Variable protected by the GIL */ + int tracing; + + /* limit of the number of frames in a traceback, 1 by default. + Variable protected by the GIL. */ + int max_nframe; +} tracemalloc_config = {TRACEMALLOC_NOT_INITIALIZED, 0, 0, 1}; + +#if defined(TRACE_RAW_MALLOC) && defined(WITH_THREAD) +/* This lock is needed because tracemalloc_free() is called without + the GIL held from PyMem_RawFree(). It cannot acquire the lock because it + would introduce a deadlock in PyThreadState_DeleteCurrent(). */ +static PyThread_type_lock tables_lock; +# define TABLES_LOCK() PyThread_acquire_lock(tables_lock, 1) +# define TABLES_UNLOCK() PyThread_release_lock(tables_lock) +#else + /* variables are protected by the GIL */ +# define TABLES_LOCK() +# define TABLES_UNLOCK() +#endif + +/* Pack the frame_t structure to reduce the memory footprint on 64-bit + architectures: 12 bytes instead of 16. This optimization might produce + SIGBUS on architectures not supporting unaligned memory accesses (64-bit + IPS CPU?): on such architecture, the structure must not be packed. */ +#pragma pack(4) +typedef struct +#ifdef __GNUC__ +__attribute__((packed)) +#endif +{ + PyObject *filename; + int lineno; +} frame_t; + +typedef struct { + Py_uhash_t hash; + int nframe; + frame_t frames[1]; +} traceback_t; + +#define TRACEBACK_SIZE(NFRAME) \ + (sizeof(traceback_t) + sizeof(frame_t) * (NFRAME - 1)) +#define TRACEBACK_STACK_SIZE TRACEBACK_SIZE(MAX_NFRAME) + +static PyObject *unknown_filename = NULL; +static traceback_t tracemalloc_empty_traceback; + +typedef struct { + size_t size; + traceback_t *traceback; +} trace_t; + +/* Size in bytes of currently traced memory. + Protected by TABLES_LOCK(). */ +static size_t tracemalloc_traced_memory = 0; + +/* Maximum size in bytes of traced memory. + Protected by TABLES_LOCK(). */ +static size_t tracemalloc_max_traced_memory = 0; + +/* Hash table used as a set to to intern filenames: + PyObject* => PyObject*. + Protected by the GIL */ +static _Py_hashtable_t *tracemalloc_filenames = NULL; + +/* Hash table used as a set to intern tracebacks: + traceback_t* => traceback_t* + Protected by the GIL */ +static _Py_hashtable_t *tracemalloc_tracebacks = NULL; + +/* pointer (void*) => trace (trace_t). + Protected by TABLES_LOCK(). */ +static _Py_hashtable_t *tracemalloc_traces = NULL; + +#ifdef TRACE_DEBUG +static void +tracemalloc_error(const char *format, ...) +{ + va_list ap; + fprintf(stderr, "tracemalloc: "); + va_start(ap, format); + vfprintf(stderr, format, ap); + va_end(ap); + fprintf(stderr, "\n"); + fflush(stderr); +} +#endif + +#if defined(WITH_THREAD) && defined(TRACE_RAW_MALLOC) +#define REENTRANT_THREADLOCAL + +/* If your OS does not provide native thread local storage, you can implement + it manually using a lock. Functions of thread.c cannot be used because + they use PyMem_RawMalloc() which leads to a reentrant call. */ +#if !(defined(_POSIX_THREADS) || defined(NT_THREADS)) +# error "need native thread local storage (TLS)" +#endif + +static int tracemalloc_reentrant_key; + +/* Any non-NULL pointer can be used */ +#define REENTRANT Py_True + +static int +get_reentrant(void) +{ + void *ptr = PyThread_get_key_value(tracemalloc_reentrant_key); + if (ptr != NULL) { + assert(ptr == REENTRANT); + return 1; + } + else + return 0; +} + +static void +set_reentrant(int reentrant) +{ + if (reentrant) { + assert(PyThread_get_key_value(tracemalloc_reentrant_key) == NULL); + PyThread_set_key_value(tracemalloc_reentrant_key, + REENTRANT); + } + else { + /* FIXME: PyThread_set_key_value() cannot be used to set the flag + to zero, because it does nothing if the variable has already + a value set. */ + PyThread_delete_key_value(tracemalloc_reentrant_key); + } +} + +#else + +/* WITH_THREAD not defined: Python compiled without threads, + or TRACE_RAW_MALLOC not defined: variable protected by the GIL */ +static int tracemalloc_reentrant = 0; + +static int +get_reentrant(void) +{ + return tracemalloc_reentrant; +} + +static void +set_reentrant(int reentrant) +{ + assert(!reentrant || !get_reentrant()); + tracemalloc_reentrant = reentrant; +} +#endif + +static int +hashtable_compare_unicode(const void *key, const _Py_hashtable_entry_t *entry) +{ + if (key != NULL && entry->key != NULL) + return (PyUnicode_Compare((PyObject *)key, (PyObject *)entry->key) == 0); + else + return key == entry->key; +} + +static _Py_hashtable_allocator_t hashtable_alloc = {malloc, free}; + +static _Py_hashtable_t * +hashtable_new(size_t data_size, + _Py_hashtable_hash_func hash_func, + _Py_hashtable_compare_func compare_func) +{ + return _Py_hashtable_new_full(data_size, 0, + hash_func, compare_func, + NULL, NULL, NULL, &hashtable_alloc); +} + +static void* +raw_malloc(size_t size) +{ + return allocators.raw.malloc(allocators.raw.ctx, size); +} + +static void +raw_free(void *ptr) +{ + allocators.raw.free(allocators.raw.ctx, ptr); +} + +static Py_uhash_t +hashtable_hash_traceback(const void *key) +{ + const traceback_t *traceback = key; + return traceback->hash; +} + +static int +hashtable_compare_traceback(const traceback_t *traceback1, + const _Py_hashtable_entry_t *he) +{ + const traceback_t *traceback2 = he->key; + const frame_t *frame1, *frame2; + int i; + + if (traceback1->nframe != traceback2->nframe) + return 0; + + for (i=0; i < traceback1->nframe; i++) { + frame1 = &traceback1->frames[i]; + frame2 = &traceback2->frames[i]; + + if (frame1->lineno != frame2->lineno) + return 0; + + if (frame1->filename != frame2->filename) { + assert(PyUnicode_Compare(frame1->filename, frame2->filename) != 0); + return 0; + } + } + return 1; +} + +static void +tracemalloc_get_frame(PyFrameObject *pyframe, frame_t *frame) +{ + PyCodeObject *code; + PyObject *filename; + _Py_hashtable_entry_t *entry; + + frame->filename = unknown_filename; + frame->lineno = PyFrame_GetLineNumber(pyframe); + assert(frame->lineno >= 0); + if (frame->lineno < 0) + frame->lineno = 0; + + code = pyframe->f_code; + if (code == NULL) { +#ifdef TRACE_DEBUG + tracemalloc_error("failed to get the code object of the a frame"); +#endif + return; + } + + if (code->co_filename == NULL) { +#ifdef TRACE_DEBUG + tracemalloc_error("failed to get the filename of the code object"); +#endif + return; + } + + filename = code->co_filename; + assert(filename != NULL); + if (filename == NULL) + return; + + if (!PyUnicode_Check(filename)) { +#ifdef TRACE_DEBUG + tracemalloc_error("filename is not an unicode string"); +#endif + return; + } + if (!PyUnicode_IS_READY(filename)) { + /* Don't make a Unicode string ready to avoid reentrant calls + to tracemalloc_malloc() or tracemalloc_realloc() */ +#ifdef TRACE_DEBUG + tracemalloc_error("filename is not a ready unicode string"); +#endif + return; + } + + /* intern the filename */ + entry = _Py_hashtable_get_entry(tracemalloc_filenames, filename); + if (entry != NULL) { + filename = (PyObject *)entry->key; + } + else { + /* tracemalloc_filenames is responsible to keep a reference + to the filename */ + Py_INCREF(filename); + if (_Py_hashtable_set(tracemalloc_filenames, filename, NULL, 0) < 0) { + Py_DECREF(filename); +#ifdef TRACE_DEBUG + tracemalloc_error("failed to intern the filename"); +#endif + return; + } + } + + /* the tracemalloc_filenames table keeps a reference to the filename */ + frame->filename = filename; +} + +static Py_uhash_t +traceback_hash(traceback_t *traceback) +{ + /* code based on tuplehash() of Objects/tupleobject.c */ + Py_uhash_t x; /* Unsigned for defined overflow behavior. */ + Py_hash_t y; + int len = traceback->nframe; + Py_uhash_t mult = _PyHASH_MULTIPLIER; + frame_t *frame; + + x = 0x345678UL; + frame = traceback->frames; + while (--len >= 0) { + y = PyObject_Hash(frame->filename); + y ^= frame->lineno; + frame++; + + x = (x ^ y) * mult; + /* the cast might truncate len; that doesn't change hash stability */ + mult += (Py_hash_t)(82520UL + len + len); + } + x += 97531UL; + return x; +} + +static void +traceback_get_frames(traceback_t *traceback) +{ + PyThreadState *tstate; + PyFrameObject *pyframe; + +#ifdef WITH_THREAD + tstate = PyGILState_GetThisThreadState(); +#else + tstate = PyThreadState_Get(); +#endif + if (tstate == NULL) { +#ifdef TRACE_DEBUG + tracemalloc_error("failed to get the current thread state"); +#endif + return; + } + + for (pyframe = tstate->frame; pyframe != NULL; pyframe = pyframe->f_back) { + tracemalloc_get_frame(pyframe, &traceback->frames[traceback->nframe]); + assert(traceback->frames[traceback->nframe].filename != NULL); + assert(traceback->frames[traceback->nframe].lineno >= 0); + traceback->nframe++; + if (traceback->nframe == tracemalloc_config.max_nframe) + break; + } +} + +static traceback_t * +traceback_new(void) +{ + char stack_buffer[TRACEBACK_STACK_SIZE]; + traceback_t *traceback = (traceback_t *)stack_buffer; + _Py_hashtable_entry_t *entry; + +#ifdef WITH_THREAD + assert(PyGILState_Check()); +#endif + + /* get frames */ + traceback->nframe = 0; + traceback_get_frames(traceback); + if (traceback->nframe == 0) + return &tracemalloc_empty_traceback; + traceback->hash = traceback_hash(traceback); + + /* intern the traceback */ + entry = _Py_hashtable_get_entry(tracemalloc_tracebacks, traceback); + if (entry != NULL) { + traceback = (traceback_t *)entry->key; + } + else { + traceback_t *copy; + size_t traceback_size; + + traceback_size = TRACEBACK_SIZE(traceback->nframe); + + copy = raw_malloc(traceback_size); + if (copy == NULL) { +#ifdef TRACE_DEBUG + tracemalloc_error("failed to intern the traceback: malloc failed"); +#endif + return NULL; + } + memcpy(copy, traceback, traceback_size); + + if (_Py_hashtable_set(tracemalloc_tracebacks, copy, NULL, 0) < 0) { + raw_free(copy); +#ifdef TRACE_DEBUG + tracemalloc_error("failed to intern the traceback: putdata failed"); +#endif + return NULL; + } + traceback = copy; + } + return traceback; +} + +static void +tracemalloc_log_alloc(void *ptr, size_t size) +{ + traceback_t *traceback; + trace_t trace; + +#ifdef WITH_THREAD + assert(PyGILState_Check()); +#endif + + traceback = traceback_new(); + if (traceback == NULL) + return; + + trace.size = size; + trace.traceback = traceback; + + TABLES_LOCK(); + assert(tracemalloc_traced_memory <= PY_SIZE_MAX - size); + tracemalloc_traced_memory += size; + if (tracemalloc_traced_memory > tracemalloc_max_traced_memory) + tracemalloc_max_traced_memory = tracemalloc_traced_memory; + + _Py_HASHTABLE_SET(tracemalloc_traces, ptr, trace); + TABLES_UNLOCK(); +} + +static void +tracemalloc_log_free(void *ptr) +{ + trace_t trace; + + TABLES_LOCK(); + if (_Py_hashtable_pop(tracemalloc_traces, ptr, &trace, sizeof(trace))) { + assert(tracemalloc_traced_memory >= trace.size); + tracemalloc_traced_memory -= trace.size; + } + TABLES_UNLOCK(); +} + +static void* +tracemalloc_malloc(void *ctx, size_t size, int gil_held) +{ + PyMemAllocator *alloc = (PyMemAllocator *)ctx; +#if defined(TRACE_RAW_MALLOC) && defined(WITH_THREAD) + PyGILState_STATE gil_state; +#endif + void *ptr; + + if (get_reentrant()) { + return alloc->malloc(alloc->ctx, size); + } + + /* Ignore reentrant call. PyObjet_Malloc() calls PyMem_Malloc() + for allocations larger than 512 bytes. PyGILState_Ensure() may call + PyMem_RawMalloc() indirectly which would call PyGILState_Ensure() if + reentrant are not disabled. */ + set_reentrant(1); +#ifdef WITH_THREAD +#ifdef TRACE_RAW_MALLOC + if (!gil_held) + gil_state = PyGILState_Ensure(); +#else + assert(gil_held); +#endif +#endif + ptr = alloc->malloc(alloc->ctx, size); + set_reentrant(0); + + if (ptr != NULL) + tracemalloc_log_alloc(ptr, size); + +#if defined(TRACE_RAW_MALLOC) && defined(WITH_THREAD) + if (!gil_held) + PyGILState_Release(gil_state); +#endif + + return ptr; +} + +static void* +tracemalloc_realloc(void *ctx, void *ptr, size_t new_size, int gil_held) +{ + PyMemAllocator *alloc = (PyMemAllocator *)ctx; +#if defined(TRACE_RAW_MALLOC) && defined(WITH_THREAD) + PyGILState_STATE gil_state; +#endif + void *ptr2; + + if (get_reentrant()) { + /* Reentrant call to PyMem_Realloc() and PyMem_RawRealloc(). + Example: PyMem_RawRealloc() is called internally by pymalloc + (_PyObject_Malloc() and _PyObject_Realloc()) to allocate a new + arena (new_arena()). */ + ptr2 = alloc->realloc(alloc->ctx, ptr, new_size); + + if (ptr2 != NULL && ptr != NULL) + tracemalloc_log_free(ptr); + + return ptr2; + } + + /* Ignore reentrant call. PyObjet_Realloc() calls PyMem_Realloc() for + allocations larger than 512 bytes. PyGILState_Ensure() may call + PyMem_RawMalloc() indirectly which would call PyGILState_Ensure() if + reentrant are not disabled. */ + set_reentrant(1); +#ifdef WITH_THREAD +#ifdef TRACE_RAW_MALLOC + if (!gil_held) + gil_state = PyGILState_Ensure(); +#else + assert(gil_held); +#endif +#endif + ptr2 = alloc->realloc(alloc->ctx, ptr, new_size); + set_reentrant(0); + + if (ptr2 != NULL) { + if (ptr != NULL) + tracemalloc_log_free(ptr); + + tracemalloc_log_alloc(ptr2, new_size); + } + +#if defined(TRACE_RAW_MALLOC) && defined(WITH_THREAD) + if (!gil_held) + PyGILState_Release(gil_state); +#endif + + return ptr2; +} + +static void +tracemalloc_free(void *ctx, void *ptr) +{ + PyMemAllocator *alloc = (PyMemAllocator *)ctx; + + if (ptr == NULL) + return; + + /* GIL cannot be locked in PyMem_RawFree() because it would introduce + a deadlock in PyThreadState_DeleteCurrent(). */ + + alloc->free(alloc->ctx, ptr); + tracemalloc_log_free(ptr); +} + +static void* +tracemalloc_malloc_gil(void *ctx, size_t size) +{ + return tracemalloc_malloc(ctx, size, 1); +} + +static void* +tracemalloc_realloc_gil(void *ctx, void *ptr, size_t new_size) +{ + return tracemalloc_realloc(ctx, ptr, new_size, 1); +} + +#ifdef TRACE_RAW_MALLOC +static void* +tracemalloc_raw_malloc(void *ctx, size_t size) +{ + return tracemalloc_malloc(ctx, size, 0); +} + +static void* +tracemalloc_raw_realloc(void *ctx, void *ptr, size_t new_size) +{ + return tracemalloc_realloc(ctx, ptr, new_size, 0); +} +#endif + +static int +tracemalloc_clear_filename(_Py_hashtable_entry_t *entry, void *user_data) +{ + PyObject *filename = (PyObject *)entry->key; + Py_DECREF(filename); + return 0; +} + +static int +traceback_free_traceback(_Py_hashtable_entry_t *entry, void *user_data) +{ + traceback_t *traceback = (traceback_t *)entry->key; + raw_free(traceback); + return 0; +} + +/* reentrant flag must be set to call this function and GIL must be held */ +static void +tracemalloc_clear_traces(void) +{ +#ifdef WITH_THREAD + /* The GIL protects variables againt concurrent access */ + assert(PyGILState_Check()); +#endif + + /* Disable also reentrant calls to tracemalloc_malloc() to not add a new + trace while we are clearing traces */ + assert(get_reentrant()); + + TABLES_LOCK(); + _Py_hashtable_clear(tracemalloc_traces); + tracemalloc_traced_memory = 0; + tracemalloc_max_traced_memory = 0; + TABLES_UNLOCK(); + + _Py_hashtable_foreach(tracemalloc_tracebacks, traceback_free_traceback, NULL); + _Py_hashtable_clear(tracemalloc_tracebacks); + + _Py_hashtable_foreach(tracemalloc_filenames, tracemalloc_clear_filename, NULL); + _Py_hashtable_clear(tracemalloc_filenames); +} + +static int +tracemalloc_init(void) +{ + if (tracemalloc_config.initialized == TRACEMALLOC_FINALIZED) { + PyErr_SetString(PyExc_RuntimeError, + "the tracemalloc module has been unloaded"); + return -1; + } + + if (tracemalloc_config.initialized == TRACEMALLOC_INITIALIZED) + return 0; + + PyMem_GetAllocator(PYMEM_DOMAIN_RAW, &allocators.raw); + +#ifdef REENTRANT_THREADLOCAL + tracemalloc_reentrant_key = PyThread_create_key(); + if (tracemalloc_reentrant_key == -1) { +#ifdef MS_WINDOWS + PyErr_SetFromWindowsErr(0); +#else + PyErr_SetFromErrno(PyExc_OSError); +#endif + return -1; + } +#endif + +#if defined(WITH_THREAD) && defined(TRACE_RAW_MALLOC) + if (tables_lock == NULL) { + tables_lock = PyThread_allocate_lock(); + if (tables_lock == NULL) { + PyErr_SetString(PyExc_RuntimeError, "cannot allocate lock"); + return -1; + } + } +#endif + + tracemalloc_filenames = hashtable_new(0, + (_Py_hashtable_hash_func)PyObject_Hash, + hashtable_compare_unicode); + + tracemalloc_tracebacks = hashtable_new(0, + (_Py_hashtable_hash_func)hashtable_hash_traceback, + (_Py_hashtable_compare_func)hashtable_compare_traceback); + + tracemalloc_traces = hashtable_new(sizeof(trace_t), + _Py_hashtable_hash_ptr, + _Py_hashtable_compare_direct); + + if (tracemalloc_filenames == NULL || tracemalloc_tracebacks == NULL + || tracemalloc_traces == NULL) + { + PyErr_NoMemory(); + return -1; + } + + unknown_filename = PyUnicode_FromString(""); + if (unknown_filename == NULL) + return -1; + PyUnicode_InternInPlace(&unknown_filename); + + tracemalloc_empty_traceback.nframe = 1; + /* borrowed reference */ + tracemalloc_empty_traceback.frames[0].filename = unknown_filename; + tracemalloc_empty_traceback.frames[0].lineno = 0; + tracemalloc_empty_traceback.hash = traceback_hash(&tracemalloc_empty_traceback); + + /* Disable tracing allocations until hooks are installed. Set + also the reentrant flag to detect bugs: fail with an assertion error + if set_reentrant(1) is called while tracing is disabled. */ + set_reentrant(1); + + tracemalloc_config.initialized = TRACEMALLOC_INITIALIZED; + return 0; +} + +static void +tracemalloc_deinit(void) +{ + if (tracemalloc_config.initialized != TRACEMALLOC_INITIALIZED) + return; + tracemalloc_config.initialized = TRACEMALLOC_FINALIZED; + + tracemalloc_stop(); + + /* destroy hash tables */ + _Py_hashtable_destroy(tracemalloc_traces); + _Py_hashtable_destroy(tracemalloc_tracebacks); + _Py_hashtable_destroy(tracemalloc_filenames); + +#if defined(WITH_THREAD) && defined(TRACE_RAW_MALLOC) + if (tables_lock != NULL) { + PyThread_free_lock(tables_lock); + tables_lock = NULL; + } +#endif + +#ifdef REENTRANT_THREADLOCAL + PyThread_delete_key(tracemalloc_reentrant_key); +#endif + + Py_XDECREF(unknown_filename); +} + +static int +tracemalloc_start(void) +{ + PyMemAllocator alloc; + + if (tracemalloc_init() < 0) + return -1; + + if (tracemalloc_config.tracing) { + /* hook already installed: do nothing */ + return 0; + } + + if (tracemalloc_atexit_register() < 0) + return -1; + +#ifdef TRACE_RAW_MALLOC + alloc.malloc = tracemalloc_raw_malloc; + alloc.realloc = tracemalloc_raw_realloc; + alloc.free = tracemalloc_free; + + alloc.ctx = &allocators.raw; + PyMem_GetAllocator(PYMEM_DOMAIN_RAW, &allocators.raw); + PyMem_SetAllocator(PYMEM_DOMAIN_RAW, &alloc); +#endif + + alloc.malloc = tracemalloc_malloc_gil; + alloc.realloc = tracemalloc_realloc_gil; + alloc.free = tracemalloc_free; + + alloc.ctx = &allocators.mem; + PyMem_GetAllocator(PYMEM_DOMAIN_MEM, &allocators.mem); + PyMem_SetAllocator(PYMEM_DOMAIN_MEM, &alloc); + + alloc.ctx = &allocators.obj; + PyMem_GetAllocator(PYMEM_DOMAIN_OBJ, &allocators.obj); + PyMem_SetAllocator(PYMEM_DOMAIN_OBJ, &alloc); + + /* everything is ready: start tracing Python memory allocations */ + tracemalloc_config.tracing = 1; + set_reentrant(0); + + return 0; +} + +static void +tracemalloc_stop(void) +{ + if (!tracemalloc_config.tracing) + return; + + /* stop tracing Python memory allocations */ + tracemalloc_config.tracing = 0; + + /* set the reentrant flag to detect bugs: fail with an assertion error if + set_reentrant(1) is called while tracing is disabled. */ + set_reentrant(1); + + /* unregister the hook on memory allocators */ +#ifdef TRACE_RAW_MALLOC + PyMem_SetAllocator(PYMEM_DOMAIN_RAW, &allocators.raw); +#endif + PyMem_SetAllocator(PYMEM_DOMAIN_MEM, &allocators.mem); + PyMem_SetAllocator(PYMEM_DOMAIN_OBJ, &allocators.obj); + + /* release memory */ + tracemalloc_clear_traces(); +} + + +static PyObject* +lineno_as_obj(int lineno) +{ + if (lineno >= 0) + return PyLong_FromLong(lineno); + else + Py_RETURN_NONE; +} + +PyDoc_STRVAR(tracemalloc_is_tracing_doc, + "is_tracing()->bool\n" + "\n" + "True if the tracemalloc module is tracing Python memory allocations,\n" + "False otherwise."); + +static PyObject* +py_tracemalloc_is_tracing(PyObject *self) +{ + return PyBool_FromLong(tracemalloc_config.tracing); +} + +PyDoc_STRVAR(tracemalloc_clear_traces_doc, + "clear_traces()\n" + "\n" + "Clear traces of memory blocks allocated by Python."); + +static PyObject* +py_tracemalloc_clear_traces(PyObject *self) +{ + if (!tracemalloc_config.tracing) + Py_RETURN_NONE; + + set_reentrant(1); + tracemalloc_clear_traces(); + set_reentrant(0); + + Py_RETURN_NONE; +} + +static PyObject* +frame_to_pyobject(frame_t *frame) +{ + PyObject *frame_obj, *lineno_obj; + + frame_obj = PyTuple_New(2); + if (frame_obj == NULL) + return NULL; + + if (frame->filename == NULL) + frame->filename = Py_None; + Py_INCREF(frame->filename); + PyTuple_SET_ITEM(frame_obj, 0, frame->filename); + + assert(frame->lineno >= 0); + lineno_obj = lineno_as_obj(frame->lineno); + if (lineno_obj == NULL) { + Py_DECREF(frame_obj); + return NULL; + } + PyTuple_SET_ITEM(frame_obj, 1, lineno_obj); + + return frame_obj; +} + +static PyObject* +traceback_to_pyobject(traceback_t *traceback, _Py_hashtable_t *intern_table) +{ + int i; + PyObject *frames, *frame; + + if (intern_table != NULL) { + if (_Py_HASHTABLE_GET(intern_table, traceback, frames)) { + Py_INCREF(frames); + return frames; + } + } + + frames = PyTuple_New(traceback->nframe); + if (frames == NULL) + return NULL; + + for (i=0; i < traceback->nframe; i++) { + frame = frame_to_pyobject(&traceback->frames[i]); + if (frame == NULL) { + Py_DECREF(frames); + return NULL; + } + PyTuple_SET_ITEM(frames, i, frame); + } + + if (intern_table != NULL) { + if (_Py_HASHTABLE_SET(intern_table, traceback, frames) < 0) { + Py_DECREF(frames); + PyErr_NoMemory(); + return NULL; + } + /* intern_table keeps a new reference to frames */ + Py_INCREF(frames); + } + return frames; +} + +static PyObject* +trace_to_pyobject(trace_t *trace, _Py_hashtable_t *intern_tracebacks) +{ + PyObject *trace_obj = NULL; + PyObject *size, *traceback; + + trace_obj = PyTuple_New(2); + if (trace_obj == NULL) + return NULL; + + size = PyLong_FromSize_t(trace->size); + if (size == NULL) { + Py_DECREF(trace_obj); + return NULL; + } + PyTuple_SET_ITEM(trace_obj, 0, size); + + traceback = traceback_to_pyobject(trace->traceback, intern_tracebacks); + if (traceback == NULL) { + Py_DECREF(trace_obj); + return NULL; + } + PyTuple_SET_ITEM(trace_obj, 1, traceback); + + return trace_obj; +} + +typedef struct { + _Py_hashtable_t *traces; + _Py_hashtable_t *tracebacks; + PyObject *list; +} get_traces_t; + +static int +tracemalloc_get_traces_fill(_Py_hashtable_entry_t *entry, void *user_data) +{ + get_traces_t *get_traces = user_data; + trace_t *trace; + PyObject *tracemalloc_obj; + int res; + + trace = (trace_t *)_PY_HASHTABLE_ENTRY_DATA(entry); + + tracemalloc_obj = trace_to_pyobject(trace, get_traces->tracebacks); + if (tracemalloc_obj == NULL) + return 1; + + res = PyList_Append(get_traces->list, tracemalloc_obj); + Py_DECREF(tracemalloc_obj); + if (res < 0) + return 1; + + return 0; +} + +static int +tracemalloc_pyobject_decref_cb(_Py_hashtable_entry_t *entry, void *user_data) +{ + PyObject *obj = (PyObject *)_Py_HASHTABLE_ENTRY_DATA_AS_VOID_P(entry); + Py_DECREF(obj); + return 0; +} + +PyDoc_STRVAR(tracemalloc_get_traces_doc, + "get_traces() -> list\n" + "\n" + "Get traces of all memory blocks allocated by Python.\n" + "Return a list of (size: int, traceback: tuple) tuples.\n" + "traceback is a tuple of (filename: str, lineno: int) tuples.\n" + "\n" + "Return an empty list if the tracemalloc module is disabled."); + +static PyObject* +py_tracemalloc_get_traces(PyObject *self, PyObject *obj) +{ + get_traces_t get_traces; + int err; + + get_traces.traces = NULL; + get_traces.tracebacks = NULL; + get_traces.list = PyList_New(0); + if (get_traces.list == NULL) + goto error; + + if (!tracemalloc_config.tracing) + return get_traces.list; + + get_traces.tracebacks = hashtable_new(sizeof(PyObject *), + _Py_hashtable_hash_ptr, + _Py_hashtable_compare_direct); + if (get_traces.tracebacks == NULL) { + PyErr_NoMemory(); + goto error; + } + + TABLES_LOCK(); + get_traces.traces = _Py_hashtable_copy(tracemalloc_traces); + TABLES_UNLOCK(); + + if (get_traces.traces == NULL) { + PyErr_NoMemory(); + goto error; + } + + set_reentrant(1); + err = _Py_hashtable_foreach(get_traces.traces, + tracemalloc_get_traces_fill, &get_traces); + set_reentrant(0); + if (err) + goto error; + + goto finally; + +error: + Py_CLEAR(get_traces.list); + +finally: + if (get_traces.tracebacks != NULL) { + _Py_hashtable_foreach(get_traces.tracebacks, + tracemalloc_pyobject_decref_cb, NULL); + _Py_hashtable_destroy(get_traces.tracebacks); + } + if (get_traces.traces != NULL) + _Py_hashtable_destroy(get_traces.traces); + + return get_traces.list; +} + +PyDoc_STRVAR(tracemalloc_get_object_traceback_doc, + "get_object_traceback(obj)\n" + "\n" + "Get the traceback where the Python object obj was allocated.\n" + "Return a tuple of (filename: str, lineno: int) tuples.\n" + "\n" + "Return None if the tracemalloc module is disabled or did not\n" + "trace the allocation of the object."); + +static PyObject* +py_tracemalloc_get_object_traceback(PyObject *self, PyObject *obj) +{ + PyTypeObject *type; + void *ptr; + trace_t trace; + int found; + + if (!tracemalloc_config.tracing) + Py_RETURN_NONE; + + type = Py_TYPE(obj); + if (PyType_IS_GC(type)) + ptr = (void *)((char *)obj - sizeof(PyGC_Head)); + else + ptr = (void *)obj; + + TABLES_LOCK(); + found = _Py_HASHTABLE_GET(tracemalloc_traces, ptr, trace); + TABLES_UNLOCK(); + + if (!found) + Py_RETURN_NONE; + + return traceback_to_pyobject(trace.traceback, NULL); +} + +static PyObject* +tracemalloc_atexit(PyObject *self) +{ +#ifdef WITH_THREAD + assert(PyGILState_Check()); +#endif + tracemalloc_deinit(); + Py_RETURN_NONE; +} + +static PyMethodDef atexit_method = { + "_atexit", (PyCFunction)tracemalloc_atexit, METH_NOARGS, NULL}; + +static int +tracemalloc_atexit_register(void) +{ + PyObject *method = NULL, *atexit = NULL, *func = NULL; + PyObject *result; + int ret = -1; + + if (tracemalloc_config.atexit_registered) + return 0; + tracemalloc_config.atexit_registered = 1; + + /* private functions */ + method = PyCFunction_New(&atexit_method, NULL); + if (method == NULL) + goto done; + + atexit = PyImport_ImportModule("atexit"); + if (atexit == NULL) { + if (!PyErr_Warn(PyExc_ImportWarning, + "atexit module is missing: " + "cannot automatically disable tracemalloc at exit")) + { + PyErr_Clear(); + return 0; + } + goto done; + } + + func = PyObject_GetAttrString(atexit, "register"); + if (func == NULL) + goto done; + + result = PyObject_CallFunction(func, "O", method); + if (result == NULL) + goto done; + Py_DECREF(result); + + ret = 0; + +done: + Py_XDECREF(method); + Py_XDECREF(func); + Py_XDECREF(atexit); + return ret; +} + +PyDoc_STRVAR(tracemalloc_start_doc, + "start()\n" + "\n" + "Start tracing Python memory allocations."); + +static PyObject* +py_tracemalloc_start(PyObject *self) +{ + if (tracemalloc_start() < 0) + return NULL; + + Py_RETURN_NONE; +} + +PyDoc_STRVAR(tracemalloc_stop_doc, + "stop()\n" + "\n" + "Stop tracing Python memory allocations and clear traces\n" + "of memory blocks allocated by Python."); + +static PyObject* +py_tracemalloc_stop(PyObject *self) +{ + tracemalloc_stop(); + Py_RETURN_NONE; +} + +PyDoc_STRVAR(tracemalloc_get_traceback_limit_doc, + "get_traceback_limit() -> int\n" + "\n" + "Get the maximum number of frames stored in the traceback\n" + "of a trace.\n" + "\n" + "By default, a trace of an allocated memory block only stores\n" + "the most recent frame: the limit is 1."); + +static PyObject* +py_tracemalloc_get_traceback_limit(PyObject *self) +{ + return PyLong_FromLong(tracemalloc_config.max_nframe); +} + +PyDoc_STRVAR(tracemalloc_set_traceback_limit_doc, + "set_traceback_limit(nframe: int)\n" + "\n" + "Set the maximum number of frames stored in the traceback of a trace."); + +static PyObject* +tracemalloc_set_traceback_limit(PyObject *self, PyObject *args) +{ + Py_ssize_t nframe; + + if (!PyArg_ParseTuple(args, "n:set_traceback_limit", + &nframe)) + return NULL; + + if (nframe < 1 || nframe > MAX_NFRAME) { + PyErr_Format(PyExc_ValueError, + "the number of frames must be in range [1; %i]", + MAX_NFRAME); + return NULL; + } + tracemalloc_config.max_nframe = Py_SAFE_DOWNCAST(nframe, Py_ssize_t, int); + + Py_RETURN_NONE; +} + +PyDoc_STRVAR(tracemalloc_get_tracemalloc_memory_doc, + "get_tracemalloc_memory() -> int\n" + "\n" + "Get the memory usage in bytes of the tracemalloc module\n" + "used internally to trace memory allocations."); + +static PyObject* +tracemalloc_get_tracemalloc_memory(PyObject *self) +{ + size_t size; + PyObject *size_obj; + + size = _Py_hashtable_size(tracemalloc_tracebacks); + size += _Py_hashtable_size(tracemalloc_filenames); + + TABLES_LOCK(); + size += _Py_hashtable_size(tracemalloc_traces); + TABLES_UNLOCK(); + + size_obj = PyLong_FromSize_t(size); + return Py_BuildValue("N", size_obj); +} + +PyDoc_STRVAR(tracemalloc_get_traced_memory_doc, + "get_traced_memory() -> int\n" + "\n" + "Get the current size and maximum size of memory blocks traced\n" + "by the tracemalloc module as a tuple: (size: int, max_size: int)."); + +static PyObject* +tracemalloc_get_traced_memory(PyObject *self) +{ + Py_ssize_t size, max_size; + PyObject *size_obj, *max_size_obj; + + if (!tracemalloc_config.tracing) + return Py_BuildValue("ii", 0, 0); + + TABLES_LOCK(); + size = tracemalloc_traced_memory; + max_size = tracemalloc_max_traced_memory; + TABLES_UNLOCK(); + + size_obj = PyLong_FromSize_t(size); + max_size_obj = PyLong_FromSize_t(max_size); + return Py_BuildValue("NN", size_obj, max_size_obj); +} + +static PyMethodDef module_methods[] = { + {"is_tracing", (PyCFunction)py_tracemalloc_is_tracing, + METH_NOARGS, tracemalloc_is_tracing_doc}, + {"clear_traces", (PyCFunction)py_tracemalloc_clear_traces, + METH_NOARGS, tracemalloc_clear_traces_doc}, + {"_get_traces", (PyCFunction)py_tracemalloc_get_traces, + METH_NOARGS, tracemalloc_get_traces_doc}, + {"_get_object_traceback", (PyCFunction)py_tracemalloc_get_object_traceback, + METH_O, tracemalloc_get_object_traceback_doc}, + {"start", (PyCFunction)py_tracemalloc_start, + METH_NOARGS, tracemalloc_start_doc}, + {"stop", (PyCFunction)py_tracemalloc_stop, + METH_NOARGS, tracemalloc_stop_doc}, + {"get_traceback_limit", (PyCFunction)py_tracemalloc_get_traceback_limit, + METH_NOARGS, tracemalloc_get_traceback_limit_doc}, + {"set_traceback_limit", (PyCFunction)tracemalloc_set_traceback_limit, + METH_VARARGS, tracemalloc_set_traceback_limit_doc}, + {"get_tracemalloc_memory", (PyCFunction)tracemalloc_get_tracemalloc_memory, + METH_NOARGS, tracemalloc_get_tracemalloc_memory_doc}, + {"get_traced_memory", (PyCFunction)tracemalloc_get_traced_memory, + METH_NOARGS, tracemalloc_get_traced_memory_doc}, + + /* sentinel */ + {NULL, NULL} +}; + +PyDoc_STRVAR(module_doc, +"Debug module to trace memory blocks allocated by Python."); + +static struct PyModuleDef module_def = { + PyModuleDef_HEAD_INIT, + "_tracemalloc", + module_doc, + 0, /* non-negative size to be able to unload the module */ + module_methods, + NULL, +}; + +PyMODINIT_FUNC +PyInit__tracemalloc(void) +{ + PyObject *m; + m = PyModule_Create(&module_def); + if (m == NULL) + return NULL; + + if (tracemalloc_init() < 0) + return NULL; + + return m; +} + +static int +parse_sys_xoptions(PyObject *value) +{ + PyObject *valuelong; + long nframe; + + if (value == Py_True) + return 1; + + assert(PyUnicode_Check(value)); + if (PyUnicode_GetLength(value) == 0) + return -1; + + valuelong = PyLong_FromUnicodeObject(value, 10); + if (valuelong == NULL) + return -1; + + nframe = PyLong_AsLong(valuelong); + Py_DECREF(valuelong); + if (nframe == -1 && PyErr_Occurred()) + return -1; + + if (nframe < 1 || nframe > MAX_NFRAME) + return -1; + + return Py_SAFE_DOWNCAST(nframe, long, int); +} + +int +_PyTraceMalloc_Init(void) +{ + char *p; + int nframe; + +#ifdef WITH_THREAD + assert(PyGILState_Check()); +#endif + + if ((p = Py_GETENV("PYTHONTRACEMALLOC")) && *p != '\0') { + char *endptr = p; + unsigned long value; + + value = strtoul(p, &endptr, 10); + if (*endptr != '\0' + || value < 1 + || value > MAX_NFRAME + || (errno == ERANGE && value == ULONG_MAX)) + { + Py_FatalError("PYTHONTRACEMALLOC must be an integer " + "in range [1; " STR(MAX_NFRAME) "]"); + return -1; + } + + nframe = (int)value; + } + else { + PyObject *xoptions, *key, *value; + + xoptions = PySys_GetXOptions(); + if (xoptions == NULL) + return -1; + + key = PyUnicode_FromString("tracemalloc"); + if (key == NULL) + return -1; + + value = PyDict_GetItemWithError(xoptions, key); + Py_DECREF(key); + if (value == NULL) { + if (PyErr_Occurred()) + return -1; + + /* -X tracemalloc is not used */ + return 0; + } + + nframe = parse_sys_xoptions(value); + Py_DECREF(value); + if (nframe < 0) { + Py_FatalError("-X tracemalloc=NFRAME: number of frame must be " + "an integer in range [1; " STR(MAX_NFRAME) "]"); + } + } + + tracemalloc_config.max_nframe = nframe; + return tracemalloc_start(); +} + diff --git a/Modules/hashtable.c b/Modules/hashtable.c new file mode 100644 --- /dev/null +++ b/Modules/hashtable.c @@ -0,0 +1,518 @@ +/* The implementation of the hash table (_Py_hashtable_t) is based on the cfuhash + project: + http://sourceforge.net/projects/libcfu/ + + Copyright of cfuhash: + ---------------------------------- + Creation date: 2005-06-24 21:22:40 + Authors: Don + Change log: + + Copyright (c) 2005 Don Owens + All rights reserved. + + This code is released under the BSD license: + + Redistribution and use in source and binary forms, with or without + modification, are permitted provided that the following conditions + are met: + + * Redistributions of source code must retain the above copyright + notice, this list of conditions and the following disclaimer. + + * Redistributions in binary form must reproduce the above + copyright notice, this list of conditions and the following + disclaimer in the documentation and/or other materials provided + with the distribution. + + * Neither the name of the author nor the names of its + contributors may be used to endorse or promote products derived + from this software without specific prior written permission. + + THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS + "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT + LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS + FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE + COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, + INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES + (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR + SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) + HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, + STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) + ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED + OF THE POSSIBILITY OF SUCH DAMAGE. + ---------------------------------- +*/ + +#include "Python.h" +#include "hashtable.h" + +#define HASHTABLE_MIN_SIZE 16 +#define HASHTABLE_HIGH 0.50 +#define HASHTABLE_LOW 0.10 +#define HASHTABLE_REHASH_FACTOR 2.0 / (HASHTABLE_LOW + HASHTABLE_HIGH) + +#define BUCKETS_HEAD(SLIST) \ + ((_Py_hashtable_entry_t *)_Py_SLIST_HEAD(&(SLIST))) +#define TABLE_HEAD(HT, BUCKET) \ + ((_Py_hashtable_entry_t *)_Py_SLIST_HEAD(&(HT)->buckets[BUCKET])) +#define ENTRY_NEXT(ENTRY) \ + ((_Py_hashtable_entry_t *)_Py_SLIST_ITEM_NEXT(ENTRY)) +#define HASHTABLE_ITEM_SIZE(HT) \ + (sizeof(_Py_hashtable_entry_t) + (HT)->data_size) + +/* Forward declaration */ +static void hashtable_rehash(_Py_hashtable_t *ht); + +static void +_Py_slist_init(_Py_slist_t *list) +{ + list->head = NULL; +} + +static void +_Py_slist_prepend(_Py_slist_t *list, _Py_slist_item_t *item) +{ + item->next = list->head; + list->head = item; +} + +static void +_Py_slist_remove(_Py_slist_t *list, _Py_slist_item_t *previous, + _Py_slist_item_t *item) +{ + if (previous != NULL) + previous->next = item->next; + else + list->head = item->next; +} + +Py_uhash_t +_Py_hashtable_hash_int(const void *key) +{ + return (Py_uhash_t)key; +} + +Py_uhash_t +_Py_hashtable_hash_ptr(const void *key) +{ + return (Py_uhash_t)_Py_HashPointer((void *)key); +} + +int +_Py_hashtable_compare_direct(const void *key, const _Py_hashtable_entry_t *entry) +{ + return entry->key == key; +} + +/* makes sure the real size of the buckets array is a power of 2 */ +static size_t +round_size(size_t s) +{ + size_t i; + if (s < HASHTABLE_MIN_SIZE) + return HASHTABLE_MIN_SIZE; + i = 1; + while (i < s) + i <<= 1; + return i; +} + +_Py_hashtable_t * +_Py_hashtable_new_full(size_t data_size, size_t init_size, + _Py_hashtable_hash_func hash_func, + _Py_hashtable_compare_func compare_func, + _Py_hashtable_copy_data_func copy_data_func, + _Py_hashtable_free_data_func free_data_func, + _Py_hashtable_get_data_size_func get_data_size_func, + _Py_hashtable_allocator_t *allocator) +{ + _Py_hashtable_t *ht; + size_t buckets_size; + _Py_hashtable_allocator_t alloc; + + if (allocator == NULL) { + alloc.malloc = PyMem_RawMalloc; + alloc.free = PyMem_RawFree; + } + else + alloc = *allocator; + + ht = (_Py_hashtable_t *)alloc.malloc(sizeof(_Py_hashtable_t)); + if (ht == NULL) + return ht; + + ht->num_buckets = round_size(init_size); + ht->entries = 0; + ht->data_size = data_size; + + buckets_size = ht->num_buckets * sizeof(ht->buckets[0]); + ht->buckets = alloc.malloc(buckets_size); + if (ht->buckets == NULL) { + alloc.free(ht); + return NULL; + } + memset(ht->buckets, 0, buckets_size); + + ht->hash_func = hash_func; + ht->compare_func = compare_func; + ht->copy_data_func = copy_data_func; + ht->free_data_func = free_data_func; + ht->get_data_size_func = get_data_size_func; + ht->alloc = alloc; + return ht; +} + +_Py_hashtable_t * +_Py_hashtable_new(size_t data_size, + _Py_hashtable_hash_func hash_func, + _Py_hashtable_compare_func compare_func) +{ + return _Py_hashtable_new_full(data_size, HASHTABLE_MIN_SIZE, + hash_func, compare_func, + NULL, NULL, NULL, NULL); +} + +size_t +_Py_hashtable_size(_Py_hashtable_t *ht) +{ + size_t size; + size_t hv; + + size = sizeof(_Py_hashtable_t); + + /* buckets */ + size += ht->num_buckets * sizeof(_Py_hashtable_entry_t *); + + /* entries */ + size += ht->entries * HASHTABLE_ITEM_SIZE(ht); + + /* data linked from entries */ + if (ht->get_data_size_func) { + for (hv = 0; hv < ht->num_buckets; hv++) { + _Py_hashtable_entry_t *entry; + + for (entry = TABLE_HEAD(ht, hv); entry; entry = ENTRY_NEXT(entry)) { + void *data; + + data = _Py_HASHTABLE_ENTRY_DATA_AS_VOID_P(entry); + size += ht->get_data_size_func(data); + } + } + } + return size; +} + +#ifdef Py_DEBUG +void +_Py_hashtable_print_stats(_Py_hashtable_t *ht) +{ + size_t size; + size_t chain_len, max_chain_len, total_chain_len, nchains; + _Py_hashtable_entry_t *entry; + size_t hv; + double load; + + size = _Py_hashtable_size(ht); + + load = (double)ht->entries / ht->num_buckets; + + max_chain_len = 0; + total_chain_len = 0; + nchains = 0; + for (hv = 0; hv < ht->num_buckets; hv++) { + entry = TABLE_HEAD(ht, hv); + if (entry != NULL) { + chain_len = 0; + for (; entry; entry = ENTRY_NEXT(entry)) { + chain_len++; + } + if (chain_len > max_chain_len) + max_chain_len = chain_len; + total_chain_len += chain_len; + nchains++; + } + } + printf("hash table %p: entries=%zu/%zu (%.0f%%), ", + ht, ht->entries, ht->num_buckets, load * 100.0); + if (nchains) + printf("avg_chain_len=%.1f, ", (double)total_chain_len / nchains); + printf("max_chain_len=%zu, %zu kB\n", + max_chain_len, size / 1024); +} +#endif + +/* Get an entry. Return NULL if the key does not exist. */ +_Py_hashtable_entry_t * +_Py_hashtable_get_entry(_Py_hashtable_t *ht, const void *key) +{ + Py_uhash_t key_hash; + size_t index; + _Py_hashtable_entry_t *entry; + + key_hash = ht->hash_func(key); + index = key_hash & (ht->num_buckets - 1); + + for (entry = TABLE_HEAD(ht, index); entry != NULL; entry = ENTRY_NEXT(entry)) { + if (entry->key_hash == key_hash && ht->compare_func(key, entry)) + break; + } + + return entry; +} + +static int +_hashtable_pop_entry(_Py_hashtable_t *ht, const void *key, void *data, size_t data_size) +{ + Py_uhash_t key_hash; + size_t index; + _Py_hashtable_entry_t *entry, *previous; + + key_hash = ht->hash_func(key); + index = key_hash & (ht->num_buckets - 1); + + previous = NULL; + for (entry = TABLE_HEAD(ht, index); entry != NULL; entry = ENTRY_NEXT(entry)) { + if (entry->key_hash == key_hash && ht->compare_func(key, entry)) + break; + previous = entry; + } + + if (entry == NULL) + return 0; + + _Py_slist_remove(&ht->buckets[index], (_Py_slist_item_t *)previous, + (_Py_slist_item_t *)entry); + ht->entries--; + + if (data != NULL) + _Py_HASHTABLE_ENTRY_READ_DATA(ht, data, data_size, entry); + ht->alloc.free(entry); + + if ((float)ht->entries / (float)ht->num_buckets < HASHTABLE_LOW) + hashtable_rehash(ht); + return 1; +} + +/* Add a new entry to the hash. The key must not be present in the hash table. + Return 0 on success, -1 on memory error. */ +int +_Py_hashtable_set(_Py_hashtable_t *ht, const void *key, + void *data, size_t data_size) +{ + Py_uhash_t key_hash; + size_t index; + _Py_hashtable_entry_t *entry; + + assert(data != NULL || data_size == 0); +#ifndef NDEBUG + /* Don't write the assertion on a single line because it is interesting + to know the duplicated entry if the assertion failed. The entry can + be read using a debugger. */ + entry = _Py_hashtable_get_entry(ht, key); + assert(entry == NULL); +#endif + + key_hash = ht->hash_func(key); + index = key_hash & (ht->num_buckets - 1); + + entry = ht->alloc.malloc(HASHTABLE_ITEM_SIZE(ht)); + if (entry == NULL) { + /* memory allocation failed */ + return -1; + } + + entry->key = (void *)key; + entry->key_hash = key_hash; + + assert(data_size == ht->data_size); + memcpy(_PY_HASHTABLE_ENTRY_DATA(entry), data, data_size); + + _Py_slist_prepend(&ht->buckets[index], (_Py_slist_item_t*)entry); + ht->entries++; + + if ((float)ht->entries / (float)ht->num_buckets > HASHTABLE_HIGH) + hashtable_rehash(ht); + return 0; +} + +/* Get data from an entry. Copy entry data into data and return 1 if the entry + exists, return 0 if the entry does not exist. */ +int +_Py_hashtable_get(_Py_hashtable_t *ht, const void *key, void *data, size_t data_size) +{ + _Py_hashtable_entry_t *entry; + + assert(data != NULL); + + entry = _Py_hashtable_get_entry(ht, key); + if (entry == NULL) + return 0; + _Py_HASHTABLE_ENTRY_READ_DATA(ht, data, data_size, entry); + return 1; +} + +int +_Py_hashtable_pop(_Py_hashtable_t *ht, const void *key, void *data, size_t data_size) +{ + assert(data != NULL); + assert(ht->free_data_func == NULL); + return _hashtable_pop_entry(ht, key, data, data_size); +} + +/* Delete an entry. The entry must exist. */ +void +_Py_hashtable_delete(_Py_hashtable_t *ht, const void *key) +{ +#ifndef NDEBUG + int found = _hashtable_pop_entry(ht, key, NULL, 0); + assert(found); +#else + (void)_hashtable_pop_entry(ht, key, NULL, 0); +#endif +} + +/* Prototype for a pointer to a function to be called foreach + key/value pair in the hash by hashtable_foreach(). Iteration + stops if a non-zero value is returned. */ +int +_Py_hashtable_foreach(_Py_hashtable_t *ht, + int (*func) (_Py_hashtable_entry_t *entry, void *arg), + void *arg) +{ + _Py_hashtable_entry_t *entry; + size_t hv; + + for (hv = 0; hv < ht->num_buckets; hv++) { + for (entry = TABLE_HEAD(ht, hv); entry; entry = ENTRY_NEXT(entry)) { + int res = func(entry, arg); + if (res) + return res; + } + } + return 0; +} + +static void +hashtable_rehash(_Py_hashtable_t *ht) +{ + size_t buckets_size, new_size, bucket; + _Py_slist_t *old_buckets = NULL; + size_t old_num_buckets; + + new_size = round_size((size_t)(ht->entries * HASHTABLE_REHASH_FACTOR)); + if (new_size == ht->num_buckets) + return; + + old_num_buckets = ht->num_buckets; + + buckets_size = new_size * sizeof(ht->buckets[0]); + old_buckets = ht->buckets; + ht->buckets = ht->alloc.malloc(buckets_size); + if (ht->buckets == NULL) { + /* cancel rehash on memory allocation failure */ + ht->buckets = old_buckets ; + /* memory allocation failed */ + return; + } + memset(ht->buckets, 0, buckets_size); + + ht->num_buckets = new_size; + + for (bucket = 0; bucket < old_num_buckets; bucket++) { + _Py_hashtable_entry_t *entry, *next; + for (entry = BUCKETS_HEAD(old_buckets[bucket]); entry != NULL; entry = next) { + size_t entry_index; + + assert(ht->hash_func(entry->key) == entry->key_hash); + next = ENTRY_NEXT(entry); + entry_index = entry->key_hash & (new_size - 1); + + _Py_slist_prepend(&ht->buckets[entry_index], (_Py_slist_item_t*)entry); + } + } + + ht->alloc.free(old_buckets); +} + +void +_Py_hashtable_clear(_Py_hashtable_t *ht) +{ + _Py_hashtable_entry_t *entry, *next; + size_t i; + + for (i=0; i < ht->num_buckets; i++) { + for (entry = TABLE_HEAD(ht, i); entry != NULL; entry = next) { + next = ENTRY_NEXT(entry); + if (ht->free_data_func) + ht->free_data_func(_Py_HASHTABLE_ENTRY_DATA_AS_VOID_P(entry)); + ht->alloc.free(entry); + } + _Py_slist_init(&ht->buckets[i]); + } + ht->entries = 0; + hashtable_rehash(ht); +} + +void +_Py_hashtable_destroy(_Py_hashtable_t *ht) +{ + size_t i; + + for (i = 0; i < ht->num_buckets; i++) { + _Py_slist_item_t *entry = ht->buckets[i].head; + while (entry) { + _Py_slist_item_t *entry_next = entry->next; + if (ht->free_data_func) + ht->free_data_func(_Py_HASHTABLE_ENTRY_DATA_AS_VOID_P(entry)); + ht->alloc.free(entry); + entry = entry_next; + } + } + + ht->alloc.free(ht->buckets); + ht->alloc.free(ht); +} + +/* Return a copy of the hash table */ +_Py_hashtable_t * +_Py_hashtable_copy(_Py_hashtable_t *src) +{ + _Py_hashtable_t *dst; + _Py_hashtable_entry_t *entry; + size_t bucket; + int err; + void *data, *new_data; + + dst = _Py_hashtable_new_full(src->data_size, src->num_buckets, + src->hash_func, src->compare_func, + src->copy_data_func, src->free_data_func, + src->get_data_size_func, &src->alloc); + if (dst == NULL) + return NULL; + + for (bucket=0; bucket < src->num_buckets; bucket++) { + entry = TABLE_HEAD(src, bucket); + for (; entry; entry = ENTRY_NEXT(entry)) { + if (src->copy_data_func) { + data = _Py_HASHTABLE_ENTRY_DATA_AS_VOID_P(entry); + new_data = src->copy_data_func(data); + if (new_data != NULL) + err = _Py_hashtable_set(dst, entry->key, + &new_data, src->data_size); + else + err = 1; + } + else { + data = _PY_HASHTABLE_ENTRY_DATA(entry); + err = _Py_hashtable_set(dst, entry->key, data, src->data_size); + } + if (err) { + _Py_hashtable_destroy(dst); + return NULL; + } + } + } + return dst; +} + diff --git a/Modules/hashtable.h b/Modules/hashtable.h new file mode 100644 --- /dev/null +++ b/Modules/hashtable.h @@ -0,0 +1,128 @@ +#ifndef Py_HASHTABLE_H +#define Py_HASHTABLE_H + +/* The whole API is private */ +#ifndef Py_LIMITED_API + +typedef struct _Py_slist_item_s { + struct _Py_slist_item_s *next; +} _Py_slist_item_t; + +typedef struct { + _Py_slist_item_t *head; +} _Py_slist_t; + +#define _Py_SLIST_ITEM_NEXT(ITEM) (((_Py_slist_item_t *)ITEM)->next) + +#define _Py_SLIST_HEAD(SLIST) (((_Py_slist_t *)SLIST)->head) + +typedef struct { + /* used by _Py_hashtable_t.buckets to link entries */ + _Py_slist_item_t _Py_slist_item; + + const void *key; + Py_uhash_t key_hash; + + /* data follows */ +} _Py_hashtable_entry_t; + +#define _PY_HASHTABLE_ENTRY_DATA(ENTRY) \ + ((char *)(ENTRY) + sizeof(_Py_hashtable_entry_t)) + +#define _Py_HASHTABLE_ENTRY_DATA_AS_VOID_P(ENTRY) \ + (*(void **)_PY_HASHTABLE_ENTRY_DATA(ENTRY)) + +#define _Py_HASHTABLE_ENTRY_READ_DATA(TABLE, DATA, DATA_SIZE, ENTRY) \ + do { \ + assert((DATA_SIZE) == (TABLE)->data_size); \ + memcpy(DATA, _PY_HASHTABLE_ENTRY_DATA(ENTRY), DATA_SIZE); \ + } while (0) + +typedef Py_uhash_t (*_Py_hashtable_hash_func) (const void *key); +typedef int (*_Py_hashtable_compare_func) (const void *key, const _Py_hashtable_entry_t *he); +typedef void* (*_Py_hashtable_copy_data_func)(void *data); +typedef void (*_Py_hashtable_free_data_func)(void *data); +typedef size_t (*_Py_hashtable_get_data_size_func)(void *data); + +typedef struct { + /* allocate a memory block */ + void* (*malloc) (size_t size); + + /* release a memory block */ + void (*free) (void *ptr); +} _Py_hashtable_allocator_t; + +typedef struct { + size_t num_buckets; + size_t entries; /* Total number of entries in the table. */ + _Py_slist_t *buckets; + size_t data_size; + + _Py_hashtable_hash_func hash_func; + _Py_hashtable_compare_func compare_func; + _Py_hashtable_copy_data_func copy_data_func; + _Py_hashtable_free_data_func free_data_func; + _Py_hashtable_get_data_size_func get_data_size_func; + _Py_hashtable_allocator_t alloc; +} _Py_hashtable_t; + +/* hash and compare functions for integers and pointers */ +PyAPI_FUNC(Py_uhash_t) _Py_hashtable_hash_ptr(const void *key); +PyAPI_FUNC(Py_uhash_t) _Py_hashtable_hash_int(const void *key); +PyAPI_FUNC(int) _Py_hashtable_compare_direct(const void *key, const _Py_hashtable_entry_t *entry); + +PyAPI_FUNC(_Py_hashtable_t *) _Py_hashtable_new( + size_t data_size, + _Py_hashtable_hash_func hash_func, + _Py_hashtable_compare_func compare_func); +PyAPI_FUNC(_Py_hashtable_t *) _Py_hashtable_new_full( + size_t data_size, + size_t init_size, + _Py_hashtable_hash_func hash_func, + _Py_hashtable_compare_func compare_func, + _Py_hashtable_copy_data_func copy_data_func, + _Py_hashtable_free_data_func free_data_func, + _Py_hashtable_get_data_size_func get_data_size_func, + _Py_hashtable_allocator_t *allocator); +PyAPI_FUNC(_Py_hashtable_t *) _Py_hashtable_copy(_Py_hashtable_t *src); +PyAPI_FUNC(void) _Py_hashtable_clear(_Py_hashtable_t *ht); +PyAPI_FUNC(void) _Py_hashtable_destroy(_Py_hashtable_t *ht); + +typedef int (*_Py_hashtable_foreach_func) (_Py_hashtable_entry_t *entry, void *arg); + +PyAPI_FUNC(int) _Py_hashtable_foreach( + _Py_hashtable_t *ht, + _Py_hashtable_foreach_func func, void *arg); +PyAPI_FUNC(size_t) _Py_hashtable_size(_Py_hashtable_t *ht); + +PyAPI_FUNC(_Py_hashtable_entry_t*) _Py_hashtable_get_entry( + _Py_hashtable_t *ht, + const void *key); +PyAPI_FUNC(int) _Py_hashtable_set( + _Py_hashtable_t *ht, + const void *key, + void *data, + size_t data_size); +PyAPI_FUNC(int) _Py_hashtable_get( + _Py_hashtable_t *ht, + const void *key, + void *data, + size_t data_size); +PyAPI_FUNC(int) _Py_hashtable_pop( + _Py_hashtable_t *ht, + const void *key, + void *data, + size_t data_size); +PyAPI_FUNC(void) _Py_hashtable_delete( + _Py_hashtable_t *ht, + const void *key); + +#define _Py_HASHTABLE_SET(TABLE, KEY, DATA) \ + _Py_hashtable_set(TABLE, KEY, &(DATA), sizeof(DATA)) + +#define _Py_HASHTABLE_GET(TABLE, KEY, DATA) \ + _Py_hashtable_get(TABLE, KEY, &(DATA), sizeof(DATA)) + +#endif /* Py_LIMITED_API */ + +#endif diff --git a/PC/config.c b/PC/config.c --- a/PC/config.c +++ b/PC/config.c @@ -13,6 +13,7 @@ extern PyObject* PyInit_cmath(void); extern PyObject* PyInit_errno(void); extern PyObject* PyInit_faulthandler(void); +extern PyObject* PyInit__tracemalloc(void); extern PyObject* PyInit_gc(void); extern PyObject* PyInit_math(void); extern PyObject* PyInit__md5(void); @@ -102,6 +103,7 @@ {"msvcrt", PyInit_msvcrt}, {"_locale", PyInit__locale}, #endif + {"_tracemalloc", PyInit__tracemalloc}, /* XXX Should _winapi go in a WIN32 block? not WIN64? */ {"_winapi", PyInit__winapi}, diff --git a/PCbuild/pythoncore.vcxproj b/PCbuild/pythoncore.vcxproj --- a/PCbuild/pythoncore.vcxproj +++ b/PCbuild/pythoncore.vcxproj @@ -449,6 +449,7 @@ + @@ -517,6 +518,7 @@ + @@ -532,6 +534,7 @@ + @@ -684,4 +687,4 @@ - \ No newline at end of file + diff --git a/Python/pythonrun.c b/Python/pythonrun.c --- a/Python/pythonrun.c +++ b/Python/pythonrun.c @@ -105,6 +105,7 @@ extern int _PyFaulthandler_Init(void); extern void _PyFaulthandler_Fini(void); extern void _PyHash_Fini(void); +extern int _PyTraceMalloc_Init(void); #ifdef WITH_THREAD extern void _PyGILState_Init(PyInterpreterState *, PyThreadState *); @@ -454,6 +455,9 @@ if (install_sigs) initsigs(); /* Signal handling stuff, including initintr() */ + if (_PyTraceMalloc_Init() < 0) + Py_FatalError("Py_Initialize: can't initialize tracemalloc"); + initmain(interp); /* Module __main__ */ if (initstdio() < 0) Py_FatalError( -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 12:50:23 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 23 Nov 2013 12:50:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2318874=3A_Remove_t?= =?utf-8?q?racemalloc=2Eset=5Ftraceback=5Flimit=28=29?= Message-ID: <3dRXrH1mGXz7Lqg@mail.python.org> http://hg.python.org/cpython/rev/66db0c66a6ee changeset: 87402:66db0c66a6ee user: Victor Stinner date: Sat Nov 23 12:37:20 2013 +0100 summary: Issue #18874: Remove tracemalloc.set_traceback_limit() tracemalloc.start() now has an option nframe parameter files: Doc/library/tracemalloc.rst | 36 +++++++---------- Doc/using/cmdline.rst | 18 ++++--- Lib/test/test_tracemalloc.py | 21 ++++----- Modules/_tracemalloc.c | 49 ++++++++--------------- 4 files changed, 52 insertions(+), 72 deletions(-) diff --git a/Doc/library/tracemalloc.rst b/Doc/library/tracemalloc.rst --- a/Doc/library/tracemalloc.rst +++ b/Doc/library/tracemalloc.rst @@ -21,8 +21,7 @@ By default, a trace of an allocated memory block only stores the most recent frame (1 frame). To store 25 frames at startup: set the :envvar:`PYTHONTRACEMALLOC` environment variable to ``25``, or use the -:option:`-X` ``tracemalloc=25`` command line option. The -:func:`set_traceback_limit` function can be used at runtime to set the limit. +:option:`-X` ``tracemalloc=25`` command line option. .. versionadded:: 3.4 @@ -120,8 +119,8 @@ import linecache import tracemalloc - tracemalloc.set_traceback_limit(25) - tracemalloc.start() + # Store 25 frames + tracemalloc.start(25) # ... run your application ... @@ -267,10 +266,10 @@ Get the maximum number of frames stored in the traceback of a trace. - By default, a trace of a memory block only stores the most recent - frame: the limit is ``1``. + The :mod:`tracemalloc` module must be tracing memory allocations to + get the limit, otherwise an exception is raised. - Use the :func:`set_traceback_limit` function to change the limit. + The limit is set by the :func:`start` function. .. function:: get_traced_memory() @@ -294,10 +293,12 @@ See also :func:`start` and :func:`stop` functions. -.. function:: set_traceback_limit(nframe: int) +.. function:: start(nframe: int=1) - Set the maximum number of frames stored in the traceback of a trace. - *nframe* must be greater or equal to ``1``. + Start tracing Python memory allocations: install hooks on Python memory + allocators. Collected tracebacks of traces will be limited to *nframe* + frames. By default, a trace of a memory block only stores the most recent + frame: the limit is ``1``. *nframe* must be greater or equal to ``1``. Storing more than ``1`` frame is only useful to compute statistics grouped by ``'traceback'`` or to compute cumulative statistics: see the @@ -309,17 +310,10 @@ The :envvar:`PYTHONTRACEMALLOC` environment variable (``PYTHONTRACEMALLOC=NFRAME``) and the :option:`-X` ``tracemalloc=NFRAME`` - command line option can be used to set the limit at startup. + command line option can be used to start tracing at startup. - Use the :func:`get_traceback_limit` function to get the current limit. - - -.. function:: start() - - Start tracing Python memory allocations: install hooks on Python memory - allocators. - - See also :func:`stop` and :func:`is_tracing` functions. + See also :func:`stop`, :func:`is_tracing` and :func:`get_traceback_limit` + functions. .. function:: stop() @@ -342,7 +336,7 @@ :mod:`tracemalloc` module started to trace memory allocations. Tracebacks of traces are limited to :func:`get_traceback_limit` frames. Use - :func:`set_traceback_limit` to store more frames. + the *nframe* parameter of the :func:`start` function to store more frames. The :mod:`tracemalloc` module must be tracing memory allocations to take a snapshot, see the the :func:`start` function. diff --git a/Doc/using/cmdline.rst b/Doc/using/cmdline.rst --- a/Doc/using/cmdline.rst +++ b/Doc/using/cmdline.rst @@ -381,10 +381,11 @@ * ``-X faulthandler`` to enable :mod:`faulthandler`; * ``-X showrefcount`` to enable the output of the total reference count and memory blocks (only works on debug builds); - * ``-X tracemalloc`` to enable :mod:`tracemalloc`. - * ``-X tracemalloc=NFRAME`` to enable :mod:`tracemalloc`, *NFRAME* is the - maximum number of frames stored in a trace: see the - :func:`tracemalloc.set_traceback_limit` function. + * ``-X tracemalloc`` to start tracing Python memory allocations using the + :mod:`tracemalloc` module. By default, only the most recent frame is + stored in a traceback of a trace. Use ``-X tracemalloc=NFRAME`` to start + tracing with a traceback limit of *NFRAME* frames. See the + :func:`tracemalloc.start` for more information. It also allows to pass arbitrary values and retrieve them through the :data:`sys._xoptions` dictionary. @@ -600,10 +601,11 @@ .. envvar:: PYTHONTRACEMALLOC - If this environment variable is set to a non-empty string, all memory - allocations made by Python are traced by the :mod:`tracemalloc` module. - The value of the variable is the maximum number of frames stored in a trace: - see the :func:`tracemalloc.set_traceback_limit` function. + If this environment variable is set to a non-empty string, start tracing + Python memory allocations using the :mod:`tracemalloc` module. The value of + the variable is the maximum number of frames stored in a traceback of a + trace. For example, ``PYTHONTRACEMALLOC=1`` stores only the most recent + frame. See the :func:`tracemalloc.start` for more information. .. versionadded:: 3.4 diff --git a/Lib/test/test_tracemalloc.py b/Lib/test/test_tracemalloc.py --- a/Lib/test/test_tracemalloc.py +++ b/Lib/test/test_tracemalloc.py @@ -83,8 +83,7 @@ if tracemalloc.is_tracing(): self.skipTest("tracemalloc must be stopped before the test") - tracemalloc.set_traceback_limit(1) - tracemalloc.start() + tracemalloc.start(1) def tearDown(self): tracemalloc.stop() @@ -109,20 +108,18 @@ def test_set_traceback_limit(self): obj_size = 10 - nframe = tracemalloc.get_traceback_limit() - self.addCleanup(tracemalloc.set_traceback_limit, nframe) + tracemalloc.stop() + self.assertRaises(ValueError, tracemalloc.start, -1) - self.assertRaises(ValueError, tracemalloc.set_traceback_limit, -1) - - tracemalloc.clear_traces() - tracemalloc.set_traceback_limit(10) + tracemalloc.stop() + tracemalloc.start(10) obj2, obj2_traceback = allocate_bytes(obj_size) traceback = tracemalloc.get_object_traceback(obj2) self.assertEqual(len(traceback), 10) self.assertEqual(traceback, obj2_traceback) - tracemalloc.clear_traces() - tracemalloc.set_traceback_limit(1) + tracemalloc.stop() + tracemalloc.start(1) obj, obj_traceback = allocate_bytes(obj_size) traceback = tracemalloc.get_object_traceback(obj) self.assertEqual(len(traceback), 1) @@ -163,8 +160,8 @@ return allocate_bytes3(size) # Ensure that two identical tracebacks are not duplicated - tracemalloc.clear_traces() - tracemalloc.set_traceback_limit(4) + tracemalloc.stop() + tracemalloc.start(4) obj_size = 123 obj1, obj1_traceback = allocate_bytes4(obj_size) obj2, obj2_traceback = allocate_bytes4(obj_size) diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -1151,13 +1151,27 @@ } PyDoc_STRVAR(tracemalloc_start_doc, - "start()\n" + "start(nframe: int=1)\n" "\n" - "Start tracing Python memory allocations."); + "Start tracing Python memory allocations. Set also the maximum number \n" + "of frames stored in the traceback of a trace to nframe."); static PyObject* -py_tracemalloc_start(PyObject *self) +py_tracemalloc_start(PyObject *self, PyObject *args) { + Py_ssize_t nframe = 1; + + if (!PyArg_ParseTuple(args, "|n:start", &nframe)) + return NULL; + + if (nframe < 1 || nframe > MAX_NFRAME) { + PyErr_Format(PyExc_ValueError, + "the number of frames must be in range [1; %i]", + MAX_NFRAME); + return NULL; + } + tracemalloc_config.max_nframe = Py_SAFE_DOWNCAST(nframe, Py_ssize_t, int); + if (tracemalloc_start() < 0) return NULL; @@ -1192,31 +1206,6 @@ return PyLong_FromLong(tracemalloc_config.max_nframe); } -PyDoc_STRVAR(tracemalloc_set_traceback_limit_doc, - "set_traceback_limit(nframe: int)\n" - "\n" - "Set the maximum number of frames stored in the traceback of a trace."); - -static PyObject* -tracemalloc_set_traceback_limit(PyObject *self, PyObject *args) -{ - Py_ssize_t nframe; - - if (!PyArg_ParseTuple(args, "n:set_traceback_limit", - &nframe)) - return NULL; - - if (nframe < 1 || nframe > MAX_NFRAME) { - PyErr_Format(PyExc_ValueError, - "the number of frames must be in range [1; %i]", - MAX_NFRAME); - return NULL; - } - tracemalloc_config.max_nframe = Py_SAFE_DOWNCAST(nframe, Py_ssize_t, int); - - Py_RETURN_NONE; -} - PyDoc_STRVAR(tracemalloc_get_tracemalloc_memory_doc, "get_tracemalloc_memory() -> int\n" "\n" @@ -1275,13 +1264,11 @@ {"_get_object_traceback", (PyCFunction)py_tracemalloc_get_object_traceback, METH_O, tracemalloc_get_object_traceback_doc}, {"start", (PyCFunction)py_tracemalloc_start, - METH_NOARGS, tracemalloc_start_doc}, + METH_VARARGS, tracemalloc_start_doc}, {"stop", (PyCFunction)py_tracemalloc_stop, METH_NOARGS, tracemalloc_stop_doc}, {"get_traceback_limit", (PyCFunction)py_tracemalloc_get_traceback_limit, METH_NOARGS, tracemalloc_get_traceback_limit_doc}, - {"set_traceback_limit", (PyCFunction)tracemalloc_set_traceback_limit, - METH_VARARGS, tracemalloc_set_traceback_limit_doc}, {"get_tracemalloc_memory", (PyCFunction)tracemalloc_get_tracemalloc_memory, METH_NOARGS, tracemalloc_get_tracemalloc_memory_doc}, {"get_traced_memory", (PyCFunction)tracemalloc_get_traced_memory, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 13:10:14 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 13:10:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Document_asyncio_transport?= =?utf-8?q?_APIs?= Message-ID: <3dRYHB6SGVz7Ll0@mail.python.org> http://hg.python.org/cpython/rev/2e3bee6e682b changeset: 87403:2e3bee6e682b user: Antoine Pitrou date: Sat Nov 23 12:50:52 2013 +0100 summary: Document asyncio transport APIs files: Doc/library/asyncio.rst | 170 ++++++++++++++++++++++++++++ 1 files changed, 170 insertions(+), 0 deletions(-) diff --git a/Doc/library/asyncio.rst b/Doc/library/asyncio.rst --- a/Doc/library/asyncio.rst +++ b/Doc/library/asyncio.rst @@ -242,6 +242,176 @@ Transports ---------- +Transports are classed provided by :mod:`asyncio` in order to abstract +various kinds of communication channels. You generally won't instantiate +a transport yourself; instead, you will call a :class:`EventLoop` method +which will create the transport and try to initiate the underlying +communication channel, calling you back when it succeeds. + +Once the communication channel is established, a transport is always +paired with a :ref:`protocol ` instance. The protocol can +then call the transport's methods for various purposes. + +:mod:`asyncio` currently implements transports for TCP, UDP, SSL, and +subprocess pipes. The methods available on a transport depend on +the transport's kind. + +Methods common to all transports +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +.. method:: close(self) + + Close the transport. If the transport has a buffer for outgoing + data, buffered data will be flushed asynchronously. No more data + will be received. After all buffered data is flushed, the + protocol's :meth:`connection_lost` method will be called with + :const:`None` as its argument. + + +.. method:: get_extra_info(name, default=None) + + Return optional transport information. *name* is a string representing + the piece of transport-specific information to get, *default* is the + value to return if the information doesn't exist. + + This method allows transport implementations to easily expose + channel-specific information. + +Methods of readable streaming transports +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +.. method:: pause_reading() + + Pause the receiving end of the transport. No data will be passed to + the protocol's :meth:`data_received` method until meth:`resume_reading` + is called. + +.. method:: resume_reading() + + Resume the receiving end. The protocol's :meth:`data_received` method + will be called once again if some data is available for reading. + +Methods of writable streaming transports +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +.. method:: write(data) + + Write some *data* bytes to the transport. + + This method does not block; it buffers the data and arranges for it + to be sent out asynchronously. + +.. method:: writelines(list_of_data) + + Write a list (or any iterable) of data bytes to the transport. + This is functionally equivalent to calling :meth:`write` on each + element yielded by the iterable, but may be implemented more efficiently. + +.. method:: write_eof() + + Close the write end of the transport after flushing buffered data. + Data may still be received. + + This method can raise :exc:`NotImplementedError` if the transport + (e.g. SSL) doesn't support half-closes. + +.. method:: can_write_eof() + + Return :const:`True` if the transport supports :meth:`write_eof`, + :const:`False` if not. + +.. method:: abort() + + Close the transport immediately, without waiting for pending operations + to complete. Buffered data will be lost. No more data will be received. + The protocol's :meth:`connection_lost` method will eventually be + called with :const:`None` as its argument. + +.. method:: set_write_buffer_limits(high=None, low=None) + + Set the *high*- and *low*-water limits for write flow control. + + These two values control when call the protocol's + :meth:`pause_writing` and :meth:`resume_writing` methods are called. + If specified, the low-water limit must be less than or equal to the + high-water limit. Neither *high* nor *low* can be negative. + + The defaults are implementation-specific. If only the + high-water limit is given, the low-water limit defaults to a + implementation-specific value less than or equal to the + high-water limit. Setting *high* to zero forces *low* to zero as + well, and causes :meth:`pause_writing` to be called whenever the + buffer becomes non-empty. Setting *low* to zero causes + :meth:`resume_writing` to be called only once the buffer is empty. + Use of zero for either limit is generally sub-optimal as it + reduces opportunities for doing I/O and computation + concurrently. + +.. method:: get_write_buffer_size() + + Return the current size of the output buffer used by the transport. + +Methods of datagram transports +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +.. method:: sendto(data, addr=None) + + Send the *data* bytes to the remote peer given by *addr* (a + transport-dependent target address). If *addr* is :const:`None`, the + data is sent to the target address given on transport creation. + + This method does not block; it buffers the data and arranges for it + to be sent out asynchronously. + +.. method:: abort() + + Close the transport immediately, without waiting for pending operations + to complete. Buffered data will be lost. No more data will be received. + The protocol's :meth:`connection_lost` method will eventually be + called with :const:`None` as its argument. + +Methods of subprocess transports +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +.. method:: get_pid() + + Return the subprocess process id as an integer. + +.. method:: get_returncode() + + Return the subprocess returncode as an integer or :const:`None` + if it hasn't returned, similarly to the + :attr:`subprocess.Popen.returncode` attribute. + +.. method:: get_pipe_transport(fd) + + Return the transport for the communication pipe correspondong to the + integer file descriptor *fd*. The return value can be a readable or + writable streaming transport, depending on the *fd*. If *fd* doesn't + correspond to a pipe belonging to this transport, :const:`None` is + returned. + +.. method:: send_signal(signal) + + Send the *signal* number to the subprocess, as in + :meth:`subprocess.Popen.send_signal`. + +.. method:: terminate() + + Ask the subprocess to stop, as in :meth:`subprocess.Popen.terminate`. + This method is an alias for the :meth:`close` method. + + On POSIX systems, this method sends SIGTERM to the subprocess. + On Windows, the Windows API function TerminateProcess() is called to + stop the subprocess. + +.. method:: kill(self) + + Kill the subprocess, as in :meth:`subprocess.Popen.kill` + + On POSIX systems, the function sends SIGKILL to the subprocess. + On Windows, this method is an alias for :meth:`terminate`. + .. _sync: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 13:10:16 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 13:10:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Start_documenting_the_even?= =?utf-8?q?t_loop?= Message-ID: <3dRYHD13tqz7Ll0@mail.python.org> http://hg.python.org/cpython/rev/b309f396ad26 changeset: 87404:b309f396ad26 user: Antoine Pitrou date: Sat Nov 23 13:10:08 2013 +0100 summary: Start documenting the event loop files: Doc/library/asyncio.rst | 57 +++++++++++++++++++++++++++++ 1 files changed, 57 insertions(+), 0 deletions(-) diff --git a/Doc/library/asyncio.rst b/Doc/library/asyncio.rst --- a/Doc/library/asyncio.rst +++ b/Doc/library/asyncio.rst @@ -58,6 +58,63 @@ Event loops ----------- +The event loop is the central execution device provided by :mod:`asyncio`. +It provides multiple facilities, amongst which: + +* Registering, executing and cancelling delayed calls (timeouts) + +* Creating client and server :ref:`transports ` for various + kinds of communication + +* Launching subprocesses and the associated :ref:`transports ` + for communication with an external program + +* Delegating costly function calls to a pool of threads + +Getting an event loop +^^^^^^^^^^^^^^^^^^^^^ + +The easiest way to get an event loop is to call the :func:`get_event_loop` +function. + +.. XXX more docs + +Delayed calls +^^^^^^^^^^^^^ + +The event loop has its own internal clock for computing timeouts. +Which clock is used depends on the (platform-specific) event loop +implementation; ideally it is a monotonic clock. This will generally be +a different clock than :func:`time.time`. + +.. method:: time() + + Return the current time, as a :class:`float` value, according to the + event loop's internal clock. + +.. method:: call_later(delay, callback, *args) + + Arrange for the *callback* to be called after the given *delay* + seconds (either an int or float). + + A "handle" is returned: an opaque object with a :meth:`cancel` method + that can be used to cancel the call. + + *callback* will be called exactly once per call to :meth:`call_later`. + If two callbacks are scheduled for exactly the same time, it is + undefined which will be called first. + + The optional positional *args* will be passed to the callback when it + is called. If you want the callback to be called with some named + arguments, use a closure or :func:`functools.partial`. + +.. method:: call_at(when, callback, *args) + + Arrange for the *callback* to be called at the given absolute timestamp + *when* (an int or float), using the same time reference as :meth:`time`. + + This method's behavior is the same as :meth:`call_later`. + .. _protocol: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 13:55:41 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 13:55:41 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Document_create=5Fconnecti?= =?utf-8?q?on?= Message-ID: <3dRZHd2w2Qz7Lnl@mail.python.org> http://hg.python.org/cpython/rev/566eb4dfb2f1 changeset: 87405:566eb4dfb2f1 user: Antoine Pitrou date: Sat Nov 23 13:55:35 2013 +0100 summary: Document create_connection files: Doc/library/asyncio.rst | 71 +++++++++++++++++++++++++++++ 1 files changed, 71 insertions(+), 0 deletions(-) diff --git a/Doc/library/asyncio.rst b/Doc/library/asyncio.rst --- a/Doc/library/asyncio.rst +++ b/Doc/library/asyncio.rst @@ -115,6 +115,71 @@ This method's behavior is the same as :meth:`call_later`. +Creating connections +^^^^^^^^^^^^^^^^^^^^ + +.. method:: create_connection(protocol_factory, host=None, port=None, **options) + + Create a streaming transport connection to a given Internet *host* and + *port*. *protocol_factory* must be a callable returning a + :ref:`protocol ` instance. + + This method returns a :ref:`coroutine ` which will try to + establish the connection in the background. When successful, the + coroutine returns a ``(transport, protocol)`` pair. + + The chronological synopsis of the underlying operation is as follows: + + #. The connection is established, and a :ref:`transport ` + is created to represent it. + + #. *protocol_factory* is called without arguments and must return a + :ref:`protocol ` instance. + + #. The protocol instance is tied to the transport, and its + :meth:`connection_made` method is called. + + #. The coroutine returns successfully with the ``(transport, protocol)`` + pair. + + The created transport is an implementation-dependent bidirectional stream. + + .. note:: + *protocol_factory* can be any kind of callable, not necessarily + a class. For example, if you want to use a pre-created + protocol instance, you can pass ``lambda: my_protocol``. + + *options* are optional named arguments allowing to change how the + connection is created: + + * *ssl*: if given and not false, a SSL/TLS transport is created + (by default a plain TCP transport is created). If *ssl* is + a :class:`ssl.SSLContext` object, this context is used to create + the transport; if *ssl* is :const:`True`, a context with some + unspecified default settings is used. + + * *server_hostname*, is only for use together with *ssl*, + and sets or overrides the hostname that the target server's certificate + will be matched against. By default the value of the *host* argument + is used. If *host* is empty, there is no default and you must pass a + value for *server_hostname*. If *server_hostname* is an empty + string, hostname matching is disabled (which is a serious security + risk, allowing for man-in-the-middle-attacks). + + * *family*, *proto*, *flags* are the optional address family, protocol + and flags to be passed through to getaddrinfo() for *host* resolution. + If given, these should all be integers from the corresponding + :mod:`socket` module constants. + + * *sock*, if given, should be an existing, already connected + :class:`socket.socket` object to be used by the transport. + If *sock* is given, none of *host*, *port*, *family*, *proto*, *flags* + and *local_addr* should be specified. + + * *local_addr*, if given, is a ``(local_host, local_port)`` tuple used + to bind the socket to locally. The *local_host* and *local_port* + are looked up using getaddrinfo(), similarly to *host* and *port*. + .. _protocol: @@ -470,6 +535,12 @@ On Windows, this method is an alias for :meth:`terminate`. +.. _coroutine: + +Coroutines +---------- + + .. _sync: Synchronization primitives -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 13:57:08 2013 From: python-checkins at python.org (christian.heimes) Date: Sat, 23 Nov 2013 13:57:08 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319292=3A_Add_SSLC?= =?utf-8?q?ontext=2Eload=5Fdefault=5Fcerts=28=29_to_load_default_root_CA?= Message-ID: <3dRZKJ5wwZz7Lnl@mail.python.org> http://hg.python.org/cpython/rev/dfd33140a2b5 changeset: 87406:dfd33140a2b5 user: Christian Heimes date: Sat Nov 23 13:56:58 2013 +0100 summary: Issue #19292: Add SSLContext.load_default_certs() to load default root CA certificates from default stores or system stores. By default the method loads CA certs for authentication of server certs. files: Doc/library/ssl.rst | 31 ++++++++++++++++++++++++++++++- Lib/ssl.py | 28 ++++++++++++++++++++++++++++ Lib/test/test_ssl.py | 32 ++++++++++++++++++++++++++++++++ Misc/NEWS | 4 ++++ 4 files changed, 94 insertions(+), 1 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -681,6 +681,20 @@ .. versionadded:: 3.4 +.. data:: Purpose.SERVER_AUTH + + Option for :meth:`SSLContext.load_default_certs` to load CA certificates + for TLS web server authentication (client side socket). + + .. versionadded:: 3.4 + +.. data:: Purpose.clientAuth + + Option for :meth:`SSLContext.load_default_certs` to load CA certificates + for TLS web client authentication (server side socket). + + .. versionadded:: 3.4 + SSL Sockets ----------- @@ -903,6 +917,22 @@ .. versionchanged:: 3.3 New optional argument *password*. +.. method:: SSLContext.load_default_certs(purpose=Purpose.SERVER_AUTH) + + Load a set of default "certification authority" (CA) certificates from + default locations. On Windows it loads CA certs from the ``CA`` and + ``ROOT`` system stores. On other systems it calls + :meth:`SSLContext.set_default_verify_paths`. In the future the method may + load CA certificates from other locations, too. + + The *purpose* flag specifies what kind of CA certificates are loaded. The + default settings :data:`Purpose.SERVER_AUTH` loads certificates, that are + flagged and trusted for TLS web server authentication (client side + sockets). :data:`Purpose.clientAuth` loads CA certificates for client + certificate verification on the server side. + + .. versionadded:: 3.4 + .. method:: SSLContext.load_verify_locations(cafile=None, capath=None, cadata=None) Load a set of "certification authority" (CA) certificates used to validate @@ -931,7 +961,6 @@ .. versionchanged:: 3.4 New optional argument *cadata* - .. method:: SSLContext.get_ca_certs(binary_form=False) Get a list of loaded "certification authority" (CA) certificates. If the diff --git a/Lib/ssl.py b/Lib/ssl.py --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -92,6 +92,7 @@ import sys import os from collections import namedtuple +from enum import Enum as _Enum import _ssl # if we can't import it, let the error propagate @@ -298,11 +299,19 @@ return super().__new__(cls, *_txt2obj(name, name=True)) +class Purpose(_ASN1Object, _Enum): + """SSLContext purpose flags with X509v3 Extended Key Usage objects + """ + SERVER_AUTH = '1.3.6.1.5.5.7.3.1' + CLIENT_AUTH = '1.3.6.1.5.5.7.3.2' + + class SSLContext(_SSLContext): """An SSLContext holds various SSL-related configuration options and data, such as certificates and possibly a private key.""" __slots__ = ('protocol', '__weakref__') + _windows_cert_stores = ("CA", "ROOT") def __new__(cls, protocol, *args, **kwargs): self = _SSLContext.__new__(cls, protocol) @@ -334,6 +343,25 @@ self._set_npn_protocols(protos) + def _load_windows_store_certs(self, storename, purpose): + certs = bytearray() + for cert, encoding, trust in enum_certificates(storename): + # CA certs are never PKCS#7 encoded + if encoding == "x509_asn": + if trust is True or purpose.oid in trust: + certs.extend(cert) + self.load_verify_locations(cadata=certs) + return certs + + def load_default_certs(self, purpose=Purpose.SERVER_AUTH): + if not isinstance(purpose, _ASN1Object): + raise TypeError(purpose) + if sys.platform == "win32": + for storename in self._windows_cert_stores: + self._load_windows_store_certs(storename, purpose) + else: + self.set_default_verify_paths() + class SSLSocket(socket): """This class implements a subtype of socket.socket that wraps diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -611,6 +611,23 @@ with self.assertRaisesRegex(ValueError, "unknown object 'serverauth'"): ssl._ASN1Object.fromname('serverauth') + def test_purpose_enum(self): + val = ssl._ASN1Object('1.3.6.1.5.5.7.3.1') + self.assertIsInstance(ssl.Purpose.SERVER_AUTH, ssl._ASN1Object) + self.assertEqual(ssl.Purpose.SERVER_AUTH, val) + self.assertEqual(ssl.Purpose.SERVER_AUTH.nid, 129) + self.assertEqual(ssl.Purpose.SERVER_AUTH.shortname, 'serverAuth') + self.assertEqual(ssl.Purpose.SERVER_AUTH.oid, + '1.3.6.1.5.5.7.3.1') + + val = ssl._ASN1Object('1.3.6.1.5.5.7.3.2') + self.assertIsInstance(ssl.Purpose.CLIENT_AUTH, ssl._ASN1Object) + self.assertEqual(ssl.Purpose.CLIENT_AUTH, val) + self.assertEqual(ssl.Purpose.CLIENT_AUTH.nid, 130) + self.assertEqual(ssl.Purpose.CLIENT_AUTH.shortname, 'clientAuth') + self.assertEqual(ssl.Purpose.CLIENT_AUTH.oid, + '1.3.6.1.5.5.7.3.2') + class ContextTests(unittest.TestCase): @@ -967,6 +984,21 @@ der = ssl.PEM_cert_to_DER_cert(pem) self.assertEqual(ctx.get_ca_certs(True), [der]) + def test_load_default_certs(self): + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + ctx.load_default_certs() + + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + ctx.load_default_certs(ssl.Purpose.SERVER_AUTH) + ctx.load_default_certs() + + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + ctx.load_default_certs(ssl.Purpose.CLIENT_AUTH) + + ctx = ssl.SSLContext(ssl.PROTOCOL_TLSv1) + self.assertRaises(TypeError, ctx.load_default_certs, None) + self.assertRaises(TypeError, ctx.load_default_certs, 'SERVER_AUTH') + class SSLErrorTests(unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,10 @@ Library ------- +- Issue #19292: Add SSLContext.load_default_certs() to load default root CA + certificates from default stores or system stores. By default the method + loads CA certs for authentication of server certs. + - Issue #19673: Add pathlib to the stdlib as a provisional module (PEP 428). - Issue #17916: Added dis.Bytecode.from_traceback() and -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 14:13:10 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 14:13:10 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_refleak_introduced_by_?= =?utf-8?q?4f730c045f5f_=28issue_=2318408=29_and_unveiled_by?= Message-ID: <3dRZgp1DR7z7LpS@mail.python.org> http://hg.python.org/cpython/rev/8f556ee0f6ba changeset: 87407:8f556ee0f6ba user: Antoine Pitrou date: Sat Nov 23 14:05:23 2013 +0100 summary: Fix refleak introduced by 4f730c045f5f (issue #18408) and unveiled by 95eea8624d05 (issue #16596). files: Python/ceval.c | 12 ++++-------- 1 files changed, 4 insertions(+), 8 deletions(-) diff --git a/Python/ceval.c b/Python/ceval.c --- a/Python/ceval.c +++ b/Python/ceval.c @@ -3850,20 +3850,16 @@ { PyObject *type, *value, *traceback, *orig_traceback, *arg; int err; - PyErr_Fetch(&type, &value, &traceback); + PyErr_Fetch(&type, &value, &orig_traceback); if (value == NULL) { value = Py_None; Py_INCREF(value); } - PyErr_NormalizeException(&type, &value, &traceback); - orig_traceback = traceback; - if (traceback == NULL) { - Py_INCREF(Py_None); - traceback = Py_None; - } + PyErr_NormalizeException(&type, &value, &orig_traceback); + traceback = (orig_traceback != NULL) ? orig_traceback : Py_None; arg = PyTuple_Pack(3, type, value, traceback); if (arg == NULL) { - PyErr_Restore(type, value, traceback); + PyErr_Restore(type, value, orig_traceback); return; } err = call_trace(func, self, f, PyTrace_EXCEPTION, arg); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 14:30:20 2013 From: python-checkins at python.org (michael.foord) Date: Sat, 23 Nov 2013 14:30:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Adding_Wing_5_project_file?= Message-ID: <3dRb3c0xb7z7Lnl@mail.python.org> http://hg.python.org/cpython/rev/eb4b6709d092 changeset: 87408:eb4b6709d092 parent: 85607:36f3f58fddce user: Michael Foord date: Sun Sep 15 20:02:18 2013 +1200 summary: Adding Wing 5 project file files: Misc/python-wing5.wpr | 18 ++++++++++++++++++ 1 files changed, 18 insertions(+), 0 deletions(-) diff --git a/Misc/python-wing5.wpr b/Misc/python-wing5.wpr new file mode 100644 --- /dev/null +++ b/Misc/python-wing5.wpr @@ -0,0 +1,18 @@ +#!wing +#!version=5.0 +################################################################## +# Wing IDE project file # +################################################################## +[project attributes] +proj.directory-list = [{'dirloc': loc('..'), + 'excludes': [u'.hg', + u'Lib/unittest/__pycache__', + u'Lib/unittest/test/__pycache__', + u'Lib/__pycache__', + u'build', + u'Doc/build'], + 'filter': '*', + 'include_hidden': False, + 'recursive': True, + 'watch_for_changes': True}] +proj.file-type = 'shared' -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 14:30:21 2013 From: python-checkins at python.org (michael.foord) Date: Sat, 23 Nov 2013 14:30:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Remove_shadowed_test?= Message-ID: <3dRb3d2f1Fz7Lp5@mail.python.org> http://hg.python.org/cpython/rev/c8c11082bd0c changeset: 87409:c8c11082bd0c user: Michael Foord date: Sun Sep 15 20:05:19 2013 +1200 summary: Remove shadowed test files: Lib/unittest/test/testmock/testmock.py | 17 +------------ 1 files changed, 2 insertions(+), 15 deletions(-) diff --git a/Lib/unittest/test/testmock/testmock.py b/Lib/unittest/test/testmock/testmock.py --- a/Lib/unittest/test/testmock/testmock.py +++ b/Lib/unittest/test/testmock/testmock.py @@ -1302,20 +1302,6 @@ self.assertEqual(m.method_calls, []) - def test_attribute_deletion(self): - # this behaviour isn't *useful*, but at least it's now tested... - for Klass in Mock, MagicMock, NonCallableMagicMock, NonCallableMock: - m = Klass() - original = m.foo - m.foo = 3 - del m.foo - self.assertEqual(m.foo, original) - - new = m.foo = Mock() - del m.foo - self.assertEqual(m.foo, new) - - def test_mock_parents(self): for Klass in Mock, MagicMock: m = Klass() @@ -1379,7 +1365,8 @@ def test_attribute_deletion(self): - for mock in Mock(), MagicMock(): + for mock in (Mock(), MagicMock(), NonCallableMagicMock(), + NonCallableMock()): self.assertTrue(hasattr(mock, 'm')) del mock.m -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 14:30:22 2013 From: python-checkins at python.org (michael.foord) Date: Sat, 23 Nov 2013 14:30:22 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_17457=3A_extend_test?= =?utf-8?q?_discovery_to_support_namespace_packages?= Message-ID: <3dRb3f5fQyz7LqN@mail.python.org> http://hg.python.org/cpython/rev/d2e5b74e2d18 changeset: 87410:d2e5b74e2d18 parent: 87407:8f556ee0f6ba user: Michael Foord date: Sat Nov 23 13:29:23 2013 +0000 summary: Issue 17457: extend test discovery to support namespace packages files: Lib/unittest/loader.py | 60 ++++++++++- Lib/unittest/test/test_discovery.py | 80 ++++++++++++++++- Misc/NEWS | 3 + Misc/python-wing5.wpr | 18 +++ 4 files changed, 150 insertions(+), 11 deletions(-) diff --git a/Lib/unittest/loader.py b/Lib/unittest/loader.py --- a/Lib/unittest/loader.py +++ b/Lib/unittest/loader.py @@ -61,8 +61,9 @@ def loadTestsFromTestCase(self, testCaseClass): """Return a suite of all tests cases contained in testCaseClass""" if issubclass(testCaseClass, suite.TestSuite): - raise TypeError("Test cases should not be derived from TestSuite." \ - " Maybe you meant to derive from TestCase?") + raise TypeError("Test cases should not be derived from " + "TestSuite. Maybe you meant to derive from " + "TestCase?") testCaseNames = self.getTestCaseNames(testCaseClass) if not testCaseNames and hasattr(testCaseClass, 'runTest'): testCaseNames = ['runTest'] @@ -200,6 +201,8 @@ self._top_level_dir = top_level_dir is_not_importable = False + is_namespace = False + tests = [] if os.path.isdir(os.path.abspath(start_dir)): start_dir = os.path.abspath(start_dir) if start_dir != top_level_dir: @@ -213,15 +216,52 @@ else: the_module = sys.modules[start_dir] top_part = start_dir.split('.')[0] - start_dir = os.path.abspath(os.path.dirname((the_module.__file__))) + try: + start_dir = os.path.abspath( + os.path.dirname((the_module.__file__))) + except AttributeError: + # look for namespace packages + try: + spec = the_module.__spec__ + except AttributeError: + spec = None + + if spec and spec.loader is None: + if spec.submodule_search_locations is not None: + is_namespace = True + + for path in the_module.__path__: + if (not set_implicit_top and + not path.startswith(top_level_dir)): + continue + self._top_level_dir = \ + (path.split(the_module.__name__ + .replace(".", os.path.sep))[0]) + tests.extend(self._find_tests(path, + pattern, + namespace=True)) + elif the_module.__name__ in sys.builtin_module_names: + # builtin module + raise TypeError('Can not use builtin modules ' + 'as dotted module names') from None + else: + raise TypeError( + 'don\'t know how to discover from {!r}' + .format(the_module)) from None + if set_implicit_top: - self._top_level_dir = self._get_directory_containing_module(top_part) - sys.path.remove(top_level_dir) + if not is_namespace: + self._top_level_dir = \ + self._get_directory_containing_module(top_part) + sys.path.remove(top_level_dir) + else: + sys.path.remove(top_level_dir) if is_not_importable: raise ImportError('Start directory is not importable: %r' % start_dir) - tests = list(self._find_tests(start_dir, pattern)) + if not is_namespace: + tests = list(self._find_tests(start_dir, pattern)) return self.suiteClass(tests) def _get_directory_containing_module(self, module_name): @@ -254,7 +294,7 @@ # override this method to use alternative matching strategy return fnmatch(path, pattern) - def _find_tests(self, start_dir, pattern): + def _find_tests(self, start_dir, pattern, namespace=False): """Used by discovery. Yields test suites it loads.""" paths = sorted(os.listdir(start_dir)) @@ -287,7 +327,8 @@ raise ImportError(msg % (mod_name, module_dir, expected_dir)) yield self.loadTestsFromModule(module) elif os.path.isdir(full_path): - if not os.path.isfile(os.path.join(full_path, '__init__.py')): + if (not namespace and + not os.path.isfile(os.path.join(full_path, '__init__.py'))): continue load_tests = None @@ -304,7 +345,8 @@ # tests loaded from package file yield tests # recurse into the package - yield from self._find_tests(full_path, pattern) + yield from self._find_tests(full_path, pattern, + namespace=namespace) else: try: yield load_tests(self, tests, pattern) diff --git a/Lib/unittest/test/test_discovery.py b/Lib/unittest/test/test_discovery.py --- a/Lib/unittest/test/test_discovery.py +++ b/Lib/unittest/test/test_discovery.py @@ -1,6 +1,8 @@ import os import re import sys +import types +import builtins from test import support import unittest @@ -173,7 +175,7 @@ self.addCleanup(restore_isdir) _find_tests_args = [] - def _find_tests(start_dir, pattern): + def _find_tests(start_dir, pattern, namespace=None): _find_tests_args.append((start_dir, pattern)) return ['tests'] loader._find_tests = _find_tests @@ -436,7 +438,7 @@ expectedPath = os.path.abspath(os.path.dirname(unittest.test.__file__)) self.wasRun = False - def _find_tests(start_dir, pattern): + def _find_tests(start_dir, pattern, namespace=None): self.wasRun = True self.assertEqual(start_dir, expectedPath) return tests @@ -446,5 +448,79 @@ self.assertEqual(suite._tests, tests) + def test_discovery_from_dotted_path_builtin_modules(self): + + loader = unittest.TestLoader() + + listdir = os.listdir + os.listdir = lambda _: ['test_this_does_not_exist.py'] + isfile = os.path.isfile + isdir = os.path.isdir + os.path.isdir = lambda _: False + orig_sys_path = sys.path[:] + def restore(): + os.path.isfile = isfile + os.path.isdir = isdir + os.listdir = listdir + sys.path[:] = orig_sys_path + self.addCleanup(restore) + + with self.assertRaises(TypeError) as cm: + loader.discover('sys') + self.assertEqual(str(cm.exception), + 'Can not use builtin modules ' + 'as dotted module names') + + def test_discovery_from_dotted_namespace_packages(self): + loader = unittest.TestLoader() + + orig_import = __import__ + package = types.ModuleType('package') + package.__path__ = ['/a', '/b'] + package.__spec__ = types.SimpleNamespace( + loader=None, + submodule_search_locations=['/a', '/b'] + ) + + def _import(packagename, *args, **kwargs): + sys.modules[packagename] = package + return package + + def cleanup(): + builtins.__import__ = orig_import + self.addCleanup(cleanup) + builtins.__import__ = _import + + _find_tests_args = [] + def _find_tests(start_dir, pattern, namespace=None): + _find_tests_args.append((start_dir, pattern)) + return ['%s/tests' % start_dir] + + loader._find_tests = _find_tests + loader.suiteClass = list + suite = loader.discover('package') + self.assertEqual(suite, ['/a/tests', '/b/tests']) + + def test_discovery_failed_discovery(self): + loader = unittest.TestLoader() + package = types.ModuleType('package') + orig_import = __import__ + + def _import(packagename, *args, **kwargs): + sys.modules[packagename] = package + return package + + def cleanup(): + builtins.__import__ = orig_import + self.addCleanup(cleanup) + builtins.__import__ = _import + + with self.assertRaises(TypeError) as cm: + loader.discover('package') + self.assertEqual(str(cm.exception), + 'don\'t know how to discover from {!r}' + .format(package)) + + if __name__ == '__main__': unittest.main() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -479,6 +479,9 @@ Library ------- +- Issue #17457: unittest test discovery now works with namespace packages. + Patch by Claudiu Popa. + - Issue #18235: Fix the sysconfig variables LDSHARED and BLDSHARED under AIX. Patch by David Edelsohn. diff --git a/Misc/python-wing5.wpr b/Misc/python-wing5.wpr new file mode 100644 --- /dev/null +++ b/Misc/python-wing5.wpr @@ -0,0 +1,18 @@ +#!wing +#!version=5.0 +################################################################## +# Wing IDE project file # +################################################################## +[project attributes] +proj.directory-list = [{'dirloc': loc('..'), + 'excludes': [u'.hg', + u'Lib/unittest/__pycache__', + u'Lib/unittest/test/__pycache__', + u'Lib/__pycache__', + u'build', + u'Doc/build'], + 'filter': '*', + 'include_hidden': False, + 'recursive': True, + 'watch_for_changes': True}] +proj.file-type = 'shared' -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 14:30:24 2013 From: python-checkins at python.org (michael.foord) Date: Sat, 23 Nov 2013 14:30:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merge?= Message-ID: <3dRb3h0KQSz7Lqg@mail.python.org> http://hg.python.org/cpython/rev/261429c07a4a changeset: 87411:261429c07a4a parent: 87410:d2e5b74e2d18 parent: 87409:c8c11082bd0c user: Michael Foord date: Sat Nov 23 13:30:03 2013 +0000 summary: Merge files: Lib/unittest/test/testmock/testmock.py | 17 +------------ 1 files changed, 2 insertions(+), 15 deletions(-) diff --git a/Lib/unittest/test/testmock/testmock.py b/Lib/unittest/test/testmock/testmock.py --- a/Lib/unittest/test/testmock/testmock.py +++ b/Lib/unittest/test/testmock/testmock.py @@ -1302,20 +1302,6 @@ self.assertEqual(m.method_calls, []) - def test_attribute_deletion(self): - # this behaviour isn't *useful*, but at least it's now tested... - for Klass in Mock, MagicMock, NonCallableMagicMock, NonCallableMock: - m = Klass() - original = m.foo - m.foo = 3 - del m.foo - self.assertEqual(m.foo, original) - - new = m.foo = Mock() - del m.foo - self.assertEqual(m.foo, new) - - def test_mock_parents(self): for Klass in Mock, MagicMock: m = Klass() @@ -1379,7 +1365,8 @@ def test_attribute_deletion(self): - for mock in Mock(), MagicMock(): + for mock in (Mock(), MagicMock(), NonCallableMagicMock(), + NonCallableMock()): self.assertTrue(hasattr(mock, 'm')) del mock.m -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 14:34:54 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 23 Nov 2013 14:34:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NzMz?= =?utf-8?q?=3A_Temporary_disable_test=5Fimage_on_MacOSX=2E?= Message-ID: <3dRb8t5shszQBj@mail.python.org> http://hg.python.org/cpython/rev/9d0a76349eda changeset: 87412:9d0a76349eda branch: 3.3 parent: 87389:03a55e207720 user: Serhiy Storchaka date: Sat Nov 23 15:21:33 2013 +0200 summary: Issue #19733: Temporary disable test_image on MacOSX. files: Lib/tkinter/test/test_tkinter/test_widgets.py | 3 +++ Lib/tkinter/test/widget_tests.py | 4 ++++ 2 files changed, 7 insertions(+), 0 deletions(-) diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -1,6 +1,7 @@ import unittest import tkinter import os +import sys from test.support import requires from tkinter.test.support import (tcl_version, requires_tcl, @@ -262,6 +263,8 @@ test_highlightthickness = StandardOptionsTests.test_highlightthickness + @unittest.skipIf(sys.platform == 'darwin', + 'crashes with Cocoa Tk (issue19733)') def test_image(self): widget = self.create() image = tkinter.PhotoImage('image1') diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/tkinter/test/widget_tests.py @@ -1,5 +1,7 @@ # Common tests for test_tkinter/test_widgets.py and test_ttk/test_widgets.py +import unittest +import sys import tkinter from tkinter.ttk import setup_master, Scale from tkinter.test.support import (tcl_version, requires_tcl, get_tk_patchlevel, @@ -289,6 +291,8 @@ self.checkParam(widget, 'highlightthickness', -2, expected=0, conv=self._conv_pixels) + @unittest.skipIf(sys.platform == 'darwin', + 'crashes with Cocoa Tk (issue19733)') def test_image(self): widget = self.create() self.checkImageParam(widget, 'image') -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 14:34:56 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 23 Nov 2013 14:34:56 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319733=3A_Temporary_disable_test=5Fimage_on_MacO?= =?utf-8?b?U1gu?= Message-ID: <3dRb8w0dMPz7LqB@mail.python.org> http://hg.python.org/cpython/rev/71e091ed2588 changeset: 87413:71e091ed2588 parent: 87406:dfd33140a2b5 parent: 87412:9d0a76349eda user: Serhiy Storchaka date: Sat Nov 23 15:22:10 2013 +0200 summary: Issue #19733: Temporary disable test_image on MacOSX. files: Lib/tkinter/test/test_tkinter/test_widgets.py | 3 +++ Lib/tkinter/test/widget_tests.py | 4 ++++ 2 files changed, 7 insertions(+), 0 deletions(-) diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -1,6 +1,7 @@ import unittest import tkinter import os +import sys from test.support import requires from tkinter.test.support import (tcl_version, requires_tcl, @@ -262,6 +263,8 @@ test_highlightthickness = StandardOptionsTests.test_highlightthickness + @unittest.skipIf(sys.platform == 'darwin', + 'crashes with Cocoa Tk (issue19733)') def test_image(self): widget = self.create() image = tkinter.PhotoImage('image1') diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/tkinter/test/widget_tests.py @@ -1,5 +1,7 @@ # Common tests for test_tkinter/test_widgets.py and test_ttk/test_widgets.py +import unittest +import sys import tkinter from tkinter.ttk import setup_master, Scale from tkinter.test.support import (tcl_version, requires_tcl, get_tk_patchlevel, @@ -289,6 +291,8 @@ self.checkParam(widget, 'highlightthickness', -2, expected=0, conv=self._conv_pixels) + @unittest.skipIf(sys.platform == 'darwin', + 'crashes with Cocoa Tk (issue19733)') def test_image(self): widget = self.create() self.checkImageParam(widget, 'image') -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 14:34:57 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 23 Nov 2013 14:34:57 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NzMz?= =?utf-8?q?=3A_Temporary_disable_test=5Fimage_on_MacOSX=2E?= Message-ID: <3dRb8x2ScZz7LqH@mail.python.org> http://hg.python.org/cpython/rev/3912934e99ba changeset: 87414:3912934e99ba branch: 2.7 parent: 87316:08f282c96fd1 user: Serhiy Storchaka date: Sat Nov 23 15:22:20 2013 +0200 summary: Issue #19733: Temporary disable test_image on MacOSX. files: Lib/lib-tk/test/test_tkinter/test_widgets.py | 8 ++++++++ Lib/lib-tk/test/widget_tests.py | 4 ++++ 2 files changed, 12 insertions(+), 0 deletions(-) diff --git a/Lib/lib-tk/test/test_tkinter/test_widgets.py b/Lib/lib-tk/test/test_tkinter/test_widgets.py --- a/Lib/lib-tk/test/test_tkinter/test_widgets.py +++ b/Lib/lib-tk/test/test_tkinter/test_widgets.py @@ -1,6 +1,7 @@ import unittest import Tkinter import os +import sys from test.test_support import requires, run_unittest from test_ttk.support import (tcl_version, requires_tcl, get_tk_patchlevel, @@ -259,6 +260,13 @@ test_highlightthickness = StandardOptionsTests.test_highlightthickness.im_func +<<<<<<< +======= + test_highlightthickness = StandardOptionsTests.test_highlightthickness + + @unittest.skipIf(sys.platform == 'darwin', + 'crashes with Cocoa Tk (issue19733)') +>>>>>>> def test_image(self): widget = self.create() image = Tkinter.PhotoImage('image1') diff --git a/Lib/lib-tk/test/widget_tests.py b/Lib/lib-tk/test/widget_tests.py --- a/Lib/lib-tk/test/widget_tests.py +++ b/Lib/lib-tk/test/widget_tests.py @@ -1,5 +1,7 @@ # Common tests for test_tkinter/test_widgets.py and test_ttk/test_widgets.py +import unittest +import sys import Tkinter from ttk import setup_master, Scale from test_ttk.support import (tcl_version, requires_tcl, get_tk_patchlevel, @@ -308,6 +310,8 @@ self.checkParam(widget, 'highlightthickness', -2, expected=0, conv=self._conv_pixels) + @unittest.skipIf(sys.platform == 'darwin', + 'crashes with Cocoa Tk (issue19733)') def test_image(self): widget = self.create() self.checkImageParam(widget, 'image') -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 14:34:58 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 23 Nov 2013 14:34:58 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merge_heads?= Message-ID: <3dRb8y5W8Kz7LqH@mail.python.org> http://hg.python.org/cpython/rev/98f275b2598f changeset: 87415:98f275b2598f parent: 87413:71e091ed2588 parent: 87411:261429c07a4a user: Serhiy Storchaka date: Sat Nov 23 15:34:05 2013 +0200 summary: Merge heads files: Lib/unittest/loader.py | 60 ++++++++- Lib/unittest/test/test_discovery.py | 80 +++++++++++++- Lib/unittest/test/testmock/testmock.py | 17 +-- Misc/NEWS | 3 + Misc/python-wing5.wpr | 18 +++ Python/ceval.c | 12 +- 6 files changed, 156 insertions(+), 34 deletions(-) diff --git a/Lib/unittest/loader.py b/Lib/unittest/loader.py --- a/Lib/unittest/loader.py +++ b/Lib/unittest/loader.py @@ -61,8 +61,9 @@ def loadTestsFromTestCase(self, testCaseClass): """Return a suite of all tests cases contained in testCaseClass""" if issubclass(testCaseClass, suite.TestSuite): - raise TypeError("Test cases should not be derived from TestSuite." \ - " Maybe you meant to derive from TestCase?") + raise TypeError("Test cases should not be derived from " + "TestSuite. Maybe you meant to derive from " + "TestCase?") testCaseNames = self.getTestCaseNames(testCaseClass) if not testCaseNames and hasattr(testCaseClass, 'runTest'): testCaseNames = ['runTest'] @@ -200,6 +201,8 @@ self._top_level_dir = top_level_dir is_not_importable = False + is_namespace = False + tests = [] if os.path.isdir(os.path.abspath(start_dir)): start_dir = os.path.abspath(start_dir) if start_dir != top_level_dir: @@ -213,15 +216,52 @@ else: the_module = sys.modules[start_dir] top_part = start_dir.split('.')[0] - start_dir = os.path.abspath(os.path.dirname((the_module.__file__))) + try: + start_dir = os.path.abspath( + os.path.dirname((the_module.__file__))) + except AttributeError: + # look for namespace packages + try: + spec = the_module.__spec__ + except AttributeError: + spec = None + + if spec and spec.loader is None: + if spec.submodule_search_locations is not None: + is_namespace = True + + for path in the_module.__path__: + if (not set_implicit_top and + not path.startswith(top_level_dir)): + continue + self._top_level_dir = \ + (path.split(the_module.__name__ + .replace(".", os.path.sep))[0]) + tests.extend(self._find_tests(path, + pattern, + namespace=True)) + elif the_module.__name__ in sys.builtin_module_names: + # builtin module + raise TypeError('Can not use builtin modules ' + 'as dotted module names') from None + else: + raise TypeError( + 'don\'t know how to discover from {!r}' + .format(the_module)) from None + if set_implicit_top: - self._top_level_dir = self._get_directory_containing_module(top_part) - sys.path.remove(top_level_dir) + if not is_namespace: + self._top_level_dir = \ + self._get_directory_containing_module(top_part) + sys.path.remove(top_level_dir) + else: + sys.path.remove(top_level_dir) if is_not_importable: raise ImportError('Start directory is not importable: %r' % start_dir) - tests = list(self._find_tests(start_dir, pattern)) + if not is_namespace: + tests = list(self._find_tests(start_dir, pattern)) return self.suiteClass(tests) def _get_directory_containing_module(self, module_name): @@ -254,7 +294,7 @@ # override this method to use alternative matching strategy return fnmatch(path, pattern) - def _find_tests(self, start_dir, pattern): + def _find_tests(self, start_dir, pattern, namespace=False): """Used by discovery. Yields test suites it loads.""" paths = sorted(os.listdir(start_dir)) @@ -287,7 +327,8 @@ raise ImportError(msg % (mod_name, module_dir, expected_dir)) yield self.loadTestsFromModule(module) elif os.path.isdir(full_path): - if not os.path.isfile(os.path.join(full_path, '__init__.py')): + if (not namespace and + not os.path.isfile(os.path.join(full_path, '__init__.py'))): continue load_tests = None @@ -304,7 +345,8 @@ # tests loaded from package file yield tests # recurse into the package - yield from self._find_tests(full_path, pattern) + yield from self._find_tests(full_path, pattern, + namespace=namespace) else: try: yield load_tests(self, tests, pattern) diff --git a/Lib/unittest/test/test_discovery.py b/Lib/unittest/test/test_discovery.py --- a/Lib/unittest/test/test_discovery.py +++ b/Lib/unittest/test/test_discovery.py @@ -1,6 +1,8 @@ import os import re import sys +import types +import builtins from test import support import unittest @@ -173,7 +175,7 @@ self.addCleanup(restore_isdir) _find_tests_args = [] - def _find_tests(start_dir, pattern): + def _find_tests(start_dir, pattern, namespace=None): _find_tests_args.append((start_dir, pattern)) return ['tests'] loader._find_tests = _find_tests @@ -436,7 +438,7 @@ expectedPath = os.path.abspath(os.path.dirname(unittest.test.__file__)) self.wasRun = False - def _find_tests(start_dir, pattern): + def _find_tests(start_dir, pattern, namespace=None): self.wasRun = True self.assertEqual(start_dir, expectedPath) return tests @@ -446,5 +448,79 @@ self.assertEqual(suite._tests, tests) + def test_discovery_from_dotted_path_builtin_modules(self): + + loader = unittest.TestLoader() + + listdir = os.listdir + os.listdir = lambda _: ['test_this_does_not_exist.py'] + isfile = os.path.isfile + isdir = os.path.isdir + os.path.isdir = lambda _: False + orig_sys_path = sys.path[:] + def restore(): + os.path.isfile = isfile + os.path.isdir = isdir + os.listdir = listdir + sys.path[:] = orig_sys_path + self.addCleanup(restore) + + with self.assertRaises(TypeError) as cm: + loader.discover('sys') + self.assertEqual(str(cm.exception), + 'Can not use builtin modules ' + 'as dotted module names') + + def test_discovery_from_dotted_namespace_packages(self): + loader = unittest.TestLoader() + + orig_import = __import__ + package = types.ModuleType('package') + package.__path__ = ['/a', '/b'] + package.__spec__ = types.SimpleNamespace( + loader=None, + submodule_search_locations=['/a', '/b'] + ) + + def _import(packagename, *args, **kwargs): + sys.modules[packagename] = package + return package + + def cleanup(): + builtins.__import__ = orig_import + self.addCleanup(cleanup) + builtins.__import__ = _import + + _find_tests_args = [] + def _find_tests(start_dir, pattern, namespace=None): + _find_tests_args.append((start_dir, pattern)) + return ['%s/tests' % start_dir] + + loader._find_tests = _find_tests + loader.suiteClass = list + suite = loader.discover('package') + self.assertEqual(suite, ['/a/tests', '/b/tests']) + + def test_discovery_failed_discovery(self): + loader = unittest.TestLoader() + package = types.ModuleType('package') + orig_import = __import__ + + def _import(packagename, *args, **kwargs): + sys.modules[packagename] = package + return package + + def cleanup(): + builtins.__import__ = orig_import + self.addCleanup(cleanup) + builtins.__import__ = _import + + with self.assertRaises(TypeError) as cm: + loader.discover('package') + self.assertEqual(str(cm.exception), + 'don\'t know how to discover from {!r}' + .format(package)) + + if __name__ == '__main__': unittest.main() diff --git a/Lib/unittest/test/testmock/testmock.py b/Lib/unittest/test/testmock/testmock.py --- a/Lib/unittest/test/testmock/testmock.py +++ b/Lib/unittest/test/testmock/testmock.py @@ -1302,20 +1302,6 @@ self.assertEqual(m.method_calls, []) - def test_attribute_deletion(self): - # this behaviour isn't *useful*, but at least it's now tested... - for Klass in Mock, MagicMock, NonCallableMagicMock, NonCallableMock: - m = Klass() - original = m.foo - m.foo = 3 - del m.foo - self.assertEqual(m.foo, original) - - new = m.foo = Mock() - del m.foo - self.assertEqual(m.foo, new) - - def test_mock_parents(self): for Klass in Mock, MagicMock: m = Klass() @@ -1379,7 +1365,8 @@ def test_attribute_deletion(self): - for mock in Mock(), MagicMock(): + for mock in (Mock(), MagicMock(), NonCallableMagicMock(), + NonCallableMock()): self.assertTrue(hasattr(mock, 'm')) del mock.m diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -479,6 +479,9 @@ Library ------- +- Issue #17457: unittest test discovery now works with namespace packages. + Patch by Claudiu Popa. + - Issue #18235: Fix the sysconfig variables LDSHARED and BLDSHARED under AIX. Patch by David Edelsohn. diff --git a/Misc/python-wing5.wpr b/Misc/python-wing5.wpr new file mode 100644 --- /dev/null +++ b/Misc/python-wing5.wpr @@ -0,0 +1,18 @@ +#!wing +#!version=5.0 +################################################################## +# Wing IDE project file # +################################################################## +[project attributes] +proj.directory-list = [{'dirloc': loc('..'), + 'excludes': [u'.hg', + u'Lib/unittest/__pycache__', + u'Lib/unittest/test/__pycache__', + u'Lib/__pycache__', + u'build', + u'Doc/build'], + 'filter': '*', + 'include_hidden': False, + 'recursive': True, + 'watch_for_changes': True}] +proj.file-type = 'shared' diff --git a/Python/ceval.c b/Python/ceval.c --- a/Python/ceval.c +++ b/Python/ceval.c @@ -3850,20 +3850,16 @@ { PyObject *type, *value, *traceback, *orig_traceback, *arg; int err; - PyErr_Fetch(&type, &value, &traceback); + PyErr_Fetch(&type, &value, &orig_traceback); if (value == NULL) { value = Py_None; Py_INCREF(value); } - PyErr_NormalizeException(&type, &value, &traceback); - orig_traceback = traceback; - if (traceback == NULL) { - Py_INCREF(Py_None); - traceback = Py_None; - } + PyErr_NormalizeException(&type, &value, &orig_traceback); + traceback = (orig_traceback != NULL) ? orig_traceback : Py_None; arg = PyTuple_Pack(3, type, value, traceback); if (arg == NULL) { - PyErr_Restore(type, value, traceback); + PyErr_Restore(type, value, orig_traceback); return; } err = call_trace(func, self, f, PyTrace_EXCEPTION, arg); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 14:53:12 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 14:53:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319716=3A_add_a_te?= =?utf-8?q?st_that_Path=2Etouch=28=29_doesn=27t_change_a_file=27s_contents?= =?utf-8?q?=2E?= Message-ID: <3dRbZ02q7Lz7Lln@mail.python.org> http://hg.python.org/cpython/rev/11a200202d7a changeset: 87416:11a200202d7a user: Antoine Pitrou date: Sat Nov 23 14:52:39 2013 +0100 summary: Issue #19716: add a test that Path.touch() doesn't change a file's contents. Patch by Kushal Das. files: Lib/test/test_pathlib.py | 7 +++++++ 1 files changed, 7 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -1403,6 +1403,13 @@ self.assertTrue(p.exists()) self.assertRaises(OSError, p.touch, exist_ok=False) + def test_touch_nochange(self): + P = self.cls(BASE) + p = P / 'fileA' + p.touch() + with p.open('rb') as f: + self.assertEqual(f.read().strip(), b"this is file A") + def test_mkdir(self): P = self.cls(BASE) p = P / 'newdirA' -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 14:56:25 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 23 Nov 2013 14:56:25 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2317201=3A_ZIP64_ex?= =?utf-8?q?tensions_now_are_enabled_by_default=2E?= Message-ID: <3dRbdj0PsDzSZ1@mail.python.org> http://hg.python.org/cpython/rev/271cc3660445 changeset: 87417:271cc3660445 user: Serhiy Storchaka date: Sat Nov 23 15:55:38 2013 +0200 summary: Issue #17201: ZIP64 extensions now are enabled by default. Patch by William Mallard. files: Doc/library/zipfile.rst | 17 ++++++++++------- Lib/test/test_zipfile.py | 4 ++-- Lib/test/test_zipfile64.py | 4 ++-- Lib/zipfile.py | 8 ++++---- Misc/ACKS | 1 + Misc/NEWS | 3 +++ 6 files changed, 22 insertions(+), 15 deletions(-) diff --git a/Doc/library/zipfile.rst b/Doc/library/zipfile.rst --- a/Doc/library/zipfile.rst +++ b/Doc/library/zipfile.rst @@ -130,7 +130,7 @@ --------------- -.. class:: ZipFile(file, mode='r', compression=ZIP_STORED, allowZip64=False) +.. class:: ZipFile(file, mode='r', compression=ZIP_STORED, allowZip64=True) Open a ZIP file, where *file* can be either a path to a file (a string) or a file-like object. The *mode* parameter should be ``'r'`` to read an existing @@ -147,12 +147,9 @@ :const:`ZIP_BZIP2` or :const:`ZIP_LZMA` is specified but the corresponding module (:mod:`zlib`, :mod:`bz2` or :mod:`lzma`) is not available, :exc:`RuntimeError` is also raised. The default is :const:`ZIP_STORED`. If *allowZip64* is - ``True`` zipfile will create ZIP files that use the ZIP64 extensions when - the zipfile is larger than 2 GiB. If it is false (the default) :mod:`zipfile` + ``True`` (the default) zipfile will create ZIP files that use the ZIP64 + extensions when the zipfile is larger than 2 GiB. If it is false :mod:`zipfile` will raise an exception when the ZIP file would require ZIP64 extensions. - ZIP64 extensions are disabled by default because the default :program:`zip` - and :program:`unzip` commands on Unix (the InfoZIP utilities) don't support - these extensions. If the file is created with mode ``'a'`` or ``'w'`` and then :meth:`closed ` without adding any files to the archive, the appropriate @@ -171,6 +168,9 @@ .. versionchanged:: 3.3 Added support for :mod:`bzip2 ` and :mod:`lzma` compression. + .. versionchanged:: 3.4 + ZIP64 extensions are enabled by default. + .. method:: ZipFile.close() @@ -374,12 +374,15 @@ The :class:`PyZipFile` constructor takes the same parameters as the :class:`ZipFile` constructor, and one additional parameter, *optimize*. -.. class:: PyZipFile(file, mode='r', compression=ZIP_STORED, allowZip64=False, \ +.. class:: PyZipFile(file, mode='r', compression=ZIP_STORED, allowZip64=True, \ optimize=-1) .. versionadded:: 3.2 The *optimize* parameter. + .. versionchanged:: 3.4 + ZIP64 extensions are enabled by default. + Instances have one method in addition to those of :class:`ZipFile` objects: .. method:: PyZipFile.writepy(pathname, basename='', filterfunc=None) diff --git a/Lib/test/test_zipfile.py b/Lib/test/test_zipfile.py --- a/Lib/test/test_zipfile.py +++ b/Lib/test/test_zipfile.py @@ -506,12 +506,12 @@ compression = zipfile.ZIP_STORED def large_file_exception_test(self, f, compression): - with zipfile.ZipFile(f, "w", compression) as zipfp: + with zipfile.ZipFile(f, "w", compression, allowZip64=False) as zipfp: self.assertRaises(zipfile.LargeZipFile, zipfp.write, TESTFN, "another.name") def large_file_exception_test2(self, f, compression): - with zipfile.ZipFile(f, "w", compression) as zipfp: + with zipfile.ZipFile(f, "w", compression, allowZip64=False) as zipfp: self.assertRaises(zipfile.LargeZipFile, zipfp.writestr, "another.name", self.data) diff --git a/Lib/test/test_zipfile64.py b/Lib/test/test_zipfile64.py --- a/Lib/test/test_zipfile64.py +++ b/Lib/test/test_zipfile64.py @@ -38,7 +38,7 @@ def zipTest(self, f, compression): # Create the ZIP archive. - zipfp = zipfile.ZipFile(f, "w", compression, allowZip64=True) + zipfp = zipfile.ZipFile(f, "w", compression) # It will contain enough copies of self.data to reach about 6GB of # raw data to store. @@ -92,7 +92,7 @@ def testMoreThan64kFiles(self): # This test checks that more than 64k files can be added to an archive, # and that the resulting archive can be read properly by ZipFile - zipf = zipfile.ZipFile(TESTFN, mode="w") + zipf = zipfile.ZipFile(TESTFN, mode="w", allowZip64=False) zipf.debug = 100 numfiles = (1 << 16) * 3//2 for i in range(numfiles): diff --git a/Lib/zipfile.py b/Lib/zipfile.py --- a/Lib/zipfile.py +++ b/Lib/zipfile.py @@ -876,7 +876,7 @@ class ZipFile: """ Class with methods to open, read, write, close, list zip files. - z = ZipFile(file, mode="r", compression=ZIP_STORED, allowZip64=False) + z = ZipFile(file, mode="r", compression=ZIP_STORED, allowZip64=True) file: Either the path to the file, or a file-like object. If it is a path, the file will be opened and closed by ZipFile. @@ -892,7 +892,7 @@ fp = None # Set here since __del__ checks it _windows_illegal_name_trans_table = None - def __init__(self, file, mode="r", compression=ZIP_STORED, allowZip64=False): + def __init__(self, file, mode="r", compression=ZIP_STORED, allowZip64=True): """Open the ZIP file with mode read "r", write "w" or append "a".""" if mode not in ("r", "w", "a"): raise RuntimeError('ZipFile() requires mode "r", "w", or "a"') @@ -1561,7 +1561,7 @@ """Class to create ZIP archives with Python library files and packages.""" def __init__(self, file, mode="r", compression=ZIP_STORED, - allowZip64=False, optimize=-1): + allowZip64=True, optimize=-1): ZipFile.__init__(self, file, mode=mode, compression=compression, allowZip64=allowZip64) self._optimize = optimize @@ -1783,7 +1783,7 @@ os.path.join(path, nm), os.path.join(zippath, nm)) # else: ignore - with ZipFile(args[1], 'w', allowZip64=True) as zf: + with ZipFile(args[1], 'w') as zf: for src in args[2:]: addToZip(zf, src, os.path.basename(src)) diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -806,6 +806,7 @@ Grzegorz Makarewicz David Malcolm Greg Malcolm +William Mallard Ken Manheimer Vladimir Marangozov Colin Marc diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,9 @@ Library ------- +- Issue #17201: ZIP64 extensions now are enabled by default. Patch by + William Mallard. + - Issue #19292: Add SSLContext.load_default_certs() to load default root CA certificates from default stores or system stores. By default the method loads CA certs for authentication of server certs. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 15:00:17 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 23 Nov 2013 15:00:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319634=3A_time=2Es?= =?utf-8?q?trftime=28=22=25y=22=29_now_raises_a_ValueError_on_Solaris_when?= =?utf-8?q?_given?= Message-ID: <3dRbk90qV2z7Lln@mail.python.org> http://hg.python.org/cpython/rev/b7fd5d8e9968 changeset: 87418:b7fd5d8e9968 user: Victor Stinner date: Sat Nov 23 14:59:33 2013 +0100 summary: Issue #19634: time.strftime("%y") now raises a ValueError on Solaris when given a year before 1900. files: Modules/timemodule.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Modules/timemodule.c b/Modules/timemodule.c --- a/Modules/timemodule.c +++ b/Modules/timemodule.c @@ -650,7 +650,7 @@ return NULL; } } -#elif defined(_AIX) && defined(HAVE_WCSFTIME) +#elif (defined(_AIX) || defined(sun)) && defined(HAVE_WCSFTIME) for(outbuf = wcschr(fmt, '%'); outbuf != NULL; outbuf = wcschr(outbuf+2, '%')) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 15:23:34 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 15:23:34 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319727=3A_os=2Euti?= =?utf-8?q?me=28=2E=2E=2E=2C_None=29_is_now_potentially_more_precise_under?= =?utf-8?q?_Windows=2E?= Message-ID: <3dRcF23KsRz7Lln@mail.python.org> http://hg.python.org/cpython/rev/6a9e262c5423 changeset: 87419:6a9e262c5423 user: Antoine Pitrou date: Sat Nov 23 15:23:26 2013 +0100 summary: Issue #19727: os.utime(..., None) is now potentially more precise under Windows. files: Misc/NEWS | 3 +++ Modules/posixmodule.c | 9 ++------- 2 files changed, 5 insertions(+), 7 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,9 @@ Library ------- +- Issue #19727: os.utime(..., None) is now potentially more precise + under Windows. + - Issue #17201: ZIP64 extensions now are enabled by default. Patch by William Mallard. diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -4953,13 +4953,8 @@ } if (utime.now) { - SYSTEMTIME now; - GetSystemTime(&now); - if (!SystemTimeToFileTime(&now, &mtime) || - !SystemTimeToFileTime(&now, &atime)) { - PyErr_SetFromWindowsErr(0); - goto exit; - } + GetSystemTimeAsFileTime(&mtime); + atime = mtime; } else { time_t_to_FILE_TIME(utime.atime_s, utime.atime_ns, &atime); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 15:26:06 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 15:26:06 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319715=3A_try_the_?= =?utf-8?q?utime=28=2E=2E=2E=2C_None=29_approach_again=2C_now_that_it_shou?= =?utf-8?q?ld_be?= Message-ID: <3dRcHy3yQdz7LqX@mail.python.org> http://hg.python.org/cpython/rev/b5e9d61f6987 changeset: 87420:b5e9d61f6987 user: Antoine Pitrou date: Sat Nov 23 15:25:59 2013 +0100 summary: Issue #19715: try the utime(..., None) approach again, now that it should be more precise under Windows files: Lib/pathlib.py | 4 +--- Lib/test/test_pathlib.py | 7 ++----- 2 files changed, 3 insertions(+), 8 deletions(-) diff --git a/Lib/pathlib.py b/Lib/pathlib.py --- a/Lib/pathlib.py +++ b/Lib/pathlib.py @@ -6,7 +6,6 @@ import posixpath import re import sys -import time import weakref try: import threading @@ -1076,9 +1075,8 @@ # First try to bump modification time # Implementation note: GNU touch uses the UTIME_NOW option of # the utimensat() / futimens() functions. - t = time.time() try: - self._accessor.utime(self, (t, t)) + self._accessor.utime(self, None) except OSError: # Avoid exception chaining pass diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -1391,11 +1391,8 @@ # The file mtime should be refreshed by calling touch() again p.touch() st = p.stat() - # Issue #19715: there can be an inconsistency under Windows between - # the timestamp rounding when creating a file, and the timestamp - # rounding done when calling utime(). `delta` makes up for this. - delta = 1e-6 if os.name == 'nt' else 0 - self.assertGreaterEqual(st.st_mtime, old_mtime - delta) + self.assertGreaterEqual(st.st_mtime_ns, old_mtime_ns) + self.assertGreaterEqual(st.st_mtime, old_mtime) # Now with exist_ok=False p = P / 'newfileB' self.assertFalse(p.exists()) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 15:59:17 2013 From: python-checkins at python.org (christian.heimes) Date: Sat, 23 Nov 2013 15:59:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Correct_documentation_clie?= =?utf-8?q?ntAuth_-=3E_CLIENT=5FAUTH?= Message-ID: <3dRd2F1r1Zz7Llr@mail.python.org> http://hg.python.org/cpython/rev/9ee40eec0180 changeset: 87421:9ee40eec0180 parent: 87407:8f556ee0f6ba user: Christian Heimes date: Sat Nov 23 14:42:01 2013 +0100 summary: Correct documentation clientAuth -> CLIENT_AUTH files: Doc/library/ssl.rst | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -688,7 +688,7 @@ .. versionadded:: 3.4 -.. data:: Purpose.clientAuth +.. data:: Purpose.CLIENT_AUTH Option for :meth:`SSLContext.load_default_certs` to load CA certificates for TLS web client authentication (server side socket). @@ -928,7 +928,7 @@ The *purpose* flag specifies what kind of CA certificates are loaded. The default settings :data:`Purpose.SERVER_AUTH` loads certificates, that are flagged and trusted for TLS web server authentication (client side - sockets). :data:`Purpose.clientAuth` loads CA certificates for client + sockets). :data:`Purpose.CLIENT_AUTH` loads CA certificates for client certificate verification on the server side. .. versionadded:: 3.4 -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 15:59:18 2013 From: python-checkins at python.org (christian.heimes) Date: Sat, 23 Nov 2013 15:59:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319689=3A_Add_ssl?= =?utf-8?q?=2Ecreate=5Fdefault=5Fcontext=28=29_factory_function=2E_It_crea?= =?utf-8?q?tes?= Message-ID: <3dRd2G4vrCz7Lq2@mail.python.org> http://hg.python.org/cpython/rev/63df21e74c65 changeset: 87422:63df21e74c65 user: Christian Heimes date: Sat Nov 23 15:58:30 2013 +0100 summary: Issue #19689: Add ssl.create_default_context() factory function. It creates a new SSLContext object with secure default settings. files: Doc/library/ssl.rst | 18 ++++++++++++++++ Lib/ssl.py | 35 ++++++++++++++++++++++++++++++++ Lib/test/test_ssl.py | 20 ++++++++++++++++++ Misc/NEWS | 3 ++ 4 files changed, 76 insertions(+), 0 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -346,6 +346,24 @@ .. versionchanged:: 3.3 This function is now IPv6-compatible. +.. function:: create_default_context(purpose=Purpose.SERVER_AUTH, cafile=None, capath=None, cadata=None) + + Create a :class:`SSLContext` with default settings. + + The current settings are: :data:`PROTOCOL_TLSv1` with high encryption + cipher suites without RC4 and without unauthenticated cipher suites. The + *purpose* :data:`Purpose.SERVER_AUTH` sets verify_mode to + :data:`CERT_REQUIRED` and either loads CA certs (when at least one of + *cafile*, *capath* or *cadata* is given) or uses + :meth:`SSLContext.load_default_certs` to load default CA certs. + + .. note:: + The protocol, options, cipher and other settings may change to more + restrictive values anytime without prior deprecation. The values + represent a fair balance between maximum compatibility and security. + + .. versionadded:: 3.4 + .. function:: DER_cert_to_PEM_cert(DER_cert_bytes) Given a certificate as a DER-encoded blob of bytes, returns a PEM-encoded diff --git a/Lib/ssl.py b/Lib/ssl.py --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -165,6 +165,13 @@ # (OpenSSL's default setting is 'DEFAULT:!aNULL:!eNULL') _DEFAULT_CIPHERS = 'DEFAULT:!aNULL:!eNULL:!LOW:!EXPORT:!SSLv2' +# restricted and more secure ciphers +# HIGH: high encryption cipher suites with key length >= 128 bits (no MD5) +# !aNULL: only authenticated cipher suites (no anonymous DH) +# !RC4: no RC4 streaming cipher, RC4 is broken +# !DSS: RSA is preferred over DSA +_RESTRICTED_CIPHERS = 'HIGH:!aNULL:!RC4:!DSS' + class CertificateError(ValueError): pass @@ -363,6 +370,34 @@ self.set_default_verify_paths() +def create_default_context(purpose=Purpose.SERVER_AUTH, *, cafile=None, + capath=None, cadata=None): + """Create a SSLContext object with default settings. + + NOTE: The protocol and settings may change anytime without prior + deprecation. The values represent a fair balance between maximum + compatibility and security. + """ + if not isinstance(purpose, _ASN1Object): + raise TypeError(purpose) + context = SSLContext(PROTOCOL_TLSv1) + # SSLv2 considered harmful. + context.options |= OP_NO_SSLv2 + # disallow ciphers with known vulnerabilities + context.set_ciphers(_RESTRICTED_CIPHERS) + # verify certs in client mode + if purpose == Purpose.SERVER_AUTH: + context.verify_mode = CERT_REQUIRED + if cafile or capath or cadata: + context.load_verify_locations(cafile, capath, cadata) + elif context.verify_mode != CERT_NONE: + # no explicit cafile, capath or cadata but the verify mode is + # CERT_OPTIONAL or CERT_REQUIRED. Let's try to load default system + # root CA certificates for the given purpose. This may fail silently. + context.load_default_certs(purpose) + return context + + class SSLSocket(socket): """This class implements a subtype of socket.socket that wraps the underlying OS socket in an SSL context when necessary, and diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -999,6 +999,26 @@ self.assertRaises(TypeError, ctx.load_default_certs, None) self.assertRaises(TypeError, ctx.load_default_certs, 'SERVER_AUTH') + def test_create_default_context(self): + ctx = ssl.create_default_context() + self.assertEqual(ctx.protocol, ssl.PROTOCOL_TLSv1) + self.assertEqual(ctx.verify_mode, ssl.CERT_REQUIRED) + self.assertEqual(ctx.options & ssl.OP_NO_SSLv2, ssl.OP_NO_SSLv2) + + with open(SIGNING_CA) as f: + cadata = f.read() + ctx = ssl.create_default_context(cafile=SIGNING_CA, capath=CAPATH, + cadata=cadata) + self.assertEqual(ctx.protocol, ssl.PROTOCOL_TLSv1) + self.assertEqual(ctx.verify_mode, ssl.CERT_REQUIRED) + self.assertEqual(ctx.options & ssl.OP_NO_SSLv2, ssl.OP_NO_SSLv2) + + ctx = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH) + self.assertEqual(ctx.protocol, ssl.PROTOCOL_TLSv1) + self.assertEqual(ctx.verify_mode, ssl.CERT_NONE) + self.assertEqual(ctx.options & ssl.OP_NO_SSLv2, ssl.OP_NO_SSLv2) + + class SSLErrorTests(unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,9 @@ Library ------- +- Issue #19689: Add ssl.create_default_context() factory function. It creates + a new SSLContext object with secure default settings. + - Issue #19292: Add SSLContext.load_default_certs() to load default root CA certificates from default stores or system stores. By default the method loads CA certs for authentication of server certs. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 15:59:20 2013 From: python-checkins at python.org (christian.heimes) Date: Sat, 23 Nov 2013 15:59:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_merge?= Message-ID: <3dRd2J2BClz7Lqd@mail.python.org> http://hg.python.org/cpython/rev/16dd19aa64c8 changeset: 87423:16dd19aa64c8 parent: 87422:63df21e74c65 parent: 87420:b5e9d61f6987 user: Christian Heimes date: Sat Nov 23 15:59:07 2013 +0100 summary: merge files: Doc/library/zipfile.rst | 17 +- Lib/pathlib.py | 4 +- Lib/test/test_pathlib.py | 14 +- Lib/test/test_zipfile.py | 4 +- Lib/test/test_zipfile64.py | 4 +- Lib/tkinter/test/test_tkinter/test_widgets.py | 3 + Lib/tkinter/test/widget_tests.py | 4 + Lib/unittest/loader.py | 60 ++++++- Lib/unittest/test/test_discovery.py | 80 +++++++++- Lib/unittest/test/testmock/testmock.py | 17 +- Lib/zipfile.py | 8 +- Misc/ACKS | 1 + Misc/NEWS | 9 + Misc/python-wing5.wpr | 18 ++ Modules/posixmodule.c | 9 +- Modules/timemodule.c | 2 +- 16 files changed, 197 insertions(+), 57 deletions(-) diff --git a/Doc/library/zipfile.rst b/Doc/library/zipfile.rst --- a/Doc/library/zipfile.rst +++ b/Doc/library/zipfile.rst @@ -130,7 +130,7 @@ --------------- -.. class:: ZipFile(file, mode='r', compression=ZIP_STORED, allowZip64=False) +.. class:: ZipFile(file, mode='r', compression=ZIP_STORED, allowZip64=True) Open a ZIP file, where *file* can be either a path to a file (a string) or a file-like object. The *mode* parameter should be ``'r'`` to read an existing @@ -147,12 +147,9 @@ :const:`ZIP_BZIP2` or :const:`ZIP_LZMA` is specified but the corresponding module (:mod:`zlib`, :mod:`bz2` or :mod:`lzma`) is not available, :exc:`RuntimeError` is also raised. The default is :const:`ZIP_STORED`. If *allowZip64* is - ``True`` zipfile will create ZIP files that use the ZIP64 extensions when - the zipfile is larger than 2 GiB. If it is false (the default) :mod:`zipfile` + ``True`` (the default) zipfile will create ZIP files that use the ZIP64 + extensions when the zipfile is larger than 2 GiB. If it is false :mod:`zipfile` will raise an exception when the ZIP file would require ZIP64 extensions. - ZIP64 extensions are disabled by default because the default :program:`zip` - and :program:`unzip` commands on Unix (the InfoZIP utilities) don't support - these extensions. If the file is created with mode ``'a'`` or ``'w'`` and then :meth:`closed ` without adding any files to the archive, the appropriate @@ -171,6 +168,9 @@ .. versionchanged:: 3.3 Added support for :mod:`bzip2 ` and :mod:`lzma` compression. + .. versionchanged:: 3.4 + ZIP64 extensions are enabled by default. + .. method:: ZipFile.close() @@ -374,12 +374,15 @@ The :class:`PyZipFile` constructor takes the same parameters as the :class:`ZipFile` constructor, and one additional parameter, *optimize*. -.. class:: PyZipFile(file, mode='r', compression=ZIP_STORED, allowZip64=False, \ +.. class:: PyZipFile(file, mode='r', compression=ZIP_STORED, allowZip64=True, \ optimize=-1) .. versionadded:: 3.2 The *optimize* parameter. + .. versionchanged:: 3.4 + ZIP64 extensions are enabled by default. + Instances have one method in addition to those of :class:`ZipFile` objects: .. method:: PyZipFile.writepy(pathname, basename='', filterfunc=None) diff --git a/Lib/pathlib.py b/Lib/pathlib.py --- a/Lib/pathlib.py +++ b/Lib/pathlib.py @@ -6,7 +6,6 @@ import posixpath import re import sys -import time import weakref try: import threading @@ -1076,9 +1075,8 @@ # First try to bump modification time # Implementation note: GNU touch uses the UTIME_NOW option of # the utimensat() / futimens() functions. - t = time.time() try: - self._accessor.utime(self, (t, t)) + self._accessor.utime(self, None) except OSError: # Avoid exception chaining pass diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -1391,11 +1391,8 @@ # The file mtime should be refreshed by calling touch() again p.touch() st = p.stat() - # Issue #19715: there can be an inconsistency under Windows between - # the timestamp rounding when creating a file, and the timestamp - # rounding done when calling utime(). `delta` makes up for this. - delta = 1e-6 if os.name == 'nt' else 0 - self.assertGreaterEqual(st.st_mtime, old_mtime - delta) + self.assertGreaterEqual(st.st_mtime_ns, old_mtime_ns) + self.assertGreaterEqual(st.st_mtime, old_mtime) # Now with exist_ok=False p = P / 'newfileB' self.assertFalse(p.exists()) @@ -1403,6 +1400,13 @@ self.assertTrue(p.exists()) self.assertRaises(OSError, p.touch, exist_ok=False) + def test_touch_nochange(self): + P = self.cls(BASE) + p = P / 'fileA' + p.touch() + with p.open('rb') as f: + self.assertEqual(f.read().strip(), b"this is file A") + def test_mkdir(self): P = self.cls(BASE) p = P / 'newdirA' diff --git a/Lib/test/test_zipfile.py b/Lib/test/test_zipfile.py --- a/Lib/test/test_zipfile.py +++ b/Lib/test/test_zipfile.py @@ -506,12 +506,12 @@ compression = zipfile.ZIP_STORED def large_file_exception_test(self, f, compression): - with zipfile.ZipFile(f, "w", compression) as zipfp: + with zipfile.ZipFile(f, "w", compression, allowZip64=False) as zipfp: self.assertRaises(zipfile.LargeZipFile, zipfp.write, TESTFN, "another.name") def large_file_exception_test2(self, f, compression): - with zipfile.ZipFile(f, "w", compression) as zipfp: + with zipfile.ZipFile(f, "w", compression, allowZip64=False) as zipfp: self.assertRaises(zipfile.LargeZipFile, zipfp.writestr, "another.name", self.data) diff --git a/Lib/test/test_zipfile64.py b/Lib/test/test_zipfile64.py --- a/Lib/test/test_zipfile64.py +++ b/Lib/test/test_zipfile64.py @@ -38,7 +38,7 @@ def zipTest(self, f, compression): # Create the ZIP archive. - zipfp = zipfile.ZipFile(f, "w", compression, allowZip64=True) + zipfp = zipfile.ZipFile(f, "w", compression) # It will contain enough copies of self.data to reach about 6GB of # raw data to store. @@ -92,7 +92,7 @@ def testMoreThan64kFiles(self): # This test checks that more than 64k files can be added to an archive, # and that the resulting archive can be read properly by ZipFile - zipf = zipfile.ZipFile(TESTFN, mode="w") + zipf = zipfile.ZipFile(TESTFN, mode="w", allowZip64=False) zipf.debug = 100 numfiles = (1 << 16) * 3//2 for i in range(numfiles): diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/tkinter/test/test_tkinter/test_widgets.py --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/tkinter/test/test_tkinter/test_widgets.py @@ -1,6 +1,7 @@ import unittest import tkinter import os +import sys from test.support import requires from tkinter.test.support import (tcl_version, requires_tcl, @@ -262,6 +263,8 @@ test_highlightthickness = StandardOptionsTests.test_highlightthickness + @unittest.skipIf(sys.platform == 'darwin', + 'crashes with Cocoa Tk (issue19733)') def test_image(self): widget = self.create() image = tkinter.PhotoImage('image1') diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/tkinter/test/widget_tests.py --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/tkinter/test/widget_tests.py @@ -1,5 +1,7 @@ # Common tests for test_tkinter/test_widgets.py and test_ttk/test_widgets.py +import unittest +import sys import tkinter from tkinter.ttk import setup_master, Scale from tkinter.test.support import (tcl_version, requires_tcl, get_tk_patchlevel, @@ -289,6 +291,8 @@ self.checkParam(widget, 'highlightthickness', -2, expected=0, conv=self._conv_pixels) + @unittest.skipIf(sys.platform == 'darwin', + 'crashes with Cocoa Tk (issue19733)') def test_image(self): widget = self.create() self.checkImageParam(widget, 'image') diff --git a/Lib/unittest/loader.py b/Lib/unittest/loader.py --- a/Lib/unittest/loader.py +++ b/Lib/unittest/loader.py @@ -61,8 +61,9 @@ def loadTestsFromTestCase(self, testCaseClass): """Return a suite of all tests cases contained in testCaseClass""" if issubclass(testCaseClass, suite.TestSuite): - raise TypeError("Test cases should not be derived from TestSuite." \ - " Maybe you meant to derive from TestCase?") + raise TypeError("Test cases should not be derived from " + "TestSuite. Maybe you meant to derive from " + "TestCase?") testCaseNames = self.getTestCaseNames(testCaseClass) if not testCaseNames and hasattr(testCaseClass, 'runTest'): testCaseNames = ['runTest'] @@ -200,6 +201,8 @@ self._top_level_dir = top_level_dir is_not_importable = False + is_namespace = False + tests = [] if os.path.isdir(os.path.abspath(start_dir)): start_dir = os.path.abspath(start_dir) if start_dir != top_level_dir: @@ -213,15 +216,52 @@ else: the_module = sys.modules[start_dir] top_part = start_dir.split('.')[0] - start_dir = os.path.abspath(os.path.dirname((the_module.__file__))) + try: + start_dir = os.path.abspath( + os.path.dirname((the_module.__file__))) + except AttributeError: + # look for namespace packages + try: + spec = the_module.__spec__ + except AttributeError: + spec = None + + if spec and spec.loader is None: + if spec.submodule_search_locations is not None: + is_namespace = True + + for path in the_module.__path__: + if (not set_implicit_top and + not path.startswith(top_level_dir)): + continue + self._top_level_dir = \ + (path.split(the_module.__name__ + .replace(".", os.path.sep))[0]) + tests.extend(self._find_tests(path, + pattern, + namespace=True)) + elif the_module.__name__ in sys.builtin_module_names: + # builtin module + raise TypeError('Can not use builtin modules ' + 'as dotted module names') from None + else: + raise TypeError( + 'don\'t know how to discover from {!r}' + .format(the_module)) from None + if set_implicit_top: - self._top_level_dir = self._get_directory_containing_module(top_part) - sys.path.remove(top_level_dir) + if not is_namespace: + self._top_level_dir = \ + self._get_directory_containing_module(top_part) + sys.path.remove(top_level_dir) + else: + sys.path.remove(top_level_dir) if is_not_importable: raise ImportError('Start directory is not importable: %r' % start_dir) - tests = list(self._find_tests(start_dir, pattern)) + if not is_namespace: + tests = list(self._find_tests(start_dir, pattern)) return self.suiteClass(tests) def _get_directory_containing_module(self, module_name): @@ -254,7 +294,7 @@ # override this method to use alternative matching strategy return fnmatch(path, pattern) - def _find_tests(self, start_dir, pattern): + def _find_tests(self, start_dir, pattern, namespace=False): """Used by discovery. Yields test suites it loads.""" paths = sorted(os.listdir(start_dir)) @@ -287,7 +327,8 @@ raise ImportError(msg % (mod_name, module_dir, expected_dir)) yield self.loadTestsFromModule(module) elif os.path.isdir(full_path): - if not os.path.isfile(os.path.join(full_path, '__init__.py')): + if (not namespace and + not os.path.isfile(os.path.join(full_path, '__init__.py'))): continue load_tests = None @@ -304,7 +345,8 @@ # tests loaded from package file yield tests # recurse into the package - yield from self._find_tests(full_path, pattern) + yield from self._find_tests(full_path, pattern, + namespace=namespace) else: try: yield load_tests(self, tests, pattern) diff --git a/Lib/unittest/test/test_discovery.py b/Lib/unittest/test/test_discovery.py --- a/Lib/unittest/test/test_discovery.py +++ b/Lib/unittest/test/test_discovery.py @@ -1,6 +1,8 @@ import os import re import sys +import types +import builtins from test import support import unittest @@ -173,7 +175,7 @@ self.addCleanup(restore_isdir) _find_tests_args = [] - def _find_tests(start_dir, pattern): + def _find_tests(start_dir, pattern, namespace=None): _find_tests_args.append((start_dir, pattern)) return ['tests'] loader._find_tests = _find_tests @@ -436,7 +438,7 @@ expectedPath = os.path.abspath(os.path.dirname(unittest.test.__file__)) self.wasRun = False - def _find_tests(start_dir, pattern): + def _find_tests(start_dir, pattern, namespace=None): self.wasRun = True self.assertEqual(start_dir, expectedPath) return tests @@ -446,5 +448,79 @@ self.assertEqual(suite._tests, tests) + def test_discovery_from_dotted_path_builtin_modules(self): + + loader = unittest.TestLoader() + + listdir = os.listdir + os.listdir = lambda _: ['test_this_does_not_exist.py'] + isfile = os.path.isfile + isdir = os.path.isdir + os.path.isdir = lambda _: False + orig_sys_path = sys.path[:] + def restore(): + os.path.isfile = isfile + os.path.isdir = isdir + os.listdir = listdir + sys.path[:] = orig_sys_path + self.addCleanup(restore) + + with self.assertRaises(TypeError) as cm: + loader.discover('sys') + self.assertEqual(str(cm.exception), + 'Can not use builtin modules ' + 'as dotted module names') + + def test_discovery_from_dotted_namespace_packages(self): + loader = unittest.TestLoader() + + orig_import = __import__ + package = types.ModuleType('package') + package.__path__ = ['/a', '/b'] + package.__spec__ = types.SimpleNamespace( + loader=None, + submodule_search_locations=['/a', '/b'] + ) + + def _import(packagename, *args, **kwargs): + sys.modules[packagename] = package + return package + + def cleanup(): + builtins.__import__ = orig_import + self.addCleanup(cleanup) + builtins.__import__ = _import + + _find_tests_args = [] + def _find_tests(start_dir, pattern, namespace=None): + _find_tests_args.append((start_dir, pattern)) + return ['%s/tests' % start_dir] + + loader._find_tests = _find_tests + loader.suiteClass = list + suite = loader.discover('package') + self.assertEqual(suite, ['/a/tests', '/b/tests']) + + def test_discovery_failed_discovery(self): + loader = unittest.TestLoader() + package = types.ModuleType('package') + orig_import = __import__ + + def _import(packagename, *args, **kwargs): + sys.modules[packagename] = package + return package + + def cleanup(): + builtins.__import__ = orig_import + self.addCleanup(cleanup) + builtins.__import__ = _import + + with self.assertRaises(TypeError) as cm: + loader.discover('package') + self.assertEqual(str(cm.exception), + 'don\'t know how to discover from {!r}' + .format(package)) + + if __name__ == '__main__': unittest.main() diff --git a/Lib/unittest/test/testmock/testmock.py b/Lib/unittest/test/testmock/testmock.py --- a/Lib/unittest/test/testmock/testmock.py +++ b/Lib/unittest/test/testmock/testmock.py @@ -1302,20 +1302,6 @@ self.assertEqual(m.method_calls, []) - def test_attribute_deletion(self): - # this behaviour isn't *useful*, but at least it's now tested... - for Klass in Mock, MagicMock, NonCallableMagicMock, NonCallableMock: - m = Klass() - original = m.foo - m.foo = 3 - del m.foo - self.assertEqual(m.foo, original) - - new = m.foo = Mock() - del m.foo - self.assertEqual(m.foo, new) - - def test_mock_parents(self): for Klass in Mock, MagicMock: m = Klass() @@ -1379,7 +1365,8 @@ def test_attribute_deletion(self): - for mock in Mock(), MagicMock(): + for mock in (Mock(), MagicMock(), NonCallableMagicMock(), + NonCallableMock()): self.assertTrue(hasattr(mock, 'm')) del mock.m diff --git a/Lib/zipfile.py b/Lib/zipfile.py --- a/Lib/zipfile.py +++ b/Lib/zipfile.py @@ -876,7 +876,7 @@ class ZipFile: """ Class with methods to open, read, write, close, list zip files. - z = ZipFile(file, mode="r", compression=ZIP_STORED, allowZip64=False) + z = ZipFile(file, mode="r", compression=ZIP_STORED, allowZip64=True) file: Either the path to the file, or a file-like object. If it is a path, the file will be opened and closed by ZipFile. @@ -892,7 +892,7 @@ fp = None # Set here since __del__ checks it _windows_illegal_name_trans_table = None - def __init__(self, file, mode="r", compression=ZIP_STORED, allowZip64=False): + def __init__(self, file, mode="r", compression=ZIP_STORED, allowZip64=True): """Open the ZIP file with mode read "r", write "w" or append "a".""" if mode not in ("r", "w", "a"): raise RuntimeError('ZipFile() requires mode "r", "w", or "a"') @@ -1561,7 +1561,7 @@ """Class to create ZIP archives with Python library files and packages.""" def __init__(self, file, mode="r", compression=ZIP_STORED, - allowZip64=False, optimize=-1): + allowZip64=True, optimize=-1): ZipFile.__init__(self, file, mode=mode, compression=compression, allowZip64=allowZip64) self._optimize = optimize @@ -1783,7 +1783,7 @@ os.path.join(path, nm), os.path.join(zippath, nm)) # else: ignore - with ZipFile(args[1], 'w', allowZip64=True) as zf: + with ZipFile(args[1], 'w') as zf: for src in args[2:]: addToZip(zf, src, os.path.basename(src)) diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -806,6 +806,7 @@ Grzegorz Makarewicz David Malcolm Greg Malcolm +William Mallard Ken Manheimer Vladimir Marangozov Colin Marc diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -71,6 +71,12 @@ - Issue #19689: Add ssl.create_default_context() factory function. It creates a new SSLContext object with secure default settings. +- Issue #19727: os.utime(..., None) is now potentially more precise + under Windows. + +- Issue #17201: ZIP64 extensions now are enabled by default. Patch by + William Mallard. + - Issue #19292: Add SSLContext.load_default_certs() to load default root CA certificates from default stores or system stores. By default the method loads CA certs for authentication of server certs. @@ -482,6 +488,9 @@ Library ------- +- Issue #17457: unittest test discovery now works with namespace packages. + Patch by Claudiu Popa. + - Issue #18235: Fix the sysconfig variables LDSHARED and BLDSHARED under AIX. Patch by David Edelsohn. diff --git a/Misc/python-wing5.wpr b/Misc/python-wing5.wpr new file mode 100644 --- /dev/null +++ b/Misc/python-wing5.wpr @@ -0,0 +1,18 @@ +#!wing +#!version=5.0 +################################################################## +# Wing IDE project file # +################################################################## +[project attributes] +proj.directory-list = [{'dirloc': loc('..'), + 'excludes': [u'.hg', + u'Lib/unittest/__pycache__', + u'Lib/unittest/test/__pycache__', + u'Lib/__pycache__', + u'build', + u'Doc/build'], + 'filter': '*', + 'include_hidden': False, + 'recursive': True, + 'watch_for_changes': True}] +proj.file-type = 'shared' diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -4953,13 +4953,8 @@ } if (utime.now) { - SYSTEMTIME now; - GetSystemTime(&now); - if (!SystemTimeToFileTime(&now, &mtime) || - !SystemTimeToFileTime(&now, &atime)) { - PyErr_SetFromWindowsErr(0); - goto exit; - } + GetSystemTimeAsFileTime(&mtime); + atime = mtime; } else { time_t_to_FILE_TIME(utime.atime_s, utime.atime_ns, &atime); diff --git a/Modules/timemodule.c b/Modules/timemodule.c --- a/Modules/timemodule.c +++ b/Modules/timemodule.c @@ -650,7 +650,7 @@ return NULL; } } -#elif defined(_AIX) && defined(HAVE_WCSFTIME) +#elif (defined(_AIX) || defined(sun)) && defined(HAVE_WCSFTIME) for(outbuf = wcschr(fmt, '%'); outbuf != NULL; outbuf = wcschr(outbuf+2, '%')) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 16:08:09 2013 From: python-checkins at python.org (matthias.klose) Date: Sat, 23 Nov 2013 16:08:09 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A__-_Modules/=5Fstruct=2Ec_?= =?utf-8?q?=28unpackiter=5Ftype=29=3A_Define_static=2E?= Message-ID: <3dRdDT5nR4zQtj@mail.python.org> http://hg.python.org/cpython/rev/37e4d14a8bad changeset: 87424:37e4d14a8bad user: doko at ubuntu.com date: Sat Nov 23 16:07:55 2013 +0100 summary: - Modules/_struct.c (unpackiter_type): Define static. files: Modules/_struct.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Modules/_struct.c b/Modules/_struct.c --- a/Modules/_struct.c +++ b/Modules/_struct.c @@ -1626,7 +1626,7 @@ return result; } -PyTypeObject unpackiter_type = { +static PyTypeObject unpackiter_type = { PyVarObject_HEAD_INIT(&PyType_Type, 0) "unpack_iterator", /* tp_name */ sizeof(unpackiterobject), /* tp_basicsize */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 16:16:35 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 16:16:35 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Tweak_ssl_docs?= Message-ID: <3dRdQC6k2wz7Lr7@mail.python.org> http://hg.python.org/cpython/rev/ebbd2ec5c7cb changeset: 87425:ebbd2ec5c7cb user: Antoine Pitrou date: Sat Nov 23 16:16:29 2013 +0100 summary: Tweak ssl docs files: Doc/library/ssl.rst | 76 +++++++++++++++++++++++--------- 1 files changed, 53 insertions(+), 23 deletions(-) diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -227,6 +227,45 @@ .. versionchanged:: 3.2 New optional argument *ciphers*. + +Context creation +^^^^^^^^^^^^^^^^ + +A convenience function helps create :class:`SSLContext` objects for common +purposes. + +.. function:: create_default_context(purpose=Purpose.SERVER_AUTH, cafile=None, capath=None, cadata=None) + + Return a new :class:`SSLContext` object with default settings for + the given *purpose*. The settings are chosen by the :mod:`ssl` module, + and usually represent a higher security level than when calling the + :class:`SSLContext` constructor directly. + + *cafile*, *capath*, *cadata* represent optional CA certificates to + trust for certificate verification, as in + :meth:`SSLContext.load_verify_locations`. If all three are + :const:`None`, this function can choose to trust the system's default + CA certificates instead. + + The settings in Python 3.4 are: :data:`PROTOCOL_TLSv1` with high encryption + cipher suites without RC4 and without unauthenticated cipher suites. + Passing :data:`~Purpose.SERVER_AUTH` as *purpose* sets + :data:`~SSLContext.verify_mode` to :data:`CERT_REQUIRED` and either + loads CA certificates (when at least one of *cafile*, *capath* or *cadata* + is given) or uses :meth:`SSLContext.load_default_certs` to load default + CA certificates. + + .. note:: + The protocol, options, cipher and other settings may change to more + restrictive values anytime without prior deprecation. The values + represent a fair balance between compatibility and security. + + If your application needs specific settings, you should create a + :class:`SSLContext` and apply the settings yourself. + + .. versionadded:: 3.4 + + Random generation ^^^^^^^^^^^^^^^^^ @@ -346,24 +385,6 @@ .. versionchanged:: 3.3 This function is now IPv6-compatible. -.. function:: create_default_context(purpose=Purpose.SERVER_AUTH, cafile=None, capath=None, cadata=None) - - Create a :class:`SSLContext` with default settings. - - The current settings are: :data:`PROTOCOL_TLSv1` with high encryption - cipher suites without RC4 and without unauthenticated cipher suites. The - *purpose* :data:`Purpose.SERVER_AUTH` sets verify_mode to - :data:`CERT_REQUIRED` and either loads CA certs (when at least one of - *cafile*, *capath* or *cadata* is given) or uses - :meth:`SSLContext.load_default_certs` to load default CA certs. - - .. note:: - The protocol, options, cipher and other settings may change to more - restrictive values anytime without prior deprecation. The values - represent a fair balance between maximum compatibility and security. - - .. versionadded:: 3.4 - .. function:: DER_cert_to_PEM_cert(DER_cert_bytes) Given a certificate as a DER-encoded blob of bytes, returns a PEM-encoded @@ -701,15 +722,19 @@ .. data:: Purpose.SERVER_AUTH - Option for :meth:`SSLContext.load_default_certs` to load CA certificates - for TLS web server authentication (client side socket). + Option for :func:`create_default_context` and + :meth:`SSLContext.load_default_certs`. This value indicates that the + context may be used to authenticate Web servers (therefore, it will + be used to create client-side sockets). .. versionadded:: 3.4 .. data:: Purpose.CLIENT_AUTH - Option for :meth:`SSLContext.load_default_certs` to load CA certificates - for TLS web client authentication (server side socket). + Option for :func:`create_default_context` and + :meth:`SSLContext.load_default_certs`. This value indicates that the + context may be used to authenticate Web clients (therefore, it will + be used to create server-side sockets). .. versionadded:: 3.4 @@ -886,7 +911,12 @@ Create a new SSL context. You must pass *protocol* which must be one of the ``PROTOCOL_*`` constants defined in this module. - :data:`PROTOCOL_SSLv23` is recommended for maximum interoperability. + :data:`PROTOCOL_SSLv23` is currently recommended for maximum + interoperability. + + .. seealso:: + :func:`create_default_context` lets the :mod:`ssl` module choose + security settings for a given purpose. :class:`SSLContext` objects have the following methods and attributes: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 17:40:46 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 17:40:46 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319308=3A_fix_the_?= =?utf-8?q?gdb_plugin_on_gdbs_linked_with_Python_3?= Message-ID: <3dRgHL2SWpz7LpG@mail.python.org> http://hg.python.org/cpython/rev/9f2a0043396b changeset: 87426:9f2a0043396b user: Antoine Pitrou date: Sat Nov 23 17:40:36 2013 +0100 summary: Issue #19308: fix the gdb plugin on gdbs linked with Python 3 files: Lib/test/test_gdb.py | 2 +- Tools/gdb/libpython.py | 117 ++++++++++++++++------------ 2 files changed, 68 insertions(+), 51 deletions(-) diff --git a/Lib/test/test_gdb.py b/Lib/test/test_gdb.py --- a/Lib/test/test_gdb.py +++ b/Lib/test/test_gdb.py @@ -63,7 +63,7 @@ return out.decode('utf-8', 'replace'), err.decode('utf-8', 'replace') # Verify that "gdb" was built with the embedded python support enabled: -gdbpy_version, _ = run_gdb("--eval-command=python import sys; print sys.version_info") +gdbpy_version, _ = run_gdb("--eval-command=python import sys; print(sys.version_info)") if not gdbpy_version: raise unittest.SkipTest("gdb not built with embedded python support") diff --git a/Tools/gdb/libpython.py b/Tools/gdb/libpython.py --- a/Tools/gdb/libpython.py +++ b/Tools/gdb/libpython.py @@ -40,11 +40,21 @@ The module also extends gdb with some python-specific commands. ''' -from __future__ import with_statement + +# NOTE: some gdbs are linked with Python 3, so this file should be dual-syntax +# compatible (2.6+ and 3.0+). See #19308. + +from __future__ import print_function, with_statement import gdb +import os import locale import sys +if sys.version_info[0] >= 3: + unichr = chr + xrange = range + long = int + # Look up the gdb.Type for some standard types: _type_char_ptr = gdb.lookup_type('char').pointer() # char* _type_unsigned_char_ptr = gdb.lookup_type('unsigned char').pointer() # unsigned char* @@ -58,16 +68,16 @@ SIZEOF_VOID_P = _type_void_ptr.sizeof -Py_TPFLAGS_HEAPTYPE = (1L << 9) +Py_TPFLAGS_HEAPTYPE = (1 << 9) -Py_TPFLAGS_LONG_SUBCLASS = (1L << 24) -Py_TPFLAGS_LIST_SUBCLASS = (1L << 25) -Py_TPFLAGS_TUPLE_SUBCLASS = (1L << 26) -Py_TPFLAGS_BYTES_SUBCLASS = (1L << 27) -Py_TPFLAGS_UNICODE_SUBCLASS = (1L << 28) -Py_TPFLAGS_DICT_SUBCLASS = (1L << 29) -Py_TPFLAGS_BASE_EXC_SUBCLASS = (1L << 30) -Py_TPFLAGS_TYPE_SUBCLASS = (1L << 31) +Py_TPFLAGS_LONG_SUBCLASS = (1 << 24) +Py_TPFLAGS_LIST_SUBCLASS = (1 << 25) +Py_TPFLAGS_TUPLE_SUBCLASS = (1 << 26) +Py_TPFLAGS_BYTES_SUBCLASS = (1 << 27) +Py_TPFLAGS_UNICODE_SUBCLASS = (1 << 28) +Py_TPFLAGS_DICT_SUBCLASS = (1 << 29) +Py_TPFLAGS_BASE_EXC_SUBCLASS = (1 << 30) +Py_TPFLAGS_TYPE_SUBCLASS = (1 << 31) MAX_OUTPUT_LEN=1024 @@ -90,32 +100,39 @@ def safe_range(val): # As per range, but don't trust the value too much: cap it to a safety # threshold in case the data was corrupted - return xrange(safety_limit(val)) + return xrange(safety_limit(int(val))) -def write_unicode(file, text): - # Write a byte or unicode string to file. Unicode strings are encoded to - # ENCODING encoding with 'backslashreplace' error handler to avoid - # UnicodeEncodeError. - if isinstance(text, unicode): - text = text.encode(ENCODING, 'backslashreplace') - file.write(text) +if sys.version_info[0] >= 3: + def write_unicode(file, text): + file.write(text) +else: + def write_unicode(file, text): + # Write a byte or unicode string to file. Unicode strings are encoded to + # ENCODING encoding with 'backslashreplace' error handler to avoid + # UnicodeEncodeError. + if isinstance(text, unicode): + text = text.encode(ENCODING, 'backslashreplace') + file.write(text) -def os_fsencode(filename): - if not isinstance(filename, unicode): - return filename - encoding = sys.getfilesystemencoding() - if encoding == 'mbcs': - # mbcs doesn't support surrogateescape - return filename.encode(encoding) - encoded = [] - for char in filename: - # surrogateescape error handler - if 0xDC80 <= ord(char) <= 0xDCFF: - byte = chr(ord(char) - 0xDC00) - else: - byte = char.encode(encoding) - encoded.append(byte) - return ''.join(encoded) +try: + os_fsencode = os.fsencode +except ImportError: + def os_fsencode(filename): + if not isinstance(filename, unicode): + return filename + encoding = sys.getfilesystemencoding() + if encoding == 'mbcs': + # mbcs doesn't support surrogateescape + return filename.encode(encoding) + encoded = [] + for char in filename: + # surrogateescape error handler + if 0xDC80 <= ord(char) <= 0xDCFF: + byte = chr(ord(char) - 0xDC00) + else: + byte = char.encode(encoding) + encoded.append(byte) + return ''.join(encoded) class StringTruncated(RuntimeError): pass @@ -322,8 +339,8 @@ # class return cls - #print 'tp_flags = 0x%08x' % tp_flags - #print 'tp_name = %r' % tp_name + #print('tp_flags = 0x%08x' % tp_flags) + #print('tp_name = %r' % tp_name) name_map = {'bool': PyBoolObjectPtr, 'classobj': PyClassObjectPtr, @@ -733,14 +750,14 @@ ''' ob_size = long(self.field('ob_size')) if ob_size == 0: - return 0L + return 0 ob_digit = self.field('ob_digit') if gdb.lookup_type('digit').sizeof == 2: - SHIFT = 15L + SHIFT = 15 else: - SHIFT = 30L + SHIFT = 30 digits = [long(ob_digit[i]) * 2**(SHIFT*i) for i in safe_range(abs(ob_size))] @@ -1595,12 +1612,12 @@ # py-list requires an actual PyEval_EvalFrameEx frame: frame = Frame.get_selected_bytecode_frame() if not frame: - print 'Unable to locate gdb frame for python bytecode interpreter' + print('Unable to locate gdb frame for python bytecode interpreter') return pyop = frame.get_pyop() if not pyop or pyop.is_optimized_out(): - print 'Unable to read information on python frame' + print('Unable to read information on python frame') return filename = pyop.filename() @@ -1656,9 +1673,9 @@ frame = iter_frame if move_up: - print 'Unable to find an older python frame' + print('Unable to find an older python frame') else: - print 'Unable to find a newer python frame' + print('Unable to find a newer python frame') class PyUp(gdb.Command): 'Select and print the python stack frame that called this one (if any)' @@ -1740,23 +1757,23 @@ frame = Frame.get_selected_python_frame() if not frame: - print 'Unable to locate python frame' + print('Unable to locate python frame') return pyop_frame = frame.get_pyop() if not pyop_frame: - print 'Unable to read information on python frame' + print('Unable to read information on python frame') return pyop_var, scope = pyop_frame.get_var_by_name(name) if pyop_var: - print ('%s %r = %s' + print('%s %r = %s' % (scope, name, pyop_var.get_truncated_repr(MAX_OUTPUT_LEN))) else: - print '%r not found' % name + print('%r not found' % name) PyPrint() @@ -1774,16 +1791,16 @@ frame = Frame.get_selected_python_frame() if not frame: - print 'Unable to locate python frame' + print('Unable to locate python frame') return pyop_frame = frame.get_pyop() if not pyop_frame: - print 'Unable to read information on python frame' + print('Unable to read information on python frame') return for pyop_name, pyop_value in pyop_frame.iter_locals(): - print ('%s = %s' + print('%s = %s' % (pyop_name.proxyval(set()), pyop_value.get_truncated_repr(MAX_OUTPUT_LEN))) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 17:44:15 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 17:44:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5MzA4?= =?utf-8?q?=3A_fix_the_gdb_plugin_on_gdbs_linked_with_Python_3?= Message-ID: <3dRgMM0Rd1z7Llf@mail.python.org> http://hg.python.org/cpython/rev/752db82b7933 changeset: 87427:752db82b7933 branch: 3.3 parent: 87412:9d0a76349eda user: Antoine Pitrou date: Sat Nov 23 17:40:36 2013 +0100 summary: Issue #19308: fix the gdb plugin on gdbs linked with Python 3 files: Lib/test/test_gdb.py | 2 +- Tools/gdb/libpython.py | 117 ++++++++++++++++------------ 2 files changed, 68 insertions(+), 51 deletions(-) diff --git a/Lib/test/test_gdb.py b/Lib/test/test_gdb.py --- a/Lib/test/test_gdb.py +++ b/Lib/test/test_gdb.py @@ -59,7 +59,7 @@ return out.decode('utf-8', 'replace'), err.decode('utf-8', 'replace') # Verify that "gdb" was built with the embedded python support enabled: -gdbpy_version, _ = run_gdb("--eval-command=python import sys; print sys.version_info") +gdbpy_version, _ = run_gdb("--eval-command=python import sys; print(sys.version_info)") if not gdbpy_version: raise unittest.SkipTest("gdb not built with embedded python support") diff --git a/Tools/gdb/libpython.py b/Tools/gdb/libpython.py --- a/Tools/gdb/libpython.py +++ b/Tools/gdb/libpython.py @@ -40,11 +40,21 @@ The module also extends gdb with some python-specific commands. ''' -from __future__ import with_statement + +# NOTE: some gdbs are linked with Python 3, so this file should be dual-syntax +# compatible (2.6+ and 3.0+). See #19308. + +from __future__ import print_function, with_statement import gdb +import os import locale import sys +if sys.version_info[0] >= 3: + unichr = chr + xrange = range + long = int + # Look up the gdb.Type for some standard types: _type_char_ptr = gdb.lookup_type('char').pointer() # char* _type_unsigned_char_ptr = gdb.lookup_type('unsigned char').pointer() # unsigned char* @@ -58,16 +68,16 @@ SIZEOF_VOID_P = _type_void_ptr.sizeof -Py_TPFLAGS_HEAPTYPE = (1L << 9) +Py_TPFLAGS_HEAPTYPE = (1 << 9) -Py_TPFLAGS_LONG_SUBCLASS = (1L << 24) -Py_TPFLAGS_LIST_SUBCLASS = (1L << 25) -Py_TPFLAGS_TUPLE_SUBCLASS = (1L << 26) -Py_TPFLAGS_BYTES_SUBCLASS = (1L << 27) -Py_TPFLAGS_UNICODE_SUBCLASS = (1L << 28) -Py_TPFLAGS_DICT_SUBCLASS = (1L << 29) -Py_TPFLAGS_BASE_EXC_SUBCLASS = (1L << 30) -Py_TPFLAGS_TYPE_SUBCLASS = (1L << 31) +Py_TPFLAGS_LONG_SUBCLASS = (1 << 24) +Py_TPFLAGS_LIST_SUBCLASS = (1 << 25) +Py_TPFLAGS_TUPLE_SUBCLASS = (1 << 26) +Py_TPFLAGS_BYTES_SUBCLASS = (1 << 27) +Py_TPFLAGS_UNICODE_SUBCLASS = (1 << 28) +Py_TPFLAGS_DICT_SUBCLASS = (1 << 29) +Py_TPFLAGS_BASE_EXC_SUBCLASS = (1 << 30) +Py_TPFLAGS_TYPE_SUBCLASS = (1 << 31) MAX_OUTPUT_LEN=1024 @@ -90,32 +100,39 @@ def safe_range(val): # As per range, but don't trust the value too much: cap it to a safety # threshold in case the data was corrupted - return xrange(safety_limit(val)) + return xrange(safety_limit(int(val))) -def write_unicode(file, text): - # Write a byte or unicode string to file. Unicode strings are encoded to - # ENCODING encoding with 'backslashreplace' error handler to avoid - # UnicodeEncodeError. - if isinstance(text, unicode): - text = text.encode(ENCODING, 'backslashreplace') - file.write(text) +if sys.version_info[0] >= 3: + def write_unicode(file, text): + file.write(text) +else: + def write_unicode(file, text): + # Write a byte or unicode string to file. Unicode strings are encoded to + # ENCODING encoding with 'backslashreplace' error handler to avoid + # UnicodeEncodeError. + if isinstance(text, unicode): + text = text.encode(ENCODING, 'backslashreplace') + file.write(text) -def os_fsencode(filename): - if not isinstance(filename, unicode): - return filename - encoding = sys.getfilesystemencoding() - if encoding == 'mbcs': - # mbcs doesn't support surrogateescape - return filename.encode(encoding) - encoded = [] - for char in filename: - # surrogateescape error handler - if 0xDC80 <= ord(char) <= 0xDCFF: - byte = chr(ord(char) - 0xDC00) - else: - byte = char.encode(encoding) - encoded.append(byte) - return ''.join(encoded) +try: + os_fsencode = os.fsencode +except ImportError: + def os_fsencode(filename): + if not isinstance(filename, unicode): + return filename + encoding = sys.getfilesystemencoding() + if encoding == 'mbcs': + # mbcs doesn't support surrogateescape + return filename.encode(encoding) + encoded = [] + for char in filename: + # surrogateescape error handler + if 0xDC80 <= ord(char) <= 0xDCFF: + byte = chr(ord(char) - 0xDC00) + else: + byte = char.encode(encoding) + encoded.append(byte) + return ''.join(encoded) class StringTruncated(RuntimeError): pass @@ -322,8 +339,8 @@ # class return cls - #print 'tp_flags = 0x%08x' % tp_flags - #print 'tp_name = %r' % tp_name + #print('tp_flags = 0x%08x' % tp_flags) + #print('tp_name = %r' % tp_name) name_map = {'bool': PyBoolObjectPtr, 'classobj': PyClassObjectPtr, @@ -733,14 +750,14 @@ ''' ob_size = long(self.field('ob_size')) if ob_size == 0: - return 0L + return 0 ob_digit = self.field('ob_digit') if gdb.lookup_type('digit').sizeof == 2: - SHIFT = 15L + SHIFT = 15 else: - SHIFT = 30L + SHIFT = 30 digits = [long(ob_digit[i]) * 2**(SHIFT*i) for i in safe_range(abs(ob_size))] @@ -1595,12 +1612,12 @@ # py-list requires an actual PyEval_EvalFrameEx frame: frame = Frame.get_selected_bytecode_frame() if not frame: - print 'Unable to locate gdb frame for python bytecode interpreter' + print('Unable to locate gdb frame for python bytecode interpreter') return pyop = frame.get_pyop() if not pyop or pyop.is_optimized_out(): - print 'Unable to read information on python frame' + print('Unable to read information on python frame') return filename = pyop.filename() @@ -1656,9 +1673,9 @@ frame = iter_frame if move_up: - print 'Unable to find an older python frame' + print('Unable to find an older python frame') else: - print 'Unable to find a newer python frame' + print('Unable to find a newer python frame') class PyUp(gdb.Command): 'Select and print the python stack frame that called this one (if any)' @@ -1740,23 +1757,23 @@ frame = Frame.get_selected_python_frame() if not frame: - print 'Unable to locate python frame' + print('Unable to locate python frame') return pyop_frame = frame.get_pyop() if not pyop_frame: - print 'Unable to read information on python frame' + print('Unable to read information on python frame') return pyop_var, scope = pyop_frame.get_var_by_name(name) if pyop_var: - print ('%s %r = %s' + print('%s %r = %s' % (scope, name, pyop_var.get_truncated_repr(MAX_OUTPUT_LEN))) else: - print '%r not found' % name + print('%r not found' % name) PyPrint() @@ -1774,16 +1791,16 @@ frame = Frame.get_selected_python_frame() if not frame: - print 'Unable to locate python frame' + print('Unable to locate python frame') return pyop_frame = frame.get_pyop() if not pyop_frame: - print 'Unable to read information on python frame' + print('Unable to read information on python frame') return for pyop_name, pyop_value in pyop_frame.iter_locals(): - print ('%s = %s' + print('%s = %s' % (pyop_name.proxyval(set()), pyop_value.get_truncated_repr(MAX_OUTPUT_LEN))) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 17:46:21 2013 From: python-checkins at python.org (guido.van.rossum) Date: Sat, 23 Nov 2013 17:46:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Add_news_about_pdb_fix_for?= =?utf-8?q?_yield=5Bfrom=5D=2E?= Message-ID: <3dRgPn4cpFz7Lln@mail.python.org> http://hg.python.org/cpython/rev/dc3c30dac5e3 changeset: 87428:dc3c30dac5e3 parent: 87426:9f2a0043396b user: Guido van Rossum date: Sat Nov 23 08:46:14 2013 -0800 summary: Add news about pdb fix for yield[from]. files: Misc/NEWS | 4 ++++ 1 files changed, 4 insertions(+), 0 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -83,6 +83,10 @@ - Issue #19673: Add pathlib to the stdlib as a provisional module (PEP 428). +- Issue #16596: pdb in a generator now properly skips over yield and + yield from rather than stepping out of the generator into its + caller. (This is essential for stepping through asyncio coroutines.) + - Issue #17916: Added dis.Bytecode.from_traceback() and dis.Bytecode.current_offset to easily display "current instruction" markers in the new disassembly API (Patch by Claudiu Popa). -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 17:59:01 2013 From: python-checkins at python.org (victor.stinner) Date: Sat, 23 Nov 2013 17:59:01 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Isue_=2319634=3A_test=5Fy?= =?utf-8?q?=5Fbefore=5F1900=28=29_is_expected_to_fail_on_Solaris?= Message-ID: <3dRghP0GMhz7LjR@mail.python.org> http://hg.python.org/cpython/rev/e73683514b4d changeset: 87429:e73683514b4d user: Victor Stinner date: Sat Nov 23 17:58:26 2013 +0100 summary: Isue #19634: test_y_before_1900() is expected to fail on Solaris files: Lib/test/test_strftime.py | 4 +++- 1 files changed, 3 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_strftime.py b/Lib/test/test_strftime.py --- a/Lib/test/test_strftime.py +++ b/Lib/test/test_strftime.py @@ -183,8 +183,10 @@ """ def test_y_before_1900(self): + # Issue #13674, #19634 t = (1899, 1, 1, 0, 0, 0, 0, 0, 0) - if sys.platform == "win32" or sys.platform.startswith("aix"): + if (sys.platform == "win32" + or sys.platform.startswith(("aix", "sunos", "solaris"))): with self.assertRaises(ValueError): time.strftime("%y", t) else: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 18:09:34 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 23 Nov 2013 18:09:34 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319668=3A_Added_su?= =?utf-8?q?pport_for_the_cp1125_encoding=2E?= Message-ID: <3dRgwZ1sSvz7Lkh@mail.python.org> http://hg.python.org/cpython/rev/d0fd68ef1aa9 changeset: 87430:d0fd68ef1aa9 parent: 87426:9f2a0043396b user: Serhiy Storchaka date: Sat Nov 23 18:52:23 2013 +0200 summary: Issue #19668: Added support for the cp1125 encoding. files: Doc/library/codecs.rst | 4 ++ Lib/encodings/aliases.py | 6 +++ Lib/encodings/cp866.py | 52 +++++++++++++------------- Lib/test/test_codecs.py | 1 + Lib/test/test_unicode.py | 4 +- Lib/test/test_xml_etree.py | 6 +- Misc/NEWS | 2 + 7 files changed, 44 insertions(+), 31 deletions(-) diff --git a/Doc/library/codecs.rst b/Doc/library/codecs.rst --- a/Doc/library/codecs.rst +++ b/Doc/library/codecs.rst @@ -1027,6 +1027,10 @@ +-----------------+--------------------------------+--------------------------------+ | cp1026 | ibm1026 | Turkish | +-----------------+--------------------------------+--------------------------------+ +| cp1125 | 1125, ibm1125, cp866u, ruscii | Ukrainian | +| | | | +| | | .. versionadded:: 3.4 | ++-----------------+--------------------------------+--------------------------------+ | cp1140 | ibm1140 | Western Europe | +-----------------+--------------------------------+--------------------------------+ | cp1250 | windows-1250 | Central and Eastern Europe | diff --git a/Lib/encodings/aliases.py b/Lib/encodings/aliases.py --- a/Lib/encodings/aliases.py +++ b/Lib/encodings/aliases.py @@ -63,6 +63,12 @@ 'csibm1026' : 'cp1026', 'ibm1026' : 'cp1026', + # cp1125 codec + '1125' : 'cp1125', + 'ibm1125' : 'cp1125', + 'cp866u' : 'cp1125', + 'ruscii' : 'cp1125', + # cp1140 codec '1140' : 'cp1140', 'ibm1140' : 'cp1140', diff --git a/Lib/encodings/cp866.py b/Lib/encodings/cp866.py --- a/Lib/encodings/cp866.py +++ b/Lib/encodings/cp866.py @@ -1,4 +1,4 @@ -""" Python Character Mapping Codec generated from 'VENDORS/MICSFT/PC/CP866.TXT' with gencodec.py. +""" Python Character Mapping Codec for CP1125 """#" @@ -32,7 +32,7 @@ def getregentry(): return codecs.CodecInfo( - name='cp866', + name='cp1125', encode=Codec().encode, decode=Codec().decode, incrementalencoder=IncrementalEncoder, @@ -159,14 +159,14 @@ 0x00ef: 0x044f, # CYRILLIC SMALL LETTER YA 0x00f0: 0x0401, # CYRILLIC CAPITAL LETTER IO 0x00f1: 0x0451, # CYRILLIC SMALL LETTER IO - 0x00f2: 0x0404, # CYRILLIC CAPITAL LETTER UKRAINIAN IE - 0x00f3: 0x0454, # CYRILLIC SMALL LETTER UKRAINIAN IE - 0x00f4: 0x0407, # CYRILLIC CAPITAL LETTER YI - 0x00f5: 0x0457, # CYRILLIC SMALL LETTER YI - 0x00f6: 0x040e, # CYRILLIC CAPITAL LETTER SHORT U - 0x00f7: 0x045e, # CYRILLIC SMALL LETTER SHORT U - 0x00f8: 0x00b0, # DEGREE SIGN - 0x00f9: 0x2219, # BULLET OPERATOR + 0x00f2: 0x0490, # CYRILLIC CAPITAL LETTER GHE WITH UPTURN + 0x00f3: 0x0491, # CYRILLIC SMALL LETTER GHE WITH UPTURN + 0x00f4: 0x0404, # CYRILLIC CAPITAL LETTER UKRAINIAN IE + 0x00f5: 0x0454, # CYRILLIC SMALL LETTER UKRAINIAN IE + 0x00f6: 0x0406, # CYRILLIC CAPITAL LETTER BYELORUSSIAN-UKRAINIAN I + 0x00f7: 0x0456, # CYRILLIC SMALL LETTER BYELORUSSIAN-UKRAINIAN I + 0x00f8: 0x0407, # CYRILLIC CAPITAL LETTER YI + 0x00f9: 0x0457, # CYRILLIC SMALL LETTER YI 0x00fa: 0x00b7, # MIDDLE DOT 0x00fb: 0x221a, # SQUARE ROOT 0x00fc: 0x2116, # NUMERO SIGN @@ -420,14 +420,14 @@ '\u044f' # 0x00ef -> CYRILLIC SMALL LETTER YA '\u0401' # 0x00f0 -> CYRILLIC CAPITAL LETTER IO '\u0451' # 0x00f1 -> CYRILLIC SMALL LETTER IO - '\u0404' # 0x00f2 -> CYRILLIC CAPITAL LETTER UKRAINIAN IE - '\u0454' # 0x00f3 -> CYRILLIC SMALL LETTER UKRAINIAN IE - '\u0407' # 0x00f4 -> CYRILLIC CAPITAL LETTER YI - '\u0457' # 0x00f5 -> CYRILLIC SMALL LETTER YI - '\u040e' # 0x00f6 -> CYRILLIC CAPITAL LETTER SHORT U - '\u045e' # 0x00f7 -> CYRILLIC SMALL LETTER SHORT U - '\xb0' # 0x00f8 -> DEGREE SIGN - '\u2219' # 0x00f9 -> BULLET OPERATOR + '\u0490' # 0x00f2 -> CYRILLIC CAPITAL LETTER GHE WITH UPTURN + '\u0491' # 0x00f3 -> CYRILLIC SMALL LETTER GHE WITH UPTURN + '\u0404' # 0x00f4 -> CYRILLIC CAPITAL LETTER UKRAINIAN IE + '\u0454' # 0x00f5 -> CYRILLIC SMALL LETTER UKRAINIAN IE + '\u0406' # 0x00f6 -> CYRILLIC CAPITAL LETTER BYELORUSSIAN-UKRAINIAN I + '\u0456' # 0x00f7 -> CYRILLIC SMALL LETTER BYELORUSSIAN-UKRAINIAN I + '\u0407' # 0x00f8 -> CYRILLIC CAPITAL LETTER YI + '\u0457' # 0x00f9 -> CYRILLIC SMALL LETTER YI '\xb7' # 0x00fa -> MIDDLE DOT '\u221a' # 0x00fb -> SQUARE ROOT '\u2116' # 0x00fc -> NUMERO SIGN @@ -569,12 +569,11 @@ 0x007f: 0x007f, # DELETE 0x00a0: 0x00ff, # NO-BREAK SPACE 0x00a4: 0x00fd, # CURRENCY SIGN - 0x00b0: 0x00f8, # DEGREE SIGN 0x00b7: 0x00fa, # MIDDLE DOT 0x0401: 0x00f0, # CYRILLIC CAPITAL LETTER IO - 0x0404: 0x00f2, # CYRILLIC CAPITAL LETTER UKRAINIAN IE - 0x0407: 0x00f4, # CYRILLIC CAPITAL LETTER YI - 0x040e: 0x00f6, # CYRILLIC CAPITAL LETTER SHORT U + 0x0404: 0x00f4, # CYRILLIC CAPITAL LETTER UKRAINIAN IE + 0x0406: 0x00f6, # CYRILLIC CAPITAL LETTER BYELORUSSIAN-UKRAINIAN I + 0x0407: 0x00f8, # CYRILLIC CAPITAL LETTER YI 0x0410: 0x0080, # CYRILLIC CAPITAL LETTER A 0x0411: 0x0081, # CYRILLIC CAPITAL LETTER BE 0x0412: 0x0082, # CYRILLIC CAPITAL LETTER VE @@ -640,11 +639,12 @@ 0x044e: 0x00ee, # CYRILLIC SMALL LETTER YU 0x044f: 0x00ef, # CYRILLIC SMALL LETTER YA 0x0451: 0x00f1, # CYRILLIC SMALL LETTER IO - 0x0454: 0x00f3, # CYRILLIC SMALL LETTER UKRAINIAN IE - 0x0457: 0x00f5, # CYRILLIC SMALL LETTER YI - 0x045e: 0x00f7, # CYRILLIC SMALL LETTER SHORT U + 0x0454: 0x00f5, # CYRILLIC SMALL LETTER UKRAINIAN IE + 0x0456: 0x00f7, # CYRILLIC SMALL LETTER BYELORUSSIAN-UKRAINIAN I + 0x0457: 0x00f9, # CYRILLIC SMALL LETTER YI + 0x0490: 0x00f2, # CYRILLIC CAPITAL LETTER GHE WITH UPTURN + 0x0491: 0x00f3, # CYRILLIC SMALL LETTER GHE WITH UPTURN 0x2116: 0x00fc, # NUMERO SIGN - 0x2219: 0x00f9, # BULLET OPERATOR 0x221a: 0x00fb, # SQUARE ROOT 0x2500: 0x00c4, # BOX DRAWINGS LIGHT HORIZONTAL 0x2502: 0x00b3, # BOX DRAWINGS LIGHT VERTICAL diff --git a/Lib/test/test_codecs.py b/Lib/test/test_codecs.py --- a/Lib/test/test_codecs.py +++ b/Lib/test/test_codecs.py @@ -1602,6 +1602,7 @@ "cp037", "cp1006", "cp1026", + "cp1125", "cp1140", "cp1250", "cp1251", diff --git a/Lib/test/test_unicode.py b/Lib/test/test_unicode.py --- a/Lib/test/test_unicode.py +++ b/Lib/test/test_unicode.py @@ -1834,7 +1834,7 @@ 'cp037', 'cp1026', 'cp273', 'cp437', 'cp500', 'cp720', 'cp737', 'cp775', 'cp850', 'cp852', 'cp855', 'cp858', 'cp860', 'cp861', 'cp862', - 'cp863', 'cp865', 'cp866', + 'cp863', 'cp865', 'cp866', 'cp1125', 'iso8859_10', 'iso8859_13', 'iso8859_14', 'iso8859_15', 'iso8859_2', 'iso8859_3', 'iso8859_4', 'iso8859_5', 'iso8859_6', 'iso8859_7', 'iso8859_9', 'koi8_r', 'latin_1', @@ -1862,7 +1862,7 @@ 'cp037', 'cp1026', 'cp273', 'cp437', 'cp500', 'cp720', 'cp737', 'cp775', 'cp850', 'cp852', 'cp855', 'cp858', 'cp860', 'cp861', 'cp862', - 'cp863', 'cp865', 'cp866', + 'cp863', 'cp865', 'cp866', 'cp1125', 'iso8859_10', 'iso8859_13', 'iso8859_14', 'iso8859_15', 'iso8859_2', 'iso8859_4', 'iso8859_5', 'iso8859_9', 'koi8_r', 'latin_1', diff --git a/Lib/test/test_xml_etree.py b/Lib/test/test_xml_etree.py --- a/Lib/test/test_xml_etree.py +++ b/Lib/test/test_xml_etree.py @@ -692,9 +692,9 @@ 'iso8859-13', 'iso8859-14', 'iso8859-15', 'iso8859-16', 'cp437', 'cp720', 'cp737', 'cp775', 'cp850', 'cp852', 'cp855', 'cp856', 'cp857', 'cp858', 'cp860', 'cp861', 'cp862', - 'cp863', 'cp865', 'cp866', 'cp869', 'cp874', 'cp1006', 'cp1250', - 'cp1251', 'cp1252', 'cp1253', 'cp1254', 'cp1255', 'cp1256', - 'cp1257', 'cp1258', + 'cp863', 'cp865', 'cp866', 'cp869', 'cp874', 'cp1006', 'cp1125', + 'cp1250', 'cp1251', 'cp1252', 'cp1253', 'cp1254', 'cp1255', + 'cp1256', 'cp1257', 'cp1258', 'mac-cyrillic', 'mac-greek', 'mac-iceland', 'mac-latin2', 'mac-roman', 'mac-turkish', 'iso2022-jp', 'iso2022-jp-1', 'iso2022-jp-2', 'iso2022-jp-2004', diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,8 @@ Library ------- +- Issue #19668: Added support for the cp1125 encoding. + - Issue #19689: Add ssl.create_default_context() factory function. It creates a new SSLContext object with secure default settings. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 18:09:35 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 23 Nov 2013 18:09:35 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merge_heads?= Message-ID: <3dRgwb3czVz7Lkh@mail.python.org> http://hg.python.org/cpython/rev/8344e273c862 changeset: 87431:8344e273c862 parent: 87430:d0fd68ef1aa9 parent: 87429:e73683514b4d user: Serhiy Storchaka date: Sat Nov 23 19:08:38 2013 +0200 summary: Merge heads files: Lib/test/test_strftime.py | 4 +++- Misc/NEWS | 4 ++++ 2 files changed, 7 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_strftime.py b/Lib/test/test_strftime.py --- a/Lib/test/test_strftime.py +++ b/Lib/test/test_strftime.py @@ -183,8 +183,10 @@ """ def test_y_before_1900(self): + # Issue #13674, #19634 t = (1899, 1, 1, 0, 0, 0, 0, 0, 0) - if sys.platform == "win32" or sys.platform.startswith("aix"): + if (sys.platform == "win32" + or sys.platform.startswith(("aix", "sunos", "solaris"))): with self.assertRaises(ValueError): time.strftime("%y", t) else: diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -85,6 +85,10 @@ - Issue #19673: Add pathlib to the stdlib as a provisional module (PEP 428). +- Issue #16596: pdb in a generator now properly skips over yield and + yield from rather than stepping out of the generator into its + caller. (This is essential for stepping through asyncio coroutines.) + - Issue #17916: Added dis.Bytecode.from_traceback() and dis.Bytecode.current_offset to easily display "current instruction" markers in the new disassembly API (Patch by Claudiu Popa). -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 18:19:44 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 18:19:44 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5MzA4?= =?utf-8?q?=3A_fix_the_gdb_plugin_on_gdbs_linked_with_Python_3?= Message-ID: <3dRh8J07GFz7Lkh@mail.python.org> http://hg.python.org/cpython/rev/ef4636faf8bd changeset: 87432:ef4636faf8bd branch: 2.7 parent: 87414:3912934e99ba user: Antoine Pitrou date: Sat Nov 23 17:40:36 2013 +0100 summary: Issue #19308: fix the gdb plugin on gdbs linked with Python 3 files: Lib/test/test_gdb.py | 4 +- Tools/gdb/libpython.py | 89 ++++++++++++++++++++--------- 2 files changed, 63 insertions(+), 30 deletions(-) diff --git a/Lib/test/test_gdb.py b/Lib/test/test_gdb.py --- a/Lib/test/test_gdb.py +++ b/Lib/test/test_gdb.py @@ -49,7 +49,7 @@ return out, err # Verify that "gdb" was built with the embedded python support enabled: -gdbpy_version, _ = run_gdb("--eval-command=python import sys; print sys.version_info") +gdbpy_version, _ = run_gdb("--eval-command=python import sys; print(sys.version_info)") if not gdbpy_version: raise unittest.SkipTest("gdb not built with embedded python support") @@ -214,7 +214,7 @@ # matches repr(value) in this process: gdb_repr, gdb_output = self.get_gdb_repr('print ' + repr(val), cmds_after_breakpoint) - self.assertEqual(gdb_repr, repr(val), gdb_output) + self.assertEqual(gdb_repr, repr(val)) def test_int(self): 'Verify the pretty-printing of various "int" values' diff --git a/Tools/gdb/libpython.py b/Tools/gdb/libpython.py --- a/Tools/gdb/libpython.py +++ b/Tools/gdb/libpython.py @@ -39,10 +39,20 @@ The module also extends gdb with some python-specific commands. ''' -from __future__ import with_statement + +# NOTE: some gdbs are linked with Python 3, so this file should be dual-syntax +# compatible (2.6+ and 3.0+). See #19308. + +from __future__ import print_function, with_statement import gdb +import os import sys +if sys.version_info[0] >= 3: + unichr = chr + xrange = range + long = int + # Look up the gdb.Type for some standard types: _type_char_ptr = gdb.lookup_type('char').pointer() # char* _type_unsigned_char_ptr = gdb.lookup_type('unsigned char').pointer() # unsigned char* @@ -51,17 +61,17 @@ SIZEOF_VOID_P = _type_void_ptr.sizeof -Py_TPFLAGS_HEAPTYPE = (1L << 9) +Py_TPFLAGS_HEAPTYPE = (1 << 9) -Py_TPFLAGS_INT_SUBCLASS = (1L << 23) -Py_TPFLAGS_LONG_SUBCLASS = (1L << 24) -Py_TPFLAGS_LIST_SUBCLASS = (1L << 25) -Py_TPFLAGS_TUPLE_SUBCLASS = (1L << 26) -Py_TPFLAGS_STRING_SUBCLASS = (1L << 27) -Py_TPFLAGS_UNICODE_SUBCLASS = (1L << 28) -Py_TPFLAGS_DICT_SUBCLASS = (1L << 29) -Py_TPFLAGS_BASE_EXC_SUBCLASS = (1L << 30) -Py_TPFLAGS_TYPE_SUBCLASS = (1L << 31) +Py_TPFLAGS_INT_SUBCLASS = (1 << 23) +Py_TPFLAGS_LONG_SUBCLASS = (1 << 24) +Py_TPFLAGS_LIST_SUBCLASS = (1 << 25) +Py_TPFLAGS_TUPLE_SUBCLASS = (1 << 26) +Py_TPFLAGS_STRING_SUBCLASS = (1 << 27) +Py_TPFLAGS_UNICODE_SUBCLASS = (1 << 28) +Py_TPFLAGS_DICT_SUBCLASS = (1 << 29) +Py_TPFLAGS_BASE_EXC_SUBCLASS = (1 << 30) +Py_TPFLAGS_TYPE_SUBCLASS = (1 << 31) MAX_OUTPUT_LEN=1024 @@ -80,7 +90,7 @@ def safe_range(val): # As per range, but don't trust the value too much: cap it to a safety # threshold in case the data was corrupted - return xrange(safety_limit(val)) + return xrange(safety_limit(int(val))) class StringTruncated(RuntimeError): @@ -292,8 +302,8 @@ # class return cls - #print 'tp_flags = 0x%08x' % tp_flags - #print 'tp_name = %r' % tp_name + #print('tp_flags = 0x%08x' % tp_flags) + #print('tp_name = %r' % tp_name) name_map = {'bool': PyBoolObjectPtr, 'classobj': PyClassObjectPtr, @@ -758,14 +768,14 @@ ''' ob_size = long(self.field('ob_size')) if ob_size == 0: - return 0L + return 0 ob_digit = self.field('ob_digit') if gdb.lookup_type('digit').sizeof == 2: - SHIFT = 15L + SHIFT = 15 else: - SHIFT = 30L + SHIFT = 30 digits = [long(ob_digit[i]) * 2**(SHIFT*i) for i in safe_range(abs(ob_size))] @@ -774,6 +784,12 @@ result = -result return result + def write_repr(self, out, visited): + # This ensures the trailing 'L' is printed when gdb is linked + # with a Python 3 interpreter. + out.write(repr(self.proxyval(visited)).rstrip('L')) + out.write('L') + class PyNoneStructPtr(PyObjectPtr): """ @@ -969,11 +985,19 @@ field_ob_size = self.field('ob_size') field_ob_sval = self.field('ob_sval') char_ptr = field_ob_sval.address.cast(_type_unsigned_char_ptr) + # When gdb is linked with a Python 3 interpreter, this is really + # a latin-1 mojibake decoding of the original string... return ''.join([chr(char_ptr[i]) for i in safe_range(field_ob_size)]) def proxyval(self, visited): return str(self) + def write_repr(self, out, visited): + val = repr(self.proxyval(visited)) + if sys.version_info[0] >= 3: + val = val.encode('ascii', 'backslashreplace').decode('ascii') + out.write(val) + class PyTupleObjectPtr(PyObjectPtr): _typename = 'PyTupleObject' @@ -1072,6 +1096,15 @@ result = u''.join([_unichr(ucs) for ucs in Py_UNICODEs]) return result + def write_repr(self, out, visited): + val = repr(self.proxyval(visited)) + if sys.version_info[0] >= 3: + val = val.encode('ascii', 'backslashreplace').decode('ascii') + # This ensures the 'u' prefix is printed when gdb is linked + # with a Python 3 interpreter. + out.write('u') + out.write(val.lstrip('u')) + def int_from_int(gdbval): return int(str(gdbval)) @@ -1295,12 +1328,12 @@ frame = Frame.get_selected_python_frame() if not frame: - print 'Unable to locate python frame' + print('Unable to locate python frame') return pyop = frame.get_pyop() if not pyop or pyop.is_optimized_out(): - print 'Unable to read information on python frame' + print('Unable to read information on python frame') return filename = pyop.filename() @@ -1350,9 +1383,9 @@ frame = iter_frame if move_up: - print 'Unable to find an older python frame' + print('Unable to find an older python frame') else: - print 'Unable to find a newer python frame' + print('Unable to find a newer python frame') class PyUp(gdb.Command): 'Select and print the python stack frame that called this one (if any)' @@ -1415,23 +1448,23 @@ frame = Frame.get_selected_python_frame() if not frame: - print 'Unable to locate python frame' + print('Unable to locate python frame') return pyop_frame = frame.get_pyop() if not pyop_frame: - print 'Unable to read information on python frame' + print('Unable to read information on python frame') return pyop_var, scope = pyop_frame.get_var_by_name(name) if pyop_var: - print ('%s %r = %s' + print('%s %r = %s' % (scope, name, pyop_var.get_truncated_repr(MAX_OUTPUT_LEN))) else: - print '%r not found' % name + print('%r not found' % name) PyPrint() @@ -1449,16 +1482,16 @@ frame = Frame.get_selected_python_frame() if not frame: - print 'Unable to locate python frame' + print('Unable to locate python frame') return pyop_frame = frame.get_pyop() if not pyop_frame: - print 'Unable to read information on python frame' + print('Unable to read information on python frame') return for pyop_name, pyop_value in pyop_frame.iter_locals(): - print ('%s = %s' + print('%s = %s' % (pyop_name.proxyval(set()), pyop_value.get_truncated_repr(MAX_OUTPUT_LEN))) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 18:20:23 2013 From: python-checkins at python.org (guido.van.rossum) Date: Sat, 23 Nov 2013 18:20:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Relax_timing_even_more=2C_?= =?utf-8?q?hopefully_again_fixes_issue_19579=2E?= Message-ID: <3dRh9342d5z7LjR@mail.python.org> http://hg.python.org/cpython/rev/085091cb6e4c changeset: 87433:085091cb6e4c parent: 87431:8344e273c862 user: Guido van Rossum date: Sat Nov 23 09:20:12 2013 -0800 summary: Relax timing even more, hopefully again fixes issue 19579. files: Lib/test/test_asyncio/test_base_events.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -181,7 +181,7 @@ self.loop._run_once() t = self.loop._selector.select.call_args[0][0] - self.assertTrue(9.9 < t < 10.1, t) + self.assertTrue(9.5 < t < 10.5, t) self.assertEqual([h2], self.loop._scheduled) self.assertTrue(self.loop._process_events.called) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 18:22:41 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 18:22:41 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogVW5kbyAoaG9wZWZ1?= =?utf-8?q?lly=29_buildbot_failures?= Message-ID: <3dRhCj5jz4z7Llf@mail.python.org> http://hg.python.org/cpython/rev/f5a626f762f6 changeset: 87434:f5a626f762f6 branch: 3.3 parent: 87427:752db82b7933 user: Antoine Pitrou date: Sat Nov 23 18:20:42 2013 +0100 summary: Undo (hopefully) buildbot failures files: Tools/gdb/libpython.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Tools/gdb/libpython.py b/Tools/gdb/libpython.py --- a/Tools/gdb/libpython.py +++ b/Tools/gdb/libpython.py @@ -116,7 +116,7 @@ try: os_fsencode = os.fsencode -except ImportError: +except AttributeError: def os_fsencode(filename): if not isinstance(filename, unicode): return filename -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 18:22:46 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 18:22:46 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Undo_=28hopefully=29_buildbot_failures?= Message-ID: <3dRhCp0LXPz7Lqg@mail.python.org> http://hg.python.org/cpython/rev/2f2dfad79e33 changeset: 87435:2f2dfad79e33 parent: 87433:085091cb6e4c parent: 87434:f5a626f762f6 user: Antoine Pitrou date: Sat Nov 23 18:22:02 2013 +0100 summary: Undo (hopefully) buildbot failures files: Tools/gdb/libpython.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Tools/gdb/libpython.py b/Tools/gdb/libpython.py --- a/Tools/gdb/libpython.py +++ b/Tools/gdb/libpython.py @@ -116,7 +116,7 @@ try: os_fsencode = os.fsencode -except ImportError: +except AttributeError: def os_fsencode(filename): if not isinstance(filename, unicode): return filename -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 18:51:16 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 23 Nov 2013 18:51:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fixed_incorrectly_applying?= =?utf-8?q?_a_patch_for_issue19668=2E?= Message-ID: <3dRhrh2qZ7z7Lkq@mail.python.org> http://hg.python.org/cpython/rev/355d8950f574 changeset: 87436:355d8950f574 user: Serhiy Storchaka date: Sat Nov 23 19:50:47 2013 +0200 summary: Fixed incorrectly applying a patch for issue19668. files: Lib/encodings/cp866.py | 0 Lib/encodings/cp866.py | 52 +++++++++++++++--------------- 2 files changed, 26 insertions(+), 26 deletions(-) diff --git a/Lib/encodings/cp866.py b/Lib/encodings/cp1125.py copy from Lib/encodings/cp866.py copy to Lib/encodings/cp1125.py diff --git a/Lib/encodings/cp866.py b/Lib/encodings/cp866.py --- a/Lib/encodings/cp866.py +++ b/Lib/encodings/cp866.py @@ -1,4 +1,4 @@ -""" Python Character Mapping Codec for CP1125 +""" Python Character Mapping Codec generated from 'VENDORS/MICSFT/PC/CP866.TXT' with gencodec.py. """#" @@ -32,7 +32,7 @@ def getregentry(): return codecs.CodecInfo( - name='cp1125', + name='cp866', encode=Codec().encode, decode=Codec().decode, incrementalencoder=IncrementalEncoder, @@ -159,14 +159,14 @@ 0x00ef: 0x044f, # CYRILLIC SMALL LETTER YA 0x00f0: 0x0401, # CYRILLIC CAPITAL LETTER IO 0x00f1: 0x0451, # CYRILLIC SMALL LETTER IO - 0x00f2: 0x0490, # CYRILLIC CAPITAL LETTER GHE WITH UPTURN - 0x00f3: 0x0491, # CYRILLIC SMALL LETTER GHE WITH UPTURN - 0x00f4: 0x0404, # CYRILLIC CAPITAL LETTER UKRAINIAN IE - 0x00f5: 0x0454, # CYRILLIC SMALL LETTER UKRAINIAN IE - 0x00f6: 0x0406, # CYRILLIC CAPITAL LETTER BYELORUSSIAN-UKRAINIAN I - 0x00f7: 0x0456, # CYRILLIC SMALL LETTER BYELORUSSIAN-UKRAINIAN I - 0x00f8: 0x0407, # CYRILLIC CAPITAL LETTER YI - 0x00f9: 0x0457, # CYRILLIC SMALL LETTER YI + 0x00f2: 0x0404, # CYRILLIC CAPITAL LETTER UKRAINIAN IE + 0x00f3: 0x0454, # CYRILLIC SMALL LETTER UKRAINIAN IE + 0x00f4: 0x0407, # CYRILLIC CAPITAL LETTER YI + 0x00f5: 0x0457, # CYRILLIC SMALL LETTER YI + 0x00f6: 0x040e, # CYRILLIC CAPITAL LETTER SHORT U + 0x00f7: 0x045e, # CYRILLIC SMALL LETTER SHORT U + 0x00f8: 0x00b0, # DEGREE SIGN + 0x00f9: 0x2219, # BULLET OPERATOR 0x00fa: 0x00b7, # MIDDLE DOT 0x00fb: 0x221a, # SQUARE ROOT 0x00fc: 0x2116, # NUMERO SIGN @@ -420,14 +420,14 @@ '\u044f' # 0x00ef -> CYRILLIC SMALL LETTER YA '\u0401' # 0x00f0 -> CYRILLIC CAPITAL LETTER IO '\u0451' # 0x00f1 -> CYRILLIC SMALL LETTER IO - '\u0490' # 0x00f2 -> CYRILLIC CAPITAL LETTER GHE WITH UPTURN - '\u0491' # 0x00f3 -> CYRILLIC SMALL LETTER GHE WITH UPTURN - '\u0404' # 0x00f4 -> CYRILLIC CAPITAL LETTER UKRAINIAN IE - '\u0454' # 0x00f5 -> CYRILLIC SMALL LETTER UKRAINIAN IE - '\u0406' # 0x00f6 -> CYRILLIC CAPITAL LETTER BYELORUSSIAN-UKRAINIAN I - '\u0456' # 0x00f7 -> CYRILLIC SMALL LETTER BYELORUSSIAN-UKRAINIAN I - '\u0407' # 0x00f8 -> CYRILLIC CAPITAL LETTER YI - '\u0457' # 0x00f9 -> CYRILLIC SMALL LETTER YI + '\u0404' # 0x00f2 -> CYRILLIC CAPITAL LETTER UKRAINIAN IE + '\u0454' # 0x00f3 -> CYRILLIC SMALL LETTER UKRAINIAN IE + '\u0407' # 0x00f4 -> CYRILLIC CAPITAL LETTER YI + '\u0457' # 0x00f5 -> CYRILLIC SMALL LETTER YI + '\u040e' # 0x00f6 -> CYRILLIC CAPITAL LETTER SHORT U + '\u045e' # 0x00f7 -> CYRILLIC SMALL LETTER SHORT U + '\xb0' # 0x00f8 -> DEGREE SIGN + '\u2219' # 0x00f9 -> BULLET OPERATOR '\xb7' # 0x00fa -> MIDDLE DOT '\u221a' # 0x00fb -> SQUARE ROOT '\u2116' # 0x00fc -> NUMERO SIGN @@ -569,11 +569,12 @@ 0x007f: 0x007f, # DELETE 0x00a0: 0x00ff, # NO-BREAK SPACE 0x00a4: 0x00fd, # CURRENCY SIGN + 0x00b0: 0x00f8, # DEGREE SIGN 0x00b7: 0x00fa, # MIDDLE DOT 0x0401: 0x00f0, # CYRILLIC CAPITAL LETTER IO - 0x0404: 0x00f4, # CYRILLIC CAPITAL LETTER UKRAINIAN IE - 0x0406: 0x00f6, # CYRILLIC CAPITAL LETTER BYELORUSSIAN-UKRAINIAN I - 0x0407: 0x00f8, # CYRILLIC CAPITAL LETTER YI + 0x0404: 0x00f2, # CYRILLIC CAPITAL LETTER UKRAINIAN IE + 0x0407: 0x00f4, # CYRILLIC CAPITAL LETTER YI + 0x040e: 0x00f6, # CYRILLIC CAPITAL LETTER SHORT U 0x0410: 0x0080, # CYRILLIC CAPITAL LETTER A 0x0411: 0x0081, # CYRILLIC CAPITAL LETTER BE 0x0412: 0x0082, # CYRILLIC CAPITAL LETTER VE @@ -639,12 +640,11 @@ 0x044e: 0x00ee, # CYRILLIC SMALL LETTER YU 0x044f: 0x00ef, # CYRILLIC SMALL LETTER YA 0x0451: 0x00f1, # CYRILLIC SMALL LETTER IO - 0x0454: 0x00f5, # CYRILLIC SMALL LETTER UKRAINIAN IE - 0x0456: 0x00f7, # CYRILLIC SMALL LETTER BYELORUSSIAN-UKRAINIAN I - 0x0457: 0x00f9, # CYRILLIC SMALL LETTER YI - 0x0490: 0x00f2, # CYRILLIC CAPITAL LETTER GHE WITH UPTURN - 0x0491: 0x00f3, # CYRILLIC SMALL LETTER GHE WITH UPTURN + 0x0454: 0x00f3, # CYRILLIC SMALL LETTER UKRAINIAN IE + 0x0457: 0x00f5, # CYRILLIC SMALL LETTER YI + 0x045e: 0x00f7, # CYRILLIC SMALL LETTER SHORT U 0x2116: 0x00fc, # NUMERO SIGN + 0x2219: 0x00f9, # BULLET OPERATOR 0x221a: 0x00fb, # SQUARE ROOT 0x2500: 0x00c4, # BOX DRAWINGS LIGHT HORIZONTAL 0x2502: 0x00b3, # BOX DRAWINGS LIGHT VERTICAL -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 18:52:20 2013 From: python-checkins at python.org (ezio.melotti) Date: Sat, 23 Nov 2013 18:52:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=2313633=3A_Added_a_new_co?= =?utf-8?q?nvert=5Fcharrefs_keyword_arg_to_HTMLParser_that=2C_when_True=2C?= Message-ID: <3dRhsw5QwCzQX4@mail.python.org> http://hg.python.org/cpython/rev/1575f2dd08c4 changeset: 87437:1575f2dd08c4 user: Ezio Melotti date: Sat Nov 23 19:52:05 2013 +0200 summary: #13633: Added a new convert_charrefs keyword arg to HTMLParser that, when True, automatically converts all character references. files: Doc/library/html.parser.rst | 35 ++++++++--- Lib/html/parser.py | 62 ++++++++++++++++------ Lib/test/test_htmlparser.py | 70 ++++++++++++++++++++++-- Misc/NEWS | 3 + 4 files changed, 134 insertions(+), 36 deletions(-) diff --git a/Doc/library/html.parser.rst b/Doc/library/html.parser.rst --- a/Doc/library/html.parser.rst +++ b/Doc/library/html.parser.rst @@ -16,14 +16,21 @@ This module defines a class :class:`HTMLParser` which serves as the basis for parsing text files formatted in HTML (HyperText Mark-up Language) and XHTML. -.. class:: HTMLParser(strict=False) +.. class:: HTMLParser(strict=False, *, convert_charrefs=False) - Create a parser instance. If *strict* is ``False`` (the default), the parser - will accept and parse invalid markup. If *strict* is ``True`` the parser - will raise an :exc:`~html.parser.HTMLParseError` exception instead [#]_ when - it's not able to parse the markup. - The use of ``strict=True`` is discouraged and the *strict* argument is - deprecated. + Create a parser instance. + + If *convert_charrefs* is ``True`` (default: ``False``), all character + references (except the ones in ``script``/``style`` elements) are + automatically converted to the corresponding Unicode characters. + The use of ``convert_charrefs=True`` is encouraged and will become + the default in Python 3.5. + + If *strict* is ``False`` (the default), the parser will accept and parse + invalid markup. If *strict* is ``True`` the parser will raise an + :exc:`~html.parser.HTMLParseError` exception instead [#]_ when it's not + able to parse the markup. The use of ``strict=True`` is discouraged and + the *strict* argument is deprecated. An :class:`.HTMLParser` instance is fed HTML data and calls handler methods when start tags, end tags, text, comments, and other markup elements are @@ -34,12 +41,15 @@ handler for elements which are closed implicitly by closing an outer element. .. versionchanged:: 3.2 - *strict* keyword added. + *strict* argument added. .. deprecated-removed:: 3.3 3.5 The *strict* argument and the strict mode have been deprecated. The parser is now able to accept and parse invalid markup too. + .. versionchanged:: 3.4 + *convert_charrefs* keyword argument added. + An exception is defined as well: @@ -181,7 +191,8 @@ This method is called to process a named character reference of the form ``&name;`` (e.g. ``>``), where *name* is a general entity reference - (e.g. ``'gt'``). + (e.g. ``'gt'``). This method is never called if *convert_charrefs* is + ``True``. .. method:: HTMLParser.handle_charref(name) @@ -189,7 +200,8 @@ This method is called to process decimal and hexadecimal numeric character references of the form ``&#NNN;`` and ``&#xNNN;``. For example, the decimal equivalent for ``>`` is ``>``, whereas the hexadecimal is ``>``; - in this case the method will receive ``'62'`` or ``'x3E'``. + in this case the method will receive ``'62'`` or ``'x3E'``. This method + is never called if *convert_charrefs* is ``True``. .. method:: HTMLParser.handle_comment(data) @@ -324,7 +336,8 @@ Num ent : > Feeding incomplete chunks to :meth:`~HTMLParser.feed` works, but -:meth:`~HTMLParser.handle_data` might be called more than once:: +:meth:`~HTMLParser.handle_data` might be called more than once +(unless *convert_charrefs* is set to ``True``):: >>> for chunk in ['buff', 'ered ', 'text']: ... parser.feed(chunk) diff --git a/Lib/html/parser.py b/Lib/html/parser.py --- a/Lib/html/parser.py +++ b/Lib/html/parser.py @@ -97,7 +97,7 @@ return result -_strict_sentinel = object() +_default_sentinel = object() class HTMLParser(_markupbase.ParserBase): """Find tags and other markup and call handler functions. @@ -112,28 +112,39 @@ self.handle_startendtag(); end tags by self.handle_endtag(). The data between tags is passed from the parser to the derived class by calling self.handle_data() with the data as argument (the data - may be split up in arbitrary chunks). Entity references are - passed by calling self.handle_entityref() with the entity - reference as the argument. Numeric character references are - passed to self.handle_charref() with the string containing the - reference as the argument. + may be split up in arbitrary chunks). If convert_charrefs is + True the character references are converted automatically to the + corresponding Unicode character (and self.handle_data() is no + longer split in chunks), otherwise they are passed by calling + self.handle_entityref() or self.handle_charref() with the string + containing respectively the named or numeric reference as the + argument. """ CDATA_CONTENT_ELEMENTS = ("script", "style") - def __init__(self, strict=_strict_sentinel): + def __init__(self, strict=_default_sentinel, *, + convert_charrefs=_default_sentinel): """Initialize and reset this instance. + If convert_charrefs is True (default: False), all character references + are automatically converted to the corresponding Unicode characters. If strict is set to False (the default) the parser will parse invalid markup, otherwise it will raise an error. Note that the strict mode and argument are deprecated. """ - if strict is not _strict_sentinel: + if strict is not _default_sentinel: warnings.warn("The strict argument and mode are deprecated.", DeprecationWarning, stacklevel=2) else: strict = False # default self.strict = strict + if convert_charrefs is _default_sentinel: + convert_charrefs = False # default + warnings.warn("The value of convert_charrefs will become True in " + "3.5. You are encouraged to set the value explicitly.", + DeprecationWarning, stacklevel=2) + self.convert_charrefs = convert_charrefs self.reset() def reset(self): @@ -184,14 +195,25 @@ i = 0 n = len(rawdata) while i < n: - match = self.interesting.search(rawdata, i) # < or & - if match: - j = match.start() + if self.convert_charrefs and not self.cdata_elem: + j = rawdata.find('<', i) + if j < 0: + if not end: + break # wait till we get all the text + j = n else: - if self.cdata_elem: - break - j = n - if i < j: self.handle_data(rawdata[i:j]) + match = self.interesting.search(rawdata, i) # < or & + if match: + j = match.start() + else: + if self.cdata_elem: + break + j = n + if i < j: + if self.convert_charrefs and not self.cdata_elem: + self.handle_data(unescape(rawdata[i:j])) + else: + self.handle_data(rawdata[i:j]) i = self.updatepos(i, j) if i == n: break startswith = rawdata.startswith @@ -226,7 +248,10 @@ k = i + 1 else: k += 1 - self.handle_data(rawdata[i:k]) + if self.convert_charrefs and not self.cdata_elem: + self.handle_data(unescape(rawdata[i:k])) + else: + self.handle_data(rawdata[i:k]) i = self.updatepos(i, k) elif startswith("&#", i): match = charref.match(rawdata, i) @@ -277,7 +302,10 @@ assert 0, "interesting.search() lied" # end while if end and i < n and not self.cdata_elem: - self.handle_data(rawdata[i:n]) + if self.convert_charrefs and not self.cdata_elem: + self.handle_data(unescape(rawdata[i:n])) + else: + self.handle_data(rawdata[i:n]) i = self.updatepos(i, n) self.rawdata = rawdata[i:] diff --git a/Lib/test/test_htmlparser.py b/Lib/test/test_htmlparser.py --- a/Lib/test/test_htmlparser.py +++ b/Lib/test/test_htmlparser.py @@ -70,6 +70,18 @@ self.append(("starttag_text", self.get_starttag_text())) +class EventCollectorCharrefs(EventCollector): + + def get_events(self): + return self.events + + def handle_charref(self, data): + self.fail('This should never be called with convert_charrefs=True') + + def handle_entityref(self, data): + self.fail('This should never be called with convert_charrefs=True') + + class TestCaseBase(unittest.TestCase): def get_collector(self): @@ -84,12 +96,14 @@ parser.close() events = parser.get_events() if events != expected_events: - self.fail("received events did not match expected events\n" - "Expected:\n" + pprint.pformat(expected_events) + + self.fail("received events did not match expected events" + + "\nSource:\n" + repr(source) + + "\nExpected:\n" + pprint.pformat(expected_events) + "\nReceived:\n" + pprint.pformat(events)) def _run_check_extra(self, source, events): - self._run_check(source, events, EventCollectorExtra()) + self._run_check(source, events, + EventCollectorExtra(convert_charrefs=False)) def _parse_error(self, source): def parse(source=source): @@ -105,7 +119,7 @@ def get_collector(self): with support.check_warnings(("", DeprecationWarning), quite=False): - return EventCollector(strict=True) + return EventCollector(strict=True, convert_charrefs=False) def test_processing_instruction_only(self): self._run_check("", [ @@ -335,7 +349,7 @@ self._run_check(s, [("starttag", element_lower, []), ("data", content), ("endtag", element_lower)], - collector=Collector()) + collector=Collector(convert_charrefs=False)) def test_comments(self): html = ("" @@ -363,14 +377,54 @@ ('comment', '[if lte IE 7]>pretty?a{0}z'.format(charref), + expected, collector=collector()) + # check charrefs at the beginning/end of the text/attributes + expected = [('data', '"'), + ('starttag', 'a', [('x', '"'), ('y', '"X'), ('z', 'X"')]), + ('data', '"'), ('endtag', 'a'), ('data', '"')] + for charref in charrefs: + self._run_check('{0}' + '{0}{0}'.format(charref), + expected, collector=collector()) + # check charrefs in {1}' + '{1}'.format(text, charref), + expected, collector=collector()) + # check truncated charrefs at the end of the file + html = '&quo &# &#x' + for x in range(1, len(html)): + self._run_check(html[:x], [('data', html[:x])], + collector=collector()) + # check a string with no charrefs + self._run_check('no charrefs here', [('data', 'no charrefs here')], + collector=collector()) + class HTMLParserTolerantTestCase(HTMLParserStrictTestCase): def get_collector(self): - return EventCollector() + return EventCollector(convert_charrefs=False) def test_deprecation_warnings(self): with self.assertWarns(DeprecationWarning): + EventCollector() # convert_charrefs not passed explicitly + with self.assertWarns(DeprecationWarning): EventCollector(strict=True) with self.assertWarns(DeprecationWarning): EventCollector(strict=False) @@ -630,7 +684,7 @@ def get_collector(self): with support.check_warnings(("", DeprecationWarning), quite=False): - return EventCollector(strict=True) + return EventCollector(strict=True, convert_charrefs=False) def test_attr_syntax(self): output = [ @@ -691,7 +745,7 @@ class AttributesTolerantTestCase(AttributesStrictTestCase): def get_collector(self): - return EventCollector() + return EventCollector(convert_charrefs=False) def test_attr_funky_names2(self): self._run_check( diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -132,6 +132,9 @@ - Issue #19449: in csv's writerow, handle non-string keys when generating the error message that certain keys are not in the 'fieldnames' list. +- Issue #13633: Added a new convert_charrefs keyword arg to HTMLParser that, + when True, automatically converts all character references. + - Issue #2927: Added the unescape() function to the html module. - Issue #8402: Added the escape() function to the glob module. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 19:01:47 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 19:01:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2317810=3A_Implemen?= =?utf-8?q?t_PEP_3154=2C_pickle_protocol_4=2E?= Message-ID: <3dRj4q0S14z7Lp9@mail.python.org> http://hg.python.org/cpython/rev/992ef855b3ed changeset: 87438:992ef855b3ed user: Antoine Pitrou date: Sat Nov 23 18:59:12 2013 +0100 summary: Issue #17810: Implement PEP 3154, pickle protocol 4. Most of the work is by Alexandre. files: Doc/library/pickle.rst | 35 +- Doc/whatsnew/3.4.rst | 15 + Lib/copyreg.py | 6 + Lib/pickle.py | 582 +++++++-- Lib/pickletools.py | 471 +++++++- Lib/test/pickletester.py | 485 ++++++-- Lib/test/test_descr.py | 605 +++++++--- Misc/NEWS | 2 + Modules/_pickle.c | 1454 ++++++++++++++++++------- Objects/classobject.c | 26 +- Objects/descrobject.c | 45 +- Objects/typeobject.c | 510 +++++++-- 12 files changed, 3181 insertions(+), 1055 deletions(-) diff --git a/Doc/library/pickle.rst b/Doc/library/pickle.rst --- a/Doc/library/pickle.rst +++ b/Doc/library/pickle.rst @@ -459,12 +459,29 @@ Classes can alter the default behaviour by providing one or several special methods: +.. method:: object.__getnewargs_ex__() + + In protocols 4 and newer, classes that implements the + :meth:`__getnewargs_ex__` method can dictate the values passed to the + :meth:`__new__` method upon unpickling. The method must return a pair + ``(args, kwargs)`` where *args* is a tuple of positional arguments + and *kwargs* a dictionary of named arguments for constructing the + object. Those will be passed to the :meth:`__new__` method upon + unpickling. + + You should implement this method if the :meth:`__new__` method of your + class requires keyword-only arguments. Otherwise, it is recommended for + compatibility to implement :meth:`__getnewargs__`. + + .. method:: object.__getnewargs__() - In protocol 2 and newer, classes that implements the :meth:`__getnewargs__` - method can dictate the values passed to the :meth:`__new__` method upon - unpickling. This is often needed for classes whose :meth:`__new__` method - requires arguments. + This method serve a similar purpose as :meth:`__getnewargs_ex__` but + for protocols 2 and newer. It must return a tuple of arguments `args` + which will be passed to the :meth:`__new__` method upon unpickling. + + In protocols 4 and newer, :meth:`__getnewargs__` will not be called if + :meth:`__getnewargs_ex__` is defined. .. method:: object.__getstate__() @@ -496,10 +513,10 @@ At unpickling time, some methods like :meth:`__getattr__`, :meth:`__getattribute__`, or :meth:`__setattr__` may be called upon the - instance. In case those methods rely on some internal invariant being true, - the type should implement :meth:`__getnewargs__` to establish such an - invariant; otherwise, neither :meth:`__new__` nor :meth:`__init__` will be - called. + instance. In case those methods rely on some internal invariant being + true, the type should implement :meth:`__getnewargs__` or + :meth:`__getnewargs_ex__` to establish such an invariant; otherwise, + neither :meth:`__new__` nor :meth:`__init__` will be called. .. index:: pair: copy; protocol @@ -511,7 +528,7 @@ Although powerful, implementing :meth:`__reduce__` directly in your classes is error prone. For this reason, class designers should use the high-level -interface (i.e., :meth:`__getnewargs__`, :meth:`__getstate__` and +interface (i.e., :meth:`__getnewargs_ex__`, :meth:`__getstate__` and :meth:`__setstate__`) whenever possible. We will show, however, cases where using :meth:`__reduce__` is the only option or leads to more efficient pickling or both. diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -109,6 +109,7 @@ Significantly Improved Library Modules: * Single-dispatch generic functions in :mod:`functoools` (:pep:`443`) +* New :mod:`pickle` protocol 4 (:pep:`3154`) * SHA-3 (Keccak) support for :mod:`hashlib`. * TLSv1.1 and TLSv1.2 support for :mod:`ssl`. * :mod:`multiprocessing` now has option to avoid using :func:`os.fork` @@ -285,6 +286,20 @@ the new methods. +Pickle protocol 4 +================= + +The new :mod:`pickle` protocol addresses a number of issues that were present +in previous protocols, such as the serialization of nested classes, very +large strings and containers, or classes whose :meth:`__new__` method takes +keyword-only arguments. It also brings a couple efficiency improvements. + +.. seealso:: + + :pep:`3154` - Pickle protocol 4 + PEP written by Antoine Pitrou and implemented by Alexandre Vassalotti. + + Other Language Changes ====================== diff --git a/Lib/copyreg.py b/Lib/copyreg.py --- a/Lib/copyreg.py +++ b/Lib/copyreg.py @@ -87,6 +87,12 @@ def __newobj__(cls, *args): return cls.__new__(cls, *args) +def __newobj_ex__(cls, args, kwargs): + """Used by pickle protocol 4, instead of __newobj__ to allow classes with + keyword-only arguments to be pickled correctly. + """ + return cls.__new__(cls, *args, **kwargs) + def _slotnames(cls): """Return a list of slot names for a given class. diff --git a/Lib/pickle.py b/Lib/pickle.py --- a/Lib/pickle.py +++ b/Lib/pickle.py @@ -23,7 +23,7 @@ """ -from types import FunctionType, BuiltinFunctionType +from types import FunctionType, BuiltinFunctionType, ModuleType from copyreg import dispatch_table from copyreg import _extension_registry, _inverted_registry, _extension_cache from itertools import islice @@ -42,17 +42,18 @@ bytes_types = (bytes, bytearray) # These are purely informational; no code uses these. -format_version = "3.0" # File format version we write +format_version = "4.0" # File format version we write compatible_formats = ["1.0", # Original protocol 0 "1.1", # Protocol 0 with INST added "1.2", # Original protocol 1 "1.3", # Protocol 1 with BINFLOAT added "2.0", # Protocol 2 "3.0", # Protocol 3 + "4.0", # Protocol 4 ] # Old format versions we can read # This is the highest protocol number we know how to read. -HIGHEST_PROTOCOL = 3 +HIGHEST_PROTOCOL = 4 # The protocol we write by default. May be less than HIGHEST_PROTOCOL. # We intentionally write a protocol that Python 2.x cannot read; @@ -164,7 +165,196 @@ BINBYTES = b'B' # push bytes; counted binary string argument SHORT_BINBYTES = b'C' # " " ; " " " " < 256 bytes -__all__.extend([x for x in dir() if re.match("[A-Z][A-Z0-9_]+$",x)]) +# Protocol 4 +SHORT_BINUNICODE = b'\x8c' # push short string; UTF-8 length < 256 bytes +BINUNICODE8 = b'\x8d' # push very long string +BINBYTES8 = b'\x8e' # push very long bytes string +EMPTY_SET = b'\x8f' # push empty set on the stack +ADDITEMS = b'\x90' # modify set by adding topmost stack items +FROZENSET = b'\x91' # build frozenset from topmost stack items +NEWOBJ_EX = b'\x92' # like NEWOBJ but work with keyword only arguments +STACK_GLOBAL = b'\x93' # same as GLOBAL but using names on the stacks +MEMOIZE = b'\x94' # store top of the stack in memo +FRAME = b'\x95' # indicate the beginning of a new frame + +__all__.extend([x for x in dir() if re.match("[A-Z][A-Z0-9_]+$", x)]) + + +class _Framer: + + _FRAME_SIZE_TARGET = 64 * 1024 + + def __init__(self, file_write): + self.file_write = file_write + self.current_frame = None + + def _commit_frame(self): + f = self.current_frame + with f.getbuffer() as data: + n = len(data) + write = self.file_write + write(FRAME) + write(pack("= self._FRAME_SIZE_TARGET: + self._commit_frame() + return f.write(data) + +class _Unframer: + + def __init__(self, file_read, file_readline, file_tell=None): + self.file_read = file_read + self.file_readline = file_readline + self.file_tell = file_tell + self.framing_enabled = False + self.current_frame = None + self.frame_start = None + + def read(self, n): + if n == 0: + return b'' + _file_read = self.file_read + if not self.framing_enabled: + return _file_read(n) + f = self.current_frame + if f is not None: + data = f.read(n) + if data: + if len(data) < n: + raise UnpicklingError( + "pickle exhausted before end of frame") + return data + frame_opcode = _file_read(1) + if frame_opcode != FRAME: + raise UnpicklingError( + "expected a FRAME opcode, got {} instead".format(frame_opcode)) + frame_size, = unpack(" sys.maxsize: + raise ValueError("frame size > sys.maxsize: %d" % frame_size) + if self.file_tell is not None: + self.frame_start = self.file_tell() + f = self.current_frame = io.BytesIO(_file_read(frame_size)) + self.readline = f.readline + data = f.read(n) + assert len(data) == n, (len(data), n) + return data + + def readline(self): + if not self.framing_enabled: + return self.file_readline() + else: + return self.current_frame.readline() + + def tell(self): + if self.file_tell is None: + return None + elif self.current_frame is None: + return self.file_tell() + else: + return self.frame_start + self.current_frame.tell() + + +# Tools used for pickling. + +def _getattribute(obj, name, allow_qualname=False): + dotted_path = name.split(".") + if not allow_qualname and len(dotted_path) > 1: + raise AttributeError("Can't get qualified attribute {!r} on {!r}; " + + "use protocols >= 4 to enable support" + .format(name, obj)) + for subpath in dotted_path: + if subpath == '': + raise AttributeError("Can't get local attribute {!r} on {!r}" + .format(name, obj)) + try: + obj = getattr(obj, subpath) + except AttributeError: + raise AttributeError("Can't get attribute {!r} on {!r}" + .format(name, obj)) + return obj + +def whichmodule(obj, name, allow_qualname=False): + """Find the module an object belong to.""" + module_name = getattr(obj, '__module__', None) + if module_name is not None: + return module_name + for module_name, module in sys.modules.items(): + if module_name == '__main__' or module is None: + continue + try: + if _getattribute(module, name, allow_qualname) is obj: + return module_name + except AttributeError: + pass + return '__main__' + +def encode_long(x): + r"""Encode a long to a two's complement little-endian binary string. + Note that 0 is a special case, returning an empty string, to save a + byte in the LONG1 pickling context. + + >>> encode_long(0) + b'' + >>> encode_long(255) + b'\xff\x00' + >>> encode_long(32767) + b'\xff\x7f' + >>> encode_long(-256) + b'\x00\xff' + >>> encode_long(-32768) + b'\x00\x80' + >>> encode_long(-128) + b'\x80' + >>> encode_long(127) + b'\x7f' + >>> + """ + if x == 0: + return b'' + nbytes = (x.bit_length() >> 3) + 1 + result = x.to_bytes(nbytes, byteorder='little', signed=True) + if x < 0 and nbytes > 1: + if result[-1] == 0xff and (result[-2] & 0x80) != 0: + result = result[:-1] + return result + +def decode_long(data): + r"""Decode a long from a two's complement little-endian binary string. + + >>> decode_long(b'') + 0 + >>> decode_long(b"\xff\x00") + 255 + >>> decode_long(b"\xff\x7f") + 32767 + >>> decode_long(b"\x00\xff") + -256 + >>> decode_long(b"\x00\x80") + -32768 + >>> decode_long(b"\x80") + -128 + >>> decode_long(b"\x7f") + 127 + """ + return int.from_bytes(data, byteorder='little', signed=True) + # Pickling machinery @@ -174,9 +364,9 @@ """This takes a binary file for writing a pickle data stream. The optional protocol argument tells the pickler to use the - given protocol; supported protocols are 0, 1, 2, 3. The default - protocol is 3; a backward-incompatible protocol designed for - Python 3.0. + given protocol; supported protocols are 0, 1, 2, 3 and 4. The + default protocol is 3; a backward-incompatible protocol designed for + Python 3. Specifying a negative protocol version selects the highest protocol version supported. The higher the protocol used, the @@ -189,8 +379,8 @@ meets this interface. If fix_imports is True and protocol is less than 3, pickle will try to - map the new Python 3.x names to the old module names used in Python - 2.x, so that the pickle data stream is readable with Python 2.x. + map the new Python 3 names to the old module names used in Python 2, + so that the pickle data stream is readable with Python 2. """ if protocol is None: protocol = DEFAULT_PROTOCOL @@ -199,7 +389,7 @@ elif not 0 <= protocol <= HIGHEST_PROTOCOL: raise ValueError("pickle protocol must be <= %d" % HIGHEST_PROTOCOL) try: - self.write = file.write + self._file_write = file.write except AttributeError: raise TypeError("file must have a 'write' attribute") self.memo = {} @@ -223,13 +413,22 @@ """Write a pickled representation of obj to the open file.""" # Check whether Pickler was initialized correctly. This is # only needed to mimic the behavior of _pickle.Pickler.dump(). - if not hasattr(self, "write"): + if not hasattr(self, "_file_write"): raise PicklingError("Pickler.__init__() was not called by " "%s.__init__()" % (self.__class__.__name__,)) if self.proto >= 2: - self.write(PROTO + pack("= 4: + framer = _Framer(self._file_write) + framer.start_framing() + self.write = framer.write + else: + framer = None + self.write = self._file_write self.save(obj) self.write(STOP) + if framer is not None: + framer.end_framing() def memoize(self, obj): """Store an object in the memo.""" @@ -249,19 +448,21 @@ if self.fast: return assert id(obj) not in self.memo - memo_len = len(self.memo) - self.write(self.put(memo_len)) - self.memo[id(obj)] = memo_len, obj + idx = len(self.memo) + self.write(self.put(idx)) + self.memo[id(obj)] = idx, obj # Return a PUT (BINPUT, LONG_BINPUT) opcode string, with argument i. - def put(self, i): - if self.bin: - if i < 256: - return BINPUT + pack("= 4: + return MEMOIZE + elif self.bin: + if idx < 256: + return BINPUT + pack("= 2 and getattr(func, "__name__", "") == "__newobj__": - # A __reduce__ implementation can direct protocol 2 to + func_name = getattr(func, "__name__", "") + if self.proto >= 4 and func_name == "__newobj_ex__": + cls, args, kwargs = args + if not hasattr(cls, "__new__"): + raise PicklingError("args[0] from {} args has no __new__" + .format(func_name)) + if obj is not None and cls is not obj.__class__: + raise PicklingError("args[0] from {} args has the wrong class" + .format(func_name)) + save(cls) + save(args) + save(kwargs) + write(NEWOBJ_EX) + elif self.proto >= 2 and func_name == "__newobj__": + # A __reduce__ implementation can direct protocol 2 or newer to # use the more efficient NEWOBJ opcode, while still # allowing protocol 0 and 1 to work normally. For this to # work, the function returned by __reduce__ should be @@ -409,7 +619,13 @@ write(REDUCE) if obj is not None: - self.memoize(obj) + # If the object is already in the memo, this means it is + # recursive. In this case, throw away everything we put on the + # stack, and fetch the object back from the memo. + if id(obj) in self.memo: + write(POP + self.get(self.memo[id(obj)][0])) + else: + self.memoize(obj) # More new special cases (that work with older protocols as # well): when __reduce__ returns a tuple with 4 or 5 items, @@ -493,8 +709,10 @@ (str(obj, 'latin1'), 'latin1'), obj=obj) return n = len(obj) - if n < 256: + if n <= 0xff: self.write(SHORT_BINBYTES + pack(" 0xffffffff and self.proto >= 4: + self.write(BINBYTES8 + pack("= 4: + self.write(SHORT_BINUNICODE + pack(" 0xffffffff and self.proto >= 4: + self.write(BINUNICODE8 + pack(" 0: + write(MARK) + for item in batch: + save(item) + write(ADDITEMS) + if n < self._BATCHSIZE: + return + dispatch[set] = save_set + + def save_frozenset(self, obj): + save = self.save + write = self.write + + if self.proto < 4: + self.save_reduce(frozenset, (list(obj),), obj=obj) + return + + write(MARK) + for item in obj: + save(item) + + if id(obj) in self.memo: + # If the object is already in the memo, this means it is + # recursive. In this case, throw away everything we put on the + # stack, and fetch the object back from the memo. + write(POP_MARK + self.get(self.memo[id(obj)][0])) + return + + write(FROZENSET) + self.memoize(obj) + dispatch[frozenset] = save_frozenset + def save_global(self, obj, name=None): write = self.write memo = self.memo + if name is None and self.proto >= 4: + name = getattr(obj, '__qualname__', None) if name is None: name = obj.__name__ - module = getattr(obj, "__module__", None) - if module is None: - module = whichmodule(obj, name) - + module_name = whichmodule(obj, name, allow_qualname=self.proto >= 4) try: - __import__(module, level=0) - mod = sys.modules[module] - klass = getattr(mod, name) + __import__(module_name, level=0) + module = sys.modules[module_name] + obj2 = _getattribute(module, name, allow_qualname=self.proto >= 4) except (ImportError, KeyError, AttributeError): raise PicklingError( "Can't pickle %r: it's not found as %s.%s" % - (obj, module, name)) + (obj, module_name, name)) else: - if klass is not obj: + if obj2 is not obj: raise PicklingError( "Can't pickle %r: it's not the same object as %s.%s" % - (obj, module, name)) + (obj, module_name, name)) if self.proto >= 2: - code = _extension_registry.get((module, name)) + code = _extension_registry.get((module_name, name)) if code: assert code > 0 if code <= 0xff: @@ -684,17 +954,23 @@ write(EXT4 + pack("= 3. - if self.proto >= 3: - write(GLOBAL + bytes(module, "utf-8") + b'\n' + + if self.proto >= 4: + self.save(module_name) + self.save(name) + write(STACK_GLOBAL) + elif self.proto >= 3: + write(GLOBAL + bytes(module_name, "utf-8") + b'\n' + bytes(name, "utf-8") + b'\n') else: if self.fix_imports: - if (module, name) in _compat_pickle.REVERSE_NAME_MAPPING: - module, name = _compat_pickle.REVERSE_NAME_MAPPING[(module, name)] - if module in _compat_pickle.REVERSE_IMPORT_MAPPING: - module = _compat_pickle.REVERSE_IMPORT_MAPPING[module] + r_name_mapping = _compat_pickle.REVERSE_NAME_MAPPING + r_import_mapping = _compat_pickle.REVERSE_IMPORT_MAPPING + if (module_name, name) in r_name_mapping: + module_name, name = r_name_mapping[(module_name, name)] + if module_name in r_import_mapping: + module_name = r_import_mapping[module_name] try: - write(GLOBAL + bytes(module, "ascii") + b'\n' + + write(GLOBAL + bytes(module_name, "ascii") + b'\n' + bytes(name, "ascii") + b'\n') except UnicodeEncodeError: raise PicklingError( @@ -703,40 +979,16 @@ self.memoize(obj) + def save_method(self, obj): + if obj.__self__ is None or type(obj.__self__) is ModuleType: + self.save_global(obj) + else: + self.save_reduce(getattr, (obj.__self__, obj.__name__), obj=obj) + dispatch[FunctionType] = save_global - dispatch[BuiltinFunctionType] = save_global + dispatch[BuiltinFunctionType] = save_method dispatch[type] = save_global -# A cache for whichmodule(), mapping a function object to the name of -# the module in which the function was found. - -classmap = {} # called classmap for backwards compatibility - -def whichmodule(func, funcname): - """Figure out the module in which a function occurs. - - Search sys.modules for the module. - Cache in classmap. - Return a module name. - If the function cannot be found, return "__main__". - """ - # Python functions should always get an __module__ from their globals. - mod = getattr(func, "__module__", None) - if mod is not None: - return mod - if func in classmap: - return classmap[func] - - for name, module in list(sys.modules.items()): - if module is None: - continue # skip dummy package entries - if name != '__main__' and getattr(module, funcname, None) is func: - break - else: - name = '__main__' - classmap[func] = name - return name - # Unpickling machinery @@ -764,8 +1016,8 @@ instances pickled by Python 2.x; these default to 'ASCII' and 'strict', respectively. """ - self.readline = file.readline - self.read = file.read + self._file_readline = file.readline + self._file_read = file.read self.memo = {} self.encoding = encoding self.errors = errors @@ -779,12 +1031,16 @@ """ # Check whether Unpickler was initialized correctly. This is # only needed to mimic the behavior of _pickle.Unpickler.dump(). - if not hasattr(self, "read"): + if not hasattr(self, "_file_read"): raise UnpicklingError("Unpickler.__init__() was not called by " "%s.__init__()" % (self.__class__.__name__,)) + self._unframer = _Unframer(self._file_read, self._file_readline) + self.read = self._unframer.read + self.readline = self._unframer.readline self.mark = object() # any new unique object self.stack = [] self.append = self.stack.append + self.proto = 0 read = self.read dispatch = self.dispatch try: @@ -822,6 +1078,8 @@ if not 0 <= proto <= HIGHEST_PROTOCOL: raise ValueError("unsupported pickle protocol: %d" % proto) self.proto = proto + if proto >= 4: + self._unframer.framing_enabled = True dispatch[PROTO[0]] = load_proto def load_persid(self): @@ -940,6 +1198,14 @@ self.append(str(self.read(len), 'utf-8', 'surrogatepass')) dispatch[BINUNICODE[0]] = load_binunicode + def load_binunicode8(self): + len, = unpack(' maxsize: + raise UnpicklingError("BINUNICODE8 exceeds system's maximum size " + "of %d bytes" % maxsize) + self.append(str(self.read(len), 'utf-8', 'surrogatepass')) + dispatch[BINUNICODE8[0]] = load_binunicode8 + def load_short_binstring(self): len = self.read(1)[0] data = self.read(len) @@ -952,6 +1218,11 @@ self.append(self.read(len)) dispatch[SHORT_BINBYTES[0]] = load_short_binbytes + def load_short_binunicode(self): + len = self.read(1)[0] + self.append(str(self.read(len), 'utf-8', 'surrogatepass')) + dispatch[SHORT_BINUNICODE[0]] = load_short_binunicode + def load_tuple(self): k = self.marker() self.stack[k:] = [tuple(self.stack[k+1:])] @@ -981,6 +1252,15 @@ self.append({}) dispatch[EMPTY_DICT[0]] = load_empty_dictionary + def load_empty_set(self): + self.append(set()) + dispatch[EMPTY_SET[0]] = load_empty_set + + def load_frozenset(self): + k = self.marker() + self.stack[k:] = [frozenset(self.stack[k+1:])] + dispatch[FROZENSET[0]] = load_frozenset + def load_list(self): k = self.marker() self.stack[k:] = [self.stack[k+1:]] @@ -1029,11 +1309,19 @@ def load_newobj(self): args = self.stack.pop() - cls = self.stack[-1] + cls = self.stack.pop() obj = cls.__new__(cls, *args) - self.stack[-1] = obj + self.append(obj) dispatch[NEWOBJ[0]] = load_newobj + def load_newobj_ex(self): + kwargs = self.stack.pop() + args = self.stack.pop() + cls = self.stack.pop() + obj = cls.__new__(cls, *args, **kwargs) + self.append(obj) + dispatch[NEWOBJ_EX[0]] = load_newobj_ex + def load_global(self): module = self.readline()[:-1].decode("utf-8") name = self.readline()[:-1].decode("utf-8") @@ -1041,6 +1329,14 @@ self.append(klass) dispatch[GLOBAL[0]] = load_global + def load_stack_global(self): + name = self.stack.pop() + module = self.stack.pop() + if type(name) is not str or type(module) is not str: + raise UnpicklingError("STACK_GLOBAL requires str") + self.append(self.find_class(module, name)) + dispatch[STACK_GLOBAL[0]] = load_stack_global + def load_ext1(self): code = self.read(1)[0] self.get_extension(code) @@ -1080,9 +1376,8 @@ if module in _compat_pickle.IMPORT_MAPPING: module = _compat_pickle.IMPORT_MAPPING[module] __import__(module, level=0) - mod = sys.modules[module] - klass = getattr(mod, name) - return klass + return _getattribute(sys.modules[module], name, + allow_qualname=self.proto >= 4) def load_reduce(self): stack = self.stack @@ -1146,6 +1441,11 @@ self.memo[i] = self.stack[-1] dispatch[LONG_BINPUT[0]] = load_long_binput + def load_memoize(self): + memo = self.memo + memo[len(memo)] = self.stack[-1] + dispatch[MEMOIZE[0]] = load_memoize + def load_append(self): stack = self.stack value = stack.pop() @@ -1185,6 +1485,20 @@ del stack[mark:] dispatch[SETITEMS[0]] = load_setitems + def load_additems(self): + stack = self.stack + mark = self.marker() + set_obj = stack[mark - 1] + items = stack[mark + 1:] + if isinstance(set_obj, set): + set_obj.update(items) + else: + add = set_obj.add + for item in items: + add(item) + del stack[mark:] + dispatch[ADDITEMS[0]] = load_additems + def load_build(self): stack = self.stack state = stack.pop() @@ -1218,86 +1532,46 @@ raise _Stop(value) dispatch[STOP[0]] = load_stop -# Encode/decode ints. - -def encode_long(x): - r"""Encode a long to a two's complement little-endian binary string. - Note that 0 is a special case, returning an empty string, to save a - byte in the LONG1 pickling context. - - >>> encode_long(0) - b'' - >>> encode_long(255) - b'\xff\x00' - >>> encode_long(32767) - b'\xff\x7f' - >>> encode_long(-256) - b'\x00\xff' - >>> encode_long(-32768) - b'\x00\x80' - >>> encode_long(-128) - b'\x80' - >>> encode_long(127) - b'\x7f' - >>> - """ - if x == 0: - return b'' - nbytes = (x.bit_length() >> 3) + 1 - result = x.to_bytes(nbytes, byteorder='little', signed=True) - if x < 0 and nbytes > 1: - if result[-1] == 0xff and (result[-2] & 0x80) != 0: - result = result[:-1] - return result - -def decode_long(data): - r"""Decode an int from a two's complement little-endian binary string. - - >>> decode_long(b'') - 0 - >>> decode_long(b"\xff\x00") - 255 - >>> decode_long(b"\xff\x7f") - 32767 - >>> decode_long(b"\x00\xff") - -256 - >>> decode_long(b"\x00\x80") - -32768 - >>> decode_long(b"\x80") - -128 - >>> decode_long(b"\x7f") - 127 - """ - return int.from_bytes(data, byteorder='little', signed=True) # Shorthands -def dump(obj, file, protocol=None, *, fix_imports=True): - Pickler(file, protocol, fix_imports=fix_imports).dump(obj) +def _dump(obj, file, protocol=None, *, fix_imports=True): + _Pickler(file, protocol, fix_imports=fix_imports).dump(obj) -def dumps(obj, protocol=None, *, fix_imports=True): +def _dumps(obj, protocol=None, *, fix_imports=True): f = io.BytesIO() - Pickler(f, protocol, fix_imports=fix_imports).dump(obj) + _Pickler(f, protocol, fix_imports=fix_imports).dump(obj) res = f.getvalue() assert isinstance(res, bytes_types) return res -def load(file, *, fix_imports=True, encoding="ASCII", errors="strict"): - return Unpickler(file, fix_imports=fix_imports, +def _load(file, *, fix_imports=True, encoding="ASCII", errors="strict"): + return _Unpickler(file, fix_imports=fix_imports, encoding=encoding, errors=errors).load() -def loads(s, *, fix_imports=True, encoding="ASCII", errors="strict"): +def _loads(s, *, fix_imports=True, encoding="ASCII", errors="strict"): if isinstance(s, str): raise TypeError("Can't load pickle from unicode string") file = io.BytesIO(s) - return Unpickler(file, fix_imports=fix_imports, - encoding=encoding, errors=errors).load() + return _Unpickler(file, fix_imports=fix_imports, + encoding=encoding, errors=errors).load() # Use the faster _pickle if possible try: - from _pickle import * + from _pickle import ( + PickleError, + PicklingError, + UnpicklingError, + Pickler, + Unpickler, + dump, + dumps, + load, + loads + ) except ImportError: Pickler, Unpickler = _Pickler, _Unpickler + dump, dumps, load, loads = _dump, _dumps, _load, _loads # Doctest def _test(): diff --git a/Lib/pickletools.py b/Lib/pickletools.py --- a/Lib/pickletools.py +++ b/Lib/pickletools.py @@ -11,6 +11,7 @@ ''' import codecs +import io import pickle import re import sys @@ -168,6 +169,7 @@ TAKEN_FROM_ARGUMENT1 = -2 # num bytes is 1-byte unsigned int TAKEN_FROM_ARGUMENT4 = -3 # num bytes is 4-byte signed little-endian int TAKEN_FROM_ARGUMENT4U = -4 # num bytes is 4-byte unsigned little-endian int +TAKEN_FROM_ARGUMENT8U = -5 # num bytes is 8-byte unsigned little-endian int class ArgumentDescriptor(object): __slots__ = ( @@ -175,7 +177,7 @@ 'name', # length of argument, in bytes; an int; UP_TO_NEWLINE and - # TAKEN_FROM_ARGUMENT{1,4} are negative values for variable-length + # TAKEN_FROM_ARGUMENT{1,4,8} are negative values for variable-length # cases 'n', @@ -196,7 +198,8 @@ n in (UP_TO_NEWLINE, TAKEN_FROM_ARGUMENT1, TAKEN_FROM_ARGUMENT4, - TAKEN_FROM_ARGUMENT4U)) + TAKEN_FROM_ARGUMENT4U, + TAKEN_FROM_ARGUMENT8U)) self.n = n self.reader = reader @@ -288,6 +291,27 @@ doc="Four-byte unsigned integer, little-endian.") +def read_uint8(f): + r""" + >>> import io + >>> read_uint8(io.BytesIO(b'\xff\x00\x00\x00\x00\x00\x00\x00')) + 255 + >>> read_uint8(io.BytesIO(b'\xff' * 8)) == 2**64-1 + True + """ + + data = f.read(8) + if len(data) == 8: + return _unpack(">> import io @@ -381,6 +405,36 @@ a single blank separating the two strings. """) + +def read_string1(f): + r""" + >>> import io + >>> read_string1(io.BytesIO(b"\x00")) + '' + >>> read_string1(io.BytesIO(b"\x03abcdef")) + 'abc' + """ + + n = read_uint1(f) + assert n >= 0 + data = f.read(n) + if len(data) == n: + return data.decode("latin-1") + raise ValueError("expected %d bytes in a string1, but only %d remain" % + (n, len(data))) + +string1 = ArgumentDescriptor( + name="string1", + n=TAKEN_FROM_ARGUMENT1, + reader=read_string1, + doc="""A counted string. + + The first argument is a 1-byte unsigned int giving the number + of bytes in the string, and the second argument is that many + bytes. + """) + + def read_string4(f): r""" >>> import io @@ -415,28 +469,28 @@ """) -def read_string1(f): +def read_bytes1(f): r""" >>> import io - >>> read_string1(io.BytesIO(b"\x00")) - '' - >>> read_string1(io.BytesIO(b"\x03abcdef")) - 'abc' + >>> read_bytes1(io.BytesIO(b"\x00")) + b'' + >>> read_bytes1(io.BytesIO(b"\x03abcdef")) + b'abc' """ n = read_uint1(f) assert n >= 0 data = f.read(n) if len(data) == n: - return data.decode("latin-1") - raise ValueError("expected %d bytes in a string1, but only %d remain" % + return data + raise ValueError("expected %d bytes in a bytes1, but only %d remain" % (n, len(data))) -string1 = ArgumentDescriptor( - name="string1", +bytes1 = ArgumentDescriptor( + name="bytes1", n=TAKEN_FROM_ARGUMENT1, - reader=read_string1, - doc="""A counted string. + reader=read_bytes1, + doc="""A counted bytes string. The first argument is a 1-byte unsigned int giving the number of bytes in the string, and the second argument is that many @@ -486,6 +540,7 @@ """ n = read_uint4(f) + assert n >= 0 if n > sys.maxsize: raise ValueError("bytes4 byte count > sys.maxsize: %d" % n) data = f.read(n) @@ -505,6 +560,39 @@ """) +def read_bytes8(f): + r""" + >>> import io + >>> read_bytes8(io.BytesIO(b"\x00\x00\x00\x00\x00\x00\x00\x00abc")) + b'' + >>> read_bytes8(io.BytesIO(b"\x03\x00\x00\x00\x00\x00\x00\x00abcdef")) + b'abc' + >>> read_bytes8(io.BytesIO(b"\x00\x00\x00\x00\x00\x00\x03\x00abcdef")) + Traceback (most recent call last): + ... + ValueError: expected 844424930131968 bytes in a bytes8, but only 6 remain + """ + + n = read_uint8(f) + assert n >= 0 + if n > sys.maxsize: + raise ValueError("bytes8 byte count > sys.maxsize: %d" % n) + data = f.read(n) + if len(data) == n: + return data + raise ValueError("expected %d bytes in a bytes8, but only %d remain" % + (n, len(data))) + +bytes8 = ArgumentDescriptor( + name="bytes8", + n=TAKEN_FROM_ARGUMENT8U, + reader=read_bytes8, + doc="""A counted bytes string. + + The first argument is a 8-byte little-endian unsigned int giving + the number of bytes, and the second argument is that many bytes. + """) + def read_unicodestringnl(f): r""" >>> import io @@ -530,6 +618,46 @@ escape sequences. """) + +def read_unicodestring1(f): + r""" + >>> import io + >>> s = 'abcd\uabcd' + >>> enc = s.encode('utf-8') + >>> enc + b'abcd\xea\xaf\x8d' + >>> n = bytes([len(enc)]) # little-endian 1-byte length + >>> t = read_unicodestring1(io.BytesIO(n + enc + b'junk')) + >>> s == t + True + + >>> read_unicodestring1(io.BytesIO(n + enc[:-1])) + Traceback (most recent call last): + ... + ValueError: expected 7 bytes in a unicodestring1, but only 6 remain + """ + + n = read_uint1(f) + assert n >= 0 + data = f.read(n) + if len(data) == n: + return str(data, 'utf-8', 'surrogatepass') + raise ValueError("expected %d bytes in a unicodestring1, but only %d " + "remain" % (n, len(data))) + +unicodestring1 = ArgumentDescriptor( + name="unicodestring1", + n=TAKEN_FROM_ARGUMENT1, + reader=read_unicodestring1, + doc="""A counted Unicode string. + + The first argument is a 1-byte little-endian signed int + giving the number of bytes in the string, and the second + argument-- the UTF-8 encoding of the Unicode string -- + contains that many bytes. + """) + + def read_unicodestring4(f): r""" >>> import io @@ -549,6 +677,7 @@ """ n = read_uint4(f) + assert n >= 0 if n > sys.maxsize: raise ValueError("unicodestring4 byte count > sys.maxsize: %d" % n) data = f.read(n) @@ -570,6 +699,47 @@ """) +def read_unicodestring8(f): + r""" + >>> import io + >>> s = 'abcd\uabcd' + >>> enc = s.encode('utf-8') + >>> enc + b'abcd\xea\xaf\x8d' + >>> n = bytes([len(enc)]) + bytes(7) # little-endian 8-byte length + >>> t = read_unicodestring8(io.BytesIO(n + enc + b'junk')) + >>> s == t + True + + >>> read_unicodestring8(io.BytesIO(n + enc[:-1])) + Traceback (most recent call last): + ... + ValueError: expected 7 bytes in a unicodestring8, but only 6 remain + """ + + n = read_uint8(f) + assert n >= 0 + if n > sys.maxsize: + raise ValueError("unicodestring8 byte count > sys.maxsize: %d" % n) + data = f.read(n) + if len(data) == n: + return str(data, 'utf-8', 'surrogatepass') + raise ValueError("expected %d bytes in a unicodestring8, but only %d " + "remain" % (n, len(data))) + +unicodestring8 = ArgumentDescriptor( + name="unicodestring8", + n=TAKEN_FROM_ARGUMENT8U, + reader=read_unicodestring8, + doc="""A counted Unicode string. + + The first argument is a 8-byte little-endian signed int + giving the number of bytes in the string, and the second + argument-- the UTF-8 encoding of the Unicode string -- + contains that many bytes. + """) + + def read_decimalnl_short(f): r""" >>> import io @@ -859,6 +1029,16 @@ obtype=dict, doc="A Python dict object.") +pyset = StackObject( + name="set", + obtype=set, + doc="A Python set object.") + +pyfrozenset = StackObject( + name="frozenset", + obtype=set, + doc="A Python frozenset object.") + anyobject = StackObject( name='any', obtype=object, @@ -1142,6 +1322,19 @@ literally as the string content. """), + I(name='BINBYTES8', + code='\x8e', + arg=bytes8, + stack_before=[], + stack_after=[pybytes], + proto=4, + doc="""Push a Python bytes object. + + There are two arguments: the first is a 8-byte unsigned int giving + the number of bytes in the string, and the second is that many bytes, + which are taken literally as the string content. + """), + # Ways to spell None. I(name='NONE', @@ -1190,6 +1383,19 @@ until the next newline character. """), + I(name='SHORT_BINUNICODE', + code='\x8c', + arg=unicodestring1, + stack_before=[], + stack_after=[pyunicode], + proto=4, + doc="""Push a Python Unicode string object. + + There are two arguments: the first is a 1-byte little-endian signed int + giving the number of bytes in the string. The second is that many + bytes, and is the UTF-8 encoding of the Unicode string. + """), + I(name='BINUNICODE', code='X', arg=unicodestring4, @@ -1203,6 +1409,19 @@ bytes, and is the UTF-8 encoding of the Unicode string. """), + I(name='BINUNICODE8', + code='\x8d', + arg=unicodestring8, + stack_before=[], + stack_after=[pyunicode], + proto=4, + doc="""Push a Python Unicode string object. + + There are two arguments: the first is a 8-byte little-endian signed int + giving the number of bytes in the string. The second is that many + bytes, and is the UTF-8 encoding of the Unicode string. + """), + # Ways to spell floats. I(name='FLOAT', @@ -1428,6 +1647,54 @@ 1, 2, ..., n, and in that order. """), + # Ways to build sets + + I(name='EMPTY_SET', + code='\x8f', + arg=None, + stack_before=[], + stack_after=[pyset], + proto=4, + doc="Push an empty set."), + + I(name='ADDITEMS', + code='\x90', + arg=None, + stack_before=[pyset, markobject, stackslice], + stack_after=[pyset], + proto=4, + doc="""Add an arbitrary number of items to an existing set. + + The slice of the stack following the topmost markobject is taken as + a sequence of items, added to the set immediately under the topmost + markobject. Everything at and after the topmost markobject is popped, + leaving the mutated set at the top of the stack. + + Stack before: ... pyset markobject item_1 ... item_n + Stack after: ... pyset + + where pyset has been modified via pyset.add(item_i) = item_i for i in + 1, 2, ..., n, and in that order. + """), + + # Way to build frozensets + + I(name='FROZENSET', + code='\x91', + arg=None, + stack_before=[markobject, stackslice], + stack_after=[pyfrozenset], + proto=4, + doc="""Build a frozenset out of the topmost slice, after markobject. + + All the stack entries following the topmost markobject are placed into + a single Python frozenset, which single frozenset object replaces all + of the stack from the topmost markobject onward. For example, + + Stack before: ... markobject 1 2 3 + Stack after: ... frozenset({1, 2, 3}) + """), + # Stack manipulation. I(name='POP', @@ -1549,6 +1816,18 @@ unsigned little-endian integer following. """), + I(name='MEMOIZE', + code='\x94', + arg=None, + stack_before=[anyobject], + stack_after=[anyobject], + proto=4, + doc="""Store the stack top into the memo. The stack is not popped. + + The index of the memo location to write is the number of + elements currently present in the memo. + """), + # Access the extension registry (predefined objects). Akin to the GET # family. @@ -1614,6 +1893,15 @@ stack, so unpickling subclasses can override this form of lookup. """), + I(name='STACK_GLOBAL', + code='\x93', + arg=None, + stack_before=[pyunicode, pyunicode], + stack_after=[anyobject], + proto=0, + doc="""Push a global object (module.attr) on the stack. + """), + # Ways to build objects of classes pickle doesn't know about directly # (user-defined classes). I despair of documenting this accurately # and comprehensibly -- you really have to read the pickle code to @@ -1770,6 +2058,21 @@ onto the stack. """), + I(name='NEWOBJ_EX', + code='\x92', + arg=None, + stack_before=[anyobject, anyobject, anyobject], + stack_after=[anyobject], + proto=4, + doc="""Build an object instance. + + The stack before should be thought of as containing a class + object followed by an argument tuple and by a keyword argument dict + (the dict being the stack top). Call these cls and args. They are + popped off the stack, and the value returned by + cls.__new__(cls, *args, *kwargs) is pushed back onto the stack. + """), + # Machine control. I(name='PROTO', @@ -1797,6 +2100,20 @@ empty then. """), + # Framing support. + + I(name='FRAME', + code='\x95', + arg=uint8, + stack_before=[], + stack_after=[], + proto=4, + doc="""Indicate the beginning of a new frame. + + The unpickler may use this opcode to safely prefetch data from its + underlying stream. + """), + # Ways to deal with persistent IDs. I(name='PERSID', @@ -1903,6 +2220,38 @@ ############################################################################## # A pickle opcode generator. +def _genops(data, yield_end_pos=False): + if isinstance(data, bytes_types): + data = io.BytesIO(data) + + if hasattr(data, "tell"): + getpos = data.tell + else: + getpos = lambda: None + + while True: + pos = getpos() + code = data.read(1) + opcode = code2op.get(code.decode("latin-1")) + if opcode is None: + if code == b"": + raise ValueError("pickle exhausted before seeing STOP") + else: + raise ValueError("at position %s, opcode %r unknown" % ( + "" if pos is None else pos, + code)) + if opcode.arg is None: + arg = None + else: + arg = opcode.arg.reader(data) + if yield_end_pos: + yield opcode, arg, pos, getpos() + else: + yield opcode, arg, pos + if code == b'.': + assert opcode.name == 'STOP' + break + def genops(pickle): """Generate all the opcodes in a pickle. @@ -1926,62 +2275,47 @@ used. Else (the pickle doesn't have a tell(), and it's not obvious how to query its current position) pos is None. """ - - if isinstance(pickle, bytes_types): - import io - pickle = io.BytesIO(pickle) - - if hasattr(pickle, "tell"): - getpos = pickle.tell - else: - getpos = lambda: None - - while True: - pos = getpos() - code = pickle.read(1) - opcode = code2op.get(code.decode("latin-1")) - if opcode is None: - if code == b"": - raise ValueError("pickle exhausted before seeing STOP") - else: - raise ValueError("at position %s, opcode %r unknown" % ( - pos is None and "" or pos, - code)) - if opcode.arg is None: - arg = None - else: - arg = opcode.arg.reader(pickle) - yield opcode, arg, pos - if code == b'.': - assert opcode.name == 'STOP' - break + return _genops(pickle) ############################################################################## # A pickle optimizer. def optimize(p): 'Optimize a pickle string by removing unused PUT opcodes' - gets = set() # set of args used by a GET opcode - puts = [] # (arg, startpos, stoppos) for the PUT opcodes - prevpos = None # set to pos if previous opcode was a PUT - for opcode, arg, pos in genops(p): - if prevpos is not None: - puts.append((prevarg, prevpos, pos)) - prevpos = None + not_a_put = object() + gets = { not_a_put } # set of args used by a GET opcode + opcodes = [] # (startpos, stoppos, putid) + proto = 0 + for opcode, arg, pos, end_pos in _genops(p, yield_end_pos=True): if 'PUT' in opcode.name: - prevarg, prevpos = arg, pos - elif 'GET' in opcode.name: - gets.add(arg) - - # Copy the pickle string except for PUTS without a corresponding GET - s = [] - i = 0 - for arg, start, stop in puts: - j = stop if (arg in gets) else start - s.append(p[i:j]) - i = stop - s.append(p[i:]) - return b''.join(s) + opcodes.append((pos, end_pos, arg)) + elif 'FRAME' in opcode.name: + pass + else: + if 'GET' in opcode.name: + gets.add(arg) + elif opcode.name == 'PROTO': + assert pos == 0, pos + proto = arg + opcodes.append((pos, end_pos, not_a_put)) + prevpos, prevarg = pos, None + + # Copy the opcodes except for PUTS without a corresponding GET + out = io.BytesIO() + opcodes = iter(opcodes) + if proto >= 2: + # Write the PROTO header before any framing + start, stop, _ = next(opcodes) + out.write(p[start:stop]) + buf = pickle._Framer(out.write) + if proto >= 4: + buf.start_framing() + for start, stop, putid in opcodes: + if putid in gets: + buf.write(p[start:stop]) + if proto >= 4: + buf.end_framing() + return out.getvalue() ############################################################################## # A symbolic pickle disassembler. @@ -2081,17 +2415,20 @@ errormsg = markmsg = "no MARK exists on stack" # Check for correct memo usage. - if opcode.name in ("PUT", "BINPUT", "LONG_BINPUT"): - assert arg is not None - if arg in memo: + if opcode.name in ("PUT", "BINPUT", "LONG_BINPUT", "MEMOIZE"): + if opcode.name == "MEMOIZE": + memo_idx = len(memo) + else: + assert arg is not None + memo_idx = arg + if memo_idx in memo: errormsg = "memo key %r already defined" % arg elif not stack: errormsg = "stack is empty -- can't store into memo" elif stack[-1] is markobject: errormsg = "can't store markobject in the memo" else: - memo[arg] = stack[-1] - + memo[memo_idx] = stack[-1] elif opcode.name in ("GET", "BINGET", "LONG_BINGET"): if arg in memo: assert len(after) == 1 diff --git a/Lib/test/pickletester.py b/Lib/test/pickletester.py --- a/Lib/test/pickletester.py +++ b/Lib/test/pickletester.py @@ -1,9 +1,10 @@ +import copyreg import io -import unittest import pickle import pickletools +import random import sys -import copyreg +import unittest import weakref from http.cookies import SimpleCookie @@ -95,6 +96,9 @@ def __getinitargs__(self): return () +class H(object): + pass + import __main__ __main__.C = C C.__module__ = "__main__" @@ -102,6 +106,8 @@ D.__module__ = "__main__" __main__.E = E E.__module__ = "__main__" +__main__.H = H +H.__module__ = "__main__" class myint(int): def __init__(self, x): @@ -428,6 +434,7 @@ x.append(5) return x + class AbstractPickleTests(unittest.TestCase): # Subclass must define self.dumps, self.loads. @@ -436,23 +443,41 @@ def setUp(self): pass + def assert_is_copy(self, obj, objcopy, msg=None): + """Utility method to verify if two objects are copies of each others. + """ + if msg is None: + msg = "{!r} is not a copy of {!r}".format(obj, objcopy) + self.assertEqual(obj, objcopy, msg=msg) + self.assertIs(type(obj), type(objcopy), msg=msg) + if hasattr(obj, '__dict__'): + self.assertDictEqual(obj.__dict__, objcopy.__dict__, msg=msg) + self.assertIsNot(obj.__dict__, objcopy.__dict__, msg=msg) + if hasattr(obj, '__slots__'): + self.assertListEqual(obj.__slots__, objcopy.__slots__, msg=msg) + for slot in obj.__slots__: + self.assertEqual( + hasattr(obj, slot), hasattr(objcopy, slot), msg=msg) + self.assertEqual(getattr(obj, slot, None), + getattr(objcopy, slot, None), msg=msg) + def test_misc(self): # test various datatypes not tested by testdata for proto in protocols: x = myint(4) s = self.dumps(x, proto) y = self.loads(s) - self.assertEqual(x, y) + self.assert_is_copy(x, y) x = (1, ()) s = self.dumps(x, proto) y = self.loads(s) - self.assertEqual(x, y) + self.assert_is_copy(x, y) x = initarg(1, x) s = self.dumps(x, proto) y = self.loads(s) - self.assertEqual(x, y) + self.assert_is_copy(x, y) # XXX test __reduce__ protocol? @@ -461,16 +486,16 @@ for proto in protocols: s = self.dumps(expected, proto) got = self.loads(s) - self.assertEqual(expected, got) + self.assert_is_copy(expected, got) def test_load_from_data0(self): - self.assertEqual(self._testdata, self.loads(DATA0)) + self.assert_is_copy(self._testdata, self.loads(DATA0)) def test_load_from_data1(self): - self.assertEqual(self._testdata, self.loads(DATA1)) + self.assert_is_copy(self._testdata, self.loads(DATA1)) def test_load_from_data2(self): - self.assertEqual(self._testdata, self.loads(DATA2)) + self.assert_is_copy(self._testdata, self.loads(DATA2)) def test_load_classic_instance(self): # See issue5180. Test loading 2.x pickles that @@ -492,7 +517,7 @@ b"X\n" b"p0\n" b"(dp1\nb.").replace(b'X', xname) - self.assertEqual(X(*args), self.loads(pickle0)) + self.assert_is_copy(X(*args), self.loads(pickle0)) # Protocol 1 (binary mode pickle) """ @@ -509,7 +534,7 @@ pickle1 = (b'(c__main__\n' b'X\n' b'q\x00oq\x01}q\x02b.').replace(b'X', xname) - self.assertEqual(X(*args), self.loads(pickle1)) + self.assert_is_copy(X(*args), self.loads(pickle1)) # Protocol 2 (pickle2 = b'\x80\x02' + pickle1) """ @@ -527,7 +552,7 @@ pickle2 = (b'\x80\x02(c__main__\n' b'X\n' b'q\x00oq\x01}q\x02b.').replace(b'X', xname) - self.assertEqual(X(*args), self.loads(pickle2)) + self.assert_is_copy(X(*args), self.loads(pickle2)) # There are gratuitous differences between pickles produced by # pickle and cPickle, largely because cPickle starts PUT indices at @@ -552,6 +577,7 @@ for proto in protocols: s = self.dumps(l, proto) x = self.loads(s) + self.assertIsInstance(x, list) self.assertEqual(len(x), 1) self.assertTrue(x is x[0]) @@ -561,6 +587,7 @@ for proto in protocols: s = self.dumps(t, proto) x = self.loads(s) + self.assertIsInstance(x, tuple) self.assertEqual(len(x), 1) self.assertEqual(len(x[0]), 1) self.assertTrue(x is x[0][0]) @@ -571,15 +598,39 @@ for proto in protocols: s = self.dumps(d, proto) x = self.loads(s) + self.assertIsInstance(x, dict) self.assertEqual(list(x.keys()), [1]) self.assertTrue(x[1] is x) + def test_recursive_set(self): + h = H() + y = set({h}) + h.attr = y + for proto in protocols: + s = self.dumps(y, proto) + x = self.loads(s) + self.assertIsInstance(x, set) + self.assertIs(list(x)[0].attr, x) + self.assertEqual(len(x), 1) + + def test_recursive_frozenset(self): + h = H() + y = frozenset({h}) + h.attr = y + for proto in protocols: + s = self.dumps(y, proto) + x = self.loads(s) + self.assertIsInstance(x, frozenset) + self.assertIs(list(x)[0].attr, x) + self.assertEqual(len(x), 1) + def test_recursive_inst(self): i = C() i.attr = i for proto in protocols: s = self.dumps(i, proto) x = self.loads(s) + self.assertIsInstance(x, C) self.assertEqual(dir(x), dir(i)) self.assertIs(x.attr, x) @@ -592,6 +643,7 @@ for proto in protocols: s = self.dumps(l, proto) x = self.loads(s) + self.assertIsInstance(x, list) self.assertEqual(len(x), 1) self.assertEqual(dir(x[0]), dir(i)) self.assertEqual(list(x[0].attr.keys()), [1]) @@ -599,7 +651,8 @@ def test_get(self): self.assertRaises(KeyError, self.loads, b'g0\np0') - self.assertEqual(self.loads(b'((Kdtp0\nh\x00l.))'), [(100,), (100,)]) + self.assert_is_copy([(100,), (100,)], + self.loads(b'((Kdtp0\nh\x00l.))')) def test_unicode(self): endcases = ['', '<\\u>', '<\\\u1234>', '<\n>', @@ -610,26 +663,26 @@ for u in endcases: p = self.dumps(u, proto) u2 = self.loads(p) - self.assertEqual(u2, u) + self.assert_is_copy(u, u2) def test_unicode_high_plane(self): t = '\U00012345' for proto in protocols: p = self.dumps(t, proto) t2 = self.loads(p) - self.assertEqual(t2, t) + self.assert_is_copy(t, t2) def test_bytes(self): for proto in protocols: for s in b'', b'xyz', b'xyz'*100: p = self.dumps(s, proto) - self.assertEqual(self.loads(p), s) + self.assert_is_copy(s, self.loads(p)) for s in [bytes([i]) for i in range(256)]: p = self.dumps(s, proto) - self.assertEqual(self.loads(p), s) + self.assert_is_copy(s, self.loads(p)) for s in [bytes([i, i]) for i in range(256)]: p = self.dumps(s, proto) - self.assertEqual(self.loads(p), s) + self.assert_is_copy(s, self.loads(p)) def test_ints(self): import sys @@ -639,14 +692,14 @@ for expected in (-n, n): s = self.dumps(expected, proto) n2 = self.loads(s) - self.assertEqual(expected, n2) + self.assert_is_copy(expected, n2) n = n >> 1 def test_maxint64(self): maxint64 = (1 << 63) - 1 data = b'I' + str(maxint64).encode("ascii") + b'\n.' got = self.loads(data) - self.assertEqual(got, maxint64) + self.assert_is_copy(maxint64, got) # Try too with a bogus literal. data = b'I' + str(maxint64).encode("ascii") + b'JUNK\n.' @@ -661,7 +714,7 @@ for n in npos, -npos: pickle = self.dumps(n, proto) got = self.loads(pickle) - self.assertEqual(n, got) + self.assert_is_copy(n, got) # Try a monster. This is quadratic-time in protos 0 & 1, so don't # bother with those. nbase = int("deadbeeffeedface", 16) @@ -669,7 +722,7 @@ for n in nbase, -nbase: p = self.dumps(n, 2) got = self.loads(p) - self.assertEqual(n, got) + self.assert_is_copy(n, got) def test_float(self): test_values = [0.0, 4.94e-324, 1e-310, 7e-308, 6.626e-34, 0.1, 0.5, @@ -679,7 +732,7 @@ for value in test_values: pickle = self.dumps(value, proto) got = self.loads(pickle) - self.assertEqual(value, got) + self.assert_is_copy(value, got) @run_with_locale('LC_ALL', 'de_DE', 'fr_FR') def test_float_format(self): @@ -711,6 +764,7 @@ s = self.dumps(a, proto) b = self.loads(s) self.assertEqual(a, b) + self.assertIs(type(a), type(b)) def test_structseq(self): import time @@ -720,48 +774,48 @@ for proto in protocols: s = self.dumps(t, proto) u = self.loads(s) - self.assertEqual(t, u) + self.assert_is_copy(t, u) if hasattr(os, "stat"): t = os.stat(os.curdir) s = self.dumps(t, proto) u = self.loads(s) - self.assertEqual(t, u) + self.assert_is_copy(t, u) if hasattr(os, "statvfs"): t = os.statvfs(os.curdir) s = self.dumps(t, proto) u = self.loads(s) - self.assertEqual(t, u) + self.assert_is_copy(t, u) def test_ellipsis(self): for proto in protocols: s = self.dumps(..., proto) u = self.loads(s) - self.assertEqual(..., u) + self.assertIs(..., u) def test_notimplemented(self): for proto in protocols: s = self.dumps(NotImplemented, proto) u = self.loads(s) - self.assertEqual(NotImplemented, u) + self.assertIs(NotImplemented, u) # Tests for protocol 2 def test_proto(self): - build_none = pickle.NONE + pickle.STOP for proto in protocols: - expected = build_none + pickled = self.dumps(None, proto) if proto >= 2: - expected = pickle.PROTO + bytes([proto]) + expected - p = self.dumps(None, proto) - self.assertEqual(p, expected) + proto_header = pickle.PROTO + bytes([proto]) + self.assertTrue(pickled.startswith(proto_header)) + else: + self.assertEqual(count_opcode(pickle.PROTO, pickled), 0) oob = protocols[-1] + 1 # a future protocol + build_none = pickle.NONE + pickle.STOP badpickle = pickle.PROTO + bytes([oob]) + build_none try: self.loads(badpickle) - except ValueError as detail: - self.assertTrue(str(detail).startswith( - "unsupported pickle protocol")) + except ValueError as err: + self.assertIn("unsupported pickle protocol", str(err)) else: self.fail("expected bad protocol number to raise ValueError") @@ -770,7 +824,7 @@ for proto in protocols: s = self.dumps(x, proto) y = self.loads(s) - self.assertEqual(x, y) + self.assert_is_copy(x, y) self.assertEqual(opcode_in_pickle(pickle.LONG1, s), proto >= 2) def test_long4(self): @@ -778,7 +832,7 @@ for proto in protocols: s = self.dumps(x, proto) y = self.loads(s) - self.assertEqual(x, y) + self.assert_is_copy(x, y) self.assertEqual(opcode_in_pickle(pickle.LONG4, s), proto >= 2) def test_short_tuples(self): @@ -816,9 +870,9 @@ for x in a, b, c, d, e: s = self.dumps(x, proto) y = self.loads(s) - self.assertEqual(x, y, (proto, x, s, y)) - expected = expected_opcode[proto, len(x)] - self.assertEqual(opcode_in_pickle(expected, s), True) + self.assert_is_copy(x, y) + expected = expected_opcode[min(proto, 3), len(x)] + self.assertTrue(opcode_in_pickle(expected, s)) def test_singletons(self): # Map (proto, singleton) to expected opcode. @@ -842,8 +896,8 @@ s = self.dumps(x, proto) y = self.loads(s) self.assertTrue(x is y, (proto, x, s, y)) - expected = expected_opcode[proto, x] - self.assertEqual(opcode_in_pickle(expected, s), True) + expected = expected_opcode[min(proto, 3), x] + self.assertTrue(opcode_in_pickle(expected, s)) def test_newobj_tuple(self): x = MyTuple([1, 2, 3]) @@ -852,8 +906,7 @@ for proto in protocols: s = self.dumps(x, proto) y = self.loads(s) - self.assertEqual(tuple(x), tuple(y)) - self.assertEqual(x.__dict__, y.__dict__) + self.assert_is_copy(x, y) def test_newobj_list(self): x = MyList([1, 2, 3]) @@ -862,8 +915,7 @@ for proto in protocols: s = self.dumps(x, proto) y = self.loads(s) - self.assertEqual(list(x), list(y)) - self.assertEqual(x.__dict__, y.__dict__) + self.assert_is_copy(x, y) def test_newobj_generic(self): for proto in protocols: @@ -874,6 +926,7 @@ s = self.dumps(x, proto) y = self.loads(s) detail = (proto, C, B, x, y, type(y)) + self.assert_is_copy(x, y) # XXX revisit self.assertEqual(B(x), B(y), detail) self.assertEqual(x.__dict__, y.__dict__, detail) @@ -912,11 +965,10 @@ s1 = self.dumps(x, 1) self.assertIn(__name__.encode("utf-8"), s1) self.assertIn(b"MyList", s1) - self.assertEqual(opcode_in_pickle(opcode, s1), False) + self.assertFalse(opcode_in_pickle(opcode, s1)) y = self.loads(s1) - self.assertEqual(list(x), list(y)) - self.assertEqual(x.__dict__, y.__dict__) + self.assert_is_copy(x, y) # Dump using protocol 2 for test. s2 = self.dumps(x, 2) @@ -925,9 +977,7 @@ self.assertEqual(opcode_in_pickle(opcode, s2), True, repr(s2)) y = self.loads(s2) - self.assertEqual(list(x), list(y)) - self.assertEqual(x.__dict__, y.__dict__) - + self.assert_is_copy(x, y) finally: e.restore() @@ -951,7 +1001,7 @@ for proto in protocols: s = self.dumps(x, proto) y = self.loads(s) - self.assertEqual(x, y) + self.assert_is_copy(x, y) num_appends = count_opcode(pickle.APPENDS, s) self.assertEqual(num_appends, proto > 0) @@ -960,7 +1010,7 @@ for proto in protocols: s = self.dumps(x, proto) y = self.loads(s) - self.assertEqual(x, y) + self.assert_is_copy(x, y) num_appends = count_opcode(pickle.APPENDS, s) if proto == 0: self.assertEqual(num_appends, 0) @@ -974,7 +1024,7 @@ s = self.dumps(x, proto) self.assertIsInstance(s, bytes_types) y = self.loads(s) - self.assertEqual(x, y) + self.assert_is_copy(x, y) num_setitems = count_opcode(pickle.SETITEMS, s) self.assertEqual(num_setitems, proto > 0) @@ -983,22 +1033,49 @@ for proto in protocols: s = self.dumps(x, proto) y = self.loads(s) - self.assertEqual(x, y) + self.assert_is_copy(x, y) num_setitems = count_opcode(pickle.SETITEMS, s) if proto == 0: self.assertEqual(num_setitems, 0) else: self.assertTrue(num_setitems >= 2) + def test_set_chunking(self): + n = 10 # too small to chunk + x = set(range(n)) + for proto in protocols: + s = self.dumps(x, proto) + y = self.loads(s) + self.assert_is_copy(x, y) + num_additems = count_opcode(pickle.ADDITEMS, s) + if proto < 4: + self.assertEqual(num_additems, 0) + else: + self.assertEqual(num_additems, 1) + + n = 2500 # expect at least two chunks when proto >= 4 + x = set(range(n)) + for proto in protocols: + s = self.dumps(x, proto) + y = self.loads(s) + self.assert_is_copy(x, y) + num_additems = count_opcode(pickle.ADDITEMS, s) + if proto < 4: + self.assertEqual(num_additems, 0) + else: + self.assertGreaterEqual(num_additems, 2) + def test_simple_newobj(self): x = object.__new__(SimpleNewObj) # avoid __init__ x.abc = 666 for proto in protocols: s = self.dumps(x, proto) - self.assertEqual(opcode_in_pickle(pickle.NEWOBJ, s), proto >= 2) + self.assertEqual(opcode_in_pickle(pickle.NEWOBJ, s), + 2 <= proto < 4) + self.assertEqual(opcode_in_pickle(pickle.NEWOBJ_EX, s), + proto >= 4) y = self.loads(s) # will raise TypeError if __init__ called - self.assertEqual(y.abc, 666) - self.assertEqual(x.__dict__, y.__dict__) + self.assert_is_copy(x, y) def test_newobj_list_slots(self): x = SlotList([1, 2, 3]) @@ -1006,10 +1083,7 @@ x.bar = "hello" s = self.dumps(x, 2) y = self.loads(s) - self.assertEqual(list(x), list(y)) - self.assertEqual(x.__dict__, y.__dict__) - self.assertEqual(x.foo, y.foo) - self.assertEqual(x.bar, y.bar) + self.assert_is_copy(x, y) def test_reduce_overrides_default_reduce_ex(self): for proto in protocols: @@ -1058,11 +1132,10 @@ @no_tracing def test_bad_getattr(self): + # Issue #3514: crash when there is an infinite loop in __getattr__ x = BadGetattr() - for proto in 0, 1: + for proto in protocols: self.assertRaises(RuntimeError, self.dumps, x, proto) - # protocol 2 don't raise a RuntimeError. - d = self.dumps(x, 2) def test_reduce_bad_iterator(self): # Issue4176: crash when 4th and 5th items of __reduce__() @@ -1095,11 +1168,10 @@ obj = [dict(large_dict), dict(large_dict), dict(large_dict)] for proto in protocols: - dumped = self.dumps(obj, proto) - loaded = self.loads(dumped) - self.assertEqual(loaded, obj, - "Failed protocol %d: %r != %r" - % (proto, obj, loaded)) + with self.subTest(proto=proto): + dumped = self.dumps(obj, proto) + loaded = self.loads(dumped) + self.assert_is_copy(obj, loaded) def test_attribute_name_interning(self): # Test that attribute names of pickled objects are interned when @@ -1155,11 +1227,14 @@ def test_int_pickling_efficiency(self): # Test compacity of int representation (see issue #12744) for proto in protocols: - sizes = [len(self.dumps(2**n, proto)) for n in range(70)] - # the size function is monotonic - self.assertEqual(sorted(sizes), sizes) - if proto >= 2: - self.assertLessEqual(sizes[-1], 14) + with self.subTest(proto=proto): + pickles = [self.dumps(2**n, proto) for n in range(70)] + sizes = list(map(len, pickles)) + # the size function is monotonic + self.assertEqual(sorted(sizes), sizes) + if proto >= 2: + for p in pickles: + self.assertFalse(opcode_in_pickle(pickle.LONG, p)) def check_negative_32b_binXXX(self, dumped): if sys.maxsize > 2**32: @@ -1242,6 +1317,137 @@ else: self._check_pickling_with_opcode(obj, pickle.SETITEMS, proto) + # Exercise framing (proto >= 4) for significant workloads + + FRAME_SIZE_TARGET = 64 * 1024 + + def test_framing_many_objects(self): + obj = list(range(10**5)) + for proto in range(4, pickle.HIGHEST_PROTOCOL + 1): + with self.subTest(proto=proto): + pickled = self.dumps(obj, proto) + unpickled = self.loads(pickled) + self.assertEqual(obj, unpickled) + # Test the framing heuristic is sane, + # assuming a given frame size target. + bytes_per_frame = (len(pickled) / + pickled.count(b'\x00\x00\x00\x00\x00')) + self.assertGreater(bytes_per_frame, + self.FRAME_SIZE_TARGET / 2) + self.assertLessEqual(bytes_per_frame, + self.FRAME_SIZE_TARGET * 1) + + def test_framing_large_objects(self): + N = 1024 * 1024 + obj = [b'x' * N, b'y' * N, b'z' * N] + for proto in range(4, pickle.HIGHEST_PROTOCOL + 1): + with self.subTest(proto=proto): + pickled = self.dumps(obj, proto) + unpickled = self.loads(pickled) + self.assertEqual(obj, unpickled) + # At least one frame was emitted per large bytes object. + n_frames = pickled.count(b'\x00\x00\x00\x00\x00') + self.assertGreaterEqual(n_frames, len(obj)) + + def test_nested_names(self): + global Nested + class Nested: + class A: + class B: + class C: + pass + + for proto in range(4, pickle.HIGHEST_PROTOCOL + 1): + for obj in [Nested.A, Nested.A.B, Nested.A.B.C]: + with self.subTest(proto=proto, obj=obj): + unpickled = self.loads(self.dumps(obj, proto)) + self.assertIs(obj, unpickled) + + def test_py_methods(self): + global PyMethodsTest + class PyMethodsTest: + @staticmethod + def cheese(): + return "cheese" + @classmethod + def wine(cls): + assert cls is PyMethodsTest + return "wine" + def biscuits(self): + assert isinstance(self, PyMethodsTest) + return "biscuits" + class Nested: + "Nested class" + @staticmethod + def ketchup(): + return "ketchup" + @classmethod + def maple(cls): + assert cls is PyMethodsTest.Nested + return "maple" + def pie(self): + assert isinstance(self, PyMethodsTest.Nested) + return "pie" + + py_methods = ( + PyMethodsTest.cheese, + PyMethodsTest.wine, + PyMethodsTest().biscuits, + PyMethodsTest.Nested.ketchup, + PyMethodsTest.Nested.maple, + PyMethodsTest.Nested().pie + ) + py_unbound_methods = ( + (PyMethodsTest.biscuits, PyMethodsTest), + (PyMethodsTest.Nested.pie, PyMethodsTest.Nested) + ) + for proto in range(4, pickle.HIGHEST_PROTOCOL + 1): + for method in py_methods: + with self.subTest(proto=proto, method=method): + unpickled = self.loads(self.dumps(method, proto)) + self.assertEqual(method(), unpickled()) + for method, cls in py_unbound_methods: + obj = cls() + with self.subTest(proto=proto, method=method): + unpickled = self.loads(self.dumps(method, proto)) + self.assertEqual(method(obj), unpickled(obj)) + + + def test_c_methods(self): + global Subclass + class Subclass(tuple): + class Nested(str): + pass + + c_methods = ( + # bound built-in method + ("abcd".index, ("c",)), + # unbound built-in method + (str.index, ("abcd", "c")), + # bound "slot" method + ([1, 2, 3].__len__, ()), + # unbound "slot" method + (list.__len__, ([1, 2, 3],)), + # bound "coexist" method + ({1, 2}.__contains__, (2,)), + # unbound "coexist" method + (set.__contains__, ({1, 2}, 2)), + # built-in class method + (dict.fromkeys, (("a", 1), ("b", 2))), + # built-in static method + (bytearray.maketrans, (b"abc", b"xyz")), + # subclass methods + (Subclass([1,2,2]).count, (2,)), + (Subclass.count, (Subclass([1,2,2]), 2)), + (Subclass.Nested("sweet").count, ("e",)), + (Subclass.Nested.count, (Subclass.Nested("sweet"), "e")), + ) + for proto in range(4, pickle.HIGHEST_PROTOCOL + 1): + for method, args in c_methods: + with self.subTest(proto=proto, method=method): + unpickled = self.loads(self.dumps(method, proto)) + self.assertEqual(method(*args), unpickled(*args)) + class BigmemPickleTests(unittest.TestCase): @@ -1252,10 +1458,11 @@ data = 1 << (8 * size) try: for proto in protocols: - if proto < 2: - continue - with self.assertRaises((ValueError, OverflowError)): - self.dumps(data, protocol=proto) + with self.subTest(proto=proto): + if proto < 2: + continue + with self.assertRaises((ValueError, OverflowError)): + self.dumps(data, protocol=proto) finally: data = None @@ -1268,14 +1475,15 @@ data = b"abcd" * (size // 4) try: for proto in protocols: - if proto < 3: - continue - try: - pickled = self.dumps(data, protocol=proto) - self.assertTrue(b"abcd" in pickled[:15]) - self.assertTrue(b"abcd" in pickled[-15:]) - finally: - pickled = None + with self.subTest(proto=proto): + if proto < 3: + continue + try: + pickled = self.dumps(data, protocol=proto) + self.assertTrue(b"abcd" in pickled[:19]) + self.assertTrue(b"abcd" in pickled[-18:]) + finally: + pickled = None finally: data = None @@ -1284,10 +1492,11 @@ data = b"a" * size try: for proto in protocols: - if proto < 3: - continue - with self.assertRaises((ValueError, OverflowError)): - self.dumps(data, protocol=proto) + with self.subTest(proto=proto): + if proto < 3: + continue + with self.assertRaises((ValueError, OverflowError)): + self.dumps(data, protocol=proto) finally: data = None @@ -1299,27 +1508,38 @@ data = "abcd" * (size // 4) try: for proto in protocols: - try: - pickled = self.dumps(data, protocol=proto) - self.assertTrue(b"abcd" in pickled[:15]) - self.assertTrue(b"abcd" in pickled[-15:]) - finally: - pickled = None + with self.subTest(proto=proto): + try: + pickled = self.dumps(data, protocol=proto) + self.assertTrue(b"abcd" in pickled[:19]) + self.assertTrue(b"abcd" in pickled[-18:]) + finally: + pickled = None finally: data = None - # BINUNICODE (protocols 1, 2 and 3) cannot carry more than - # 2**32 - 1 bytes of utf-8 encoded unicode. + # BINUNICODE (protocols 1, 2 and 3) cannot carry more than 2**32 - 1 bytes + # of utf-8 encoded unicode. BINUNICODE8 (protocol 4) supports these huge + # unicode strings however. - @bigmemtest(size=_4G, memuse=1 + ascii_char_size, dry_run=False) + @bigmemtest(size=_4G, memuse=2 + ascii_char_size, dry_run=False) def test_huge_str_64b(self, size): - data = "a" * size + data = "abcd" * (size // 4) try: for proto in protocols: - if proto == 0: - continue - with self.assertRaises((ValueError, OverflowError)): - self.dumps(data, protocol=proto) + with self.subTest(proto=proto): + if proto == 0: + continue + if proto < 4: + with self.assertRaises((ValueError, OverflowError)): + self.dumps(data, protocol=proto) + else: + try: + pickled = self.dumps(data, protocol=proto) + self.assertTrue(b"abcd" in pickled[:19]) + self.assertTrue(b"abcd" in pickled[-18:]) + finally: + pickled = None finally: data = None @@ -1363,8 +1583,8 @@ return object.__reduce__(self) class REX_six(object): - """This class is used to check the 4th argument (list iterator) of the reduce - protocol. + """This class is used to check the 4th argument (list iterator) of + the reduce protocol. """ def __init__(self, items=None): self.items = items if items is not None else [] @@ -1376,8 +1596,8 @@ return type(self), (), None, iter(self.items), None class REX_seven(object): - """This class is used to check the 5th argument (dict iterator) of the reduce - protocol. + """This class is used to check the 5th argument (dict iterator) of + the reduce protocol. """ def __init__(self, table=None): self.table = table if table is not None else {} @@ -1415,10 +1635,16 @@ class MyDict(dict): sample = {"a": 1, "b": 2} +class MySet(set): + sample = {"a", "b"} + +class MyFrozenSet(frozenset): + sample = frozenset({"a", "b"}) + myclasses = [MyInt, MyFloat, MyComplex, MyStr, MyUnicode, - MyTuple, MyList, MyDict] + MyTuple, MyList, MyDict, MySet, MyFrozenSet] class SlotList(MyList): @@ -1428,6 +1654,8 @@ def __init__(self, a, b, c): # raise an error, to make sure this isn't called raise TypeError("SimpleNewObj.__init__() didn't expect to get called") + def __eq__(self, other): + return self.__dict__ == other.__dict__ class BadGetattr: def __getattr__(self, key): @@ -1464,7 +1692,7 @@ def test_highest_protocol(self): # Of course this needs to be changed when HIGHEST_PROTOCOL changes. - self.assertEqual(pickle.HIGHEST_PROTOCOL, 3) + self.assertEqual(pickle.HIGHEST_PROTOCOL, 4) def test_callapi(self): f = io.BytesIO() @@ -1645,22 +1873,23 @@ def _check_multiple_unpicklings(self, ioclass): for proto in protocols: - data1 = [(x, str(x)) for x in range(2000)] + [b"abcde", len] - f = ioclass() - pickler = self.pickler_class(f, protocol=proto) - pickler.dump(data1) - pickled = f.getvalue() + with self.subTest(proto=proto): + data1 = [(x, str(x)) for x in range(2000)] + [b"abcde", len] + f = ioclass() + pickler = self.pickler_class(f, protocol=proto) + pickler.dump(data1) + pickled = f.getvalue() - N = 5 - f = ioclass(pickled * N) - unpickler = self.unpickler_class(f) - for i in range(N): - if f.seekable(): - pos = f.tell() - self.assertEqual(unpickler.load(), data1) - if f.seekable(): - self.assertEqual(f.tell(), pos + len(pickled)) - self.assertRaises(EOFError, unpickler.load) + N = 5 + f = ioclass(pickled * N) + unpickler = self.unpickler_class(f) + for i in range(N): + if f.seekable(): + pos = f.tell() + self.assertEqual(unpickler.load(), data1) + if f.seekable(): + self.assertEqual(f.tell(), pos + len(pickled)) + self.assertRaises(EOFError, unpickler.load) def test_multiple_unpicklings_seekable(self): self._check_multiple_unpicklings(io.BytesIO) diff --git a/Lib/test/test_descr.py b/Lib/test/test_descr.py --- a/Lib/test/test_descr.py +++ b/Lib/test/test_descr.py @@ -1,8 +1,11 @@ import builtins +import copyreg import gc +import itertools +import math +import pickle import sys import types -import math import unittest import weakref @@ -3153,176 +3156,6 @@ self.assertEqual(e.a, 1) self.assertEqual(can_delete_dict(e), can_delete_dict(ValueError())) - def test_pickles(self): - # Testing pickling and copying new-style classes and objects... - import pickle - - def sorteditems(d): - L = list(d.items()) - L.sort() - return L - - global C - class C(object): - def __init__(self, a, b): - super(C, self).__init__() - self.a = a - self.b = b - def __repr__(self): - return "C(%r, %r)" % (self.a, self.b) - - global C1 - class C1(list): - def __new__(cls, a, b): - return super(C1, cls).__new__(cls) - def __getnewargs__(self): - return (self.a, self.b) - def __init__(self, a, b): - self.a = a - self.b = b - def __repr__(self): - return "C1(%r, %r)<%r>" % (self.a, self.b, list(self)) - - global C2 - class C2(int): - def __new__(cls, a, b, val=0): - return super(C2, cls).__new__(cls, val) - def __getnewargs__(self): - return (self.a, self.b, int(self)) - def __init__(self, a, b, val=0): - self.a = a - self.b = b - def __repr__(self): - return "C2(%r, %r)<%r>" % (self.a, self.b, int(self)) - - global C3 - class C3(object): - def __init__(self, foo): - self.foo = foo - def __getstate__(self): - return self.foo - def __setstate__(self, foo): - self.foo = foo - - global C4classic, C4 - class C4classic: # classic - pass - class C4(C4classic, object): # mixed inheritance - pass - - for bin in 0, 1: - for cls in C, C1, C2: - s = pickle.dumps(cls, bin) - cls2 = pickle.loads(s) - self.assertIs(cls2, cls) - - a = C1(1, 2); a.append(42); a.append(24) - b = C2("hello", "world", 42) - s = pickle.dumps((a, b), bin) - x, y = pickle.loads(s) - self.assertEqual(x.__class__, a.__class__) - self.assertEqual(sorteditems(x.__dict__), sorteditems(a.__dict__)) - self.assertEqual(y.__class__, b.__class__) - self.assertEqual(sorteditems(y.__dict__), sorteditems(b.__dict__)) - self.assertEqual(repr(x), repr(a)) - self.assertEqual(repr(y), repr(b)) - # Test for __getstate__ and __setstate__ on new style class - u = C3(42) - s = pickle.dumps(u, bin) - v = pickle.loads(s) - self.assertEqual(u.__class__, v.__class__) - self.assertEqual(u.foo, v.foo) - # Test for picklability of hybrid class - u = C4() - u.foo = 42 - s = pickle.dumps(u, bin) - v = pickle.loads(s) - self.assertEqual(u.__class__, v.__class__) - self.assertEqual(u.foo, v.foo) - - # Testing copy.deepcopy() - import copy - for cls in C, C1, C2: - cls2 = copy.deepcopy(cls) - self.assertIs(cls2, cls) - - a = C1(1, 2); a.append(42); a.append(24) - b = C2("hello", "world", 42) - x, y = copy.deepcopy((a, b)) - self.assertEqual(x.__class__, a.__class__) - self.assertEqual(sorteditems(x.__dict__), sorteditems(a.__dict__)) - self.assertEqual(y.__class__, b.__class__) - self.assertEqual(sorteditems(y.__dict__), sorteditems(b.__dict__)) - self.assertEqual(repr(x), repr(a)) - self.assertEqual(repr(y), repr(b)) - - def test_pickle_slots(self): - # Testing pickling of classes with __slots__ ... - import pickle - # Pickling of classes with __slots__ but without __getstate__ should fail - # (if using protocol 0 or 1) - global B, C, D, E - class B(object): - pass - for base in [object, B]: - class C(base): - __slots__ = ['a'] - class D(C): - pass - try: - pickle.dumps(C(), 0) - except TypeError: - pass - else: - self.fail("should fail: pickle C instance - %s" % base) - try: - pickle.dumps(C(), 0) - except TypeError: - pass - else: - self.fail("should fail: pickle D instance - %s" % base) - # Give C a nice generic __getstate__ and __setstate__ - class C(base): - __slots__ = ['a'] - def __getstate__(self): - try: - d = self.__dict__.copy() - except AttributeError: - d = {} - for cls in self.__class__.__mro__: - for sn in cls.__dict__.get('__slots__', ()): - try: - d[sn] = getattr(self, sn) - except AttributeError: - pass - return d - def __setstate__(self, d): - for k, v in list(d.items()): - setattr(self, k, v) - class D(C): - pass - # Now it should work - x = C() - y = pickle.loads(pickle.dumps(x)) - self.assertNotHasAttr(y, 'a') - x.a = 42 - y = pickle.loads(pickle.dumps(x)) - self.assertEqual(y.a, 42) - x = D() - x.a = 42 - x.b = 100 - y = pickle.loads(pickle.dumps(x)) - self.assertEqual(y.a + y.b, 142) - # A subclass that adds a slot should also work - class E(C): - __slots__ = ['b'] - x = E() - x.a = 42 - x.b = "foo" - y = pickle.loads(pickle.dumps(x)) - self.assertEqual(y.a, x.a) - self.assertEqual(y.b, x.b) - def test_binary_operator_override(self): # Testing overrides of binary operations... class I(int): @@ -4690,11 +4523,439 @@ self.assertEqual(X.mykey2, 'from Base2') +class PicklingTests(unittest.TestCase): + + def _check_reduce(self, proto, obj, args=(), kwargs={}, state=None, + listitems=None, dictitems=None): + if proto >= 4: + reduce_value = obj.__reduce_ex__(proto) + self.assertEqual(reduce_value[:3], + (copyreg.__newobj_ex__, + (type(obj), args, kwargs), + state)) + if listitems is not None: + self.assertListEqual(list(reduce_value[3]), listitems) + else: + self.assertIsNone(reduce_value[3]) + if dictitems is not None: + self.assertDictEqual(dict(reduce_value[4]), dictitems) + else: + self.assertIsNone(reduce_value[4]) + elif proto >= 2: + reduce_value = obj.__reduce_ex__(proto) + self.assertEqual(reduce_value[:3], + (copyreg.__newobj__, + (type(obj),) + args, + state)) + if listitems is not None: + self.assertListEqual(list(reduce_value[3]), listitems) + else: + self.assertIsNone(reduce_value[3]) + if dictitems is not None: + self.assertDictEqual(dict(reduce_value[4]), dictitems) + else: + self.assertIsNone(reduce_value[4]) + else: + base_type = type(obj).__base__ + reduce_value = (copyreg._reconstructor, + (type(obj), + base_type, + None if base_type is object else base_type(obj))) + if state is not None: + reduce_value += (state,) + self.assertEqual(obj.__reduce_ex__(proto), reduce_value) + self.assertEqual(obj.__reduce__(), reduce_value) + + def test_reduce(self): + protocols = range(pickle.HIGHEST_PROTOCOL + 1) + args = (-101, "spam") + kwargs = {'bacon': -201, 'fish': -301} + state = {'cheese': -401} + + class C1: + def __getnewargs__(self): + return args + obj = C1() + for proto in protocols: + self._check_reduce(proto, obj, args) + + for name, value in state.items(): + setattr(obj, name, value) + for proto in protocols: + self._check_reduce(proto, obj, args, state=state) + + class C2: + def __getnewargs__(self): + return "bad args" + obj = C2() + for proto in protocols: + if proto >= 2: + with self.assertRaises(TypeError): + obj.__reduce_ex__(proto) + + class C3: + def __getnewargs_ex__(self): + return (args, kwargs) + obj = C3() + for proto in protocols: + if proto >= 4: + self._check_reduce(proto, obj, args, kwargs) + elif proto >= 2: + with self.assertRaises(ValueError): + obj.__reduce_ex__(proto) + + class C4: + def __getnewargs_ex__(self): + return (args, "bad dict") + class C5: + def __getnewargs_ex__(self): + return ("bad tuple", kwargs) + class C6: + def __getnewargs_ex__(self): + return () + class C7: + def __getnewargs_ex__(self): + return "bad args" + for proto in protocols: + for cls in C4, C5, C6, C7: + obj = cls() + if proto >= 2: + with self.assertRaises((TypeError, ValueError)): + obj.__reduce_ex__(proto) + + class C8: + def __getnewargs_ex__(self): + return (args, kwargs) + obj = C8() + for proto in protocols: + if 2 <= proto < 4: + with self.assertRaises(ValueError): + obj.__reduce_ex__(proto) + class C9: + def __getnewargs_ex__(self): + return (args, {}) + obj = C9() + for proto in protocols: + self._check_reduce(proto, obj, args) + + class C10: + def __getnewargs_ex__(self): + raise IndexError + obj = C10() + for proto in protocols: + if proto >= 2: + with self.assertRaises(IndexError): + obj.__reduce_ex__(proto) + + class C11: + def __getstate__(self): + return state + obj = C11() + for proto in protocols: + self._check_reduce(proto, obj, state=state) + + class C12: + def __getstate__(self): + return "not dict" + obj = C12() + for proto in protocols: + self._check_reduce(proto, obj, state="not dict") + + class C13: + def __getstate__(self): + raise IndexError + obj = C13() + for proto in protocols: + with self.assertRaises(IndexError): + obj.__reduce_ex__(proto) + if proto < 2: + with self.assertRaises(IndexError): + obj.__reduce__() + + class C14: + __slots__ = tuple(state) + def __init__(self): + for name, value in state.items(): + setattr(self, name, value) + + obj = C14() + for proto in protocols: + if proto >= 2: + self._check_reduce(proto, obj, state=(None, state)) + else: + with self.assertRaises(TypeError): + obj.__reduce_ex__(proto) + with self.assertRaises(TypeError): + obj.__reduce__() + + class C15(dict): + pass + obj = C15({"quebec": -601}) + for proto in protocols: + self._check_reduce(proto, obj, dictitems=dict(obj)) + + class C16(list): + pass + obj = C16(["yukon"]) + for proto in protocols: + self._check_reduce(proto, obj, listitems=list(obj)) + + def _assert_is_copy(self, obj, objcopy, msg=None): + """Utility method to verify if two objects are copies of each others. + """ + if msg is None: + msg = "{!r} is not a copy of {!r}".format(obj, objcopy) + if type(obj).__repr__ is object.__repr__: + # We have this limitation for now because we use the object's repr + # to help us verify that the two objects are copies. This allows + # us to delegate the non-generic verification logic to the objects + # themselves. + raise ValueError("object passed to _assert_is_copy must " + + "override the __repr__ method.") + self.assertIsNot(obj, objcopy, msg=msg) + self.assertIs(type(obj), type(objcopy), msg=msg) + if hasattr(obj, '__dict__'): + self.assertDictEqual(obj.__dict__, objcopy.__dict__, msg=msg) + self.assertIsNot(obj.__dict__, objcopy.__dict__, msg=msg) + if hasattr(obj, '__slots__'): + self.assertListEqual(obj.__slots__, objcopy.__slots__, msg=msg) + for slot in obj.__slots__: + self.assertEqual( + hasattr(obj, slot), hasattr(objcopy, slot), msg=msg) + self.assertEqual(getattr(obj, slot, None), + getattr(objcopy, slot, None), msg=msg) + self.assertEqual(repr(obj), repr(objcopy), msg=msg) + + @staticmethod + def _generate_pickle_copiers(): + """Utility method to generate the many possible pickle configurations. + """ + class PickleCopier: + "This class copies object using pickle." + def __init__(self, proto, dumps, loads): + self.proto = proto + self.dumps = dumps + self.loads = loads + def copy(self, obj): + return self.loads(self.dumps(obj, self.proto)) + def __repr__(self): + # We try to be as descriptive as possible here since this is + # the string which we will allow us to tell the pickle + # configuration we are using during debugging. + return ("PickleCopier(proto={}, dumps={}.{}, loads={}.{})" + .format(self.proto, + self.dumps.__module__, self.dumps.__qualname__, + self.loads.__module__, self.loads.__qualname__)) + return (PickleCopier(*args) for args in + itertools.product(range(pickle.HIGHEST_PROTOCOL + 1), + {pickle.dumps, pickle._dumps}, + {pickle.loads, pickle._loads})) + + def test_pickle_slots(self): + # Tests pickling of classes with __slots__. + + # Pickling of classes with __slots__ but without __getstate__ should + # fail (if using protocol 0 or 1) + global C + class C: + __slots__ = ['a'] + with self.assertRaises(TypeError): + pickle.dumps(C(), 0) + + global D + class D(C): + pass + with self.assertRaises(TypeError): + pickle.dumps(D(), 0) + + class C: + "A class with __getstate__ and __setstate__ implemented." + __slots__ = ['a'] + def __getstate__(self): + state = getattr(self, '__dict__', {}).copy() + for cls in type(self).__mro__: + for slot in cls.__dict__.get('__slots__', ()): + try: + state[slot] = getattr(self, slot) + except AttributeError: + pass + return state + def __setstate__(self, state): + for k, v in state.items(): + setattr(self, k, v) + def __repr__(self): + return "%s()<%r>" % (type(self).__name__, self.__getstate__()) + + class D(C): + "A subclass of a class with slots." + pass + + global E + class E(C): + "A subclass with an extra slot." + __slots__ = ['b'] + + # Now it should work + for pickle_copier in self._generate_pickle_copiers(): + with self.subTest(pickle_copier=pickle_copier): + x = C() + y = pickle_copier.copy(x) + self._assert_is_copy(x, y) + + x.a = 42 + y = pickle_copier.copy(x) + self._assert_is_copy(x, y) + + x = D() + x.a = 42 + x.b = 100 + y = pickle_copier.copy(x) + self._assert_is_copy(x, y) + + x = E() + x.a = 42 + x.b = "foo" + y = pickle_copier.copy(x) + self._assert_is_copy(x, y) + + def test_reduce_copying(self): + # Tests pickling and copying new-style classes and objects. + global C1 + class C1: + "The state of this class is copyable via its instance dict." + ARGS = (1, 2) + NEED_DICT_COPYING = True + def __init__(self, a, b): + super().__init__() + self.a = a + self.b = b + def __repr__(self): + return "C1(%r, %r)" % (self.a, self.b) + + global C2 + class C2(list): + "A list subclass copyable via __getnewargs__." + ARGS = (1, 2) + NEED_DICT_COPYING = False + def __new__(cls, a, b): + self = super().__new__(cls) + self.a = a + self.b = b + return self + def __init__(self, *args): + super().__init__() + # This helps testing that __init__ is not called during the + # unpickling process, which would cause extra appends. + self.append("cheese") + @classmethod + def __getnewargs__(cls): + return cls.ARGS + def __repr__(self): + return "C2(%r, %r)<%r>" % (self.a, self.b, list(self)) + + global C3 + class C3(list): + "A list subclass copyable via __getstate__." + ARGS = (1, 2) + NEED_DICT_COPYING = False + def __init__(self, a, b): + self.a = a + self.b = b + # This helps testing that __init__ is not called during the + # unpickling process, which would cause extra appends. + self.append("cheese") + @classmethod + def __getstate__(cls): + return cls.ARGS + def __setstate__(self, state): + a, b = state + self.a = a + self.b = b + def __repr__(self): + return "C3(%r, %r)<%r>" % (self.a, self.b, list(self)) + + global C4 + class C4(int): + "An int subclass copyable via __getnewargs__." + ARGS = ("hello", "world", 1) + NEED_DICT_COPYING = False + def __new__(cls, a, b, value): + self = super().__new__(cls, value) + self.a = a + self.b = b + return self + @classmethod + def __getnewargs__(cls): + return cls.ARGS + def __repr__(self): + return "C4(%r, %r)<%r>" % (self.a, self.b, int(self)) + + global C5 + class C5(int): + "An int subclass copyable via __getnewargs_ex__." + ARGS = (1, 2) + KWARGS = {'value': 3} + NEED_DICT_COPYING = False + def __new__(cls, a, b, *, value=0): + self = super().__new__(cls, value) + self.a = a + self.b = b + return self + @classmethod + def __getnewargs_ex__(cls): + return (cls.ARGS, cls.KWARGS) + def __repr__(self): + return "C5(%r, %r)<%r>" % (self.a, self.b, int(self)) + + test_classes = (C1, C2, C3, C4, C5) + # Testing copying through pickle + pickle_copiers = self._generate_pickle_copiers() + for cls, pickle_copier in itertools.product(test_classes, pickle_copiers): + with self.subTest(cls=cls, pickle_copier=pickle_copier): + kwargs = getattr(cls, 'KWARGS', {}) + obj = cls(*cls.ARGS, **kwargs) + proto = pickle_copier.proto + if 2 <= proto < 4 and hasattr(cls, '__getnewargs_ex__'): + with self.assertRaises(ValueError): + pickle_copier.dumps(obj, proto) + continue + objcopy = pickle_copier.copy(obj) + self._assert_is_copy(obj, objcopy) + # For test classes that supports this, make sure we didn't go + # around the reduce protocol by simply copying the attribute + # dictionary. We clear attributes using the previous copy to + # not mutate the original argument. + if proto >= 2 and not cls.NEED_DICT_COPYING: + objcopy.__dict__.clear() + objcopy2 = pickle_copier.copy(objcopy) + self._assert_is_copy(obj, objcopy2) + + # Testing copying through copy.deepcopy() + for cls in test_classes: + with self.subTest(cls=cls): + kwargs = getattr(cls, 'KWARGS', {}) + obj = cls(*cls.ARGS, **kwargs) + # XXX: We need to modify the copy module to support PEP 3154's + # reduce protocol 4. + if hasattr(cls, '__getnewargs_ex__'): + continue + objcopy = deepcopy(obj) + self._assert_is_copy(obj, objcopy) + # For test classes that supports this, make sure we didn't go + # around the reduce protocol by simply copying the attribute + # dictionary. We clear attributes using the previous copy to + # not mutate the original argument. + if not cls.NEED_DICT_COPYING: + objcopy.__dict__.clear() + objcopy2 = deepcopy(objcopy) + self._assert_is_copy(obj, objcopy2) + + def test_main(): # Run all local test cases, with PTypesLongInitTest first. support.run_unittest(PTypesLongInitTest, OperatorsTest, ClassPropertiesAndMethods, DictProxyTests, - MiscTests) + MiscTests, PicklingTests) if __name__ == "__main__": test_main() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,8 @@ Library ------- +- Issue #17810: Implement PEP 3154, pickle protocol 4. + - Issue #19668: Added support for the cp1125 encoding. - Issue #19689: Add ssl.create_default_context() factory function. It creates diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -6,7 +6,7 @@ /* Bump this when new opcodes are added to the pickle protocol. */ enum { - HIGHEST_PROTOCOL = 3, + HIGHEST_PROTOCOL = 4, DEFAULT_PROTOCOL = 3 }; @@ -71,7 +71,19 @@ /* Protocol 3 (Python 3.x) */ BINBYTES = 'B', - SHORT_BINBYTES = 'C' + SHORT_BINBYTES = 'C', + + /* Protocol 4 */ + SHORT_BINUNICODE = '\x8c', + BINUNICODE8 = '\x8d', + BINBYTES8 = '\x8e', + EMPTY_SET = '\x8f', + ADDITEMS = '\x90', + FROZENSET = '\x91', + NEWOBJ_EX = '\x92', + STACK_GLOBAL = '\x93', + MEMOIZE = '\x94', + FRAME = '\x95' }; /* These aren't opcodes -- they're ways to pickle bools before protocol 2 @@ -103,7 +115,11 @@ MAX_WRITE_BUF_SIZE = 64 * 1024, /* Prefetch size when unpickling (disabled on unpeekable streams) */ - PREFETCH = 8192 * 16 + PREFETCH = 8192 * 16, + + FRAME_SIZE_TARGET = 64 * 1024, + + FRAME_HEADER_SIZE = 9 }; /* Exception classes for pickle. These should override the ones defined in @@ -136,9 +152,6 @@ /* For looking up name pairs in copyreg._extension_registry. */ static PyObject *two_tuple = NULL; -_Py_IDENTIFIER(__name__); -_Py_IDENTIFIER(modules); - static int stack_underflow(void) { @@ -332,7 +345,12 @@ Py_ssize_t max_output_len; /* Allocation size of output_buffer. */ int proto; /* Pickle protocol number, >= 0 */ int bin; /* Boolean, true if proto > 0 */ - Py_ssize_t buf_size; /* Size of the current buffered pickle data */ + int framing; /* True when framing is enabled, proto >= 4 */ + Py_ssize_t frame_start; /* Position in output_buffer where the + where the current frame begins. -1 if there + is no frame currently open. */ + + Py_ssize_t buf_size; /* Size of the current buffered pickle data */ int fast; /* Enable fast mode if set to a true value. The fast mode disable the usage of memo, therefore speeding the pickling process by @@ -352,7 +370,8 @@ /* The unpickler memo is just an array of PyObject *s. Using a dict is unnecessary, since the keys are contiguous ints. */ PyObject **memo; - Py_ssize_t memo_size; + Py_ssize_t memo_size; /* Capacity of the memo array */ + Py_ssize_t memo_len; /* Number of objects in the memo */ PyObject *arg; PyObject *pers_func; /* persistent_load() method, can be NULL. */ @@ -362,7 +381,9 @@ char *input_line; Py_ssize_t input_len; Py_ssize_t next_read_idx; + Py_ssize_t frame_end_idx; Py_ssize_t prefetched_idx; /* index of first prefetched byte */ + PyObject *read; /* read() method of the input stream. */ PyObject *readline; /* readline() method of the input stream. */ PyObject *peek; /* peek() method of the input stream, or NULL */ @@ -380,6 +401,7 @@ int proto; /* Protocol of the pickle loaded. */ int fix_imports; /* Indicate whether Unpickler should fix the name of globals pickled by Python 2.x. */ + int framing; /* True when framing is enabled, proto >= 4 */ } UnpicklerObject; /* Forward declarations */ @@ -673,15 +695,63 @@ if (self->output_buffer == NULL) return -1; self->output_len = 0; + self->frame_start = -1; return 0; } +static void +_Pickler_WriteFrameHeader(PicklerObject *self, char *qdata, size_t frame_len) +{ + qdata[0] = (unsigned char)FRAME; + qdata[1] = (unsigned char)(frame_len & 0xff); + qdata[2] = (unsigned char)((frame_len >> 8) & 0xff); + qdata[3] = (unsigned char)((frame_len >> 16) & 0xff); + qdata[4] = (unsigned char)((frame_len >> 24) & 0xff); + qdata[5] = (unsigned char)((frame_len >> 32) & 0xff); + qdata[6] = (unsigned char)((frame_len >> 40) & 0xff); + qdata[7] = (unsigned char)((frame_len >> 48) & 0xff); + qdata[8] = (unsigned char)((frame_len >> 56) & 0xff); +} + +static int +_Pickler_CommitFrame(PicklerObject *self) +{ + size_t frame_len; + char *qdata; + + if (!self->framing || self->frame_start == -1) + return 0; + frame_len = self->output_len - self->frame_start - FRAME_HEADER_SIZE; + qdata = PyBytes_AS_STRING(self->output_buffer) + self->frame_start; + _Pickler_WriteFrameHeader(self, qdata, frame_len); + self->frame_start = -1; + return 0; +} + +static int +_Pickler_OpcodeBoundary(PicklerObject *self) +{ + Py_ssize_t frame_len; + + if (!self->framing || self->frame_start == -1) + return 0; + frame_len = self->output_len - self->frame_start - FRAME_HEADER_SIZE; + if (frame_len >= FRAME_SIZE_TARGET) + return _Pickler_CommitFrame(self); + else + return 0; +} + static PyObject * _Pickler_GetString(PicklerObject *self) { PyObject *output_buffer = self->output_buffer; assert(self->output_buffer != NULL); + + if (_Pickler_CommitFrame(self)) + return NULL; + self->output_buffer = NULL; /* Resize down to exact size */ if (_PyBytes_Resize(&output_buffer, self->output_len) < 0) @@ -696,6 +766,7 @@ assert(self->write != NULL); + /* This will commit the frame first */ output = _Pickler_GetString(self); if (output == NULL) return -1; @@ -706,57 +777,93 @@ } static Py_ssize_t -_Pickler_Write(PicklerObject *self, const char *s, Py_ssize_t n) -{ - Py_ssize_t i, required; +_Pickler_Write(PicklerObject *self, const char *s, Py_ssize_t data_len) +{ + Py_ssize_t i, n, required; char *buffer; + int need_new_frame; assert(s != NULL); + need_new_frame = (self->framing && self->frame_start == -1); + + if (need_new_frame) + n = data_len + FRAME_HEADER_SIZE; + else + n = data_len; required = self->output_len + n; - if (required > self->max_output_len) { - if (self->write != NULL && required > MAX_WRITE_BUF_SIZE) { - /* XXX This reallocates a new buffer every time, which is a bit - wasteful. */ - if (_Pickler_FlushToFile(self) < 0) - return -1; - if (_Pickler_ClearBuffer(self) < 0) - return -1; - } - if (self->write != NULL && n > MAX_WRITE_BUF_SIZE) { - /* we already flushed above, so the buffer is empty */ - PyObject *result; - /* XXX we could spare an intermediate copy and pass - a memoryview instead */ - PyObject *output = PyBytes_FromStringAndSize(s, n); - if (s == NULL) + if (self->write != NULL && required > MAX_WRITE_BUF_SIZE) { + /* XXX This reallocates a new buffer every time, which is a bit + wasteful. */ + if (_Pickler_FlushToFile(self) < 0) + return -1; + if (_Pickler_ClearBuffer(self) < 0) + return -1; + /* The previous frame was just committed by _Pickler_FlushToFile */ + need_new_frame = self->framing; + if (need_new_frame) + n = data_len + FRAME_HEADER_SIZE; + else + n = data_len; + required = self->output_len + n; + } + if (self->write != NULL && n > MAX_WRITE_BUF_SIZE) { + /* For large pickle chunks, we write directly to the output + file instead of buffering. Note the buffer is empty at this + point (it was flushed above, since required >= n). */ + PyObject *output, *result; + if (need_new_frame) { + char frame_header[FRAME_HEADER_SIZE]; + _Pickler_WriteFrameHeader(self, frame_header, (size_t) data_len); + output = PyBytes_FromStringAndSize(frame_header, FRAME_HEADER_SIZE); + if (output == NULL) return -1; result = _Pickler_FastCall(self, self->write, output); Py_XDECREF(result); - return (result == NULL) ? -1 : 0; - } - else { - if (self->output_len >= PY_SSIZE_T_MAX / 2 - n) { - PyErr_NoMemory(); - return -1; - } - self->max_output_len = (self->output_len + n) / 2 * 3; - if (_PyBytes_Resize(&self->output_buffer, self->max_output_len) < 0) + if (result == NULL) return -1; } + /* XXX we could spare an intermediate copy and pass + a memoryview instead */ + output = PyBytes_FromStringAndSize(s, data_len); + if (output == NULL) + return -1; + result = _Pickler_FastCall(self, self->write, output); + Py_XDECREF(result); + return (result == NULL) ? -1 : 0; + } + if (required > self->max_output_len) { + /* Make place in buffer for the pickle chunk */ + if (self->output_len >= PY_SSIZE_T_MAX / 2 - n) { + PyErr_NoMemory(); + return -1; + } + self->max_output_len = (self->output_len + n) / 2 * 3; + if (_PyBytes_Resize(&self->output_buffer, self->max_output_len) < 0) + return -1; } buffer = PyBytes_AS_STRING(self->output_buffer); - if (n < 8) { + if (need_new_frame) { + /* Setup new frame */ + Py_ssize_t frame_start = self->output_len; + self->frame_start = frame_start; + for (i = 0; i < FRAME_HEADER_SIZE; i++) { + /* Write an invalid value, for debugging */ + buffer[frame_start + i] = 0xFE; + } + self->output_len += FRAME_HEADER_SIZE; + } + if (data_len < 8) { /* This is faster than memcpy when the string is short. */ - for (i = 0; i < n; i++) { + for (i = 0; i < data_len; i++) { buffer[self->output_len + i] = s[i]; } } else { - memcpy(buffer + self->output_len, s, n); - } - self->output_len += n; - return n; + memcpy(buffer + self->output_len, s, data_len); + } + self->output_len += data_len; + return data_len; } static PicklerObject * @@ -774,6 +881,8 @@ self->write = NULL; self->proto = 0; self->bin = 0; + self->framing = 0; + self->frame_start = -1; self->fast = 0; self->fast_nesting = 0; self->fix_imports = 0; @@ -868,6 +977,7 @@ self->input_buffer = self->buffer.buf; self->input_len = self->buffer.len; self->next_read_idx = 0; + self->frame_end_idx = -1; self->prefetched_idx = self->input_len; return self->input_len; } @@ -932,7 +1042,7 @@ return -1; /* Prefetch some data without advancing the file pointer, if possible */ - if (self->peek) { + if (self->peek && !self->framing) { PyObject *len, *prefetched; len = PyLong_FromSsize_t(PREFETCH); if (len == NULL) { @@ -980,7 +1090,7 @@ Returns -1 (with an exception set) on failure. On success, return the number of chars read. */ static Py_ssize_t -_Unpickler_Read(UnpicklerObject *self, char **s, Py_ssize_t n) +_Unpickler_ReadUnframed(UnpicklerObject *self, char **s, Py_ssize_t n) { Py_ssize_t num_read; @@ -1006,6 +1116,67 @@ } static Py_ssize_t +_Unpickler_Read(UnpicklerObject *self, char **s, Py_ssize_t n) +{ + if (self->framing && + (self->frame_end_idx == -1 || + self->frame_end_idx <= self->next_read_idx)) { + /* Need to read new frame */ + char *dummy; + unsigned char *frame_start; + size_t frame_len; + if (_Unpickler_ReadUnframed(self, &dummy, FRAME_HEADER_SIZE) < 0) + return -1; + frame_start = (unsigned char *) dummy; + if (frame_start[0] != (unsigned char)FRAME) { + PyErr_Format(UnpicklingError, + "expected FRAME opcode, got 0x%x instead", + frame_start[0]); + return -1; + } + frame_len = (size_t) frame_start[1]; + frame_len |= (size_t) frame_start[2] << 8; + frame_len |= (size_t) frame_start[3] << 16; + frame_len |= (size_t) frame_start[4] << 24; +#if SIZEOF_SIZE_T >= 8 + frame_len |= (size_t) frame_start[5] << 32; + frame_len |= (size_t) frame_start[6] << 40; + frame_len |= (size_t) frame_start[7] << 48; + frame_len |= (size_t) frame_start[8] << 56; +#else + if (frame_start[5] || frame_start[6] || + frame_start[7] || frame_start[8]) { + PyErr_Format(PyExc_OverflowError, + "Frame size too large for 32-bit build"); + return -1; + } +#endif + if (frame_len > PY_SSIZE_T_MAX) { + PyErr_Format(UnpicklingError, "Invalid frame length"); + return -1; + } + if (frame_len < n) { + PyErr_Format(UnpicklingError, "Bad framing"); + return -1; + } + if (_Unpickler_ReadUnframed(self, &dummy /* unused */, + frame_len) < 0) + return -1; + /* Rewind to start of frame */ + self->frame_end_idx = self->next_read_idx; + self->next_read_idx -= frame_len; + } + if (self->framing) { + /* Check for bad input */ + if (n + self->next_read_idx > self->frame_end_idx) { + PyErr_Format(UnpicklingError, "Bad framing"); + return -1; + } + } + return _Unpickler_ReadUnframed(self, s, n); +} + +static Py_ssize_t _Unpickler_CopyLine(UnpicklerObject *self, char *line, Py_ssize_t len, char **result) { @@ -1102,7 +1273,12 @@ Py_INCREF(value); old_item = self->memo[idx]; self->memo[idx] = value; - Py_XDECREF(old_item); + if (old_item != NULL) { + Py_DECREF(old_item); + } + else { + self->memo_len++; + } return 0; } @@ -1150,6 +1326,7 @@ self->input_line = NULL; self->input_len = 0; self->next_read_idx = 0; + self->frame_end_idx = -1; self->prefetched_idx = 0; self->read = NULL; self->readline = NULL; @@ -1160,9 +1337,11 @@ self->num_marks = 0; self->marks_size = 0; self->proto = 0; + self->framing = 0; self->fix_imports = 0; memset(&self->buffer, 0, sizeof(Py_buffer)); self->memo_size = 32; + self->memo_len = 0; self->memo = _Unpickler_NewMemo(self->memo_size); self->stack = (Pdata *)Pdata_New(); @@ -1277,36 +1456,44 @@ static int memo_put(PicklerObject *self, PyObject *obj) { - Py_ssize_t x; char pdata[30]; Py_ssize_t len; - int status = 0; + Py_ssize_t idx; + + const char memoize_op = MEMOIZE; if (self->fast) return 0; - - x = PyMemoTable_Size(self->memo); - if (PyMemoTable_Set(self->memo, obj, x) < 0) - goto error; - - if (!self->bin) { + if (_Pickler_OpcodeBoundary(self)) + return -1; + + idx = PyMemoTable_Size(self->memo); + if (PyMemoTable_Set(self->memo, obj, idx) < 0) + return -1; + + if (self->proto >= 4) { + if (_Pickler_Write(self, &memoize_op, 1) < 0) + return -1; + return 0; + } + else if (!self->bin) { pdata[0] = PUT; PyOS_snprintf(pdata + 1, sizeof(pdata) - 1, - "%" PY_FORMAT_SIZE_T "d\n", x); + "%" PY_FORMAT_SIZE_T "d\n", idx); len = strlen(pdata); } else { - if (x < 256) { + if (idx < 256) { pdata[0] = BINPUT; - pdata[1] = (unsigned char)x; + pdata[1] = (unsigned char)idx; len = 2; } - else if (x <= 0xffffffffL) { + else if (idx <= 0xffffffffL) { pdata[0] = LONG_BINPUT; - pdata[1] = (unsigned char)(x & 0xff); - pdata[2] = (unsigned char)((x >> 8) & 0xff); - pdata[3] = (unsigned char)((x >> 16) & 0xff); - pdata[4] = (unsigned char)((x >> 24) & 0xff); + pdata[1] = (unsigned char)(idx & 0xff); + pdata[2] = (unsigned char)((idx >> 8) & 0xff); + pdata[3] = (unsigned char)((idx >> 16) & 0xff); + pdata[4] = (unsigned char)((idx >> 24) & 0xff); len = 5; } else { /* unlikely */ @@ -1315,57 +1502,94 @@ return -1; } } - if (_Pickler_Write(self, pdata, len) < 0) - goto error; - - if (0) { - error: - status = -1; - } - - return status; + return -1; + + return 0; } static PyObject * -whichmodule(PyObject *global, PyObject *global_name) -{ - Py_ssize_t i, j; - static PyObject *module_str = NULL; - static PyObject *main_str = NULL; +getattribute(PyObject *obj, PyObject *name, int allow_qualname) { + PyObject *dotted_path; + Py_ssize_t i; + _Py_static_string(PyId_dot, "."); + _Py_static_string(PyId_locals, ""); + + dotted_path = PyUnicode_Split(name, _PyUnicode_FromId(&PyId_dot), -1); + if (dotted_path == NULL) { + return NULL; + } + assert(Py_SIZE(dotted_path) >= 1); + if (!allow_qualname && Py_SIZE(dotted_path) > 1) { + PyErr_Format(PyExc_AttributeError, + "Can't get qualified attribute %R on %R;" + "use protocols >= 4 to enable support", + name, obj); + Py_DECREF(dotted_path); + return NULL; + } + Py_INCREF(obj); + for (i = 0; i < Py_SIZE(dotted_path); i++) { + PyObject *subpath = PyList_GET_ITEM(dotted_path, i); + PyObject *tmp; + PyObject *result = PyUnicode_RichCompare( + subpath, _PyUnicode_FromId(&PyId_locals), Py_EQ); + int is_equal = (result == Py_True); + assert(PyBool_Check(result)); + Py_DECREF(result); + if (is_equal) { + PyErr_Format(PyExc_AttributeError, + "Can't get local attribute %R on %R", name, obj); + Py_DECREF(dotted_path); + Py_DECREF(obj); + return NULL; + } + tmp = PyObject_GetAttr(obj, subpath); + Py_DECREF(obj); + if (tmp == NULL) { + if (PyErr_ExceptionMatches(PyExc_AttributeError)) { + PyErr_Clear(); + PyErr_Format(PyExc_AttributeError, + "Can't get attribute %R on %R", name, obj); + } + Py_DECREF(dotted_path); + return NULL; + } + obj = tmp; + } + Py_DECREF(dotted_path); + return obj; +} + +static PyObject * +whichmodule(PyObject *global, PyObject *global_name, int allow_qualname) +{ PyObject *module_name; PyObject *modules_dict; PyObject *module; PyObject *obj; - - if (module_str == NULL) { - module_str = PyUnicode_InternFromString("__module__"); - if (module_str == NULL) + Py_ssize_t i, j; + _Py_IDENTIFIER(__module__); + _Py_IDENTIFIER(modules); + _Py_IDENTIFIER(__main__); + + module_name = _PyObject_GetAttrId(global, &PyId___module__); + + if (module_name == NULL) { + if (!PyErr_ExceptionMatches(PyExc_AttributeError)) return NULL; - main_str = PyUnicode_InternFromString("__main__"); - if (main_str == NULL) - return NULL; - } - - module_name = PyObject_GetAttr(global, module_str); - - /* In some rare cases (e.g., bound methods of extension types), - __module__ can be None. If it is so, then search sys.modules - for the module of global. */ - if (module_name == Py_None) { - Py_DECREF(module_name); - goto search; - } - - if (module_name) { - return module_name; - } - if (PyErr_ExceptionMatches(PyExc_AttributeError)) PyErr_Clear(); - else - return NULL; - - search: + } + else { + /* In some rare cases (e.g., bound methods of extension types), + __module__ can be None. If it is so, then search sys.modules for + the module of global. */ + if (module_name != Py_None) + return module_name; + Py_CLEAR(module_name); + } + assert(module_name == NULL); + modules_dict = _PySys_GetObjectId(&PyId_modules); if (modules_dict == NULL) { PyErr_SetString(PyExc_RuntimeError, "unable to get sys.modules"); @@ -1373,34 +1597,35 @@ } i = 0; - module_name = NULL; while ((j = PyDict_Next(modules_dict, &i, &module_name, &module))) { - if (PyObject_RichCompareBool(module_name, main_str, Py_EQ) == 1) + PyObject *result = PyUnicode_RichCompare( + module_name, _PyUnicode_FromId(&PyId___main__), Py_EQ); + int is_equal = (result == Py_True); + assert(PyBool_Check(result)); + Py_DECREF(result); + if (is_equal) continue; - - obj = PyObject_GetAttr(module, global_name); + if (module == Py_None) + continue; + + obj = getattribute(module, global_name, allow_qualname); if (obj == NULL) { - if (PyErr_ExceptionMatches(PyExc_AttributeError)) - PyErr_Clear(); - else + if (!PyErr_ExceptionMatches(PyExc_AttributeError)) return NULL; + PyErr_Clear(); continue; } - if (obj != global) { + if (obj == global) { Py_DECREF(obj); - continue; + Py_INCREF(module_name); + return module_name; } - Py_DECREF(obj); - break; } /* If no module is found, use __main__. */ - if (!j) { - module_name = main_str; - } - + module_name = _PyUnicode_FromId(&PyId___main__); Py_INCREF(module_name); return module_name; } @@ -1744,22 +1969,17 @@ reduce_value = Py_BuildValue("(O())", (PyObject*)&PyBytes_Type); } else { - static PyObject *latin1 = NULL; PyObject *unicode_str = PyUnicode_DecodeLatin1(PyBytes_AS_STRING(obj), PyBytes_GET_SIZE(obj), "strict"); + _Py_IDENTIFIER(latin1); + if (unicode_str == NULL) return -1; - if (latin1 == NULL) { - latin1 = PyUnicode_InternFromString("latin1"); - if (latin1 == NULL) { - Py_DECREF(unicode_str); - return -1; - } - } reduce_value = Py_BuildValue("(O(OO))", - codecs_encode, unicode_str, latin1); + codecs_encode, unicode_str, + _PyUnicode_FromId(&PyId_latin1)); Py_DECREF(unicode_str); } @@ -1773,14 +1993,14 @@ } else { Py_ssize_t size; - char header[5]; + char header[9]; Py_ssize_t len; size = PyBytes_GET_SIZE(obj); if (size < 0) return -1; - if (size < 256) { + if (size <= 0xff) { header[0] = SHORT_BINBYTES; header[1] = (unsigned char)size; len = 2; @@ -1793,6 +2013,14 @@ header[4] = (unsigned char)((size >> 24) & 0xff); len = 5; } + else if (self->proto >= 4) { + int i; + header[0] = BINBYTES8; + for (i = 0; i < 8; i++) { + header[i+1] = (unsigned char)((size >> (8 * i)) & 0xff); + } + len = 8; + } else { PyErr_SetString(PyExc_OverflowError, "cannot serialize a bytes object larger than 4 GiB"); @@ -1882,26 +2110,39 @@ static int write_utf8(PicklerObject *self, char *data, Py_ssize_t size) { - char pdata[5]; - -#if SIZEOF_SIZE_T > 4 - if (size > 0xffffffffUL) { - /* string too large */ + char header[9]; + Py_ssize_t len; + + if (size <= 0xff && self->proto >= 4) { + header[0] = SHORT_BINUNICODE; + header[1] = (unsigned char)(size & 0xff); + len = 2; + } + else if (size <= 0xffffffffUL) { + header[0] = BINUNICODE; + header[1] = (unsigned char)(size & 0xff); + header[2] = (unsigned char)((size >> 8) & 0xff); + header[3] = (unsigned char)((size >> 16) & 0xff); + header[4] = (unsigned char)((size >> 24) & 0xff); + len = 5; + } + else if (self->proto >= 4) { + int i; + + header[0] = BINUNICODE8; + for (i = 0; i < 8; i++) { + header[i+1] = (unsigned char)((size >> (8 * i)) & 0xff); + } + len = 9; + } + else { PyErr_SetString(PyExc_OverflowError, "cannot serialize a string larger than 4GiB"); return -1; } -#endif - - pdata[0] = BINUNICODE; - pdata[1] = (unsigned char)(size & 0xff); - pdata[2] = (unsigned char)((size >> 8) & 0xff); - pdata[3] = (unsigned char)((size >> 16) & 0xff); - pdata[4] = (unsigned char)((size >> 24) & 0xff); - - if (_Pickler_Write(self, pdata, sizeof(pdata)) < 0) - return -1; - + + if (_Pickler_Write(self, header, len) < 0) + return -1; if (_Pickler_Write(self, data, size) < 0) return -1; @@ -2598,6 +2839,214 @@ } static int +save_set(PicklerObject *self, PyObject *obj) +{ + PyObject *item; + int i; + Py_ssize_t set_size, ppos = 0; + Py_hash_t hash; + + const char empty_set_op = EMPTY_SET; + const char mark_op = MARK; + const char additems_op = ADDITEMS; + + if (self->proto < 4) { + PyObject *items; + PyObject *reduce_value; + int status; + + items = PySequence_List(obj); + if (items == NULL) { + return -1; + } + reduce_value = Py_BuildValue("(O(O))", (PyObject*)&PySet_Type, items); + Py_DECREF(items); + if (reduce_value == NULL) { + return -1; + } + /* save_reduce() will memoize the object automatically. */ + status = save_reduce(self, reduce_value, obj); + Py_DECREF(reduce_value); + return status; + } + + if (_Pickler_Write(self, &empty_set_op, 1) < 0) + return -1; + + if (memo_put(self, obj) < 0) + return -1; + + set_size = PySet_GET_SIZE(obj); + if (set_size == 0) + return 0; /* nothing to do */ + + /* Write in batches of BATCHSIZE. */ + do { + i = 0; + if (_Pickler_Write(self, &mark_op, 1) < 0) + return -1; + while (_PySet_NextEntry(obj, &ppos, &item, &hash)) { + if (save(self, item, 0) < 0) + return -1; + if (++i == BATCHSIZE) + break; + } + if (_Pickler_Write(self, &additems_op, 1) < 0) + return -1; + if (PySet_GET_SIZE(obj) != set_size) { + PyErr_Format( + PyExc_RuntimeError, + "set changed size during iteration"); + return -1; + } + } while (i == BATCHSIZE); + + return 0; +} + +static int +save_frozenset(PicklerObject *self, PyObject *obj) +{ + PyObject *iter; + + const char mark_op = MARK; + const char frozenset_op = FROZENSET; + + if (self->fast && !fast_save_enter(self, obj)) + return -1; + + if (self->proto < 4) { + PyObject *items; + PyObject *reduce_value; + int status; + + items = PySequence_List(obj); + if (items == NULL) { + return -1; + } + reduce_value = Py_BuildValue("(O(O))", (PyObject*)&PyFrozenSet_Type, + items); + Py_DECREF(items); + if (reduce_value == NULL) { + return -1; + } + /* save_reduce() will memoize the object automatically. */ + status = save_reduce(self, reduce_value, obj); + Py_DECREF(reduce_value); + return status; + } + + if (_Pickler_Write(self, &mark_op, 1) < 0) + return -1; + + iter = PyObject_GetIter(obj); + for (;;) { + PyObject *item; + + item = PyIter_Next(iter); + if (item == NULL) { + if (PyErr_Occurred()) { + Py_DECREF(iter); + return -1; + } + break; + } + if (save(self, item, 0) < 0) { + Py_DECREF(item); + Py_DECREF(iter); + return -1; + } + Py_DECREF(item); + } + Py_DECREF(iter); + + /* If the object is already in the memo, this means it is + recursive. In this case, throw away everything we put on the + stack, and fetch the object back from the memo. */ + if (PyMemoTable_Get(self->memo, obj)) { + const char pop_mark_op = POP_MARK; + + if (_Pickler_Write(self, &pop_mark_op, 1) < 0) + return -1; + if (memo_get(self, obj) < 0) + return -1; + return 0; + } + + if (_Pickler_Write(self, &frozenset_op, 1) < 0) + return -1; + if (memo_put(self, obj) < 0) + return -1; + + return 0; +} + +static int +fix_imports(PyObject **module_name, PyObject **global_name) +{ + PyObject *key; + PyObject *item; + + key = PyTuple_Pack(2, *module_name, *global_name); + if (key == NULL) + return -1; + item = PyDict_GetItemWithError(name_mapping_3to2, key); + Py_DECREF(key); + if (item) { + PyObject *fixed_module_name; + PyObject *fixed_global_name; + + if (!PyTuple_Check(item) || PyTuple_GET_SIZE(item) != 2) { + PyErr_Format(PyExc_RuntimeError, + "_compat_pickle.REVERSE_NAME_MAPPING values " + "should be 2-tuples, not %.200s", + Py_TYPE(item)->tp_name); + return -1; + } + fixed_module_name = PyTuple_GET_ITEM(item, 0); + fixed_global_name = PyTuple_GET_ITEM(item, 1); + if (!PyUnicode_Check(fixed_module_name) || + !PyUnicode_Check(fixed_global_name)) { + PyErr_Format(PyExc_RuntimeError, + "_compat_pickle.REVERSE_NAME_MAPPING values " + "should be pairs of str, not (%.200s, %.200s)", + Py_TYPE(fixed_module_name)->tp_name, + Py_TYPE(fixed_global_name)->tp_name); + return -1; + } + + Py_CLEAR(*module_name); + Py_CLEAR(*global_name); + Py_INCREF(fixed_module_name); + Py_INCREF(fixed_global_name); + *module_name = fixed_module_name; + *global_name = fixed_global_name; + } + else if (PyErr_Occurred()) { + return -1; + } + + item = PyDict_GetItemWithError(import_mapping_3to2, *module_name); + if (item) { + if (!PyUnicode_Check(item)) { + PyErr_Format(PyExc_RuntimeError, + "_compat_pickle.REVERSE_IMPORT_MAPPING values " + "should be strings, not %.200s", + Py_TYPE(item)->tp_name); + return -1; + } + Py_CLEAR(*module_name); + Py_INCREF(item); + *module_name = item; + } + else if (PyErr_Occurred()) { + return -1; + } + + return 0; +} + +static int save_global(PicklerObject *self, PyObject *obj, PyObject *name) { PyObject *global_name = NULL; @@ -2605,20 +3054,32 @@ PyObject *module = NULL; PyObject *cls; int status = 0; + _Py_IDENTIFIER(__name__); + _Py_IDENTIFIER(__qualname__); const char global_op = GLOBAL; if (name) { + Py_INCREF(name); global_name = name; - Py_INCREF(global_name); } else { - global_name = _PyObject_GetAttrId(obj, &PyId___name__); - if (global_name == NULL) - goto error; - } - - module_name = whichmodule(obj, global_name); + if (self->proto >= 4) { + global_name = _PyObject_GetAttrId(obj, &PyId___qualname__); + if (global_name == NULL) { + if (!PyErr_ExceptionMatches(PyExc_AttributeError)) + goto error; + PyErr_Clear(); + } + } + if (global_name == NULL) { + global_name = _PyObject_GetAttrId(obj, &PyId___name__); + if (global_name == NULL) + goto error; + } + } + + module_name = whichmodule(obj, global_name, self->proto >= 4); if (module_name == NULL) goto error; @@ -2637,11 +3098,11 @@ obj, module_name); goto error; } - cls = PyObject_GetAttr(module, global_name); + cls = getattribute(module, global_name, self->proto >= 4); if (cls == NULL) { PyErr_Format(PicklingError, - "Can't pickle %R: attribute lookup %S.%S failed", - obj, module_name, global_name); + "Can't pickle %R: attribute lookup %S on %S failed", + obj, global_name, module_name); goto error; } if (cls != obj) { @@ -2715,120 +3176,82 @@ goto error; } else { - /* Generate a normal global opcode if we are using a pickle - protocol <= 2, or if the object is not registered in the - extension registry. */ - PyObject *encoded; - PyObject *(*unicode_encoder)(PyObject *); - gen_global: - if (_Pickler_Write(self, &global_op, 1) < 0) - goto error; - - /* Since Python 3.0 now supports non-ASCII identifiers, we encode both - the module name and the global name using UTF-8. We do so only when - we are using the pickle protocol newer than version 3. This is to - ensure compatibility with older Unpickler running on Python 2.x. */ - if (self->proto >= 3) { - unicode_encoder = PyUnicode_AsUTF8String; + if (self->proto >= 4) { + const char stack_global_op = STACK_GLOBAL; + + save(self, module_name, 0); + save(self, global_name, 0); + + if (_Pickler_Write(self, &stack_global_op, 1) < 0) + goto error; } else { - unicode_encoder = PyUnicode_AsASCIIString; - } - - /* For protocol < 3 and if the user didn't request against doing so, - we convert module names to the old 2.x module names. */ - if (self->fix_imports) { - PyObject *key; - PyObject *item; - - key = PyTuple_Pack(2, module_name, global_name); - if (key == NULL) + /* Generate a normal global opcode if we are using a pickle + protocol < 4, or if the object is not registered in the + extension registry. */ + PyObject *encoded; + PyObject *(*unicode_encoder)(PyObject *); + + if (_Pickler_Write(self, &global_op, 1) < 0) goto error; - item = PyDict_GetItemWithError(name_mapping_3to2, key); - Py_DECREF(key); - if (item) { - if (!PyTuple_Check(item) || PyTuple_GET_SIZE(item) != 2) { - PyErr_Format(PyExc_RuntimeError, - "_compat_pickle.REVERSE_NAME_MAPPING values " - "should be 2-tuples, not %.200s", - Py_TYPE(item)->tp_name); + + /* For protocol < 3 and if the user didn't request against doing + so, we convert module names to the old 2.x module names. */ + if (self->proto < 3 && self->fix_imports) { + if (fix_imports(&module_name, &global_name) < 0) { goto error; } - Py_CLEAR(module_name); - Py_CLEAR(global_name); - module_name = PyTuple_GET_ITEM(item, 0); - global_name = PyTuple_GET_ITEM(item, 1); - if (!PyUnicode_Check(module_name) || - !PyUnicode_Check(global_name)) { - PyErr_Format(PyExc_RuntimeError, - "_compat_pickle.REVERSE_NAME_MAPPING values " - "should be pairs of str, not (%.200s, %.200s)", - Py_TYPE(module_name)->tp_name, - Py_TYPE(global_name)->tp_name); - goto error; - } - Py_INCREF(module_name); - Py_INCREF(global_name); } - else if (PyErr_Occurred()) { + + /* Since Python 3.0 now supports non-ASCII identifiers, we encode + both the module name and the global name using UTF-8. We do so + only when we are using the pickle protocol newer than version + 3. This is to ensure compatibility with older Unpickler running + on Python 2.x. */ + if (self->proto == 3) { + unicode_encoder = PyUnicode_AsUTF8String; + } + else { + unicode_encoder = PyUnicode_AsASCIIString; + } + encoded = unicode_encoder(module_name); + if (encoded == NULL) { + if (PyErr_ExceptionMatches(PyExc_UnicodeEncodeError)) + PyErr_Format(PicklingError, + "can't pickle module identifier '%S' using " + "pickle protocol %i", + module_name, self->proto); goto error; } - - item = PyDict_GetItemWithError(import_mapping_3to2, module_name); - if (item) { - if (!PyUnicode_Check(item)) { - PyErr_Format(PyExc_RuntimeError, - "_compat_pickle.REVERSE_IMPORT_MAPPING values " - "should be strings, not %.200s", - Py_TYPE(item)->tp_name); - goto error; - } - Py_CLEAR(module_name); - module_name = item; - Py_INCREF(module_name); - } - else if (PyErr_Occurred()) { + if (_Pickler_Write(self, PyBytes_AS_STRING(encoded), + PyBytes_GET_SIZE(encoded)) < 0) { + Py_DECREF(encoded); goto error; } + Py_DECREF(encoded); + if(_Pickler_Write(self, "\n", 1) < 0) + goto error; + + /* Save the name of the module. */ + encoded = unicode_encoder(global_name); + if (encoded == NULL) { + if (PyErr_ExceptionMatches(PyExc_UnicodeEncodeError)) + PyErr_Format(PicklingError, + "can't pickle global identifier '%S' using " + "pickle protocol %i", + global_name, self->proto); + goto error; + } + if (_Pickler_Write(self, PyBytes_AS_STRING(encoded), + PyBytes_GET_SIZE(encoded)) < 0) { + Py_DECREF(encoded); + goto error; + } + Py_DECREF(encoded); + if (_Pickler_Write(self, "\n", 1) < 0) + goto error; } - - /* Save the name of the module. */ - encoded = unicode_encoder(module_name); - if (encoded == NULL) { - if (PyErr_ExceptionMatches(PyExc_UnicodeEncodeError)) - PyErr_Format(PicklingError, - "can't pickle module identifier '%S' using " - "pickle protocol %i", module_name, self->proto); - goto error; - } - if (_Pickler_Write(self, PyBytes_AS_STRING(encoded), - PyBytes_GET_SIZE(encoded)) < 0) { - Py_DECREF(encoded); - goto error; - } - Py_DECREF(encoded); - if(_Pickler_Write(self, "\n", 1) < 0) - goto error; - - /* Save the name of the module. */ - encoded = unicode_encoder(global_name); - if (encoded == NULL) { - if (PyErr_ExceptionMatches(PyExc_UnicodeEncodeError)) - PyErr_Format(PicklingError, - "can't pickle global identifier '%S' using " - "pickle protocol %i", global_name, self->proto); - goto error; - } - if (_Pickler_Write(self, PyBytes_AS_STRING(encoded), - PyBytes_GET_SIZE(encoded)) < 0) { - Py_DECREF(encoded); - goto error; - } - Py_DECREF(encoded); - if(_Pickler_Write(self, "\n", 1) < 0) - goto error; - /* Memoize the object. */ if (memo_put(self, obj) < 0) goto error; @@ -2927,14 +3350,9 @@ get_class(PyObject *obj) { PyObject *cls; - static PyObject *str_class; - - if (str_class == NULL) { - str_class = PyUnicode_InternFromString("__class__"); - if (str_class == NULL) - return NULL; - } - cls = PyObject_GetAttr(obj, str_class); + _Py_IDENTIFIER(__class__); + + cls = _PyObject_GetAttrId(obj, &PyId___class__); if (cls == NULL) { if (PyErr_ExceptionMatches(PyExc_AttributeError)) { PyErr_Clear(); @@ -2957,12 +3375,12 @@ PyObject *listitems = Py_None; PyObject *dictitems = Py_None; Py_ssize_t size; - - int use_newobj = self->proto >= 2; + int use_newobj = 0, use_newobj_ex = 0; const char reduce_op = REDUCE; const char build_op = BUILD; const char newobj_op = NEWOBJ; + const char newobj_ex_op = NEWOBJ_EX; size = PyTuple_Size(args); if (size < 2 || size > 5) { @@ -3007,33 +3425,75 @@ return -1; } - /* Protocol 2 special case: if callable's name is __newobj__, use - NEWOBJ. */ - if (use_newobj) { - static PyObject *newobj_str = NULL; + if (self->proto >= 2) { PyObject *name; - - if (newobj_str == NULL) { - newobj_str = PyUnicode_InternFromString("__newobj__"); - if (newobj_str == NULL) - return -1; - } + _Py_IDENTIFIER(__name__); name = _PyObject_GetAttrId(callable, &PyId___name__); if (name == NULL) { - if (PyErr_ExceptionMatches(PyExc_AttributeError)) - PyErr_Clear(); - else + if (!PyErr_ExceptionMatches(PyExc_AttributeError)) { return -1; - use_newobj = 0; + } + PyErr_Clear(); + } + else if (self->proto >= 4) { + _Py_IDENTIFIER(__newobj_ex__); + use_newobj_ex = PyUnicode_Check(name) && + PyUnicode_Compare( + name, _PyUnicode_FromId(&PyId___newobj_ex__)) == 0; + Py_DECREF(name); } else { + _Py_IDENTIFIER(__newobj__); use_newobj = PyUnicode_Check(name) && - PyUnicode_Compare(name, newobj_str) == 0; + PyUnicode_Compare( + name, _PyUnicode_FromId(&PyId___newobj__)) == 0; Py_DECREF(name); } } - if (use_newobj) { + + if (use_newobj_ex) { + PyObject *cls; + PyObject *args; + PyObject *kwargs; + + if (Py_SIZE(argtup) != 3) { + PyErr_Format(PicklingError, + "length of the NEWOBJ_EX argument tuple must be " + "exactly 3, not %zd", Py_SIZE(argtup)); + return -1; + } + + cls = PyTuple_GET_ITEM(argtup, 0); + if (!PyType_Check(cls)) { + PyErr_Format(PicklingError, + "first item from NEWOBJ_EX argument tuple must " + "be a class, not %.200s", Py_TYPE(cls)->tp_name); + return -1; + } + args = PyTuple_GET_ITEM(argtup, 1); + if (!PyTuple_Check(args)) { + PyErr_Format(PicklingError, + "second item from NEWOBJ_EX argument tuple must " + "be a tuple, not %.200s", Py_TYPE(args)->tp_name); + return -1; + } + kwargs = PyTuple_GET_ITEM(argtup, 2); + if (!PyDict_Check(kwargs)) { + PyErr_Format(PicklingError, + "third item from NEWOBJ_EX argument tuple must " + "be a dict, not %.200s", Py_TYPE(kwargs)->tp_name); + return -1; + } + + if (save(self, cls, 0) < 0 || + save(self, args, 0) < 0 || + save(self, kwargs, 0) < 0 || + _Pickler_Write(self, &newobj_ex_op, 1) < 0) { + return -1; + } + } + else if (use_newobj) { PyObject *cls; PyObject *newargtup; PyObject *obj_class; @@ -3117,8 +3577,23 @@ the caller do not want to memoize the object. Not particularly useful, but that is to mimic the behavior save_reduce() in pickle.py when obj is None. */ - if (obj && memo_put(self, obj) < 0) - return -1; + if (obj != NULL) { + /* If the object is already in the memo, this means it is + recursive. In this case, throw away everything we put on the + stack, and fetch the object back from the memo. */ + if (PyMemoTable_Get(self->memo, obj)) { + const char pop_op = POP; + + if (_Pickler_Write(self, &pop_op, 1) < 0) + return -1; + if (memo_get(self, obj) < 0) + return -1; + + return 0; + } + else if (memo_put(self, obj) < 0) + return -1; + } if (listitems && batch_list(self, listitems) < 0) return -1; @@ -3136,6 +3611,34 @@ } static int +save_method(PicklerObject *self, PyObject *obj) +{ + PyObject *method_self = PyCFunction_GET_SELF(obj); + + if (method_self == NULL || PyModule_Check(method_self)) { + return save_global(self, obj, NULL); + } + else { + PyObject *builtins; + PyObject *getattr; + PyObject *reduce_value; + int status = -1; + _Py_IDENTIFIER(getattr); + + builtins = PyEval_GetBuiltins(); + getattr = _PyDict_GetItemId(builtins, &PyId_getattr); + reduce_value = \ + Py_BuildValue("O(Os)", getattr, method_self, + ((PyCFunctionObject *)obj)->m_ml->ml_name); + if (reduce_value != NULL) { + status = save_reduce(self, reduce_value, obj); + Py_DECREF(reduce_value); + } + return status; + } +} + +static int save(PicklerObject *self, PyObject *obj, int pers_save) { PyTypeObject *type; @@ -3213,6 +3716,14 @@ status = save_dict(self, obj); goto done; } + else if (type == &PySet_Type) { + status = save_set(self, obj); + goto done; + } + else if (type == &PyFrozenSet_Type) { + status = save_frozenset(self, obj); + goto done; + } else if (type == &PyList_Type) { status = save_list(self, obj); goto done; @@ -3236,7 +3747,7 @@ } } else if (type == &PyCFunction_Type) { - status = save_global(self, obj, NULL); + status = save_method(self, obj); goto done; } @@ -3269,18 +3780,9 @@ goto done; } else { - static PyObject *reduce_str = NULL; - static PyObject *reduce_ex_str = NULL; - - /* Cache the name of the reduce methods. */ - if (reduce_str == NULL) { - reduce_str = PyUnicode_InternFromString("__reduce__"); - if (reduce_str == NULL) - goto error; - reduce_ex_str = PyUnicode_InternFromString("__reduce_ex__"); - if (reduce_ex_str == NULL) - goto error; - } + _Py_IDENTIFIER(__reduce__); + _Py_IDENTIFIER(__reduce_ex__); + /* XXX: If the __reduce__ method is defined, __reduce_ex__ is automatically defined as __reduce__. While this is convenient, this @@ -3291,7 +3793,7 @@ don't actually have to check for a __reduce__ method. */ /* Check for a __reduce_ex__ method. */ - reduce_func = PyObject_GetAttr(obj, reduce_ex_str); + reduce_func = _PyObject_GetAttrId(obj, &PyId___reduce_ex__); if (reduce_func != NULL) { PyObject *proto; proto = PyLong_FromLong(self->proto); @@ -3305,7 +3807,7 @@ else goto error; /* Check for a __reduce__ method. */ - reduce_func = PyObject_GetAttr(obj, reduce_str); + reduce_func = _PyObject_GetAttrId(obj, &PyId___reduce__); if (reduce_func != NULL) { reduce_value = PyObject_Call(reduce_func, empty_tuple, NULL); } @@ -3338,6 +3840,8 @@ status = -1; } done: + if (status == 0) + status = _Pickler_OpcodeBoundary(self); Py_LeaveRecursiveCall(); Py_XDECREF(reduce_func); Py_XDECREF(reduce_value); @@ -3358,6 +3862,8 @@ header[1] = (unsigned char)self->proto; if (_Pickler_Write(self, header, 2) < 0) return -1; + if (self->proto >= 4) + self->framing = 1; } if (save(self, obj, 0) < 0 || @@ -3478,9 +3984,9 @@ "This takes a binary file for writing a pickle data stream.\n" "\n" "The optional protocol argument tells the pickler to use the\n" -"given protocol; supported protocols are 0, 1, 2, 3. The default\n" -"protocol is 3; a backward-incompatible protocol designed for\n" -"Python 3.0.\n" +"given protocol; supported protocols are 0, 1, 2, 3 and 4. The\n" +"default protocol is 3; a backward-incompatible protocol designed for\n" +"Python 3.\n" "\n" "Specifying a negative protocol version selects the highest\n" "protocol version supported. The higher the protocol used, the\n" @@ -3493,8 +3999,8 @@ "meets this interface.\n" "\n" "If fix_imports is True and protocol is less than 3, pickle will try to\n" -"map the new Python 3.x names to the old module names used in Python\n" -"2.x, so that the pickle data stream is readable with Python 2.x.\n"); +"map the new Python 3 names to the old module names used in Python 2,\n" +"so that the pickle data stream is readable with Python 2.\n"); static int Pickler_init(PicklerObject *self, PyObject *args, PyObject *kwds) @@ -3987,17 +4493,15 @@ * as a C Py_ssize_t, or -1 if it's higher than PY_SSIZE_T_MAX. */ static Py_ssize_t -calc_binsize(char *bytes, int size) +calc_binsize(char *bytes, int nbytes) { unsigned char *s = (unsigned char *)bytes; + int i; size_t x = 0; - assert(size == 4); - - x = (size_t) s[0]; - x |= (size_t) s[1] << 8; - x |= (size_t) s[2] << 16; - x |= (size_t) s[3] << 24; + for (i = 0; i < nbytes; i++) { + x |= (size_t) s[i] << (8 * i); + } if (x > PY_SSIZE_T_MAX) return -1; @@ -4011,21 +4515,21 @@ * of x-platform bugs. */ static long -calc_binint(char *bytes, int size) +calc_binint(char *bytes, int nbytes) { unsigned char *s = (unsigned char *)bytes; - int i = size; + int i; long x = 0; - for (i = 0; i < size; i++) { - x |= (long)s[i] << (i * 8); + for (i = 0; i < nbytes; i++) { + x |= (long)s[i] << (8 * i); } /* Unlike BININT1 and BININT2, BININT (more accurately BININT4) * is signed, so on a box with longs bigger than 4 bytes we need * to extend a BININT's sign bit to the full width. */ - if (SIZEOF_LONG > 4 && size == 4) { + if (SIZEOF_LONG > 4 && nbytes == 4) { x |= -(x & (1L << 31)); } @@ -4233,26 +4737,27 @@ } static int -load_binbytes(UnpicklerObject *self) +load_counted_binbytes(UnpicklerObject *self, int nbytes) { PyObject *bytes; - Py_ssize_t x; + Py_ssize_t size; char *s; - if (_Unpickler_Read(self, &s, 4) < 0) - return -1; - - x = calc_binsize(s, 4); - if (x < 0) { + if (_Unpickler_Read(self, &s, nbytes) < 0) + return -1; + + size = calc_binsize(s, nbytes); + if (size < 0) { PyErr_Format(PyExc_OverflowError, "BINBYTES exceeds system's maximum size of %zd bytes", PY_SSIZE_T_MAX); return -1; } - if (_Unpickler_Read(self, &s, x) < 0) - return -1; - bytes = PyBytes_FromStringAndSize(s, x); + if (_Unpickler_Read(self, &s, size) < 0) + return -1; + + bytes = PyBytes_FromStringAndSize(s, size); if (bytes == NULL) return -1; @@ -4261,74 +4766,27 @@ } static int -load_short_binbytes(UnpicklerObject *self) -{ - PyObject *bytes; - Py_ssize_t x; +load_counted_binstring(UnpicklerObject *self, int nbytes) +{ + PyObject *str; + Py_ssize_t size; char *s; - if (_Unpickler_Read(self, &s, 1) < 0) - return -1; - - x = (unsigned char)s[0]; - - if (_Unpickler_Read(self, &s, x) < 0) - return -1; - - bytes = PyBytes_FromStringAndSize(s, x); - if (bytes == NULL) - return -1; - - PDATA_PUSH(self->stack, bytes, -1); - return 0; -} - -static int -load_binstring(UnpicklerObject *self) -{ - PyObject *str; - Py_ssize_t x; - char *s; - - if (_Unpickler_Read(self, &s, 4) < 0) - return -1; - - x = calc_binint(s, 4); - if (x < 0) { - PyErr_SetString(UnpicklingError, - "BINSTRING pickle has negative byte count"); - return -1; - } - - if (_Unpickler_Read(self, &s, x) < 0) - return -1; - + if (_Unpickler_Read(self, &s, nbytes) < 0) + return -1; + + size = calc_binsize(s, nbytes); + if (size < 0) { + PyErr_Format(UnpicklingError, + "BINSTRING exceeds system's maximum size of %zd bytes", + PY_SSIZE_T_MAX); + return -1; + } + + if (_Unpickler_Read(self, &s, size) < 0) + return -1; /* Convert Python 2.x strings to unicode. */ - str = PyUnicode_Decode(s, x, self->encoding, self->errors); - if (str == NULL) - return -1; - - PDATA_PUSH(self->stack, str, -1); - return 0; -} - -static int -load_short_binstring(UnpicklerObject *self) -{ - PyObject *str; - Py_ssize_t x; - char *s; - - if (_Unpickler_Read(self, &s, 1) < 0) - return -1; - - x = (unsigned char)s[0]; - - if (_Unpickler_Read(self, &s, x) < 0) - return -1; - - /* Convert Python 2.x strings to unicode. */ - str = PyUnicode_Decode(s, x, self->encoding, self->errors); + str = PyUnicode_Decode(s, size, self->encoding, self->errors); if (str == NULL) return -1; @@ -4357,16 +4815,16 @@ } static int -load_binunicode(UnpicklerObject *self) +load_counted_binunicode(UnpicklerObject *self, int nbytes) { PyObject *str; Py_ssize_t size; char *s; - if (_Unpickler_Read(self, &s, 4) < 0) - return -1; - - size = calc_binsize(s, 4); + if (_Unpickler_Read(self, &s, nbytes) < 0) + return -1; + + size = calc_binsize(s, nbytes); if (size < 0) { PyErr_Format(PyExc_OverflowError, "BINUNICODE exceeds system's maximum size of %zd bytes", @@ -4374,7 +4832,6 @@ return -1; } - if (_Unpickler_Read(self, &s, size) < 0) return -1; @@ -4446,6 +4903,17 @@ } static int +load_empty_set(UnpicklerObject *self) +{ + PyObject *set; + + if ((set = PySet_New(NULL)) == NULL) + return -1; + PDATA_PUSH(self->stack, set, -1); + return 0; +} + +static int load_list(UnpicklerObject *self) { PyObject *list; @@ -4487,6 +4955,29 @@ return 0; } +static int +load_frozenset(UnpicklerObject *self) +{ + PyObject *items; + PyObject *frozenset; + Py_ssize_t i; + + if ((i = marker(self)) < 0) + return -1; + + items = Pdata_poptuple(self->stack, i); + if (items == NULL) + return -1; + + frozenset = PyFrozenSet_New(items); + Py_DECREF(items); + if (frozenset == NULL) + return -1; + + PDATA_PUSH(self->stack, frozenset, -1); + return 0; +} + static PyObject * instantiate(PyObject *cls, PyObject *args) { @@ -4638,6 +5129,57 @@ } static int +load_newobj_ex(UnpicklerObject *self) +{ + PyObject *cls, *args, *kwargs; + PyObject *obj; + + PDATA_POP(self->stack, kwargs); + if (kwargs == NULL) { + return -1; + } + PDATA_POP(self->stack, args); + if (args == NULL) { + Py_DECREF(kwargs); + return -1; + } + PDATA_POP(self->stack, cls); + if (cls == NULL) { + Py_DECREF(kwargs); + Py_DECREF(args); + return -1; + } + + if (!PyType_Check(cls)) { + Py_DECREF(kwargs); + Py_DECREF(args); + Py_DECREF(cls); + PyErr_Format(UnpicklingError, + "NEWOBJ_EX class argument must be a type, not %.200s", + Py_TYPE(cls)->tp_name); + return -1; + } + + if (((PyTypeObject *)cls)->tp_new == NULL) { + Py_DECREF(kwargs); + Py_DECREF(args); + Py_DECREF(cls); + PyErr_SetString(UnpicklingError, + "NEWOBJ_EX class argument doesn't have __new__"); + return -1; + } + obj = ((PyTypeObject *)cls)->tp_new((PyTypeObject *)cls, args, kwargs); + Py_DECREF(kwargs); + Py_DECREF(args); + Py_DECREF(cls); + if (obj == NULL) { + return -1; + } + PDATA_PUSH(self->stack, obj, -1); + return 0; +} + +static int load_global(UnpicklerObject *self) { PyObject *global = NULL; @@ -4674,6 +5216,31 @@ } static int +load_stack_global(UnpicklerObject *self) +{ + PyObject *global; + PyObject *module_name; + PyObject *global_name; + + PDATA_POP(self->stack, global_name); + PDATA_POP(self->stack, module_name); + if (module_name == NULL || !PyUnicode_CheckExact(module_name) || + global_name == NULL || !PyUnicode_CheckExact(global_name)) { + PyErr_SetString(UnpicklingError, "STACK_GLOBAL requires str"); + Py_XDECREF(global_name); + Py_XDECREF(module_name); + return -1; + } + global = find_class(self, module_name, global_name); + Py_DECREF(global_name); + Py_DECREF(module_name); + if (global == NULL) + return -1; + PDATA_PUSH(self->stack, global, -1); + return 0; +} + +static int load_persid(UnpicklerObject *self) { PyObject *pid; @@ -5017,6 +5584,18 @@ } static int +load_memoize(UnpicklerObject *self) +{ + PyObject *value; + + if (Py_SIZE(self->stack) <= 0) + return stack_underflow(); + value = self->stack->data[Py_SIZE(self->stack) - 1]; + + return _Unpickler_MemoPut(self, self->memo_len, value); +} + +static int do_append(UnpicklerObject *self, Py_ssize_t x) { PyObject *value; @@ -5132,6 +5711,59 @@ } static int +load_additems(UnpicklerObject *self) +{ + PyObject *set; + Py_ssize_t mark, len, i; + + mark = marker(self); + len = Py_SIZE(self->stack); + if (mark > len || mark <= 0) + return stack_underflow(); + if (len == mark) /* nothing to do */ + return 0; + + set = self->stack->data[mark - 1]; + + if (PySet_Check(set)) { + PyObject *items; + int status; + + items = Pdata_poptuple(self->stack, mark); + if (items == NULL) + return -1; + + status = _PySet_Update(set, items); + Py_DECREF(items); + return status; + } + else { + PyObject *add_func; + _Py_IDENTIFIER(add); + + add_func = _PyObject_GetAttrId(set, &PyId_add); + if (add_func == NULL) + return -1; + for (i = mark; i < len; i++) { + PyObject *result; + PyObject *item; + + item = self->stack->data[i]; + result = _Unpickler_FastCall(self, add_func, item); + if (result == NULL) { + Pdata_clear(self->stack, i + 1); + Py_SIZE(self->stack) = mark; + return -1; + } + Py_DECREF(result); + } + Py_SIZE(self->stack) = mark; + } + + return 0; +} + +static int load_build(UnpicklerObject *self) { PyObject *state, *inst, *slotstate; @@ -5325,6 +5957,7 @@ i = (unsigned char)s[0]; if (i <= HIGHEST_PROTOCOL) { self->proto = i; + self->framing = (self->proto >= 4); return 0; } @@ -5340,6 +5973,8 @@ char *s; self->num_marks = 0; + self->proto = 0; + self->framing = 0; if (Py_SIZE(self->stack)) Pdata_clear(self->stack, 0); @@ -5365,13 +6000,16 @@ OP_ARG(LONG4, load_counted_long, 4) OP(FLOAT, load_float) OP(BINFLOAT, load_binfloat) - OP(BINBYTES, load_binbytes) - OP(SHORT_BINBYTES, load_short_binbytes) - OP(BINSTRING, load_binstring) - OP(SHORT_BINSTRING, load_short_binstring) + OP_ARG(SHORT_BINBYTES, load_counted_binbytes, 1) + OP_ARG(BINBYTES, load_counted_binbytes, 4) + OP_ARG(BINBYTES8, load_counted_binbytes, 8) + OP_ARG(SHORT_BINSTRING, load_counted_binstring, 1) + OP_ARG(BINSTRING, load_counted_binstring, 4) OP(STRING, load_string) OP(UNICODE, load_unicode) - OP(BINUNICODE, load_binunicode) + OP_ARG(SHORT_BINUNICODE, load_counted_binunicode, 1) + OP_ARG(BINUNICODE, load_counted_binunicode, 4) + OP_ARG(BINUNICODE8, load_counted_binunicode, 8) OP_ARG(EMPTY_TUPLE, load_counted_tuple, 0) OP_ARG(TUPLE1, load_counted_tuple, 1) OP_ARG(TUPLE2, load_counted_tuple, 2) @@ -5381,10 +6019,15 @@ OP(LIST, load_list) OP(EMPTY_DICT, load_empty_dict) OP(DICT, load_dict) + OP(EMPTY_SET, load_empty_set) + OP(ADDITEMS, load_additems) + OP(FROZENSET, load_frozenset) OP(OBJ, load_obj) OP(INST, load_inst) OP(NEWOBJ, load_newobj) + OP(NEWOBJ_EX, load_newobj_ex) OP(GLOBAL, load_global) + OP(STACK_GLOBAL, load_stack_global) OP(APPEND, load_append) OP(APPENDS, load_appends) OP(BUILD, load_build) @@ -5396,6 +6039,7 @@ OP(BINPUT, load_binput) OP(LONG_BINPUT, load_long_binput) OP(PUT, load_put) + OP(MEMOIZE, load_memoize) OP(POP, load_pop) OP(POP_MARK, load_pop_mark) OP(SETITEM, load_setitem) @@ -5485,6 +6129,7 @@ PyObject *modules_dict; PyObject *module; PyObject *module_name, *global_name; + _Py_IDENTIFIER(modules); if (!PyArg_UnpackTuple(args, "find_class", 2, 2, &module_name, &global_name)) @@ -5556,11 +6201,11 @@ module = PyImport_Import(module_name); if (module == NULL) return NULL; - global = PyObject_GetAttr(module, global_name); + global = getattribute(module, global_name, self->proto >= 4); Py_DECREF(module); } else { - global = PyObject_GetAttr(module, global_name); + global = getattribute(module, global_name, self->proto >= 4); } return global; } @@ -5723,6 +6368,7 @@ self->arg = NULL; self->proto = 0; + self->framing = 0; return 0; } diff --git a/Objects/classobject.c b/Objects/classobject.c --- a/Objects/classobject.c +++ b/Objects/classobject.c @@ -69,6 +69,30 @@ return (PyObject *)im; } +static PyObject * +method_reduce(PyMethodObject *im) +{ + PyObject *self = PyMethod_GET_SELF(im); + PyObject *func = PyMethod_GET_FUNCTION(im); + PyObject *builtins; + PyObject *getattr; + PyObject *funcname; + _Py_IDENTIFIER(getattr); + + funcname = _PyObject_GetAttrId(func, &PyId___name__); + if (funcname == NULL) { + return NULL; + } + builtins = PyEval_GetBuiltins(); + getattr = _PyDict_GetItemId(builtins, &PyId_getattr); + return Py_BuildValue("O(ON)", getattr, self, funcname); +} + +static PyMethodDef method_methods[] = { + {"__reduce__", (PyCFunction)method_reduce, METH_NOARGS, NULL}, + {NULL, NULL} +}; + /* Descriptors for PyMethod attributes */ /* im_func and im_self are stored in the PyMethod object */ @@ -367,7 +391,7 @@ offsetof(PyMethodObject, im_weakreflist), /* tp_weaklistoffset */ 0, /* tp_iter */ 0, /* tp_iternext */ - 0, /* tp_methods */ + method_methods, /* tp_methods */ method_memberlist, /* tp_members */ method_getset, /* tp_getset */ 0, /* tp_base */ diff --git a/Objects/descrobject.c b/Objects/descrobject.c --- a/Objects/descrobject.c +++ b/Objects/descrobject.c @@ -398,6 +398,24 @@ return descr->d_qualname; } +static PyObject * +descr_reduce(PyDescrObject *descr) +{ + PyObject *builtins; + PyObject *getattr; + _Py_IDENTIFIER(getattr); + + builtins = PyEval_GetBuiltins(); + getattr = _PyDict_GetItemId(builtins, &PyId_getattr); + return Py_BuildValue("O(OO)", getattr, PyDescr_TYPE(descr), + PyDescr_NAME(descr)); +} + +static PyMethodDef descr_methods[] = { + {"__reduce__", (PyCFunction)descr_reduce, METH_NOARGS, NULL}, + {NULL, NULL} +}; + static PyMemberDef descr_members[] = { {"__objclass__", T_OBJECT, offsetof(PyDescrObject, d_type), READONLY}, {"__name__", T_OBJECT, offsetof(PyDescrObject, d_name), READONLY}, @@ -494,7 +512,7 @@ 0, /* tp_weaklistoffset */ 0, /* tp_iter */ 0, /* tp_iternext */ - 0, /* tp_methods */ + descr_methods, /* tp_methods */ descr_members, /* tp_members */ method_getset, /* tp_getset */ 0, /* tp_base */ @@ -532,7 +550,7 @@ 0, /* tp_weaklistoffset */ 0, /* tp_iter */ 0, /* tp_iternext */ - 0, /* tp_methods */ + descr_methods, /* tp_methods */ descr_members, /* tp_members */ method_getset, /* tp_getset */ 0, /* tp_base */ @@ -569,7 +587,7 @@ 0, /* tp_weaklistoffset */ 0, /* tp_iter */ 0, /* tp_iternext */ - 0, /* tp_methods */ + descr_methods, /* tp_methods */ descr_members, /* tp_members */ member_getset, /* tp_getset */ 0, /* tp_base */ @@ -643,7 +661,7 @@ 0, /* tp_weaklistoffset */ 0, /* tp_iter */ 0, /* tp_iternext */ - 0, /* tp_methods */ + descr_methods, /* tp_methods */ descr_members, /* tp_members */ wrapperdescr_getset, /* tp_getset */ 0, /* tp_base */ @@ -1085,6 +1103,23 @@ wp->self); } +static PyObject * +wrapper_reduce(wrapperobject *wp) +{ + PyObject *builtins; + PyObject *getattr; + _Py_IDENTIFIER(getattr); + + builtins = PyEval_GetBuiltins(); + getattr = _PyDict_GetItemId(builtins, &PyId_getattr); + return Py_BuildValue("O(OO)", getattr, wp->self, PyDescr_NAME(wp->descr)); +} + +static PyMethodDef wrapper_methods[] = { + {"__reduce__", (PyCFunction)wrapper_reduce, METH_NOARGS, NULL}, + {NULL, NULL} +}; + static PyMemberDef wrapper_members[] = { {"__self__", T_OBJECT, offsetof(wrapperobject, self), READONLY}, {0} @@ -1193,7 +1228,7 @@ 0, /* tp_weaklistoffset */ 0, /* tp_iter */ 0, /* tp_iternext */ - 0, /* tp_methods */ + wrapper_methods, /* tp_methods */ wrapper_members, /* tp_members */ wrapper_getsets, /* tp_getset */ 0, /* tp_base */ diff --git a/Objects/typeobject.c b/Objects/typeobject.c --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -3405,149 +3405,428 @@ return cached_copyreg_module; } -static PyObject * -slotnames(PyObject *cls) -{ - PyObject *clsdict; +Py_LOCAL(PyObject *) +_PyType_GetSlotNames(PyTypeObject *cls) +{ PyObject *copyreg; PyObject *slotnames; _Py_IDENTIFIER(__slotnames__); _Py_IDENTIFIER(_slotnames); - clsdict = ((PyTypeObject *)cls)->tp_dict; - slotnames = _PyDict_GetItemId(clsdict, &PyId___slotnames__); - if (slotnames != NULL && PyList_Check(slotnames)) { + assert(PyType_Check(cls)); + + /* Get the slot names from the cache in the class if possible. */ + slotnames = _PyDict_GetItemIdWithError(cls->tp_dict, &PyId___slotnames__); + if (slotnames != NULL) { + if (slotnames != Py_None && !PyList_Check(slotnames)) { + PyErr_Format(PyExc_TypeError, + "%.200s.__slotnames__ should be a list or None, " + "not %.200s", + cls->tp_name, Py_TYPE(slotnames)->tp_name); + return NULL; + } Py_INCREF(slotnames); return slotnames; } + else { + if (PyErr_Occurred()) { + return NULL; + } + /* The class does not have the slot names cached yet. */ + } copyreg = import_copyreg(); if (copyreg == NULL) return NULL; - slotnames = _PyObject_CallMethodId(copyreg, &PyId__slotnames, "O", cls); + /* Use _slotnames function from the copyreg module to find the slots + by this class and its bases. This function will cache the result + in __slotnames__. */ + slotnames = _PyObject_CallMethodIdObjArgs(copyreg, &PyId__slotnames, + cls, NULL); Py_DECREF(copyreg); - if (slotnames != NULL && - slotnames != Py_None && - !PyList_Check(slotnames)) - { + if (slotnames == NULL) + return NULL; + + if (slotnames != Py_None && !PyList_Check(slotnames)) { PyErr_SetString(PyExc_TypeError, - "copyreg._slotnames didn't return a list or None"); + "copyreg._slotnames didn't return a list or None"); Py_DECREF(slotnames); - slotnames = NULL; + return NULL; } return slotnames; } -static PyObject * -reduce_2(PyObject *obj) -{ - PyObject *cls, *getnewargs; - PyObject *args = NULL, *args2 = NULL; - PyObject *getstate = NULL, *state = NULL, *names = NULL; - PyObject *slots = NULL, *listitems = NULL, *dictitems = NULL; - PyObject *copyreg = NULL, *newobj = NULL, *res = NULL; - Py_ssize_t i, n; - _Py_IDENTIFIER(__getnewargs__); +Py_LOCAL(PyObject *) +_PyObject_GetState(PyObject *obj) +{ + PyObject *state; + PyObject *getstate; _Py_IDENTIFIER(__getstate__); - _Py_IDENTIFIER(__newobj__); - - cls = (PyObject *) Py_TYPE(obj); - - getnewargs = _PyObject_GetAttrId(obj, &PyId___getnewargs__); - if (getnewargs != NULL) { - args = PyObject_CallObject(getnewargs, NULL); - Py_DECREF(getnewargs); - if (args != NULL && !PyTuple_Check(args)) { - PyErr_Format(PyExc_TypeError, - "__getnewargs__ should return a tuple, " - "not '%.200s'", Py_TYPE(args)->tp_name); - goto end; + + getstate = _PyObject_GetAttrId(obj, &PyId___getstate__); + if (getstate == NULL) { + PyObject *slotnames; + + if (!PyErr_ExceptionMatches(PyExc_AttributeError)) { + return NULL; } - } - else { PyErr_Clear(); - args = PyTuple_New(0); - } - if (args == NULL) - goto end; - - getstate = _PyObject_GetAttrId(obj, &PyId___getstate__); - if (getstate != NULL) { + + { + PyObject **dict; + dict = _PyObject_GetDictPtr(obj); + /* It is possible that the object's dict is not initialized + yet. In this case, we will return None for the state. + We also return None if the dict is empty to make the behavior + consistent regardless whether the dict was initialized or not. + This make unit testing easier. */ + if (dict != NULL && *dict != NULL && PyDict_Size(*dict) > 0) { + state = *dict; + } + else { + state = Py_None; + } + Py_INCREF(state); + } + + slotnames = _PyType_GetSlotNames(Py_TYPE(obj)); + if (slotnames == NULL) { + Py_DECREF(state); + return NULL; + } + + assert(slotnames == Py_None || PyList_Check(slotnames)); + if (slotnames != Py_None && Py_SIZE(slotnames) > 0) { + PyObject *slots; + Py_ssize_t slotnames_size, i; + + slots = PyDict_New(); + if (slots == NULL) { + Py_DECREF(slotnames); + Py_DECREF(state); + return NULL; + } + + slotnames_size = Py_SIZE(slotnames); + for (i = 0; i < slotnames_size; i++) { + PyObject *name, *value; + + name = PyList_GET_ITEM(slotnames, i); + value = PyObject_GetAttr(obj, name); + if (value == NULL) { + if (!PyErr_ExceptionMatches(PyExc_AttributeError)) { + goto error; + } + /* It is not an error if the attribute is not present. */ + PyErr_Clear(); + } + else { + int err = PyDict_SetItem(slots, name, value); + Py_DECREF(value); + if (err) { + goto error; + } + } + + /* The list is stored on the class so it may mutates while we + iterate over it */ + if (slotnames_size != Py_SIZE(slotnames)) { + PyErr_Format(PyExc_RuntimeError, + "__slotsname__ changed size during iteration"); + goto error; + } + + /* We handle errors within the loop here. */ + if (0) { + error: + Py_DECREF(slotnames); + Py_DECREF(slots); + Py_DECREF(state); + return NULL; + } + } + + /* If we found some slot attributes, pack them in a tuple along + the orginal attribute dictionary. */ + if (PyDict_Size(slots) > 0) { + PyObject *state2; + + state2 = PyTuple_Pack(2, state, slots); + Py_DECREF(state); + if (state2 == NULL) { + Py_DECREF(slotnames); + Py_DECREF(slots); + return NULL; + } + state = state2; + } + Py_DECREF(slots); + } + Py_DECREF(slotnames); + } + else { /* getstate != NULL */ state = PyObject_CallObject(getstate, NULL); Py_DECREF(getstate); if (state == NULL) - goto end; + return NULL; + } + + return state; +} + +Py_LOCAL(int) +_PyObject_GetNewArguments(PyObject *obj, PyObject **args, PyObject **kwargs) +{ + PyObject *getnewargs, *getnewargs_ex; + _Py_IDENTIFIER(__getnewargs_ex__); + _Py_IDENTIFIER(__getnewargs__); + + if (args == NULL || kwargs == NULL) { + PyErr_BadInternalCall(); + return -1; + } + + /* We first attempt to fetch the arguments for __new__ by calling + __getnewargs_ex__ on the object. */ + getnewargs_ex = _PyObject_GetAttrId(obj, &PyId___getnewargs_ex__); + if (getnewargs_ex != NULL) { + PyObject *newargs = PyObject_CallObject(getnewargs_ex, NULL); + Py_DECREF(getnewargs_ex); + if (newargs == NULL) { + return -1; + } + if (!PyTuple_Check(newargs)) { + PyErr_Format(PyExc_TypeError, + "__getnewargs_ex__ should return a tuple, " + "not '%.200s'", Py_TYPE(newargs)->tp_name); + Py_DECREF(newargs); + return -1; + } + if (Py_SIZE(newargs) != 2) { + PyErr_Format(PyExc_ValueError, + "__getnewargs_ex__ should return a tuple of " + "length 2, not %zd", Py_SIZE(newargs)); + Py_DECREF(newargs); + return -1; + } + *args = PyTuple_GET_ITEM(newargs, 0); + Py_INCREF(*args); + *kwargs = PyTuple_GET_ITEM(newargs, 1); + Py_INCREF(*kwargs); + Py_DECREF(newargs); + + /* XXX We should perhaps allow None to be passed here. */ + if (!PyTuple_Check(*args)) { + PyErr_Format(PyExc_TypeError, + "first item of the tuple returned by " + "__getnewargs_ex__ must be a tuple, not '%.200s'", + Py_TYPE(*args)->tp_name); + Py_CLEAR(*args); + Py_CLEAR(*kwargs); + return -1; + } + if (!PyDict_Check(*kwargs)) { + PyErr_Format(PyExc_TypeError, + "second item of the tuple returned by " + "__getnewargs_ex__ must be a dict, not '%.200s'", + Py_TYPE(*kwargs)->tp_name); + Py_CLEAR(*args); + Py_CLEAR(*kwargs); + return -1; + } + return 0; + } else { + if (!PyErr_ExceptionMatches(PyExc_AttributeError)) { + return -1; + } + PyErr_Clear(); + } + + /* The object does not have __getnewargs_ex__ so we fallback on using + __getnewargs__ instead. */ + getnewargs = _PyObject_GetAttrId(obj, &PyId___getnewargs__); + if (getnewargs != NULL) { + *args = PyObject_CallObject(getnewargs, NULL); + Py_DECREF(getnewargs); + if (*args == NULL) { + return -1; + } + if (!PyTuple_Check(*args)) { + PyErr_Format(PyExc_TypeError, + "__getnewargs__ should return a tuple, " + "not '%.200s'", Py_TYPE(*args)->tp_name); + Py_CLEAR(*args); + return -1; + } + *kwargs = NULL; + return 0; + } else { + if (!PyErr_ExceptionMatches(PyExc_AttributeError)) { + return -1; + } + PyErr_Clear(); + } + + /* The object does not have __getnewargs_ex__ and __getnewargs__. This may + means __new__ does not takes any arguments on this object, or that the + object does not implement the reduce protocol for pickling or + copying. */ + *args = NULL; + *kwargs = NULL; + return 0; +} + +Py_LOCAL(int) +_PyObject_GetItemsIter(PyObject *obj, PyObject **listitems, + PyObject **dictitems) +{ + if (listitems == NULL || dictitems == NULL) { + PyErr_BadInternalCall(); + return -1; + } + + if (!PyList_Check(obj)) { + *listitems = Py_None; + Py_INCREF(*listitems); } else { - PyObject **dict; - PyErr_Clear(); - dict = _PyObject_GetDictPtr(obj); - if (dict && *dict) - state = *dict; - else - state = Py_None; - Py_INCREF(state); - names = slotnames(cls); - if (names == NULL) - goto end; - if (names != Py_None && PyList_GET_SIZE(names) > 0) { - assert(PyList_Check(names)); - slots = PyDict_New(); - if (slots == NULL) - goto end; - n = 0; - /* Can't pre-compute the list size; the list - is stored on the class so accessible to other - threads, which may be run by DECREF */ - for (i = 0; i < PyList_GET_SIZE(names); i++) { - PyObject *name, *value; - name = PyList_GET_ITEM(names, i); - value = PyObject_GetAttr(obj, name); - if (value == NULL) - PyErr_Clear(); - else { - int err = PyDict_SetItem(slots, name, - value); - Py_DECREF(value); - if (err) - goto end; - n++; - } - } - if (n) { - state = Py_BuildValue("(NO)", state, slots); - if (state == NULL) - goto end; - } + *listitems = PyObject_GetIter(obj); + if (listitems == NULL) + return -1; + } + + if (!PyDict_Check(obj)) { + *dictitems = Py_None; + Py_INCREF(*dictitems); + } + else { + PyObject *items; + _Py_IDENTIFIER(items); + + items = _PyObject_CallMethodIdObjArgs(obj, &PyId_items, NULL); + if (items == NULL) { + Py_CLEAR(*listitems); + return -1; } - } - - if (!PyList_Check(obj)) { - listitems = Py_None; - Py_INCREF(listitems); - } - else { - listitems = PyObject_GetIter(obj); - if (listitems == NULL) - goto end; - } - - if (!PyDict_Check(obj)) { - dictitems = Py_None; - Py_INCREF(dictitems); - } - else { - _Py_IDENTIFIER(items); - PyObject *items = _PyObject_CallMethodId(obj, &PyId_items, ""); - if (items == NULL) - goto end; - dictitems = PyObject_GetIter(items); + *dictitems = PyObject_GetIter(items); Py_DECREF(items); - if (dictitems == NULL) - goto end; - } + if (*dictitems == NULL) { + Py_CLEAR(*listitems); + return -1; + } + } + + assert(*listitems != NULL && *dictitems != NULL); + + return 0; +} + +static PyObject * +reduce_4(PyObject *obj) +{ + PyObject *args = NULL, *kwargs = NULL; + PyObject *copyreg; + PyObject *newobj, *newargs, *state, *listitems, *dictitems; + PyObject *result; + _Py_IDENTIFIER(__newobj_ex__); + + if (_PyObject_GetNewArguments(obj, &args, &kwargs) < 0) { + return NULL; + } + if (args == NULL) { + args = PyTuple_New(0); + if (args == NULL) + return NULL; + } + if (kwargs == NULL) { + kwargs = PyDict_New(); + if (kwargs == NULL) + return NULL; + } + + copyreg = import_copyreg(); + if (copyreg == NULL) { + Py_DECREF(args); + Py_DECREF(kwargs); + return NULL; + } + newobj = _PyObject_GetAttrId(copyreg, &PyId___newobj_ex__); + Py_DECREF(copyreg); + if (newobj == NULL) { + Py_DECREF(args); + Py_DECREF(kwargs); + return NULL; + } + newargs = PyTuple_Pack(3, Py_TYPE(obj), args, kwargs); + Py_DECREF(args); + Py_DECREF(kwargs); + if (newargs == NULL) { + Py_DECREF(newobj); + return NULL; + } + state = _PyObject_GetState(obj); + if (state == NULL) { + Py_DECREF(newobj); + Py_DECREF(newargs); + return NULL; + } + if (_PyObject_GetItemsIter(obj, &listitems, &dictitems) < 0) { + Py_DECREF(newobj); + Py_DECREF(newargs); + Py_DECREF(state); + return NULL; + } + + result = PyTuple_Pack(5, newobj, newargs, state, listitems, dictitems); + Py_DECREF(newobj); + Py_DECREF(newargs); + Py_DECREF(state); + Py_DECREF(listitems); + Py_DECREF(dictitems); + return result; +} + +static PyObject * +reduce_2(PyObject *obj) +{ + PyObject *cls; + PyObject *args = NULL, *args2 = NULL, *kwargs = NULL; + PyObject *state = NULL, *listitems = NULL, *dictitems = NULL; + PyObject *copyreg = NULL, *newobj = NULL, *res = NULL; + Py_ssize_t i, n; + _Py_IDENTIFIER(__newobj__); + + if (_PyObject_GetNewArguments(obj, &args, &kwargs) < 0) { + return NULL; + } + if (args == NULL) { + assert(kwargs == NULL); + args = PyTuple_New(0); + if (args == NULL) { + return NULL; + } + } + else if (kwargs != NULL) { + if (PyDict_Size(kwargs) > 0) { + PyErr_SetString(PyExc_ValueError, + "must use protocol 4 or greater to copy this " + "object; since __getnewargs_ex__ returned " + "keyword arguments."); + Py_DECREF(args); + Py_DECREF(kwargs); + return NULL; + } + Py_CLEAR(kwargs); + } + + state = _PyObject_GetState(obj); + if (state == NULL) + goto end; + + if (_PyObject_GetItemsIter(obj, &listitems, &dictitems) < 0) + goto end; copyreg = import_copyreg(); if (copyreg == NULL) @@ -3560,6 +3839,7 @@ args2 = PyTuple_New(n+1); if (args2 == NULL) goto end; + cls = (PyObject *) Py_TYPE(obj); Py_INCREF(cls); PyTuple_SET_ITEM(args2, 0, cls); for (i = 0; i < n; i++) { @@ -3573,9 +3853,7 @@ end: Py_XDECREF(args); Py_XDECREF(args2); - Py_XDECREF(slots); Py_XDECREF(state); - Py_XDECREF(names); Py_XDECREF(listitems); Py_XDECREF(dictitems); Py_XDECREF(copyreg); @@ -3603,7 +3881,9 @@ { PyObject *copyreg, *res; - if (proto >= 2) + if (proto >= 4) + return reduce_4(self); + else if (proto >= 2) return reduce_2(self); copyreg = import_copyreg(); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 19:01:48 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 19:01:48 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_whitespace?= Message-ID: <3dRj4r2KrSz7Lqd@mail.python.org> http://hg.python.org/cpython/rev/91faefbeb343 changeset: 87439:91faefbeb343 user: Antoine Pitrou date: Sat Nov 23 19:01:36 2013 +0100 summary: Fix whitespace files: Lib/test/test_descr.py | 6 +++--- 1 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_descr.py b/Lib/test/test_descr.py --- a/Lib/test/test_descr.py +++ b/Lib/test/test_descr.py @@ -4525,7 +4525,7 @@ class PicklingTests(unittest.TestCase): - def _check_reduce(self, proto, obj, args=(), kwargs={}, state=None, + def _check_reduce(self, proto, obj, args=(), kwargs={}, state=None, listitems=None, dictitems=None): if proto >= 4: reduce_value = obj.__reduce_ex__(proto) @@ -4559,7 +4559,7 @@ base_type = type(obj).__base__ reduce_value = (copyreg._reconstructor, (type(obj), - base_type, + base_type, None if base_type is object else base_type(obj))) if state is not None: reduce_value += (state,) @@ -4774,7 +4774,7 @@ def __getstate__(self): state = getattr(self, '__dict__', {}).copy() for cls in type(self).__mro__: - for slot in cls.__dict__.get('__slots__', ()): + for slot in cls.__dict__.get('__slots__', ()): try: state[slot] = getattr(self, slot) except AttributeError: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 19:06:58 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 19:06:58 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Mark_PEP_3154_final=2E?= Message-ID: <3dRjBp4JYbz7LjR@mail.python.org> http://hg.python.org/peps/rev/caec577195c3 changeset: 5316:caec577195c3 user: Antoine Pitrou date: Sat Nov 23 19:06:53 2013 +0100 summary: Mark PEP 3154 final. files: pep-3154.txt | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/pep-3154.txt b/pep-3154.txt --- a/pep-3154.txt +++ b/pep-3154.txt @@ -3,7 +3,7 @@ Version: $Revision$ Last-Modified: $Date$ Author: Antoine Pitrou -Status: Accepted +Status: Final Type: Standards Track Content-Type: text/x-rst Created: 2011-08-11 -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sat Nov 23 19:27:43 2013 From: python-checkins at python.org (ezio.melotti) Date: Sat, 23 Nov 2013 19:27:43 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=2319639=3A_update_the_rep?= =?utf-8?q?r_of_the_match_objects_in_the_docs=2E__Patch_by_Claudiu?= Message-ID: <3dRjfl5svDz7LjR@mail.python.org> http://hg.python.org/cpython/rev/bbfc559f7190 changeset: 87440:bbfc559f7190 user: Ezio Melotti date: Sat Nov 23 20:27:27 2013 +0200 summary: #19639: update the repr of the match objects in the docs. Patch by Claudiu Popa. files: Doc/howto/regex.rst | 16 ++++++++-------- Doc/library/fnmatch.rst | 2 +- Doc/library/re.rst | 18 +++++++++--------- 3 files changed, 18 insertions(+), 18 deletions(-) diff --git a/Doc/howto/regex.rst b/Doc/howto/regex.rst --- a/Doc/howto/regex.rst +++ b/Doc/howto/regex.rst @@ -402,7 +402,7 @@ >>> m = p.match('tempo') >>> m #doctest: +ELLIPSIS - <_sre.SRE_Match object at 0x...> + <_sre.SRE_Match object; span=(0, 5), match='tempo'> Now you can query the :ref:`match object ` for information about the matching string. :ref:`match object ` instances @@ -441,7 +441,7 @@ >>> print(p.match('::: message')) None >>> m = p.search('::: message'); print(m) #doctest: +ELLIPSIS - <_sre.SRE_Match object at 0x...> + <_sre.SRE_Match object; span=(4, 11), match='message'> >>> m.group() 'message' >>> m.span() @@ -493,7 +493,7 @@ >>> print(re.match(r'From\s+', 'Fromage amk')) None >>> re.match(r'From\s+', 'From amk Thu May 14 19:12:10 1998') #doctest: +ELLIPSIS - <_sre.SRE_Match object at 0x...> + <_sre.SRE_Match object; span=(0, 5), match='From '> Under the hood, these functions simply create a pattern object for you and call the appropriate method on it. They also store the compiled @@ -685,7 +685,7 @@ line, the RE to use is ``^From``. :: >>> print(re.search('^From', 'From Here to Eternity')) #doctest: +ELLIPSIS - <_sre.SRE_Match object at 0x...> + <_sre.SRE_Match object; span=(0, 4), match='From'> >>> print(re.search('^From', 'Reciting From Memory')) None @@ -697,11 +697,11 @@ or any location followed by a newline character. :: >>> print(re.search('}$', '{block}')) #doctest: +ELLIPSIS - <_sre.SRE_Match object at 0x...> + <_sre.SRE_Match object; span=(6, 7), match='}'> >>> print(re.search('}$', '{block} ')) None >>> print(re.search('}$', '{block}\n')) #doctest: +ELLIPSIS - <_sre.SRE_Match object at 0x...> + <_sre.SRE_Match object; span=(6, 7), match='}'> To match a literal ``'$'``, use ``\$`` or enclose it inside a character class, as in ``[$]``. @@ -726,7 +726,7 @@ >>> p = re.compile(r'\bclass\b') >>> print(p.search('no class at all')) #doctest: +ELLIPSIS - <_sre.SRE_Match object at 0x...> + <_sre.SRE_Match object; span=(3, 8), match='class'> >>> print(p.search('the declassified algorithm')) None >>> print(p.search('one subclass is')) @@ -744,7 +744,7 @@ >>> print(p.search('no class at all')) None >>> print(p.search('\b' + 'class' + '\b')) #doctest: +ELLIPSIS - <_sre.SRE_Match object at 0x...> + <_sre.SRE_Match object; span=(0, 7), match='\x08class\x08'> Second, inside a character class, where there's no use for this assertion, ``\b`` represents the backspace character, for compatibility with Python's diff --git a/Doc/library/fnmatch.rst b/Doc/library/fnmatch.rst --- a/Doc/library/fnmatch.rst +++ b/Doc/library/fnmatch.rst @@ -86,7 +86,7 @@ '.*\\.txt$' >>> reobj = re.compile(regex) >>> reobj.match('foobar.txt') - <_sre.SRE_Match object at 0x...> + <_sre.SRE_Match object; span=(0, 10), match='foobar.txt'> .. seealso:: diff --git a/Doc/library/re.rst b/Doc/library/re.rst --- a/Doc/library/re.rst +++ b/Doc/library/re.rst @@ -755,7 +755,7 @@ >>> pattern = re.compile("d") >>> pattern.search("dog") # Match at index 0 - <_sre.SRE_Match object at ...> + <_sre.SRE_Match object; span=(0, 1), match='d'> >>> pattern.search("dog", 1) # No match; search doesn't include the "d" @@ -772,7 +772,7 @@ >>> pattern = re.compile("o") >>> pattern.match("dog") # No match as "o" is not at the start of "dog". >>> pattern.match("dog", 1) # Match as "o" is the 2nd character of "dog". - <_sre.SRE_Match object at ...> + <_sre.SRE_Match object; span=(1, 2), match='o'> If you want to locate a match anywhere in *string*, use :meth:`~regex.search` instead (see also :ref:`search-vs-match`). @@ -1139,7 +1139,7 @@ >>> re.match("c", "abcdef") # No match >>> re.search("c", "abcdef") # Match - <_sre.SRE_Match object at ...> + <_sre.SRE_Match object; span=(2, 3), match='c'> Regular expressions beginning with ``'^'`` can be used with :func:`search` to restrict the match at the beginning of the string:: @@ -1147,7 +1147,7 @@ >>> re.match("c", "abcdef") # No match >>> re.search("^c", "abcdef") # No match >>> re.search("^a", "abcdef") # Match - <_sre.SRE_Match object at ...> + <_sre.SRE_Match object; span=(0, 1), match='a'> Note however that in :const:`MULTILINE` mode :func:`match` only matches at the beginning of the string, whereas using :func:`search` with a regular expression @@ -1155,7 +1155,7 @@ >>> re.match('X', 'A\nB\nX', re.MULTILINE) # No match >>> re.search('^X', 'A\nB\nX', re.MULTILINE) # Match - <_sre.SRE_Match object at ...> + <_sre.SRE_Match object; span=(4, 5), match='X'> Making a Phonebook @@ -1274,9 +1274,9 @@ functionally identical: >>> re.match(r"\W(.)\1\W", " ff ") - <_sre.SRE_Match object at ...> + <_sre.SRE_Match object; span=(0, 4), match=' ff '> >>> re.match("\\W(.)\\1\\W", " ff ") - <_sre.SRE_Match object at ...> + <_sre.SRE_Match object; span=(0, 4), match=' ff '> When one wants to match a literal backslash, it must be escaped in the regular expression. With raw string notation, this means ``r"\\"``. Without raw string @@ -1284,9 +1284,9 @@ functionally identical: >>> re.match(r"\\", r"\\") - <_sre.SRE_Match object at ...> + <_sre.SRE_Match object; span=(0, 1), match='\\'> >>> re.match("\\\\", r"\\") - <_sre.SRE_Match object at ...> + <_sre.SRE_Match object; span=(0, 1), match='\\'> Writing a Tokenizer -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 20:15:01 2013 From: python-checkins at python.org (ezio.melotti) Date: Sat, 23 Nov 2013 20:15:01 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=2310712=3A_2to3_has_a_new?= =?utf-8?q?_=22asserts=22_fixer_that_replaces_deprecated_names_of?= Message-ID: <3dRkjK2QzZz7Llf@mail.python.org> http://hg.python.org/cpython/rev/cc0fc4e9b494 changeset: 87441:cc0fc4e9b494 user: Ezio Melotti date: Sat Nov 23 21:14:42 2013 +0200 summary: #10712: 2to3 has a new "asserts" fixer that replaces deprecated names of unittest methods. files: Doc/library/2to3.rst | 33 +++++++++++++ Lib/lib2to3/fixes/fix_asserts.py | 34 +++++++++++++ Lib/lib2to3/tests/test_fixers.py | 50 ++++++++++++++++++++ Misc/NEWS | 3 + 4 files changed, 120 insertions(+), 0 deletions(-) diff --git a/Doc/library/2to3.rst b/Doc/library/2to3.rst --- a/Doc/library/2to3.rst +++ b/Doc/library/2to3.rst @@ -142,6 +142,39 @@ Removes usage of :func:`apply`. For example ``apply(function, *args, **kwargs)`` is converted to ``function(*args, **kwargs)``. +.. 2to3fixer:: asserts + + Replaces deprecated :mod:`unittest` method names with the correct ones. + + ================================ ========================================== + From To + ================================ ========================================== + ``failUnlessEqual(a, b)`` :meth:`assertEqual(a, b) + ` + ``assertEquals(a, b)`` :meth:`assertEqual(a, b) + ` + ``failIfEqual(a, b)`` :meth:`assertNotEqual(a, b) + ` + ``assertNotEquals(a, b)`` :meth:`assertNotEqual(a, b) + ` + ``failUnless(a)`` :meth:`assertTrue(a) + ` + ``assert_(a)`` :meth:`assertTrue(a) + ` + ``failIf(a)`` :meth:`assertFalse(a) + ` + ``failUnlessRaises(exc, cal)`` :meth:`assertRaises(exc, cal) + ` + ``failUnlessAlmostEqual(a, b)`` :meth:`assertAlmostEqual(a, b) + ` + ``assertAlmostEquals(a, b)`` :meth:`assertAlmostEqual(a, b) + ` + ``failIfAlmostEqual(a, b)`` :meth:`assertNotAlmostEqual(a, b) + ` + ``assertNotAlmostEquals(a, b)`` :meth:`assertNotAlmostEqual(a, b) + ` + ================================ ========================================== + .. 2to3fixer:: basestring Converts :class:`basestring` to :class:`str`. diff --git a/Lib/lib2to3/fixes/fix_asserts.py b/Lib/lib2to3/fixes/fix_asserts.py new file mode 100644 --- /dev/null +++ b/Lib/lib2to3/fixes/fix_asserts.py @@ -0,0 +1,34 @@ +"""Fixer that replaces deprecated unittest method names.""" + +# Author: Ezio Melotti + +from ..fixer_base import BaseFix +from ..fixer_util import Name + +NAMES = dict( + assert_="assertTrue", + assertEquals="assertEqual", + assertNotEquals="assertNotEqual", + assertAlmostEquals="assertAlmostEqual", + assertNotAlmostEquals="assertNotAlmostEqual", + assertRegexpMatches="assertRegex", + assertRaisesRegexp="assertRaisesRegex", + failUnlessEqual="assertEqual", + failIfEqual="assertNotEqual", + failUnlessAlmostEqual="assertAlmostEqual", + failIfAlmostEqual="assertNotAlmostEqual", + failUnless="assertTrue", + failUnlessRaises="assertRaises", + failIf="assertFalse", +) + + +class FixAsserts(BaseFix): + + PATTERN = """ + power< any+ trailer< '.' meth=(%s)> any* > + """ % '|'.join(map(repr, NAMES)) + + def transform(self, node, results): + name = results["meth"][0] + name.replace(Name(NAMES[str(name)], prefix=name.prefix)) diff --git a/Lib/lib2to3/tests/test_fixers.py b/Lib/lib2to3/tests/test_fixers.py --- a/Lib/lib2to3/tests/test_fixers.py +++ b/Lib/lib2to3/tests/test_fixers.py @@ -4635,3 +4635,53 @@ def test_unchanged(self): s = """f(sys.exitfunc)""" self.unchanged(s) + + +class Test_asserts(FixerTestCase): + + fixer = "asserts" + + def test_deprecated_names(self): + tests = [ + ('self.assert_(True)', 'self.assertTrue(True)'), + ('self.assertEquals(2, 2)', 'self.assertEqual(2, 2)'), + ('self.assertNotEquals(2, 3)', 'self.assertNotEqual(2, 3)'), + ('self.assertAlmostEquals(2, 3)', 'self.assertAlmostEqual(2, 3)'), + ('self.assertNotAlmostEquals(2, 8)', 'self.assertNotAlmostEqual(2, 8)'), + ('self.failUnlessEqual(2, 2)', 'self.assertEqual(2, 2)'), + ('self.failIfEqual(2, 3)', 'self.assertNotEqual(2, 3)'), + ('self.failUnlessAlmostEqual(2, 3)', 'self.assertAlmostEqual(2, 3)'), + ('self.failIfAlmostEqual(2, 8)', 'self.assertNotAlmostEqual(2, 8)'), + ('self.failUnless(True)', 'self.assertTrue(True)'), + ('self.failUnlessRaises(foo)', 'self.assertRaises(foo)'), + ('self.failIf(False)', 'self.assertFalse(False)'), + ] + for b, a in tests: + self.check(b, a) + + def test_variants(self): + b = 'eq = self.assertEquals' + a = 'eq = self.assertEqual' + self.check(b, a) + b = 'self.assertEquals(2, 3, msg="fail")' + a = 'self.assertEqual(2, 3, msg="fail")' + self.check(b, a) + b = 'self.assertEquals(2, 3, msg="fail") # foo' + a = 'self.assertEqual(2, 3, msg="fail") # foo' + self.check(b, a) + b = 'self.assertEquals (2, 3)' + a = 'self.assertEqual (2, 3)' + self.check(b, a) + b = ' self.assertEquals (2, 3)' + a = ' self.assertEqual (2, 3)' + self.check(b, a) + b = 'with self.failUnlessRaises(Explosion): explode()' + a = 'with self.assertRaises(Explosion): explode()' + self.check(b, a) + b = 'with self.failUnlessRaises(Explosion) as cm: explode()' + a = 'with self.assertRaises(Explosion) as cm: explode()' + self.check(b, a) + + def test_unchanged(self): + self.unchanged('self.assertEqualsOnSaturday') + self.unchanged('self.assertEqualsOnSaturday(3, 5)') diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -575,6 +575,9 @@ - Issue #18716: Deprecate the formatter module. +- Issue #10712: 2to3 has a new "asserts" fixer that replaces deprecated names + of unittest methods (e.g. failUnlessEqual -> assertEqual). + - Issue #18037: 2to3 now escapes '\u' and '\U' in native strings. - Issue #17839: base64.decodebytes and base64.encodebytes now accept any -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 20:53:12 2013 From: python-checkins at python.org (guido.van.rossum) Date: Sat, 23 Nov 2013 20:53:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_typo=2E?= Message-ID: <3dRlYN3qbfz7LrK@mail.python.org> http://hg.python.org/cpython/rev/5e55461fa80b changeset: 87442:5e55461fa80b user: Guido van Rossum date: Sat Nov 23 11:51:09 2013 -0800 summary: Fix typo. files: Lib/asyncio/transports.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/transports.py b/Lib/asyncio/transports.py --- a/Lib/asyncio/transports.py +++ b/Lib/asyncio/transports.py @@ -105,7 +105,7 @@ raise NotImplementedError def abort(self): - """Closs the transport immediately. + """Close the transport immediately. Buffered data will be lost. No more data will be received. The protocol's connection_lost() method will (eventually) be -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 20:53:13 2013 From: python-checkins at python.org (guido.van.rossum) Date: Sat, 23 Nov 2013 20:53:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Use_socketpair?= =?utf-8?q?=28=29_from_test=5Futils_in_tests_=28Sa=C3=BAl_Ibarra_Corretg?= =?utf-8?b?w6kpLg==?= Message-ID: <3dRlYP6SS4z7Llf@mail.python.org> http://hg.python.org/cpython/rev/80f48fbb25b8 changeset: 87443:80f48fbb25b8 user: Guido van Rossum date: Sat Nov 23 11:51:53 2013 -0800 summary: asyncio: Use socketpair() from test_utils in tests (Sa?l Ibarra Corretg?). files: Lib/test/test_asyncio/test_events.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -896,7 +896,7 @@ proto = MyWritePipeProto(loop=self.loop) return proto - rsock, wsock = self.loop._socketpair() + rsock, wsock = test_utils.socketpair() pipeobj = io.open(wsock.detach(), 'wb', 1024) @tasks.coroutine -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:01:50 2013 From: python-checkins at python.org (christian.heimes) Date: Sat, 23 Nov 2013 21:01:50 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2317810=3A_Add_NULL?= =?utf-8?q?_check_to_save=5Ffrozenset?= Message-ID: <3dRllL3MCJz7LjR@mail.python.org> http://hg.python.org/cpython/rev/d719975f4d25 changeset: 87444:d719975f4d25 user: Christian Heimes date: Sat Nov 23 21:01:40 2013 +0100 summary: Issue #17810: Add NULL check to save_frozenset CID 1131949: Dereference null return value (NULL_RETURNS) files: Modules/_pickle.c | 3 +++ 1 files changed, 3 insertions(+), 0 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -2940,6 +2940,9 @@ return -1; iter = PyObject_GetIter(obj); + if (iter == NULL) { + return NULL; + } for (;;) { PyObject *item; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:05:53 2013 From: python-checkins at python.org (christian.heimes) Date: Sat, 23 Nov 2013 21:05:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2317810=3A_return_-?= =?utf-8?q?1_on_error?= Message-ID: <3dRlr14xG5zN4m@mail.python.org> http://hg.python.org/cpython/rev/c54becd69805 changeset: 87445:c54becd69805 user: Christian Heimes date: Sat Nov 23 21:05:31 2013 +0100 summary: Issue #17810: return -1 on error files: Modules/_pickle.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -2941,7 +2941,7 @@ iter = PyObject_GetIter(obj); if (iter == NULL) { - return NULL; + return -1; } for (;;) { PyObject *item; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:07:46 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 21:07:46 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_writing_out_64-bit_siz?= =?utf-8?q?e_fields_on_32-bit_builds?= Message-ID: <3dRltB3dMgzN4m@mail.python.org> http://hg.python.org/cpython/rev/458340ed0606 changeset: 87446:458340ed0606 parent: 87444:d719975f4d25 user: Antoine Pitrou date: Sat Nov 23 21:05:08 2013 +0100 summary: Fix writing out 64-bit size fields on 32-bit builds files: Lib/test/test_range.py | 7 +++-- Modules/_pickle.c | 32 +++++++++++++++++++---------- 2 files changed, 25 insertions(+), 14 deletions(-) diff --git a/Lib/test/test_range.py b/Lib/test/test_range.py --- a/Lib/test/test_range.py +++ b/Lib/test/test_range.py @@ -353,9 +353,10 @@ (13, 21, 3), (-2, 2, 2), (2**65, 2**65+2)] for proto in range(pickle.HIGHEST_PROTOCOL + 1): for t in testcases: - r = range(*t) - self.assertEqual(list(pickle.loads(pickle.dumps(r, proto))), - list(r)) + with self.subTest(proto=proto, test=t): + r = range(*t) + self.assertEqual(list(pickle.loads(pickle.dumps(r, proto))), + list(r)) def test_iterator_pickling(self): testcases = [(13,), (0, 11), (-22, 10), (20, 3, -1), diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -700,17 +700,27 @@ } static void +_write_size64(char *out, size_t value) +{ + out[0] = (unsigned char)(value & 0xff); + out[1] = (unsigned char)((value >> 8) & 0xff); + out[2] = (unsigned char)((value >> 16) & 0xff); + out[3] = (unsigned char)((value >> 24) & 0xff); +#if SIZEOF_SIZE_T >= 8 + out[4] = (unsigned char)((value >> 32) & 0xff); + out[5] = (unsigned char)((value >> 40) & 0xff); + out[6] = (unsigned char)((value >> 48) & 0xff); + out[7] = (unsigned char)((value >> 56) & 0xff); +#else + out[4] = out[5] = out[6] = out[7] = 0; +#endif +} + +static void _Pickler_WriteFrameHeader(PicklerObject *self, char *qdata, size_t frame_len) { - qdata[0] = (unsigned char)FRAME; - qdata[1] = (unsigned char)(frame_len & 0xff); - qdata[2] = (unsigned char)((frame_len >> 8) & 0xff); - qdata[3] = (unsigned char)((frame_len >> 16) & 0xff); - qdata[4] = (unsigned char)((frame_len >> 24) & 0xff); - qdata[5] = (unsigned char)((frame_len >> 32) & 0xff); - qdata[6] = (unsigned char)((frame_len >> 40) & 0xff); - qdata[7] = (unsigned char)((frame_len >> 48) & 0xff); - qdata[8] = (unsigned char)((frame_len >> 56) & 0xff); + qdata[0] = FRAME; + _write_size64(qdata + 1, frame_len); } static int @@ -2017,7 +2027,7 @@ int i; header[0] = BINBYTES8; for (i = 0; i < 8; i++) { - header[i+1] = (unsigned char)((size >> (8 * i)) & 0xff); + _write_size64(header + 1, size); } len = 8; } @@ -2131,7 +2141,7 @@ header[0] = BINUNICODE8; for (i = 0; i < 8; i++) { - header[i+1] = (unsigned char)((size >> (8 * i)) & 0xff); + _write_size64(header + 1, size); } len = 9; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:07:47 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 21:07:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merge?= Message-ID: <3dRltC5RHpzN4m@mail.python.org> http://hg.python.org/cpython/rev/3c222d1f81e9 changeset: 87447:3c222d1f81e9 parent: 87446:458340ed0606 parent: 87445:c54becd69805 user: Antoine Pitrou date: Sat Nov 23 21:06:21 2013 +0100 summary: Merge files: Modules/_pickle.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -2951,7 +2951,7 @@ iter = PyObject_GetIter(obj); if (iter == NULL) { - return NULL; + return -1; } for (;;) { PyObject *item; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:13:14 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 23 Nov 2013 21:13:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2315204=3A_Deprecat?= =?utf-8?q?ed_the_=27U=27_mode_in_file-like_objects=2E?= Message-ID: <3dRm0V4hY7z7Lkh@mail.python.org> http://hg.python.org/cpython/rev/70bd6f7e013b changeset: 87448:70bd6f7e013b parent: 87443:80f48fbb25b8 user: Serhiy Storchaka date: Sat Nov 23 22:12:06 2013 +0200 summary: Issue #15204: Deprecated the 'U' mode in file-like objects. files: Doc/library/fileinput.rst | 3 +++ Doc/library/functions.rst | 6 ++++-- Doc/library/zipfile.rst | 3 +++ Lib/_pyio.py | 10 ++++++++-- Lib/fileinput.py | 4 ++++ Lib/imp.py | 4 ++-- Lib/test/test_imp.py | 2 +- Lib/zipfile.py | 4 ++++ Misc/NEWS | 2 ++ Modules/_io/_iomodule.c | 10 ++++++++-- Tools/iobench/iobench.py | 4 +++- Tools/scripts/diff.py | 4 ++-- Tools/scripts/ndiff.py | 2 +- 13 files changed, 45 insertions(+), 13 deletions(-) diff --git a/Doc/library/fileinput.rst b/Doc/library/fileinput.rst --- a/Doc/library/fileinput.rst +++ b/Doc/library/fileinput.rst @@ -160,6 +160,9 @@ .. versionchanged:: 3.2 Can be used as a context manager. + .. deprecated:: 3.4 + The ``'rU'`` and ``'U'`` modes. + **Optional in-place filtering:** if the keyword argument ``inplace=True`` is passed to :func:`fileinput.input` or to the :class:`FileInput` constructor, the diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -872,8 +872,7 @@ ``'b'`` binary mode ``'t'`` text mode (default) ``'+'`` open a disk file for updating (reading and writing) - ``'U'`` universal newlines mode (for backwards compatibility; should - not be used in new code) + ``'U'`` :term:`universal newlines` mode (deprecated) ========= =============================================================== The default mode is ``'r'`` (open for reading text, synonym of ``'rt'``). @@ -1029,6 +1028,9 @@ .. versionchanged:: 3.4 The file is now non-inheritable. + .. deprecated-removed:: 3.4 4.0 + The ``'U'`` mode. + .. XXX works for bytes too, but should it? .. function:: ord(c) diff --git a/Doc/library/zipfile.rst b/Doc/library/zipfile.rst --- a/Doc/library/zipfile.rst +++ b/Doc/library/zipfile.rst @@ -234,6 +234,9 @@ or a :class:`ZipInfo` object. You will appreciate this when trying to read a ZIP file that contains members with duplicate names. + .. deprecated-removed:: 3.4 3.6 + The ``'U'`` or ``'rU'`` mode. Use :class:`io.TextIOWrapper` for reading + compressed text files in :term:`universal newlines` mode. .. method:: ZipFile.extract(member, path=None, pwd=None) diff --git a/Lib/_pyio.py b/Lib/_pyio.py --- a/Lib/_pyio.py +++ b/Lib/_pyio.py @@ -62,8 +62,7 @@ 'b' binary mode 't' text mode (default) '+' open a disk file for updating (reading and writing) - 'U' universal newline mode (for backwards compatibility; unneeded - for new code) + 'U' universal newline mode (deprecated) ========= =============================================================== The default mode is 'rt' (open for reading text). For binary random @@ -79,6 +78,10 @@ returned as strings, the bytes having been first decoded using a platform-dependent encoding or using the specified encoding if given. + 'U' mode is deprecated and will raise an exception in future versions + of Python. It has no effect in Python 3. Use newline to control + universal newlines mode. + buffering is an optional integer used to set the buffering policy. Pass 0 to switch buffering off (only allowed in binary mode), 1 to select line buffering (only usable in text mode), and an integer > 1 to indicate @@ -174,6 +177,9 @@ if "U" in modes: if creating or writing or appending: raise ValueError("can't use U and writing mode at once") + import warnings + warnings.warn("'U' mode is deprecated", + DeprecationWarning, 2) reading = True if text and binary: raise ValueError("can't have text and binary mode at once") diff --git a/Lib/fileinput.py b/Lib/fileinput.py --- a/Lib/fileinput.py +++ b/Lib/fileinput.py @@ -222,6 +222,10 @@ if mode not in ('r', 'rU', 'U', 'rb'): raise ValueError("FileInput opening mode must be one of " "'r', 'rU', 'U' and 'rb'") + if 'U' in mode: + import warnings + warnings.warn("Use of 'U' mode is deprecated", + DeprecationWarning, 2) self._mode = mode if openhook: if inplace: diff --git a/Lib/imp.py b/Lib/imp.py --- a/Lib/imp.py +++ b/Lib/imp.py @@ -103,7 +103,7 @@ def get_suffixes(): """**DEPRECATED**""" extensions = [(s, 'rb', C_EXTENSION) for s in machinery.EXTENSION_SUFFIXES] - source = [(s, 'U', PY_SOURCE) for s in machinery.SOURCE_SUFFIXES] + source = [(s, 'r', PY_SOURCE) for s in machinery.SOURCE_SUFFIXES] bytecode = [(s, 'rb', PY_COMPILED) for s in machinery.BYTECODE_SUFFIXES] return extensions + source + bytecode @@ -297,7 +297,7 @@ raise ImportError(_ERR_MSG.format(name), name=name) encoding = None - if mode == 'U': + if 'b' not in mode: with open(file_path, 'rb') as file: encoding = tokenize.detect_encoding(file.readline)[0] file = open(file_path, mode, encoding=encoding) diff --git a/Lib/test/test_imp.py b/Lib/test/test_imp.py --- a/Lib/test/test_imp.py +++ b/Lib/test/test_imp.py @@ -165,7 +165,7 @@ self.assertIsNotNone(file) self.assertTrue(filename[:-3].endswith(temp_mod_name)) self.assertEqual(info[0], '.py') - self.assertEqual(info[1], 'U') + self.assertEqual(info[1], 'r') self.assertEqual(info[2], imp.PY_SOURCE) mod = imp.load_module(temp_mod_name, file, filename, info) diff --git a/Lib/zipfile.py b/Lib/zipfile.py --- a/Lib/zipfile.py +++ b/Lib/zipfile.py @@ -1117,6 +1117,10 @@ """Return file-like object for 'name'.""" if mode not in ("r", "U", "rU"): raise RuntimeError('open() requires mode "r", "U", or "rU"') + if 'U' in mode: + import warnings + warnings.warn("'U' mode is deprecated", + DeprecationWarning, 2) if pwd and not isinstance(pwd, bytes): raise TypeError("pwd: expected bytes, got %s" % type(pwd)) if not self.fp: diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,8 @@ Library ------- +- Issue #15204: Deprecated the 'U' mode in file-like objects. + - Issue #17810: Implement PEP 3154, pickle protocol 4. - Issue #19668: Added support for the cp1125 encoding. diff --git a/Modules/_io/_iomodule.c b/Modules/_io/_iomodule.c --- a/Modules/_io/_iomodule.c +++ b/Modules/_io/_iomodule.c @@ -126,8 +126,7 @@ "'b' binary mode\n" "'t' text mode (default)\n" "'+' open a disk file for updating (reading and writing)\n" -"'U' universal newline mode (for backwards compatibility; unneeded\n" -" for new code)\n" +"'U' universal newline mode (deprecated)\n" "========= ===============================================================\n" "\n" "The default mode is 'rt' (open for reading text). For binary random\n" @@ -143,6 +142,10 @@ "returned as strings, the bytes having been first decoded using a\n" "platform-dependent encoding or using the specified encoding if given.\n" "\n" +"'U' mode is deprecated and will raise an exception in future versions\n" +"of Python. It has no effect in Python 3. Use newline to control\n" +"universal newlines mode.\n" +"\n" "buffering is an optional integer used to set the buffering policy.\n" "Pass 0 to switch buffering off (only allowed in binary mode), 1 to select\n" "line buffering (only usable in text mode), and an integer > 1 to indicate\n" @@ -310,6 +313,9 @@ "can't use U and writing mode at once"); return NULL; } + if (PyErr_WarnEx(PyExc_DeprecationWarning, + "'U' mode is deprecated", 1) < 0) + return NULL; reading = 1; } diff --git a/Tools/iobench/iobench.py b/Tools/iobench/iobench.py --- a/Tools/iobench/iobench.py +++ b/Tools/iobench/iobench.py @@ -24,6 +24,8 @@ try: return open(fn, mode, encoding=encoding or TEXT_ENCODING) except TypeError: + if 'r' in mode: + mode += 'U' # 'U' mode is needed only in Python 2.x return open(fn, mode) def get_file_sizes(): @@ -380,7 +382,7 @@ f.write(os.urandom(size)) # Text files chunk = [] - with text_open(__file__, "rU", encoding='utf8') as f: + with text_open(__file__, "r", encoding='utf8') as f: for line in f: if line.startswith("# "): break diff --git a/Tools/scripts/diff.py b/Tools/scripts/diff.py --- a/Tools/scripts/diff.py +++ b/Tools/scripts/diff.py @@ -38,9 +38,9 @@ fromdate = file_mtime(fromfile) todate = file_mtime(tofile) - with open(fromfile, 'U') as ff: + with open(fromfile) as ff: fromlines = ff.readlines() - with open(tofile, 'U') as tf: + with open(tofile) as tf: tolines = tf.readlines() if options.u: diff --git a/Tools/scripts/ndiff.py b/Tools/scripts/ndiff.py --- a/Tools/scripts/ndiff.py +++ b/Tools/scripts/ndiff.py @@ -60,7 +60,7 @@ # couldn't be opened def fopen(fname): try: - return open(fname, 'U') + return open(fname) except IOError as detail: return fail("couldn't open " + fname + ": " + str(detail)) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:13:15 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 23 Nov 2013 21:13:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merge_heads?= Message-ID: <3dRm0W6Ydgz7Lqf@mail.python.org> http://hg.python.org/cpython/rev/d469b3e557d2 changeset: 87449:d469b3e557d2 parent: 87448:70bd6f7e013b parent: 87447:3c222d1f81e9 user: Serhiy Storchaka date: Sat Nov 23 22:12:36 2013 +0200 summary: Merge heads files: Lib/test/test_range.py | 7 +++-- Modules/_pickle.c | 35 ++++++++++++++++++++--------- 2 files changed, 28 insertions(+), 14 deletions(-) diff --git a/Lib/test/test_range.py b/Lib/test/test_range.py --- a/Lib/test/test_range.py +++ b/Lib/test/test_range.py @@ -353,9 +353,10 @@ (13, 21, 3), (-2, 2, 2), (2**65, 2**65+2)] for proto in range(pickle.HIGHEST_PROTOCOL + 1): for t in testcases: - r = range(*t) - self.assertEqual(list(pickle.loads(pickle.dumps(r, proto))), - list(r)) + with self.subTest(proto=proto, test=t): + r = range(*t) + self.assertEqual(list(pickle.loads(pickle.dumps(r, proto))), + list(r)) def test_iterator_pickling(self): testcases = [(13,), (0, 11), (-22, 10), (20, 3, -1), diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -700,17 +700,27 @@ } static void +_write_size64(char *out, size_t value) +{ + out[0] = (unsigned char)(value & 0xff); + out[1] = (unsigned char)((value >> 8) & 0xff); + out[2] = (unsigned char)((value >> 16) & 0xff); + out[3] = (unsigned char)((value >> 24) & 0xff); +#if SIZEOF_SIZE_T >= 8 + out[4] = (unsigned char)((value >> 32) & 0xff); + out[5] = (unsigned char)((value >> 40) & 0xff); + out[6] = (unsigned char)((value >> 48) & 0xff); + out[7] = (unsigned char)((value >> 56) & 0xff); +#else + out[4] = out[5] = out[6] = out[7] = 0; +#endif +} + +static void _Pickler_WriteFrameHeader(PicklerObject *self, char *qdata, size_t frame_len) { - qdata[0] = (unsigned char)FRAME; - qdata[1] = (unsigned char)(frame_len & 0xff); - qdata[2] = (unsigned char)((frame_len >> 8) & 0xff); - qdata[3] = (unsigned char)((frame_len >> 16) & 0xff); - qdata[4] = (unsigned char)((frame_len >> 24) & 0xff); - qdata[5] = (unsigned char)((frame_len >> 32) & 0xff); - qdata[6] = (unsigned char)((frame_len >> 40) & 0xff); - qdata[7] = (unsigned char)((frame_len >> 48) & 0xff); - qdata[8] = (unsigned char)((frame_len >> 56) & 0xff); + qdata[0] = FRAME; + _write_size64(qdata + 1, frame_len); } static int @@ -2017,7 +2027,7 @@ int i; header[0] = BINBYTES8; for (i = 0; i < 8; i++) { - header[i+1] = (unsigned char)((size >> (8 * i)) & 0xff); + _write_size64(header + 1, size); } len = 8; } @@ -2131,7 +2141,7 @@ header[0] = BINUNICODE8; for (i = 0; i < 8; i++) { - header[i+1] = (unsigned char)((size >> (8 * i)) & 0xff); + _write_size64(header + 1, size); } len = 9; } @@ -2940,6 +2950,9 @@ return -1; iter = PyObject_GetIter(obj); + if (iter == NULL) { + return -1; + } for (;;) { PyObject *item; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:14:13 2013 From: python-checkins at python.org (christian.heimes) Date: Sat, 23 Nov 2013 21:14:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2317810=3A_Add_two_?= =?utf-8?q?missing_error_checks_to_save=5Fglobal?= Message-ID: <3dRm1d0M7cz7LjR@mail.python.org> http://hg.python.org/cpython/rev/a02adfb3260a changeset: 87450:a02adfb3260a parent: 87447:3c222d1f81e9 user: Christian Heimes date: Sat Nov 23 21:13:39 2013 +0100 summary: Issue #17810: Add two missing error checks to save_global CID 1131946: Unchecked return value (CHECKED_RETURN) files: Modules/_pickle.c | 6 ++++-- 1 files changed, 4 insertions(+), 2 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -3193,8 +3193,10 @@ if (self->proto >= 4) { const char stack_global_op = STACK_GLOBAL; - save(self, module_name, 0); - save(self, global_name, 0); + if (save(self, module_name, 0) < 0) + goto error; + if (save(self, global_name, 0) < 0) + goto error; if (_Pickler_Write(self, &stack_global_op, 1) < 0) goto error; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:14:14 2013 From: python-checkins at python.org (christian.heimes) Date: Sat, 23 Nov 2013 21:14:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_merge?= Message-ID: <3dRm1f3V2Hz7Llf@mail.python.org> http://hg.python.org/cpython/rev/8a8392aff111 changeset: 87451:8a8392aff111 parent: 87450:a02adfb3260a parent: 87449:d469b3e557d2 user: Christian Heimes date: Sat Nov 23 21:14:01 2013 +0100 summary: merge files: Doc/library/fileinput.rst | 3 +++ Doc/library/functions.rst | 6 ++++-- Doc/library/zipfile.rst | 3 +++ Lib/_pyio.py | 10 ++++++++-- Lib/fileinput.py | 4 ++++ Lib/imp.py | 4 ++-- Lib/test/test_imp.py | 2 +- Lib/zipfile.py | 4 ++++ Misc/NEWS | 2 ++ Modules/_io/_iomodule.c | 10 ++++++++-- Tools/iobench/iobench.py | 4 +++- Tools/scripts/diff.py | 4 ++-- Tools/scripts/ndiff.py | 2 +- 13 files changed, 45 insertions(+), 13 deletions(-) diff --git a/Doc/library/fileinput.rst b/Doc/library/fileinput.rst --- a/Doc/library/fileinput.rst +++ b/Doc/library/fileinput.rst @@ -160,6 +160,9 @@ .. versionchanged:: 3.2 Can be used as a context manager. + .. deprecated:: 3.4 + The ``'rU'`` and ``'U'`` modes. + **Optional in-place filtering:** if the keyword argument ``inplace=True`` is passed to :func:`fileinput.input` or to the :class:`FileInput` constructor, the diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -872,8 +872,7 @@ ``'b'`` binary mode ``'t'`` text mode (default) ``'+'`` open a disk file for updating (reading and writing) - ``'U'`` universal newlines mode (for backwards compatibility; should - not be used in new code) + ``'U'`` :term:`universal newlines` mode (deprecated) ========= =============================================================== The default mode is ``'r'`` (open for reading text, synonym of ``'rt'``). @@ -1029,6 +1028,9 @@ .. versionchanged:: 3.4 The file is now non-inheritable. + .. deprecated-removed:: 3.4 4.0 + The ``'U'`` mode. + .. XXX works for bytes too, but should it? .. function:: ord(c) diff --git a/Doc/library/zipfile.rst b/Doc/library/zipfile.rst --- a/Doc/library/zipfile.rst +++ b/Doc/library/zipfile.rst @@ -234,6 +234,9 @@ or a :class:`ZipInfo` object. You will appreciate this when trying to read a ZIP file that contains members with duplicate names. + .. deprecated-removed:: 3.4 3.6 + The ``'U'`` or ``'rU'`` mode. Use :class:`io.TextIOWrapper` for reading + compressed text files in :term:`universal newlines` mode. .. method:: ZipFile.extract(member, path=None, pwd=None) diff --git a/Lib/_pyio.py b/Lib/_pyio.py --- a/Lib/_pyio.py +++ b/Lib/_pyio.py @@ -62,8 +62,7 @@ 'b' binary mode 't' text mode (default) '+' open a disk file for updating (reading and writing) - 'U' universal newline mode (for backwards compatibility; unneeded - for new code) + 'U' universal newline mode (deprecated) ========= =============================================================== The default mode is 'rt' (open for reading text). For binary random @@ -79,6 +78,10 @@ returned as strings, the bytes having been first decoded using a platform-dependent encoding or using the specified encoding if given. + 'U' mode is deprecated and will raise an exception in future versions + of Python. It has no effect in Python 3. Use newline to control + universal newlines mode. + buffering is an optional integer used to set the buffering policy. Pass 0 to switch buffering off (only allowed in binary mode), 1 to select line buffering (only usable in text mode), and an integer > 1 to indicate @@ -174,6 +177,9 @@ if "U" in modes: if creating or writing or appending: raise ValueError("can't use U and writing mode at once") + import warnings + warnings.warn("'U' mode is deprecated", + DeprecationWarning, 2) reading = True if text and binary: raise ValueError("can't have text and binary mode at once") diff --git a/Lib/fileinput.py b/Lib/fileinput.py --- a/Lib/fileinput.py +++ b/Lib/fileinput.py @@ -222,6 +222,10 @@ if mode not in ('r', 'rU', 'U', 'rb'): raise ValueError("FileInput opening mode must be one of " "'r', 'rU', 'U' and 'rb'") + if 'U' in mode: + import warnings + warnings.warn("Use of 'U' mode is deprecated", + DeprecationWarning, 2) self._mode = mode if openhook: if inplace: diff --git a/Lib/imp.py b/Lib/imp.py --- a/Lib/imp.py +++ b/Lib/imp.py @@ -103,7 +103,7 @@ def get_suffixes(): """**DEPRECATED**""" extensions = [(s, 'rb', C_EXTENSION) for s in machinery.EXTENSION_SUFFIXES] - source = [(s, 'U', PY_SOURCE) for s in machinery.SOURCE_SUFFIXES] + source = [(s, 'r', PY_SOURCE) for s in machinery.SOURCE_SUFFIXES] bytecode = [(s, 'rb', PY_COMPILED) for s in machinery.BYTECODE_SUFFIXES] return extensions + source + bytecode @@ -297,7 +297,7 @@ raise ImportError(_ERR_MSG.format(name), name=name) encoding = None - if mode == 'U': + if 'b' not in mode: with open(file_path, 'rb') as file: encoding = tokenize.detect_encoding(file.readline)[0] file = open(file_path, mode, encoding=encoding) diff --git a/Lib/test/test_imp.py b/Lib/test/test_imp.py --- a/Lib/test/test_imp.py +++ b/Lib/test/test_imp.py @@ -165,7 +165,7 @@ self.assertIsNotNone(file) self.assertTrue(filename[:-3].endswith(temp_mod_name)) self.assertEqual(info[0], '.py') - self.assertEqual(info[1], 'U') + self.assertEqual(info[1], 'r') self.assertEqual(info[2], imp.PY_SOURCE) mod = imp.load_module(temp_mod_name, file, filename, info) diff --git a/Lib/zipfile.py b/Lib/zipfile.py --- a/Lib/zipfile.py +++ b/Lib/zipfile.py @@ -1117,6 +1117,10 @@ """Return file-like object for 'name'.""" if mode not in ("r", "U", "rU"): raise RuntimeError('open() requires mode "r", "U", or "rU"') + if 'U' in mode: + import warnings + warnings.warn("'U' mode is deprecated", + DeprecationWarning, 2) if pwd and not isinstance(pwd, bytes): raise TypeError("pwd: expected bytes, got %s" % type(pwd)) if not self.fp: diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,8 @@ Library ------- +- Issue #15204: Deprecated the 'U' mode in file-like objects. + - Issue #17810: Implement PEP 3154, pickle protocol 4. - Issue #19668: Added support for the cp1125 encoding. diff --git a/Modules/_io/_iomodule.c b/Modules/_io/_iomodule.c --- a/Modules/_io/_iomodule.c +++ b/Modules/_io/_iomodule.c @@ -126,8 +126,7 @@ "'b' binary mode\n" "'t' text mode (default)\n" "'+' open a disk file for updating (reading and writing)\n" -"'U' universal newline mode (for backwards compatibility; unneeded\n" -" for new code)\n" +"'U' universal newline mode (deprecated)\n" "========= ===============================================================\n" "\n" "The default mode is 'rt' (open for reading text). For binary random\n" @@ -143,6 +142,10 @@ "returned as strings, the bytes having been first decoded using a\n" "platform-dependent encoding or using the specified encoding if given.\n" "\n" +"'U' mode is deprecated and will raise an exception in future versions\n" +"of Python. It has no effect in Python 3. Use newline to control\n" +"universal newlines mode.\n" +"\n" "buffering is an optional integer used to set the buffering policy.\n" "Pass 0 to switch buffering off (only allowed in binary mode), 1 to select\n" "line buffering (only usable in text mode), and an integer > 1 to indicate\n" @@ -310,6 +313,9 @@ "can't use U and writing mode at once"); return NULL; } + if (PyErr_WarnEx(PyExc_DeprecationWarning, + "'U' mode is deprecated", 1) < 0) + return NULL; reading = 1; } diff --git a/Tools/iobench/iobench.py b/Tools/iobench/iobench.py --- a/Tools/iobench/iobench.py +++ b/Tools/iobench/iobench.py @@ -24,6 +24,8 @@ try: return open(fn, mode, encoding=encoding or TEXT_ENCODING) except TypeError: + if 'r' in mode: + mode += 'U' # 'U' mode is needed only in Python 2.x return open(fn, mode) def get_file_sizes(): @@ -380,7 +382,7 @@ f.write(os.urandom(size)) # Text files chunk = [] - with text_open(__file__, "rU", encoding='utf8') as f: + with text_open(__file__, "r", encoding='utf8') as f: for line in f: if line.startswith("# "): break diff --git a/Tools/scripts/diff.py b/Tools/scripts/diff.py --- a/Tools/scripts/diff.py +++ b/Tools/scripts/diff.py @@ -38,9 +38,9 @@ fromdate = file_mtime(fromfile) todate = file_mtime(tofile) - with open(fromfile, 'U') as ff: + with open(fromfile) as ff: fromlines = ff.readlines() - with open(tofile, 'U') as tf: + with open(tofile) as tf: tolines = tf.readlines() if options.u: diff --git a/Tools/scripts/ndiff.py b/Tools/scripts/ndiff.py --- a/Tools/scripts/ndiff.py +++ b/Tools/scripts/ndiff.py @@ -60,7 +60,7 @@ # couldn't be opened def fopen(fname): try: - return open(fname, 'U') + return open(fname) except IOError as detail: return fail("couldn't open " + fname + ": " + str(detail)) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:19:52 2013 From: python-checkins at python.org (christian.heimes) Date: Sat, 23 Nov 2013 21:19:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2317810=3A_Fixed_NU?= =?utf-8?q?LL_check_in_=5FPyObject=5FGetItemsIter=28=29?= Message-ID: <3dRm885nF3z7LjR@mail.python.org> http://hg.python.org/cpython/rev/3e16c8c34e69 changeset: 87452:3e16c8c34e69 user: Christian Heimes date: Sat Nov 23 21:19:43 2013 +0100 summary: Issue #17810: Fixed NULL check in _PyObject_GetItemsIter() CID 1131948: Logically dead code (DEADCODE) files: Objects/typeobject.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Objects/typeobject.c b/Objects/typeobject.c --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -3693,7 +3693,7 @@ } else { *listitems = PyObject_GetIter(obj); - if (listitems == NULL) + if (*listitems == NULL) return -1; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:21:46 2013 From: python-checkins at python.org (gregory.p.smith) Date: Sat, 23 Nov 2013 21:21:46 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_gcc_doesn=27t_realize_that?= =?utf-8?q?_dummy_is_always_initialized_by_the_function_call?= Message-ID: <3dRmBL0WxZz7LjR@mail.python.org> http://hg.python.org/cpython/rev/6eca53cf005d changeset: 87453:6eca53cf005d user: Gregory P. Smith date: Sat Nov 23 20:21:28 2013 +0000 summary: gcc doesn't realize that dummy is always initialized by the function call and warns about potential uninitialized use. Silence that by initializing it to null. files: Modules/_pickle.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -1132,7 +1132,7 @@ (self->frame_end_idx == -1 || self->frame_end_idx <= self->next_read_idx)) { /* Need to read new frame */ - char *dummy; + char *dummy = NULL; unsigned char *frame_start; size_t frame_len; if (_Unpickler_ReadUnframed(self, &dummy, FRAME_HEADER_SIZE) < 0) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:27:41 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 23 Nov 2013 21:27:41 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319641=3A_Added_th?= =?utf-8?q?e_audioop=2Ebyteswap=28=29_function_to_convert_big-endian?= Message-ID: <3dRmK91KxZz7Lr2@mail.python.org> http://hg.python.org/cpython/rev/fbebc90abcd1 changeset: 87454:fbebc90abcd1 user: Serhiy Storchaka date: Sat Nov 23 22:26:01 2013 +0200 summary: Issue #19641: Added the audioop.byteswap() function to convert big-endian samples to little-endian and vice versa. files: Doc/library/audioop.rst | 8 ++++ Doc/whatsnew/3.4.rst | 3 + Lib/test/audiotests.py | 18 --------- Lib/test/test_aifc.py | 5 +- Lib/test/test_audioop.py | 17 +++++++++ Lib/test/test_sunau.py | 3 +- Lib/test/test_wave.py | 18 ++------- Lib/wave.py | 51 ++++----------------------- Misc/NEWS | 3 + Modules/audioop.c | 32 +++++++++++++++++ 10 files changed, 80 insertions(+), 78 deletions(-) diff --git a/Doc/library/audioop.rst b/Doc/library/audioop.rst --- a/Doc/library/audioop.rst +++ b/Doc/library/audioop.rst @@ -77,6 +77,14 @@ sample. Samples wrap around in case of overflow. +.. function:: byteswap(fragment, width) + + "Byteswap" all samples in a fragment and returns the modified fragment. + Converts big-endian samples to little-endian and vice versa. + + .. versionadded: 3.4 + + .. function:: cross(fragment, width) Return the number of zero crossings in the fragment passed as an argument. diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -415,6 +415,9 @@ Added support for 24-bit samples (:issue:`12866`). +Added the :func:`~audioop.byteswap` function to convert big-endian samples +to little-endian and vice versa (:issue:`19641`). + base64 ------ diff --git a/Lib/test/audiotests.py b/Lib/test/audiotests.py --- a/Lib/test/audiotests.py +++ b/Lib/test/audiotests.py @@ -5,24 +5,6 @@ import pickle import sys -def byteswap2(data): - a = array.array('h') - a.frombytes(data) - a.byteswap() - return a.tobytes() - -def byteswap3(data): - ba = bytearray(data) - ba[::3] = data[2::3] - ba[2::3] = data[::3] - return bytes(ba) - -def byteswap4(data): - a = array.array('i') - a.frombytes(data) - a.byteswap() - return a.tobytes() - class UnseekableIO(io.FileIO): def tell(self): raise io.UnsupportedOperation diff --git a/Lib/test/test_aifc.py b/Lib/test/test_aifc.py --- a/Lib/test/test_aifc.py +++ b/Lib/test/test_aifc.py @@ -1,6 +1,7 @@ from test.support import findfile, TESTFN, unlink import unittest from test import audiotests +from audioop import byteswap import os import io import sys @@ -122,7 +123,7 @@ E5040CBC 617C0A3C 08BC0A3C 2C7C0B3C 517C0E3C 8A8410FC B6840EBC 457C0A3C \ """) if sys.byteorder != 'big': - frames = audiotests.byteswap2(frames) + frames = byteswap(frames, 2) class AifcALAWTest(AifcTest, unittest.TestCase): @@ -143,7 +144,7 @@ E4800CC0 62000A40 08C00A40 2B000B40 52000E40 8A001180 B6000EC0 46000A40 \ """) if sys.byteorder != 'big': - frames = audiotests.byteswap2(frames) + frames = byteswap(frames, 2) class AifcMiscTest(audiotests.AudioTests, unittest.TestCase): diff --git a/Lib/test/test_audioop.py b/Lib/test/test_audioop.py --- a/Lib/test/test_audioop.py +++ b/Lib/test/test_audioop.py @@ -448,6 +448,23 @@ self.assertEqual(audioop.getsample(data, w, 3), maxvalues[w]) self.assertEqual(audioop.getsample(data, w, 4), minvalues[w]) + def test_byteswap(self): + swapped_datas = { + 1: datas[1], + 2: packs[2](0, 0x3412, 0x6745, -0x6646, -0x81, 0x80, -1), + 3: packs[3](0, 0x563412, -0x7698bb, 0x7798ba, -0x81, 0x80, -1), + 4: packs[4](0, 0x78563412, -0x547698bb, 0x557698ba, + -0x81, 0x80, -1), + } + for w in 1, 2, 3, 4: + self.assertEqual(audioop.byteswap(b'', w), b'') + self.assertEqual(audioop.byteswap(datas[w], w), swapped_datas[w]) + self.assertEqual(audioop.byteswap(swapped_datas[w], w), datas[w]) + self.assertEqual(audioop.byteswap(bytearray(datas[w]), w), + swapped_datas[w]) + self.assertEqual(audioop.byteswap(memoryview(datas[w]), w), + swapped_datas[w]) + def test_negativelen(self): # from issue 3306, previously it segfaulted self.assertRaises(audioop.error, diff --git a/Lib/test/test_sunau.py b/Lib/test/test_sunau.py --- a/Lib/test/test_sunau.py +++ b/Lib/test/test_sunau.py @@ -1,6 +1,7 @@ from test.support import TESTFN import unittest from test import audiotests +from audioop import byteswap import sys import sunau @@ -124,7 +125,7 @@ E5040CBC 617C0A3C 08BC0A3C 2C7C0B3C 517C0E3C 8A8410FC B6840EBC 457C0A3C \ """) if sys.byteorder != 'big': - frames = audiotests.byteswap2(frames) + frames = byteswap(frames, 2) if __name__ == "__main__": diff --git a/Lib/test/test_wave.py b/Lib/test/test_wave.py --- a/Lib/test/test_wave.py +++ b/Lib/test/test_wave.py @@ -1,6 +1,7 @@ from test.support import TESTFN import unittest from test import audiotests +from audioop import byteswap import sys import wave @@ -46,13 +47,7 @@ E4B50CEB 63440A5A 08CA0A1F 2BBA0B0B 51460E47 8BCB113C B6F50EEA 44150A59 \ """) if sys.byteorder != 'big': - frames = audiotests.byteswap2(frames) - - if sys.byteorder == 'big': - @unittest.expectedFailure - def test_unseekable_incompleted_write(self): - super().test_unseekable_incompleted_write() - + frames = byteswap(frames, 2) class WavePCM24Test(audiotests.AudioWriteTests, @@ -82,7 +77,7 @@ 51486F0E44E1 8BCC64113B05 B6F4EC0EEB36 4413170A5B48 \ """) if sys.byteorder != 'big': - frames = audiotests.byteswap3(frames) + frames = byteswap(frames, 3) class WavePCM32Test(audiotests.AudioWriteTests, @@ -112,12 +107,7 @@ 51486F800E44E190 8BCC6480113B0580 B6F4EC000EEB3630 441317800A5B48A0 \ """) if sys.byteorder != 'big': - frames = audiotests.byteswap4(frames) - - if sys.byteorder == 'big': - @unittest.expectedFailure - def test_unseekable_incompleted_write(self): - super().test_unseekable_incompleted_write() + frames = byteswap(frames, 4) if __name__ == '__main__': diff --git a/Lib/wave.py b/Lib/wave.py --- a/Lib/wave.py +++ b/Lib/wave.py @@ -82,17 +82,12 @@ _array_fmts = None, 'b', 'h', None, 'i' +import audioop import struct import sys from chunk import Chunk from collections import namedtuple -def _byteswap3(data): - ba = bytearray(data) - ba[::3] = data[2::3] - ba[2::3] = data[::3] - return bytes(ba) - _wave_params = namedtuple('_wave_params', 'nchannels sampwidth framerate nframes comptype compname') @@ -243,29 +238,9 @@ self._data_seek_needed = 0 if nframes == 0: return b'' - if self._sampwidth in (2, 4) and sys.byteorder == 'big': - # unfortunately the fromfile() method does not take - # something that only looks like a file object, so - # we have to reach into the innards of the chunk object - import array - chunk = self._data_chunk - data = array.array(_array_fmts[self._sampwidth]) - assert data.itemsize == self._sampwidth - nitems = nframes * self._nchannels - if nitems * self._sampwidth > chunk.chunksize - chunk.size_read: - nitems = (chunk.chunksize - chunk.size_read) // self._sampwidth - data.fromfile(chunk.file.file, nitems) - # "tell" data chunk how much was read - chunk.size_read = chunk.size_read + nitems * self._sampwidth - # do the same for the outermost chunk - chunk = chunk.file - chunk.size_read = chunk.size_read + nitems * self._sampwidth - data.byteswap() - data = data.tobytes() - else: - data = self._data_chunk.read(nframes * self._framesize) - if self._sampwidth == 3 and sys.byteorder == 'big': - data = _byteswap3(data) + data = self._data_chunk.read(nframes * self._framesize) + if self._sampwidth != 1 and sys.byteorder == 'big': + data = audioop.byteswap(data, self._sampwidth) if self._convert and data: data = self._convert(data) self._soundpos = self._soundpos + len(data) // (self._nchannels * self._sampwidth) @@ -441,20 +416,10 @@ nframes = len(data) // (self._sampwidth * self._nchannels) if self._convert: data = self._convert(data) - if self._sampwidth in (2, 4) and sys.byteorder == 'big': - import array - a = array.array(_array_fmts[self._sampwidth]) - a.frombytes(data) - data = a - assert data.itemsize == self._sampwidth - data.byteswap() - data.tofile(self._file) - self._datawritten = self._datawritten + len(data) * self._sampwidth - else: - if self._sampwidth == 3 and sys.byteorder == 'big': - data = _byteswap3(data) - self._file.write(data) - self._datawritten = self._datawritten + len(data) + if self._sampwidth != 1 and sys.byteorder == 'big': + data = audioop.byteswap(data, self._sampwidth) + self._file.write(data) + self._datawritten += len(data) self._nframeswritten = self._nframeswritten + nframes def writeframes(self, data): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,9 @@ Library ------- +- Issue #19641: Added the audioop.byteswap() function to convert big-endian + samples to little-endian and vice versa. + - Issue #15204: Deprecated the 'U' mode in file-like objects. - Issue #17810: Implement PEP 3154, pickle protocol 4. diff --git a/Modules/audioop.c b/Modules/audioop.c --- a/Modules/audioop.c +++ b/Modules/audioop.c @@ -1108,6 +1108,37 @@ } static PyObject * +audioop_byteswap(PyObject *self, PyObject *args) +{ + Py_buffer view; + unsigned char *ncp; + Py_ssize_t i; + int size; + PyObject *rv = NULL; + + if (!PyArg_ParseTuple(args, "y*i:swapbytes", + &view, &size)) + return NULL; + + if (!audioop_check_parameters(view.len, size)) + goto exit; + + rv = PyBytes_FromStringAndSize(NULL, view.len); + if (rv == NULL) + goto exit; + ncp = (unsigned char *)PyBytes_AsString(rv); + + for (i = 0; i < view.len; i += size) { + int j; + for (j = 0; j < size; j++) + ncp[i + size - 1 - j] = ((unsigned char *)view.buf)[i + j]; + } + exit: + PyBuffer_Release(&view); + return rv; +} + +static PyObject * audioop_lin2lin(PyObject *self, PyObject *args) { Py_buffer view; @@ -1698,6 +1729,7 @@ { "tostereo", audioop_tostereo, METH_VARARGS }, { "getsample", audioop_getsample, METH_VARARGS }, { "reverse", audioop_reverse, METH_VARARGS }, + { "byteswap", audioop_byteswap, METH_VARARGS }, { "ratecv", audioop_ratecv, METH_VARARGS }, { 0, 0 } }; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:38:13 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 21:38:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_test=5Fpickle=3A_speed_up_?= =?utf-8?q?test=5Flong?= Message-ID: <3dRmYK1tXXz7Lkh@mail.python.org> http://hg.python.org/cpython/rev/e6f13c40a020 changeset: 87455:e6f13c40a020 user: Antoine Pitrou date: Sat Nov 23 21:20:49 2013 +0100 summary: test_pickle: speed up test_long files: Lib/test/pickletester.py | 6 +++++- 1 files changed, 5 insertions(+), 1 deletions(-) diff --git a/Lib/test/pickletester.py b/Lib/test/pickletester.py --- a/Lib/test/pickletester.py +++ b/Lib/test/pickletester.py @@ -722,7 +722,11 @@ for n in nbase, -nbase: p = self.dumps(n, 2) got = self.loads(p) - self.assert_is_copy(n, got) + # assert_is_copy is very expensive here as it precomputes + # a failure message by computing the repr() of n and got, + # we just do the check ourselves. + self.assertIs(type(got), int) + self.assertEqual(n, got) def test_float(self): test_values = [0.0, 4.94e-324, 1e-310, 7e-308, 6.626e-34, 0.1, 0.5, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:38:14 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sat, 23 Nov 2013 21:38:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_signed_/_unsigned_comp?= =?utf-8?q?arison?= Message-ID: <3dRmYL3gPzz7Lkh@mail.python.org> http://hg.python.org/cpython/rev/c3fd79b17983 changeset: 87456:c3fd79b17983 user: Antoine Pitrou date: Sat Nov 23 21:34:04 2013 +0100 summary: Fix signed / unsigned comparison files: Modules/_pickle.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -1165,7 +1165,7 @@ PyErr_Format(UnpicklingError, "Invalid frame length"); return -1; } - if (frame_len < n) { + if ((Py_ssize_t) frame_len < n) { PyErr_Format(UnpicklingError, "Bad framing"); return -1; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:41:04 2013 From: python-checkins at python.org (gregory.p.smith) Date: Sat, 23 Nov 2013 21:41:04 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_test=5Fpickletools=2Ep?= =?utf-8?q?y_doctest=27s_on_32-bit_platforms=2E__I_hate_doctests=2E?= Message-ID: <3dRmcc45qKzQ3P@mail.python.org> http://hg.python.org/cpython/rev/eb6fd9e96015 changeset: 87457:eb6fd9e96015 user: Gregory P. Smith date: Sat Nov 23 20:40:46 2013 +0000 summary: Fix test_pickletools.py doctest's on 32-bit platforms. I hate doctests. files: Lib/pickletools.py | 7 ++++--- 1 files changed, 4 insertions(+), 3 deletions(-) diff --git a/Lib/pickletools.py b/Lib/pickletools.py --- a/Lib/pickletools.py +++ b/Lib/pickletools.py @@ -562,15 +562,16 @@ def read_bytes8(f): r""" - >>> import io + >>> import io, struct, sys >>> read_bytes8(io.BytesIO(b"\x00\x00\x00\x00\x00\x00\x00\x00abc")) b'' >>> read_bytes8(io.BytesIO(b"\x03\x00\x00\x00\x00\x00\x00\x00abcdef")) b'abc' - >>> read_bytes8(io.BytesIO(b"\x00\x00\x00\x00\x00\x00\x03\x00abcdef")) + >>> bigsize8 = struct.pack(">> read_bytes8(io.BytesIO(bigsize8 + b"abcdef")) #doctest: +ELLIPSIS Traceback (most recent call last): ... - ValueError: expected 844424930131968 bytes in a bytes8, but only 6 remain + ValueError: expected ... bytes in a bytes8, but only 6 remain """ n = read_uint8(f) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:45:27 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 23 Nov 2013 21:45:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2313592=3A_Improved?= =?utf-8?q?_the_repr_for_regular_expression_pattern_objects=2E?= Message-ID: <3dRmjg12fTz7LjR@mail.python.org> http://hg.python.org/cpython/rev/8c00677da6c0 changeset: 87458:8c00677da6c0 parent: 87456:c3fd79b17983 user: Serhiy Storchaka date: Sat Nov 23 22:42:43 2013 +0200 summary: Issue #13592: Improved the repr for regular expression pattern objects. Based on patch by Hugo Lopes Tavares. files: Lib/sre_constants.py | 2 + Lib/test/test_re.py | 62 +++++++++++++++++++++ Misc/NEWS | 3 + Modules/_sre.c | 82 ++++++++++++++++++++++++++++- Modules/sre_constants.h | 2 + 5 files changed, 150 insertions(+), 1 deletions(-) diff --git a/Lib/sre_constants.py b/Lib/sre_constants.py --- a/Lib/sre_constants.py +++ b/Lib/sre_constants.py @@ -250,6 +250,8 @@ f.write("#define SRE_FLAG_DOTALL %d\n" % SRE_FLAG_DOTALL) f.write("#define SRE_FLAG_UNICODE %d\n" % SRE_FLAG_UNICODE) f.write("#define SRE_FLAG_VERBOSE %d\n" % SRE_FLAG_VERBOSE) + f.write("#define SRE_FLAG_DEBUG %d\n" % SRE_FLAG_DEBUG) + f.write("#define SRE_FLAG_ASCII %d\n" % SRE_FLAG_ASCII) f.write("#define SRE_INFO_PREFIX %d\n" % SRE_INFO_PREFIX) f.write("#define SRE_INFO_LITERAL %d\n" % SRE_INFO_LITERAL) diff --git a/Lib/test/test_re.py b/Lib/test/test_re.py --- a/Lib/test/test_re.py +++ b/Lib/test/test_re.py @@ -1164,6 +1164,68 @@ self.assertEqual(m.group(2), "y") +class PatternReprTests(unittest.TestCase): + def check(self, pattern, expected): + self.assertEqual(repr(re.compile(pattern)), expected) + + def check_flags(self, pattern, flags, expected): + self.assertEqual(repr(re.compile(pattern, flags)), expected) + + def test_without_flags(self): + self.check('random pattern', + "re.compile('random pattern')") + + def test_single_flag(self): + self.check_flags('random pattern', re.IGNORECASE, + "re.compile('random pattern', re.IGNORECASE)") + + def test_multiple_flags(self): + self.check_flags('random pattern', re.I|re.S|re.X, + "re.compile('random pattern', " + "re.IGNORECASE|re.DOTALL|re.VERBOSE)") + + def test_unicode_flag(self): + self.check_flags('random pattern', re.U, + "re.compile('random pattern')") + self.check_flags('random pattern', re.I|re.S|re.U, + "re.compile('random pattern', " + "re.IGNORECASE|re.DOTALL)") + + def test_inline_flags(self): + self.check('(?i)pattern', + "re.compile('(?i)pattern', re.IGNORECASE)") + + def test_unknown_flags(self): + self.check_flags('random pattern', 0x123000, + "re.compile('random pattern', 0x123000)") + self.check_flags('random pattern', 0x123000|re.I, + "re.compile('random pattern', re.IGNORECASE|0x123000)") + + def test_bytes(self): + self.check(b'bytes pattern', + "re.compile(b'bytes pattern')") + self.check_flags(b'bytes pattern', re.A, + "re.compile(b'bytes pattern', re.ASCII)") + + def test_quotes(self): + self.check('random "double quoted" pattern', + '''re.compile('random "double quoted" pattern')''') + self.check("random 'single quoted' pattern", + '''re.compile("random 'single quoted' pattern")''') + self.check('''both 'single' and "double" quotes''', + '''re.compile('both \\'single\\' and "double" quotes')''') + + def test_long_pattern(self): + pattern = 'Very %spattern' % ('long ' * 1000) + r = repr(re.compile(pattern)) + self.assertLess(len(r), 300) + self.assertEqual(r[:30], "re.compile('Very long long lon") + r = repr(re.compile(pattern, re.I)) + self.assertLess(len(r), 300) + self.assertEqual(r[:30], "re.compile('Very long long lon") + self.assertEqual(r[-16:], ", re.IGNORECASE)") + + class ImplementationTest(unittest.TestCase): """ Test implementation details of the re module. diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,9 @@ Library ------- +- Issue #13592: Improved the repr for regular expression pattern objects. + Based on patch by Hugo Lopes Tavares. + - Issue #19641: Added the audioop.byteswap() function to convert big-endian samples to little-endian and vice versa. diff --git a/Modules/_sre.c b/Modules/_sre.c --- a/Modules/_sre.c +++ b/Modules/_sre.c @@ -1139,6 +1139,86 @@ #endif } +static PyObject * +pattern_repr(PatternObject *obj) +{ + static const struct { + const char *name; + int value; + } flag_names[] = { + {"re.TEMPLATE", SRE_FLAG_TEMPLATE}, + {"re.IGNORECASE", SRE_FLAG_IGNORECASE}, + {"re.LOCALE", SRE_FLAG_LOCALE}, + {"re.MULTILINE", SRE_FLAG_MULTILINE}, + {"re.DOTALL", SRE_FLAG_DOTALL}, + {"re.UNICODE", SRE_FLAG_UNICODE}, + {"re.VERBOSE", SRE_FLAG_VERBOSE}, + {"re.DEBUG", SRE_FLAG_DEBUG}, + {"re.ASCII", SRE_FLAG_ASCII}, + }; + PyObject *result = NULL; + PyObject *flag_items; + int i; + int flags = obj->flags; + + /* Omit re.UNICODE for valid string patterns. */ + if (obj->isbytes == 0 && + (flags & (SRE_FLAG_LOCALE|SRE_FLAG_UNICODE|SRE_FLAG_ASCII)) == + SRE_FLAG_UNICODE) + flags &= ~SRE_FLAG_UNICODE; + + flag_items = PyList_New(0); + if (!flag_items) + return NULL; + + for (i = 0; i < Py_ARRAY_LENGTH(flag_names); i++) { + if (flags & flag_names[i].value) { + PyObject *item = PyUnicode_FromString(flag_names[i].name); + if (!item) + goto done; + + if (PyList_Append(flag_items, item) < 0) { + Py_DECREF(item); + goto done; + } + Py_DECREF(item); + flags &= ~flag_names[i].value; + } + } + if (flags) { + PyObject *item = PyUnicode_FromFormat("0x%x", flags); + if (!item) + goto done; + + if (PyList_Append(flag_items, item) < 0) { + Py_DECREF(item); + goto done; + } + Py_DECREF(item); + } + + if (PyList_Size(flag_items) > 0) { + PyObject *flags_result; + PyObject *sep = PyUnicode_FromString("|"); + if (!sep) + goto done; + flags_result = PyUnicode_Join(sep, flag_items); + Py_DECREF(sep); + if (!flags_result) + goto done; + result = PyUnicode_FromFormat("re.compile(%.200R, %S)", + obj->pattern, flags_result); + Py_DECREF(flags_result); + } + else { + result = PyUnicode_FromFormat("re.compile(%.200R)", obj->pattern); + } + +done: + Py_DECREF(flag_items); + return result; +} + PyDoc_STRVAR(pattern_match_doc, "match(string[, pos[, endpos]]) -> match object or None.\n\ Matches zero or more characters at the beginning of the string"); @@ -1214,7 +1294,7 @@ 0, /* tp_getattr */ 0, /* tp_setattr */ 0, /* tp_reserved */ - 0, /* tp_repr */ + (reprfunc)pattern_repr, /* tp_repr */ 0, /* tp_as_number */ 0, /* tp_as_sequence */ 0, /* tp_as_mapping */ diff --git a/Modules/sre_constants.h b/Modules/sre_constants.h --- a/Modules/sre_constants.h +++ b/Modules/sre_constants.h @@ -81,6 +81,8 @@ #define SRE_FLAG_DOTALL 16 #define SRE_FLAG_UNICODE 32 #define SRE_FLAG_VERBOSE 64 +#define SRE_FLAG_DEBUG 128 +#define SRE_FLAG_ASCII 256 #define SRE_INFO_PREFIX 1 #define SRE_INFO_LITERAL 2 #define SRE_INFO_CHARSET 4 -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 21:45:28 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 23 Nov 2013 21:45:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merge_heads?= Message-ID: <3dRmjh2ryFz7Lkh@mail.python.org> http://hg.python.org/cpython/rev/68b6014b3f20 changeset: 87459:68b6014b3f20 parent: 87458:8c00677da6c0 parent: 87457:eb6fd9e96015 user: Serhiy Storchaka date: Sat Nov 23 22:45:06 2013 +0200 summary: Merge heads files: Lib/pickletools.py | 7 ++++--- 1 files changed, 4 insertions(+), 3 deletions(-) diff --git a/Lib/pickletools.py b/Lib/pickletools.py --- a/Lib/pickletools.py +++ b/Lib/pickletools.py @@ -562,15 +562,16 @@ def read_bytes8(f): r""" - >>> import io + >>> import io, struct, sys >>> read_bytes8(io.BytesIO(b"\x00\x00\x00\x00\x00\x00\x00\x00abc")) b'' >>> read_bytes8(io.BytesIO(b"\x03\x00\x00\x00\x00\x00\x00\x00abcdef")) b'abc' - >>> read_bytes8(io.BytesIO(b"\x00\x00\x00\x00\x00\x00\x03\x00abcdef")) + >>> bigsize8 = struct.pack(">> read_bytes8(io.BytesIO(bigsize8 + b"abcdef")) #doctest: +ELLIPSIS Traceback (most recent call last): ... - ValueError: expected 844424930131968 bytes in a bytes8, but only 6 remain + ValueError: expected ... bytes in a bytes8, but only 6 remain """ n = read_uint8(f) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 22:10:26 2013 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 23 Nov 2013 22:10:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Package_=5Foverlapped=2E?= Message-ID: <3dRnGV4kX5z7LjR@mail.python.org> http://hg.python.org/cpython/rev/83ab827ea2be changeset: 87460:83ab827ea2be user: Martin v. L?wis date: Sat Nov 23 22:02:00 2013 +0100 summary: Package _overlapped. files: Tools/msi/msi.py | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Tools/msi/msi.py b/Tools/msi/msi.py --- a/Tools/msi/msi.py +++ b/Tools/msi/msi.py @@ -102,6 +102,7 @@ '_testbuffer.pyd', '_sha3.pyd', '_testimportmultiple.pyd', + '_overlapped.pyd', ] # Well-known component UUIDs -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 22:20:54 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 23 Nov 2013 22:20:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2316203=3A_Add_re?= =?utf-8?q?=2Efullmatch=28=29_function_and_regex=2Efullmatch=28=29_method?= =?utf-8?q?=2C?= Message-ID: <3dRnVZ6DpXz7Llr@mail.python.org> http://hg.python.org/cpython/rev/89dfa2671c83 changeset: 87461:89dfa2671c83 user: Serhiy Storchaka date: Sat Nov 23 23:20:30 2013 +0200 summary: Issue #16203: Add re.fullmatch() function and regex.fullmatch() method, which anchor the pattern at both ends of the string to match. Original patch by Matthew Barnett. files: Doc/library/re.rst | 30 ++++++++++++- Doc/whatsnew/3.4.rst | 7 ++ Lib/re.py | 28 +++++++---- Lib/test/test_re.py | 30 ++++++++++++ Misc/NEWS | 4 + Modules/_sre.c | 77 ++++++++++++++++++++++++------- Modules/sre.h | 1 + Modules/sre_lib.h | 28 ++++++++-- 8 files changed, 168 insertions(+), 37 deletions(-) diff --git a/Doc/library/re.rst b/Doc/library/re.rst --- a/Doc/library/re.rst +++ b/Doc/library/re.rst @@ -481,7 +481,7 @@ .. note:: The compiled versions of the most recent patterns passed to - :func:`re.match`, :func:`re.search` or :func:`re.compile` are cached, so + :func:`re.compile` and the module-level matching functions are cached, so programs that use only a few regular expressions at a time needn't worry about compiling regular expressions. @@ -584,6 +584,16 @@ instead (see also :ref:`search-vs-match`). +.. function:: fullmatch(pattern, string, flags=0) + + If the whole *string* matches the regular expression *pattern*, return a + corresponding :ref:`match object `. Return ``None`` if the + string does not match the pattern; note that this is different from a + zero-length match. + + .. versionadded:: 3.4 + + .. function:: split(pattern, string, maxsplit=0, flags=0) Split *string* by the occurrences of *pattern*. If capturing parentheses are @@ -778,6 +788,24 @@ :meth:`~regex.search` instead (see also :ref:`search-vs-match`). +.. method:: regex.fullmatch(string[, pos[, endpos]]) + + If the whole *string* matches this regular expression, return a corresponding + :ref:`match object `. Return ``None`` if the string does not + match the pattern; note that this is different from a zero-length match. + + The optional *pos* and *endpos* parameters have the same meaning as for the + :meth:`~regex.search` method. + + >>> pattern = re.compile("o[gh]") + >>> pattern.fullmatch("dog") # No match as "o" is not at the start of "dog". + >>> pattern.fullmatch("ogre") # No match as not the full string matches. + >>> pattern.fullmatch("doggie", 1, 3) # Matches within given limits. + <_sre.SRE_Match object at ...> + + .. versionadded:: 3.4 + + .. method:: regex.split(string, maxsplit=0) Identical to the :func:`split` function, using the compiled pattern. diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -647,6 +647,13 @@ the :mod:`inspect` module. +re +-- + +Added :func:`re.fullmatch` function and :meth:`regex.fullmatch` method, +which anchor the pattern at both ends of the string to match. +(Contributed by Matthew Barnett in :issue:`16203`.) + resource -------- diff --git a/Lib/re.py b/Lib/re.py --- a/Lib/re.py +++ b/Lib/re.py @@ -85,16 +85,17 @@ \\ Matches a literal backslash. This module exports the following functions: - match Match a regular expression pattern to the beginning of a string. - search Search a string for the presence of a pattern. - sub Substitute occurrences of a pattern found in a string. - subn Same as sub, but also return the number of substitutions made. - split Split a string by the occurrences of a pattern. - findall Find all occurrences of a pattern in a string. - finditer Return an iterator yielding a match object for each match. - compile Compile a pattern into a RegexObject. - purge Clear the regular expression cache. - escape Backslash all non-alphanumerics in a string. + match Match a regular expression pattern to the beginning of a string. + fullmatch Match a regular expression pattern to all of a string. + search Search a string for the presence of a pattern. + sub Substitute occurrences of a pattern found in a string. + subn Same as sub, but also return the number of substitutions made. + split Split a string by the occurrences of a pattern. + findall Find all occurrences of a pattern in a string. + finditer Return an iterator yielding a match object for each match. + compile Compile a pattern into a RegexObject. + purge Clear the regular expression cache. + escape Backslash all non-alphanumerics in a string. Some of the functions in this module takes flags as optional parameters: A ASCII For string patterns, make \w, \W, \b, \B, \d, \D @@ -123,7 +124,7 @@ import sre_parse # public symbols -__all__ = [ "match", "search", "sub", "subn", "split", "findall", +__all__ = [ "match", "fullmatch", "search", "sub", "subn", "split", "findall", "compile", "purge", "template", "escape", "A", "I", "L", "M", "S", "X", "U", "ASCII", "IGNORECASE", "LOCALE", "MULTILINE", "DOTALL", "VERBOSE", "UNICODE", "error" ] @@ -154,6 +155,11 @@ a match object, or None if no match was found.""" return _compile(pattern, flags).match(string) +def fullmatch(pattern, string, flags=0): + """Try to apply the pattern to all of the string, returning + a match object, or None if no match was found.""" + return _compile(pattern, flags).fullmatch(string) + def search(pattern, string, flags=0): """Scan through string looking for a match to the pattern, returning a match object, or None if no match was found.""" diff --git a/Lib/test/test_re.py b/Lib/test/test_re.py --- a/Lib/test/test_re.py +++ b/Lib/test/test_re.py @@ -349,6 +349,36 @@ (None, 'b', None)) self.assertEqual(pat.match('ac').group(1, 'b2', 3), ('a', None, 'c')) + def test_re_fullmatch(self): + # Issue 16203: Proposal: add re.fullmatch() method. + self.assertEqual(re.fullmatch(r"a", "a").span(), (0, 1)) + for string in "ab", S("ab"): + self.assertEqual(re.fullmatch(r"a|ab", string).span(), (0, 2)) + for string in b"ab", B(b"ab"), bytearray(b"ab"), memoryview(b"ab"): + self.assertEqual(re.fullmatch(br"a|ab", string).span(), (0, 2)) + for a, b in "\xe0\xdf", "\u0430\u0431", "\U0001d49c\U0001d49e": + r = r"%s|%s" % (a, a + b) + self.assertEqual(re.fullmatch(r, a + b).span(), (0, 2)) + self.assertEqual(re.fullmatch(r".*?$", "abc").span(), (0, 3)) + self.assertEqual(re.fullmatch(r".*?", "abc").span(), (0, 3)) + self.assertEqual(re.fullmatch(r"a.*?b", "ab").span(), (0, 2)) + self.assertEqual(re.fullmatch(r"a.*?b", "abb").span(), (0, 3)) + self.assertEqual(re.fullmatch(r"a.*?b", "axxb").span(), (0, 4)) + self.assertIsNone(re.fullmatch(r"a+", "ab")) + self.assertIsNone(re.fullmatch(r"abc$", "abc\n")) + self.assertIsNone(re.fullmatch(r"abc\Z", "abc\n")) + self.assertIsNone(re.fullmatch(r"(?m)abc$", "abc\n")) + self.assertEqual(re.fullmatch(r"ab(?=c)cd", "abcd").span(), (0, 4)) + self.assertEqual(re.fullmatch(r"ab(?<=b)cd", "abcd").span(), (0, 4)) + self.assertEqual(re.fullmatch(r"(?=a|ab)ab", "ab").span(), (0, 2)) + + self.assertEqual( + re.compile(r"bc").fullmatch("abcd", pos=1, endpos=3).span(), (1, 3)) + self.assertEqual( + re.compile(r".*?$").fullmatch("abcd", pos=1, endpos=3).span(), (1, 3)) + self.assertEqual( + re.compile(r".*?").fullmatch("abcd", pos=1, endpos=3).span(), (1, 3)) + def test_re_groupref_exists(self): self.assertEqual(re.match('^(\()?([^()]+)(?(1)\))$', '(a)').groups(), ('(', 'a')) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,10 @@ Library ------- +- Issue #16203: Add re.fullmatch() function and regex.fullmatch() method, + which anchor the pattern at both ends of the string to match. + Original patch by Matthew Barnett. + - Issue #13592: Improved the repr for regular expression pattern objects. Based on patch by Hugo Lopes Tavares. diff --git a/Modules/_sre.c b/Modules/_sre.c --- a/Modules/_sre.c +++ b/Modules/_sre.c @@ -4,24 +4,25 @@ * regular expression matching engine * * partial history: - * 1999-10-24 fl created (based on existing template matcher code) - * 2000-03-06 fl first alpha, sort of - * 2000-08-01 fl fixes for 1.6b1 - * 2000-08-07 fl use PyOS_CheckStack() if available - * 2000-09-20 fl added expand method - * 2001-03-20 fl lots of fixes for 2.1b2 - * 2001-04-15 fl export copyright as Python attribute, not global - * 2001-04-28 fl added __copy__ methods (work in progress) - * 2001-05-14 fl fixes for 1.5.2 compatibility - * 2001-07-01 fl added BIGCHARSET support (from Martin von Loewis) - * 2001-10-18 fl fixed group reset issue (from Matthew Mueller) - * 2001-10-20 fl added split primitive; reenable unicode for 1.6/2.0/2.1 - * 2001-10-21 fl added sub/subn primitive - * 2001-10-24 fl added finditer primitive (for 2.2 only) - * 2001-12-07 fl fixed memory leak in sub/subn (Guido van Rossum) - * 2002-11-09 fl fixed empty sub/subn return type - * 2003-04-18 mvl fully support 4-byte codes - * 2003-10-17 gn implemented non recursive scheme + * 1999-10-24 fl created (based on existing template matcher code) + * 2000-03-06 fl first alpha, sort of + * 2000-08-01 fl fixes for 1.6b1 + * 2000-08-07 fl use PyOS_CheckStack() if available + * 2000-09-20 fl added expand method + * 2001-03-20 fl lots of fixes for 2.1b2 + * 2001-04-15 fl export copyright as Python attribute, not global + * 2001-04-28 fl added __copy__ methods (work in progress) + * 2001-05-14 fl fixes for 1.5.2 compatibility + * 2001-07-01 fl added BIGCHARSET support (from Martin von Loewis) + * 2001-10-18 fl fixed group reset issue (from Matthew Mueller) + * 2001-10-20 fl added split primitive; reenable unicode for 1.6/2.0/2.1 + * 2001-10-21 fl added sub/subn primitive + * 2001-10-24 fl added finditer primitive (for 2.2 only) + * 2001-12-07 fl fixed memory leak in sub/subn (Guido van Rossum) + * 2002-11-09 fl fixed empty sub/subn return type + * 2003-04-18 mvl fully support 4-byte codes + * 2003-10-17 gn implemented non recursive scheme + * 2013-02-04 mrab added fullmatch primitive * * Copyright (c) 1997-2001 by Secret Labs AB. All rights reserved. * @@ -559,6 +560,40 @@ } static PyObject* +pattern_fullmatch(PatternObject* self, PyObject* args, PyObject* kw) +{ + SRE_STATE state; + Py_ssize_t status; + + PyObject* string; + Py_ssize_t start = 0; + Py_ssize_t end = PY_SSIZE_T_MAX; + static char* kwlist[] = { "pattern", "pos", "endpos", NULL }; + if (!PyArg_ParseTupleAndKeywords(args, kw, "O|nn:fullmatch", kwlist, + &string, &start, &end)) + return NULL; + + string = state_init(&state, self, string, start, end); + if (!string) + return NULL; + + state.match_all = 1; + state.ptr = state.start; + + TRACE(("|%p|%p|FULLMATCH\n", PatternObject_GetCode(self), state.ptr)); + + status = sre_match(&state, PatternObject_GetCode(self)); + + TRACE(("|%p|%p|END\n", PatternObject_GetCode(self), state.ptr)); + if (PyErr_Occurred()) + return NULL; + + state_fini(&state); + + return pattern_new_match(self, &state, status); +} + +static PyObject* pattern_search(PatternObject* self, PyObject* args, PyObject* kw) { SRE_STATE state; @@ -1223,6 +1258,10 @@ "match(string[, pos[, endpos]]) -> match object or None.\n\ Matches zero or more characters at the beginning of the string"); +PyDoc_STRVAR(pattern_fullmatch_doc, +"fullmatch(string[, pos[, endpos]]) -> match object or None.\n\ + Matches against all of the string"); + PyDoc_STRVAR(pattern_search_doc, "search(string[, pos[, endpos]]) -> match object or None.\n\ Scan through string looking for a match, and return a corresponding\n\ @@ -1258,6 +1297,8 @@ static PyMethodDef pattern_methods[] = { {"match", (PyCFunction) pattern_match, METH_VARARGS|METH_KEYWORDS, pattern_match_doc}, + {"fullmatch", (PyCFunction) pattern_fullmatch, METH_VARARGS|METH_KEYWORDS, + pattern_fullmatch_doc}, {"search", (PyCFunction) pattern_search, METH_VARARGS|METH_KEYWORDS, pattern_search_doc}, {"sub", (PyCFunction) pattern_sub, METH_VARARGS|METH_KEYWORDS, diff --git a/Modules/sre.h b/Modules/sre.h --- a/Modules/sre.h +++ b/Modules/sre.h @@ -86,6 +86,7 @@ SRE_REPEAT *repeat; /* hooks */ SRE_TOLOWER_HOOK lower; + int match_all; } SRE_STATE; typedef struct { diff --git a/Modules/sre_lib.h b/Modules/sre_lib.h --- a/Modules/sre_lib.h +++ b/Modules/sre_lib.h @@ -454,17 +454,24 @@ #define JUMP_ASSERT 12 #define JUMP_ASSERT_NOT 13 -#define DO_JUMP(jumpvalue, jumplabel, nextpattern) \ +#define DO_JUMPX(jumpvalue, jumplabel, nextpattern, matchall) \ DATA_ALLOC(SRE(match_context), nextctx); \ nextctx->last_ctx_pos = ctx_pos; \ nextctx->jump = jumpvalue; \ nextctx->pattern = nextpattern; \ + nextctx->match_all = matchall; \ ctx_pos = alloc_pos; \ ctx = nextctx; \ goto entrance; \ jumplabel: \ while (0) /* gcc doesn't like labels at end of scopes */ \ +#define DO_JUMP(jumpvalue, jumplabel, nextpattern) \ + DO_JUMPX(jumpvalue, jumplabel, nextpattern, ctx->match_all) + +#define DO_JUMP0(jumpvalue, jumplabel, nextpattern) \ + DO_JUMPX(jumpvalue, jumplabel, nextpattern, 0) + typedef struct { Py_ssize_t last_ctx_pos; Py_ssize_t jump; @@ -477,6 +484,7 @@ SRE_CODE chr; SRE_REPEAT* rep; } u; + int match_all; } SRE(match_context); /* check if string matches the given pattern. returns <0 for @@ -499,6 +507,7 @@ ctx->last_ctx_pos = -1; ctx->jump = JUMP_NONE; ctx->pattern = pattern; + ctx->match_all = state->match_all; ctx_pos = alloc_pos; entrance: @@ -571,8 +580,11 @@ case SRE_OP_SUCCESS: /* end of pattern */ TRACE(("|%p|%p|SUCCESS\n", ctx->pattern, ctx->ptr)); - state->ptr = ctx->ptr; - RETURN_SUCCESS; + if (!ctx->match_all || ctx->ptr == state->end) { + state->ptr = ctx->ptr; + RETURN_SUCCESS; + } + RETURN_FAILURE; case SRE_OP_AT: /* match at given position */ @@ -726,7 +738,8 @@ if (ctx->count < (Py_ssize_t) ctx->pattern[1]) RETURN_FAILURE; - if (ctx->pattern[ctx->pattern[0]] == SRE_OP_SUCCESS) { + if (ctx->pattern[ctx->pattern[0]] == SRE_OP_SUCCESS && + (!ctx->match_all || ctx->ptr == state->end)) { /* tail is empty. we're finished */ state->ptr = ctx->ptr; RETURN_SUCCESS; @@ -810,7 +823,8 @@ ctx->ptr += ctx->count; } - if (ctx->pattern[ctx->pattern[0]] == SRE_OP_SUCCESS) { + if (ctx->pattern[ctx->pattern[0]] == SRE_OP_SUCCESS && + (!ctx->match_all || ctx->ptr == state->end)) { /* tail is empty. we're finished */ state->ptr = ctx->ptr; RETURN_SUCCESS; @@ -1082,7 +1096,7 @@ state->ptr = ctx->ptr - ctx->pattern[1]; if (state->ptr < state->beginning) RETURN_FAILURE; - DO_JUMP(JUMP_ASSERT, jump_assert, ctx->pattern+2); + DO_JUMP0(JUMP_ASSERT, jump_assert, ctx->pattern+2); RETURN_ON_FAILURE(ret); ctx->pattern += ctx->pattern[0]; break; @@ -1094,7 +1108,7 @@ ctx->ptr, ctx->pattern[1])); state->ptr = ctx->ptr - ctx->pattern[1]; if (state->ptr >= state->beginning) { - DO_JUMP(JUMP_ASSERT_NOT, jump_assert_not, ctx->pattern+2); + DO_JUMP0(JUMP_ASSERT_NOT, jump_assert_not, ctx->pattern+2); if (ret) { RETURN_ON_ERROR(ret); RETURN_FAILURE; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 22:24:58 2013 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 23 Nov 2013 22:24:58 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogQWRkIDIuNy43IHV1?= =?utf-8?q?ids=2E?= Message-ID: <3dRnbG1tQQz7LjR@mail.python.org> http://hg.python.org/cpython/rev/f075a7178108 changeset: 87462:f075a7178108 branch: 2.7 parent: 87432:ef4636faf8bd user: Martin v. L?wis date: Sun Nov 10 19:42:47 2013 +0100 summary: Add 2.7.7 uuids. files: Tools/msi/uuids.py | 2 ++ 1 files changed, 2 insertions(+), 0 deletions(-) diff --git a/Tools/msi/uuids.py b/Tools/msi/uuids.py --- a/Tools/msi/uuids.py +++ b/Tools/msi/uuids.py @@ -63,4 +63,6 @@ '2.7.5150':'{DBDD570E-0952-475f-9453-AB88F3DD5659}', # 2.7.5 '2.7.6121':'{D1EBC07F-A7B1-4163-83DB-AE813CEF392F}', # 2.7.6rc1 '2.7.6150':'{C3CC4DF5-39A5-4027-B136-2B3E1F5AB6E2}', # 2.7.6 + '2.7.7121':'{5E0D187D-238B-4e96-9C75-C4CF141F5385}', # 2.7.7rc1 + '2.7.7150':'{049CA433-77A0-4e48-AC76-180A282C4E10}', # 2.7.7 } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 22:44:02 2013 From: python-checkins at python.org (christian.heimes) Date: Sat, 23 Nov 2013 22:44:02 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319735=3A_Implemen?= =?utf-8?q?t_private_function_ssl=2E=5Fcreate=5Fstdlib=5Fcontext=28=29_to?= Message-ID: <3dRp1G1R2Yz7LjR@mail.python.org> http://hg.python.org/cpython/rev/03cc1b55faae changeset: 87463:03cc1b55faae parent: 87461:89dfa2671c83 user: Christian Heimes date: Sat Nov 23 22:43:47 2013 +0100 summary: Issue #19735: Implement private function ssl._create_stdlib_context() to create SSLContext objects in Python's stdlib module. It provides a single configuration point and makes use of SSLContext.load_default_certs(). files: Lib/asyncio/selector_events.py | 6 +- Lib/ftplib.py | 17 ++---- Lib/http/client.py | 4 +- Lib/imaplib.py | 12 ++--- Lib/nntplib.py | 4 +- Lib/poplib.py | 11 ++-- Lib/smtplib.py | 16 +++--- Lib/ssl.py | 50 +++++++++++++++++++-- Lib/test/test_ssl.py | 21 +++++++++ Lib/urllib/request.py | 10 +--- Misc/NEWS | 4 + 11 files changed, 100 insertions(+), 55 deletions(-) diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -571,10 +571,8 @@ # context; in that case the sslcontext passed is None. # The default is the same as used by urllib with # cadefault=True. - sslcontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23) - sslcontext.options |= ssl.OP_NO_SSLv2 - sslcontext.set_default_verify_paths() - sslcontext.verify_mode = ssl.CERT_REQUIRED + sslcontext = ssl._create_stdlib_context( + cert_reqs=ssl.CERT_REQUIRED) wrap_kwargs = { 'server_side': server_side, diff --git a/Lib/ftplib.py b/Lib/ftplib.py --- a/Lib/ftplib.py +++ b/Lib/ftplib.py @@ -727,6 +727,10 @@ "exclusive") self.keyfile = keyfile self.certfile = certfile + if context is None: + context = ssl._create_stdlib_context(self.ssl_version, + certfile=certfile, + keyfile=keyfile) self.context = context self._prot_p = False FTP.__init__(self, host, user, passwd, acct, timeout, source_address) @@ -744,12 +748,7 @@ resp = self.voidcmd('AUTH TLS') else: resp = self.voidcmd('AUTH SSL') - if self.context is not None: - self.sock = self.context.wrap_socket(self.sock) - else: - self.sock = ssl.wrap_socket(self.sock, self.keyfile, - self.certfile, - ssl_version=self.ssl_version) + self.sock = self.context.wrap_socket(self.sock) self.file = self.sock.makefile(mode='r', encoding=self.encoding) return resp @@ -788,11 +787,7 @@ def ntransfercmd(self, cmd, rest=None): conn, size = FTP.ntransfercmd(self, cmd, rest) if self._prot_p: - if self.context is not None: - conn = self.context.wrap_socket(conn) - else: - conn = ssl.wrap_socket(conn, self.keyfile, self.certfile, - ssl_version=self.ssl_version) + conn = self.context.wrap_socket(conn) return conn, size def abort(self): diff --git a/Lib/http/client.py b/Lib/http/client.py --- a/Lib/http/client.py +++ b/Lib/http/client.py @@ -1179,9 +1179,7 @@ self.key_file = key_file self.cert_file = cert_file if context is None: - # Some reasonable defaults - context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) - context.options |= ssl.OP_NO_SSLv2 + context = ssl._create_stdlib_context() will_verify = context.verify_mode != ssl.CERT_NONE if check_hostname is None: check_hostname = will_verify diff --git a/Lib/imaplib.py b/Lib/imaplib.py --- a/Lib/imaplib.py +++ b/Lib/imaplib.py @@ -742,9 +742,7 @@ raise self.abort('TLS not supported by server') # Generate a default SSL context if none was passed. if ssl_context is None: - ssl_context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) - # SSLv2 considered harmful. - ssl_context.options |= ssl.OP_NO_SSLv2 + ssl_context = ssl._create_stdlib_context() typ, dat = self._simple_command(name) if typ == 'OK': self.sock = ssl_context.wrap_socket(self.sock) @@ -1210,15 +1208,15 @@ self.keyfile = keyfile self.certfile = certfile + if ssl_context is None: + ssl_context = ssl._create_stdlib_context(certfile=certfile, + keyfile=keyfile) self.ssl_context = ssl_context IMAP4.__init__(self, host, port) def _create_socket(self): sock = IMAP4._create_socket(self) - if self.ssl_context: - return self.ssl_context.wrap_socket(sock) - else: - return ssl.wrap_socket(sock, self.keyfile, self.certfile) + return self.ssl_context.wrap_socket(sock) def open(self, host='', port=IMAP4_SSL_PORT): """Setup connection to remote server on "host:port". diff --git a/Lib/nntplib.py b/Lib/nntplib.py --- a/Lib/nntplib.py +++ b/Lib/nntplib.py @@ -288,9 +288,7 @@ """ # Generate a default SSL context if none was passed. if context is None: - context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) - # SSLv2 considered harmful. - context.options |= ssl.OP_NO_SSLv2 + context = ssl._create_stdlib_context() return context.wrap_socket(sock) diff --git a/Lib/poplib.py b/Lib/poplib.py --- a/Lib/poplib.py +++ b/Lib/poplib.py @@ -385,8 +385,7 @@ if not 'STLS' in caps: raise error_proto('-ERR STLS not supported by server') if context is None: - context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) - context.options |= ssl.OP_NO_SSLv2 + context = ssl._create_stdlib_context() resp = self._shortcmd('STLS') self.sock = context.wrap_socket(self.sock) self.file = self.sock.makefile('rb') @@ -421,15 +420,15 @@ "exclusive") self.keyfile = keyfile self.certfile = certfile + if context is None: + context = ssl._create_stdlib_context(certfile=certfile, + keyfile=keyfile) self.context = context POP3.__init__(self, host, port, timeout) def _create_socket(self, timeout): sock = POP3._create_socket(self, timeout) - if self.context is not None: - sock = self.context.wrap_socket(sock) - else: - sock = ssl.wrap_socket(sock, self.keyfile, self.certfile) + sock = self.context.wrap_socket(sock) return sock def stls(self, keyfile=None, certfile=None, context=None): diff --git a/Lib/smtplib.py b/Lib/smtplib.py --- a/Lib/smtplib.py +++ b/Lib/smtplib.py @@ -664,10 +664,10 @@ if context is not None and certfile is not None: raise ValueError("context and certfile arguments are mutually " "exclusive") - if context is not None: - self.sock = context.wrap_socket(self.sock) - else: - self.sock = ssl.wrap_socket(self.sock, keyfile, certfile) + if context is None: + context = ssl._create_stdlib_context(certfile=certfile, + keyfile=keyfile) + self.sock = context.wrap_socket(self.sock) self.file = None # RFC 3207: # The client MUST discard any knowledge obtained from @@ -880,6 +880,9 @@ "exclusive") self.keyfile = keyfile self.certfile = certfile + if context is None: + context = ssl._create_stdlib_context(certfile=certfile, + keyfile=keyfile) self.context = context SMTP.__init__(self, host, port, local_hostname, timeout, source_address) @@ -889,10 +892,7 @@ print('connect:', (host, port), file=stderr) new_socket = socket.create_connection((host, port), timeout, self.source_address) - if self.context is not None: - new_socket = self.context.wrap_socket(new_socket) - else: - new_socket = ssl.wrap_socket(new_socket, self.keyfile, self.certfile) + new_socket = self.context.wrap_socket(new_socket) return new_socket __all__.append("SMTP_SSL") diff --git a/Lib/ssl.py b/Lib/ssl.py --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -398,6 +398,43 @@ return context +def _create_stdlib_context(protocol=PROTOCOL_SSLv23, *, cert_reqs=None, + purpose=Purpose.SERVER_AUTH, + certfile=None, keyfile=None, + cafile=None, capath=None, cadata=None): + """Create a SSLContext object for Python stdlib modules + + All Python stdlib modules shall use this function to create SSLContext + objects in order to keep common settings in one place. The configuration + is less restrict than create_default_context()'s to increase backward + compatibility. + """ + if not isinstance(purpose, _ASN1Object): + raise TypeError(purpose) + + context = SSLContext(protocol) + # SSLv2 considered harmful. + context.options |= OP_NO_SSLv2 + + if cert_reqs is not None: + context.verify_mode = cert_reqs + + if keyfile and not certfile: + raise ValueError("certfile must be specified") + if certfile or keyfile: + context.load_cert_chain(certfile, keyfile) + + # load CA root certs + if cafile or capath or cadata: + context.load_verify_locations(cafile, capath, cadata) + elif context.verify_mode != CERT_NONE: + # no explicit cafile, capath or cadata but the verify mode is + # CERT_OPTIONAL or CERT_REQUIRED. Let's try to load default system + # root CA certificates for the given purpose. This may fail silently. + context.load_default_certs(purpose) + + return context + class SSLSocket(socket): """This class implements a subtype of socket.socket that wraps the underlying OS socket in an SSL context when necessary, and @@ -829,15 +866,16 @@ If 'ssl_version' is specified, use it in the connection attempt.""" host, port = addr - if (ca_certs is not None): + if ca_certs is not None: cert_reqs = CERT_REQUIRED else: cert_reqs = CERT_NONE - s = create_connection(addr) - s = wrap_socket(s, ssl_version=ssl_version, - cert_reqs=cert_reqs, ca_certs=ca_certs) - dercert = s.getpeercert(True) - s.close() + context = _create_stdlib_context(ssl_version, + cert_reqs=cert_reqs, + cafile=ca_certs) + with create_connection(addr) as sock: + with context.wrap_socket(sock) as sslsock: + dercert = sslsock.getpeercert(True) return DER_cert_to_PEM_cert(dercert) def get_protocol_name(protocol_code): diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -1018,6 +1018,27 @@ self.assertEqual(ctx.verify_mode, ssl.CERT_NONE) self.assertEqual(ctx.options & ssl.OP_NO_SSLv2, ssl.OP_NO_SSLv2) + def test__create_stdlib_context(self): + ctx = ssl._create_stdlib_context() + self.assertEqual(ctx.protocol, ssl.PROTOCOL_SSLv23) + self.assertEqual(ctx.verify_mode, ssl.CERT_NONE) + self.assertEqual(ctx.options & ssl.OP_NO_SSLv2, ssl.OP_NO_SSLv2) + + ctx = ssl._create_stdlib_context(ssl.PROTOCOL_TLSv1) + self.assertEqual(ctx.protocol, ssl.PROTOCOL_TLSv1) + self.assertEqual(ctx.verify_mode, ssl.CERT_NONE) + self.assertEqual(ctx.options & ssl.OP_NO_SSLv2, ssl.OP_NO_SSLv2) + + ctx = ssl._create_stdlib_context(ssl.PROTOCOL_TLSv1, + cert_reqs=ssl.CERT_REQUIRED) + self.assertEqual(ctx.protocol, ssl.PROTOCOL_TLSv1) + self.assertEqual(ctx.verify_mode, ssl.CERT_REQUIRED) + self.assertEqual(ctx.options & ssl.OP_NO_SSLv2, ssl.OP_NO_SSLv2) + + ctx = ssl._create_stdlib_context(purpose=ssl.Purpose.CLIENT_AUTH) + self.assertEqual(ctx.protocol, ssl.PROTOCOL_SSLv23) + self.assertEqual(ctx.verify_mode, ssl.CERT_NONE) + self.assertEqual(ctx.options & ssl.OP_NO_SSLv2, ssl.OP_NO_SSLv2) class SSLErrorTests(unittest.TestCase): diff --git a/Lib/urllib/request.py b/Lib/urllib/request.py --- a/Lib/urllib/request.py +++ b/Lib/urllib/request.py @@ -141,13 +141,9 @@ if cafile or capath or cadefault: if not _have_ssl: raise ValueError('SSL support not available') - context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) - context.options |= ssl.OP_NO_SSLv2 - context.verify_mode = ssl.CERT_REQUIRED - if cafile or capath: - context.load_verify_locations(cafile, capath) - else: - context.set_default_verify_paths() + context = ssl._create_stdlib_context(cert_reqs=ssl.CERT_REQUIRED, + cafile=cafile, + capath=capath) https_handler = HTTPSHandler(context=context, check_hostname=True) opener = build_opener(https_handler) elif _opener is None: diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,10 @@ Library ------- +- Issue #19735: Implement private function ssl._create_stdlib_context() to + create SSLContext objects in Python's stdlib module. It provides a single + configuration point and makes use of SSLContext.load_default_certs(). + - Issue #16203: Add re.fullmatch() function and regex.fullmatch() method, which anchor the pattern at both ends of the string to match. Original patch by Matthew Barnett. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 23:09:09 2013 From: python-checkins at python.org (martin.v.loewis) Date: Sat, 23 Nov 2013 23:09:09 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Update_Tcl/Tk_to_8=2E6=2E1?= =?utf-8?q?=2E?= Message-ID: <3dRpZF3Sm8z7M3C@mail.python.org> http://hg.python.org/cpython/rev/730d89d66b38 changeset: 87464:730d89d66b38 user: Martin v. L?wis date: Sat Nov 23 23:05:27 2013 +0100 summary: Update Tcl/Tk to 8.6.1. files: Misc/NEWS | 2 +- PCbuild/build_tkinter.py | 6 +++--- PCbuild/pyproject.props | 8 ++++---- PCbuild/readme.txt | 4 ++-- Tools/buildbot/external-amd64.bat | 14 +++++++------- Tools/buildbot/external-common.bat | 8 ++++---- Tools/buildbot/external.bat | 14 +++++++------- 7 files changed, 28 insertions(+), 28 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -402,7 +402,7 @@ Build ----- -- Update SQLite to 3.8.1 and xz to 5.0.5 on Windows. +- Update SQLite to 3.8.1, xz to 5.0.5, and Tcl/Tk to 8.6.1 on Windows. - Issue #16632: Enable DEP and ASLR on Windows. diff --git a/PCbuild/build_tkinter.py b/PCbuild/build_tkinter.py --- a/PCbuild/build_tkinter.py +++ b/PCbuild/build_tkinter.py @@ -11,9 +11,9 @@ here = os.path.abspath(os.path.dirname(__file__)) par = os.path.pardir -TCL = "tcl8.5.11" -TK = "tk8.5.11" -TIX = "tix-8.4.3.x" +TCL = "tcl8.6.1" +TK = "tk8.6.1" +TIX = "tix-8.4.3.3" ROOT = os.path.abspath(os.path.join(here, par, par)) NMAKE = ('nmake /nologo /f %s %s %s') diff --git a/PCbuild/pyproject.props b/PCbuild/pyproject.props --- a/PCbuild/pyproject.props +++ b/PCbuild/pyproject.props @@ -23,10 +23,10 @@ $(externalsDir)\openssl-1.0.1e $(externalsDir)\tcltk $(externalsDir)\tcltk64 - $(tcltkDir)\lib\tcl85.lib;$(tcltkDir)\lib\tk85.lib - $(tcltkDir)\lib\tcl85g.lib;$(tcltkDir)\lib\tk85g.lib - $(tcltk64Dir)\lib\tcl85.lib;$(tcltk64Dir)\lib\tk85.lib - $(tcltk64Dir)\lib\tcl85g.lib;$(tcltk64Dir)\lib\tk85g.lib + $(tcltkDir)\lib\tcl86t.lib;$(tcltkDir)\lib\tk86t.lib + $(tcltkDir)\lib\tcl86tg.lib;$(tcltkDir)\lib\tk86tg.lib + $(tcltk64Dir)\lib\tcl86t.lib;$(tcltk64Dir)\lib\tk86t.lib + $(tcltk64Dir)\lib\tcl86tg.lib;$(tcltk64Dir)\lib\tk86tg.lib diff --git a/PCbuild/readme.txt b/PCbuild/readme.txt --- a/PCbuild/readme.txt +++ b/PCbuild/readme.txt @@ -211,7 +211,7 @@ Homepage: http://www.sqlite.org/ _tkinter - Wraps version 8.5.11 of the Tk windowing system. + Wraps version 8.6.1 of the Tk windowing system. Homepage: http://www.tcl.tk/ @@ -261,7 +261,7 @@ So for a release build, you'd call it as: nmake -f makefile.vc MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 clean all install -Note that the above command is called from within ..\..\tcl-8.5.11.0\win +Note that the above command is called from within ..\..\tcl-8.6.1.0\win (relative to this directory); don't forget to build Tk as well as Tcl! This will be cleaned up in the future; http://bugs.python.org/issue15968 diff --git a/Tools/buildbot/external-amd64.bat b/Tools/buildbot/external-amd64.bat --- a/Tools/buildbot/external-amd64.bat +++ b/Tools/buildbot/external-amd64.bat @@ -4,18 +4,18 @@ call "Tools\buildbot\external-common.bat" call "%VS100COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64 -if not exist tcltk64\bin\tcl85g.dll ( - cd tcl-8.5.11.0\win +if not exist tcltk64\bin\tcl86tg.dll ( + cd tcl-8.6.1.0\win nmake -f makefile.vc DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 clean all nmake -f makefile.vc DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 install cd ..\.. ) -if not exist tcltk64\bin\tk85g.dll ( - cd tk-8.5.11.0\win - nmake -f makefile.vc OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.5.11.0 clean - nmake -f makefile.vc OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.5.11.0 all - nmake -f makefile.vc OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.5.11.0 install +if not exist tcltk64\bin\tk86tg.dll ( + cd tk-8.6.1.0\win + nmake -f makefile.vc OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.6.1.0 clean + nmake -f makefile.vc OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.6.1.0 all + nmake -f makefile.vc OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.6.1.0 install cd ..\.. ) diff --git a/Tools/buildbot/external-common.bat b/Tools/buildbot/external-common.bat --- a/Tools/buildbot/external-common.bat +++ b/Tools/buildbot/external-common.bat @@ -30,11 +30,11 @@ ) @rem tcl/tk -if not exist tcl-8.5.11.0 ( - rd /s/q tcltk tcltk64 - svn export http://svn.python.org/projects/external/tcl-8.5.11.0 +if not exist tcl-8.6.1.0 ( + rd /s/q tcltk tcltk64 tcl-8.5.11.0 tk-8.5.11.0 + svn export http://svn.python.org/projects/external/tcl-8.6.1.0 ) -if not exist tk-8.5.11.0 svn export http://svn.python.org/projects/external/tk-8.5.11.0 +if not exist tk-8.6.1.0 svn export http://svn.python.org/projects/external/tk-8.6.1.0 @rem sqlite3 if not exist sqlite-3.8.1 ( diff --git a/Tools/buildbot/external.bat b/Tools/buildbot/external.bat --- a/Tools/buildbot/external.bat +++ b/Tools/buildbot/external.bat @@ -4,18 +4,18 @@ call "Tools\buildbot\external-common.bat" call "%VS100COMNTOOLS%\vsvars32.bat" -if not exist tcltk\bin\tcl85g.dll ( +if not exist tcltk\bin\tcl86tg.dll ( @rem all and install need to be separate invocations, otherwise nmakehlp is not found on install - cd tcl-8.5.11.0\win + cd tcl-8.6.1.0\win nmake -f makefile.vc DEBUG=1 INSTALLDIR=..\..\tcltk clean all nmake -f makefile.vc DEBUG=1 INSTALLDIR=..\..\tcltk install cd ..\.. ) -if not exist tcltk\bin\tk85g.dll ( - cd tk-8.5.11.0\win - nmake -f makefile.vc OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.5.11.0 clean - nmake -f makefile.vc OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.5.11.0 all - nmake -f makefile.vc OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.5.11.0 install +if not exist tcltk\bin\tk86tg.dll ( + cd tk-8.6.1.0\win + nmake -f makefile.vc OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.6.1.0 clean + nmake -f makefile.vc OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.6.1.0 all + nmake -f makefile.vc OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.6.1.0 install cd ..\.. ) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 23:49:59 2013 From: python-checkins at python.org (larry.hastings) Date: Sat, 23 Nov 2013 23:49:59 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319722=3A_Added_op?= =?utf-8?q?code=2Estack=5Feffect=28=29=2C_which_accurately?= Message-ID: <3dRqTM0kHsz7Lq7@mail.python.org> http://hg.python.org/cpython/rev/5fe72b9ed48e changeset: 87465:5fe72b9ed48e user: Larry Hastings date: Sat Nov 23 14:49:22 2013 -0800 summary: Issue #19722: Added opcode.stack_effect(), which accurately computes the stack effect of bytecode instructions. files: Doc/library/dis.rst | 7 + Include/compile.h | 3 + Lib/opcode.py | 13 +++ Lib/test/test__opcode.py | 22 +++++ Misc/NEWS | 2 + Modules/_opcode.c | 116 +++++++++++++++++++++++++++ Python/compile.c | 20 ++- setup.py | 2 + 8 files changed, 177 insertions(+), 8 deletions(-) diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -217,6 +217,13 @@ Detect all offsets in the code object *code* which are jump targets, and return a list of these offsets. + +.. function:: stack_effect(opcode, [oparg]) + + Compute the stack effect of *opcode* with argument *oparg*. + + .. versionadded:: 3.4 + .. _bytecodes: Python Bytecode Instructions diff --git a/Include/compile.h b/Include/compile.h --- a/Include/compile.h +++ b/Include/compile.h @@ -54,6 +54,9 @@ /* _Py_Mangle is defined in compile.c */ PyAPI_FUNC(PyObject*) _Py_Mangle(PyObject *p, PyObject *name); +#define PY_INVALID_STACK_EFFECT INT_MAX +PyAPI_FUNC(int) PyCompile_OpcodeStackEffect(int opcode, int oparg); + #ifdef __cplusplus } #endif diff --git a/Lib/opcode.py b/Lib/opcode.py --- a/Lib/opcode.py +++ b/Lib/opcode.py @@ -8,6 +8,19 @@ "haslocal", "hascompare", "hasfree", "opname", "opmap", "HAVE_ARGUMENT", "EXTENDED_ARG", "hasnargs"] +# It's a chicken-and-egg I'm afraid: +# We're imported before _opcode's made. +# With exception unheeded +# (stack_effect is not needed) +# Both our chickens and eggs are allayed. +# --Larry Hastings, 2013/11/23 + +try: + from _opcode import stack_effect + __all__.append('stack_effect') +except ImportError: + pass + cmp_op = ('<', '<=', '==', '!=', '>', '>=', 'in', 'not in', 'is', 'is not', 'exception match', 'BAD') diff --git a/Lib/test/test__opcode.py b/Lib/test/test__opcode.py new file mode 100644 --- /dev/null +++ b/Lib/test/test__opcode.py @@ -0,0 +1,22 @@ +import dis +import _opcode +from test.support import run_unittest +import unittest + +class OpcodeTests(unittest.TestCase): + + def test_stack_effect(self): + self.assertEqual(_opcode.stack_effect(dis.opmap['POP_TOP']), -1) + self.assertEqual(_opcode.stack_effect(dis.opmap['DUP_TOP_TWO']), 2) + self.assertEqual(_opcode.stack_effect(dis.opmap['BUILD_SLICE'], 0), -1) + self.assertEqual(_opcode.stack_effect(dis.opmap['BUILD_SLICE'], 1), -1) + self.assertEqual(_opcode.stack_effect(dis.opmap['BUILD_SLICE'], 3), -2) + self.assertRaises(ValueError, _opcode.stack_effect, 30000) + self.assertRaises(ValueError, _opcode.stack_effect, dis.opmap['BUILD_SLICE']) + self.assertRaises(ValueError, _opcode.stack_effect, dis.opmap['POP_TOP'], 0) + +def test_main(): + run_unittest(OpcodeTests) + +if __name__ == "__main__": + test_main() diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -67,6 +67,8 @@ Library ------- +- Issue #19722: Added opcode.stack_effect(), which + computes the stack effect of bytecode instructions. - Issue #19735: Implement private function ssl._create_stdlib_context() to create SSLContext objects in Python's stdlib module. It provides a single diff --git a/Modules/_opcode.c b/Modules/_opcode.c new file mode 100644 --- /dev/null +++ b/Modules/_opcode.c @@ -0,0 +1,116 @@ +#include "Python.h" +#include "opcode.h" + + +/*[clinic] + +module _opcode + +_opcode.stack_effect -> int + + opcode: int + + [ + oparg: int + ] + / + +Compute the stack effect of the opcode. +[clinic]*/ + +PyDoc_STRVAR(_opcode_stack_effect__doc__, +"Compute the stack effect of the opcode.\n" +"\n" +"_opcode.stack_effect(opcode, [oparg])"); + +#define _OPCODE_STACK_EFFECT_METHODDEF \ + {"stack_effect", (PyCFunction)_opcode_stack_effect, METH_VARARGS, _opcode_stack_effect__doc__}, + +static int +_opcode_stack_effect_impl(PyObject *module, int opcode, int group_right_1, int oparg); + +static PyObject * +_opcode_stack_effect(PyObject *module, PyObject *args) +{ + PyObject *return_value = NULL; + int opcode; + int group_right_1 = 0; + int oparg = 0; + int _return_value; + + switch (PyTuple_Size(args)) { + case 1: + if (!PyArg_ParseTuple(args, "i:stack_effect", &opcode)) + return NULL; + break; + case 2: + if (!PyArg_ParseTuple(args, "ii:stack_effect", &opcode, &oparg)) + return NULL; + group_right_1 = 1; + break; + default: + PyErr_SetString(PyExc_TypeError, "_opcode.stack_effect requires 1 to 2 arguments"); + return NULL; + } + _return_value = _opcode_stack_effect_impl(module, opcode, group_right_1, oparg); + if ((_return_value == -1) && PyErr_Occurred()) + goto exit; + return_value = PyLong_FromLong((long)_return_value); + +exit: + return return_value; +} + +static int +_opcode_stack_effect_impl(PyObject *module, int opcode, int group_right_1, int oparg) +/*[clinic checksum: 2312ded40abc9bcbce718942de21f53e61a2dfd3]*/ +{ + int effect; + if (HAS_ARG(opcode)) { + if (!group_right_1) { + PyErr_SetString(PyExc_ValueError, + "stack_effect: opcode requires oparg but oparg was not specified"); + return -1; + } + } + else if (group_right_1) { + PyErr_SetString(PyExc_ValueError, + "stack_effect: opcode does not permit oparg but oparg was specified"); + return -1; + } + effect = PyCompile_OpcodeStackEffect(opcode, oparg); + if (effect == PY_INVALID_STACK_EFFECT) { + PyErr_SetString(PyExc_ValueError, + "invalid opcode or oparg"); + return -1; + } + return effect; +} + + + + +static PyMethodDef +opcode_functions[] = { + _OPCODE_STACK_EFFECT_METHODDEF + {NULL, NULL, 0, NULL} +}; + + +static struct PyModuleDef opcodemodule = { + PyModuleDef_HEAD_INIT, + "_opcode", + "Opcode support module.", + -1, + opcode_functions, + NULL, + NULL, + NULL, + NULL +}; + +PyMODINIT_FUNC +PyInit__opcode(void) +{ + return PyModule_Create(&opcodemodule); +} diff --git a/Python/compile.c b/Python/compile.c --- a/Python/compile.c +++ b/Python/compile.c @@ -853,8 +853,8 @@ b->b_instr[off].i_lineno = c->u->u_lineno; } -static int -opcode_stack_effect(int opcode, int oparg) +int +PyCompile_OpcodeStackEffect(int opcode, int oparg) { switch (opcode) { case POP_TOP: @@ -1044,11 +1044,9 @@ case DELETE_DEREF: return 0; default: - fprintf(stderr, "opcode = %d\n", opcode); - Py_FatalError("opcode_stack_effect()"); - + return PY_INVALID_STACK_EFFECT; } - return 0; /* not reachable */ + return PY_INVALID_STACK_EFFECT; /* not reachable */ } /* Add an opcode with no argument. @@ -3829,7 +3827,7 @@ static int stackdepth_walk(struct compiler *c, basicblock *b, int depth, int maxdepth) { - int i, target_depth; + int i, target_depth, effect; struct instr *instr; if (b->b_seen || b->b_startdepth >= depth) return maxdepth; @@ -3837,7 +3835,13 @@ b->b_startdepth = depth; for (i = 0; i < b->b_iused; i++) { instr = &b->b_instr[i]; - depth += opcode_stack_effect(instr->i_opcode, instr->i_oparg); + effect = PyCompile_OpcodeStackEffect(instr->i_opcode, instr->i_oparg); + if (effect == PY_INVALID_STACK_EFFECT) { + fprintf(stderr, "opcode = %d\n", instr->i_opcode); + Py_FatalError("PyCompile_OpcodeStackEffect()"); + } + depth += effect; + if (depth > maxdepth) maxdepth = depth; assert(depth >= 0); /* invalid code or bug in stackdepth() */ diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -596,6 +596,8 @@ exts.append( Extension('_lsprof', ['_lsprof.c', 'rotatingtree.c']) ) # static Unicode character database exts.append( Extension('unicodedata', ['unicodedata.c']) ) + # _opcode module + exts.append( Extension('_opcode', ['_opcode.c']) ) # Modules with some UNIX dependencies -- on by default: # (If you have a really backward UNIX, select and socket may not be -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 23:54:37 2013 From: python-checkins at python.org (larry.hastings) Date: Sat, 23 Nov 2013 23:54:37 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319730=3A_Argument?= =?utf-8?q?_Clinic_now_supports_all_the_existing_PyArg?= Message-ID: <3dRqZj1Wg8z7Lq7@mail.python.org> http://hg.python.org/cpython/rev/760ccd78e874 changeset: 87466:760ccd78e874 user: Larry Hastings date: Sat Nov 23 14:54:00 2013 -0800 summary: Issue #19730: Argument Clinic now supports all the existing PyArg "format units" as legacy converters, as well as two new features: "self converters" and the "version" directive. files: Include/pyport.h | 7 + Misc/NEWS | 4 +- Modules/_datetimemodule.c | 10 +- Modules/_dbmmodule.c | 107 ++++++- Modules/_weakref.c | 8 +- Modules/posixmodule.c | 36 +- Modules/zlibmodule.c | 100 +++++- Objects/unicodeobject.c | 10 +- Tools/clinic/clinic.py | 355 +++++++++++++++++++------ 9 files changed, 474 insertions(+), 163 deletions(-) diff --git a/Include/pyport.h b/Include/pyport.h --- a/Include/pyport.h +++ b/Include/pyport.h @@ -188,6 +188,13 @@ #define SIZEOF_PY_UHASH_T SIZEOF_SIZE_T typedef size_t Py_uhash_t; +/* Only used for compatibility with code that may not be PY_SSIZE_T_CLEAN. */ +#ifdef PY_SSIZE_T_CLEAN +typedef Py_ssize_t Py_ssize_clean_t; +#else +typedef int Py_ssize_clean_t; +#endif + /* Largest possible value of size_t. SIZE_MAX is part of C99, so it might be defined on some platforms. If it is not defined, (size_t)-1 is a portable diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -9,7 +9,6 @@ Core and Builtins ----------------- - - Use the repr of a module name in more places in import, especially exceptions. @@ -450,6 +449,9 @@ Tools/Demos ----------- +- Issue #19730: Argument Clinic now supports all the existing PyArg + "format units" as legacy converters, as well as two new features: + "self converters" and the "version" directive. - Issue #19552: pyvenv now bootstraps pip into virtual environments by default (pass --without-pip to request the old behaviour) diff --git a/Modules/_datetimemodule.c b/Modules/_datetimemodule.c --- a/Modules/_datetimemodule.c +++ b/Modules/_datetimemodule.c @@ -4167,10 +4167,10 @@ {"now", (PyCFunction)datetime_datetime_now, METH_VARARGS|METH_KEYWORDS|METH_CLASS, datetime_datetime_now__doc__}, static PyObject * -datetime_datetime_now_impl(PyObject *cls, PyObject *tz); +datetime_datetime_now_impl(PyTypeObject *cls, PyObject *tz); static PyObject * -datetime_datetime_now(PyObject *cls, PyObject *args, PyObject *kwargs) +datetime_datetime_now(PyTypeObject *cls, PyObject *args, PyObject *kwargs) { PyObject *return_value = NULL; static char *_keywords[] = {"tz", NULL}; @@ -4187,8 +4187,8 @@ } static PyObject * -datetime_datetime_now_impl(PyObject *cls, PyObject *tz) -/*[clinic checksum: cde1daca68c9b7dca6df51759db2de1d43a39774]*/ +datetime_datetime_now_impl(PyTypeObject *cls, PyObject *tz) +/*[clinic checksum: 5e61647d5d1feaf1ab096c5406ccea17bb7b061c]*/ { PyObject *self; @@ -4198,7 +4198,7 @@ if (check_tzinfo_subclass(tz) < 0) return NULL; - self = datetime_best_possible(cls, + self = datetime_best_possible((PyObject *)cls, tz == Py_None ? localtime : gmtime, tz); if (self != NULL && tz != Py_None) { diff --git a/Modules/_dbmmodule.c b/Modules/_dbmmodule.c --- a/Modules/_dbmmodule.c +++ b/Modules/_dbmmodule.c @@ -43,6 +43,20 @@ static PyObject *DbmError; +/*[clinic] +module dbm +class dbm.dbm +[clinic]*/ +/*[clinic checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ + +/*[python] +class dbmobject_converter(self_converter): + type = "dbmobject *" + def converter_init(self): + self.name = 'dp' +[python]*/ +/*[python checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ + static PyObject * newdbmobject(const char *file, int flags, int mode) { @@ -248,27 +262,77 @@ 0, /* sq_inplace_repeat */ }; +/*[clinic] + +dbm.dbm.get + + self: dbmobject + + key: str(length=True) + [ + default: object + ] + / + +Return the value for key if present, otherwise default. +[clinic]*/ + +PyDoc_STRVAR(dbm_dbm_get__doc__, +"Return the value for key if present, otherwise default.\n" +"\n" +"dbm.dbm.get(key, [default])"); + +#define DBM_DBM_GET_METHODDEF \ + {"get", (PyCFunction)dbm_dbm_get, METH_VARARGS, dbm_dbm_get__doc__}, + static PyObject * -dbm_get(dbmobject *dp, PyObject *args) +dbm_dbm_get_impl(dbmobject *dp, const char *key, Py_ssize_clean_t key_length, int group_right_1, PyObject *default_value); + +static PyObject * +dbm_dbm_get(PyObject *self, PyObject *args) { - datum key, val; - PyObject *defvalue = Py_None; - char *tmp_ptr; - Py_ssize_t tmp_size; + PyObject *return_value = NULL; + const char *key; + Py_ssize_clean_t key_length; + int group_right_1 = 0; + PyObject *default_value = NULL; - if (!PyArg_ParseTuple(args, "s#|O:get", - &tmp_ptr, &tmp_size, &defvalue)) - return NULL; - key.dptr = tmp_ptr; - key.dsize = tmp_size; + switch (PyTuple_Size(args)) { + case 1: + if (!PyArg_ParseTuple(args, "s#:get", &key, &key_length)) + return NULL; + break; + case 2: + if (!PyArg_ParseTuple(args, "s#O:get", &key, &key_length, &default_value)) + return NULL; + group_right_1 = 1; + break; + default: + PyErr_SetString(PyExc_TypeError, "dbm.dbm.get requires 1 to 2 arguments"); + return NULL; + } + return_value = dbm_dbm_get_impl((dbmobject *)self, key, key_length, group_right_1, default_value); + + return return_value; +} + +static PyObject * +dbm_dbm_get_impl(dbmobject *dp, const char *key, Py_ssize_clean_t key_length, int group_right_1, PyObject *default_value) +/*[clinic checksum: 5b4265e66568f163ef0fc7efec09410eaf793508]*/ +{ + datum dbm_key, val; + + if (!group_right_1) + default_value = Py_None; + dbm_key.dptr = (char *)key; + dbm_key.dsize = key_length; check_dbmobject_open(dp); - val = dbm_fetch(dp->di_dbm, key); + val = dbm_fetch(dp->di_dbm, dbm_key); if (val.dptr != NULL) return PyBytes_FromStringAndSize(val.dptr, val.dsize); - else { - Py_INCREF(defvalue); - return defvalue; - } + + Py_INCREF(default_value); + return default_value; } static PyObject * @@ -333,9 +397,7 @@ "close()\nClose the database."}, {"keys", (PyCFunction)dbm_keys, METH_NOARGS, "keys() -> list\nReturn a list of all keys in the database."}, - {"get", (PyCFunction)dbm_get, METH_VARARGS, - "get(key[, default]) -> value\n" - "Return the value for key if present, otherwise default."}, + DBM_DBM_GET_METHODDEF {"setdefault", (PyCFunction)dbm_setdefault, METH_VARARGS, "setdefault(key[, default]) -> value\n" "Return the value for key if present, otherwise default. If key\n" @@ -379,7 +441,6 @@ /* ----------------------------------------------------------------- */ /*[clinic] -module dbm dbm.open as dbmopen @@ -415,10 +476,10 @@ {"open", (PyCFunction)dbmopen, METH_VARARGS, dbmopen__doc__}, static PyObject * -dbmopen_impl(PyObject *module, const char *filename, const char *flags, int mode); +dbmopen_impl(PyModuleDef *module, const char *filename, const char *flags, int mode); static PyObject * -dbmopen(PyObject *module, PyObject *args) +dbmopen(PyModuleDef *module, PyObject *args) { PyObject *return_value = NULL; const char *filename; @@ -436,8 +497,8 @@ } static PyObject * -dbmopen_impl(PyObject *module, const char *filename, const char *flags, int mode) -/*[clinic checksum: 2b0ec9e3c6ecd19e06d16c9f0ba33848245cb1ab]*/ +dbmopen_impl(PyModuleDef *module, const char *filename, const char *flags, int mode) +/*[clinic checksum: c1f2036017ec36a43ac6f59893732751e67c19d5]*/ { int iflags; diff --git a/Modules/_weakref.c b/Modules/_weakref.c --- a/Modules/_weakref.c +++ b/Modules/_weakref.c @@ -25,10 +25,10 @@ {"getweakrefcount", (PyCFunction)_weakref_getweakrefcount, METH_O, _weakref_getweakrefcount__doc__}, static Py_ssize_t -_weakref_getweakrefcount_impl(PyObject *module, PyObject *object); +_weakref_getweakrefcount_impl(PyModuleDef *module, PyObject *object); static PyObject * -_weakref_getweakrefcount(PyObject *module, PyObject *object) +_weakref_getweakrefcount(PyModuleDef *module, PyObject *object) { PyObject *return_value = NULL; Py_ssize_t _return_value; @@ -42,8 +42,8 @@ } static Py_ssize_t -_weakref_getweakrefcount_impl(PyObject *module, PyObject *object) -/*[clinic checksum: 05cffbc3a4b193a0b7e645da81be281748704f69]*/ +_weakref_getweakrefcount_impl(PyModuleDef *module, PyObject *object) +/*[clinic checksum: 015113be0c9a0a8672d35df10c63e3642cc23da4]*/ { PyWeakReference **list; diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -2460,10 +2460,10 @@ {"stat", (PyCFunction)os_stat, METH_VARARGS|METH_KEYWORDS, os_stat__doc__}, static PyObject * -os_stat_impl(PyObject *module, path_t *path, int dir_fd, int follow_symlinks); - -static PyObject * -os_stat(PyObject *module, PyObject *args, PyObject *kwargs) +os_stat_impl(PyModuleDef *module, path_t *path, int dir_fd, int follow_symlinks); + +static PyObject * +os_stat(PyModuleDef *module, PyObject *args, PyObject *kwargs) { PyObject *return_value = NULL; static char *_keywords[] = {"path", "dir_fd", "follow_symlinks", NULL}; @@ -2485,8 +2485,8 @@ } static PyObject * -os_stat_impl(PyObject *module, path_t *path, int dir_fd, int follow_symlinks) -/*[clinic checksum: 89390f78327e3f045a81974d758d3996e2a71f68]*/ +os_stat_impl(PyModuleDef *module, path_t *path, int dir_fd, int follow_symlinks) +/*[clinic checksum: b08112eff0ceab3ec2c72352da95ce73f245d104]*/ { return posix_do_stat("stat", path, dir_fd, follow_symlinks); } @@ -2600,10 +2600,10 @@ {"access", (PyCFunction)os_access, METH_VARARGS|METH_KEYWORDS, os_access__doc__}, static PyObject * -os_access_impl(PyObject *module, path_t *path, int mode, int dir_fd, int effective_ids, int follow_symlinks); - -static PyObject * -os_access(PyObject *module, PyObject *args, PyObject *kwargs) +os_access_impl(PyModuleDef *module, path_t *path, int mode, int dir_fd, int effective_ids, int follow_symlinks); + +static PyObject * +os_access(PyModuleDef *module, PyObject *args, PyObject *kwargs) { PyObject *return_value = NULL; static char *_keywords[] = {"path", "mode", "dir_fd", "effective_ids", "follow_symlinks", NULL}; @@ -2627,8 +2627,8 @@ } static PyObject * -os_access_impl(PyObject *module, path_t *path, int mode, int dir_fd, int effective_ids, int follow_symlinks) -/*[clinic checksum: aa3e145816a748172e62df8e44af74169c7e1247]*/ +os_access_impl(PyModuleDef *module, path_t *path, int mode, int dir_fd, int effective_ids, int follow_symlinks) +/*[clinic checksum: b9f8ececb061d31b64220c29526bfee642d1b602]*/ { PyObject *return_value = NULL; @@ -2734,10 +2734,10 @@ {"ttyname", (PyCFunction)os_ttyname, METH_VARARGS, os_ttyname__doc__}, static char * -os_ttyname_impl(PyObject *module, int fd); - -static PyObject * -os_ttyname(PyObject *module, PyObject *args) +os_ttyname_impl(PyModuleDef *module, int fd); + +static PyObject * +os_ttyname(PyModuleDef *module, PyObject *args) { PyObject *return_value = NULL; int fd; @@ -2757,8 +2757,8 @@ } static char * -os_ttyname_impl(PyObject *module, int fd) -/*[clinic checksum: c742dd621ec98d0f81d37d264e1d3c89c7a5fb1a]*/ +os_ttyname_impl(PyModuleDef *module, int fd) +/*[clinic checksum: 61e4e525984cb293f949ccae6ae393c0011dfe8e]*/ { char *ret; diff --git a/Modules/zlibmodule.c b/Modules/zlibmodule.c --- a/Modules/zlibmodule.c +++ b/Modules/zlibmodule.c @@ -81,6 +81,13 @@ PyErr_Format(ZlibError, "Error %d %s: %.200s", err, msg, zmsg); } +/*[clinic] +module zlib +class zlib.Compress +class zlib.Decompress +[clinic]*/ +/*[clinic checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ + PyDoc_STRVAR(compressobj__doc__, "compressobj(level=-1, method=DEFLATED, wbits=15, memlevel=8,\n" " strategy=Z_DEFAULT_STRATEGY[, zdict])\n" @@ -157,32 +164,86 @@ PyMem_RawFree(ptr); } -PyDoc_STRVAR(compress__doc__, -"compress(string[, level]) -- Returned compressed string.\n" +/*[clinic] +zlib.compress + bytes: Py_buffer + Binary data to be compressed. + [ + level: int + Compression level, in 0-9. + ] + / + +Returns compressed string. + +[clinic]*/ + +PyDoc_STRVAR(zlib_compress__doc__, +"Returns compressed string.\n" "\n" -"Optional arg level is the compression level, in 0-9."); +"zlib.compress(bytes, [level])\n" +" bytes\n" +" Binary data to be compressed.\n" +" level\n" +" Compression level, in 0-9."); + +#define ZLIB_COMPRESS_METHODDEF \ + {"compress", (PyCFunction)zlib_compress, METH_VARARGS, zlib_compress__doc__}, static PyObject * -PyZlib_compress(PyObject *self, PyObject *args) +zlib_compress_impl(PyModuleDef *module, Py_buffer *bytes, int group_right_1, int level); + +static PyObject * +zlib_compress(PyModuleDef *module, PyObject *args) +{ + PyObject *return_value = NULL; + Py_buffer bytes; + int group_right_1 = 0; + int level = 0; + + switch (PyTuple_Size(args)) { + case 1: + if (!PyArg_ParseTuple(args, "y*:compress", &bytes)) + return NULL; + break; + case 2: + if (!PyArg_ParseTuple(args, "y*i:compress", &bytes, &level)) + return NULL; + group_right_1 = 1; + break; + default: + PyErr_SetString(PyExc_TypeError, "zlib.compress requires 1 to 2 arguments"); + return NULL; + } + return_value = zlib_compress_impl(module, &bytes, group_right_1, level); + + /* Cleanup for bytes */ + if (bytes.buf) + PyBuffer_Release(&bytes); + + return return_value; +} + +static PyObject * +zlib_compress_impl(PyModuleDef *module, Py_buffer *bytes, int group_right_1, int level) +/*[clinic checksum: 03e857836db25448d4d572da537eb7faf7695d71]*/ { PyObject *ReturnVal = NULL; - Py_buffer pinput; Byte *input, *output = NULL; unsigned int length; - int level=Z_DEFAULT_COMPRESSION, err; + int err; z_stream zst; - /* require Python string object, optional 'level' arg */ - if (!PyArg_ParseTuple(args, "y*|i:compress", &pinput, &level)) - return NULL; + if (!group_right_1) + level = Z_DEFAULT_COMPRESSION; - if ((size_t)pinput.len > UINT_MAX) { + if ((size_t)bytes->len > UINT_MAX) { PyErr_SetString(PyExc_OverflowError, "Size does not fit in an unsigned int"); goto error; } - input = pinput.buf; - length = (unsigned int)pinput.len; + input = bytes->buf; + length = (unsigned int)bytes->len; zst.avail_out = length + length/1000 + 12 + 1; @@ -239,7 +300,6 @@ zlib_error(zst, err, "while finishing compression"); error: - PyBuffer_Release(&pinput); PyMem_Free(output); return ReturnVal; @@ -682,10 +742,6 @@ } /*[clinic] - -module zlib -class zlib.Decompress - zlib.Decompress.decompress data: Py_buffer @@ -739,14 +795,15 @@ exit: /* Cleanup for data */ - PyBuffer_Release(&data); + if (data.buf) + PyBuffer_Release(&data); return return_value; } static PyObject * zlib_Decompress_decompress_impl(PyObject *self, Py_buffer *data, unsigned int max_length) -/*[clinic checksum: 76ca9259e3f5ca86bae9da3d0e75637b5d492234]*/ +/*[clinic checksum: f83e91728d327462d7ccbee95299514f26b92253]*/ { compobject *zself = (compobject *)self; int err; @@ -966,8 +1023,6 @@ #ifdef HAVE_ZLIB_COPY /*[clinic] - -class zlib.Compress zlib.Compress.copy Return a copy of the compression object. @@ -1295,8 +1350,7 @@ { {"adler32", (PyCFunction)PyZlib_adler32, METH_VARARGS, adler32__doc__}, - {"compress", (PyCFunction)PyZlib_compress, METH_VARARGS, - compress__doc__}, + ZLIB_COMPRESS_METHODDEF {"compressobj", (PyCFunction)PyZlib_compressobj, METH_VARARGS|METH_KEYWORDS, compressobj__doc__}, {"crc32", (PyCFunction)PyZlib_crc32, METH_VARARGS, diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -12924,10 +12924,10 @@ {"maketrans", (PyCFunction)unicode_maketrans, METH_VARARGS|METH_STATIC, unicode_maketrans__doc__}, static PyObject * -unicode_maketrans_impl(PyObject *x, PyObject *y, PyObject *z); +unicode_maketrans_impl(void *null, PyObject *x, PyObject *y, PyObject *z); static PyObject * -unicode_maketrans(PyObject *null, PyObject *args) +unicode_maketrans(void *null, PyObject *args) { PyObject *return_value = NULL; PyObject *x; @@ -12938,15 +12938,15 @@ "O|UU:maketrans", &x, &y, &z)) goto exit; - return_value = unicode_maketrans_impl(x, y, z); + return_value = unicode_maketrans_impl(null, x, y, z); exit: return return_value; } static PyObject * -unicode_maketrans_impl(PyObject *x, PyObject *y, PyObject *z) -/*[clinic checksum: 137db9c3199e7906b7967009f511c24fa3235b5f]*/ +unicode_maketrans_impl(void *null, PyObject *x, PyObject *y, PyObject *z) +/*[clinic checksum: 6d522e3aea2f2e123da3c5d367132a99d803f9b9]*/ { PyObject *new = NULL, *key, *value; Py_ssize_t i = 0; diff --git a/Tools/clinic/clinic.py b/Tools/clinic/clinic.py --- a/Tools/clinic/clinic.py +++ b/Tools/clinic/clinic.py @@ -23,7 +23,6 @@ import tempfile import textwrap - # TODO: # converters for # @@ -52,6 +51,8 @@ # is too new for us. # +version = '1' + _empty = inspect._empty _void = inspect._void @@ -195,6 +196,46 @@ return output()[:-1] +def version_splitter(s): + """Splits a version string into a tuple of integers. + + The following ASCII characters are allowed, and employ + the following conversions: + a -> -3 + b -> -2 + c -> -1 + (This permits Python-style version strings such as "1.4b3".) + """ + version = [] + accumulator = [] + def flush(): + if not accumulator: + raise ValueError('Malformed version string: ' + repr(s)) + version.append(int(''.join(accumulator))) + accumulator.clear() + + for c in s: + if c.isdigit(): + accumulator.append(c) + elif c == '.': + flush() + elif c in 'abc': + flush() + version.append('abc'.index(c) - 3) + else: + raise ValueError('Illegal character ' + repr(c) + ' in version string ' + repr(s)) + flush() + return tuple(version) + +def version_comparitor(version1, version2): + iterator = itertools.zip_longest(version_splitter(version1), version_splitter(version2), fillvalue=0) + for i, (a, b) in enumerate(iterator): + if a < b: + return -1 + if a > b: + return 1 + return 0 + class CRenderData: def __init__(self): @@ -373,22 +414,22 @@ {docstring}); #define {methoddef_name} \\ - {{"{name}", (PyCFunction){c_basename}, {meth_flags}, {c_basename}__doc__}}, -""".replace('{meth_flags}', flags) - - def meth_noargs_pyobject_template(self, meth_flags=""): - return self.template_base("METH_NOARGS", meth_flags) + """ + {{"{name}", (PyCFunction){c_basename}, {methoddef_flags}, {c_basename}__doc__}}, +""".replace('{methoddef_flags}', flags) + + def meth_noargs_pyobject_template(self, methoddef_flags=""): + return self.template_base("METH_NOARGS", methoddef_flags) + """ static PyObject * -{c_basename}(PyObject *{self_name}) +{c_basename}({self_type}{self_name}) """ - def meth_noargs_template(self, meth_flags=""): - return self.template_base("METH_NOARGS", meth_flags) + """ + def meth_noargs_template(self, methoddef_flags=""): + return self.template_base("METH_NOARGS", methoddef_flags) + """ static {impl_return_type} {impl_prototype}; static PyObject * -{c_basename}(PyObject *{self_name}) +{c_basename}({self_type}{self_name}) {{ PyObject *return_value = NULL; {declarations} @@ -406,14 +447,14 @@ {impl_prototype} """ - def meth_o_template(self, meth_flags=""): - return self.template_base("METH_O", meth_flags) + """ + def meth_o_template(self, methoddef_flags=""): + return self.template_base("METH_O", methoddef_flags) + """ static PyObject * {c_basename}({impl_parameters}) """ - def meth_o_return_converter_template(self, meth_flags=""): - return self.template_base("METH_O", meth_flags) + """ + def meth_o_return_converter_template(self, methoddef_flags=""): + return self.template_base("METH_O", methoddef_flags) + """ static {impl_return_type} {impl_prototype}; @@ -435,13 +476,13 @@ {impl_prototype} """ - def option_group_template(self, meth_flags=""): - return self.template_base("METH_VARARGS", meth_flags) + """ + def option_group_template(self, methoddef_flags=""): + return self.template_base("METH_VARARGS", methoddef_flags) + """ static {impl_return_type} {impl_prototype}; static PyObject * -{c_basename}(PyObject *{self_name}, PyObject *args) +{c_basename}({self_type}{self_name}, PyObject *args) {{ PyObject *return_value = NULL; {declarations} @@ -460,13 +501,13 @@ {impl_prototype} """ - def keywords_template(self, meth_flags=""): - return self.template_base("METH_VARARGS|METH_KEYWORDS", meth_flags) + """ + def keywords_template(self, methoddef_flags=""): + return self.template_base("METH_VARARGS|METH_KEYWORDS", methoddef_flags) + """ static {impl_return_type} {impl_prototype}; static PyObject * -{c_basename}(PyObject *{self_name}, PyObject *args, PyObject *kwargs) +{c_basename}({self_type}{self_name}, PyObject *args, PyObject *kwargs) {{ PyObject *return_value = NULL; static char *_keywords[] = {{{keywords}, NULL}}; @@ -489,13 +530,13 @@ {impl_prototype} """ - def positional_only_template(self, meth_flags=""): - return self.template_base("METH_VARARGS", meth_flags) + """ + def positional_only_template(self, methoddef_flags=""): + return self.template_base("METH_VARARGS", methoddef_flags) + """ static {impl_return_type} {impl_prototype}; static PyObject * -{c_basename}(PyObject *{self_name}, PyObject *args) +{c_basename}({self_type}{self_name}, PyObject *args) {{ PyObject *return_value = NULL; {declarations} @@ -614,27 +655,6 @@ add, output = text_accumulator() data = CRenderData() - if f.kind == STATIC_METHOD: - meth_flags = 'METH_STATIC' - self_name = "null" - else: - if f.kind == CALLABLE: - meth_flags = '' - self_name = "self" if f.cls else "module" - elif f.kind == CLASS_METHOD: - meth_flags = 'METH_CLASS' - self_name = "cls" - else: - fail("Unrecognized 'kind' " + repr(f.kind) + " for function " + f.name) - - data.impl_parameters.append("PyObject *" + self_name) - data.impl_arguments.append(self_name) - - if f.coexist: - if meth_flags: - meth_flags += '|' - meth_flags += 'METH_COEXIST' - parameters = list(f.parameters.values()) converters = [p.converter for p in parameters] @@ -654,8 +674,6 @@ template_dict['docstring'] = self.docstring_for_c_string(f) - template_dict['self_name'] = self_name - positional = has_option_groups = False if parameters: @@ -680,6 +698,18 @@ if has_option_groups: assert positional + # now insert our "self" (or whatever) parameters + # (we deliberately don't call render on self converters) + stock_self = self_converter('self', f) + template_dict['self_name'] = stock_self.name + template_dict['self_type'] = stock_self.type + data.impl_parameters.insert(0, f.self_converter.type + ("" if f.self_converter.type.endswith('*') else " ") + f.self_converter.name) + if f.self_converter.type != stock_self.type: + self_cast = '(' + f.self_converter.type + ')' + else: + self_cast = '' + data.impl_arguments.insert(0, self_cast + stock_self.name) + f.return_converter.render(f, data) template_dict['impl_return_type'] = f.return_converter.type @@ -701,25 +731,26 @@ if not parameters: if default_return_converter: - template = self.meth_noargs_pyobject_template(meth_flags) + template = self.meth_noargs_pyobject_template(f.methoddef_flags) else: - template = self.meth_noargs_template(meth_flags) + template = self.meth_noargs_template(f.methoddef_flags) elif (len(parameters) == 1 and parameters[0].kind == inspect.Parameter.POSITIONAL_ONLY and not converters[0].is_optional() and isinstance(converters[0], object_converter) and converters[0].format_unit == 'O'): if default_return_converter: - template = self.meth_o_template(meth_flags) + template = self.meth_o_template(f.methoddef_flags) else: # HACK # we're using "impl_parameters" for the # non-impl function, because that works # better for METH_O. but that means we - # must surpress actually declaring the + # must supress actually declaring the # impl's parameters as variables in the # non-impl. but since it's METH_O, we - # only have one anyway, and it's the first one. + # only have one anyway, so + # we don't have any problem finding it. declarations_copy = list(data.declarations) before, pyobject, after = declarations_copy[0].partition('PyObject *') assert not before, "hack failed, see comment" @@ -727,16 +758,16 @@ assert after and after[0].isalpha(), "hack failed, see comment" del declarations_copy[0] template_dict['declarations'] = "\n".join(declarations_copy) - template = self.meth_o_return_converter_template(meth_flags) + template = self.meth_o_return_converter_template(f.methoddef_flags) elif has_option_groups: self.render_option_group_parsing(f, template_dict) - template = self.option_group_template(meth_flags) + template = self.option_group_template(f.methoddef_flags) template = linear_format(template, option_group_parsing=template_dict['option_group_parsing']) elif positional: - template = self.positional_only_template(meth_flags) + template = self.positional_only_template(f.methoddef_flags) else: - template = self.keywords_template(meth_flags) + template = self.keywords_template(f.methoddef_flags) template = linear_format(template, declarations=template_dict['declarations'], @@ -1178,6 +1209,20 @@ self.docstring = docstring or '' self.kind = kind self.coexist = coexist + self.self_converter = None + + @property + def methoddef_flags(self): + flags = [] + if self.kind == CLASS_METHOD: + flags.append('METH_CLASS') + elif self.kind == STATIC_METHOD: + flags.append('METH_STATIC') + else: + assert self.kind == CALLABLE, "unknown kind: " + repr(self.kind) + if self.coexist: + flags.append('METH_COEXIST') + return '|'.join(flags) def __repr__(self): return '' @@ -1307,6 +1352,7 @@ # The C converter *function* to be used, if any. # (If this is not None, format_unit must be 'O&'.) converter = None + encoding = None impl_by_reference = False parse_by_reference = True @@ -1354,6 +1400,8 @@ # impl_arguments s = ("&" if self.impl_by_reference else "") + name data.impl_arguments.append(s) + if self.length: + data.impl_arguments.append(self.length_name()) # keywords data.keywords.append(name) @@ -1370,12 +1418,20 @@ # impl_parameters data.impl_parameters.append(self.simple_declaration(by_reference=self.impl_by_reference)) + if self.length: + data.impl_parameters.append("Py_ssize_clean_t " + self.length_name()) # cleanup cleanup = self.cleanup() if cleanup: data.cleanup.append('/* Cleanup for ' + name + ' */\n' + cleanup.rstrip() + "\n") + def length_name(self): + """Computes the name of the associated "length" variable.""" + if not self.length: + return None + return ensure_legal_c_identifier(self.name) + "_length" + # Why is this one broken out separately? # For "positional-only" function parsing, # which generates a bunch of PyArg_ParseTuple calls. @@ -1388,9 +1444,13 @@ if self.encoding: list.append(self.encoding) - s = ("&" if self.parse_by_reference else "") + ensure_legal_c_identifier(self.name) + legal_name = ensure_legal_c_identifier(self.name) + s = ("&" if self.parse_by_reference else "") + legal_name list.append(s) + if self.length: + list.append("&" + self.length_name()) + # # All the functions after here are intended as extension points. # @@ -1421,6 +1481,10 @@ declaration.append(" = ") declaration.append(default) declaration.append(";") + if self.length: + declaration.append('\nPy_ssize_clean_t ') + declaration.append(self.length_name()) + declaration.append(';') return "".join(declaration) def initialize(self): @@ -1462,7 +1526,7 @@ def converter_init(self, *, bitwise=False): if bitwise: - format_unit = 'B' + self.format_unit = 'B' class short_converter(CConverter): type = 'short' @@ -1478,15 +1542,17 @@ if not bitwise: fail("Unsigned shorts must be bitwise (for now).") - at add_legacy_c_converter('C', from_str=True) + at add_legacy_c_converter('C', types='str') class int_converter(CConverter): type = 'int' format_unit = 'i' c_ignored_default = "0" - def converter_init(self, *, from_str=False): - if from_str: - format_unit = 'C' + def converter_init(self, *, types='int'): + if types == 'str': + self.format_unit = 'C' + elif types != 'int': + fail("int_converter: illegal 'types' argument") class unsigned_int_converter(CConverter): type = 'unsigned int' @@ -1568,18 +1634,66 @@ self.encoding = type - at add_legacy_c_converter('y', from_bytes=True) + at add_legacy_c_converter('s#', length=True) + at add_legacy_c_converter('y', type="bytes") + at add_legacy_c_converter('y#', type="bytes", length=True) @add_legacy_c_converter('z', nullable=True) + at add_legacy_c_converter('z#', nullable=True, length=True) class str_converter(CConverter): type = 'const char *' format_unit = 's' - def converter_init(self, *, nullable=False, from_bytes=False): - if from_bytes: - assert not nullable - format_unit = 'y' - if nullable: - format_unit = 'z' + def converter_init(self, *, encoding=None, types="str", + length=False, nullable=False, zeroes=False): + + types = set(types.strip().split()) + bytes_type = set(("bytes",)) + str_type = set(("str",)) + all_3_type = set(("bytearray",)) | bytes_type | str_type + is_bytes = types == bytes_type + is_str = types == str_type + is_all_3 = types == all_3_type + + self.length = bool(length) + format_unit = None + + if encoding: + self.encoding = encoding + + if is_str and not (length or zeroes or nullable): + format_unit = 'es' + elif is_all_3 and not (length or zeroes or nullable): + format_unit = 'et' + elif is_str and length and zeroes and not nullable: + format_unit = 'es#' + elif is_all_3 and length and not (nullable or zeroes): + format_unit = 'et#' + + if format_unit.endswith('#'): + # TODO set pointer to NULL + # TODO add cleanup for buffer + pass + + else: + if zeroes: + fail("str_converter: illegal combination of arguments (zeroes is only legal with an encoding)") + + if is_bytes and not (nullable or length): + format_unit = 'y' + elif is_bytes and length and not nullable: + format_unit = 'y#' + elif is_str and not (nullable or length): + format_unit = 's' + elif is_str and length and not nullable: + format_unit = 's#' + elif is_str and nullable and not length: + format_unit = 'z' + elif is_str and nullable and length: + format_unit = 'z#' + + if not format_unit: + fail("str_converter: illegal combination of arguments") + self.format_unit = format_unit class PyBytesObject_converter(CConverter): @@ -1594,36 +1708,89 @@ type = 'PyObject *' format_unit = 'U' + at add_legacy_c_converter('u#', length=True) @add_legacy_c_converter('Z', nullable=True) + at add_legacy_c_converter('Z#', nullable=True, length=True) class Py_UNICODE_converter(CConverter): type = 'Py_UNICODE *' format_unit = 'u' - def converter_init(self, *, nullable=False): - if nullable: - format_unit = 'Z' - - at add_legacy_c_converter('s*', zeroes=True) - at add_legacy_c_converter('w*', read_write=True) - at add_legacy_c_converter('z*', zeroes=True, nullable=True) + def converter_init(self, *, nullable=False, length=False): + format_unit = 'Z' if nullable else 'u' + if length: + format_unit += '#' + self.length = True + self.format_unit = format_unit + +# +# We define three string conventions for buffer types in the 'types' argument: +# 'buffer' : any object supporting the buffer interface +# 'rwbuffer': any object supporting the buffer interface, but must be writeable +# 'robuffer': any object supporting the buffer interface, but must not be writeable +# + at add_legacy_c_converter('s*', types='str bytes bytearray buffer') + at add_legacy_c_converter('z*', types='str bytes bytearray buffer', nullable=True) + at add_legacy_c_converter('w*', types='bytearray rwbuffer') class Py_buffer_converter(CConverter): type = 'Py_buffer' format_unit = 'y*' impl_by_reference = True c_ignored_default = "{NULL, NULL, 0, 0, 0, 0, NULL, NULL, NULL, NULL, NULL}" - def converter_init(self, *, str=False, zeroes=False, nullable=False, read_write=False): - if not str: - assert not (zeroes or nullable or read_write) - elif read_write: - assert not (zeroes or nullable) - self.format_unit = 'w*' + def converter_init(self, *, types='bytes bytearray buffer', nullable=False): + types = set(types.strip().split()) + bytes_type = set(('bytes',)) + bytearray_type = set(('bytearray',)) + buffer_type = set(('buffer',)) + rwbuffer_type = set(('rwbuffer',)) + robuffer_type = set(('robuffer',)) + str_type = set(('str',)) + bytes_bytearray_buffer_type = bytes_type | bytearray_type | buffer_type + + format_unit = None + if types == (str_type | bytes_bytearray_buffer_type): + format_unit = 's*' if not nullable else 'z*' else: - assert zeroes - self.format_unit = 'z*' if nullable else 's*' + if nullable: + fail('Py_buffer_converter: illegal combination of arguments (nullable=True)') + elif types == (bytes_bytearray_buffer_type): + format_unit = 'y*' + elif types == (bytearray_type | rwuffer_type): + format_unit = 'w*' + if not format_unit: + fail("Py_buffer_converter: illegal combination of arguments") + + self.format_unit = format_unit def cleanup(self): - return "PyBuffer_Release(&" + ensure_legal_c_identifier(self.name) + ");\n" + name = ensure_legal_c_identifier(self.name) + return "".join(["if (", name, ".buf)\n PyBuffer_Release(&", name, ");\n"]) + + +class self_converter(CConverter): + """ + A special-case converter: + this is the default converter used for "self". + """ + type = "PyObject *" + def converter_init(self): + f = self.function + if f.kind == CALLABLE: + if f.cls: + self.name = "self" + else: + self.name = "module" + self.type = "PyModuleDef *" + elif f.kind == STATIC_METHOD: + self.name = "null" + self.type = "void *" + elif f.kind == CLASS_METHOD: + self.name = "cls" + self.type = "PyTypeObject *" + + def render(self, parameter, data): + fail("render() should never be called on self_converter instances") + def add_c_return_converter(f, name=None): @@ -1830,6 +1997,11 @@ self.kind = CALLABLE self.coexist = False + def directive_version(self, required): + global version + if version_comparitor(version, required) < 0: + fail("Insufficient Clinic version!\n Version: " + version + "\n Required: " + required) + def directive_module(self, name): fields = name.split('.') new = fields.pop() @@ -1867,6 +2039,7 @@ assert self.coexist == False self.coexist = True + def parse(self, block): self.reset() self.block = block @@ -2128,6 +2301,17 @@ fail('{} is not a valid {}converter'.format(name, legacy_str)) converter = dict[name](parameter_name, self.function, value, **kwargs) + # special case: if it's the self converter, + # don't actually add it to the parameter list + if isinstance(converter, self_converter): + if self.function.parameters or (self.parameter_state != self.ps_required): + fail("The 'self' parameter, if specified, must be the very first thing in the parameter block.") + if self.function.self_converter: + fail("You can't specify the 'self' parameter more than once.") + self.function.self_converter = converter + self.parameter_state = self.ps_start + return + kind = inspect.Parameter.KEYWORD_ONLY if self.keyword_only else inspect.Parameter.POSITIONAL_OR_KEYWORD p = Parameter(parameter_name, kind, function=self.function, converter=converter, default=value, group=self.group) self.function.parameters[parameter_name] = p @@ -2224,6 +2408,9 @@ # the final stanza of the DSL is the docstring. def state_function_docstring(self, line): + if not self.function.self_converter: + self.function.self_converter = self_converter("self", self.function) + if self.group: fail("Function " + self.function.name + " has a ] without a matching [.") -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 23 23:59:17 2013 From: python-checkins at python.org (larry.hastings) Date: Sat, 23 Nov 2013 23:59:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319358=3A_=22make_?= =?utf-8?q?clinic=22_now_runs_the_Argument_Clinic_preprocessor?= Message-ID: <3dRqh56MVfz7Ls5@mail.python.org> http://hg.python.org/cpython/rev/29370c25e0f1 changeset: 87467:29370c25e0f1 user: Larry Hastings date: Sat Nov 23 14:58:45 2013 -0800 summary: Issue #19358: "make clinic" now runs the Argument Clinic preprocessor over all CPython source files. files: Makefile.pre.in | 6 ++++++ Misc/NEWS | 6 ++++++ Tools/clinic/clinic.py | 27 ++++++++++++++++++++++++--- 3 files changed, 36 insertions(+), 3 deletions(-) diff --git a/Makefile.pre.in b/Makefile.pre.in --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -526,6 +526,12 @@ : # build lcov report $(MAKE) coverage-lcov +# Run "Argument Clinic" over all source files +# (depends on python having already been built) +.PHONY=clinic +clinic: $(BUILDPYTHON) + $(RUNSHARED) $(PYTHON_FOR_BUILD) ./Tools/clinic/clinic.py --make + # Build the interpreter $(BUILDPYTHON): Modules/python.o $(LIBRARY) $(LDLIBRARY) $(PY3LIBRARY) $(LINKCC) $(PY_LDFLAGS) $(LINKFORSHARED) -o $@ Modules/python.o $(BLDLIBRARY) $(LIBS) $(MODLIBS) $(SYSLIBS) $(LDLAST) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -9,6 +9,7 @@ Core and Builtins ----------------- + - Use the repr of a module name in more places in import, especially exceptions. @@ -66,6 +67,7 @@ Library ------- + - Issue #19722: Added opcode.stack_effect(), which computes the stack effect of bytecode instructions. @@ -403,6 +405,9 @@ Build ----- +- Issue #19358: "make clinic" now runs the Argument Clinic preprocessor + over all CPython source files. + - Update SQLite to 3.8.1, xz to 5.0.5, and Tcl/Tk to 8.6.1 on Windows. - Issue #16632: Enable DEP and ASLR on Windows. @@ -449,6 +454,7 @@ Tools/Demos ----------- + - Issue #19730: Argument Clinic now supports all the existing PyArg "format units" as legacy converters, as well as two new features: "self converters" and the "version" directive. diff --git a/Tools/clinic/clinic.py b/Tools/clinic/clinic.py --- a/Tools/clinic/clinic.py +++ b/Tools/clinic/clinic.py @@ -988,7 +988,6 @@ dsl_name = block.dsl_name write = self.f.write - assert (not input) or (input.endswith('\n')) assert not ((dsl_name == None) ^ (output == None)), "you must specify dsl_name and output together, dsl_name " + repr(dsl_name) if not dsl_name: @@ -1122,12 +1121,16 @@ clinic = Clinic(language, verify=verify, filename=filename) with open(filename, 'r', encoding=encoding) as f: - text = clinic.parse(f.read()) + raw = f.read() + + cooked = clinic.parse(raw) + if cooked == raw: + return directory = os.path.dirname(filename) or '.' with tempfile.TemporaryDirectory(prefix="clinic", dir=directory) as tmpdir: - bytes = text.encode(encoding) + bytes = cooked.encode(encoding) tmpfilename = os.path.join(tmpdir, os.path.basename(filename)) with open(tmpfilename, "wb") as f: f.write(bytes) @@ -2619,6 +2622,7 @@ cmdline.add_argument("-f", "--force", action='store_true') cmdline.add_argument("-o", "--output", type=str) cmdline.add_argument("--converters", action='store_true') + cmdline.add_argument("--make", action='store_true') cmdline.add_argument("filename", type=str, nargs="*") ns = cmdline.parse_args(argv) @@ -2697,6 +2701,23 @@ print("All return converters also accept (doc_default=None).") sys.exit(0) + if ns.make: + if ns.output or ns.filename: + print("Usage error: can't use -o or filenames with --make.") + print() + cmdline.print_usage() + sys.exit(-1) + for root, dirs, files in os.walk('.'): + for rcs_dir in ('.svn', '.git', '.hg'): + if rcs_dir in dirs: + dirs.remove(rcs_dir) + for filename in files: + if not filename.endswith('.c'): + continue + path = os.path.join(root, filename) + parse_file(path, verify=not ns.force) + return + if not ns.filename: cmdline.print_usage() sys.exit(-1) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 00:09:25 2013 From: python-checkins at python.org (guido.van.rossum) Date: Sun, 24 Nov 2013 00:09:25 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Change_bounded_?= =?utf-8?q?semaphore_into_a_subclass=2C_like?= Message-ID: <3dRqvn17KPz7Lq7@mail.python.org> http://hg.python.org/cpython/rev/6bee0fdcba39 changeset: 87468:6bee0fdcba39 user: Guido van Rossum date: Sat Nov 23 15:09:16 2013 -0800 summary: asyncio: Change bounded semaphore into a subclass, like threading.[Bounded]Semaphore. files: Lib/asyncio/locks.py | 36 ++++++++-------- Lib/test/test_asyncio/test_locks.py | 2 +- 2 files changed, 20 insertions(+), 18 deletions(-) diff --git a/Lib/asyncio/locks.py b/Lib/asyncio/locks.py --- a/Lib/asyncio/locks.py +++ b/Lib/asyncio/locks.py @@ -336,22 +336,15 @@ Semaphores also support the context manager protocol. - The first optional argument gives the initial value for the internal + The optional argument gives the initial value for the internal counter; it defaults to 1. If the value given is less than 0, ValueError is raised. - - The second optional argument determines if the semaphore can be released - more than initial internal counter value; it defaults to False. If the - value given is True and number of release() is more than number of - successful acquire() calls ValueError is raised. """ - def __init__(self, value=1, bound=False, *, loop=None): + def __init__(self, value=1, *, loop=None): if value < 0: raise ValueError("Semaphore initial value must be >= 0") self._value = value - self._bound = bound - self._bound_value = value self._waiters = collections.deque() self._locked = (value == 0) if loop is not None: @@ -402,17 +395,9 @@ """Release a semaphore, incrementing the internal counter by one. When it was zero on entry and another coroutine is waiting for it to become larger than zero again, wake up that coroutine. - - If Semaphore is created with "bound" parameter equals true, then - release() method checks to make sure its current value doesn't exceed - its initial value. If it does, ValueError is raised. """ - if self._bound and self._value >= self._bound_value: - raise ValueError('Semaphore released too many times') - self._value += 1 self._locked = False - for waiter in self._waiters: if not waiter.done(): waiter.set_result(True) @@ -429,3 +414,20 @@ def __iter__(self): yield from self.acquire() return self + + +class BoundedSemaphore(Semaphore): + """A bounded semaphore implementation. + + This raises ValueError in release() if it would increase the value + above the initial value. + """ + + def __init__(self, value=1, *, loop=None): + self._bound_value = value + super().__init__(value, loop=loop) + + def release(self): + if self._value >= self._bound_value: + raise ValueError('BoundedSemaphore released too many times') + super().release() diff --git a/Lib/test/test_asyncio/test_locks.py b/Lib/test/test_asyncio/test_locks.py --- a/Lib/test/test_asyncio/test_locks.py +++ b/Lib/test/test_asyncio/test_locks.py @@ -805,7 +805,7 @@ self.assertFalse(sem._waiters) def test_release_not_acquired(self): - sem = locks.Semaphore(bound=True, loop=self.loop) + sem = locks.BoundedSemaphore(loop=self.loop) self.assertRaises(ValueError, sem.release) -- Repository URL: http://hg.python.org/cpython From tim.peters at gmail.com Sun Nov 24 00:18:37 2013 From: tim.peters at gmail.com (Tim Peters) Date: Sat, 23 Nov 2013 17:18:37 -0600 Subject: [Python-checkins] cpython: asyncio: Change bounded semaphore into a subclass, like In-Reply-To: <3dRqvn17KPz7Lq7@mail.python.org> References: <3dRqvn17KPz7Lq7@mail.python.org> Message-ID: [guido] > http://hg.python.org/cpython/rev/6bee0fdcba39 > changeset: 87468:6bee0fdcba39 > user: Guido van Rossum > date: Sat Nov 23 15:09:16 2013 -0800 > summary: > asyncio: Change bounded semaphore into a subclass, like threading.[Bounded]Semaphore. > > files: > Lib/asyncio/locks.py | 36 ++++++++-------- > Lib/test/test_asyncio/test_locks.py | 2 +- > 2 files changed, 20 insertions(+), 18 deletions(-) > > > diff --git a/Lib/asyncio/locks.py b/Lib/asyncio/locks.py > --- a/Lib/asyncio/locks.py > +++ b/Lib/asyncio/locks.py > @@ -336,22 +336,15 @@ ... > +class BoundedSemaphore(Semaphore): ... > + def release(self): > + if self._value >= self._bound_value: > + raise ValueError('BoundedSemaphore released too many times') > + super().release() If there's a lock and parallelism involved, this release() implementation is vulnerable to races: any number of threads can see "self._value < self._bound_value" before one of them manages to call the superclass release(), and so self._value can become arbitrarily larger than self._bound_value. I fixed the same bug in threading's similar bounded semaphore class a few weeks ago. From python-checkins at python.org Sun Nov 24 00:36:50 2013 From: python-checkins at python.org (guido.van.rossum) Date: Sun, 24 Nov 2013 00:36:50 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Keep_asyncio_working_with_?= =?utf-8?q?Python_3=2E3_too=2E?= Message-ID: <3dRrWQ6RXyz7LsS@mail.python.org> http://hg.python.org/cpython/rev/8c98d0900b0f changeset: 87469:8c98d0900b0f user: Guido van Rossum date: Sat Nov 23 15:36:43 2013 -0800 summary: Keep asyncio working with Python 3.3 too. files: Lib/asyncio/selector_events.py | 11 +++++++++-- 1 files changed, 9 insertions(+), 2 deletions(-) diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -571,8 +571,15 @@ # context; in that case the sslcontext passed is None. # The default is the same as used by urllib with # cadefault=True. - sslcontext = ssl._create_stdlib_context( - cert_reqs=ssl.CERT_REQUIRED) + if hasattr(ssl, '_create_stdlib_context'): + sslcontext = ssl._create_stdlib_context( + cert_reqs=ssl.CERT_REQUIRED) + else: + # Fallback for Python 3.3. + sslcontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23) + sslcontext.options |= ssl.OP_NO_SSLv2 + sslcontext.set_default_verify_paths() + sslcontext.verify_mode = ssl.CERT_REQUIRED wrap_kwargs = { 'server_side': server_side, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 00:38:34 2013 From: python-checkins at python.org (larry.hastings) Date: Sun, 24 Nov 2013 00:38:34 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319674=3A_inspect?= =?utf-8?q?=2Esignature=28=29_now_produces_a_correct_signature?= Message-ID: <3dRrYQ5wHZz7Ls3@mail.python.org> http://hg.python.org/cpython/rev/78ec18f5cb45 changeset: 87470:78ec18f5cb45 user: Larry Hastings date: Sat Nov 23 15:37:55 2013 -0800 summary: Issue #19674: inspect.signature() now produces a correct signature for some builtins. files: Lib/inspect.py | 62 +++++++++++++++++++++++ Lib/pydoc.py | 52 +++++++++---------- Lib/test/test_capi.py | 29 +++++++++++ Lib/test/test_inspect.py | 7 +- Misc/NEWS | 3 + Modules/_cursesmodule.c | 12 +++- Modules/_datetimemodule.c | 12 +++- Modules/_dbmmodule.c | 23 ++++---- Modules/_opcode.c | 19 +++--- Modules/_testcapimodule.c | 44 ++++++++++++++++ Modules/_weakref.c | 12 ++- Modules/posixmodule.c | 19 +++--- Modules/unicodedata.c | 14 +++- Modules/zlibmodule.c | 17 +++--- Objects/dictobject.c | 13 +++- Objects/methodobject.c | 71 +++++++++++++++++++++++++- Objects/unicodeobject.c | 11 ++- Tools/clinic/clinic.py | 59 +++++++++------------ 18 files changed, 343 insertions(+), 136 deletions(-) diff --git a/Lib/inspect.py b/Lib/inspect.py --- a/Lib/inspect.py +++ b/Lib/inspect.py @@ -31,6 +31,7 @@ __author__ = ('Ka-Ping Yee ', 'Yury Selivanov ') +import ast import importlib.machinery import itertools import linecache @@ -1461,6 +1462,9 @@ if isinstance(obj, types.FunctionType): return Signature.from_function(obj) + if isinstance(obj, types.BuiltinFunctionType): + return Signature.from_builtin(obj) + if isinstance(obj, functools.partial): sig = signature(obj.func) @@ -1942,6 +1946,64 @@ return_annotation=annotations.get('return', _empty), __validate_parameters__=False) + @classmethod + def from_builtin(cls, func): + s = getattr(func, "__text_signature__", None) + if not s: + return None + + if s.endswith("/)"): + kind = Parameter.POSITIONAL_ONLY + s = s[:-2] + ')' + else: + kind = Parameter.POSITIONAL_OR_KEYWORD + + s = "def foo" + s + ": pass" + + try: + module = ast.parse(s) + except SyntaxError: + return None + if not isinstance(module, ast.Module): + return None + + # ast.FunctionDef + f = module.body[0] + + parameters = [] + empty = Parameter.empty + + def p(name_node, default_node, default=empty): + name = name_node.arg + + if isinstance(default_node, ast.Num): + default = default.n + elif isinstance(default_node, ast.NameConstant): + default = default_node.value + parameters.append(Parameter(name, kind, default=default, annotation=empty)) + + # non-keyword-only parameters + for name, default in reversed(list(itertools.zip_longest(reversed(f.args.args), reversed(f.args.defaults), fillvalue=None))): + p(name, default) + + # *args + if f.args.vararg: + kind = Parameter.VAR_POSITIONAL + p(f.args.vararg, empty) + + # keyword-only arguments + kind = Parameter.KEYWORD_ONLY + for name, default in zip(f.args.kwonlyargs, f.args.kw_defaults): + p(name, default) + + # **kwargs + if f.args.kwarg: + kind = Parameter.VAR_KEYWORD + p(f.args.kwarg, empty) + + return cls(parameters, return_annotation=cls.empty) + + @property def parameters(self): return self._parameters diff --git a/Lib/pydoc.py b/Lib/pydoc.py --- a/Lib/pydoc.py +++ b/Lib/pydoc.py @@ -916,20 +916,18 @@ reallink = realname title = '%s = %s' % ( anchor, name, reallink) - if inspect.isfunction(object): - args, varargs, kwonlyargs, kwdefaults, varkw, defaults, ann = \ - inspect.getfullargspec(object) - argspec = inspect.formatargspec( - args, varargs, kwonlyargs, kwdefaults, varkw, defaults, ann, - formatvalue=self.formatvalue, - formatannotation=inspect.formatannotationrelativeto(object)) - if realname == '': - title = '%s lambda ' % name - # XXX lambda's won't usually have func_annotations['return'] - # since the syntax doesn't support but it is possible. - # So removing parentheses isn't truly safe. - argspec = argspec[1:-1] # remove parentheses - else: + argspec = None + if inspect.isfunction(object) or inspect.isbuiltin(object): + signature = inspect.signature(object) + if signature: + argspec = str(signature) + if realname == '': + title = '%s lambda ' % name + # XXX lambda's won't usually have func_annotations['return'] + # since the syntax doesn't support but it is possible. + # So removing parentheses isn't truly safe. + argspec = argspec[1:-1] # remove parentheses + if not argspec: argspec = '(...)' decl = title + argspec + (note and self.grey( @@ -1313,20 +1311,18 @@ cl.__dict__[realname] is object): skipdocs = 1 title = self.bold(name) + ' = ' + realname - if inspect.isfunction(object): - args, varargs, varkw, defaults, kwonlyargs, kwdefaults, ann = \ - inspect.getfullargspec(object) - argspec = inspect.formatargspec( - args, varargs, varkw, defaults, kwonlyargs, kwdefaults, ann, - formatvalue=self.formatvalue, - formatannotation=inspect.formatannotationrelativeto(object)) - if realname == '': - title = self.bold(name) + ' lambda ' - # XXX lambda's won't usually have func_annotations['return'] - # since the syntax doesn't support but it is possible. - # So removing parentheses isn't truly safe. - argspec = argspec[1:-1] # remove parentheses - else: + argspec = None + if inspect.isfunction(object) or inspect.isbuiltin(object): + signature = inspect.signature(object) + if signature: + argspec = str(signature) + if realname == '': + title = self.bold(name) + ' lambda ' + # XXX lambda's won't usually have func_annotations['return'] + # since the syntax doesn't support but it is possible. + # So removing parentheses isn't truly safe. + argspec = argspec[1:-1] # remove parentheses + if not argspec: argspec = '(...)' decl = title + argspec + note diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -109,6 +109,35 @@ self.assertRaises(TypeError, _posixsubprocess.fork_exec, Z(),[b'1'],3,[1, 2],5,6,7,8,9,10,11,12,13,14,15,16,17) + def test_docstring_signature_parsing(self): + + self.assertEqual(_testcapi.no_docstring.__doc__, None) + self.assertEqual(_testcapi.no_docstring.__text_signature__, None) + + self.assertEqual(_testcapi.docstring_empty.__doc__, "") + self.assertEqual(_testcapi.docstring_empty.__text_signature__, None) + + self.assertEqual(_testcapi.docstring_no_signature.__doc__, + "This docstring has no signature.") + self.assertEqual(_testcapi.docstring_no_signature.__text_signature__, None) + + self.assertEqual(_testcapi.docstring_with_invalid_signature.__doc__, + "docstring_with_invalid_signature (boo)\n" + "\n" + "This docstring has an invalid signature." + ) + self.assertEqual(_testcapi.docstring_with_invalid_signature.__text_signature__, None) + + self.assertEqual(_testcapi.docstring_with_signature.__doc__, + "This docstring has a valid signature.") + self.assertEqual(_testcapi.docstring_with_signature.__text_signature__, "(sig)") + + self.assertEqual(_testcapi.docstring_with_signature_and_extra_newlines.__doc__, + "This docstring has a valid signature and some extra newlines.") + self.assertEqual(_testcapi.docstring_with_signature_and_extra_newlines.__text_signature__, + "(parameter)") + + @unittest.skipUnless(threading, 'Threading required for this test.') class TestPendingCalls(unittest.TestCase): diff --git a/Lib/test/test_inspect.py b/Lib/test/test_inspect.py --- a/Lib/test/test_inspect.py +++ b/Lib/test/test_inspect.py @@ -1588,10 +1588,9 @@ with self.assertRaisesRegex(ValueError, 'not supported by signature'): # support for 'method-wrapper' inspect.signature(min.__call__) - with self.assertRaisesRegex(ValueError, - 'no signature found for builtin function'): - # support for 'method-wrapper' - inspect.signature(min) + self.assertEqual(inspect.signature(min), None) + signature = inspect.signature(os.stat) + self.assertTrue(isinstance(signature, inspect.Signature)) def test_signature_on_non_function(self): with self.assertRaisesRegex(TypeError, 'is not a callable object'): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,9 @@ Library ------- +- Issue #19674: inspect.signature() now produces a correct signature + for some builtins. + - Issue #19722: Added opcode.stack_effect(), which computes the stack effect of bytecode instructions. diff --git a/Modules/_cursesmodule.c b/Modules/_cursesmodule.c --- a/Modules/_cursesmodule.c +++ b/Modules/_cursesmodule.c @@ -134,6 +134,12 @@ #define STRICT_SYSV_CURSES #endif +/*[clinic] +module curses +class curses.window +[clinic]*/ +/*[clinic checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ + /* Definition of exception curses.error */ static PyObject *PyCursesError; @@ -550,8 +556,6 @@ /* Addch, Addstr, Addnstr */ /*[clinic] -module curses -class curses.window curses.window.addch @@ -580,9 +584,9 @@ [clinic]*/ PyDoc_STRVAR(curses_window_addch__doc__, +"addch([x, y,] ch, [attr])\n" "Paint character ch at (y, x) with attributes attr.\n" "\n" -"curses.window.addch([x, y,] ch, [attr])\n" " x\n" " X-coordinate.\n" " y\n" @@ -646,7 +650,7 @@ static PyObject * curses_window_addch_impl(PyObject *self, int group_left_1, int x, int y, PyObject *ch, int group_right_1, long attr) -/*[clinic checksum: 094d012af1019387c0219a9c0bc76e90729c833f]*/ +/*[clinic checksum: 44ed958b891cde91205e584c766e048f3999714f]*/ { PyCursesWindowObject *cwself = (PyCursesWindowObject *)self; int coordinates_group = group_left_1; diff --git a/Modules/_datetimemodule.c b/Modules/_datetimemodule.c --- a/Modules/_datetimemodule.c +++ b/Modules/_datetimemodule.c @@ -16,6 +16,12 @@ #include "datetime.h" #undef Py_BUILD_CORE +/*[clinic] +module datetime +class datetime.datetime +[clinic]*/ +/*[clinic checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ + /* We require that C int be at least 32 bits, and use int virtually * everywhere. In just a few cases we use a temp long, where a Python * API returns a C long. In such cases, we have to ensure that the @@ -4140,8 +4146,6 @@ } /*[clinic] -module datetime -class datetime.datetime @classmethod datetime.datetime.now @@ -4155,9 +4159,9 @@ [clinic]*/ PyDoc_STRVAR(datetime_datetime_now__doc__, +"now(tz=None)\n" "Returns new datetime object representing current time local to tz.\n" "\n" -"datetime.datetime.now(tz=None)\n" " tz\n" " Timezone object.\n" "\n" @@ -4188,7 +4192,7 @@ static PyObject * datetime_datetime_now_impl(PyTypeObject *cls, PyObject *tz) -/*[clinic checksum: 5e61647d5d1feaf1ab096c5406ccea17bb7b061c]*/ +/*[clinic checksum: ca3d26a423b3f633b260c7622e303f0915a96f7c]*/ { PyObject *self; diff --git a/Modules/_dbmmodule.c b/Modules/_dbmmodule.c --- a/Modules/_dbmmodule.c +++ b/Modules/_dbmmodule.c @@ -28,6 +28,12 @@ #error "No ndbm.h available!" #endif +/*[clinic] +module dbm +class dbm.dbm +[clinic]*/ +/*[clinic checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ + typedef struct { PyObject_HEAD int di_size; /* -1 means recompute */ @@ -43,12 +49,6 @@ static PyObject *DbmError; -/*[clinic] -module dbm -class dbm.dbm -[clinic]*/ -/*[clinic checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ - /*[python] class dbmobject_converter(self_converter): type = "dbmobject *" @@ -278,9 +278,8 @@ [clinic]*/ PyDoc_STRVAR(dbm_dbm_get__doc__, -"Return the value for key if present, otherwise default.\n" -"\n" -"dbm.dbm.get(key, [default])"); +"get(key, [default])\n" +"Return the value for key if present, otherwise default."); #define DBM_DBM_GET_METHODDEF \ {"get", (PyCFunction)dbm_dbm_get, METH_VARARGS, dbm_dbm_get__doc__}, @@ -318,7 +317,7 @@ static PyObject * dbm_dbm_get_impl(dbmobject *dp, const char *key, Py_ssize_clean_t key_length, int group_right_1, PyObject *default_value) -/*[clinic checksum: 5b4265e66568f163ef0fc7efec09410eaf793508]*/ +/*[clinic checksum: 28cf8928811bde51e535d67ae98ea039d79df717]*/ { datum dbm_key, val; @@ -461,9 +460,9 @@ [clinic]*/ PyDoc_STRVAR(dbmopen__doc__, +"open(filename, flags=\'r\', mode=0o666)\n" "Return a database object.\n" "\n" -"dbm.open(filename, flags=\'r\', mode=0o666)\n" " filename\n" " The filename to open.\n" " flags\n" @@ -498,7 +497,7 @@ static PyObject * dbmopen_impl(PyModuleDef *module, const char *filename, const char *flags, int mode) -/*[clinic checksum: c1f2036017ec36a43ac6f59893732751e67c19d5]*/ +/*[clinic checksum: fb265f75641553ccd963f84c143b35c11f9121fc]*/ { int iflags; diff --git a/Modules/_opcode.c b/Modules/_opcode.c --- a/Modules/_opcode.c +++ b/Modules/_opcode.c @@ -1,11 +1,13 @@ #include "Python.h" #include "opcode.h" +/*[clinic] +module _opcode +[clinic]*/ +/*[clinic checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ /*[clinic] -module _opcode - _opcode.stack_effect -> int opcode: int @@ -19,18 +21,17 @@ [clinic]*/ PyDoc_STRVAR(_opcode_stack_effect__doc__, -"Compute the stack effect of the opcode.\n" -"\n" -"_opcode.stack_effect(opcode, [oparg])"); +"stack_effect(opcode, [oparg])\n" +"Compute the stack effect of the opcode."); #define _OPCODE_STACK_EFFECT_METHODDEF \ {"stack_effect", (PyCFunction)_opcode_stack_effect, METH_VARARGS, _opcode_stack_effect__doc__}, static int -_opcode_stack_effect_impl(PyObject *module, int opcode, int group_right_1, int oparg); +_opcode_stack_effect_impl(PyModuleDef *module, int opcode, int group_right_1, int oparg); static PyObject * -_opcode_stack_effect(PyObject *module, PyObject *args) +_opcode_stack_effect(PyModuleDef *module, PyObject *args) { PyObject *return_value = NULL; int opcode; @@ -62,8 +63,8 @@ } static int -_opcode_stack_effect_impl(PyObject *module, int opcode, int group_right_1, int oparg) -/*[clinic checksum: 2312ded40abc9bcbce718942de21f53e61a2dfd3]*/ +_opcode_stack_effect_impl(PyModuleDef *module, int opcode, int group_right_1, int oparg) +/*[clinic checksum: e880e62dc7b0de73403026eaf4f8074aa106358b]*/ { int effect; if (HAS_ARG(opcode)) { diff --git a/Modules/_testcapimodule.c b/Modules/_testcapimodule.c --- a/Modules/_testcapimodule.c +++ b/Modules/_testcapimodule.c @@ -2842,6 +2842,33 @@ return test_setallocators(PYMEM_DOMAIN_OBJ); } +PyDoc_STRVAR(docstring_empty, +"" +); + +PyDoc_STRVAR(docstring_no_signature, +"This docstring has no signature." +); + +PyDoc_STRVAR(docstring_with_invalid_signature, +"docstring_with_invalid_signature (boo)\n" +"\n" +"This docstring has an invalid signature." +); + +PyDoc_STRVAR(docstring_with_signature, +"docstring_with_signature(sig)\n" +"This docstring has a valid signature." +); + +PyDoc_STRVAR(docstring_with_signature_and_extra_newlines, +"docstring_with_signature_and_extra_newlines(parameter)\n" +"\n" +"\n" +"\n" +"This docstring has a valid signature and some extra newlines." +); + static PyMethodDef TestMethods[] = { {"raise_exception", raise_exception, METH_VARARGS}, {"raise_memoryerror", (PyCFunction)raise_memoryerror, METH_NOARGS}, @@ -2953,6 +2980,23 @@ (PyCFunction)test_pymem_setallocators, METH_NOARGS}, {"test_pyobject_setallocators", (PyCFunction)test_pyobject_setallocators, METH_NOARGS}, + {"no_docstring", + (PyCFunction)test_with_docstring, METH_NOARGS}, + {"docstring_empty", + (PyCFunction)test_with_docstring, METH_NOARGS, + docstring_empty}, + {"docstring_no_signature", + (PyCFunction)test_with_docstring, METH_NOARGS, + docstring_no_signature}, + {"docstring_with_invalid_signature", + (PyCFunction)test_with_docstring, METH_NOARGS, + docstring_with_invalid_signature}, + {"docstring_with_signature", + (PyCFunction)test_with_docstring, METH_NOARGS, + docstring_with_signature}, + {"docstring_with_signature_and_extra_newlines", + (PyCFunction)test_with_docstring, METH_NOARGS, + docstring_with_signature_and_extra_newlines}, {NULL, NULL} /* sentinel */ }; diff --git a/Modules/_weakref.c b/Modules/_weakref.c --- a/Modules/_weakref.c +++ b/Modules/_weakref.c @@ -5,8 +5,11 @@ ((PyWeakReference **) PyObject_GET_WEAKREFS_LISTPTR(o)) /*[clinic] +module _weakref +[clinic]*/ +/*[clinic checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ -module _weakref +/*[clinic] _weakref.getweakrefcount -> Py_ssize_t @@ -17,9 +20,8 @@ [clinic]*/ PyDoc_STRVAR(_weakref_getweakrefcount__doc__, -"Return the number of weak references to \'object\'.\n" -"\n" -"_weakref.getweakrefcount(object)"); +"getweakrefcount(object)\n" +"Return the number of weak references to \'object\'."); #define _WEAKREF_GETWEAKREFCOUNT_METHODDEF \ {"getweakrefcount", (PyCFunction)_weakref_getweakrefcount, METH_O, _weakref_getweakrefcount__doc__}, @@ -43,7 +45,7 @@ static Py_ssize_t _weakref_getweakrefcount_impl(PyModuleDef *module, PyObject *object) -/*[clinic checksum: 015113be0c9a0a8672d35df10c63e3642cc23da4]*/ +/*[clinic checksum: 436e8fbe0297434375f039d8c2d9fc3a9bbe773c]*/ { PyWeakReference **list; diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -190,7 +190,10 @@ #endif /* ! __WATCOMC__ || __QNX__ */ - +/*[clinic] +module os +[clinic]*/ +/*[clinic checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ #ifndef _MSC_VER @@ -2404,7 +2407,6 @@ /*[python checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ /*[clinic] -module os os.stat -> object(doc_default='stat_result') @@ -2435,9 +2437,9 @@ [clinic]*/ PyDoc_STRVAR(os_stat__doc__, +"stat(path, *, dir_fd=None, follow_symlinks=True)\n" "Perform a stat system call on the given path.\n" "\n" -"os.stat(path, *, dir_fd=None, follow_symlinks=True) -> stat_result\n" " path\n" " Path to be examined; can be string, bytes, or open-file-descriptor int.\n" " dir_fd\n" @@ -2486,7 +2488,7 @@ static PyObject * os_stat_impl(PyModuleDef *module, path_t *path, int dir_fd, int follow_symlinks) -/*[clinic checksum: b08112eff0ceab3ec2c72352da95ce73f245d104]*/ +/*[clinic checksum: 85a71ad602e89f8e280118da976f70cd2f9abdf1]*/ { return posix_do_stat("stat", path, dir_fd, follow_symlinks); } @@ -2567,9 +2569,9 @@ [clinic]*/ PyDoc_STRVAR(os_access__doc__, +"access(path, mode, *, dir_fd=None, effective_ids=False, follow_symlinks=True)\n" "Use the real uid/gid to test for access to a path.\n" "\n" -"os.access(path, mode, *, dir_fd=None, effective_ids=False, follow_symlinks=True) -> True if granted, False otherwise\n" " path\n" " Path to be tested; can be string, bytes, or open-file-descriptor int.\n" " mode\n" @@ -2587,7 +2589,6 @@ " access will examine the symbolic link itself instead of the file\n" " the link points to.\n" "\n" -"{parameters}\n" "dir_fd, effective_ids, and follow_symlinks may not be implemented\n" " on your platform. If they are unavailable, using them will raise a\n" " NotImplementedError.\n" @@ -2628,7 +2629,7 @@ static PyObject * os_access_impl(PyModuleDef *module, path_t *path, int mode, int dir_fd, int effective_ids, int follow_symlinks) -/*[clinic checksum: b9f8ececb061d31b64220c29526bfee642d1b602]*/ +/*[clinic checksum: 636e835c36562a2fc11acab75314634127fdf769]*/ { PyObject *return_value = NULL; @@ -2724,9 +2725,9 @@ [clinic]*/ PyDoc_STRVAR(os_ttyname__doc__, +"ttyname(fd)\n" "Return the name of the terminal device connected to \'fd\'.\n" "\n" -"os.ttyname(fd)\n" " fd\n" " Integer file descriptor handle."); @@ -2758,7 +2759,7 @@ static char * os_ttyname_impl(PyModuleDef *module, int fd) -/*[clinic checksum: 61e4e525984cb293f949ccae6ae393c0011dfe8e]*/ +/*[clinic checksum: 0f368134dc0a7f21f25185e2e6bacf7675fb473a]*/ { char *ret; diff --git a/Modules/unicodedata.c b/Modules/unicodedata.c --- a/Modules/unicodedata.c +++ b/Modules/unicodedata.c @@ -17,6 +17,12 @@ #include "ucnhash.h" #include "structmember.h" +/*[clinic] +module unicodedata +class unicodedata.UCD +[clinic]*/ +/*[clinic checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ + /* character properties */ typedef struct { @@ -108,8 +114,7 @@ /* --- Module API --------------------------------------------------------- */ /*[clinic] -module unicodedata -class unicodedata.UCD + unicodedata.UCD.decimal unichr: object(type='str') @@ -124,10 +129,9 @@ [clinic]*/ PyDoc_STRVAR(unicodedata_UCD_decimal__doc__, +"decimal(unichr, default=None)\n" "Converts a Unicode character into its equivalent decimal value.\n" "\n" -"unicodedata.UCD.decimal(unichr, default=None)\n" -"\n" "Returns the decimal value assigned to the Unicode character unichr\n" "as integer. If no such value is defined, default is returned, or, if\n" "not given, ValueError is raised."); @@ -157,7 +161,7 @@ static PyObject * unicodedata_UCD_decimal_impl(PyObject *self, PyObject *unichr, PyObject *default_value) -/*[clinic checksum: a0980c387387287e2ac230c37d95b26f6903e0d2]*/ +/*[clinic checksum: 9576fa55f4ea0be82968af39dc9d0283e634beeb]*/ { PyUnicodeObject *v = (PyUnicodeObject *)unichr; int have_old = 0; diff --git a/Modules/zlibmodule.c b/Modules/zlibmodule.c --- a/Modules/zlibmodule.c +++ b/Modules/zlibmodule.c @@ -165,6 +165,7 @@ } /*[clinic] + zlib.compress bytes: Py_buffer Binary data to be compressed. @@ -179,9 +180,9 @@ [clinic]*/ PyDoc_STRVAR(zlib_compress__doc__, +"compress(bytes, [level])\n" "Returns compressed string.\n" "\n" -"zlib.compress(bytes, [level])\n" " bytes\n" " Binary data to be compressed.\n" " level\n" @@ -226,7 +227,7 @@ static PyObject * zlib_compress_impl(PyModuleDef *module, Py_buffer *bytes, int group_right_1, int level) -/*[clinic checksum: 03e857836db25448d4d572da537eb7faf7695d71]*/ +/*[clinic checksum: f490708eff84be652b5ebe7fe622ab73ac12c888]*/ { PyObject *ReturnVal = NULL; Byte *input, *output = NULL; @@ -742,6 +743,7 @@ } /*[clinic] + zlib.Decompress.decompress data: Py_buffer @@ -760,9 +762,9 @@ [clinic]*/ PyDoc_STRVAR(zlib_Decompress_decompress__doc__, +"decompress(data, max_length=0)\n" "Return a string containing the decompressed version of the data.\n" "\n" -"zlib.Decompress.decompress(data, max_length=0)\n" " data\n" " The binary data to decompress.\n" " max_length\n" @@ -803,7 +805,7 @@ static PyObject * zlib_Decompress_decompress_impl(PyObject *self, Py_buffer *data, unsigned int max_length) -/*[clinic checksum: f83e91728d327462d7ccbee95299514f26b92253]*/ +/*[clinic checksum: 4683928665a1fa6987f5c57cada4a22807a78fbb]*/ { compobject *zself = (compobject *)self; int err; @@ -1029,16 +1031,15 @@ [clinic]*/ PyDoc_STRVAR(zlib_Compress_copy__doc__, -"Return a copy of the compression object.\n" -"\n" -"zlib.Compress.copy()"); +"copy()\n" +"Return a copy of the compression object."); #define ZLIB_COMPRESS_COPY_METHODDEF \ {"copy", (PyCFunction)zlib_Compress_copy, METH_NOARGS, zlib_Compress_copy__doc__}, static PyObject * zlib_Compress_copy(PyObject *self) -/*[clinic checksum: 2551952e72329f0f2beb48a1dde3780e485a220b]*/ +/*[clinic checksum: 8d30351f05defbc2b335c2a78d18f07aa367bb1d]*/ { compobject *zself = (compobject *)self; compobject *retval = NULL; diff --git a/Objects/dictobject.c b/Objects/dictobject.c --- a/Objects/dictobject.c +++ b/Objects/dictobject.c @@ -69,6 +69,11 @@ #include "Python.h" #include "stringlib/eq.h" +/*[clinic] +class dict +[clinic]*/ +/*[clinic checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ + typedef struct { /* Cached hash code of me_key. */ Py_hash_t me_hash; @@ -2160,7 +2165,6 @@ } /*[clinic] -class dict @coexist dict.__contains__ @@ -2172,16 +2176,15 @@ [clinic]*/ PyDoc_STRVAR(dict___contains____doc__, -"True if D has a key k, else False\"\n" -"\n" -"dict.__contains__(key)"); +"__contains__(key)\n" +"True if D has a key k, else False\""); #define DICT___CONTAINS___METHODDEF \ {"__contains__", (PyCFunction)dict___contains__, METH_O|METH_COEXIST, dict___contains____doc__}, static PyObject * dict___contains__(PyObject *self, PyObject *key) -/*[clinic checksum: 61c5c802ea1d35699a1a754f1f3538ea9b259cf4]*/ +/*[clinic checksum: 3bbac5ce898ae630d9668fa1c8b3afb645ff22e8]*/ { register PyDictObject *mp = (PyDictObject *)self; Py_hash_t hash; diff --git a/Objects/methodobject.c b/Objects/methodobject.c --- a/Objects/methodobject.c +++ b/Objects/methodobject.c @@ -159,15 +159,75 @@ } } +/* + * finds the docstring's introspection signature. + * if present, returns a pointer pointing to the first '('. + * otherwise returns NULL. + */ +static const char *find_signature(PyCFunctionObject *m) +{ + const char *trace = m->m_ml->ml_doc; + const char *name = m->m_ml->ml_name; + size_t length; + if (!trace || !name) + return NULL; + length = strlen(name); + if (strncmp(trace, name, length)) + return NULL; + trace += length; + if (*trace != '(') + return NULL; + return trace; +} + +/* + * skips to the end of the docstring's instrospection signature. + */ +static const char *skip_signature(const char *trace) +{ + while (*trace && *trace != '\n') + trace++; + return trace; +} + +static const char *skip_eols(const char *trace) +{ + while (*trace == '\n') + trace++; + return trace; +} + +static PyObject * +meth_get__text_signature__(PyCFunctionObject *m, void *closure) +{ + const char *start = find_signature(m); + const char *trace; + + if (!start) { + Py_INCREF(Py_None); + return Py_None; + } + + trace = skip_signature(start); + return PyUnicode_FromStringAndSize(start, trace - start); +} + static PyObject * meth_get__doc__(PyCFunctionObject *m, void *closure) { - const char *doc = m->m_ml->ml_doc; + const char *doc = find_signature(m); - if (doc != NULL) - return PyUnicode_FromString(doc); - Py_INCREF(Py_None); - return Py_None; + if (doc) + doc = skip_eols(skip_signature(doc)); + else + doc = m->m_ml->ml_doc; + + if (!doc) { + Py_INCREF(Py_None); + return Py_None; + } + + return PyUnicode_FromString(doc); } static PyObject * @@ -236,6 +296,7 @@ {"__name__", (getter)meth_get__name__, NULL, NULL}, {"__qualname__", (getter)meth_get__qualname__, NULL, NULL}, {"__self__", (getter)meth_get__self__, NULL, NULL}, + {"__text_signature__", (getter)meth_get__text_signature__, NULL, NULL}, {0} }; diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -47,6 +47,11 @@ #include #endif +/*[clinic] +class str +[clinic]*/ +/*[clinic checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ + /* --- Globals ------------------------------------------------------------ NOTE: In the interpreter's initialization phase, some globals are currently @@ -12883,7 +12888,6 @@ } /*[clinic] -class str @staticmethod str.maketrans as unicode_maketrans @@ -12908,10 +12912,9 @@ [clinic]*/ PyDoc_STRVAR(unicode_maketrans__doc__, +"maketrans(x, y=None, z=None)\n" "Return a translation table usable for str.translate().\n" "\n" -"str.maketrans(x, y=None, z=None)\n" -"\n" "If there is only one argument, it must be a dictionary mapping Unicode\n" "ordinals (integers) or characters to Unicode ordinals, strings or None.\n" "Character keys will be then converted to ordinals.\n" @@ -12946,7 +12949,7 @@ static PyObject * unicode_maketrans_impl(void *null, PyObject *x, PyObject *y, PyObject *z) -/*[clinic checksum: 6d522e3aea2f2e123da3c5d367132a99d803f9b9]*/ +/*[clinic checksum: 7f76f414a0dfd0c614e0d4717872eeb520516da7]*/ { PyObject *new = NULL, *key, *value; Py_ssize_t i = 0; diff --git a/Tools/clinic/clinic.py b/Tools/clinic/clinic.py --- a/Tools/clinic/clinic.py +++ b/Tools/clinic/clinic.py @@ -24,17 +24,6 @@ import textwrap # TODO: -# converters for -# -# es -# es# -# et -# et# -# s# -# u# -# y# -# z# -# Z# # # soon: # @@ -44,12 +33,6 @@ # * max and min use positional only with an optional group # and keyword-only # -# * Generate forward slash for docstring first line -# (if I get positional-only syntax pep accepted) -# -# * Add "version" directive, so we can complain if the file -# is too new for us. -# version = '1' @@ -2441,7 +2424,7 @@ ## docstring first line ## - add(f.full_name) + add(f.name) add('(') # populate "right_bracket_count" field for every parameter @@ -2498,29 +2481,32 @@ add(fix_right_bracket_count(0)) add(')') - if f.return_converter.doc_default: - add(' -> ') - add(f.return_converter.doc_default) + # if f.return_converter.doc_default: + # add(' -> ') + # add(f.return_converter.doc_default) docstring_first_line = output() # now fix up the places where the brackets look wrong docstring_first_line = docstring_first_line.replace(', ]', ',] ') - # okay. now we're officially building the - # "prototype" section. - add(docstring_first_line) - + # okay. now we're officially building the "parameters" section. # create substitution text for {parameters} + spacer_line = False for p in parameters: if not p.docstring.strip(): continue - add('\n') + if spacer_line: + add('\n') + else: + spacer_line = True add(" ") add(p.name) add('\n') add(textwrap.indent(rstrip_lines(p.docstring.rstrip()), " ")) - prototype = output() + parameters = output() + if parameters: + parameters += '\n' ## ## docstring body @@ -2549,21 +2535,26 @@ elif len(lines) == 1: # the docstring is only one line right now--the summary line. # add an empty line after the summary line so we have space - # between it and the {prototype} we're about to add. + # between it and the {parameters} we're about to add. lines.append('') - prototype_marker_count = len(docstring.split('{prototype}')) - 1 - if prototype_marker_count: - fail('You may not specify {prototype} in a docstring!') - # insert *after* the summary line - lines.insert(2, '{prototype}\n') + parameters_marker_count = len(docstring.split('{parameters}')) - 1 + if parameters_marker_count > 1: + fail('You may not specify {parameters} more than once in a docstring!') + + if not parameters_marker_count: + # insert after summary line + lines.insert(2, '{parameters}') + + # insert at front of docstring + lines.insert(0, docstring_first_line) docstring = "\n".join(lines) add(docstring) docstring = output() - docstring = linear_format(docstring, prototype=prototype) + docstring = linear_format(docstring, parameters=parameters) docstring = docstring.rstrip() return docstring -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 00:54:32 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 24 Nov 2013 00:54:32 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2313477=3A_Added_co?= =?utf-8?q?mmand_line_interface_to_the_tarfile_module=2E?= Message-ID: <3dRrvr1bq0z7Lkh@mail.python.org> http://hg.python.org/cpython/rev/a5b6c8cbc473 changeset: 87471:a5b6c8cbc473 user: Serhiy Storchaka date: Sun Nov 24 01:53:29 2013 +0200 summary: Issue #13477: Added command line interface to the tarfile module. Original patch by Berker Peksag. files: Doc/library/tarfile.rst | 61 +++++++++ Lib/tarfile.py | 94 +++++++++++++++ Lib/test/test_tarfile.py | 166 ++++++++++++++++++++++++++- Misc/NEWS | 3 + 4 files changed, 323 insertions(+), 1 deletions(-) diff --git a/Doc/library/tarfile.rst b/Doc/library/tarfile.rst --- a/Doc/library/tarfile.rst +++ b/Doc/library/tarfile.rst @@ -591,6 +591,67 @@ Return :const:`True` if it is one of character device, block device or FIFO. +.. _tarfile-commandline: + +Command Line Interface +---------------------- + +.. versionadded:: 3.4 + +The :mod:`tarfile` module provides a simple command line interface to interact +with tar archives. + +If you want to create a new tar archive, specify its name after the :option:`-c` +option and then list the filename(s) that should be included:: + + $ python -m tarfile -c monty.tar spam.txt eggs.txt + +Passing a directory is also acceptable:: + + $ python -m tarfile -c monty.tar life-of-brian_1979/ + +If you want to extract a tar archive into the current directory, use +the :option:`-e` option:: + + $ python -m tarfile -e monty.tar + +You can also extract a tar archive into a different directory by passing the +directory's name:: + + $ python -m tarfile -e monty.tar other-dir/ + +For a list of the files in a tar archive, use the :option:`-l` option:: + + $ python -m tarfile -l monty.tar + + +Command line options +~~~~~~~~~~~~~~~~~~~~ + +.. cmdoption:: -l + --list + + List files in a tarfile. + +.. cmdoption:: -c + --create + + Create tarfile from source files. + +.. cmdoption:: -e [] + --extract [] + + Extract tarfile into the current directory if *output_dir* is not specified. + +.. cmdoption:: -t + --test + + Test whether the tarfile is valid or not. + +.. cmdoption:: -v, --verbose + + Verbose output + .. _tar-examples: Examples diff --git a/Lib/tarfile.py b/Lib/tarfile.py --- a/Lib/tarfile.py +++ b/Lib/tarfile.py @@ -2404,3 +2404,97 @@ bltn_open = open open = TarFile.open + + +def main(): + import argparse + + description = 'A simple command line interface for tarfile module.' + parser = argparse.ArgumentParser(description=description) + parser.add_argument('-v', '--verbose', action='store_true', default=False, + help='Verbose output') + group = parser.add_mutually_exclusive_group() + group.add_argument('-l', '--list', metavar='', + help='Show listing of a tarfile') + group.add_argument('-e', '--extract', nargs='+', + metavar=('', ''), + help='Extract tarfile into target dir') + group.add_argument('-c', '--create', nargs='+', + metavar=('', ''), + help='Create tarfile from sources') + group.add_argument('-t', '--test', metavar='', + help='Test if a tarfile is valid') + args = parser.parse_args() + + if args.test: + src = args.test + if is_tarfile(src): + with open(src, 'r') as tar: + tar.getmembers() + print(tar.getmembers(), file=sys.stderr) + if args.verbose: + print('{!r} is a tar archive.'.format(src)) + else: + parser.exit(1, '{!r} is not a tar archive.\n'.format(src)) + + elif args.list: + src = args.list + if is_tarfile(src): + with TarFile.open(src, 'r:*') as tf: + tf.list(verbose=args.verbose) + else: + parser.exit(1, '{!r} is not a tar archive.\n'.format(src)) + + elif args.extract: + if len(args.extract) == 1: + src = args.extract[0] + curdir = os.curdir + elif len(args.extract) == 2: + src, curdir = args.extract + else: + parser.exit(1, parser.format_help()) + + if is_tarfile(src): + with TarFile.open(src, 'r:*') as tf: + tf.extractall(path=curdir) + if args.verbose: + if curdir == '.': + msg = '{!r} file is extracted.'.format(src) + else: + msg = ('{!r} file is extracted ' + 'into {!r} directory.').format(src, curdir) + print(msg) + else: + parser.exit(1, '{!r} is not a tar archive.\n'.format(src)) + + elif args.create: + tar_name = args.create.pop(0) + _, ext = os.path.splitext(tar_name) + compressions = { + # gz + 'gz': 'gz', + 'tgz': 'gz', + # xz + 'xz': 'xz', + 'txz': 'xz', + # bz2 + 'bz2': 'bz2', + 'tbz': 'bz2', + 'tbz2': 'bz2', + 'tb2': 'bz2', + } + tar_mode = 'w:' + compressions[ext] if ext in compressions else 'w' + tar_files = args.create + + with TarFile.open(tar_name, tar_mode) as tf: + for file_name in tar_files: + tf.add(file_name) + + if args.verbose: + print('{!r} file created.'.format(tar_name)) + + else: + parser.exit(1, parser.format_help()) + +if __name__ == '__main__': + main() diff --git a/Lib/test/test_tarfile.py b/Lib/test/test_tarfile.py --- a/Lib/test/test_tarfile.py +++ b/Lib/test/test_tarfile.py @@ -7,7 +7,7 @@ import unittest import tarfile -from test import support +from test import support, script_helper # Check for our compression modules. try: @@ -27,11 +27,13 @@ return md5(data).hexdigest() TEMPDIR = os.path.abspath(support.TESTFN) + "-tardir" +tarextdir = TEMPDIR + '-extract-test' tarname = support.findfile("testtar.tar") gzipname = os.path.join(TEMPDIR, "testtar.tar.gz") bz2name = os.path.join(TEMPDIR, "testtar.tar.bz2") xzname = os.path.join(TEMPDIR, "testtar.tar.xz") tmpname = os.path.join(TEMPDIR, "tmp.tar") +dotlessname = os.path.join(TEMPDIR, "testtar") md5_regtype = "65f477c818ad9e15f7feab0c6d37742f" md5_sparse = "a54fbc4ca4f4399a90e1b27164012fc6" @@ -1724,6 +1726,168 @@ tarfile.itn(0x10000000000, 6, tarfile.GNU_FORMAT) +class CommandLineTest(unittest.TestCase): + + def tarfilecmd(self, *args): + rc, out, err = script_helper.assert_python_ok('-m', 'tarfile', *args) + return out + + def tarfilecmd_failure(self, *args): + return script_helper.assert_python_failure('-m', 'tarfile', *args) + + def make_simple_tarfile(self, tar_name): + files = [support.findfile('tokenize_tests.txt'), + support.findfile('tokenize_tests-no-coding-cookie-' + 'and-utf8-bom-sig-only.txt')] + self.addCleanup(support.unlink, tar_name) + with tarfile.open(tar_name, 'w') as tf: + for tardata in files: + tf.add(tardata, arcname=os.path.basename(tardata)) + + def test_test_command(self): + for tar_name in (tarname, gzipname, bz2name, xzname): + for opt in '-t', '--test': + out = self.tarfilecmd(opt, tar_name) + self.assertEqual(out, b'') + + def test_test_command_verbose(self): + for tar_name in (tarname, gzipname, bz2name, xzname): + for opt in '-v', '--verbose': + out = self.tarfilecmd(opt, '-t', tar_name) + self.assertIn(b'is a tar archive.\n', out) + + def test_test_command_invalid_file(self): + zipname = support.findfile('zipdir.zip') + rc, out, err = self.tarfilecmd_failure('-t', zipname) + self.assertIn(b' is not a tar archive.', err) + self.assertEqual(out, b'') + self.assertEqual(rc, 1) + + for tar_name in (tarname, gzipname, bz2name, xzname): + with self.subTest(tar_name=tar_name): + with open(tar_name, 'rb') as f: + data = f.read() + try: + with open(tmpname, 'wb') as f: + f.write(data[:511]) + rc, out, err = self.tarfilecmd_failure('-t', tmpname) + self.assertEqual(out, b'') + self.assertEqual(rc, 1) + finally: + support.unlink(tmpname) + + def test_list_command(self): + self.make_simple_tarfile(tmpname) + with support.captured_stdout() as t: + with tarfile.open(tmpname, 'r') as tf: + tf.list(verbose=False) + expected = t.getvalue().encode(sys.getfilesystemencoding()) + for opt in '-l', '--list': + out = self.tarfilecmd(opt, tmpname) + self.assertEqual(out, expected) + + def test_list_command_verbose(self): + self.make_simple_tarfile(tmpname) + with support.captured_stdout() as t: + with tarfile.open(tmpname, 'r') as tf: + tf.list(verbose=True) + expected = t.getvalue().encode(sys.getfilesystemencoding()) + for opt in '-v', '--verbose': + out = self.tarfilecmd(opt, '-l', tmpname) + self.assertEqual(out, expected) + + def test_list_command_invalid_file(self): + zipname = support.findfile('zipdir.zip') + rc, out, err = self.tarfilecmd_failure('-l', zipname) + self.assertIn(b' is not a tar archive.', err) + self.assertEqual(out, b'') + self.assertEqual(rc, 1) + + def test_create_command(self): + files = [support.findfile('tokenize_tests.txt'), + support.findfile('tokenize_tests-no-coding-cookie-' + 'and-utf8-bom-sig-only.txt')] + for opt in '-c', '--create': + try: + out = self.tarfilecmd(opt, tmpname, *files) + self.assertEqual(out, b'') + with tarfile.open(tmpname) as tar: + tar.getmembers() + finally: + support.unlink(tmpname) + + def test_create_command_verbose(self): + files = [support.findfile('tokenize_tests.txt'), + support.findfile('tokenize_tests-no-coding-cookie-' + 'and-utf8-bom-sig-only.txt')] + for opt in '-v', '--verbose': + try: + out = self.tarfilecmd(opt, '-c', tmpname, *files) + self.assertIn(b' file created.', out) + with tarfile.open(tmpname) as tar: + tar.getmembers() + finally: + support.unlink(tmpname) + + def test_create_command_dotless_filename(self): + files = [support.findfile('tokenize_tests.txt')] + try: + out = self.tarfilecmd('-c', dotlessname, *files) + self.assertEqual(out, b'') + with tarfile.open(dotlessname) as tar: + tar.getmembers() + finally: + support.unlink(dotlessname) + + def test_create_command_dot_started_filename(self): + tar_name = os.path.join(TEMPDIR, ".testtar") + files = [support.findfile('tokenize_tests.txt')] + try: + out = self.tarfilecmd('-c', tar_name, *files) + self.assertEqual(out, b'') + with tarfile.open(tar_name) as tar: + tar.getmembers() + finally: + support.unlink(tar_name) + + def test_extract_command(self): + self.make_simple_tarfile(tmpname) + for opt in '-e', '--extract': + try: + with support.temp_cwd(tarextdir): + out = self.tarfilecmd(opt, tmpname) + self.assertEqual(out, b'') + finally: + support.rmtree(tarextdir) + + def test_extract_command_verbose(self): + self.make_simple_tarfile(tmpname) + for opt in '-v', '--verbose': + try: + with support.temp_cwd(tarextdir): + out = self.tarfilecmd(opt, '-e', tmpname) + self.assertIn(b' file is extracted.', out) + finally: + support.rmtree(tarextdir) + + def test_extract_command_different_directory(self): + self.make_simple_tarfile(tmpname) + try: + with support.temp_cwd(tarextdir): + out = self.tarfilecmd('-e', tmpname, 'spamdir') + self.assertEqual(out, b'') + finally: + support.rmtree(tarextdir) + + def test_extract_command_invalid_file(self): + zipname = support.findfile('zipdir.zip') + with support.temp_cwd(tarextdir): + rc, out, err = self.tarfilecmd_failure('-e', zipname) + self.assertIn(b' is not a tar archive.', err) + self.assertEqual(out, b'') + self.assertEqual(rc, 1) + + class ContextManagerTest(unittest.TestCase): def test_basic(self): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,9 @@ Library ------- +- Issue #13477: Added command line interface to the tarfile module. + Original patch by Berker Peksag. + - Issue #19674: inspect.signature() now produces a correct signature for some builtins. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 01:11:51 2013 From: python-checkins at python.org (larry.hastings) Date: Sun, 24 Nov 2013 01:11:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Don=27t_attempt_to_run_the?= =?utf-8?q?_=5Fopcode_test_if_it_wasn=27t_built=2E?= Message-ID: <3dRsHq2tsrz7Lkh@mail.python.org> http://hg.python.org/cpython/rev/895cf5801f4c changeset: 87472:895cf5801f4c user: Larry Hastings date: Sat Nov 23 16:11:17 2013 -0800 summary: Don't attempt to run the _opcode test if it wasn't built. files: Lib/test/test__opcode.py | 5 +++-- 1 files changed, 3 insertions(+), 2 deletions(-) diff --git a/Lib/test/test__opcode.py b/Lib/test/test__opcode.py --- a/Lib/test/test__opcode.py +++ b/Lib/test/test__opcode.py @@ -1,8 +1,9 @@ import dis -import _opcode -from test.support import run_unittest +from test.support import run_unittest, import_module import unittest +_opcode = import_module("_opcode") + class OpcodeTests(unittest.TestCase): def test_stack_effect(self): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 01:13:17 2013 From: python-checkins at python.org (christian.heimes) Date: Sun, 24 Nov 2013 01:13:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Add_=5Fopcode_to_Windows_b?= =?utf-8?q?uild_env?= Message-ID: <3dRsKT0p7Tz7Lkh@mail.python.org> http://hg.python.org/cpython/rev/f8d910b7c330 changeset: 87473:f8d910b7c330 parent: 87471:a5b6c8cbc473 user: Christian Heimes date: Sun Nov 24 01:11:57 2013 +0100 summary: Add _opcode to Windows build env files: PC/config.c | 2 ++ PCbuild/pythoncore.vcxproj | 1 + 2 files changed, 3 insertions(+), 0 deletions(-) diff --git a/PC/config.c b/PC/config.c --- a/PC/config.c +++ b/PC/config.c @@ -66,6 +66,7 @@ extern PyObject* _PyWarnings_Init(void); extern PyObject* PyInit__string(void); extern PyObject* PyInit__stat(void); +extern PyObject* PyInit__opcode(void); /* tools/freeze/makeconfig.py marker for additional "extern" */ /* -- ADDMODULE MARKER 1 -- */ @@ -158,6 +159,7 @@ {"_pickle", PyInit__pickle}, {"atexit", PyInit_atexit}, {"_stat", PyInit__stat}, + {"_opcode", PyInit__opcode}, /* Sentinel */ {0, 0} diff --git a/PCbuild/pythoncore.vcxproj b/PCbuild/pythoncore.vcxproj --- a/PCbuild/pythoncore.vcxproj +++ b/PCbuild/pythoncore.vcxproj @@ -524,6 +524,7 @@ + -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 01:13:18 2013 From: python-checkins at python.org (christian.heimes) Date: Sun, 24 Nov 2013 01:13:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_merge?= Message-ID: <3dRsKV2Vwkz7Lr2@mail.python.org> http://hg.python.org/cpython/rev/4f5dbf26459a changeset: 87474:4f5dbf26459a parent: 87473:f8d910b7c330 parent: 87472:895cf5801f4c user: Christian Heimes date: Sun Nov 24 01:12:22 2013 +0100 summary: merge files: Lib/test/test__opcode.py | 5 +++-- 1 files changed, 3 insertions(+), 2 deletions(-) diff --git a/Lib/test/test__opcode.py b/Lib/test/test__opcode.py --- a/Lib/test/test__opcode.py +++ b/Lib/test/test__opcode.py @@ -1,8 +1,9 @@ import dis -import _opcode -from test.support import run_unittest +from test.support import run_unittest, import_module import unittest +_opcode = import_module("_opcode") + class OpcodeTests(unittest.TestCase): def test_stack_effect(self): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 01:14:12 2013 From: python-checkins at python.org (barry.warsaw) Date: Sun, 24 Nov 2013 01:14:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_A_few_more_PEP_updates_-_like?= =?utf-8?q?ly_more_to_come_once_beta_1_is_spun=2E?= Message-ID: <3dRsLX2mR7z7Lsk@mail.python.org> http://hg.python.org/peps/rev/8efed935ad3a changeset: 5317:8efed935ad3a user: Barry Warsaw date: Sat Nov 23 19:14:08 2013 -0500 summary: A few more PEP updates - likely more to come once beta 1 is spun. files: pep-0429.txt | 7 ++----- pep-0454.txt | 2 +- 2 files changed, 3 insertions(+), 6 deletions(-) diff --git a/pep-0429.txt b/pep-0429.txt --- a/pep-0429.txt +++ b/pep-0429.txt @@ -76,14 +76,11 @@ * PEP 450, basic statistics module for the standard library * PEP 451, a ModuleSpec Type for the Import System * PEP 453, pip bootstrapping/bundling with CPython +* PEP 454, the tracemalloc module for tracing Python memory allocations * PEP 456, secure and interchangeable hash algorithm +* PEP 3154, Pickle protocol revision 4 * PEP 3156, improved asynchronous IO support -Accepted but not yet implemented/merged: - -* PEP 454, the tracemalloc module for tracing Python memory allocations -* PEP 3154, Pickle protocol revision 4 - Other final large-scale changes: * None so far diff --git a/pep-0454.txt b/pep-0454.txt --- a/pep-0454.txt +++ b/pep-0454.txt @@ -4,7 +4,7 @@ Last-Modified: $Date$ Author: Victor Stinner BDFL-Delegate: Charles-Fran?ois Natali -Status: Accepted +Status: Final Type: Standards Track Content-Type: text/x-rst Created: 3-September-2013 -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sun Nov 24 01:32:47 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 24 Nov 2013 01:32:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Build_a_list_of_supported_?= =?utf-8?q?test_tarfiles_dynamically_for_CLI_=22test=22_command?= Message-ID: <3dRslz5bNZz7Lqj@mail.python.org> http://hg.python.org/cpython/rev/70b9d22b900a changeset: 87475:70b9d22b900a user: Serhiy Storchaka date: Sun Nov 24 02:30:59 2013 +0200 summary: Build a list of supported test tarfiles dynamically for CLI "test" command tests (issue13477). files: Lib/test/test_tarfile.py | 9 ++++++--- 1 files changed, 6 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_tarfile.py b/Lib/test/test_tarfile.py --- a/Lib/test/test_tarfile.py +++ b/Lib/test/test_tarfile.py @@ -1745,13 +1745,13 @@ tf.add(tardata, arcname=os.path.basename(tardata)) def test_test_command(self): - for tar_name in (tarname, gzipname, bz2name, xzname): + for tar_name in testtarnames: for opt in '-t', '--test': out = self.tarfilecmd(opt, tar_name) self.assertEqual(out, b'') def test_test_command_verbose(self): - for tar_name in (tarname, gzipname, bz2name, xzname): + for tar_name in testtarnames: for opt in '-v', '--verbose': out = self.tarfilecmd(opt, '-t', tar_name) self.assertIn(b'is a tar archive.\n', out) @@ -1763,7 +1763,7 @@ self.assertEqual(out, b'') self.assertEqual(rc, 1) - for tar_name in (tarname, gzipname, bz2name, xzname): + for tar_name in testtarnames: with self.subTest(tar_name=tar_name): with open(tar_name, 'rb') as f: data = f.read() @@ -2015,6 +2015,8 @@ support.unlink(TEMPDIR) os.makedirs(TEMPDIR) + global testtarnames + testtarnames = [tarname] with open(tarname, "rb") as fobj: data = fobj.read() @@ -2022,6 +2024,7 @@ for c in GzipTest, Bz2Test, LzmaTest: if c.open: support.unlink(c.tarname) + testtarnames.append(c.tarname) with c.open(c.tarname, "wb") as tar: tar.write(data) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 01:55:44 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 24 Nov 2013 01:55:44 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Try_to_fix_test=5Ftarfile_?= =?utf-8?q?under_Windows?= Message-ID: <3dRtGS0dQkz7Lsd@mail.python.org> http://hg.python.org/cpython/rev/a539c85aec51 changeset: 87476:a539c85aec51 user: Antoine Pitrou date: Sun Nov 24 01:55:05 2013 +0100 summary: Try to fix test_tarfile under Windows files: Lib/test/test_tarfile.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_tarfile.py b/Lib/test/test_tarfile.py --- a/Lib/test/test_tarfile.py +++ b/Lib/test/test_tarfile.py @@ -1730,7 +1730,7 @@ def tarfilecmd(self, *args): rc, out, err = script_helper.assert_python_ok('-m', 'tarfile', *args) - return out + return out.replace(os.linesep.encode(), b'\n') def tarfilecmd_failure(self, *args): return script_helper.assert_python_failure('-m', 'tarfile', *args) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 02:36:33 2013 From: python-checkins at python.org (larry.hastings) Date: Sun, 24 Nov 2013 02:36:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_inspect=2Esignature_te?= =?utf-8?q?sts_for_builtins_when_docstrings_are_compiled_out=2E?= Message-ID: <3dRv9Y694zz7Lkh@mail.python.org> http://hg.python.org/cpython/rev/2dfb81a51eba changeset: 87477:2dfb81a51eba user: Larry Hastings date: Sat Nov 23 17:35:48 2013 -0800 summary: Fix inspect.signature tests for builtins when docstrings are compiled out. files: Lib/test/test_capi.py | 3 +++ Lib/test/test_inspect.py | 7 ++++++- 2 files changed, 9 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -9,6 +9,7 @@ import time import unittest from test import support +from test.support import MISSING_C_DOCSTRINGS try: import _posixsubprocess except ImportError: @@ -109,6 +110,8 @@ self.assertRaises(TypeError, _posixsubprocess.fork_exec, Z(),[b'1'],3,[1, 2],5,6,7,8,9,10,11,12,13,14,15,16,17) + @unittest.skipIf(MISSING_C_DOCSTRINGS, + "Signature information for builtins requires docstrings") def test_docstring_signature_parsing(self): self.assertEqual(_testcapi.no_docstring.__doc__, None) diff --git a/Lib/test/test_inspect.py b/Lib/test/test_inspect.py --- a/Lib/test/test_inspect.py +++ b/Lib/test/test_inspect.py @@ -17,6 +17,7 @@ ThreadPoolExecutor = None from test.support import run_unittest, TESTFN, DirsOnSysPath +from test.support import MISSING_C_DOCSTRINGS from test.script_helper import assert_python_ok, assert_python_failure from test import inspect_fodder as mod from test import inspect_fodder2 as mod2 @@ -1579,7 +1580,7 @@ ('kwargs', ..., int, "var_keyword")), ...)) - def test_signature_on_builtin_function(self): + def test_signature_on_unsupported_builtins(self): with self.assertRaisesRegex(ValueError, 'not supported by signature'): inspect.signature(type) with self.assertRaisesRegex(ValueError, 'not supported by signature'): @@ -1588,6 +1589,10 @@ with self.assertRaisesRegex(ValueError, 'not supported by signature'): # support for 'method-wrapper' inspect.signature(min.__call__) + + @unittest.skipIf(MISSING_C_DOCSTRINGS, + "Signature information for builtins requires docstrings") + def test_signature_on_builtins(self): self.assertEqual(inspect.signature(min), None) signature = inspect.signature(os.stat) self.assertTrue(isinstance(signature, inspect.Signature)) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 02:47:51 2013 From: python-checkins at python.org (nick.coghlan) Date: Sun, 24 Nov 2013 02:47:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_19734=3A_better_diag?= =?utf-8?q?nostics_for_test=5Fvenv_failures?= Message-ID: <3dRvQb1WXlz7Lpx@mail.python.org> http://hg.python.org/cpython/rev/cb598129837c changeset: 87478:cb598129837c user: Nick Coghlan date: Sun Nov 24 11:36:31 2013 +1000 summary: Issue 19734: better diagnostics for test_venv failures files: Lib/test/test_venv.py | 20 ++++++++++++++++---- Lib/venv/__init__.py | 4 ++-- 2 files changed, 18 insertions(+), 6 deletions(-) diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -285,15 +285,27 @@ # warnings in current versions of Python. Ensure related # environment settings don't cause venv to fail. envvars["PYTHONWARNINGS"] = "e" - self.run_with_capture(venv.create, self.env_dir, with_pip=True) + try: + self.run_with_capture(venv.create, self.env_dir, with_pip=True) + except subprocess.CalledProcessError as exc: + # The output this produces can be a little hard to read, but + # least it has all the details + details = exc.output.decode(errors="replace") + msg = "{}\n\n**Subprocess Output**\n{}".format(exc, details) + self.fail(msg) envpy = os.path.join(os.path.realpath(self.env_dir), self.bindir, self.exe) cmd = [envpy, '-m', 'pip', '--version'] p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE) out, err = p.communicate() - self.assertEqual(err, b"") - self.assertTrue(out.startswith(b"pip")) - self.assertIn(self.env_dir.encode(), out) + # We force everything to text, so unittest gives the detailed diff + # if we get unexpected results + err = err.decode("latin-1") # Force to text, prevent decoding errors + self.assertEqual(err, "") + out = out.decode("latin-1") # Force to text, prevent decoding errors + env_dir = os.fsencode(self.env_dir).decode("latin-1") + self.assertTrue(out.startswith("pip")) + self.assertIn(env_dir, out) def test_main(): diff --git a/Lib/venv/__init__.py b/Lib/venv/__init__.py --- a/Lib/venv/__init__.py +++ b/Lib/venv/__init__.py @@ -237,9 +237,9 @@ # We run ensurepip in isolated mode to avoid side effects from # environment vars, the current directory and anything else # intended for the global Python environment - cmd = [context.env_exe, '-Im', 'ensurepip', '--upgrade', + cmd = [context.env_exe, '-m', 'ensurepip', '--upgrade', '--default-pip'] - subprocess.check_output(cmd) + subprocess.check_output(cmd, stderr=subprocess.STDOUT) def setup_scripts(self, context): """ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 02:53:16 2013 From: python-checkins at python.org (nick.coghlan) Date: Sun, 24 Nov 2013 02:53:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319734=3A_venv_sti?= =?utf-8?q?ll_needs_isolated_mode?= Message-ID: <3dRvXr66fnz7Lpx@mail.python.org> http://hg.python.org/cpython/rev/989de1a267b1 changeset: 87479:989de1a267b1 user: Nick Coghlan date: Sun Nov 24 11:53:03 2013 +1000 summary: Issue #19734: venv still needs isolated mode files: Lib/venv/__init__.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/venv/__init__.py b/Lib/venv/__init__.py --- a/Lib/venv/__init__.py +++ b/Lib/venv/__init__.py @@ -237,7 +237,7 @@ # We run ensurepip in isolated mode to avoid side effects from # environment vars, the current directory and anything else # intended for the global Python environment - cmd = [context.env_exe, '-m', 'ensurepip', '--upgrade', + cmd = [context.env_exe, '-Im', 'ensurepip', '--upgrade', '--default-pip'] subprocess.check_output(cmd, stderr=subprocess.STDOUT) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 03:33:01 2013 From: python-checkins at python.org (christian.heimes) Date: Sun, 24 Nov 2013 03:33:01 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Fix_make_distc?= =?utf-8?q?lean_for_out-of-tree_builds?= Message-ID: <3dRwQj4yypz7Lkh@mail.python.org> http://hg.python.org/cpython/rev/73c72bd2322b changeset: 87480:73c72bd2322b branch: 3.3 parent: 87434:f5a626f762f6 user: Christian Heimes date: Sun Nov 24 03:32:40 2013 +0100 summary: Fix make distclean for out-of-tree builds files: Makefile.pre.in | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Makefile.pre.in b/Makefile.pre.in --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -1413,8 +1413,8 @@ # remove all generated files, even Makefile[.pre] # Keep configure and Python-ast.[ch], it's possible they can't be generated distclean: clobber - for file in Lib/test/data/* ; do \ - if test "$$file" != "Lib/test/data/README"; then rm "$$file"; fi; \ + for file in $(srcdir)/Lib/test/data/* ; do \ + if test "$$file" != "$(srcdir)/Lib/test/data/README"; then rm "$$file"; fi; \ done -rm -f core Makefile Makefile.pre config.status \ Modules/Setup Modules/Setup.local Modules/Setup.config \ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 03:33:02 2013 From: python-checkins at python.org (christian.heimes) Date: Sun, 24 Nov 2013 03:33:02 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Fix_make_distclean_for_out-of-tree_builds?= Message-ID: <3dRwQk6n8cz7Lpx@mail.python.org> http://hg.python.org/cpython/rev/7a18ddbeaaee changeset: 87481:7a18ddbeaaee parent: 87479:989de1a267b1 parent: 87480:73c72bd2322b user: Christian Heimes date: Sun Nov 24 03:32:51 2013 +0100 summary: Fix make distclean for out-of-tree builds files: Makefile.pre.in | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Makefile.pre.in b/Makefile.pre.in --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -1533,8 +1533,8 @@ # remove all generated files, even Makefile[.pre] # Keep configure and Python-ast.[ch], it's possible they can't be generated distclean: clobber - for file in Lib/test/data/* ; do \ - if test "$$file" != "Lib/test/data/README"; then rm "$$file"; fi; \ + for file in $(srcdir)/Lib/test/data/* ; do \ + if test "$$file" != "$(srcdir)/Lib/test/data/README"; then rm "$$file"; fi; \ done -rm -f core Makefile Makefile.pre config.status \ Modules/Setup Modules/Setup.local Modules/Setup.config \ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 03:41:20 2013 From: python-checkins at python.org (ned.deily) Date: Sun, 24 Nov 2013 03:41:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_typo_in_and_reformat_O?= =?utf-8?q?S_X_Installer_ReadMe?= Message-ID: <3dRwcJ5yG9z7Lry@mail.python.org> http://hg.python.org/cpython/rev/4990fa0baa8b changeset: 87482:4990fa0baa8b user: Ned Deily date: Sat Nov 23 18:40:41 2013 -0800 summary: Fix typo in and reformat OS X Installer ReadMe files: Mac/BuildScript/resources/ReadMe.txt | 15 ++++++++------- 1 files changed, 8 insertions(+), 7 deletions(-) diff --git a/Mac/BuildScript/resources/ReadMe.txt b/Mac/BuildScript/resources/ReadMe.txt --- a/Mac/BuildScript/resources/ReadMe.txt +++ b/Mac/BuildScript/resources/ReadMe.txt @@ -36,18 +36,19 @@ The Python installer now includes an option to automatically install or upgrade pip, a tool for installing and managing Python packages. This option is enabled by default and no Internet access is required. -If you do want the installer to do this, select the "Customize" option -at the "Installation Type" step and uncheck the "Install or ugprade -pip" option. +If you do not want the installer to do this, select the "Customize" +option at the "Installation Type" step and uncheck the "Install or +ugprade pip" option. To make it easier to use scripts installed by third-party Python packages, with pip or by other means, the "Shell profile updater" option is now enabled by default, as has been the case with Python 2.7.x installers. You can also turn this option off by selecting -"Customize" and unchecking the "Shell profile updater" option. You can -also update your shell profile later by launching the "Update Shell -Profile" command found in the /Applications/Python $VERSION folder. You may -need to start a new terminal window for the changes to take effect. +"Customize" and unchecking the "Shell profile updater" option. You +can also update your shell profile later by launching the "Update +Shell Profile" command found in the /Applications/Python $VERSION +folder. You may need to start a new terminal window for the +changes to take effect. Python.org Python $VERSION and 2.7.x versions can both be installed and will not conflict. Command names for Python 3 contain a 3 in them, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 03:45:38 2013 From: python-checkins at python.org (nick.coghlan) Date: Sun, 24 Nov 2013 03:45:38 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319744_=28temp_wor?= =?utf-8?q?karound=29=3A_without_ssl=2C_skip_pip_test?= Message-ID: <3dRwjG5lpPz7Ll8@mail.python.org> http://hg.python.org/cpython/rev/9891ba920f3c changeset: 87483:9891ba920f3c user: Nick Coghlan date: Sun Nov 24 12:45:25 2013 +1000 summary: Issue #19744 (temp workaround): without ssl, skip pip test files: Lib/test/test_venv.py | 6 ++++++ 1 files changed, 6 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -15,6 +15,10 @@ can_symlink, EnvironmentVarGuard) import unittest import venv +try: + import ssl +except ImportError: + ssl = None skipInVenv = unittest.skipIf(sys.prefix != sys.base_prefix, 'Test not appropriate in a venv') @@ -278,6 +282,8 @@ self.assertEqual(err, b"") self.assertEqual(out.strip(), b"OK") + # Temporary skip for http://bugs.python.org/issue19744 + @unittest.skipIf(ssl is None, 'pip needs SSL support') def test_with_pip(self): shutil.rmtree(self.env_dir) with EnvironmentVarGuard() as envvars: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 03:54:05 2013 From: python-checkins at python.org (nick.coghlan) Date: Sun, 24 Nov 2013 03:54:05 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319734=3A_ctypes_r?= =?utf-8?q?esource_management_fixes?= Message-ID: <3dRwv13Yv9z7Ll8@mail.python.org> http://hg.python.org/cpython/rev/7c080ee796a6 changeset: 87484:7c080ee796a6 user: Nick Coghlan date: Sun Nov 24 12:53:50 2013 +1000 summary: Issue #19734: ctypes resource management fixes files: Lib/ctypes/util.py | 15 +++++++++------ 1 files changed, 9 insertions(+), 6 deletions(-) diff --git a/Lib/ctypes/util.py b/Lib/ctypes/util.py --- a/Lib/ctypes/util.py +++ b/Lib/ctypes/util.py @@ -132,8 +132,10 @@ cmd = 'if ! type objdump >/dev/null 2>&1; then exit 10; fi;' \ "objdump -p -j .dynamic 2>/dev/null " + f f = os.popen(cmd) - dump = f.read() - rv = f.close() + try: + dump = f.read() + finally: + rv = f.close() if rv == 10: raise OSError('objdump command not found') res = re.search(r'\sSONAME\s+([^\s]+)', dump) @@ -176,10 +178,11 @@ else: cmd = 'env LC_ALL=C /usr/bin/crle 2>/dev/null' - for line in os.popen(cmd).readlines(): - line = line.strip() - if line.startswith('Default Library Path (ELF):'): - paths = line.split()[4] + with contextlib.closing(os.popen(cmd)) as f: + for line in f.readlines(): + line = line.strip() + if line.startswith('Default Library Path (ELF):'): + paths = line.split()[4] if not paths: return None -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 05:29:56 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Sun, 24 Nov 2013 05:29:56 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Make_framing_optional_in_p?= =?utf-8?q?ickle_protocol_4=2E?= Message-ID: <3dRz1c1ZhCz7Ll8@mail.python.org> http://hg.python.org/cpython/rev/de9bda43d552 changeset: 87485:de9bda43d552 user: Alexandre Vassalotti date: Sat Nov 23 20:30:03 2013 -0800 summary: Make framing optional in pickle protocol 4. This will allow us to control in the future whether to use framing or not. For example, we may want to turn it off for tiny pickle where it doesn't help. The change also improves performance slightly: ### fastpickle ### Min: 0.608517 -> 0.557358: 1.09x faster Avg: 0.798892 -> 0.694738: 1.15x faster Significant (t=3.45) Stddev: 0.17145 -> 0.12704: 1.3496x smaller Timeline: http://goo.gl/3xQE1J ### pickle_dict ### Min: 0.669920 -> 0.615271: 1.09x faster Avg: 0.733633 -> 0.645058: 1.14x faster Significant (t=5.05) Stddev: 0.12041 -> 0.02961: 4.0662x smaller Timeline: http://goo.gl/LpLSXI ### pickle_list ### Min: 0.397583 -> 0.368112: 1.08x faster Avg: 0.412784 -> 0.397223: 1.04x faster Significant (t=2.78) Stddev: 0.01518 -> 0.03653: 2.4068x larger Timeline: http://goo.gl/v39E59 ### unpickle_list ### Min: 0.692935 -> 0.594870: 1.16x faster Avg: 0.730012 -> 0.628395: 1.16x faster Significant (t=17.76) Stddev: 0.02720 -> 0.02995: 1.1012x larger Timeline: http://goo.gl/2P9AEt The following not significant results are hidden, use -v to show them: fastunpickle. files: Lib/pickle.py | 130 +++++++++----------- Lib/test/pickletester.py | 39 ++++++ Modules/_pickle.c | 160 +++++--------------------- 3 files changed, 132 insertions(+), 197 deletions(-) diff --git a/Lib/pickle.py b/Lib/pickle.py --- a/Lib/pickle.py +++ b/Lib/pickle.py @@ -188,87 +188,72 @@ self.file_write = file_write self.current_frame = None - def _commit_frame(self): - f = self.current_frame - with f.getbuffer() as data: - n = len(data) - write = self.file_write - write(FRAME) - write(pack(" 0: + self.commit_frame(force=True) self.current_frame = None + def commit_frame(self, force=False): + if self.current_frame: + f = self.current_frame + if f.tell() >= self._FRAME_SIZE_TARGET or force: + with f.getbuffer() as data: + n = len(data) + write = self.file_write + write(FRAME) + write(pack("= self._FRAME_SIZE_TARGET: - self._commit_frame() - return f.write(data) + class _Unframer: def __init__(self, file_read, file_readline, file_tell=None): self.file_read = file_read self.file_readline = file_readline - self.file_tell = file_tell - self.framing_enabled = False self.current_frame = None - self.frame_start = None def read(self, n): - if n == 0: - return b'' - _file_read = self.file_read - if not self.framing_enabled: - return _file_read(n) - f = self.current_frame - if f is not None: - data = f.read(n) - if data: - if len(data) < n: - raise UnpicklingError( - "pickle exhausted before end of frame") - return data - frame_opcode = _file_read(1) - if frame_opcode != FRAME: - raise UnpicklingError( - "expected a FRAME opcode, got {} instead".format(frame_opcode)) - frame_size, = unpack(" sys.maxsize: - raise ValueError("frame size > sys.maxsize: %d" % frame_size) - if self.file_tell is not None: - self.frame_start = self.file_tell() - f = self.current_frame = io.BytesIO(_file_read(frame_size)) - self.readline = f.readline - data = f.read(n) - assert len(data) == n, (len(data), n) - return data + if self.current_frame: + data = self.current_frame.read(n) + if not data and n != 0: + self.current_frame = None + return self.file_read(n) + if len(data) < n: + raise UnpicklingError( + "pickle exhausted before end of frame") + return data + else: + return self.file_read(n) def readline(self): - if not self.framing_enabled: + if self.current_frame: + data = self.current_frame.readline() + if not data: + self.current_frame = None + return self.file_readline() + if data[-1] != b'\n': + raise UnpicklingError( + "pickle exhausted before end of frame") + return data + else: return self.file_readline() - else: - return self.current_frame.readline() - def tell(self): - if self.file_tell is None: - return None - elif self.current_frame is None: - return self.file_tell() - else: - return self.frame_start + self.current_frame.tell() + def load_frame(self, frame_size): + if self.current_frame and self.current_frame.read() != b'': + raise UnpicklingError( + "beginning of a new frame before end of current frame") + self.current_frame = io.BytesIO(self.file_read(frame_size)) # Tools used for pickling. @@ -392,6 +377,8 @@ self._file_write = file.write except AttributeError: raise TypeError("file must have a 'write' attribute") + self.framer = _Framer(self._file_write) + self.write = self.framer.write self.memo = {} self.proto = int(protocol) self.bin = protocol >= 1 @@ -417,18 +404,12 @@ raise PicklingError("Pickler.__init__() was not called by " "%s.__init__()" % (self.__class__.__name__,)) if self.proto >= 2: - self._file_write(PROTO + pack("= 4: - framer = _Framer(self._file_write) - framer.start_framing() - self.write = framer.write - else: - framer = None - self.write = self._file_write + self.framer.start_framing() self.save(obj) self.write(STOP) - if framer is not None: - framer.end_framing() + self.framer.end_framing() def memoize(self, obj): """Store an object in the memo.""" @@ -475,6 +456,8 @@ return GET + repr(i).encode("ascii") + b'\n' def save(self, obj, save_persistent_id=True): + self.framer.commit_frame() + # Check for persistent id (defined by a subclass) pid = self.persistent_id(obj) if pid is not None and save_persistent_id: @@ -1078,10 +1061,15 @@ if not 0 <= proto <= HIGHEST_PROTOCOL: raise ValueError("unsupported pickle protocol: %d" % proto) self.proto = proto - if proto >= 4: - self._unframer.framing_enabled = True dispatch[PROTO[0]] = load_proto + def load_frame(self): + frame_size, = unpack(' sys.maxsize: + raise ValueError("frame size > sys.maxsize: %d" % frame_size) + self._unframer.load_frame(frame_size) + dispatch[FRAME[0]] = load_frame + def load_persid(self): pid = self.readline()[:-1].decode("ascii") self.append(self.persistent_load(pid)) diff --git a/Lib/test/pickletester.py b/Lib/test/pickletester.py --- a/Lib/test/pickletester.py +++ b/Lib/test/pickletester.py @@ -1353,6 +1353,45 @@ n_frames = pickled.count(b'\x00\x00\x00\x00\x00') self.assertGreaterEqual(n_frames, len(obj)) + def test_optional_frames(self): + if pickle.HIGHEST_PROTOCOL < 4: + return + + def remove_frames(pickled, keep_frame=None): + """Remove frame opcodes from the given pickle.""" + frame_starts = [] + # 1 byte for the opcode and 8 for the argument + frame_opcode_size = 9 + for opcode, _, pos in pickletools.genops(pickled): + if opcode.name == 'FRAME': + frame_starts.append(pos) + + newpickle = bytearray() + last_frame_end = 0 + for i, pos in enumerate(frame_starts): + if keep_frame and keep_frame(i): + continue + newpickle += pickled[last_frame_end:pos] + last_frame_end = pos + frame_opcode_size + newpickle += pickled[last_frame_end:] + return newpickle + + target_frame_size = 64 * 1024 + num_frames = 20 + obj = [bytes([i]) * target_frame_size for i in range(num_frames)] + + for proto in range(4, pickle.HIGHEST_PROTOCOL + 1): + pickled = self.dumps(obj, proto) + + frameless_pickle = remove_frames(pickled) + self.assertEqual(count_opcode(pickle.FRAME, frameless_pickle), 0) + self.assertEqual(obj, self.loads(frameless_pickle)) + + some_frames_pickle = remove_frames(pickled, lambda i: i % 2 == 0) + self.assertLess(count_opcode(pickle.FRAME, some_frames_pickle), + count_opcode(pickle.FRAME, pickled)) + self.assertEqual(obj, self.loads(some_frames_pickle)) + def test_nested_names(self): global Nested class Nested: diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -110,10 +110,6 @@ /* Initial size of the write buffer of Pickler. */ WRITE_BUF_SIZE = 4096, - /* Maximum size of the write buffer of Pickler when pickling to a - stream. This is ignored for in-memory pickling. */ - MAX_WRITE_BUF_SIZE = 64 * 1024, - /* Prefetch size when unpickling (disabled on unpeekable streams) */ PREFETCH = 8192 * 16, @@ -381,7 +377,6 @@ char *input_line; Py_ssize_t input_len; Py_ssize_t next_read_idx; - Py_ssize_t frame_end_idx; Py_ssize_t prefetched_idx; /* index of first prefetched byte */ PyObject *read; /* read() method of the input stream. */ @@ -401,7 +396,6 @@ int proto; /* Protocol of the pickle loaded. */ int fix_imports; /* Indicate whether Unpickler should fix the name of globals pickled by Python 2.x. */ - int framing; /* True when framing is enabled, proto >= 4 */ } UnpicklerObject; /* Forward declarations */ @@ -802,46 +796,6 @@ n = data_len; required = self->output_len + n; - if (self->write != NULL && required > MAX_WRITE_BUF_SIZE) { - /* XXX This reallocates a new buffer every time, which is a bit - wasteful. */ - if (_Pickler_FlushToFile(self) < 0) - return -1; - if (_Pickler_ClearBuffer(self) < 0) - return -1; - /* The previous frame was just committed by _Pickler_FlushToFile */ - need_new_frame = self->framing; - if (need_new_frame) - n = data_len + FRAME_HEADER_SIZE; - else - n = data_len; - required = self->output_len + n; - } - if (self->write != NULL && n > MAX_WRITE_BUF_SIZE) { - /* For large pickle chunks, we write directly to the output - file instead of buffering. Note the buffer is empty at this - point (it was flushed above, since required >= n). */ - PyObject *output, *result; - if (need_new_frame) { - char frame_header[FRAME_HEADER_SIZE]; - _Pickler_WriteFrameHeader(self, frame_header, (size_t) data_len); - output = PyBytes_FromStringAndSize(frame_header, FRAME_HEADER_SIZE); - if (output == NULL) - return -1; - result = _Pickler_FastCall(self, self->write, output); - Py_XDECREF(result); - if (result == NULL) - return -1; - } - /* XXX we could spare an intermediate copy and pass - a memoryview instead */ - output = PyBytes_FromStringAndSize(s, data_len); - if (output == NULL) - return -1; - result = _Pickler_FastCall(self, self->write, output); - Py_XDECREF(result); - return (result == NULL) ? -1 : 0; - } if (required > self->max_output_len) { /* Make place in buffer for the pickle chunk */ if (self->output_len >= PY_SSIZE_T_MAX / 2 - n) { @@ -987,7 +941,6 @@ self->input_buffer = self->buffer.buf; self->input_len = self->buffer.len; self->next_read_idx = 0; - self->frame_end_idx = -1; self->prefetched_idx = self->input_len; return self->input_len; } @@ -1052,7 +1005,7 @@ return -1; /* Prefetch some data without advancing the file pointer, if possible */ - if (self->peek && !self->framing) { + if (self->peek) { PyObject *len, *prefetched; len = PyLong_FromSsize_t(PREFETCH); if (len == NULL) { @@ -1100,7 +1053,7 @@ Returns -1 (with an exception set) on failure. On success, return the number of chars read. */ static Py_ssize_t -_Unpickler_ReadUnframed(UnpicklerObject *self, char **s, Py_ssize_t n) +_Unpickler_Read(UnpicklerObject *self, char **s, Py_ssize_t n) { Py_ssize_t num_read; @@ -1126,67 +1079,6 @@ } static Py_ssize_t -_Unpickler_Read(UnpicklerObject *self, char **s, Py_ssize_t n) -{ - if (self->framing && - (self->frame_end_idx == -1 || - self->frame_end_idx <= self->next_read_idx)) { - /* Need to read new frame */ - char *dummy = NULL; - unsigned char *frame_start; - size_t frame_len; - if (_Unpickler_ReadUnframed(self, &dummy, FRAME_HEADER_SIZE) < 0) - return -1; - frame_start = (unsigned char *) dummy; - if (frame_start[0] != (unsigned char)FRAME) { - PyErr_Format(UnpicklingError, - "expected FRAME opcode, got 0x%x instead", - frame_start[0]); - return -1; - } - frame_len = (size_t) frame_start[1]; - frame_len |= (size_t) frame_start[2] << 8; - frame_len |= (size_t) frame_start[3] << 16; - frame_len |= (size_t) frame_start[4] << 24; -#if SIZEOF_SIZE_T >= 8 - frame_len |= (size_t) frame_start[5] << 32; - frame_len |= (size_t) frame_start[6] << 40; - frame_len |= (size_t) frame_start[7] << 48; - frame_len |= (size_t) frame_start[8] << 56; -#else - if (frame_start[5] || frame_start[6] || - frame_start[7] || frame_start[8]) { - PyErr_Format(PyExc_OverflowError, - "Frame size too large for 32-bit build"); - return -1; - } -#endif - if (frame_len > PY_SSIZE_T_MAX) { - PyErr_Format(UnpicklingError, "Invalid frame length"); - return -1; - } - if ((Py_ssize_t) frame_len < n) { - PyErr_Format(UnpicklingError, "Bad framing"); - return -1; - } - if (_Unpickler_ReadUnframed(self, &dummy /* unused */, - frame_len) < 0) - return -1; - /* Rewind to start of frame */ - self->frame_end_idx = self->next_read_idx; - self->next_read_idx -= frame_len; - } - if (self->framing) { - /* Check for bad input */ - if (n + self->next_read_idx > self->frame_end_idx) { - PyErr_Format(UnpicklingError, "Bad framing"); - return -1; - } - } - return _Unpickler_ReadUnframed(self, s, n); -} - -static Py_ssize_t _Unpickler_CopyLine(UnpicklerObject *self, char *line, Py_ssize_t len, char **result) { @@ -1336,7 +1228,6 @@ self->input_line = NULL; self->input_len = 0; self->next_read_idx = 0; - self->frame_end_idx = -1; self->prefetched_idx = 0; self->read = NULL; self->readline = NULL; @@ -1347,7 +1238,6 @@ self->num_marks = 0; self->marks_size = 0; self->proto = 0; - self->framing = 0; self->fix_imports = 0; memset(&self->buffer, 0, sizeof(Py_buffer)); self->memo_size = 32; @@ -1474,8 +1364,6 @@ if (self->fast) return 0; - if (_Pickler_OpcodeBoundary(self)) - return -1; idx = PyMemoTable_Size(self->memo); if (PyMemoTable_Set(self->memo, obj, idx) < 0) @@ -3661,6 +3549,9 @@ PyObject *reduce_value = NULL; int status = 0; + if (_Pickler_OpcodeBoundary(self) < 0) + return -1; + if (Py_EnterRecursiveCall(" while pickling an object")) return -1; @@ -3855,8 +3746,7 @@ status = -1; } done: - if (status == 0) - status = _Pickler_OpcodeBoundary(self); + Py_LeaveRecursiveCall(); Py_XDECREF(reduce_func); Py_XDECREF(reduce_value); @@ -4514,7 +4404,7 @@ int i; size_t x = 0; - for (i = 0; i < nbytes; i++) { + for (i = 0; i < nbytes && i < sizeof(size_t); i++) { x |= (size_t) s[i] << (8 * i); } @@ -5972,7 +5862,6 @@ i = (unsigned char)s[0]; if (i <= HIGHEST_PROTOCOL) { self->proto = i; - self->framing = (self->proto >= 4); return 0; } @@ -5980,16 +5869,39 @@ return -1; } +static int +load_frame(UnpicklerObject *self) +{ + char *s; + Py_ssize_t frame_len; + + if (_Unpickler_Read(self, &s, 8) < 0) + return -1; + + frame_len = calc_binsize(s, 8); + if (frame_len < 0) { + PyErr_Format(PyExc_OverflowError, + "FRAME length exceeds system's maximum of %zd bytes", + PY_SSIZE_T_MAX); + return -1; + } + + if (_Unpickler_Read(self, &s, frame_len) < 0) + return -1; + + /* Rewind to start of frame */ + self->next_read_idx -= frame_len; + return 0; +} + static PyObject * load(UnpicklerObject *self) { - PyObject *err; PyObject *value = NULL; char *s; self->num_marks = 0; self->proto = 0; - self->framing = 0; if (Py_SIZE(self->stack)) Pdata_clear(self->stack, 0); @@ -6063,6 +5975,7 @@ OP(BINPERSID, load_binpersid) OP(REDUCE, load_reduce) OP(PROTO, load_proto) + OP(FRAME, load_frame) OP_ARG(EXT1, load_extension, 1) OP_ARG(EXT2, load_extension, 2) OP_ARG(EXT4, load_extension, 4) @@ -6084,11 +5997,7 @@ break; /* and we are done! */ } - /* XXX: It is not clear what this is actually for. */ - if ((err = PyErr_Occurred())) { - if (err == PyExc_EOFError) { - PyErr_SetNone(PyExc_EOFError); - } + if (PyErr_Occurred()) { return NULL; } @@ -6383,7 +6292,6 @@ self->arg = NULL; self->proto = 0; - self->framing = 0; return 0; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 05:58:24 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Sun, 24 Nov 2013 05:58:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Disable_annoying_tests_whi?= =?utf-8?q?ch_doesn=27t_work_optimized_pickles=2E?= Message-ID: <3dRzfS13cMz7Ll8@mail.python.org> http://hg.python.org/cpython/rev/a68c303eb8dc changeset: 87486:a68c303eb8dc user: Alexandre Vassalotti date: Sat Nov 23 20:58:24 2013 -0800 summary: Disable annoying tests which doesn't work optimized pickles. files: Lib/test/pickletester.py | 24 +++++++++++++----------- 1 files changed, 13 insertions(+), 11 deletions(-) diff --git a/Lib/test/pickletester.py b/Lib/test/pickletester.py --- a/Lib/test/pickletester.py +++ b/Lib/test/pickletester.py @@ -1334,12 +1334,13 @@ self.assertEqual(obj, unpickled) # Test the framing heuristic is sane, # assuming a given frame size target. - bytes_per_frame = (len(pickled) / - pickled.count(b'\x00\x00\x00\x00\x00')) - self.assertGreater(bytes_per_frame, - self.FRAME_SIZE_TARGET / 2) - self.assertLessEqual(bytes_per_frame, - self.FRAME_SIZE_TARGET * 1) + # XXX Assumptions here are wrong when the pickle are optimized + # bytes_per_frame = (len(pickled) / + # count_opcode(pickle.FRAME, pickled)) + # self.assertGreater(bytes_per_frame, + # self.FRAME_SIZE_TARGET / 2) + # self.assertLessEqual(bytes_per_frame, + # self.FRAME_SIZE_TARGET * 1) def test_framing_large_objects(self): N = 1024 * 1024 @@ -1350,8 +1351,9 @@ unpickled = self.loads(pickled) self.assertEqual(obj, unpickled) # At least one frame was emitted per large bytes object. - n_frames = pickled.count(b'\x00\x00\x00\x00\x00') - self.assertGreaterEqual(n_frames, len(obj)) + # XXX Assumptions here are wrong when the pickle are optimized + # n_frames = count_opcode(pickle.FRAME, pickled) + # self.assertGreaterEqual(n_frames, len(obj)) def test_optional_frames(self): if pickle.HIGHEST_PROTOCOL < 4: @@ -1376,9 +1378,9 @@ newpickle += pickled[last_frame_end:] return newpickle - target_frame_size = 64 * 1024 + frame_size = self.FRAME_SIZE_TARGET num_frames = 20 - obj = [bytes([i]) * target_frame_size for i in range(num_frames)] + obj = [bytes([i]) * frame_size for i in range(num_frames)] for proto in range(4, pickle.HIGHEST_PROTOCOL + 1): pickled = self.dumps(obj, proto) @@ -1387,7 +1389,7 @@ self.assertEqual(count_opcode(pickle.FRAME, frameless_pickle), 0) self.assertEqual(obj, self.loads(frameless_pickle)) - some_frames_pickle = remove_frames(pickled, lambda i: i % 2 == 0) + some_frames_pickle = remove_frames(pickled, lambda i: i % 2) self.assertLess(count_opcode(pickle.FRAME, some_frames_pickle), count_opcode(pickle.FRAME, pickled)) self.assertEqual(obj, self.loads(some_frames_pickle)) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 05:59:35 2013 From: python-checkins at python.org (nick.coghlan) Date: Sun, 24 Nov 2013 05:59:35 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319734=3A_Ensure_t?= =?utf-8?q?est=5Fvenv_ignores_PIP=5FREQUIRE=5FVIRTUALENV?= Message-ID: <3dRzgq0mYYz7Ll8@mail.python.org> http://hg.python.org/cpython/rev/9186fdae7e1f changeset: 87487:9186fdae7e1f user: Nick Coghlan date: Sun Nov 24 14:58:31 2013 +1000 summary: Issue #19734: Ensure test_venv ignores PIP_REQUIRE_VIRTUALENV files: Lib/test/test_venv.py | 4 ++++ 1 files changed, 4 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -291,6 +291,10 @@ # warnings in current versions of Python. Ensure related # environment settings don't cause venv to fail. envvars["PYTHONWARNINGS"] = "e" + # pip doesn't ignore environment variables when running in + # isolated mode, and we don't have an active virtualenv here + # See http://bugs.python.org/issue19734 for details + del envvars["PIP_REQUIRE_VIRTUALENV"] try: self.run_with_capture(venv.create, self.env_dir, with_pip=True) except subprocess.CalledProcessError as exc: -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Sun Nov 24 07:47:44 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sun, 24 Nov 2013 07:47:44 +0100 Subject: [Python-checkins] Daily reference leaks (989de1a267b1): sum=0 Message-ID: results for 989de1a267b1 on branch "default" -------------------------------------------- test_site leaked [0, 2, -2] references, sum=0 test_site leaked [0, 2, -2] memory blocks, sum=0 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogmWKC97', '-x'] From python-checkins at python.org Sun Nov 24 07:49:32 2013 From: python-checkins at python.org (nick.coghlan) Date: Sun, 24 Nov 2013 07:49:32 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319734=3A_Also_run?= =?utf-8?q?_pip_version_check_in_isolated_mode?= Message-ID: <3dS26h6zR5z7LlL@mail.python.org> http://hg.python.org/cpython/rev/124e51c19e4f changeset: 87488:124e51c19e4f user: Nick Coghlan date: Sun Nov 24 16:49:20 2013 +1000 summary: Issue #19734: Also run pip version check in isolated mode files: Lib/test/test_venv.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -304,9 +304,9 @@ msg = "{}\n\n**Subprocess Output**\n{}".format(exc, details) self.fail(msg) envpy = os.path.join(os.path.realpath(self.env_dir), self.bindir, self.exe) - cmd = [envpy, '-m', 'pip', '--version'] + cmd = [envpy, '-Im', 'pip', '--version'] p = subprocess.Popen(cmd, stdout=subprocess.PIPE, - stderr=subprocess.PIPE) + stderr=subprocess.PIPE) out, err = p.communicate() # We force everything to text, so unittest gives the detailed diff # if we get unexpected results -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 08:20:31 2013 From: python-checkins at python.org (zach.ware) Date: Sun, 24 Nov 2013 08:20:31 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=233158=3A_doctest_c?= =?utf-8?q?an_now_find_doctests_in_functions_and_methods?= Message-ID: <3dS2pR4bPwz7LlL@mail.python.org> http://hg.python.org/cpython/rev/95dc3054959b changeset: 87489:95dc3054959b parent: 87487:9186fdae7e1f user: Zachary Ware date: Sun Nov 24 01:19:09 2013 -0600 summary: Issue #3158: doctest can now find doctests in functions and methods written in C. As a part of this, a few doctests have been added to the builtins module (on hex(), oct(), and bin()), a doctest has been fixed (hopefully on all platforms) on float, and test_builtins now runs doctests in builtins. files: Doc/library/doctest.rst | 9 +++++-- Lib/doctest.py | 11 +++++---- Lib/test/test_builtin.py | 21 ++++--------------- Lib/test/test_doctest.py | 29 ++++++++++++++++++++++++++++ Misc/NEWS | 3 ++ Objects/floatobject.c | 2 +- Python/bltinmodule.c | 18 ++++++++++++++-- 7 files changed, 65 insertions(+), 28 deletions(-) diff --git a/Doc/library/doctest.rst b/Doc/library/doctest.rst --- a/Doc/library/doctest.rst +++ b/Doc/library/doctest.rst @@ -278,6 +278,10 @@ Any classes found are recursively searched similarly, to test docstrings in their contained methods and nested classes. +.. impl-detail:: + Prior to version 3.4, extension modules written in C were not fully + searched by doctest. + .. _doctest-finding-examples: @@ -1285,9 +1289,8 @@ A processing class used to extract the :class:`DocTest`\ s that are relevant to a given object, from its docstring and the docstrings of its contained objects. - :class:`DocTest`\ s can currently be extracted from the following object types: - modules, functions, classes, methods, staticmethods, classmethods, and - properties. + :class:`DocTest`\ s can be extracted from modules, classes, functions, + methods, staticmethods, classmethods, and properties. The optional argument *verbose* can be used to display the objects searched by the finder. It defaults to ``False`` (no output). diff --git a/Lib/doctest.py b/Lib/doctest.py --- a/Lib/doctest.py +++ b/Lib/doctest.py @@ -918,6 +918,8 @@ return module is inspect.getmodule(object) elif inspect.isfunction(object): return module.__dict__ is object.__globals__ + elif inspect.ismethoddescriptor(object): + return module.__name__ == object.__objclass__.__module__ elif inspect.isclass(object): return module.__name__ == object.__module__ elif hasattr(object, '__module__'): @@ -950,7 +952,7 @@ for valname, val in obj.__dict__.items(): valname = '%s.%s' % (name, valname) # Recurse to functions & classes. - if ((inspect.isfunction(val) or inspect.isclass(val)) and + if ((inspect.isroutine(val) or inspect.isclass(val)) and self._from_module(module, val)): self._find(tests, val, valname, module, source_lines, globs, seen) @@ -962,9 +964,8 @@ raise ValueError("DocTestFinder.find: __test__ keys " "must be strings: %r" % (type(valname),)) - if not (inspect.isfunction(val) or inspect.isclass(val) or - inspect.ismethod(val) or inspect.ismodule(val) or - isinstance(val, str)): + if not (inspect.isroutine(val) or inspect.isclass(val) or + inspect.ismodule(val) or isinstance(val, str)): raise ValueError("DocTestFinder.find: __test__ values " "must be strings, functions, methods, " "classes, or modules: %r" % @@ -983,7 +984,7 @@ val = getattr(obj, valname).__func__ # Recurse to methods, properties, and nested classes. - if ((inspect.isfunction(val) or inspect.isclass(val) or + if ((inspect.isroutine(val) or inspect.isclass(val) or isinstance(val, property)) and self._from_module(module, val)): valname = '%s.%s' % (name, valname) diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -1592,21 +1592,10 @@ data = 'The quick Brown fox Jumped over The lazy Dog'.split() self.assertRaises(TypeError, sorted, data, None, lambda x,y: 0) -def test_main(verbose=None): - test_classes = (BuiltinTest, TestSorted) - - run_unittest(*test_classes) - - # verify reference counting - if verbose and hasattr(sys, "gettotalrefcount"): - import gc - counts = [None] * 5 - for i in range(len(counts)): - run_unittest(*test_classes) - gc.collect() - counts[i] = sys.gettotalrefcount() - print(counts) - +def load_tests(loader, tests, pattern): + from doctest import DocTestSuite + tests.addTest(DocTestSuite(builtins)) + return tests if __name__ == "__main__": - test_main(verbose=True) + unittest.main() diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -644,6 +644,35 @@ >>> test = doctest.DocTestFinder().find(f)[0] >>> [e.lineno for e in test.examples] [1, 9, 12] + +Finding Doctests in Modules Not Written in Python +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +DocTestFinder can also find doctests in most modules not written in Python. +We'll use builtins as an example, since it almost certainly isn't written in +plain ol' Python and is guaranteed to be available. + + >>> import builtins + >>> tests = doctest.DocTestFinder().find(builtins) + >>> len(tests) # how many objects checked for doctests + 794 + >>> real_tests = [t for t in tests if len(t.examples) > 0] + >>> len(real_tests) # how many objects actually have doctests + 8 + >>> for t in real_tests: + ... print('{} {}'.format(len(t.examples), t.name)) + ... + 1 builtins.bin + 3 builtins.float.as_integer_ratio + 2 builtins.float.fromhex + 2 builtins.float.hex + 1 builtins.hex + 1 builtins.int + 2 builtins.int.bit_length + 1 builtins.oct + +Note here that 'bin', 'oct', and 'hex' are functions; 'float.as_integer_ratio', +'float.hex', and 'int.bit_length' are methods; 'float.fromhex' is a classmethod, +and 'int' is a type. """ def test_DocTestParser(): r""" diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,9 @@ Library ------- +- Issue #3158: doctest can now find doctests in functions and methods + written in C. + - Issue #13477: Added command line interface to the tarfile module. Original patch by Berker Peksag. diff --git a/Objects/floatobject.c b/Objects/floatobject.c --- a/Objects/floatobject.c +++ b/Objects/floatobject.c @@ -1417,7 +1417,7 @@ >>> float.fromhex('0x1.ffffp10')\n\ 2047.984375\n\ >>> float.fromhex('-0x1p-1074')\n\ --4.9406564584124654e-324"); +-5e-324"); static PyObject * diff --git a/Python/bltinmodule.c b/Python/bltinmodule.c --- a/Python/bltinmodule.c +++ b/Python/bltinmodule.c @@ -350,7 +350,11 @@ PyDoc_STRVAR(bin_doc, "bin(number) -> string\n\ \n\ -Return the binary representation of an integer."); +Return the binary representation of an integer.\n\ +\n\ + >>> bin(2796202)\n\ + '0b1010101010101010101010'\n\ +"); static PyObject * @@ -1276,7 +1280,11 @@ PyDoc_STRVAR(hex_doc, "hex(number) -> string\n\ \n\ -Return the hexadecimal representation of an integer."); +Return the hexadecimal representation of an integer.\n\ +\n\ + >>> hex(3735928559)\n\ + '0xdeadbeef'\n\ +"); static PyObject * @@ -1476,7 +1484,11 @@ PyDoc_STRVAR(oct_doc, "oct(number) -> string\n\ \n\ -Return the octal representation of an integer."); +Return the octal representation of an integer.\n\ +\n\ + >>> oct(342391)\n\ + '0o1234567'\n\ +"); static PyObject * -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 08:20:33 2013 From: python-checkins at python.org (zach.ware) Date: Sun, 24 Nov 2013 08:20:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merge_heads?= Message-ID: <3dS2pT0c32z7LqY@mail.python.org> http://hg.python.org/cpython/rev/a9d4690c27ad changeset: 87490:a9d4690c27ad parent: 87488:124e51c19e4f parent: 87489:95dc3054959b user: Zachary Ware date: Sun Nov 24 01:20:14 2013 -0600 summary: Merge heads files: Doc/library/doctest.rst | 9 +++++-- Lib/doctest.py | 11 +++++---- Lib/test/test_builtin.py | 21 ++++--------------- Lib/test/test_doctest.py | 29 ++++++++++++++++++++++++++++ Misc/NEWS | 3 ++ Objects/floatobject.c | 2 +- Python/bltinmodule.c | 18 ++++++++++++++-- 7 files changed, 65 insertions(+), 28 deletions(-) diff --git a/Doc/library/doctest.rst b/Doc/library/doctest.rst --- a/Doc/library/doctest.rst +++ b/Doc/library/doctest.rst @@ -278,6 +278,10 @@ Any classes found are recursively searched similarly, to test docstrings in their contained methods and nested classes. +.. impl-detail:: + Prior to version 3.4, extension modules written in C were not fully + searched by doctest. + .. _doctest-finding-examples: @@ -1285,9 +1289,8 @@ A processing class used to extract the :class:`DocTest`\ s that are relevant to a given object, from its docstring and the docstrings of its contained objects. - :class:`DocTest`\ s can currently be extracted from the following object types: - modules, functions, classes, methods, staticmethods, classmethods, and - properties. + :class:`DocTest`\ s can be extracted from modules, classes, functions, + methods, staticmethods, classmethods, and properties. The optional argument *verbose* can be used to display the objects searched by the finder. It defaults to ``False`` (no output). diff --git a/Lib/doctest.py b/Lib/doctest.py --- a/Lib/doctest.py +++ b/Lib/doctest.py @@ -918,6 +918,8 @@ return module is inspect.getmodule(object) elif inspect.isfunction(object): return module.__dict__ is object.__globals__ + elif inspect.ismethoddescriptor(object): + return module.__name__ == object.__objclass__.__module__ elif inspect.isclass(object): return module.__name__ == object.__module__ elif hasattr(object, '__module__'): @@ -950,7 +952,7 @@ for valname, val in obj.__dict__.items(): valname = '%s.%s' % (name, valname) # Recurse to functions & classes. - if ((inspect.isfunction(val) or inspect.isclass(val)) and + if ((inspect.isroutine(val) or inspect.isclass(val)) and self._from_module(module, val)): self._find(tests, val, valname, module, source_lines, globs, seen) @@ -962,9 +964,8 @@ raise ValueError("DocTestFinder.find: __test__ keys " "must be strings: %r" % (type(valname),)) - if not (inspect.isfunction(val) or inspect.isclass(val) or - inspect.ismethod(val) or inspect.ismodule(val) or - isinstance(val, str)): + if not (inspect.isroutine(val) or inspect.isclass(val) or + inspect.ismodule(val) or isinstance(val, str)): raise ValueError("DocTestFinder.find: __test__ values " "must be strings, functions, methods, " "classes, or modules: %r" % @@ -983,7 +984,7 @@ val = getattr(obj, valname).__func__ # Recurse to methods, properties, and nested classes. - if ((inspect.isfunction(val) or inspect.isclass(val) or + if ((inspect.isroutine(val) or inspect.isclass(val) or isinstance(val, property)) and self._from_module(module, val)): valname = '%s.%s' % (name, valname) diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -1592,21 +1592,10 @@ data = 'The quick Brown fox Jumped over The lazy Dog'.split() self.assertRaises(TypeError, sorted, data, None, lambda x,y: 0) -def test_main(verbose=None): - test_classes = (BuiltinTest, TestSorted) - - run_unittest(*test_classes) - - # verify reference counting - if verbose and hasattr(sys, "gettotalrefcount"): - import gc - counts = [None] * 5 - for i in range(len(counts)): - run_unittest(*test_classes) - gc.collect() - counts[i] = sys.gettotalrefcount() - print(counts) - +def load_tests(loader, tests, pattern): + from doctest import DocTestSuite + tests.addTest(DocTestSuite(builtins)) + return tests if __name__ == "__main__": - test_main(verbose=True) + unittest.main() diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -644,6 +644,35 @@ >>> test = doctest.DocTestFinder().find(f)[0] >>> [e.lineno for e in test.examples] [1, 9, 12] + +Finding Doctests in Modules Not Written in Python +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +DocTestFinder can also find doctests in most modules not written in Python. +We'll use builtins as an example, since it almost certainly isn't written in +plain ol' Python and is guaranteed to be available. + + >>> import builtins + >>> tests = doctest.DocTestFinder().find(builtins) + >>> len(tests) # how many objects checked for doctests + 794 + >>> real_tests = [t for t in tests if len(t.examples) > 0] + >>> len(real_tests) # how many objects actually have doctests + 8 + >>> for t in real_tests: + ... print('{} {}'.format(len(t.examples), t.name)) + ... + 1 builtins.bin + 3 builtins.float.as_integer_ratio + 2 builtins.float.fromhex + 2 builtins.float.hex + 1 builtins.hex + 1 builtins.int + 2 builtins.int.bit_length + 1 builtins.oct + +Note here that 'bin', 'oct', and 'hex' are functions; 'float.as_integer_ratio', +'float.hex', and 'int.bit_length' are methods; 'float.fromhex' is a classmethod, +and 'int' is a type. """ def test_DocTestParser(): r""" diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,9 @@ Library ------- +- Issue #3158: doctest can now find doctests in functions and methods + written in C. + - Issue #13477: Added command line interface to the tarfile module. Original patch by Berker Peksag. diff --git a/Objects/floatobject.c b/Objects/floatobject.c --- a/Objects/floatobject.c +++ b/Objects/floatobject.c @@ -1417,7 +1417,7 @@ >>> float.fromhex('0x1.ffffp10')\n\ 2047.984375\n\ >>> float.fromhex('-0x1p-1074')\n\ --4.9406564584124654e-324"); +-5e-324"); static PyObject * diff --git a/Python/bltinmodule.c b/Python/bltinmodule.c --- a/Python/bltinmodule.c +++ b/Python/bltinmodule.c @@ -350,7 +350,11 @@ PyDoc_STRVAR(bin_doc, "bin(number) -> string\n\ \n\ -Return the binary representation of an integer."); +Return the binary representation of an integer.\n\ +\n\ + >>> bin(2796202)\n\ + '0b1010101010101010101010'\n\ +"); static PyObject * @@ -1276,7 +1280,11 @@ PyDoc_STRVAR(hex_doc, "hex(number) -> string\n\ \n\ -Return the hexadecimal representation of an integer."); +Return the hexadecimal representation of an integer.\n\ +\n\ + >>> hex(3735928559)\n\ + '0xdeadbeef'\n\ +"); static PyObject * @@ -1476,7 +1484,11 @@ PyDoc_STRVAR(oct_doc, "oct(number) -> string\n\ \n\ -Return the octal representation of an integer."); +Return the octal representation of an integer.\n\ +\n\ + >>> oct(342391)\n\ + '0o1234567'\n\ +"); static PyObject * -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 09:22:24 2013 From: python-checkins at python.org (zach.ware) Date: Sun, 24 Nov 2013 09:22:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=233158=3A_Relax_new?= =?utf-8?q?_doctests_a_bit=2E?= Message-ID: <3dS49r42nfz7Lqr@mail.python.org> http://hg.python.org/cpython/rev/30b95368d253 changeset: 87491:30b95368d253 user: Zachary Ware date: Sun Nov 24 02:21:57 2013 -0600 summary: Issue #3158: Relax new doctests a bit. Apparently, the number of objects with docstrings in builtins varies with --with-pydebug (non-debug has one fewer). Also, skip the new tests entirely if built --without-doc-strings. files: Lib/test/test_doctest.py | 13 +++++++++---- 1 files changed, 9 insertions(+), 4 deletions(-) diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -409,7 +409,8 @@ """ -def test_DocTestFinder(): r""" +class test_DocTestFinder: + def basics(): r""" Unit tests for the `DocTestFinder` class. DocTestFinder is used to extract DocTests from an object's docstring @@ -644,6 +645,10 @@ >>> test = doctest.DocTestFinder().find(f)[0] >>> [e.lineno for e in test.examples] [1, 9, 12] +""" + + if int.__doc__: # simple check for --without-doc-strings, skip if lacking + def non_Python_modules(): r""" Finding Doctests in Modules Not Written in Python ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -653,10 +658,10 @@ >>> import builtins >>> tests = doctest.DocTestFinder().find(builtins) - >>> len(tests) # how many objects checked for doctests - 794 + >>> 790 < len(tests) < 800 # approximate number of objects with docstrings + True >>> real_tests = [t for t in tests if len(t.examples) > 0] - >>> len(real_tests) # how many objects actually have doctests + >>> len(real_tests) # objects that actually have doctests 8 >>> for t in real_tests: ... print('{} {}'.format(len(t.examples), t.name)) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 11:29:11 2013 From: python-checkins at python.org (victor.stinner) Date: Sun, 24 Nov 2013 11:29:11 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319741=3A_fix_trac?= =?utf-8?b?ZW1hbGxvY19sb2dfYWxsb2MoKSwgaGFuZGxlIF9QeV9IQVNIVEFCTEVfU0VU?= =?utf-8?q?=28=29_failure?= Message-ID: <3dS7071Fx7z7LlL@mail.python.org> http://hg.python.org/cpython/rev/c189ea6b586b changeset: 87492:c189ea6b586b user: Victor Stinner date: Sun Nov 24 11:28:20 2013 +0100 summary: Issue #19741: fix tracemalloc_log_alloc(), handle _Py_HASHTABLE_SET() failure files: Modules/_tracemalloc.c | 23 ++++++++++++++++------- 1 files changed, 16 insertions(+), 7 deletions(-) diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -447,19 +447,28 @@ #endif traceback = traceback_new(); - if (traceback == NULL) + if (traceback == NULL) { + /* Memory allocation failed. The error cannot be reported to the + caller, because realloc() may already have shrink the memory block + and so removed bytes. */ return; + } trace.size = size; trace.traceback = traceback; TABLES_LOCK(); - assert(tracemalloc_traced_memory <= PY_SIZE_MAX - size); - tracemalloc_traced_memory += size; - if (tracemalloc_traced_memory > tracemalloc_max_traced_memory) - tracemalloc_max_traced_memory = tracemalloc_traced_memory; - - _Py_HASHTABLE_SET(tracemalloc_traces, ptr, trace); + if (_Py_HASHTABLE_SET(tracemalloc_traces, ptr, trace) == 0) { + assert(tracemalloc_traced_memory <= PY_SIZE_MAX - size); + tracemalloc_traced_memory += size; + if (tracemalloc_traced_memory > tracemalloc_max_traced_memory) + tracemalloc_max_traced_memory = tracemalloc_traced_memory; + } + else { + /* Hashtabled failed to add a new entry because of a memory allocation + failure. Same than above, the error cannot be reported to the + caller. */ + } TABLES_UNLOCK(); } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 11:41:30 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Sun, 24 Nov 2013 11:41:30 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Make_built-in_methods_pick?= =?utf-8?q?lable_through_the_reduce_protocol=2E?= Message-ID: <3dS7GL2dfdz7Lsh@mail.python.org> http://hg.python.org/cpython/rev/24c84d00cf1f changeset: 87493:24c84d00cf1f user: Alexandre Vassalotti date: Sun Nov 24 02:41:05 2013 -0800 summary: Make built-in methods picklable through the reduce protocol. files: Lib/pickle.py | 9 +------- Modules/_pickle.c | 32 ------------------------------ Objects/methodobject.c | 22 +++++++++++++++++++- 3 files changed, 22 insertions(+), 41 deletions(-) diff --git a/Lib/pickle.py b/Lib/pickle.py --- a/Lib/pickle.py +++ b/Lib/pickle.py @@ -23,7 +23,7 @@ """ -from types import FunctionType, BuiltinFunctionType, ModuleType +from types import FunctionType, ModuleType from copyreg import dispatch_table from copyreg import _extension_registry, _inverted_registry, _extension_cache from itertools import islice @@ -962,14 +962,7 @@ self.memoize(obj) - def save_method(self, obj): - if obj.__self__ is None or type(obj.__self__) is ModuleType: - self.save_global(obj) - else: - self.save_reduce(getattr, (obj.__self__, obj.__name__), obj=obj) - dispatch[FunctionType] = save_global - dispatch[BuiltinFunctionType] = save_method dispatch[type] = save_global diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -3514,34 +3514,6 @@ } static int -save_method(PicklerObject *self, PyObject *obj) -{ - PyObject *method_self = PyCFunction_GET_SELF(obj); - - if (method_self == NULL || PyModule_Check(method_self)) { - return save_global(self, obj, NULL); - } - else { - PyObject *builtins; - PyObject *getattr; - PyObject *reduce_value; - int status = -1; - _Py_IDENTIFIER(getattr); - - builtins = PyEval_GetBuiltins(); - getattr = _PyDict_GetItemId(builtins, &PyId_getattr); - reduce_value = \ - Py_BuildValue("O(Os)", getattr, method_self, - ((PyCFunctionObject *)obj)->m_ml->ml_name); - if (reduce_value != NULL) { - status = save_reduce(self, reduce_value, obj); - Py_DECREF(reduce_value); - } - return status; - } -} - -static int save(PicklerObject *self, PyObject *obj, int pers_save) { PyTypeObject *type; @@ -3652,10 +3624,6 @@ goto done; } } - else if (type == &PyCFunction_Type) { - status = save_method(self, obj); - goto done; - } /* XXX: This part needs some unit tests. */ diff --git a/Objects/methodobject.c b/Objects/methodobject.c --- a/Objects/methodobject.c +++ b/Objects/methodobject.c @@ -159,6 +159,26 @@ } } +static PyObject * +meth_reduce(PyCFunctionObject *m) +{ + PyObject *builtins; + PyObject *getattr; + _Py_IDENTIFIER(getattr); + + if (m->m_self == NULL || PyModule_Check(m->m_self)) + return PyUnicode_FromString(m->m_ml->ml_name); + + builtins = PyEval_GetBuiltins(); + getattr = _PyDict_GetItemId(builtins, &PyId_getattr); + return Py_BuildValue("O(Os)", getattr, m->m_self, m->m_ml->ml_name); +} + +static PyMethodDef meth_methods[] = { + {"__reduce__", (PyCFunction)meth_reduce, METH_NOARGS, NULL}, + {NULL, NULL} +}; + /* * finds the docstring's introspection signature. * if present, returns a pointer pointing to the first '('. @@ -394,7 +414,7 @@ 0, /* tp_weaklistoffset */ 0, /* tp_iter */ 0, /* tp_iternext */ - 0, /* tp_methods */ + meth_methods, /* tp_methods */ meth_members, /* tp_members */ meth_getsets, /* tp_getset */ 0, /* tp_base */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 11:53:50 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Sun, 24 Nov 2013 11:53:50 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Make_Ellipsis_and_NotImple?= =?utf-8?q?mented_picklable_through_the_reduce_protocol=2E?= Message-ID: <3dS7XZ0Pm8z7Ll8@mail.python.org> http://hg.python.org/cpython/rev/04ccb2e0307e changeset: 87494:04ccb2e0307e user: Alexandre Vassalotti date: Sun Nov 24 02:53:45 2013 -0800 summary: Make Ellipsis and NotImplemented picklable through the reduce protocol. files: Lib/pickle.py | 8 ------- Modules/_pickle.c | 32 ------------------------------- Objects/object.c | 13 +++++++++++- Objects/sliceobject.c | 13 +++++++++++- 4 files changed, 24 insertions(+), 42 deletions(-) diff --git a/Lib/pickle.py b/Lib/pickle.py --- a/Lib/pickle.py +++ b/Lib/pickle.py @@ -633,14 +633,6 @@ self.write(NONE) dispatch[type(None)] = save_none - def save_ellipsis(self, obj): - self.save_global(Ellipsis, 'Ellipsis') - dispatch[type(Ellipsis)] = save_ellipsis - - def save_notimplemented(self, obj): - self.save_global(NotImplemented, 'NotImplemented') - dispatch[type(NotImplemented)] = save_notimplemented - def save_bool(self, obj): if self.proto >= 2: self.write(NEWTRUE if obj else NEWFALSE) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -3172,30 +3172,6 @@ } static int -save_ellipsis(PicklerObject *self, PyObject *obj) -{ - PyObject *str = PyUnicode_FromString("Ellipsis"); - int res; - if (str == NULL) - return -1; - res = save_global(self, Py_Ellipsis, str); - Py_DECREF(str); - return res; -} - -static int -save_notimplemented(PicklerObject *self, PyObject *obj) -{ - PyObject *str = PyUnicode_FromString("NotImplemented"); - int res; - if (str == NULL) - return -1; - res = save_global(self, Py_NotImplemented, str); - Py_DECREF(str); - return res; -} - -static int save_pers(PicklerObject *self, PyObject *obj, PyObject *func) { PyObject *pid = NULL; @@ -3552,14 +3528,6 @@ status = save_none(self, obj); goto done; } - else if (obj == Py_Ellipsis) { - status = save_ellipsis(self, obj); - goto done; - } - else if (obj == Py_NotImplemented) { - status = save_notimplemented(self, obj); - goto done; - } else if (obj == Py_False || obj == Py_True) { status = save_bool(self, obj); goto done; diff --git a/Objects/object.c b/Objects/object.c --- a/Objects/object.c +++ b/Objects/object.c @@ -1465,6 +1465,17 @@ } static PyObject * +NotImplemented_reduce(PyObject *op) +{ + return PyUnicode_FromString("NotImplemented"); +} + +static PyMethodDef notimplemented_methods[] = { + {"__reduce__", (PyCFunction)NotImplemented_reduce, METH_NOARGS, NULL}, + {NULL, NULL} +}; + +static PyObject * notimplemented_new(PyTypeObject *type, PyObject *args, PyObject *kwargs) { if (PyTuple_GET_SIZE(args) || (kwargs && PyDict_Size(kwargs))) { @@ -1511,7 +1522,7 @@ 0, /*tp_weaklistoffset */ 0, /*tp_iter */ 0, /*tp_iternext */ - 0, /*tp_methods */ + notimplemented_methods, /*tp_methods */ 0, /*tp_members */ 0, /*tp_getset */ 0, /*tp_base */ diff --git a/Objects/sliceobject.c b/Objects/sliceobject.c --- a/Objects/sliceobject.c +++ b/Objects/sliceobject.c @@ -33,6 +33,17 @@ return PyUnicode_FromString("Ellipsis"); } +static PyObject * +ellipsis_reduce(PyObject *op) +{ + return PyUnicode_FromString("Ellipsis"); +} + +static PyMethodDef ellipsis_methods[] = { + {"__reduce__", (PyCFunction)ellipsis_reduce, METH_NOARGS, NULL}, + {NULL, NULL} +}; + PyTypeObject PyEllipsis_Type = { PyVarObject_HEAD_INIT(&PyType_Type, 0) "ellipsis", /* tp_name */ @@ -61,7 +72,7 @@ 0, /* tp_weaklistoffset */ 0, /* tp_iter */ 0, /* tp_iternext */ - 0, /* tp_methods */ + ellipsis_methods, /* tp_methods */ 0, /* tp_members */ 0, /* tp_getset */ 0, /* tp_base */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 12:02:49 2013 From: python-checkins at python.org (victor.stinner) Date: Sun, 24 Nov 2013 12:02:49 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319741=3A_tracemal?= =?utf-8?q?loc=3A_report_tracemalloc=5Flog=5Falloc=28=29_failure_to_the_ca?= =?utf-8?q?ller?= Message-ID: <3dS7kx3Y75z7Lsq@mail.python.org> http://hg.python.org/cpython/rev/d8de3f4c7662 changeset: 87495:d8de3f4c7662 user: Victor Stinner date: Sun Nov 24 11:37:15 2013 +0100 summary: Issue #19741: tracemalloc: report tracemalloc_log_alloc() failure to the caller for new allocations, but not when a memory block was already resized files: Modules/_tracemalloc.c | 47 ++++++++++++++++++----------- 1 files changed, 29 insertions(+), 18 deletions(-) diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -436,40 +436,35 @@ return traceback; } -static void +static int tracemalloc_log_alloc(void *ptr, size_t size) { traceback_t *traceback; trace_t trace; + int res; #ifdef WITH_THREAD assert(PyGILState_Check()); #endif traceback = traceback_new(); - if (traceback == NULL) { - /* Memory allocation failed. The error cannot be reported to the - caller, because realloc() may already have shrink the memory block - and so removed bytes. */ - return; - } + if (traceback == NULL) + return -1; trace.size = size; trace.traceback = traceback; TABLES_LOCK(); - if (_Py_HASHTABLE_SET(tracemalloc_traces, ptr, trace) == 0) { + res = _Py_HASHTABLE_SET(tracemalloc_traces, ptr, trace); + if (res == 0) { assert(tracemalloc_traced_memory <= PY_SIZE_MAX - size); tracemalloc_traced_memory += size; if (tracemalloc_traced_memory > tracemalloc_max_traced_memory) tracemalloc_max_traced_memory = tracemalloc_traced_memory; } - else { - /* Hashtabled failed to add a new entry because of a memory allocation - failure. Same than above, the error cannot be reported to the - caller. */ - } TABLES_UNLOCK(); + + return res; } static void @@ -512,11 +507,16 @@ #endif #endif ptr = alloc->malloc(alloc->ctx, size); + + if (ptr != NULL) { + if (tracemalloc_log_alloc(ptr, size) < 0) { + /* Memory allocation failed */ + alloc->free(alloc->ctx, ptr); + ptr = NULL; + } + } set_reentrant(0); - if (ptr != NULL) - tracemalloc_log_alloc(ptr, size); - #if defined(TRACE_RAW_MALLOC) && defined(WITH_THREAD) if (!gil_held) PyGILState_Release(gil_state); @@ -561,14 +561,25 @@ #endif #endif ptr2 = alloc->realloc(alloc->ctx, ptr, new_size); - set_reentrant(0); if (ptr2 != NULL) { if (ptr != NULL) tracemalloc_log_free(ptr); - tracemalloc_log_alloc(ptr2, new_size); + if (tracemalloc_log_alloc(ptr2, new_size) < 0) { + if (ptr == NULL) { + /* Memory allocation failed */ + alloc->free(alloc->ctx, ptr2); + ptr2 = NULL; + } + else { + /* Memory allocation failed. The error cannot be reported to + the caller, because realloc() may already have shrinked the + memory block and so removed bytes. */ + } + } } + set_reentrant(0); #if defined(TRACE_RAW_MALLOC) && defined(WITH_THREAD) if (!gil_held) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 12:07:39 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Sun, 24 Nov 2013 12:07:39 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Remove_code_path_in_cpickl?= =?utf-8?q?e_that_does_not_exist_in_pickle=2E?= Message-ID: <3dS7rW0Ggyz7Ll8@mail.python.org> http://hg.python.org/cpython/rev/6bd1f0a27e8e changeset: 87496:6bd1f0a27e8e user: Alexandre Vassalotti date: Sun Nov 24 03:07:35 2013 -0800 summary: Remove code path in cpickle that does not exist in pickle. files: Modules/_pickle.c | 8 +------- 1 files changed, 1 insertions(+), 7 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -3584,13 +3584,7 @@ } else if (type == &PyFunction_Type) { status = save_global(self, obj, NULL); - if (status < 0 && PyErr_ExceptionMatches(PickleError)) { - /* fall back to reduce */ - PyErr_Clear(); - } - else { - goto done; - } + goto done; } /* XXX: This part needs some unit tests. */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 12:28:31 2013 From: python-checkins at python.org (victor.stinner) Date: Sun, 24 Nov 2013 12:28:31 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319741=3A_cleanup_?= =?utf-8?q?tracemalloc=5Frealloc=28=29?= Message-ID: <3dS8Jb5KFkzS9B@mail.python.org> http://hg.python.org/cpython/rev/dadb5ed301c7 changeset: 87497:dadb5ed301c7 user: Victor Stinner date: Sun Nov 24 12:27:59 2013 +0100 summary: Issue #19741: cleanup tracemalloc_realloc() Explain that unhandled error case is very unlikely files: Modules/_tracemalloc.c | 23 +++++++++++++++-------- 1 files changed, 15 insertions(+), 8 deletions(-) diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -563,20 +563,27 @@ ptr2 = alloc->realloc(alloc->ctx, ptr, new_size); if (ptr2 != NULL) { - if (ptr != NULL) + if (ptr != NULL) { + /* resize */ tracemalloc_log_free(ptr); - if (tracemalloc_log_alloc(ptr2, new_size) < 0) { - if (ptr == NULL) { + if (tracemalloc_log_alloc(ptr2, new_size) < 0) { + /* Memory allocation failed. The error cannot be reported to + the caller, because realloc() may already have shrinked the + memory block and so removed bytes. + + This case is very unlikely since we just released an hash + entry, so we have enough free bytes to allocate the new + entry. */ + } + } + else { + /* new allocation */ + if (tracemalloc_log_alloc(ptr2, new_size) < 0) { /* Memory allocation failed */ alloc->free(alloc->ctx, ptr2); ptr2 = NULL; } - else { - /* Memory allocation failed. The error cannot be reported to - the caller, because realloc() may already have shrinked the - memory block and so removed bytes. */ - } } } set_reentrant(0); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 12:41:18 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 24 Nov 2013 12:41:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_a_few_markup_problems_?= =?utf-8?q?in_the_new_import_doc=2E?= Message-ID: <3dS8bL4yfhz7Lkh@mail.python.org> http://hg.python.org/cpython/rev/f958f3a2b1d8 changeset: 87498:f958f3a2b1d8 user: Georg Brandl date: Sun Nov 24 12:39:56 2013 +0100 summary: Fix a few markup problems in the new import doc. files: Doc/reference/import.rst | 6 ++++-- 1 files changed, 4 insertions(+), 2 deletions(-) diff --git a/Doc/reference/import.rst b/Doc/reference/import.rst --- a/Doc/reference/import.rst +++ b/Doc/reference/import.rst @@ -231,7 +231,7 @@ range and scope of module searching. Finders do not actually load modules. If they can find the named module, they -return a :term:`module spec`, an encapsulation of the module's import-related +return a :dfn:`module spec`, an encapsulation of the module's import-related information, which the import machinery then uses when loading the module. The following sections describe the protocol for finders and loaders in more @@ -381,7 +381,7 @@ * After the module is created but before execution, the import machinery sets the import-related module attributes ("init_module_attrs"), as - summarized in a `later section `_. + summarized in a :ref:`later section `. * Module execution is the key moment of loading in which the module's namespace gets populated. Execution is entirely delegated to the @@ -474,6 +474,8 @@ .. versionadded:: 3.4 +.. _import-mod-attrs: + Import-related module attributes -------------------------------- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 13:24:10 2013 From: python-checkins at python.org (larry.hastings) Date: Sun, 24 Nov 2013 13:24:10 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Clinic=3A_Add_warning_for_?= =?utf-8?q?untested_=28and_unused_in_CPython!=29_format_units=2E?= Message-ID: <3dS9Xp5lXTz7LlL@mail.python.org> http://hg.python.org/cpython/rev/3f7a00a61c67 changeset: 87499:3f7a00a61c67 user: Larry Hastings date: Sun Nov 24 04:23:35 2013 -0800 summary: Clinic: Add warning for untested (and unused in CPython!) format units. files: Tools/clinic/clinic.py | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Tools/clinic/clinic.py b/Tools/clinic/clinic.py --- a/Tools/clinic/clinic.py +++ b/Tools/clinic/clinic.py @@ -1656,6 +1656,7 @@ format_unit = 'et#' if format_unit.endswith('#'): + print("Warning: code using format unit ", repr(format_unit), "probably doesn't work properly.") # TODO set pointer to NULL # TODO add cleanup for buffer pass -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 13:42:31 2013 From: python-checkins at python.org (larry.hastings) Date: Sun, 24 Nov 2013 13:42:31 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Clinic=3A_fix_=22self_conv?= =?utf-8?q?erters=22_with_METH=5FNOARGS_functions=2E?= Message-ID: <3dS9xz1NkJz7Lkh@mail.python.org> http://hg.python.org/cpython/rev/b67b19859fb1 changeset: 87500:b67b19859fb1 user: Larry Hastings date: Sun Nov 24 04:41:57 2013 -0800 summary: Clinic: fix "self converters" with METH_NOARGS functions. files: Modules/zlibmodule.c | 87 +++++++++++++++-------------- Tools/clinic/clinic.py | 2 +- 2 files changed, 47 insertions(+), 42 deletions(-) diff --git a/Modules/zlibmodule.c b/Modules/zlibmodule.c --- a/Modules/zlibmodule.c +++ b/Modules/zlibmodule.c @@ -312,6 +312,9 @@ type = 'unsigned int' converter = 'uint_converter' +class compobject_converter(self_converter): + type = "compobject *" + [python]*/ /*[python checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ @@ -746,6 +749,8 @@ zlib.Decompress.decompress + self: compobject + data: Py_buffer The binary data to decompress. max_length: uint = 0 @@ -780,7 +785,7 @@ {"decompress", (PyCFunction)zlib_Decompress_decompress, METH_VARARGS, zlib_Decompress_decompress__doc__}, static PyObject * -zlib_Decompress_decompress_impl(PyObject *self, Py_buffer *data, unsigned int max_length); +zlib_Decompress_decompress_impl(compobject *self, Py_buffer *data, unsigned int max_length); static PyObject * zlib_Decompress_decompress(PyObject *self, PyObject *args) @@ -793,7 +798,7 @@ "y*|O&:decompress", &data, uint_converter, &max_length)) goto exit; - return_value = zlib_Decompress_decompress_impl(self, &data, max_length); + return_value = zlib_Decompress_decompress_impl((compobject *)self, &data, max_length); exit: /* Cleanup for data */ @@ -804,10 +809,9 @@ } static PyObject * -zlib_Decompress_decompress_impl(PyObject *self, Py_buffer *data, unsigned int max_length) -/*[clinic checksum: 4683928665a1fa6987f5c57cada4a22807a78fbb]*/ +zlib_Decompress_decompress_impl(compobject *self, Py_buffer *data, unsigned int max_length) +/*[clinic checksum: 3599698948f5a712f5a8309491671cc2ce969d2c]*/ { - compobject *zself = (compobject *)self; int err; unsigned int old_length, length = DEFAULTALLOC; PyObject *RetVal = NULL; @@ -825,21 +829,21 @@ if (!(RetVal = PyBytes_FromStringAndSize(NULL, length))) return NULL; - ENTER_ZLIB(zself); + ENTER_ZLIB(self); - start_total_out = zself->zst.total_out; - zself->zst.avail_in = (unsigned int)data->len; - zself->zst.next_in = data->buf; - zself->zst.avail_out = length; - zself->zst.next_out = (unsigned char *)PyBytes_AS_STRING(RetVal); + start_total_out = self->zst.total_out; + self->zst.avail_in = (unsigned int)data->len; + self->zst.next_in = data->buf; + self->zst.avail_out = length; + self->zst.next_out = (unsigned char *)PyBytes_AS_STRING(RetVal); Py_BEGIN_ALLOW_THREADS - err = inflate(&(zself->zst), Z_SYNC_FLUSH); + err = inflate(&(self->zst), Z_SYNC_FLUSH); Py_END_ALLOW_THREADS - if (err == Z_NEED_DICT && zself->zdict != NULL) { + if (err == Z_NEED_DICT && self->zdict != NULL) { Py_buffer zdict_buf; - if (PyObject_GetBuffer(zself->zdict, &zdict_buf, PyBUF_SIMPLE) == -1) { + if (PyObject_GetBuffer(self->zdict, &zdict_buf, PyBUF_SIMPLE) == -1) { Py_DECREF(RetVal); RetVal = NULL; goto error; @@ -853,24 +857,24 @@ goto error; } - err = inflateSetDictionary(&(zself->zst), + err = inflateSetDictionary(&(self->zst), zdict_buf.buf, (unsigned int)zdict_buf.len); PyBuffer_Release(&zdict_buf); if (err != Z_OK) { - zlib_error(zself->zst, err, "while decompressing data"); + zlib_error(self->zst, err, "while decompressing data"); Py_CLEAR(RetVal); goto error; } /* Repeat the call to inflate. */ Py_BEGIN_ALLOW_THREADS - err = inflate(&(zself->zst), Z_SYNC_FLUSH); + err = inflate(&(self->zst), Z_SYNC_FLUSH); Py_END_ALLOW_THREADS } /* While Z_OK and the output buffer is full, there might be more output. So extend the output buffer and try again. */ - while (err == Z_OK && zself->zst.avail_out == 0) { + while (err == Z_OK && self->zst.avail_out == 0) { /* If max_length set, don't continue decompressing if we've already reached the limit. */ @@ -887,16 +891,16 @@ Py_CLEAR(RetVal); goto error; } - zself->zst.next_out = + self->zst.next_out = (unsigned char *)PyBytes_AS_STRING(RetVal) + old_length; - zself->zst.avail_out = length - old_length; + self->zst.avail_out = length - old_length; Py_BEGIN_ALLOW_THREADS - err = inflate(&(zself->zst), Z_SYNC_FLUSH); + err = inflate(&(self->zst), Z_SYNC_FLUSH); Py_END_ALLOW_THREADS } - if (save_unconsumed_input(zself, err) < 0) { + if (save_unconsumed_input(self, err) < 0) { Py_DECREF(RetVal); RetVal = NULL; goto error; @@ -905,24 +909,24 @@ if (err == Z_STREAM_END) { /* This is the logical place to call inflateEnd, but the old behaviour of only calling it on flush() is preserved. */ - zself->eof = 1; + self->eof = 1; } else if (err != Z_OK && err != Z_BUF_ERROR) { /* We will only get Z_BUF_ERROR if the output buffer was full but there wasn't more output when we tried again, so it is not an error condition. */ - zlib_error(zself->zst, err, "while decompressing data"); + zlib_error(self->zst, err, "while decompressing data"); Py_DECREF(RetVal); RetVal = NULL; goto error; } - if (_PyBytes_Resize(&RetVal, zself->zst.total_out - start_total_out) < 0) { + if (_PyBytes_Resize(&RetVal, self->zst.total_out - start_total_out) < 0) { Py_CLEAR(RetVal); } error: - LEAVE_ZLIB(zself); + LEAVE_ZLIB(self); return RetVal; } @@ -1027,6 +1031,8 @@ /*[clinic] zlib.Compress.copy + self: compobject + Return a copy of the compression object. [clinic]*/ @@ -1038,10 +1044,9 @@ {"copy", (PyCFunction)zlib_Compress_copy, METH_NOARGS, zlib_Compress_copy__doc__}, static PyObject * -zlib_Compress_copy(PyObject *self) -/*[clinic checksum: 8d30351f05defbc2b335c2a78d18f07aa367bb1d]*/ +zlib_Compress_copy(compobject *self) +/*[clinic checksum: 0b37c07f8f27deb7d4769951fbecf600a1006ef8]*/ { - compobject *zself = (compobject *)self; compobject *retval = NULL; int err; @@ -1051,8 +1056,8 @@ /* Copy the zstream state * We use ENTER_ZLIB / LEAVE_ZLIB to make this thread-safe */ - ENTER_ZLIB(zself); - err = deflateCopy(&retval->zst, &zself->zst); + ENTER_ZLIB(self); + err = deflateCopy(&retval->zst, &self->zst); switch(err) { case(Z_OK): break; @@ -1064,28 +1069,28 @@ "Can't allocate memory for compression object"); goto error; default: - zlib_error(zself->zst, err, "while copying compression object"); + zlib_error(self->zst, err, "while copying compression object"); goto error; } - Py_INCREF(zself->unused_data); - Py_INCREF(zself->unconsumed_tail); - Py_XINCREF(zself->zdict); + Py_INCREF(self->unused_data); + Py_INCREF(self->unconsumed_tail); + Py_XINCREF(self->zdict); Py_XDECREF(retval->unused_data); Py_XDECREF(retval->unconsumed_tail); Py_XDECREF(retval->zdict); - retval->unused_data = zself->unused_data; - retval->unconsumed_tail = zself->unconsumed_tail; - retval->zdict = zself->zdict; - retval->eof = zself->eof; + retval->unused_data = self->unused_data; + retval->unconsumed_tail = self->unconsumed_tail; + retval->zdict = self->zdict; + retval->eof = self->eof; /* Mark it as being initialized */ retval->is_initialised = 1; - LEAVE_ZLIB(zself); + LEAVE_ZLIB(self); return (PyObject *)retval; error: - LEAVE_ZLIB(zself); + LEAVE_ZLIB(self); Py_XDECREF(retval); return NULL; } diff --git a/Tools/clinic/clinic.py b/Tools/clinic/clinic.py --- a/Tools/clinic/clinic.py +++ b/Tools/clinic/clinic.py @@ -403,7 +403,7 @@ def meth_noargs_pyobject_template(self, methoddef_flags=""): return self.template_base("METH_NOARGS", methoddef_flags) + """ static PyObject * -{c_basename}({self_type}{self_name}) +{c_basename}({impl_parameters}) """ def meth_noargs_template(self, methoddef_flags=""): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 14:33:43 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 24 Nov 2013 14:33:43 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Selectively_re-enable_fram?= =?utf-8?q?ing_tests?= Message-ID: <3dSC531F2Gz7LpQ@mail.python.org> http://hg.python.org/cpython/rev/dda396830882 changeset: 87501:dda396830882 user: Antoine Pitrou date: Sun Nov 24 14:33:37 2013 +0100 summary: Selectively re-enable framing tests files: Lib/test/pickletester.py | 30 +++++++++++++++-------- Lib/test/test_pickletools.py | 2 + 2 files changed, 21 insertions(+), 11 deletions(-) diff --git a/Lib/test/pickletester.py b/Lib/test/pickletester.py --- a/Lib/test/pickletester.py +++ b/Lib/test/pickletester.py @@ -438,6 +438,8 @@ class AbstractPickleTests(unittest.TestCase): # Subclass must define self.dumps, self.loads. + optimized = False + _testdata = create_data() def setUp(self): @@ -1334,13 +1336,16 @@ self.assertEqual(obj, unpickled) # Test the framing heuristic is sane, # assuming a given frame size target. - # XXX Assumptions here are wrong when the pickle are optimized - # bytes_per_frame = (len(pickled) / - # count_opcode(pickle.FRAME, pickled)) - # self.assertGreater(bytes_per_frame, - # self.FRAME_SIZE_TARGET / 2) - # self.assertLessEqual(bytes_per_frame, - # self.FRAME_SIZE_TARGET * 1) + if self.optimized: + # These assumptions are currently invalid for optimized + # pickles (see e.g. issue19754). + continue + bytes_per_frame = (len(pickled) / + count_opcode(pickle.FRAME, pickled)) + self.assertGreater(bytes_per_frame, + self.FRAME_SIZE_TARGET / 2) + self.assertLessEqual(bytes_per_frame, + self.FRAME_SIZE_TARGET * 1) def test_framing_large_objects(self): N = 1024 * 1024 @@ -1350,10 +1355,13 @@ pickled = self.dumps(obj, proto) unpickled = self.loads(pickled) self.assertEqual(obj, unpickled) - # At least one frame was emitted per large bytes object. - # XXX Assumptions here are wrong when the pickle are optimized - # n_frames = count_opcode(pickle.FRAME, pickled) - # self.assertGreaterEqual(n_frames, len(obj)) + n_frames = count_opcode(pickle.FRAME, pickled) + if self.optimized: + # At least one frame was emitted (see issue19754). + self.assertGreaterEqual(n_frames, 1) + else: + # At least one frame was emitted per large bytes object. + self.assertGreaterEqual(n_frames, len(obj)) def test_optional_frames(self): if pickle.HIGHEST_PROTOCOL < 4: diff --git a/Lib/test/test_pickletools.py b/Lib/test/test_pickletools.py --- a/Lib/test/test_pickletools.py +++ b/Lib/test/test_pickletools.py @@ -6,6 +6,8 @@ class OptimizedPickleTests(AbstractPickleTests, AbstractPickleModuleTests): + optimized = True + def dumps(self, arg, proto=None): return pickletools.optimize(pickle.dumps(arg, proto)) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 14:58:25 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 24 Nov 2013 14:58:25 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319743=3A_fix_test?= =?utf-8?q?=5Fgdb_on_some_optimized_Python_builds?= Message-ID: <3dSCdY31Zpz7Lrt@mail.python.org> http://hg.python.org/cpython/rev/7a14cde3c4df changeset: 87502:7a14cde3c4df user: Antoine Pitrou date: Sun Nov 24 14:58:17 2013 +0100 summary: Issue #19743: fix test_gdb on some optimized Python builds files: Lib/test/test_gdb.py | 12 ++++++++---- 1 files changed, 8 insertions(+), 4 deletions(-) diff --git a/Lib/test/test_gdb.py b/Lib/test/test_gdb.py --- a/Lib/test/test_gdb.py +++ b/Lib/test/test_gdb.py @@ -145,7 +145,7 @@ args += [script] # print args - # print ' '.join(args) + # print (' '.join(args)) # Use "args" to invoke gdb, capturing stdout, stderr: out, err = run_gdb(*args, PYTHONHASHSEED=PYTHONHASHSEED) @@ -191,6 +191,11 @@ # # For a nested structure, the first time we hit the breakpoint will # give us the top-level structure + + # NOTE: avoid decoding too much of the traceback as some + # undecodable characters may lurk there in optimized mode + # (issue #19743). + cmds_after_breakpoint = cmds_after_breakpoint or ["backtrace 1"] gdb_output = self.get_stack_trace(source, breakpoint=BREAKPOINT_FN, cmds_after_breakpoint=cmds_after_breakpoint, import_site=import_site) @@ -221,11 +226,10 @@ gdb_output = self.get_stack_trace('id(42)') self.assertTrue(BREAKPOINT_FN in gdb_output) - def assertGdbRepr(self, val, exp_repr=None, cmds_after_breakpoint=None): + def assertGdbRepr(self, val, exp_repr=None): # Ensure that gdb's rendering of the value in a debugged process # matches repr(value) in this process: - gdb_repr, gdb_output = self.get_gdb_repr('id(' + ascii(val) + ')', - cmds_after_breakpoint) + gdb_repr, gdb_output = self.get_gdb_repr('id(' + ascii(val) + ')') if not exp_repr: exp_repr = repr(val) self.assertEqual(gdb_repr, exp_repr, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 15:01:49 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 24 Nov 2013 15:01:49 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NzQz?= =?utf-8?q?=3A_fix_test=5Fgdb_on_some_optimized_Python_builds?= Message-ID: <3dSCjT3q18zSV8@mail.python.org> http://hg.python.org/cpython/rev/e42b449f73fd changeset: 87503:e42b449f73fd branch: 3.3 parent: 87480:73c72bd2322b user: Antoine Pitrou date: Sun Nov 24 14:58:17 2013 +0100 summary: Issue #19743: fix test_gdb on some optimized Python builds files: Lib/test/test_gdb.py | 12 ++++++++---- 1 files changed, 8 insertions(+), 4 deletions(-) diff --git a/Lib/test/test_gdb.py b/Lib/test/test_gdb.py --- a/Lib/test/test_gdb.py +++ b/Lib/test/test_gdb.py @@ -141,7 +141,7 @@ args += [script] # print args - # print ' '.join(args) + # print (' '.join(args)) # Use "args" to invoke gdb, capturing stdout, stderr: out, err = run_gdb(*args, PYTHONHASHSEED='0') @@ -186,6 +186,11 @@ # # For a nested structure, the first time we hit the breakpoint will # give us the top-level structure + + # NOTE: avoid decoding too much of the traceback as some + # undecodable characters may lurk there in optimized mode + # (issue #19743). + cmds_after_breakpoint = cmds_after_breakpoint or ["backtrace 1"] gdb_output = self.get_stack_trace(source, breakpoint=BREAKPOINT_FN, cmds_after_breakpoint=cmds_after_breakpoint, import_site=import_site) @@ -216,11 +221,10 @@ gdb_output = self.get_stack_trace('id(42)') self.assertTrue(BREAKPOINT_FN in gdb_output) - def assertGdbRepr(self, val, exp_repr=None, cmds_after_breakpoint=None): + def assertGdbRepr(self, val, exp_repr=None): # Ensure that gdb's rendering of the value in a debugged process # matches repr(value) in this process: - gdb_repr, gdb_output = self.get_gdb_repr('id(' + ascii(val) + ')', - cmds_after_breakpoint) + gdb_repr, gdb_output = self.get_gdb_repr('id(' + ascii(val) + ')') if not exp_repr: exp_repr = repr(val) self.assertEqual(gdb_repr, exp_repr, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 15:01:50 2013 From: python-checkins at python.org (antoine.pitrou) Date: Sun, 24 Nov 2013 15:01:50 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Null_merge?= Message-ID: <3dSCjV5g9xz7LrW@mail.python.org> http://hg.python.org/cpython/rev/790b1dc94fe0 changeset: 87504:790b1dc94fe0 parent: 87502:7a14cde3c4df parent: 87503:e42b449f73fd user: Antoine Pitrou date: Sun Nov 24 15:01:39 2013 +0100 summary: Null merge files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 15:55:10 2013 From: python-checkins at python.org (eli.bendersky) Date: Sun, 24 Nov 2013 15:55:10 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Cosmetic_fixes?= Message-ID: <3dSDv210w5z7LlL@mail.python.org> http://hg.python.org/cpython/rev/e02bf9919f01 changeset: 87505:e02bf9919f01 user: Eli Bendersky date: Sun Nov 24 06:55:04 2013 -0800 summary: Cosmetic fixes files: Modules/_elementtree.c | 21 ++++++++------------- 1 files changed, 8 insertions(+), 13 deletions(-) diff --git a/Modules/_elementtree.c b/Modules/_elementtree.c --- a/Modules/_elementtree.c +++ b/Modules/_elementtree.c @@ -567,8 +567,9 @@ PyObject* attrib = NULL; if (!PyArg_ParseTuple(args, "O!O|O!:SubElement", &Element_Type, &parent, &tag, - &PyDict_Type, &attrib)) + &PyDict_Type, &attrib)) { return NULL; + } if (attrib) { /* attrib passed as positional arg */ @@ -652,7 +653,6 @@ } /* -------------------------------------------------------------------- */ -/* methods (in alphabetical order) */ static PyObject* element_append(ElementObject* self, PyObject* args) @@ -696,8 +696,7 @@ return NULL; element = (ElementObject*) create_new_element( - self->tag, (self->extra) ? self->extra->attrib : Py_None - ); + self->tag, (self->extra) ? self->extra->attrib : Py_None); if (!element) return NULL; @@ -710,7 +709,6 @@ Py_INCREF(JOIN_OBJ(element->tail)); if (self->extra) { - if (element_resize(element, self->extra->length) < 0) { Py_DECREF(element); return NULL; @@ -722,7 +720,6 @@ } element->extra->length = self->extra->length; - } return (PyObject*) element; @@ -779,7 +776,6 @@ element->tail = JOIN_SET(tail, JOIN_GET(self->tail)); if (self->extra) { - if (element_resize(element, self->extra->length) < 0) goto error; @@ -793,7 +789,6 @@ } element->extra->length = self->extra->length; - } /* add object to memo dictionary (so deepcopy won't visit it again) */ @@ -1141,8 +1136,8 @@ for (i = 0; i < self->extra->length; i++) { ElementObject* item = (ElementObject*) self->extra->children[i]; - if (Element_CheckExact(item) && (PyObject_RichCompareBool(item->tag, tag, Py_EQ) == 1)) { - + if (Element_CheckExact(item) && + (PyObject_RichCompareBool(item->tag, tag, Py_EQ) == 1)) { PyObject* text = element_get_text(item); if (text == Py_None) return PyUnicode_New(0, 0); @@ -1207,12 +1202,12 @@ elementtreestate *st = ET_STATE_GLOBAL; if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|O:iterfind", kwlist, - &tag, &namespaces)) + &tag, &namespaces)) { return NULL; + } return _PyObject_CallMethodId( - st->elementpath_obj, &PyId_iterfind, "OOO", self, tag, namespaces - ); + st->elementpath_obj, &PyId_iterfind, "OOO", self, tag, namespaces); } static PyObject* -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 16:11:26 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 24 Nov 2013 16:11:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_suspicious_markup_in_t?= =?utf-8?q?he_docs=2E?= Message-ID: <3dSFFp5BL2z7Lln@mail.python.org> http://hg.python.org/cpython/rev/f99e67cdedfb changeset: 87506:f99e67cdedfb user: Georg Brandl date: Sun Nov 24 16:09:26 2013 +0100 summary: Fix suspicious markup in the docs. files: Doc/library/email.contentmanager.rst | 4 ++-- Doc/library/email.message.rst | 2 +- Doc/library/gettext.rst | 6 +++--- Doc/library/unittest.rst | 4 ++-- Doc/tools/sphinxext/susp-ignored.csv | 16 +++++++++------- Doc/whatsnew/3.4.rst | 2 +- Misc/NEWS | 8 ++++---- 7 files changed, 22 insertions(+), 20 deletions(-) diff --git a/Doc/library/email.contentmanager.rst b/Doc/library/email.contentmanager.rst --- a/Doc/library/email.contentmanager.rst +++ b/Doc/library/email.contentmanager.rst @@ -96,7 +96,7 @@ only it when looking for candidate matches. Otherwise consider only the first (default root) part of the ``multipart/related``. - If a part has a :mailheader:``Content-Disposition`` header, only consider + If a part has a :mailheader:`Content-Disposition` header, only consider the part a candidate match if the value of the header is ``inline``. If none of the candidates matches any of the preferences in @@ -134,7 +134,7 @@ Return an iterator over all of the immediate sub-parts of the message, which will be empty for a non-``multipart``. (See also - :meth:``~email.message.walk``.) + :meth:`~email.message.walk`.) .. method:: get_content(*args, content_manager=None, **kw) diff --git a/Doc/library/email.message.rst b/Doc/library/email.message.rst --- a/Doc/library/email.message.rst +++ b/Doc/library/email.message.rst @@ -35,7 +35,7 @@ If *policy* is specified (it must be an instance of a :mod:`~email.policy` class) use the rules it specifies to udpate and serialize the representation - of the message. If *policy* is not set, use the :class`compat32 + of the message. If *policy* is not set, use the :class:`compat32 ` policy, which maintains backward compatibility with the Python 3.2 version of the email package. For more information see the :mod:`~email.policy` documentation. diff --git a/Doc/library/gettext.rst b/Doc/library/gettext.rst --- a/Doc/library/gettext.rst +++ b/Doc/library/gettext.rst @@ -476,9 +476,9 @@ :program:`xgettext`, :program:`pygettext`, and similar tools generate :file:`.po` files that are message catalogs. They are structured -:human-readable files that contain every marked string in the source -:code, along with a placeholder for the translated versions of these -:strings. +human-readable files that contain every marked string in the source +code, along with a placeholder for the translated versions of these +strings. Copies of these :file:`.po` files are then handed over to the individual human translators who write translations for every diff --git a/Doc/library/unittest.rst b/Doc/library/unittest.rst --- a/Doc/library/unittest.rst +++ b/Doc/library/unittest.rst @@ -901,8 +901,8 @@ | :meth:`assertWarnsRegex(warn, r, fun, *args, **kwds) | ``fun(*args, **kwds)`` raises *warn* | 3.2 | | ` | and the message matches regex *r* | | +---------------------------------------------------------+--------------------------------------+------------+ - | :meth:`assertLogs(logger, level)` | The ``with`` block logs on *logger* | 3.4 | - | ` | with minimum *level* | | + | :meth:`assertLogs(logger, level) | The ``with`` block logs on *logger* | 3.4 | + | ` | with minimum *level* | | +---------------------------------------------------------+--------------------------------------+------------+ .. method:: assertRaises(exception, callable, *args, **kwds) diff --git a/Doc/tools/sphinxext/susp-ignored.csv b/Doc/tools/sphinxext/susp-ignored.csv --- a/Doc/tools/sphinxext/susp-ignored.csv +++ b/Doc/tools/sphinxext/susp-ignored.csv @@ -141,15 +141,8 @@ library/logging.handlers,,:port,host:port library/mmap,,:i2,obj[i1:i2] library/multiprocessing,,`,# Add more tasks using `put()` -library/multiprocessing,,`,# A test file for the `multiprocessing` package -library/multiprocessing,,`,# A test of `multiprocessing.Pool` class -library/multiprocessing,,`,# `BaseManager`. -library/multiprocessing,,`,# in the original order then consider using `Pool.map()` or library/multiprocessing,,`,">>> l._callmethod('__getitem__', (20,)) # equiv to `l[20]`" library/multiprocessing,,`,">>> l._callmethod('__getslice__', (2, 7)) # equiv to `l[2:7]`" -library/multiprocessing,,`,# Not sure if we should synchronize access to `socket.accept()` method by -library/multiprocessing,,`,# object. (We import `multiprocessing.reduction` to enable this pickling.) -library/multiprocessing,,`,# `Pool.imap()` (which will save on the amount of code needed anyway). library/multiprocessing,,:queue,">>> QueueManager.register('get_queue', callable=lambda:queue)" library/multiprocessing,,`,# register the Foo class; make `f()` and `g()` accessible via proxy library/multiprocessing,,`,# register the Foo class; make `g()` and `_h()` accessible via proxy @@ -158,6 +151,10 @@ library/nntplib,,:lines,:lines library/optparse,,:len,"del parser.rargs[:len(value)]" library/os.path,,:foo,c:foo +library/pathlib,,:bar,">>> PureWindowsPath('c:/Windows', 'd:bar')" +library/pathlib,,:bar,PureWindowsPath('d:bar') +library/pathlib,,:Program,>>> PureWindowsPath('c:Program Files/').root +library/pathlib,,:Program,>>> PureWindowsPath('c:Program Files/').anchor library/pdb,,:lineno,filename:lineno library/pickle,,:memory,"conn = sqlite3.connect("":memory:"")" library/posix,,`,"CFLAGS=""`getconf LFS_CFLAGS`"" OPT=""-g -O2 $CFLAGS""" @@ -200,7 +197,12 @@ library/tarfile,,:xz,'w:xz' library/time,,:mm, library/time,,:ss, +library/tracemalloc,,:limit,"for index, stat in enumerate(top_stats[:limit], 1):" library/turtle,,::,Example:: +library/unittest,1412,:foo,"self.assertEqual(cm.output, ['INFO:foo:first message'," +library/unittest,1412,:first,"self.assertEqual(cm.output, ['INFO:foo:first message'," +library/unittest,1412,:foo,'ERROR:foo.bar:second message']) +library/unittest,1412,:second,'ERROR:foo.bar:second message']) library/urllib.request,,:close,Connection:close library/urllib.request,,:lang,"xmlns=""http://www.w3.org/1999/xhtml"" xml:lang=""en"" lang=""en"">\n\n\n" library/urllib.request,,:password,"""joe:password at python.org""" diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -424,7 +424,7 @@ The encoding and decoding functions in :mod:`base64` now accept any :term:`bytes-like object` in cases where it previously required a -:class:`bytes` or :class:`bytearray` instance (:issue`17839`) +:class:`bytes` or :class:`bytearray` instance (:issue:`17839`). colorsys diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -2518,7 +2518,7 @@ - Issue #14398: Fix size truncation and overflow bugs in the bz2 module. - Issue #12692: Fix resource leak in urllib.request when talking to an HTTP - server that does not include a "Connection: close" header in its responses. + server that does not include a ``Connection: close`` header in its responses. - Issue #12034: Fix bogus caching of result in check_GetFinalPathNameByHandle. Patch by Atsuo Ishimoto. @@ -6091,7 +6091,7 @@ given as a low fd, it gets overwritten. - Issue #12576: Fix urlopen behavior on sites which do not send (or obfuscates) - Connection:close header. + ``Connection: close`` header. - Issue #12560: Build libpython.so on OpenBSD. Patch by Stefan Sperling. @@ -6686,7 +6686,7 @@ - Issue #11127: Raise a TypeError when trying to pickle a socket object. -- Issue #11563: Connection:close header is sent by requests using URLOpener +- Issue #11563: ``Connection: close`` header is sent by requests using URLOpener class which helps in closing of sockets after connection is over. Patch contributions by Jeff McNeil and Nadeem Vawda. @@ -7262,7 +7262,7 @@ - Issue #11505: improves test coverage of string.py. Patch by Alicia Arlen -- Issue #11490: test_subprocess:test_leaking_fds_on_error no longer gives a +- Issue #11490: test_subprocess.test_leaking_fds_on_error no longer gives a false positive if the last directory in the path is inaccessible. - Issue #11223: Fix test_threadsignals to fail, not hang, when the -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 16:18:30 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 24 Nov 2013 16:18:30 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Doc/Makefile?= =?utf-8?q?=3A_also_do_=22make_suspicious=22_during_daily_autobuild?= Message-ID: <3dSFPy478Wz7Lln@mail.python.org> http://hg.python.org/cpython/rev/30763929082c changeset: 87507:30763929082c branch: 2.7 parent: 87462:f075a7178108 user: Georg Brandl date: Sun Nov 24 16:17:54 2013 +0100 summary: Doc/Makefile: also do "make suspicious" during daily autobuild files: Doc/Makefile | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Doc/Makefile b/Doc/Makefile --- a/Doc/Makefile +++ b/Doc/Makefile @@ -170,6 +170,7 @@ autobuild-dev: make update make dist SPHINXOPTS='-A daily=1 -A versionswitcher=1' + -make suspicious # for quick rebuilds (HTML only) autobuild-html: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 16:18:53 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 24 Nov 2013 16:18:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Doc/Makefile?= =?utf-8?q?=3A_also_do_=22make_suspicious=22_during_daily_autobuild?= Message-ID: <3dSFQP0ZRhz7Lln@mail.python.org> http://hg.python.org/cpython/rev/27347b955d4a changeset: 87508:27347b955d4a branch: 3.3 parent: 87503:e42b449f73fd user: Georg Brandl date: Sun Nov 24 16:17:54 2013 +0100 summary: Doc/Makefile: also do "make suspicious" during daily autobuild files: Doc/Makefile | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Doc/Makefile b/Doc/Makefile --- a/Doc/Makefile +++ b/Doc/Makefile @@ -186,6 +186,7 @@ autobuild-dev: make update make dist SPHINXOPTS='-A daily=1 -A versionswitcher=1' + -make suspicious # for quick rebuilds (HTML only) autobuild-html: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 16:18:54 2013 From: python-checkins at python.org (georg.brandl) Date: Sun, 24 Nov 2013 16:18:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_merge_with_3=2E3?= Message-ID: <3dSFQQ2cfGz7Lln@mail.python.org> http://hg.python.org/cpython/rev/7cc76d9564fa changeset: 87509:7cc76d9564fa parent: 87506:f99e67cdedfb parent: 87508:27347b955d4a user: Georg Brandl date: Sun Nov 24 16:18:23 2013 +0100 summary: merge with 3.3 files: Doc/Makefile | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Doc/Makefile b/Doc/Makefile --- a/Doc/Makefile +++ b/Doc/Makefile @@ -186,6 +186,7 @@ autobuild-dev: make update make dist SPHINXOPTS='-A daily=1 -A versionswitcher=1' + -make suspicious # for quick rebuilds (HTML only) autobuild-html: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 17:17:45 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 24 Nov 2013 17:17:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NTQ1?= =?utf-8?q?=3A_Avoid_chained_exceptions_while_passing_stray_=25_to?= Message-ID: <3dSGkK0sHhz7Lln@mail.python.org> http://hg.python.org/cpython/rev/ce1578f4d105 changeset: 87510:ce1578f4d105 branch: 3.3 parent: 87508:27347b955d4a user: Serhiy Storchaka date: Sun Nov 24 18:15:37 2013 +0200 summary: Issue #19545: Avoid chained exceptions while passing stray % to time.strptime(). Initial patch by Claudiu Popa. files: Lib/_strptime.py | 2 +- Lib/test/test_strptime.py | 4 ++++ Lib/test/test_time.py | 4 ++++ Misc/NEWS | 3 +++ 4 files changed, 12 insertions(+), 1 deletions(-) diff --git a/Lib/_strptime.py b/Lib/_strptime.py --- a/Lib/_strptime.py +++ b/Lib/_strptime.py @@ -329,7 +329,7 @@ (bad_directive, format)) from None # IndexError only occurs when the format string is "%" except IndexError: - raise ValueError("stray %% in format '%s'" % format) + raise ValueError("stray %% in format '%s'" % format) from None _regex_cache[format] = format_regex found = format_regex.match(data_string) if not found: diff --git a/Lib/test/test_strptime.py b/Lib/test/test_strptime.py --- a/Lib/test/test_strptime.py +++ b/Lib/test/test_strptime.py @@ -223,6 +223,10 @@ with self.assertRaises(ValueError) as e: _strptime._strptime_time('', '%D') self.assertIs(e.exception.__suppress_context__, True) + # additional check for IndexError branch (issue #19545) + with self.assertRaises(ValueError) as e: + _strptime._strptime_time('19', '%Y %') + self.assertIs(e.exception.__suppress_context__, True) def test_unconverteddata(self): # Check ValueError is raised when there is unconverted data diff --git a/Lib/test/test_time.py b/Lib/test/test_time.py --- a/Lib/test/test_time.py +++ b/Lib/test/test_time.py @@ -198,6 +198,10 @@ with self.assertRaises(ValueError) as e: time.strptime('', '%D') self.assertIs(e.exception.__suppress_context__, True) + # additional check for IndexError branch (issue #19545) + with self.assertRaises(ValueError) as e: + time.strptime('19', '%Y %') + self.assertIs(e.exception.__suppress_context__, True) def test_asctime(self): time.asctime(time.gmtime(self.t)) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -13,6 +13,9 @@ Library ------- +- Issue #19545: Avoid chained exceptions while passing stray % to + time.strptime(). Initial patch by Claudiu Popa. + - Issue #19633: Fixed writing not compressed 16- and 32-bit wave files on big-endian platforms. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 17:17:46 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 24 Nov 2013 17:17:46 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319545=3A_Avoid_chained_exceptions_while_passing?= =?utf-8?q?_stray_=25_to?= Message-ID: <3dSGkL2nmTz7Lsh@mail.python.org> http://hg.python.org/cpython/rev/2bf4741515a7 changeset: 87511:2bf4741515a7 parent: 87509:7cc76d9564fa parent: 87510:ce1578f4d105 user: Serhiy Storchaka date: Sun Nov 24 18:17:11 2013 +0200 summary: Issue #19545: Avoid chained exceptions while passing stray % to time.strptime(). Initial patch by Claudiu Popa. files: Lib/_strptime.py | 2 +- Lib/test/test_strptime.py | 4 ++++ Lib/test/test_time.py | 4 ++++ Misc/NEWS | 3 +++ 4 files changed, 12 insertions(+), 1 deletions(-) diff --git a/Lib/_strptime.py b/Lib/_strptime.py --- a/Lib/_strptime.py +++ b/Lib/_strptime.py @@ -329,7 +329,7 @@ (bad_directive, format)) from None # IndexError only occurs when the format string is "%" except IndexError: - raise ValueError("stray %% in format '%s'" % format) + raise ValueError("stray %% in format '%s'" % format) from None _regex_cache[format] = format_regex found = format_regex.match(data_string) if not found: diff --git a/Lib/test/test_strptime.py b/Lib/test/test_strptime.py --- a/Lib/test/test_strptime.py +++ b/Lib/test/test_strptime.py @@ -223,6 +223,10 @@ with self.assertRaises(ValueError) as e: _strptime._strptime_time('', '%D') self.assertIs(e.exception.__suppress_context__, True) + # additional check for IndexError branch (issue #19545) + with self.assertRaises(ValueError) as e: + _strptime._strptime_time('19', '%Y %') + self.assertIs(e.exception.__suppress_context__, True) def test_unconverteddata(self): # Check ValueError is raised when there is unconverted data diff --git a/Lib/test/test_time.py b/Lib/test/test_time.py --- a/Lib/test/test_time.py +++ b/Lib/test/test_time.py @@ -198,6 +198,10 @@ with self.assertRaises(ValueError) as e: time.strptime('', '%D') self.assertIs(e.exception.__suppress_context__, True) + # additional check for IndexError branch (issue #19545) + with self.assertRaises(ValueError) as e: + time.strptime('19', '%Y %') + self.assertIs(e.exception.__suppress_context__, True) def test_asctime(self): time.asctime(time.gmtime(self.t)) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -68,6 +68,9 @@ Library ------- +- Issue #19545: Avoid chained exceptions while passing stray % to + time.strptime(). Initial patch by Claudiu Popa. + - Issue #3158: doctest can now find doctests in functions and methods written in C. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 17:27:33 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 24 Nov 2013 17:27:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Fixed_merging_?= =?utf-8?q?error_in_changeset_3912934e99ba_=28issue_=2319733=29=2E?= Message-ID: <3dSGxd6CR8z7Lln@mail.python.org> http://hg.python.org/cpython/rev/77e3e395f446 changeset: 87512:77e3e395f446 branch: 2.7 parent: 87507:30763929082c user: Serhiy Storchaka date: Sun Nov 24 18:26:20 2013 +0200 summary: Fixed merging error in changeset 3912934e99ba (issue #19733). files: Lib/lib-tk/test/test_tkinter/test_widgets.py | 5 ----- 1 files changed, 0 insertions(+), 5 deletions(-) diff --git a/Lib/lib-tk/test/test_tkinter/test_widgets.py b/Lib/lib-tk/test/test_tkinter/test_widgets.py --- a/Lib/lib-tk/test/test_tkinter/test_widgets.py +++ b/Lib/lib-tk/test/test_tkinter/test_widgets.py @@ -260,13 +260,8 @@ test_highlightthickness = StandardOptionsTests.test_highlightthickness.im_func -<<<<<<< -======= - test_highlightthickness = StandardOptionsTests.test_highlightthickness - @unittest.skipIf(sys.platform == 'darwin', 'crashes with Cocoa Tk (issue19733)') ->>>>>>> def test_image(self): widget = self.create() image = Tkinter.PhotoImage('image1') -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 17:34:29 2013 From: python-checkins at python.org (barry.warsaw) Date: Sun, 24 Nov 2013 17:34:29 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_397_is_marked_Final=2E__?= =?utf-8?q?=28Closes_issue19755=29=2E?= Message-ID: <3dSH5d3B8Mz7Lp5@mail.python.org> http://hg.python.org/peps/rev/fa10b8c5bb00 changeset: 5318:fa10b8c5bb00 user: Barry Warsaw date: Sun Nov 24 11:34:23 2013 -0500 summary: PEP 397 is marked Final. (Closes issue19755). files: pep-0397.txt | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/pep-0397.txt b/pep-0397.txt --- a/pep-0397.txt +++ b/pep-0397.txt @@ -4,7 +4,7 @@ Last-Modified: $Date: 2012/06/19 15:13:49 $ Author: Mark Hammond , Martin v. L?wis -Status: Accepted +Status: Final Type: Standards Track Content-Type: text/plain Created: 15-Mar-2011 -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sun Nov 24 18:51:55 2013 From: python-checkins at python.org (richard.oudkerk) Date: Sun, 24 Nov 2013 18:51:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319740=3A_Use_Wait?= =?utf-8?q?ForSingleObject=28=29_instead_of_trusting_TimerOrWaitFired=2E?= Message-ID: <3dSJpz4RTgz7Lq0@mail.python.org> http://hg.python.org/cpython/rev/d71db7fe4872 changeset: 87513:d71db7fe4872 parent: 87511:2bf4741515a7 user: Richard Oudkerk date: Sun Nov 24 17:50:40 2013 +0000 summary: Issue #19740: Use WaitForSingleObject() instead of trusting TimerOrWaitFired. files: Lib/asyncio/windows_events.py | 11 +++++++++-- 1 files changed, 9 insertions(+), 2 deletions(-) diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -327,14 +327,21 @@ handle, self._iocp, ov.address, ms) f = _WaitHandleFuture(wh, loop=self._loop) - def finish(timed_out, _, ov): + def finish(trans, key, ov): if not f.cancelled(): try: _overlapped.UnregisterWait(wh) except OSError as e: if e.winerror != _overlapped.ERROR_IO_PENDING: raise - return not timed_out + # Note that this second wait means that we should only use + # this with handles types where a successful wait has no + # effect. So events or processes are all right, but locks + # or semaphores are not. Also note if the handle is + # signalled and then quickly reset, then we may return + # False even though we have not timed out. + return (_winapi.WaitForSingleObject(handle, 0) == + _winapi.WAIT_OBJECT_0) self._cache[ov.address] = (f, ov, None, finish) return f -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 18:56:55 2013 From: python-checkins at python.org (victor.stinner) Date: Sun, 24 Nov 2013 18:56:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319753=3A_Try_to_f?= =?utf-8?q?ix_test=5Fgdb_on_SystemZ_buildbot?= Message-ID: <3dSJwl02D4z7Lp5@mail.python.org> http://hg.python.org/cpython/rev/6e5eab3add6c changeset: 87514:6e5eab3add6c user: Victor Stinner date: Sun Nov 24 18:55:25 2013 +0100 summary: Issue #19753: Try to fix test_gdb on SystemZ buildbot files: Lib/test/test_gdb.py | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_gdb.py b/Lib/test/test_gdb.py --- a/Lib/test/test_gdb.py +++ b/Lib/test/test_gdb.py @@ -170,6 +170,7 @@ 'Do you need "set solib-search-path" or ' '"set sysroot"?', 'warning: Source file is more recent than executable.', + 'Missing separate debuginfo for ', ) for line in errlines: if not line.startswith(ignore_patterns): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 19:23:52 2013 From: python-checkins at python.org (victor.stinner) Date: Sun, 24 Nov 2013 19:23:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319636=3A_Fix_posi?= =?utf-8?q?x=5F=5Fgetvolumepathname=28=29=2C_raise_an_OverflowError_if?= Message-ID: <3dSKWr3x9xz7LjM@mail.python.org> http://hg.python.org/cpython/rev/b3eb42a657a3 changeset: 87515:b3eb42a657a3 user: Victor Stinner date: Sun Nov 24 19:22:57 2013 +0100 summary: Issue #19636: Fix posix__getvolumepathname(), raise an OverflowError if the length doesn't fit in an DWORD files: Modules/posixmodule.c | 19 ++++++++++++++----- 1 files changed, 14 insertions(+), 5 deletions(-) diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -385,6 +385,8 @@ #endif #endif +#define DWORD_MAX 4294967295U + #ifdef MS_WINDOWS static int @@ -4039,24 +4041,31 @@ { PyObject *po, *result; wchar_t *path, *mountpath=NULL; - size_t bufsize; + size_t buflen; BOOL ret; if (!PyArg_ParseTuple(args, "U|:_getvolumepathname", &po)) return NULL; - path = PyUnicode_AsUnicode(po); + path = PyUnicode_AsUnicodeAndSize(po, &buflen); if (path == NULL) return NULL; + buflen += 1; /* Volume path should be shorter than entire path */ - bufsize = max(MAX_PATH, wcslen(path) * 2 * sizeof(wchar_t)+1); - mountpath = (wchar_t *)PyMem_Malloc(bufsize); + buflen = Py_MAX(buflen, MAX_PATH); + + if (buflen > DWORD_MAX) { + PyErr_SetString(PyExc_OverflowError, "path too long"); + return NULL; + } + + mountpath = (wchar_t *)PyMem_Malloc(buflen * sizeof(wchar_t)); if (mountpath == NULL) return PyErr_NoMemory(); Py_BEGIN_ALLOW_THREADS ret = GetVolumePathNameW(path, mountpath, - Py_SAFE_DOWNCAST(bufsize, size_t, DWORD)); + Py_SAFE_DOWNCAST(buflen, size_t, DWORD)); Py_END_ALLOW_THREADS if (!ret) { -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 19:23:53 2013 From: python-checkins at python.org (victor.stinner) Date: Sun, 24 Nov 2013 19:23:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319636=3A_Fix_usag?= =?utf-8?q?e_of_MAX=5FPATH_in_posixmodule=2Ec?= Message-ID: <3dSKWs5r1zz7LjM@mail.python.org> http://hg.python.org/cpython/rev/46aecfc5e374 changeset: 87516:46aecfc5e374 user: Victor Stinner date: Sun Nov 24 19:23:25 2013 +0100 summary: Issue #19636: Fix usage of MAX_PATH in posixmodule.c files: Modules/posixmodule.c | 28 ++++++++++++++-------------- 1 files changed, 14 insertions(+), 14 deletions(-) diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -899,7 +899,7 @@ length = PyBytes_GET_SIZE(bytes); #ifdef MS_WINDOWS - if (length > MAX_PATH) { + if (length > MAX_PATH-1) { FORMAT_EXCEPTION(PyExc_ValueError, "%s too long for Windows"); Py_DECREF(bytes); return 0; @@ -1378,18 +1378,18 @@ static BOOL __stdcall win32_chdir(LPCSTR path) { - char new_path[MAX_PATH+1]; + char new_path[MAX_PATH]; int result; char env[4] = "=x:"; if(!SetCurrentDirectoryA(path)) return FALSE; - result = GetCurrentDirectoryA(MAX_PATH+1, new_path); + result = GetCurrentDirectoryA(Py_ARRAY_LENGTH(new_path), new_path); if (!result) return FALSE; /* In the ANSI API, there should not be any paths longer - than MAX_PATH. */ - assert(result <= MAX_PATH+1); + than MAX_PATH-1 (not including the final null character). */ + assert(result < Py_ARRAY_LENGTH(new_path)); if (strncmp(new_path, "\\\\", 2) == 0 || strncmp(new_path, "//", 2) == 0) /* UNC path, nothing to do. */ @@ -1403,16 +1403,16 @@ static BOOL __stdcall win32_wchdir(LPCWSTR path) { - wchar_t _new_path[MAX_PATH+1], *new_path = _new_path; + wchar_t _new_path[MAX_PATH], *new_path = _new_path; int result; wchar_t env[4] = L"=x:"; if(!SetCurrentDirectoryW(path)) return FALSE; - result = GetCurrentDirectoryW(MAX_PATH+1, new_path); + result = GetCurrentDirectoryW(Py_ARRAY_LENGTH(new_path), new_path); if (!result) return FALSE; - if (result > MAX_PATH+1) { + if (result > Py_ARRAY_LENGTH(new_path)) { new_path = PyMem_RawMalloc(result * sizeof(wchar_t)); if (!new_path) { SetLastError(ERROR_OUTOFMEMORY); @@ -3398,11 +3398,11 @@ PyObject *resobj; DWORD len; Py_BEGIN_ALLOW_THREADS - len = GetCurrentDirectoryW(sizeof wbuf/ sizeof wbuf[0], wbuf); + len = GetCurrentDirectoryW(Py_ARRAY_LENGTH(wbuf), wbuf); /* If the buffer is large enough, len does not include the terminating \0. If the buffer is too small, len includes the space needed for the terminator. */ - if (len >= sizeof wbuf/ sizeof wbuf[0]) { + if (len >= Py_ARRAY_LENGTH(wbuf)) { wbuf2 = PyMem_RawMalloc(len * sizeof(wchar_t)); if (wbuf2) len = GetCurrentDirectoryW(len, wbuf2); @@ -3583,10 +3583,10 @@ HANDLE hFindFile = INVALID_HANDLE_VALUE; BOOL result; WIN32_FIND_DATA FileData; - char namebuf[MAX_PATH+5]; /* Overallocate for \\*.*\0 */ + char namebuf[MAX_PATH+4]; /* Overallocate for "\*.*" */ char *bufptr = namebuf; /* only claim to have space for MAX_PATH */ - Py_ssize_t len = sizeof(namebuf)-5; + Py_ssize_t len = Py_ARRAY_LENGTH(namebuf)-4; PyObject *po = NULL; wchar_t *wnamebuf = NULL; @@ -3875,14 +3875,14 @@ posix__getfullpathname(PyObject *self, PyObject *args) { const char *path; - char outbuf[MAX_PATH*2]; + char outbuf[MAX_PATH]; char *temp; PyObject *po; if (PyArg_ParseTuple(args, "U|:_getfullpathname", &po)) { wchar_t *wpath; - wchar_t woutbuf[MAX_PATH*2], *woutbufp = woutbuf; + wchar_t woutbuf[MAX_PATH], *woutbufp = woutbuf; wchar_t *wtemp; DWORD result; PyObject *v; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 19:57:15 2013 From: python-checkins at python.org (stefan.krah) Date: Sun, 24 Nov 2013 19:57:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogMSkgUHJlcGFyZSBs?= =?utf-8?q?ibmpdec_for_the_2=2E4=2E0_release=2E__None_of_the_following_cha?= =?utf-8?q?nges?= Message-ID: <3dSLGM1qWKz7LjP@mail.python.org> http://hg.python.org/cpython/rev/cbd78679080b changeset: 87517:cbd78679080b branch: 3.3 parent: 87510:ce1578f4d105 user: Stefan Krah date: Sun Nov 24 19:44:57 2013 +0100 summary: 1) Prepare libmpdec for the 2.4.0 release. None of the following changes affects _decimal: o Make all "mpd_t to C integer" conversion functions available in both the 64-bit and the 32-bit versions. o Make all mixed mpd_t/C integer arithmetic functions available in the 32-bit version. o Better handling of __STDC_LIMIT_MACROS for C++ users. o Add struct tags (at the request of C++ users). 2) Check for libmpdec.so.2 if --with-system-libmpdec is used. files: Lib/decimal.py | 1 + Lib/test/test_decimal.py | 1 + Modules/_decimal/_decimal.c | 8 +- Modules/_decimal/libmpdec/mpdecimal.c | 260 ++++++++++++++ Modules/_decimal/libmpdec/mpdecimal.h | 94 +++- setup.py | 2 +- 6 files changed, 329 insertions(+), 37 deletions(-) diff --git a/Lib/decimal.py b/Lib/decimal.py --- a/Lib/decimal.py +++ b/Lib/decimal.py @@ -140,6 +140,7 @@ __version__ = '1.70' # Highest version of the spec this complies with # See http://speleotrove.com/decimal/ +__libmpdec_version__ = "2.4.0" # compatible libmpdec version import copy as _copy import math as _math diff --git a/Lib/test/test_decimal.py b/Lib/test/test_decimal.py --- a/Lib/test/test_decimal.py +++ b/Lib/test/test_decimal.py @@ -4150,6 +4150,7 @@ self.assertTrue(P.HAVE_THREADS is True or P.HAVE_THREADS is False) self.assertEqual(C.__version__, P.__version__) + self.assertEqual(C.__libmpdec_version__, P.__libmpdec_version__) x = dir(C) y = [s for s in dir(P) if '__' in s or not s.startswith('_')] diff --git a/Modules/_decimal/_decimal.c b/Modules/_decimal/_decimal.c --- a/Modules/_decimal/_decimal.c +++ b/Modules/_decimal/_decimal.c @@ -39,6 +39,11 @@ #include "memory.h" +#if MPD_MAJOR_VERSION != 2 + #error "libmpdec major version 2 required" +#endif + + /* * Type sizes with assertions in mpdecimal.h and pyport.h: * sizeof(size_t) == sizeof(Py_ssize_t) @@ -5730,7 +5735,8 @@ } /* Add specification version number */ - CHECK_INT(PyModule_AddStringConstant(m, "__version__", " 1.70")); + CHECK_INT(PyModule_AddStringConstant(m, "__version__", "1.70")); + CHECK_INT(PyModule_AddStringConstant(m, "__libmpdec_version__", mpd_version())); return m; diff --git a/Modules/_decimal/libmpdec/mpdecimal.c b/Modules/_decimal/libmpdec/mpdecimal.c --- a/Modules/_decimal/libmpdec/mpdecimal.c +++ b/Modules/_decimal/libmpdec/mpdecimal.c @@ -97,6 +97,8 @@ mpd_ssize_t exp); static inline mpd_ssize_t _mpd_real_size(mpd_uint_t *data, mpd_ssize_t size); +static int _mpd_cmp_abs(const mpd_t *a, const mpd_t *b); + static void _mpd_qadd(mpd_t *result, const mpd_t *a, const mpd_t *b, const mpd_context_t *ctx, uint32_t *status); static inline void _mpd_qmul(mpd_t *result, const mpd_t *a, const mpd_t *b, @@ -111,6 +113,17 @@ /******************************************************************************/ +/* Version */ +/******************************************************************************/ + +const char * +mpd_version(void) +{ + return MPD_VERSION; +} + + +/******************************************************************************/ /* Performance critical inline functions */ /******************************************************************************/ @@ -1345,6 +1358,91 @@ return MPD_SSIZE_MAX; } +#if defined(CONFIG_32) && !defined(LEGACY_COMPILER) +/* + * Quietly get a uint64_t from a decimal. If the operation is impossible, + * MPD_Invalid_operation is set. + */ +static uint64_t +_c32_qget_u64(int use_sign, const mpd_t *a, uint32_t *status) +{ + MPD_NEW_STATIC(tmp,0,0,20,3); + mpd_context_t maxcontext; + uint64_t ret; + + tmp_data[0] = 709551615; + tmp_data[1] = 446744073; + tmp_data[2] = 18; + + if (mpd_isspecial(a)) { + *status |= MPD_Invalid_operation; + return UINT64_MAX; + } + if (mpd_iszero(a)) { + return 0; + } + if (use_sign && mpd_isnegative(a)) { + *status |= MPD_Invalid_operation; + return UINT64_MAX; + } + if (!_mpd_isint(a)) { + *status |= MPD_Invalid_operation; + return UINT64_MAX; + } + + if (_mpd_cmp_abs(a, &tmp) > 0) { + *status |= MPD_Invalid_operation; + return UINT64_MAX; + } + + mpd_maxcontext(&maxcontext); + mpd_qrescale(&tmp, a, 0, &maxcontext, &maxcontext.status); + maxcontext.status &= ~MPD_Rounded; + if (maxcontext.status != 0) { + *status |= (maxcontext.status|MPD_Invalid_operation); /* GCOV_NOT_REACHED */ + return UINT64_MAX; /* GCOV_NOT_REACHED */ + } + + ret = 0; + switch (tmp.len) { + case 3: + ret += (uint64_t)tmp_data[2] * 1000000000000000000ULL; + case 2: + ret += (uint64_t)tmp_data[1] * 1000000000ULL; + case 1: + ret += tmp_data[0]; + break; + default: + abort(); /* GCOV_NOT_REACHED */ + } + + return ret; +} + +static int64_t +_c32_qget_i64(const mpd_t *a, uint32_t *status) +{ + uint64_t u; + int isneg; + + u = _c32_qget_u64(0, a, status); + if (*status&MPD_Invalid_operation) { + return INT64_MAX; + } + + isneg = mpd_isnegative(a); + if (u <= INT64_MAX) { + return isneg ? -((int64_t)u) : (int64_t)u; + } + else if (isneg && u+(INT64_MIN+INT64_MAX) == INT64_MAX) { + return INT64_MIN; + } + + *status |= MPD_Invalid_operation; + return INT64_MAX; +} +#endif /* CONFIG_32 && !LEGACY_COMPILER */ + #ifdef CONFIG_64 /* quietly get a uint64_t from a decimal */ uint64_t @@ -1359,7 +1457,57 @@ { return mpd_qget_ssize(a, status); } + +/* quietly get a uint32_t from a decimal */ +uint32_t +mpd_qget_u32(const mpd_t *a, uint32_t *status) +{ + uint64_t x = mpd_qget_uint(a, status); + + if (*status&MPD_Invalid_operation) { + return UINT32_MAX; + } + if (x > UINT32_MAX) { + *status |= MPD_Invalid_operation; + return UINT32_MAX; + } + + return (uint32_t)x; +} + +/* quietly get an int32_t from a decimal */ +int32_t +mpd_qget_i32(const mpd_t *a, uint32_t *status) +{ + int64_t x = mpd_qget_ssize(a, status); + + if (*status&MPD_Invalid_operation) { + return INT32_MAX; + } + if (x < INT32_MIN || x > INT32_MAX) { + *status |= MPD_Invalid_operation; + return INT32_MAX; + } + + return (int32_t)x; +} #else +#ifndef LEGACY_COMPILER +/* quietly get a uint64_t from a decimal */ +uint64_t +mpd_qget_u64(const mpd_t *a, uint32_t *status) +{ + return _c32_qget_u64(1, a, status); +} + +/* quietly get an int64_t from a decimal */ +int64_t +mpd_qget_i64(const mpd_t *a, uint32_t *status) +{ + return _c32_qget_i64(a, status); +} +#endif + /* quietly get a uint32_t from a decimal */ uint32_t mpd_qget_u32(const mpd_t *a, uint32_t *status) @@ -3386,6 +3534,34 @@ { mpd_qadd_uint(result, a, b, ctx, status); } +#elif !defined(LEGACY_COMPILER) +/* Add decimal and int64_t. */ +void +mpd_qadd_i64(mpd_t *result, const mpd_t *a, int64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_i64(&bb, b, &maxcontext, status); + mpd_qadd(result, a, &bb, ctx, status); + mpd_del(&bb); +} + +/* Add decimal and uint64_t. */ +void +mpd_qadd_u64(mpd_t *result, const mpd_t *a, uint64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_u64(&bb, b, &maxcontext, status); + mpd_qadd(result, a, &bb, ctx, status); + mpd_del(&bb); +} #endif /* Subtract int32_t from decimal. */ @@ -3420,6 +3596,34 @@ { mpd_qsub_uint(result, a, b, ctx, status); } +#elif !defined(LEGACY_COMPILER) +/* Subtract int64_t from decimal. */ +void +mpd_qsub_i64(mpd_t *result, const mpd_t *a, int64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_i64(&bb, b, &maxcontext, status); + mpd_qsub(result, a, &bb, ctx, status); + mpd_del(&bb); +} + +/* Subtract uint64_t from decimal. */ +void +mpd_qsub_u64(mpd_t *result, const mpd_t *a, uint64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_u64(&bb, b, &maxcontext, status); + mpd_qsub(result, a, &bb, ctx, status); + mpd_del(&bb); +} #endif @@ -3871,6 +4075,34 @@ { mpd_qdiv_uint(result, a, b, ctx, status); } +#elif !defined(LEGACY_COMPILER) +/* Divide decimal by int64_t. */ +void +mpd_qdiv_i64(mpd_t *result, const mpd_t *a, int64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_i64(&bb, b, &maxcontext, status); + mpd_qdiv(result, a, &bb, ctx, status); + mpd_del(&bb); +} + +/* Divide decimal by uint64_t. */ +void +mpd_qdiv_u64(mpd_t *result, const mpd_t *a, uint64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_u64(&bb, b, &maxcontext, status); + mpd_qdiv(result, a, &bb, ctx, status); + mpd_del(&bb); +} #endif /* Pad the result with trailing zeros if it has fewer digits than prec. */ @@ -5664,6 +5896,34 @@ { mpd_qmul_uint(result, a, b, ctx, status); } +#elif !defined(LEGACY_COMPILER) +/* Multiply decimal and int64_t. */ +void +mpd_qmul_i64(mpd_t *result, const mpd_t *a, int64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_i64(&bb, b, &maxcontext, status); + mpd_qmul(result, a, &bb, ctx, status); + mpd_del(&bb); +} + +/* Multiply decimal and uint64_t. */ +void +mpd_qmul_u64(mpd_t *result, const mpd_t *a, uint64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_u64(&bb, b, &maxcontext, status); + mpd_qmul(result, a, &bb, ctx, status); + mpd_del(&bb); +} #endif /* Like the minus operator. */ diff --git a/Modules/_decimal/libmpdec/mpdecimal.h b/Modules/_decimal/libmpdec/mpdecimal.h --- a/Modules/_decimal/libmpdec/mpdecimal.h +++ b/Modules/_decimal/libmpdec/mpdecimal.h @@ -32,7 +32,6 @@ #ifdef __cplusplus extern "C" { -#define __STDC_LIMIT_MACROS #endif @@ -56,12 +55,18 @@ #define MPD_HIDE_SYMBOLS_END #define EXTINLINE extern inline #else - #ifdef HAVE_STDINT_H - #include - #endif #ifdef HAVE_INTTYPES_H #include #endif + #ifdef HAVE_STDINT_H + #if defined(__cplusplus) && !defined(__STDC_LIMIT_MACROS) + #define __STDC_LIMIT_MACROS + #include + #undef __STDC_LIMIT_MACROS + #else + #include + #endif + #endif #ifndef __GNUC_STDC_INLINE__ #define __GNUC_STDC_INLINE__ 1 #endif @@ -100,6 +105,19 @@ /******************************************************************************/ +/* Version */ +/******************************************************************************/ + +#define MPD_MAJOR_VERSION 2 +#define MPD_MINOR_VERSION 4 +#define MPD_MICRO_VERSION 0 + +#define MPD_VERSION "2.4.0" + +const char *mpd_version(void); + + +/******************************************************************************/ /* Configuration */ /******************************************************************************/ @@ -241,7 +259,7 @@ extern const char *mpd_clamp_string[MPD_CLAMP_GUARD]; -typedef struct { +typedef struct mpd_context_t { mpd_ssize_t prec; /* precision */ mpd_ssize_t emax; /* max positive exp */ mpd_ssize_t emin; /* min negative exp */ @@ -353,7 +371,7 @@ #define MPD_DATAFLAGS (MPD_STATIC_DATA|MPD_SHARED_DATA|MPD_CONST_DATA) /* mpd_t */ -typedef struct { +typedef struct mpd_t { uint8_t flags; mpd_ssize_t exp; mpd_ssize_t digits; @@ -371,7 +389,7 @@ /******************************************************************************/ /* format specification */ -typedef struct { +typedef struct mpd_spec_t { mpd_ssize_t min_width; /* minimum field width */ mpd_ssize_t prec; /* fraction digits or significant digits */ char type; /* conversion specifier */ @@ -437,6 +455,12 @@ mpd_uint_t mpd_qget_uint(const mpd_t *dec, uint32_t *status); mpd_uint_t mpd_qabs_uint(const mpd_t *dec, uint32_t *status); +int32_t mpd_qget_i32(const mpd_t *dec, uint32_t *status); +uint32_t mpd_qget_u32(const mpd_t *dec, uint32_t *status); +#ifndef LEGACY_COMPILER +int64_t mpd_qget_i64(const mpd_t *dec, uint32_t *status); +uint64_t mpd_qget_u64(const mpd_t *dec, uint32_t *status); +#endif /* quiet functions */ int mpd_qcheck_nan(mpd_t *nanresult, const mpd_t *a, const mpd_context_t *ctx, uint32_t *status); @@ -528,6 +552,17 @@ void mpd_qsqrt(mpd_t *result, const mpd_t *a, const mpd_context_t *ctx, uint32_t *status); void mpd_qinvroot(mpd_t *result, const mpd_t *a, const mpd_context_t *ctx, uint32_t *status); +#ifndef LEGACY_COMPILER +void mpd_qadd_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qadd_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qsub_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qsub_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qmul_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qmul_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qdiv_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qdiv_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); +#endif + size_t mpd_sizeinbase(const mpd_t *a, uint32_t base); void mpd_qimport_u16(mpd_t *result, const uint16_t *srcdata, size_t srclen, @@ -571,6 +606,12 @@ mpd_ssize_t mpd_get_ssize(const mpd_t *a, mpd_context_t *ctx); mpd_uint_t mpd_get_uint(const mpd_t *a, mpd_context_t *ctx); mpd_uint_t mpd_abs_uint(const mpd_t *a, mpd_context_t *ctx); +int32_t mpd_get_i32(const mpd_t *a, mpd_context_t *ctx); +uint32_t mpd_get_u32(const mpd_t *a, mpd_context_t *ctx); +#ifndef LEGACY_COMPILER +int64_t mpd_get_i64(const mpd_t *a, mpd_context_t *ctx); +uint64_t mpd_get_u64(const mpd_t *a, mpd_context_t *ctx); +#endif void mpd_and(mpd_t *result, const mpd_t *a, const mpd_t *b, mpd_context_t *ctx); void mpd_copy(mpd_t *result, const mpd_t *a, mpd_context_t *ctx); void mpd_canonical(mpd_t *result, const mpd_t *a, mpd_context_t *ctx); @@ -641,6 +682,17 @@ void mpd_sqrt(mpd_t *result, const mpd_t *a, mpd_context_t *ctx); void mpd_invroot(mpd_t *result, const mpd_t *a, mpd_context_t *ctx); +#ifndef LEGACY_COMPILER +void mpd_add_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); +void mpd_add_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); +void mpd_sub_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); +void mpd_sub_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); +void mpd_div_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); +void mpd_div_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); +void mpd_mul_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); +void mpd_mul_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); +#endif + /******************************************************************************/ /* Configuration specific */ @@ -649,36 +701,8 @@ #ifdef CONFIG_64 void mpd_qsset_i64(mpd_t *result, int64_t a, const mpd_context_t *ctx, uint32_t *status); void mpd_qsset_u64(mpd_t *result, uint64_t a, const mpd_context_t *ctx, uint32_t *status); -int64_t mpd_qget_i64(const mpd_t *dec, uint32_t *status); -uint64_t mpd_qget_u64(const mpd_t *dec, uint32_t *status); - -void mpd_qadd_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qadd_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qsub_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qsub_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qmul_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qmul_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qdiv_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qdiv_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); - void mpd_sset_i64(mpd_t *result, int64_t a, mpd_context_t *ctx); void mpd_sset_u64(mpd_t *result, uint64_t a, mpd_context_t *ctx); -int64_t mpd_get_i64(const mpd_t *a, mpd_context_t *ctx); -uint64_t mpd_get_u64(const mpd_t *a, mpd_context_t *ctx); - -void mpd_add_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); -void mpd_add_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); -void mpd_sub_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); -void mpd_sub_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); -void mpd_div_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); -void mpd_div_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); -void mpd_mul_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); -void mpd_mul_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); -#else -int32_t mpd_qget_i32(const mpd_t *dec, uint32_t *status); -uint32_t mpd_qget_u32(const mpd_t *dec, uint32_t *status); -int32_t mpd_get_i32(const mpd_t *a, mpd_context_t *ctx); -uint32_t mpd_get_u32(const mpd_t *a, mpd_context_t *ctx); #endif diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -1945,7 +1945,7 @@ undef_macros = [] if '--with-system-libmpdec' in sysconfig.get_config_var("CONFIG_ARGS"): include_dirs = [] - libraries = ['mpdec'] + libraries = [':libmpdec.so.2'] sources = ['_decimal/_decimal.c'] depends = ['_decimal/docstrings.h'] else: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 19:57:17 2013 From: python-checkins at python.org (stefan.krah) Date: Sun, 24 Nov 2013 19:57:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?b?KTogTWVyZ2UgZnJvbSAzLjMu?= Message-ID: <3dSLGP0HTLz7LjP@mail.python.org> http://hg.python.org/cpython/rev/9d07b3eb34e3 changeset: 87518:9d07b3eb34e3 parent: 87516:46aecfc5e374 parent: 87517:cbd78679080b user: Stefan Krah date: Sun Nov 24 19:56:23 2013 +0100 summary: Merge from 3.3. files: Lib/decimal.py | 1 + Lib/test/test_decimal.py | 1 + Modules/_decimal/_decimal.c | 8 +- Modules/_decimal/libmpdec/mpdecimal.c | 260 ++++++++++++++ Modules/_decimal/libmpdec/mpdecimal.h | 94 +++- setup.py | 2 +- 6 files changed, 329 insertions(+), 37 deletions(-) diff --git a/Lib/decimal.py b/Lib/decimal.py --- a/Lib/decimal.py +++ b/Lib/decimal.py @@ -140,6 +140,7 @@ __version__ = '1.70' # Highest version of the spec this complies with # See http://speleotrove.com/decimal/ +__libmpdec_version__ = "2.4.0" # compatible libmpdec version import copy as _copy import math as _math diff --git a/Lib/test/test_decimal.py b/Lib/test/test_decimal.py --- a/Lib/test/test_decimal.py +++ b/Lib/test/test_decimal.py @@ -4149,6 +4149,7 @@ self.assertTrue(P.HAVE_THREADS is True or P.HAVE_THREADS is False) self.assertEqual(C.__version__, P.__version__) + self.assertEqual(C.__libmpdec_version__, P.__libmpdec_version__) x = dir(C) y = [s for s in dir(P) if '__' in s or not s.startswith('_')] diff --git a/Modules/_decimal/_decimal.c b/Modules/_decimal/_decimal.c --- a/Modules/_decimal/_decimal.c +++ b/Modules/_decimal/_decimal.c @@ -39,6 +39,11 @@ #include "memory.h" +#if MPD_MAJOR_VERSION != 2 + #error "libmpdec major version 2 required" +#endif + + /* * Type sizes with assertions in mpdecimal.h and pyport.h: * sizeof(size_t) == sizeof(Py_ssize_t) @@ -5730,7 +5735,8 @@ } /* Add specification version number */ - CHECK_INT(PyModule_AddStringConstant(m, "__version__", " 1.70")); + CHECK_INT(PyModule_AddStringConstant(m, "__version__", "1.70")); + CHECK_INT(PyModule_AddStringConstant(m, "__libmpdec_version__", mpd_version())); return m; diff --git a/Modules/_decimal/libmpdec/mpdecimal.c b/Modules/_decimal/libmpdec/mpdecimal.c --- a/Modules/_decimal/libmpdec/mpdecimal.c +++ b/Modules/_decimal/libmpdec/mpdecimal.c @@ -97,6 +97,8 @@ mpd_ssize_t exp); static inline mpd_ssize_t _mpd_real_size(mpd_uint_t *data, mpd_ssize_t size); +static int _mpd_cmp_abs(const mpd_t *a, const mpd_t *b); + static void _mpd_qadd(mpd_t *result, const mpd_t *a, const mpd_t *b, const mpd_context_t *ctx, uint32_t *status); static inline void _mpd_qmul(mpd_t *result, const mpd_t *a, const mpd_t *b, @@ -111,6 +113,17 @@ /******************************************************************************/ +/* Version */ +/******************************************************************************/ + +const char * +mpd_version(void) +{ + return MPD_VERSION; +} + + +/******************************************************************************/ /* Performance critical inline functions */ /******************************************************************************/ @@ -1345,6 +1358,91 @@ return MPD_SSIZE_MAX; } +#if defined(CONFIG_32) && !defined(LEGACY_COMPILER) +/* + * Quietly get a uint64_t from a decimal. If the operation is impossible, + * MPD_Invalid_operation is set. + */ +static uint64_t +_c32_qget_u64(int use_sign, const mpd_t *a, uint32_t *status) +{ + MPD_NEW_STATIC(tmp,0,0,20,3); + mpd_context_t maxcontext; + uint64_t ret; + + tmp_data[0] = 709551615; + tmp_data[1] = 446744073; + tmp_data[2] = 18; + + if (mpd_isspecial(a)) { + *status |= MPD_Invalid_operation; + return UINT64_MAX; + } + if (mpd_iszero(a)) { + return 0; + } + if (use_sign && mpd_isnegative(a)) { + *status |= MPD_Invalid_operation; + return UINT64_MAX; + } + if (!_mpd_isint(a)) { + *status |= MPD_Invalid_operation; + return UINT64_MAX; + } + + if (_mpd_cmp_abs(a, &tmp) > 0) { + *status |= MPD_Invalid_operation; + return UINT64_MAX; + } + + mpd_maxcontext(&maxcontext); + mpd_qrescale(&tmp, a, 0, &maxcontext, &maxcontext.status); + maxcontext.status &= ~MPD_Rounded; + if (maxcontext.status != 0) { + *status |= (maxcontext.status|MPD_Invalid_operation); /* GCOV_NOT_REACHED */ + return UINT64_MAX; /* GCOV_NOT_REACHED */ + } + + ret = 0; + switch (tmp.len) { + case 3: + ret += (uint64_t)tmp_data[2] * 1000000000000000000ULL; + case 2: + ret += (uint64_t)tmp_data[1] * 1000000000ULL; + case 1: + ret += tmp_data[0]; + break; + default: + abort(); /* GCOV_NOT_REACHED */ + } + + return ret; +} + +static int64_t +_c32_qget_i64(const mpd_t *a, uint32_t *status) +{ + uint64_t u; + int isneg; + + u = _c32_qget_u64(0, a, status); + if (*status&MPD_Invalid_operation) { + return INT64_MAX; + } + + isneg = mpd_isnegative(a); + if (u <= INT64_MAX) { + return isneg ? -((int64_t)u) : (int64_t)u; + } + else if (isneg && u+(INT64_MIN+INT64_MAX) == INT64_MAX) { + return INT64_MIN; + } + + *status |= MPD_Invalid_operation; + return INT64_MAX; +} +#endif /* CONFIG_32 && !LEGACY_COMPILER */ + #ifdef CONFIG_64 /* quietly get a uint64_t from a decimal */ uint64_t @@ -1359,7 +1457,57 @@ { return mpd_qget_ssize(a, status); } + +/* quietly get a uint32_t from a decimal */ +uint32_t +mpd_qget_u32(const mpd_t *a, uint32_t *status) +{ + uint64_t x = mpd_qget_uint(a, status); + + if (*status&MPD_Invalid_operation) { + return UINT32_MAX; + } + if (x > UINT32_MAX) { + *status |= MPD_Invalid_operation; + return UINT32_MAX; + } + + return (uint32_t)x; +} + +/* quietly get an int32_t from a decimal */ +int32_t +mpd_qget_i32(const mpd_t *a, uint32_t *status) +{ + int64_t x = mpd_qget_ssize(a, status); + + if (*status&MPD_Invalid_operation) { + return INT32_MAX; + } + if (x < INT32_MIN || x > INT32_MAX) { + *status |= MPD_Invalid_operation; + return INT32_MAX; + } + + return (int32_t)x; +} #else +#ifndef LEGACY_COMPILER +/* quietly get a uint64_t from a decimal */ +uint64_t +mpd_qget_u64(const mpd_t *a, uint32_t *status) +{ + return _c32_qget_u64(1, a, status); +} + +/* quietly get an int64_t from a decimal */ +int64_t +mpd_qget_i64(const mpd_t *a, uint32_t *status) +{ + return _c32_qget_i64(a, status); +} +#endif + /* quietly get a uint32_t from a decimal */ uint32_t mpd_qget_u32(const mpd_t *a, uint32_t *status) @@ -3386,6 +3534,34 @@ { mpd_qadd_uint(result, a, b, ctx, status); } +#elif !defined(LEGACY_COMPILER) +/* Add decimal and int64_t. */ +void +mpd_qadd_i64(mpd_t *result, const mpd_t *a, int64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_i64(&bb, b, &maxcontext, status); + mpd_qadd(result, a, &bb, ctx, status); + mpd_del(&bb); +} + +/* Add decimal and uint64_t. */ +void +mpd_qadd_u64(mpd_t *result, const mpd_t *a, uint64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_u64(&bb, b, &maxcontext, status); + mpd_qadd(result, a, &bb, ctx, status); + mpd_del(&bb); +} #endif /* Subtract int32_t from decimal. */ @@ -3420,6 +3596,34 @@ { mpd_qsub_uint(result, a, b, ctx, status); } +#elif !defined(LEGACY_COMPILER) +/* Subtract int64_t from decimal. */ +void +mpd_qsub_i64(mpd_t *result, const mpd_t *a, int64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_i64(&bb, b, &maxcontext, status); + mpd_qsub(result, a, &bb, ctx, status); + mpd_del(&bb); +} + +/* Subtract uint64_t from decimal. */ +void +mpd_qsub_u64(mpd_t *result, const mpd_t *a, uint64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_u64(&bb, b, &maxcontext, status); + mpd_qsub(result, a, &bb, ctx, status); + mpd_del(&bb); +} #endif @@ -3871,6 +4075,34 @@ { mpd_qdiv_uint(result, a, b, ctx, status); } +#elif !defined(LEGACY_COMPILER) +/* Divide decimal by int64_t. */ +void +mpd_qdiv_i64(mpd_t *result, const mpd_t *a, int64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_i64(&bb, b, &maxcontext, status); + mpd_qdiv(result, a, &bb, ctx, status); + mpd_del(&bb); +} + +/* Divide decimal by uint64_t. */ +void +mpd_qdiv_u64(mpd_t *result, const mpd_t *a, uint64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_u64(&bb, b, &maxcontext, status); + mpd_qdiv(result, a, &bb, ctx, status); + mpd_del(&bb); +} #endif /* Pad the result with trailing zeros if it has fewer digits than prec. */ @@ -5664,6 +5896,34 @@ { mpd_qmul_uint(result, a, b, ctx, status); } +#elif !defined(LEGACY_COMPILER) +/* Multiply decimal and int64_t. */ +void +mpd_qmul_i64(mpd_t *result, const mpd_t *a, int64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_i64(&bb, b, &maxcontext, status); + mpd_qmul(result, a, &bb, ctx, status); + mpd_del(&bb); +} + +/* Multiply decimal and uint64_t. */ +void +mpd_qmul_u64(mpd_t *result, const mpd_t *a, uint64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_u64(&bb, b, &maxcontext, status); + mpd_qmul(result, a, &bb, ctx, status); + mpd_del(&bb); +} #endif /* Like the minus operator. */ diff --git a/Modules/_decimal/libmpdec/mpdecimal.h b/Modules/_decimal/libmpdec/mpdecimal.h --- a/Modules/_decimal/libmpdec/mpdecimal.h +++ b/Modules/_decimal/libmpdec/mpdecimal.h @@ -32,7 +32,6 @@ #ifdef __cplusplus extern "C" { -#define __STDC_LIMIT_MACROS #endif @@ -56,12 +55,18 @@ #define MPD_HIDE_SYMBOLS_END #define EXTINLINE extern inline #else - #ifdef HAVE_STDINT_H - #include - #endif #ifdef HAVE_INTTYPES_H #include #endif + #ifdef HAVE_STDINT_H + #if defined(__cplusplus) && !defined(__STDC_LIMIT_MACROS) + #define __STDC_LIMIT_MACROS + #include + #undef __STDC_LIMIT_MACROS + #else + #include + #endif + #endif #ifndef __GNUC_STDC_INLINE__ #define __GNUC_STDC_INLINE__ 1 #endif @@ -100,6 +105,19 @@ /******************************************************************************/ +/* Version */ +/******************************************************************************/ + +#define MPD_MAJOR_VERSION 2 +#define MPD_MINOR_VERSION 4 +#define MPD_MICRO_VERSION 0 + +#define MPD_VERSION "2.4.0" + +const char *mpd_version(void); + + +/******************************************************************************/ /* Configuration */ /******************************************************************************/ @@ -241,7 +259,7 @@ extern const char *mpd_clamp_string[MPD_CLAMP_GUARD]; -typedef struct { +typedef struct mpd_context_t { mpd_ssize_t prec; /* precision */ mpd_ssize_t emax; /* max positive exp */ mpd_ssize_t emin; /* min negative exp */ @@ -353,7 +371,7 @@ #define MPD_DATAFLAGS (MPD_STATIC_DATA|MPD_SHARED_DATA|MPD_CONST_DATA) /* mpd_t */ -typedef struct { +typedef struct mpd_t { uint8_t flags; mpd_ssize_t exp; mpd_ssize_t digits; @@ -371,7 +389,7 @@ /******************************************************************************/ /* format specification */ -typedef struct { +typedef struct mpd_spec_t { mpd_ssize_t min_width; /* minimum field width */ mpd_ssize_t prec; /* fraction digits or significant digits */ char type; /* conversion specifier */ @@ -437,6 +455,12 @@ mpd_uint_t mpd_qget_uint(const mpd_t *dec, uint32_t *status); mpd_uint_t mpd_qabs_uint(const mpd_t *dec, uint32_t *status); +int32_t mpd_qget_i32(const mpd_t *dec, uint32_t *status); +uint32_t mpd_qget_u32(const mpd_t *dec, uint32_t *status); +#ifndef LEGACY_COMPILER +int64_t mpd_qget_i64(const mpd_t *dec, uint32_t *status); +uint64_t mpd_qget_u64(const mpd_t *dec, uint32_t *status); +#endif /* quiet functions */ int mpd_qcheck_nan(mpd_t *nanresult, const mpd_t *a, const mpd_context_t *ctx, uint32_t *status); @@ -528,6 +552,17 @@ void mpd_qsqrt(mpd_t *result, const mpd_t *a, const mpd_context_t *ctx, uint32_t *status); void mpd_qinvroot(mpd_t *result, const mpd_t *a, const mpd_context_t *ctx, uint32_t *status); +#ifndef LEGACY_COMPILER +void mpd_qadd_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qadd_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qsub_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qsub_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qmul_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qmul_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qdiv_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qdiv_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); +#endif + size_t mpd_sizeinbase(const mpd_t *a, uint32_t base); void mpd_qimport_u16(mpd_t *result, const uint16_t *srcdata, size_t srclen, @@ -571,6 +606,12 @@ mpd_ssize_t mpd_get_ssize(const mpd_t *a, mpd_context_t *ctx); mpd_uint_t mpd_get_uint(const mpd_t *a, mpd_context_t *ctx); mpd_uint_t mpd_abs_uint(const mpd_t *a, mpd_context_t *ctx); +int32_t mpd_get_i32(const mpd_t *a, mpd_context_t *ctx); +uint32_t mpd_get_u32(const mpd_t *a, mpd_context_t *ctx); +#ifndef LEGACY_COMPILER +int64_t mpd_get_i64(const mpd_t *a, mpd_context_t *ctx); +uint64_t mpd_get_u64(const mpd_t *a, mpd_context_t *ctx); +#endif void mpd_and(mpd_t *result, const mpd_t *a, const mpd_t *b, mpd_context_t *ctx); void mpd_copy(mpd_t *result, const mpd_t *a, mpd_context_t *ctx); void mpd_canonical(mpd_t *result, const mpd_t *a, mpd_context_t *ctx); @@ -641,6 +682,17 @@ void mpd_sqrt(mpd_t *result, const mpd_t *a, mpd_context_t *ctx); void mpd_invroot(mpd_t *result, const mpd_t *a, mpd_context_t *ctx); +#ifndef LEGACY_COMPILER +void mpd_add_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); +void mpd_add_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); +void mpd_sub_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); +void mpd_sub_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); +void mpd_div_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); +void mpd_div_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); +void mpd_mul_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); +void mpd_mul_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); +#endif + /******************************************************************************/ /* Configuration specific */ @@ -649,36 +701,8 @@ #ifdef CONFIG_64 void mpd_qsset_i64(mpd_t *result, int64_t a, const mpd_context_t *ctx, uint32_t *status); void mpd_qsset_u64(mpd_t *result, uint64_t a, const mpd_context_t *ctx, uint32_t *status); -int64_t mpd_qget_i64(const mpd_t *dec, uint32_t *status); -uint64_t mpd_qget_u64(const mpd_t *dec, uint32_t *status); - -void mpd_qadd_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qadd_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qsub_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qsub_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qmul_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qmul_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qdiv_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qdiv_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); - void mpd_sset_i64(mpd_t *result, int64_t a, mpd_context_t *ctx); void mpd_sset_u64(mpd_t *result, uint64_t a, mpd_context_t *ctx); -int64_t mpd_get_i64(const mpd_t *a, mpd_context_t *ctx); -uint64_t mpd_get_u64(const mpd_t *a, mpd_context_t *ctx); - -void mpd_add_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); -void mpd_add_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); -void mpd_sub_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); -void mpd_sub_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); -void mpd_div_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); -void mpd_div_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); -void mpd_mul_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); -void mpd_mul_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); -#else -int32_t mpd_qget_i32(const mpd_t *dec, uint32_t *status); -uint32_t mpd_qget_u32(const mpd_t *dec, uint32_t *status); -int32_t mpd_get_i32(const mpd_t *a, mpd_context_t *ctx); -uint32_t mpd_get_u32(const mpd_t *a, mpd_context_t *ctx); #endif diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -1956,7 +1956,7 @@ undef_macros = [] if '--with-system-libmpdec' in sysconfig.get_config_var("CONFIG_ARGS"): include_dirs = [] - libraries = ['mpdec'] + libraries = [':libmpdec.so.2'] sources = ['_decimal/_decimal.c'] depends = ['_decimal/docstrings.h'] else: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 21:26:15 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Sun, 24 Nov 2013 21:26:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Use_Clinic_to_process_argu?= =?utf-8?q?ments_in_cpickle=2E?= Message-ID: <3dSNF33T4zz7LjT@mail.python.org> http://hg.python.org/cpython/rev/ead3f5a907bd changeset: 87519:ead3f5a907bd user: Alexandre Vassalotti date: Sun Nov 24 12:25:48 2013 -0800 summary: Use Clinic to process arguments in cpickle. This doesn't make any functional changes to the exisiting implementation. The conversion did help however uncover documentation bugs. The best thing about this conversion is less C code to maintain by hand. files: Modules/_pickle.c | 977 ++++++++++++++++++++++++--------- 1 files changed, 696 insertions(+), 281 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -1,6 +1,30 @@ #include "Python.h" #include "structmember.h" +/*[clinic] +module _pickle +class _pickle.Pickler +class _pickle.PicklerMemoProxy +class _pickle.Unpickler +class _pickle.UnpicklerMemoProxy +[clinic]*/ +/*[clinic checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ + +/*[python] +class PicklerObject_converter(self_converter): + type = "PicklerObject *" + +class PicklerMemoProxyObject_converter(self_converter): + type = "PicklerMemoProxyObject *" + +class UnpicklerObject_converter(self_converter): + type = "UnpicklerObject *" + +class UnpicklerMemoProxyObject_converter(self_converter): + type = "UnpicklerMemoProxyObject *" +[python]*/ +/*[python checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ + PyDoc_STRVAR(pickle_module_doc, "Optimized C implementation for the Python pickle module."); @@ -866,34 +890,29 @@ } static int -_Pickler_SetProtocol(PicklerObject *self, PyObject *proto_obj, - PyObject *fix_imports_obj) -{ - long proto = 0; - int fix_imports; - - if (proto_obj == NULL || proto_obj == Py_None) +_Pickler_SetProtocol(PicklerObject *self, PyObject *protocol, int fix_imports) +{ + long proto; + + if (protocol == NULL || protocol == Py_None) { proto = DEFAULT_PROTOCOL; + } else { - proto = PyLong_AsLong(proto_obj); - if (proto == -1 && PyErr_Occurred()) + proto = PyLong_AsLong(protocol); + if (proto < 0) { + if (proto == -1 && PyErr_Occurred()) + return -1; + proto = HIGHEST_PROTOCOL; + } + else if (proto > HIGHEST_PROTOCOL) { + PyErr_Format(PyExc_ValueError, "pickle protocol must be <= %d", + HIGHEST_PROTOCOL); return -1; - } - if (proto < 0) - proto = HIGHEST_PROTOCOL; - if (proto > HIGHEST_PROTOCOL) { - PyErr_Format(PyExc_ValueError, "pickle protocol must be <= %d", - HIGHEST_PROTOCOL); - return -1; - } - fix_imports = PyObject_IsTrue(fix_imports_obj); - if (fix_imports == -1) - return -1; - - self->proto = proto; + } + } + self->proto = (int)proto; self->bin = proto > 0; self->fix_imports = fix_imports && proto < 3; - return 0; } @@ -3708,16 +3727,35 @@ return 0; } -PyDoc_STRVAR(Pickler_clear_memo_doc, -"clear_memo() -> None. Clears the pickler's \"memo\"." +/*[clinic] + +_pickle.Pickler.clear_memo + + self: PicklerObject + +Clears the pickler's "memo". + +The memo is the data structure that remembers which objects the +pickler has already seen, so that shared or recursive objects are +pickled by reference and not by value. This method is useful when +re-using picklers. +[clinic]*/ + +PyDoc_STRVAR(_pickle_Pickler_clear_memo__doc__, +"clear_memo()\n" +"Clears the pickler\'s \"memo\".\n" "\n" "The memo is the data structure that remembers which objects the\n" "pickler has already seen, so that shared or recursive objects are\n" "pickled by reference and not by value. This method is useful when\n" "re-using picklers."); +#define _PICKLE_PICKLER_CLEAR_MEMO_METHODDEF \ + {"clear_memo", (PyCFunction)_pickle_Pickler_clear_memo, METH_NOARGS, _pickle_Pickler_clear_memo__doc__}, + static PyObject * -Pickler_clear_memo(PicklerObject *self) +_pickle_Pickler_clear_memo(PicklerObject *self) +/*[clinic checksum: 9c32be7e7a17ff82a81aae409d0d4f469033a5b2]*/ { if (self->memo) PyMemoTable_Clear(self->memo); @@ -3725,14 +3763,28 @@ Py_RETURN_NONE; } -PyDoc_STRVAR(Pickler_dump_doc, -"dump(obj) -> None. Write a pickled representation of obj to the open file."); +/*[clinic] + +_pickle.Pickler.dump + + self: PicklerObject + obj: object + / + +Write a pickled representation of the given object to the open file. +[clinic]*/ + +PyDoc_STRVAR(_pickle_Pickler_dump__doc__, +"dump(obj)\n" +"Write a pickled representation of the given object to the open file."); + +#define _PICKLE_PICKLER_DUMP_METHODDEF \ + {"dump", (PyCFunction)_pickle_Pickler_dump, METH_O, _pickle_Pickler_dump__doc__}, static PyObject * -Pickler_dump(PicklerObject *self, PyObject *args) -{ - PyObject *obj; - +_pickle_Pickler_dump(PicklerObject *self, PyObject *obj) +/*[clinic checksum: b72a69ec98737fabf66dae7c5a3210178bdbd3e6]*/ +{ /* Check whether the Pickler was initialized correctly (issue3664). Developers often forget to call __init__() in their subclasses, which would trigger a segfault without this check. */ @@ -3743,9 +3795,6 @@ return NULL; } - if (!PyArg_ParseTuple(args, "O:dump", &obj)) - return NULL; - if (_Pickler_ClearBuffer(self) < 0) return NULL; @@ -3759,10 +3808,8 @@ } static struct PyMethodDef Pickler_methods[] = { - {"dump", (PyCFunction)Pickler_dump, METH_VARARGS, - Pickler_dump_doc}, - {"clear_memo", (PyCFunction)Pickler_clear_memo, METH_NOARGS, - Pickler_clear_memo_doc}, + _PICKLE_PICKLER_DUMP_METHODDEF + _PICKLE_PICKLER_CLEAR_MEMO_METHODDEF {NULL, NULL} /* sentinel */ }; @@ -3813,9 +3860,39 @@ } -PyDoc_STRVAR(Pickler_doc, -"Pickler(file, protocol=None)" -"\n" +/*[clinic] + +_pickle.Pickler.__init__ + + self: PicklerObject + file: object + protocol: object = NULL + fix_imports: bool = True + +This takes a binary file for writing a pickle data stream. + +The optional protocol argument tells the pickler to use the +given protocol; supported protocols are 0, 1, 2, 3 and 4. The +default protocol is 3; a backward-incompatible protocol designed for +Python 3. + +Specifying a negative protocol version selects the highest +protocol version supported. The higher the protocol used, the +more recent the version of Python needed to read the pickle +produced. + +The file argument must have a write() method that accepts a single +bytes argument. It can thus be a file object opened for binary +writing, a io.BytesIO instance, or any other custom object that +meets this interface. + +If fix_imports is True and protocol is less than 3, pickle will try to +map the new Python 3 names to the old module names used in Python 2, +so that the pickle data stream is readable with Python 2. +[clinic]*/ + +PyDoc_STRVAR(_pickle_Pickler___init____doc__, +"__init__(file, protocol=None, fix_imports=True)\n" "This takes a binary file for writing a pickle data stream.\n" "\n" "The optional protocol argument tells the pickler to use the\n" @@ -3835,37 +3912,55 @@ "\n" "If fix_imports is True and protocol is less than 3, pickle will try to\n" "map the new Python 3 names to the old module names used in Python 2,\n" -"so that the pickle data stream is readable with Python 2.\n"); - -static int -Pickler_init(PicklerObject *self, PyObject *args, PyObject *kwds) -{ - static char *kwlist[] = {"file", "protocol", "fix_imports", 0}; +"so that the pickle data stream is readable with Python 2."); + +#define _PICKLE_PICKLER___INIT___METHODDEF \ + {"__init__", (PyCFunction)_pickle_Pickler___init__, METH_VARARGS|METH_KEYWORDS, _pickle_Pickler___init____doc__}, + +static PyObject * +_pickle_Pickler___init___impl(PicklerObject *self, PyObject *file, PyObject *protocol, int fix_imports); + +static PyObject * +_pickle_Pickler___init__(PyObject *self, PyObject *args, PyObject *kwargs) +{ + PyObject *return_value = NULL; + static char *_keywords[] = {"file", "protocol", "fix_imports", NULL}; PyObject *file; - PyObject *proto_obj = NULL; - PyObject *fix_imports = Py_True; + PyObject *protocol = NULL; + int fix_imports = 1; + + if (!PyArg_ParseTupleAndKeywords(args, kwargs, + "O|Op:__init__", _keywords, + &file, &protocol, &fix_imports)) + goto exit; + return_value = _pickle_Pickler___init___impl((PicklerObject *)self, file, protocol, fix_imports); + +exit: + return return_value; +} + +static PyObject * +_pickle_Pickler___init___impl(PicklerObject *self, PyObject *file, PyObject *protocol, int fix_imports) +/*[clinic checksum: c99ff417bd703a74affc4b708167e56e135e8969]*/ +{ _Py_IDENTIFIER(persistent_id); _Py_IDENTIFIER(dispatch_table); - if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|OO:Pickler", - kwlist, &file, &proto_obj, &fix_imports)) - return -1; - /* In case of multiple __init__() calls, clear previous content. */ if (self->write != NULL) (void)Pickler_clear(self); - if (_Pickler_SetProtocol(self, proto_obj, fix_imports) < 0) - return -1; + if (_Pickler_SetProtocol(self, protocol, fix_imports) < 0) + return NULL; if (_Pickler_SetOutputStream(self, file) < 0) - return -1; + return NULL; /* memo and output_buffer may have already been created in _Pickler_New */ if (self->memo == NULL) { self->memo = PyMemoTable_New(); if (self->memo == NULL) - return -1; + return NULL; } self->output_len = 0; if (self->output_buffer == NULL) { @@ -3873,7 +3968,7 @@ self->output_buffer = PyBytes_FromStringAndSize(NULL, self->max_output_len); if (self->output_buffer == NULL) - return -1; + return NULL; } self->arg = NULL; @@ -3885,14 +3980,24 @@ self->pers_func = _PyObject_GetAttrId((PyObject *)self, &PyId_persistent_id); if (self->pers_func == NULL) - return -1; + return NULL; } self->dispatch_table = NULL; if (_PyObject_HasAttrId((PyObject *)self, &PyId_dispatch_table)) { self->dispatch_table = _PyObject_GetAttrId((PyObject *)self, &PyId_dispatch_table); if (self->dispatch_table == NULL) - return -1; + return NULL; + } + return Py_None; +} + +/* XXX Slight hack to slot a Clinic generated signature in tp_init. */ +static int +Pickler_init(PyObject *self, PyObject *args, PyObject *kwargs) +{ + if (_pickle_Pickler___init__(self, args, kwargs) == NULL) { + return -1; } return 0; } @@ -3912,22 +4017,48 @@ PicklerObject *pickler; /* Pickler whose memo table we're proxying. */ } PicklerMemoProxyObject; -PyDoc_STRVAR(pmp_clear_doc, -"memo.clear() -> None. Remove all items from memo."); +/*[clinic] +_pickle.PicklerMemoProxy.clear + + self: PicklerMemoProxyObject + +Remove all items from memo. +[clinic]*/ + +PyDoc_STRVAR(_pickle_PicklerMemoProxy_clear__doc__, +"clear()\n" +"Remove all items from memo."); + +#define _PICKLE_PICKLERMEMOPROXY_CLEAR_METHODDEF \ + {"clear", (PyCFunction)_pickle_PicklerMemoProxy_clear, METH_NOARGS, _pickle_PicklerMemoProxy_clear__doc__}, static PyObject * -pmp_clear(PicklerMemoProxyObject *self) +_pickle_PicklerMemoProxy_clear(PicklerMemoProxyObject *self) +/*[clinic checksum: 507f13938721992e175a3e58b5ad02620045a1cc]*/ { if (self->pickler->memo) PyMemoTable_Clear(self->pickler->memo); Py_RETURN_NONE; } -PyDoc_STRVAR(pmp_copy_doc, -"memo.copy() -> new_memo. Copy the memo to a new object."); +/*[clinic] +_pickle.PicklerMemoProxy.copy + + self: PicklerMemoProxyObject + +Copy the memo to a new object. +[clinic]*/ + +PyDoc_STRVAR(_pickle_PicklerMemoProxy_copy__doc__, +"copy()\n" +"Copy the memo to a new object."); + +#define _PICKLE_PICKLERMEMOPROXY_COPY_METHODDEF \ + {"copy", (PyCFunction)_pickle_PicklerMemoProxy_copy, METH_NOARGS, _pickle_PicklerMemoProxy_copy__doc__}, static PyObject * -pmp_copy(PicklerMemoProxyObject *self) +_pickle_PicklerMemoProxy_copy(PicklerMemoProxyObject *self) +/*[clinic checksum: 73a5117ab354290ebdbe07bd0bf7232d0936a69d]*/ { Py_ssize_t i; PyMemoTable *memo; @@ -3964,14 +4095,27 @@ return NULL; } -PyDoc_STRVAR(pmp_reduce_doc, -"memo.__reduce__(). Pickling support."); +/*[clinic] +_pickle.PicklerMemoProxy.__reduce__ + + self: PicklerMemoProxyObject + +Implement pickle support. +[clinic]*/ + +PyDoc_STRVAR(_pickle_PicklerMemoProxy___reduce____doc__, +"__reduce__()\n" +"Implement pickle support."); + +#define _PICKLE_PICKLERMEMOPROXY___REDUCE___METHODDEF \ + {"__reduce__", (PyCFunction)_pickle_PicklerMemoProxy___reduce__, METH_NOARGS, _pickle_PicklerMemoProxy___reduce____doc__}, static PyObject * -pmp_reduce(PicklerMemoProxyObject *self, PyObject *args) +_pickle_PicklerMemoProxy___reduce__(PicklerMemoProxyObject *self) +/*[clinic checksum: 40f0bf7a9b161e77130674f0481bda0a0184dcce]*/ { PyObject *reduce_value, *dict_args; - PyObject *contents = pmp_copy(self); + PyObject *contents = _pickle_PicklerMemoProxy_copy(self); if (contents == NULL) return NULL; @@ -3994,9 +4138,9 @@ } static PyMethodDef picklerproxy_methods[] = { - {"clear", (PyCFunction)pmp_clear, METH_NOARGS, pmp_clear_doc}, - {"copy", (PyCFunction)pmp_copy, METH_NOARGS, pmp_copy_doc}, - {"__reduce__", (PyCFunction)pmp_reduce, METH_VARARGS, pmp_reduce_doc}, + _PICKLE_PICKLERMEMOPROXY_CLEAR_METHODDEF + _PICKLE_PICKLERMEMOPROXY_COPY_METHODDEF + _PICKLE_PICKLERMEMOPROXY___REDUCE___METHODDEF {NULL, NULL} /* sentinel */ }; @@ -4208,7 +4352,7 @@ 0, /*tp_setattro*/ 0, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE | Py_TPFLAGS_HAVE_GC, - Pickler_doc, /*tp_doc*/ + _pickle_Pickler___init____doc__, /*tp_doc*/ (traverseproc)Pickler_traverse, /*tp_traverse*/ (inquiry)Pickler_clear, /*tp_clear*/ 0, /*tp_richcompare*/ @@ -4223,7 +4367,7 @@ 0, /*tp_descr_get*/ 0, /*tp_descr_set*/ 0, /*tp_dictoffset*/ - (initproc)Pickler_init, /*tp_init*/ + Pickler_init, /*tp_init*/ PyType_GenericAlloc, /*tp_alloc*/ PyType_GenericNew, /*tp_new*/ PyObject_GC_Del, /*tp_free*/ @@ -5938,57 +6082,111 @@ return value; } -PyDoc_STRVAR(Unpickler_load_doc, -"load() -> object. Load a pickle." +/*[clinic] + +_pickle.Unpickler.load + +Load a pickle. + +Read a pickled object representation from the open file object given in +the constructor, and return the reconstituted object hierarchy specified +therein. +[clinic]*/ + +PyDoc_STRVAR(_pickle_Unpickler_load__doc__, +"load()\n" +"Load a pickle.\n" "\n" "Read a pickled object representation from the open file object given in\n" "the constructor, and return the reconstituted object hierarchy specified\n" -"therein.\n"); +"therein."); + +#define _PICKLE_UNPICKLER_LOAD_METHODDEF \ + {"load", (PyCFunction)_pickle_Unpickler_load, METH_NOARGS, _pickle_Unpickler_load__doc__}, static PyObject * -Unpickler_load(UnpicklerObject *self) -{ +_pickle_Unpickler_load(PyObject *self) +/*[clinic checksum: 9a30ba4e4d9221d4dcd705e1471ab11b2c9e3ac6]*/ +{ + UnpicklerObject *unpickler = (UnpicklerObject*)self; /* Check whether the Unpickler was initialized correctly. This prevents segfaulting if a subclass overridden __init__ with a function that does not call Unpickler.__init__(). Here, we simply ensure that self->read is not NULL. */ - if (self->read == NULL) { + if (unpickler->read == NULL) { PyErr_Format(UnpicklingError, "Unpickler.__init__() was not called by %s.__init__()", - Py_TYPE(self)->tp_name); + Py_TYPE(unpickler)->tp_name); return NULL; } - return load(self); + return load(unpickler); } /* The name of find_class() is misleading. In newer pickle protocols, this function is used for loading any global (i.e., functions), not just classes. The name is kept only for backward compatibility. */ -PyDoc_STRVAR(Unpickler_find_class_doc, -"find_class(module_name, global_name) -> object.\n" +/*[clinic] + +_pickle.Unpickler.find_class + + self: UnpicklerObject + module_name: object + global_name: object + / + +Return an object from a specified module. + +If necessary, the module will be imported. Subclasses may override this +method (e.g. to restrict unpickling of arbitrary classes and functions). + +This method is called whenever a class or a function object is +needed. Both arguments passed are str objects. +[clinic]*/ + +PyDoc_STRVAR(_pickle_Unpickler_find_class__doc__, +"find_class(module_name, global_name)\n" +"Return an object from a specified module.\n" "\n" -"Return an object from a specified module, importing the module if\n" -"necessary. Subclasses may override this method (e.g. to restrict\n" -"unpickling of arbitrary classes and functions).\n" +"If necessary, the module will be imported. Subclasses may override this\n" +"method (e.g. to restrict unpickling of arbitrary classes and functions).\n" "\n" "This method is called whenever a class or a function object is\n" -"needed. Both arguments passed are str objects.\n"); +"needed. Both arguments passed are str objects."); + +#define _PICKLE_UNPICKLER_FIND_CLASS_METHODDEF \ + {"find_class", (PyCFunction)_pickle_Unpickler_find_class, METH_VARARGS, _pickle_Unpickler_find_class__doc__}, static PyObject * -Unpickler_find_class(UnpicklerObject *self, PyObject *args) +_pickle_Unpickler_find_class_impl(UnpicklerObject *self, PyObject *module_name, PyObject *global_name); + +static PyObject * +_pickle_Unpickler_find_class(PyObject *self, PyObject *args) +{ + PyObject *return_value = NULL; + PyObject *module_name; + PyObject *global_name; + + if (!PyArg_ParseTuple(args, + "OO:find_class", + &module_name, &global_name)) + goto exit; + return_value = _pickle_Unpickler_find_class_impl((UnpicklerObject *)self, module_name, global_name); + +exit: + return return_value; +} + +static PyObject * +_pickle_Unpickler_find_class_impl(UnpicklerObject *self, PyObject *module_name, PyObject *global_name) +/*[clinic checksum: b7d05d4dd8adc698e5780c1ac2be0f5062d33915]*/ { PyObject *global; PyObject *modules_dict; PyObject *module; - PyObject *module_name, *global_name; _Py_IDENTIFIER(modules); - if (!PyArg_UnpackTuple(args, "find_class", 2, 2, - &module_name, &global_name)) - return NULL; - /* Try to map the old names used in Python 2.x to the new ones used in Python 3.x. We do this only with old pickle protocols and when the user has not disabled the feature. */ @@ -6065,10 +6263,8 @@ } static struct PyMethodDef Unpickler_methods[] = { - {"load", (PyCFunction)Unpickler_load, METH_NOARGS, - Unpickler_load_doc}, - {"find_class", (PyCFunction)Unpickler_find_class, METH_VARARGS, - Unpickler_find_class_doc}, + _PICKLE_UNPICKLER_LOAD_METHODDEF + _PICKLE_UNPICKLER_FIND_CLASS_METHODDEF {NULL, NULL} /* sentinel */ }; @@ -6135,9 +6331,41 @@ return 0; } -PyDoc_STRVAR(Unpickler_doc, -"Unpickler(file, *, encoding='ASCII', errors='strict')" -"\n" +/*[clinic] + +_pickle.Unpickler.__init__ + + self: UnpicklerObject + file: object + * + fix_imports: bool = True + encoding: str = 'ASCII' + errors: str = 'strict' + +This takes a binary file for reading a pickle data stream. + +The protocol version of the pickle is detected automatically, so no +proto argument is needed. + +The file-like object must have two methods, a read() method +that takes an integer argument, and a readline() method that +requires no arguments. Both methods should return bytes. +Thus file-like object can be a binary file object opened for +reading, a BytesIO object, or any other custom object that +meets this interface. + +Optional keyword arguments are *fix_imports*, *encoding* and *errors*, +which are used to control compatiblity support for pickle stream +generated by Python 2.x. If *fix_imports* is True, pickle will try to +map the old Python 2.x names to the new names used in Python 3.x. The +*encoding* and *errors* tell pickle how to decode 8-bit string +instances pickled by Python 2.x; these default to 'ASCII' and +'strict', respectively. + +[clinic]*/ + +PyDoc_STRVAR(_pickle_Unpickler___init____doc__, +"__init__(file, *, fix_imports=True, encoding=\'ASCII\', errors=\'strict\')\n" "This takes a binary file for reading a pickle data stream.\n" "\n" "The protocol version of the pickle is detected automatically, so no\n" @@ -6155,57 +6383,60 @@ "generated by Python 2.x. If *fix_imports* is True, pickle will try to\n" "map the old Python 2.x names to the new names used in Python 3.x. The\n" "*encoding* and *errors* tell pickle how to decode 8-bit string\n" -"instances pickled by Python 2.x; these default to 'ASCII' and\n" -"'strict', respectively.\n"); - -static int -Unpickler_init(UnpicklerObject *self, PyObject *args, PyObject *kwds) -{ - static char *kwlist[] = {"file", "fix_imports", "encoding", "errors", 0}; +"instances pickled by Python 2.x; these default to \'ASCII\' and\n" +"\'strict\', respectively."); + +#define _PICKLE_UNPICKLER___INIT___METHODDEF \ + {"__init__", (PyCFunction)_pickle_Unpickler___init__, METH_VARARGS|METH_KEYWORDS, _pickle_Unpickler___init____doc__}, + +static PyObject * +_pickle_Unpickler___init___impl(UnpicklerObject *self, PyObject *file, int fix_imports, const char *encoding, const char *errors); + +static PyObject * +_pickle_Unpickler___init__(PyObject *self, PyObject *args, PyObject *kwargs) +{ + PyObject *return_value = NULL; + static char *_keywords[] = {"file", "fix_imports", "encoding", "errors", NULL}; PyObject *file; - PyObject *fix_imports = Py_True; - char *encoding = NULL; - char *errors = NULL; + int fix_imports = 1; + const char *encoding = "ASCII"; + const char *errors = "strict"; + + if (!PyArg_ParseTupleAndKeywords(args, kwargs, + "O|$pss:__init__", _keywords, + &file, &fix_imports, &encoding, &errors)) + goto exit; + return_value = _pickle_Unpickler___init___impl((UnpicklerObject *)self, file, fix_imports, encoding, errors); + +exit: + return return_value; +} + +static PyObject * +_pickle_Unpickler___init___impl(UnpicklerObject *self, PyObject *file, int fix_imports, const char *encoding, const char *errors) +/*[clinic checksum: bed0d8bbe1c647960ccc6f997b33bf33935fa56f]*/ +{ _Py_IDENTIFIER(persistent_load); - /* XXX: That is an horrible error message. But, I don't know how to do - better... */ - if (Py_SIZE(args) != 1) { - PyErr_Format(PyExc_TypeError, - "%s takes exactly one positional argument (%zd given)", - Py_TYPE(self)->tp_name, Py_SIZE(args)); - return -1; - } - - /* Arguments parsing needs to be done in the __init__() method to allow - subclasses to define their own __init__() method, which may (or may - not) support Unpickler arguments. However, this means we need to be - extra careful in the other Unpickler methods, since a subclass could - forget to call Unpickler.__init__() thus breaking our internal - invariants. */ - if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|Oss:Unpickler", kwlist, - &file, &fix_imports, &encoding, &errors)) - return -1; - /* In case of multiple __init__() calls, clear previous content. */ if (self->read != NULL) (void)Unpickler_clear(self); if (_Unpickler_SetInputStream(self, file) < 0) - return -1; + return NULL; if (_Unpickler_SetInputEncoding(self, encoding, errors) < 0) - return -1; - - self->fix_imports = PyObject_IsTrue(fix_imports); + return NULL; + + self->fix_imports = fix_imports; if (self->fix_imports == -1) - return -1; + return NULL; if (_PyObject_HasAttrId((PyObject *)self, &PyId_persistent_load)) { self->pers_func = _PyObject_GetAttrId((PyObject *)self, &PyId_persistent_load); if (self->pers_func == NULL) - return -1; + return NULL; } else { self->pers_func = NULL; @@ -6213,16 +6444,26 @@ self->stack = (Pdata *)Pdata_New(); if (self->stack == NULL) - return -1; + return NULL; self->memo_size = 32; self->memo = _Unpickler_NewMemo(self->memo_size); if (self->memo == NULL) - return -1; + return NULL; self->arg = NULL; self->proto = 0; + return Py_None; +} + +/* XXX Slight hack to slot a Clinic generated signature in tp_init. */ +static int +Unpickler_init(PyObject *self, PyObject *args, PyObject *kwargs) +{ + if (_pickle_Unpickler___init__(self, args, kwargs) == NULL) { + return -1; + } return 0; } @@ -6244,11 +6485,24 @@ UnpicklerObject *unpickler; } UnpicklerMemoProxyObject; -PyDoc_STRVAR(ump_clear_doc, -"memo.clear() -> None. Remove all items from memo."); +/*[clinic] +_pickle.UnpicklerMemoProxy.clear + + self: UnpicklerMemoProxyObject + +Remove all items from memo. +[clinic]*/ + +PyDoc_STRVAR(_pickle_UnpicklerMemoProxy_clear__doc__, +"clear()\n" +"Remove all items from memo."); + +#define _PICKLE_UNPICKLERMEMOPROXY_CLEAR_METHODDEF \ + {"clear", (PyCFunction)_pickle_UnpicklerMemoProxy_clear, METH_NOARGS, _pickle_UnpicklerMemoProxy_clear__doc__}, static PyObject * -ump_clear(UnpicklerMemoProxyObject *self) +_pickle_UnpicklerMemoProxy_clear(UnpicklerMemoProxyObject *self) +/*[clinic checksum: 46fecf4e33c0c873124f845edf6cc3a2e9864bd5]*/ { _Unpickler_MemoCleanup(self->unpickler); self->unpickler->memo = _Unpickler_NewMemo(self->unpickler->memo_size); @@ -6257,11 +6511,24 @@ Py_RETURN_NONE; } -PyDoc_STRVAR(ump_copy_doc, -"memo.copy() -> new_memo. Copy the memo to a new object."); +/*[clinic] +_pickle.UnpicklerMemoProxy.copy + + self: UnpicklerMemoProxyObject + +Copy the memo to a new object. +[clinic]*/ + +PyDoc_STRVAR(_pickle_UnpicklerMemoProxy_copy__doc__, +"copy()\n" +"Copy the memo to a new object."); + +#define _PICKLE_UNPICKLERMEMOPROXY_COPY_METHODDEF \ + {"copy", (PyCFunction)_pickle_UnpicklerMemoProxy_copy, METH_NOARGS, _pickle_UnpicklerMemoProxy_copy__doc__}, static PyObject * -ump_copy(UnpicklerMemoProxyObject *self) +_pickle_UnpicklerMemoProxy_copy(UnpicklerMemoProxyObject *self) +/*[clinic checksum: f8856c4e8a33540886dfbb245f286af3008fa0ad]*/ { Py_ssize_t i; PyObject *new_memo = PyDict_New(); @@ -6291,15 +6558,28 @@ return NULL; } -PyDoc_STRVAR(ump_reduce_doc, -"memo.__reduce__(). Pickling support."); +/*[clinic] +_pickle.UnpicklerMemoProxy.__reduce__ + + self: UnpicklerMemoProxyObject + +Implement pickling support. +[clinic]*/ + +PyDoc_STRVAR(_pickle_UnpicklerMemoProxy___reduce____doc__, +"__reduce__()\n" +"Implement pickling support."); + +#define _PICKLE_UNPICKLERMEMOPROXY___REDUCE___METHODDEF \ + {"__reduce__", (PyCFunction)_pickle_UnpicklerMemoProxy___reduce__, METH_NOARGS, _pickle_UnpicklerMemoProxy___reduce____doc__}, static PyObject * -ump_reduce(UnpicklerMemoProxyObject *self, PyObject *args) +_pickle_UnpicklerMemoProxy___reduce__(UnpicklerMemoProxyObject *self) +/*[clinic checksum: ab5516a77659144e1191c7dd70a0c6c7455660bc]*/ { PyObject *reduce_value; PyObject *constructor_args; - PyObject *contents = ump_copy(self); + PyObject *contents = _pickle_UnpicklerMemoProxy_copy(self); if (contents == NULL) return NULL; @@ -6322,9 +6602,9 @@ } static PyMethodDef unpicklerproxy_methods[] = { - {"clear", (PyCFunction)ump_clear, METH_NOARGS, ump_clear_doc}, - {"copy", (PyCFunction)ump_copy, METH_NOARGS, ump_copy_doc}, - {"__reduce__", (PyCFunction)ump_reduce, METH_VARARGS, ump_reduce_doc}, + _PICKLE_UNPICKLERMEMOPROXY_CLEAR_METHODDEF + _PICKLE_UNPICKLERMEMOPROXY_COPY_METHODDEF + _PICKLE_UNPICKLERMEMOPROXY___REDUCE___METHODDEF {NULL, NULL} /* sentinel */ }; @@ -6548,7 +6828,7 @@ 0, /*tp_setattro*/ 0, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE | Py_TPFLAGS_HAVE_GC, - Unpickler_doc, /*tp_doc*/ + _pickle_Unpickler___init____doc__, /*tp_doc*/ (traverseproc)Unpickler_traverse, /*tp_traverse*/ (inquiry)Unpickler_clear, /*tp_clear*/ 0, /*tp_richcompare*/ @@ -6563,21 +6843,53 @@ 0, /*tp_descr_get*/ 0, /*tp_descr_set*/ 0, /*tp_dictoffset*/ - (initproc)Unpickler_init, /*tp_init*/ + Unpickler_init, /*tp_init*/ PyType_GenericAlloc, /*tp_alloc*/ PyType_GenericNew, /*tp_new*/ PyObject_GC_Del, /*tp_free*/ 0, /*tp_is_gc*/ }; -PyDoc_STRVAR(pickle_dump_doc, -"dump(obj, file, protocol=None, *, fix_imports=True) -> None\n" +/*[clinic] + +_pickle.dump + + obj: object + file: object + protocol: object = NULL + * + fix_imports: bool = True + +Write a pickled representation of obj to the open file object file. + +This is equivalent to ``Pickler(file, protocol).dump(obj)``, but may be more +efficient. + +The optional protocol argument tells the pickler to use the given protocol +supported protocols are 0, 1, 2, 3. The default protocol is 3; a +backward-incompatible protocol designed for Python 3.0. + +Specifying a negative protocol version selects the highest protocol version +supported. The higher the protocol used, the more recent the version of +Python needed to read the pickle produced. + +The file argument must have a write() method that accepts a single bytes +argument. It can thus be a file object opened for binary writing, a +io.BytesIO instance, or any other custom object that meets this interface. + +If fix_imports is True and protocol is less than 3, pickle will try to +map the new Python 3.x names to the old module names used in Python 2.x, +so that the pickle data stream is readable with Python 2.x. +[clinic]*/ + +PyDoc_STRVAR(_pickle_dump__doc__, +"dump(obj, file, protocol=None, *, fix_imports=True)\n" +"Write a pickled representation of obj to the open file object file.\n" "\n" -"Write a pickled representation of obj to the open file object file. This\n" -"is equivalent to ``Pickler(file, protocol).dump(obj)``, but may be more\n" +"This is equivalent to ``Pickler(file, protocol).dump(obj)``, but may be more\n" "efficient.\n" "\n" -"The optional protocol argument tells the pickler to use the given protocol;\n" +"The optional protocol argument tells the pickler to use the given protocol\n" "supported protocols are 0, 1, 2, 3. The default protocol is 3; a\n" "backward-incompatible protocol designed for Python 3.0.\n" "\n" @@ -6591,35 +6903,44 @@ "\n" "If fix_imports is True and protocol is less than 3, pickle will try to\n" "map the new Python 3.x names to the old module names used in Python 2.x,\n" -"so that the pickle data stream is readable with Python 2.x.\n"); +"so that the pickle data stream is readable with Python 2.x."); + +#define _PICKLE_DUMP_METHODDEF \ + {"dump", (PyCFunction)_pickle_dump, METH_VARARGS|METH_KEYWORDS, _pickle_dump__doc__}, static PyObject * -pickle_dump(PyObject *self, PyObject *args, PyObject *kwds) -{ - static char *kwlist[] = {"obj", "file", "protocol", "fix_imports", 0}; +_pickle_dump_impl(PyModuleDef *module, PyObject *obj, PyObject *file, PyObject *protocol, int fix_imports); + +static PyObject * +_pickle_dump(PyModuleDef *module, PyObject *args, PyObject *kwargs) +{ + PyObject *return_value = NULL; + static char *_keywords[] = {"obj", "file", "protocol", "fix_imports", NULL}; PyObject *obj; PyObject *file; - PyObject *proto = NULL; - PyObject *fix_imports = Py_True; - PicklerObject *pickler; - - /* fix_imports is a keyword-only argument. */ - if (Py_SIZE(args) > 3) { - PyErr_Format(PyExc_TypeError, - "pickle.dump() takes at most 3 positional " - "argument (%zd given)", Py_SIZE(args)); - return NULL; - } - - if (!PyArg_ParseTupleAndKeywords(args, kwds, "OO|OO:dump", kwlist, - &obj, &file, &proto, &fix_imports)) - return NULL; - - pickler = _Pickler_New(); + PyObject *protocol = NULL; + int fix_imports = 1; + + if (!PyArg_ParseTupleAndKeywords(args, kwargs, + "OO|O$p:dump", _keywords, + &obj, &file, &protocol, &fix_imports)) + goto exit; + return_value = _pickle_dump_impl(module, obj, file, protocol, fix_imports); + +exit: + return return_value; +} + +static PyObject * +_pickle_dump_impl(PyModuleDef *module, PyObject *obj, PyObject *file, PyObject *protocol, int fix_imports) +/*[clinic checksum: e442721b16052d921b5e3fbd146d0a62e94a459e]*/ +{ + PicklerObject *pickler = _Pickler_New(); + if (pickler == NULL) return NULL; - if (_Pickler_SetProtocol(pickler, proto, fix_imports) < 0) + if (_Pickler_SetProtocol(pickler, protocol, fix_imports) < 0) goto error; if (_Pickler_SetOutputStream(pickler, file) < 0) @@ -6639,11 +6960,33 @@ return NULL; } -PyDoc_STRVAR(pickle_dumps_doc, -"dumps(obj, protocol=None, *, fix_imports=True) -> bytes\n" -"\n" -"Return the pickled representation of the object as a bytes\n" -"object, instead of writing it to a file.\n" +/*[clinic] + +_pickle.dumps + + obj: object + protocol: object = NULL + * + fix_imports: bool = True + +Return the pickled representation of the object as a bytes object. + +The optional protocol argument tells the pickler to use the given protocol; +supported protocols are 0, 1, 2, 3. The default protocol is 3; a +backward-incompatible protocol designed for Python 3.0. + +Specifying a negative protocol version selects the highest protocol version +supported. The higher the protocol used, the more recent the version of +Python needed to read the pickle produced. + +If fix_imports is True and *protocol* is less than 3, pickle will try to +map the new Python 3.x names to the old module names used in Python 2.x, +so that the pickle data stream is readable with Python 2.x. +[clinic]*/ + +PyDoc_STRVAR(_pickle_dumps__doc__, +"dumps(obj, protocol=None, *, fix_imports=True)\n" +"Return the pickled representation of the object as a bytes object.\n" "\n" "The optional protocol argument tells the pickler to use the given protocol;\n" "supported protocols are 0, 1, 2, 3. The default protocol is 3; a\n" @@ -6655,35 +6998,44 @@ "\n" "If fix_imports is True and *protocol* is less than 3, pickle will try to\n" "map the new Python 3.x names to the old module names used in Python 2.x,\n" -"so that the pickle data stream is readable with Python 2.x.\n"); +"so that the pickle data stream is readable with Python 2.x."); + +#define _PICKLE_DUMPS_METHODDEF \ + {"dumps", (PyCFunction)_pickle_dumps, METH_VARARGS|METH_KEYWORDS, _pickle_dumps__doc__}, static PyObject * -pickle_dumps(PyObject *self, PyObject *args, PyObject *kwds) -{ - static char *kwlist[] = {"obj", "protocol", "fix_imports", 0}; +_pickle_dumps_impl(PyModuleDef *module, PyObject *obj, PyObject *protocol, int fix_imports); + +static PyObject * +_pickle_dumps(PyModuleDef *module, PyObject *args, PyObject *kwargs) +{ + PyObject *return_value = NULL; + static char *_keywords[] = {"obj", "protocol", "fix_imports", NULL}; PyObject *obj; - PyObject *proto = NULL; + PyObject *protocol = NULL; + int fix_imports = 1; + + if (!PyArg_ParseTupleAndKeywords(args, kwargs, + "O|O$p:dumps", _keywords, + &obj, &protocol, &fix_imports)) + goto exit; + return_value = _pickle_dumps_impl(module, obj, protocol, fix_imports); + +exit: + return return_value; +} + +static PyObject * +_pickle_dumps_impl(PyModuleDef *module, PyObject *obj, PyObject *protocol, int fix_imports) +/*[clinic checksum: df6262c4c487f537f47aec8a1709318204c1e174]*/ +{ PyObject *result; - PyObject *fix_imports = Py_True; - PicklerObject *pickler; - - /* fix_imports is a keyword-only argument. */ - if (Py_SIZE(args) > 2) { - PyErr_Format(PyExc_TypeError, - "pickle.dumps() takes at most 2 positional " - "argument (%zd given)", Py_SIZE(args)); - return NULL; - } - - if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|OO:dumps", kwlist, - &obj, &proto, &fix_imports)) - return NULL; - - pickler = _Pickler_New(); + PicklerObject *pickler = _Pickler_New(); + if (pickler == NULL) return NULL; - if (_Pickler_SetProtocol(pickler, proto, fix_imports) < 0) + if (_Pickler_SetProtocol(pickler, protocol, fix_imports) < 0) goto error; if (dump(pickler, obj) < 0) @@ -6698,15 +7050,46 @@ return NULL; } -PyDoc_STRVAR(pickle_load_doc, -"load(file, *, fix_imports=True, encoding='ASCII', errors='strict') -> object\n" +/*[clinic] + +_pickle.load + + file: object + * + fix_imports: bool = True + encoding: str = 'ASCII' + errors: str = 'strict' + +Return a reconstituted object from the pickle data stored in a file. + +This is equivalent to ``Unpickler(file).load()``, but may be more efficient. + +The protocol version of the pickle is detected automatically, so no protocol +argument is needed. Bytes past the pickled object's representation are +ignored. + +The argument file must have two methods, a read() method that takes an +integer argument, and a readline() method that requires no arguments. Both +methods should return bytes. Thus *file* can be a binary file object opened +for reading, a BytesIO object, or any other custom object that meets this +interface. + +Optional keyword arguments are fix_imports, encoding and errors, +which are used to control compatiblity support for pickle stream generated +by Python 2.x. If fix_imports is True, pickle will try to map the old +Python 2.x names to the new names used in Python 3.x. The encoding and +errors tell pickle how to decode 8-bit string instances pickled by Python +2.x; these default to 'ASCII' and 'strict', respectively. +[clinic]*/ + +PyDoc_STRVAR(_pickle_load__doc__, +"load(file, *, fix_imports=True, encoding=\'ASCII\', errors=\'strict\')\n" +"Return a reconstituted object from the pickle data stored in a file.\n" "\n" -"Read a pickled object representation from the open file object file and\n" -"return the reconstituted object hierarchy specified therein. This is\n" -"equivalent to ``Unpickler(file).load()``, but may be more efficient.\n" +"This is equivalent to ``Unpickler(file).load()``, but may be more efficient.\n" "\n" "The protocol version of the pickle is detected automatically, so no protocol\n" -"argument is needed. Bytes past the pickled object's representation are\n" +"argument is needed. Bytes past the pickled object\'s representation are\n" "ignored.\n" "\n" "The argument file must have two methods, a read() method that takes an\n" @@ -6720,32 +7103,41 @@ "by Python 2.x. If fix_imports is True, pickle will try to map the old\n" "Python 2.x names to the new names used in Python 3.x. The encoding and\n" "errors tell pickle how to decode 8-bit string instances pickled by Python\n" -"2.x; these default to 'ASCII' and 'strict', respectively.\n"); +"2.x; these default to \'ASCII\' and \'strict\', respectively."); + +#define _PICKLE_LOAD_METHODDEF \ + {"load", (PyCFunction)_pickle_load, METH_VARARGS|METH_KEYWORDS, _pickle_load__doc__}, static PyObject * -pickle_load(PyObject *self, PyObject *args, PyObject *kwds) -{ - static char *kwlist[] = {"file", "fix_imports", "encoding", "errors", 0}; +_pickle_load_impl(PyModuleDef *module, PyObject *file, int fix_imports, const char *encoding, const char *errors); + +static PyObject * +_pickle_load(PyModuleDef *module, PyObject *args, PyObject *kwargs) +{ + PyObject *return_value = NULL; + static char *_keywords[] = {"file", "fix_imports", "encoding", "errors", NULL}; PyObject *file; - PyObject *fix_imports = Py_True; + int fix_imports = 1; + const char *encoding = "ASCII"; + const char *errors = "strict"; + + if (!PyArg_ParseTupleAndKeywords(args, kwargs, + "O|$pss:load", _keywords, + &file, &fix_imports, &encoding, &errors)) + goto exit; + return_value = _pickle_load_impl(module, file, fix_imports, encoding, errors); + +exit: + return return_value; +} + +static PyObject * +_pickle_load_impl(PyModuleDef *module, PyObject *file, int fix_imports, const char *encoding, const char *errors) +/*[clinic checksum: e10796f6765b22ce48dca6940f11b3933853ca35]*/ +{ PyObject *result; - char *encoding = NULL; - char *errors = NULL; - UnpicklerObject *unpickler; - - /* fix_imports, encoding and errors are a keyword-only argument. */ - if (Py_SIZE(args) != 1) { - PyErr_Format(PyExc_TypeError, - "pickle.load() takes exactly one positional " - "argument (%zd given)", Py_SIZE(args)); - return NULL; - } - - if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|Oss:load", kwlist, - &file, &fix_imports, &encoding, &errors)) - return NULL; - - unpickler = _Unpickler_New(); + UnpicklerObject *unpickler = _Unpickler_New(); + if (unpickler == NULL) return NULL; @@ -6755,9 +7147,7 @@ if (_Unpickler_SetInputEncoding(unpickler, encoding, errors) < 0) goto error; - unpickler->fix_imports = PyObject_IsTrue(fix_imports); - if (unpickler->fix_imports == -1) - goto error; + unpickler->fix_imports = fix_imports; result = load(unpickler); Py_DECREF(unpickler); @@ -6768,14 +7158,36 @@ return NULL; } -PyDoc_STRVAR(pickle_loads_doc, -"loads(input, *, fix_imports=True, encoding='ASCII', errors='strict') -> object\n" -"\n" -"Read a pickled object hierarchy from a bytes object and return the\n" -"reconstituted object hierarchy specified therein\n" +/*[clinic] + +_pickle.loads + + data: object + * + fix_imports: bool = True + encoding: str = 'ASCII' + errors: str = 'strict' + +Return a reconstituted object from the given pickle data. + +The protocol version of the pickle is detected automatically, so no protocol +argument is needed. Bytes past the pickled object's representation are +ignored. + +Optional keyword arguments are fix_imports, encoding and errors, which +are used to control compatiblity support for pickle stream generated +by Python 2.x. If fix_imports is True, pickle will try to map the old +Python 2.x names to the new names used in Python 3.x. The encoding and +errors tell pickle how to decode 8-bit string instances pickled by Python +2.x; these default to 'ASCII' and 'strict', respectively. +[clinic]*/ + +PyDoc_STRVAR(_pickle_loads__doc__, +"loads(data, *, fix_imports=True, encoding=\'ASCII\', errors=\'strict\')\n" +"Return a reconstituted object from the given pickle data.\n" "\n" "The protocol version of the pickle is detected automatically, so no protocol\n" -"argument is needed. Bytes past the pickled object's representation are\n" +"argument is needed. Bytes past the pickled object\'s representation are\n" "ignored.\n" "\n" "Optional keyword arguments are fix_imports, encoding and errors, which\n" @@ -6783,44 +7195,51 @@ "by Python 2.x. If fix_imports is True, pickle will try to map the old\n" "Python 2.x names to the new names used in Python 3.x. The encoding and\n" "errors tell pickle how to decode 8-bit string instances pickled by Python\n" -"2.x; these default to 'ASCII' and 'strict', respectively.\n"); +"2.x; these default to \'ASCII\' and \'strict\', respectively."); + +#define _PICKLE_LOADS_METHODDEF \ + {"loads", (PyCFunction)_pickle_loads, METH_VARARGS|METH_KEYWORDS, _pickle_loads__doc__}, static PyObject * -pickle_loads(PyObject *self, PyObject *args, PyObject *kwds) -{ - static char *kwlist[] = {"input", "fix_imports", "encoding", "errors", 0}; - PyObject *input; - PyObject *fix_imports = Py_True; +_pickle_loads_impl(PyModuleDef *module, PyObject *data, int fix_imports, const char *encoding, const char *errors); + +static PyObject * +_pickle_loads(PyModuleDef *module, PyObject *args, PyObject *kwargs) +{ + PyObject *return_value = NULL; + static char *_keywords[] = {"data", "fix_imports", "encoding", "errors", NULL}; + PyObject *data; + int fix_imports = 1; + const char *encoding = "ASCII"; + const char *errors = "strict"; + + if (!PyArg_ParseTupleAndKeywords(args, kwargs, + "O|$pss:loads", _keywords, + &data, &fix_imports, &encoding, &errors)) + goto exit; + return_value = _pickle_loads_impl(module, data, fix_imports, encoding, errors); + +exit: + return return_value; +} + +static PyObject * +_pickle_loads_impl(PyModuleDef *module, PyObject *data, int fix_imports, const char *encoding, const char *errors) +/*[clinic checksum: 29ee725efcbf51a3533c19cb8261a8e267b7080a]*/ +{ PyObject *result; - char *encoding = NULL; - char *errors = NULL; - UnpicklerObject *unpickler; - - /* fix_imports, encoding and errors are a keyword-only argument. */ - if (Py_SIZE(args) != 1) { - PyErr_Format(PyExc_TypeError, - "pickle.loads() takes exactly one positional " - "argument (%zd given)", Py_SIZE(args)); - return NULL; - } - - if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|Oss:loads", kwlist, - &input, &fix_imports, &encoding, &errors)) - return NULL; - - unpickler = _Unpickler_New(); + UnpicklerObject *unpickler = _Unpickler_New(); + if (unpickler == NULL) return NULL; - if (_Unpickler_SetStringInput(unpickler, input) < 0) + if (_Unpickler_SetStringInput(unpickler, data) < 0) goto error; if (_Unpickler_SetInputEncoding(unpickler, encoding, errors) < 0) goto error; - unpickler->fix_imports = PyObject_IsTrue(fix_imports); - if (unpickler->fix_imports == -1) - goto error; + unpickler->fix_imports = fix_imports; result = load(unpickler); Py_DECREF(unpickler); @@ -6833,14 +7252,10 @@ static struct PyMethodDef pickle_methods[] = { - {"dump", (PyCFunction)pickle_dump, METH_VARARGS|METH_KEYWORDS, - pickle_dump_doc}, - {"dumps", (PyCFunction)pickle_dumps, METH_VARARGS|METH_KEYWORDS, - pickle_dumps_doc}, - {"load", (PyCFunction)pickle_load, METH_VARARGS|METH_KEYWORDS, - pickle_load_doc}, - {"loads", (PyCFunction)pickle_loads, METH_VARARGS|METH_KEYWORDS, - pickle_loads_doc}, + _PICKLE_DUMP_METHODDEF + _PICKLE_DUMPS_METHODDEF + _PICKLE_LOAD_METHODDEF + _PICKLE_LOADS_METHODDEF {NULL, NULL} /* sentinel */ }; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 21:39:41 2013 From: python-checkins at python.org (larry.hastings) Date: Sun, 24 Nov 2013 21:39:41 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?release=3A_hgtouch_leaves_behind_a_?= =?utf-8?q?=2Epyc_file=2C_which_has_been_in_all_the_3=2E4_alphas=2E__Grrr?= =?utf-8?q?=2E?= Message-ID: <3dSNXY4NTsz7LjT@mail.python.org> http://hg.python.org/release/rev/2b3d34c79b36 changeset: 65:2b3d34c79b36 parent: 63:2dd672c67008 user: Larry Hastings date: Sun Nov 24 12:38:31 2013 -0800 summary: hgtouch leaves behind a .pyc file, which has been in all the 3.4 alphas. Grrr. files: release.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/release.py b/release.py --- a/release.py +++ b/release.py @@ -281,9 +281,9 @@ os.utime(name, None) # Remove files we don't want to ship in tarballs. - print('Removing VCS .*ignore and .hg*') + print('Removing VCS .*ignore, .hg*, and the hgtouch.pyc we JUST CREATED') for name in ('.hgignore', '.hgeol', '.hgtags', '.hgtouch', - '.bzrignore', '.gitignore'): + '.bzrignore', '.gitignore', 'Tools/hg/hgtouch.pyc'): try: os.unlink(name) except OSError: -- Repository URL: http://hg.python.org/release From python-checkins at python.org Sun Nov 24 21:39:42 2013 From: python-checkins at python.org (larry.hastings) Date: Sun, 24 Nov 2013 21:39:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?release_=28merge_default_-=3E_default?= =?utf-8?b?KTogTWVyZ2Uu?= Message-ID: <3dSNXZ5wKnz7LjW@mail.python.org> http://hg.python.org/release/rev/bd34f111a4e0 changeset: 66:bd34f111a4e0 parent: 65:2b3d34c79b36 parent: 64:1f49b5c5f6c9 user: Larry Hastings date: Sun Nov 24 12:38:56 2013 -0800 summary: Merge. files: release.py | 11 ----------- 1 files changed, 0 insertions(+), 11 deletions(-) diff --git a/release.py b/release.py --- a/release.py +++ b/release.py @@ -217,33 +217,23 @@ print('Making .tgz') base = os.path.basename(source) tgz = base + '.tgz' - bz = base + '.tar.bz2' xz = base + '.tar.xz' run_cmd(['tar cf - %s | gzip -9 > %s' % (source, tgz)]) - print("Making .tar.bz2") - run_cmd(['tar cf - %s | bzip2 -9 > %s' % (source, bz)]) print("Making .tar.xz") run_cmd(['tar cf - %s | xz > %s' % (source, xz)]) print('Calculating md5 sums') checksum_tgz = hashlib.md5() with open(tgz, 'rb') as data: checksum_tgz.update(data.read()) - checksum_bz2 = hashlib.md5() - with open(bz, 'rb') as data: - checksum_bz2.update(data.read()) checksum_xz = hashlib.md5() with open(xz, 'rb') as data: checksum_xz.update(data.read()) print(' %s %8s %s' % ( checksum_tgz.hexdigest(), int(os.path.getsize(tgz)), tgz)) print(' %s %8s %s' % ( - checksum_bz2.hexdigest(), int(os.path.getsize(bz)), bz)) - print(' %s %8s %s' % ( checksum_xz.hexdigest(), int(os.path.getsize(xz)), xz)) with open(tgz + '.md5', 'w', encoding="ascii") as fp: fp.write(checksum_tgz.hexdigest()) - with open(bz + '.md5', 'w', encoding="ascii") as fp: - fp.write(checksum_bz2.hexdigest()) with open(xz + '.md5', 'w', encoding="ascii") as fp: fp.write(checksum_xz.hexdigest()) @@ -252,7 +242,6 @@ run_cmd(['gpg -K | grep -A 1 "^sec"']) uid = input('Please enter key ID to use for signing: ') os.system('gpg -bas -u ' + uid + ' ' + tgz) - os.system('gpg -bas -u ' + uid + ' ' + bz) os.system('gpg -bas -u ' + uid + ' ' + xz) -- Repository URL: http://hg.python.org/release From python-checkins at python.org Sun Nov 24 22:13:50 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sun, 24 Nov 2013 22:13:50 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2315204=3A_Silence_?= =?utf-8?q?and_check_the_=27U=27_mode_deprecation_warnings_in_tests=2E?= Message-ID: <3dSPHy285fz7LjP@mail.python.org> http://hg.python.org/cpython/rev/694e2708b4a8 changeset: 87520:694e2708b4a8 user: Serhiy Storchaka date: Sun Nov 24 23:13:26 2013 +0200 summary: Issue #15204: Silence and check the 'U' mode deprecation warnings in tests. Changed deprecation message in the fileinput module. files: Lib/fileinput.py | 2 +- Lib/test/test_codecs.py | 4 ++- Lib/test/test_fileinput.py | 8 ++++-- Lib/test/test_io.py | 3 +- Lib/test/test_zipfile.py | 29 +++++++++++++++++++------ 5 files changed, 33 insertions(+), 13 deletions(-) diff --git a/Lib/fileinput.py b/Lib/fileinput.py --- a/Lib/fileinput.py +++ b/Lib/fileinput.py @@ -224,7 +224,7 @@ "'r', 'rU', 'U' and 'rb'") if 'U' in mode: import warnings - warnings.warn("Use of 'U' mode is deprecated", + warnings.warn("'U' mode is deprecated", DeprecationWarning, 2) self._mode = mode if openhook: diff --git a/Lib/test/test_codecs.py b/Lib/test/test_codecs.py --- a/Lib/test/test_codecs.py +++ b/Lib/test/test_codecs.py @@ -602,7 +602,9 @@ self.addCleanup(support.unlink, support.TESTFN) with open(support.TESTFN, 'wb') as fp: fp.write(s) - with codecs.open(support.TESTFN, 'U', encoding=self.encoding) as reader: + with support.check_warnings(('', DeprecationWarning)): + reader = codecs.open(support.TESTFN, 'U', encoding=self.encoding) + with reader: self.assertEqual(reader.read(), s1) class UTF16LETest(ReadTest, unittest.TestCase): diff --git a/Lib/test/test_fileinput.py b/Lib/test/test_fileinput.py --- a/Lib/test/test_fileinput.py +++ b/Lib/test/test_fileinput.py @@ -22,7 +22,7 @@ from io import StringIO from fileinput import FileInput, hook_encoded -from test.support import verbose, TESTFN, run_unittest +from test.support import verbose, TESTFN, run_unittest, check_warnings from test.support import unlink as safe_unlink @@ -224,8 +224,10 @@ try: # try opening in universal newline mode t1 = writeTmp(1, [b"A\nB\r\nC\rD"], mode="wb") - fi = FileInput(files=t1, mode="U") - lines = list(fi) + with check_warnings(('', DeprecationWarning)): + fi = FileInput(files=t1, mode="U") + with check_warnings(('', DeprecationWarning)): + lines = list(fi) self.assertEqual(lines, ["A\n", "B\n", "C\n", "D"]) finally: remove_tempfiles(t1) diff --git a/Lib/test/test_io.py b/Lib/test/test_io.py --- a/Lib/test/test_io.py +++ b/Lib/test/test_io.py @@ -2777,7 +2777,8 @@ self.assertEqual(f.mode, "wb") f.close() - f = self.open(support.TESTFN, "U") + with support.check_warnings(('', DeprecationWarning)): + f = self.open(support.TESTFN, "U") self.assertEqual(f.name, support.TESTFN) self.assertEqual(f.buffer.name, support.TESTFN) self.assertEqual(f.buffer.raw.name, support.TESTFN) diff --git a/Lib/test/test_zipfile.py b/Lib/test/test_zipfile.py --- a/Lib/test/test_zipfile.py +++ b/Lib/test/test_zipfile.py @@ -14,7 +14,7 @@ from test.support import (TESTFN, findfile, unlink, requires_zlib, requires_bz2, requires_lzma, - captured_stdout) + captured_stdout, check_warnings) TESTFN2 = TESTFN + "2" TESTFNDIR = TESTFN + "d" @@ -35,6 +35,10 @@ yield f test.assertFalse(f.closed) +def openU(zipfp, fn): + with check_warnings(('', DeprecationWarning)): + return zipfp.open(fn, 'rU') + class AbstractTestsWithSourceFile: @classmethod def setUpClass(cls): @@ -875,6 +879,17 @@ data += zipfp.read(info) self.assertIn(data, {b"foobar", b"barfoo"}) + def test_universal_deprecation(self): + f = io.BytesIO() + with zipfile.ZipFile(f, "w") as zipfp: + zipfp.writestr('spam.txt', b'ababagalamaga') + + with zipfile.ZipFile(f, "r") as zipfp: + for mode in 'U', 'rU': + with self.assertWarns(DeprecationWarning): + zipopen = zipfp.open('spam.txt', mode) + zipopen.close() + def test_universal_readaheads(self): f = io.BytesIO() @@ -884,7 +899,7 @@ data2 = b'' with zipfile.ZipFile(f, 'r') as zipfp, \ - zipfp.open(TESTFN, 'rU') as zipopen: + openU(zipfp, TESTFN) as zipopen: for line in zipopen: data2 += line @@ -1613,7 +1628,7 @@ # Read the ZIP archive with zipfile.ZipFile(f, "r") as zipfp: for sep, fn in self.arcfiles.items(): - with zipfp.open(fn, "rU") as fp: + with openU(zipfp, fn) as fp: zipdata = fp.read() self.assertEqual(self.arcdata[sep], zipdata) @@ -1627,7 +1642,7 @@ # Read the ZIP archive with zipfile.ZipFile(f, "r") as zipfp: for sep, fn in self.arcfiles.items(): - with zipfp.open(fn, "rU") as zipopen: + with openU(zipfp, fn) as zipopen: data = b'' while True: read = zipopen.readline() @@ -1652,7 +1667,7 @@ # Read the ZIP archive with zipfile.ZipFile(f, "r") as zipfp: for sep, fn in self.arcfiles.items(): - with zipfp.open(fn, "rU") as zipopen: + with openU(zipfp, fn) as zipopen: for line in self.line_gen: linedata = zipopen.readline() self.assertEqual(linedata, line + b'\n') @@ -1667,7 +1682,7 @@ # Read the ZIP archive with zipfile.ZipFile(f, "r") as zipfp: for sep, fn in self.arcfiles.items(): - with zipfp.open(fn, "rU") as fp: + with openU(zipfp, fn) as fp: ziplines = fp.readlines() for line, zipline in zip(self.line_gen, ziplines): self.assertEqual(zipline, line + b'\n') @@ -1682,7 +1697,7 @@ # Read the ZIP archive with zipfile.ZipFile(f, "r") as zipfp: for sep, fn in self.arcfiles.items(): - with zipfp.open(fn, "rU") as fp: + with openU(zipfp, fn) as fp: for line, zipline in zip(self.line_gen, fp): self.assertEqual(zipline, line + b'\n') -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 22:15:04 2013 From: python-checkins at python.org (larry.hastings) Date: Sun, 24 Nov 2013 22:15:04 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?release=3A_Hacked_up_size=2Epy_so_it_?= =?utf-8?q?ignores_files_it_doesn=27t_care_about=2E?= Message-ID: <3dSPKN0z03z7Ljt@mail.python.org> http://hg.python.org/release/rev/45eaf8ee1ad2 changeset: 67:45eaf8ee1ad2 user: Larry Hastings date: Sun Nov 24 13:14:38 2013 -0800 summary: Hacked up size.py so it ignores files it doesn't care about. files: size.py | 10 ++++++++-- 1 files changed, 8 insertions(+), 2 deletions(-) diff --git a/size.py b/size.py --- a/size.py +++ b/size.py @@ -11,7 +11,11 @@ # For consistency with historical use. sort_order = dict((ext, i) for i, ext in enumerate( - ('tgz', 'tar.bz2', 'amd64.msi', 'msi', 'dmg'))) + ('tgz', 'tar.bz2', 'tar.xz', 'pdb.zip', 'amd64.msi', 'msi', 'chm', 'dmg'))) + + +def ignore(filename): + return not any(filename.endswith(DOT + ext) for ext in sort_order) def key(filename): @@ -21,10 +25,12 @@ if ext not in sort_order: ext = parts[-1] # Let KeyError propagate. - return sort_order[ext] + return sort_order.get(ext, 9999) for filename in sorted(sys.argv[1:], key=key): + if ignore(filename): + continue md5 = hashlib.md5() with open(filename, 'rb') as fp: md5.update(fp.read()) -- Repository URL: http://hg.python.org/release From python-checkins at python.org Sun Nov 24 23:07:19 2013 From: python-checkins at python.org (larry.hastings) Date: Sun, 24 Nov 2013 23:07:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Updated_pydoc_topics_for_3?= =?utf-8?b?LjQuMGIxLg==?= Message-ID: <3dSQTg0ZGQz7Lkl@mail.python.org> http://hg.python.org/cpython/rev/0f4af2a31b1e changeset: 87521:0f4af2a31b1e parent: 87504:790b1dc94fe0 user: Larry Hastings date: Sun Nov 24 06:53:15 2013 -0800 summary: Updated pydoc topics for 3.4.0b1. files: Lib/pydoc_data/topics.py | 8 ++++---- 1 files changed, 4 insertions(+), 4 deletions(-) diff --git a/Lib/pydoc_data/topics.py b/Lib/pydoc_data/topics.py --- a/Lib/pydoc_data/topics.py +++ b/Lib/pydoc_data/topics.py @@ -1,5 +1,5 @@ # -*- coding: utf-8 -*- -# Autogenerated by Sphinx on Sun Oct 20 01:58:36 2013 +# Autogenerated by Sphinx on Sun Nov 24 06:50:34 2013 topics = {'assert': '\nThe ``assert`` statement\n************************\n\nAssert statements are a convenient way to insert debugging assertions\ninto a program:\n\n assert_stmt ::= "assert" expression ["," expression]\n\nThe simple form, ``assert expression``, is equivalent to\n\n if __debug__:\n if not expression: raise AssertionError\n\nThe extended form, ``assert expression1, expression2``, is equivalent\nto\n\n if __debug__:\n if not expression1: raise AssertionError(expression2)\n\nThese equivalences assume that ``__debug__`` and ``AssertionError``\nrefer to the built-in variables with those names. In the current\nimplementation, the built-in variable ``__debug__`` is ``True`` under\nnormal circumstances, ``False`` when optimization is requested\n(command line option -O). The current code generator emits no code\nfor an assert statement when optimization is requested at compile\ntime. Note that it is unnecessary to include the source code for the\nexpression that failed in the error message; it will be displayed as\npart of the stack trace.\n\nAssignments to ``__debug__`` are illegal. The value for the built-in\nvariable is determined when the interpreter starts.\n', 'assignment': '\nAssignment statements\n*********************\n\nAssignment statements are used to (re)bind names to values and to\nmodify attributes or items of mutable objects:\n\n assignment_stmt ::= (target_list "=")+ (expression_list | yield_expression)\n target_list ::= target ("," target)* [","]\n target ::= identifier\n | "(" target_list ")"\n | "[" target_list "]"\n | attributeref\n | subscription\n | slicing\n | "*" target\n\n(See section *Primaries* for the syntax definitions for the last three\nsymbols.)\n\nAn assignment statement evaluates the expression list (remember that\nthis can be a single expression or a comma-separated list, the latter\nyielding a tuple) and assigns the single resulting object to each of\nthe target lists, from left to right.\n\nAssignment is defined recursively depending on the form of the target\n(list). When a target is part of a mutable object (an attribute\nreference, subscription or slicing), the mutable object must\nultimately perform the assignment and decide about its validity, and\nmay raise an exception if the assignment is unacceptable. The rules\nobserved by various types and the exceptions raised are given with the\ndefinition of the object types (see section *The standard type\nhierarchy*).\n\nAssignment of an object to a target list, optionally enclosed in\nparentheses or square brackets, is recursively defined as follows.\n\n* If the target list is a single target: The object is assigned to\n that target.\n\n* If the target list is a comma-separated list of targets: The object\n must be an iterable with the same number of items as there are\n targets in the target list, and the items are assigned, from left to\n right, to the corresponding targets.\n\n * If the target list contains one target prefixed with an asterisk,\n called a "starred" target: The object must be a sequence with at\n least as many items as there are targets in the target list, minus\n one. The first items of the sequence are assigned, from left to\n right, to the targets before the starred target. The final items\n of the sequence are assigned to the targets after the starred\n target. A list of the remaining items in the sequence is then\n assigned to the starred target (the list can be empty).\n\n * Else: The object must be a sequence with the same number of items\n as there are targets in the target list, and the items are\n assigned, from left to right, to the corresponding targets.\n\nAssignment of an object to a single target is recursively defined as\nfollows.\n\n* If the target is an identifier (name):\n\n * If the name does not occur in a ``global`` or ``nonlocal``\n statement in the current code block: the name is bound to the\n object in the current local namespace.\n\n * Otherwise: the name is bound to the object in the global namespace\n or the outer namespace determined by ``nonlocal``, respectively.\n\n The name is rebound if it was already bound. This may cause the\n reference count for the object previously bound to the name to reach\n zero, causing the object to be deallocated and its destructor (if it\n has one) to be called.\n\n* If the target is a target list enclosed in parentheses or in square\n brackets: The object must be an iterable with the same number of\n items as there are targets in the target list, and its items are\n assigned, from left to right, to the corresponding targets.\n\n* If the target is an attribute reference: The primary expression in\n the reference is evaluated. It should yield an object with\n assignable attributes; if this is not the case, ``TypeError`` is\n raised. That object is then asked to assign the assigned object to\n the given attribute; if it cannot perform the assignment, it raises\n an exception (usually but not necessarily ``AttributeError``).\n\n Note: If the object is a class instance and the attribute reference\n occurs on both sides of the assignment operator, the RHS expression,\n ``a.x`` can access either an instance attribute or (if no instance\n attribute exists) a class attribute. The LHS target ``a.x`` is\n always set as an instance attribute, creating it if necessary.\n Thus, the two occurrences of ``a.x`` do not necessarily refer to the\n same attribute: if the RHS expression refers to a class attribute,\n the LHS creates a new instance attribute as the target of the\n assignment:\n\n class Cls:\n x = 3 # class variable\n inst = Cls()\n inst.x = inst.x + 1 # writes inst.x as 4 leaving Cls.x as 3\n\n This description does not necessarily apply to descriptor\n attributes, such as properties created with ``property()``.\n\n* If the target is a subscription: The primary expression in the\n reference is evaluated. It should yield either a mutable sequence\n object (such as a list) or a mapping object (such as a dictionary).\n Next, the subscript expression is evaluated.\n\n If the primary is a mutable sequence object (such as a list), the\n subscript must yield an integer. If it is negative, the sequence\'s\n length is added to it. The resulting value must be a nonnegative\n integer less than the sequence\'s length, and the sequence is asked\n to assign the assigned object to its item with that index. If the\n index is out of range, ``IndexError`` is raised (assignment to a\n subscripted sequence cannot add new items to a list).\n\n If the primary is a mapping object (such as a dictionary), the\n subscript must have a type compatible with the mapping\'s key type,\n and the mapping is then asked to create a key/datum pair which maps\n the subscript to the assigned object. This can either replace an\n existing key/value pair with the same key value, or insert a new\n key/value pair (if no key with the same value existed).\n\n For user-defined objects, the ``__setitem__()`` method is called\n with appropriate arguments.\n\n* If the target is a slicing: The primary expression in the reference\n is evaluated. It should yield a mutable sequence object (such as a\n list). The assigned object should be a sequence object of the same\n type. Next, the lower and upper bound expressions are evaluated,\n insofar they are present; defaults are zero and the sequence\'s\n length. The bounds should evaluate to integers. If either bound is\n negative, the sequence\'s length is added to it. The resulting\n bounds are clipped to lie between zero and the sequence\'s length,\n inclusive. Finally, the sequence object is asked to replace the\n slice with the items of the assigned sequence. The length of the\n slice may be different from the length of the assigned sequence,\n thus changing the length of the target sequence, if the object\n allows it.\n\n**CPython implementation detail:** In the current implementation, the\nsyntax for targets is taken to be the same as for expressions, and\ninvalid syntax is rejected during the code generation phase, causing\nless detailed error messages.\n\nWARNING: Although the definition of assignment implies that overlaps\nbetween the left-hand side and the right-hand side are \'safe\' (for\nexample ``a, b = b, a`` swaps two variables), overlaps *within* the\ncollection of assigned-to variables are not safe! For instance, the\nfollowing program prints ``[0, 2]``:\n\n x = [0, 1]\n i = 0\n i, x[i] = 1, 2\n print(x)\n\nSee also:\n\n **PEP 3132** - Extended Iterable Unpacking\n The specification for the ``*target`` feature.\n\n\nAugmented assignment statements\n===============================\n\nAugmented assignment is the combination, in a single statement, of a\nbinary operation and an assignment statement:\n\n augmented_assignment_stmt ::= augtarget augop (expression_list | yield_expression)\n augtarget ::= identifier | attributeref | subscription | slicing\n augop ::= "+=" | "-=" | "*=" | "/=" | "//=" | "%=" | "**="\n | ">>=" | "<<=" | "&=" | "^=" | "|="\n\n(See section *Primaries* for the syntax definitions for the last three\nsymbols.)\n\nAn augmented assignment evaluates the target (which, unlike normal\nassignment statements, cannot be an unpacking) and the expression\nlist, performs the binary operation specific to the type of assignment\non the two operands, and assigns the result to the original target.\nThe target is only evaluated once.\n\nAn augmented assignment expression like ``x += 1`` can be rewritten as\n``x = x + 1`` to achieve a similar, but not exactly equal effect. In\nthe augmented version, ``x`` is only evaluated once. Also, when\npossible, the actual operation is performed *in-place*, meaning that\nrather than creating a new object and assigning that to the target,\nthe old object is modified instead.\n\nWith the exception of assigning to tuples and multiple targets in a\nsingle statement, the assignment done by augmented assignment\nstatements is handled the same way as normal assignments. Similarly,\nwith the exception of the possible *in-place* behavior, the binary\noperation performed by augmented assignment is the same as the normal\nbinary operations.\n\nFor targets which are attribute references, the same *caveat about\nclass and instance attributes* applies as for regular assignments.\n', 'atom-identifiers': '\nIdentifiers (Names)\n*******************\n\nAn identifier occurring as an atom is a name. See section\n*Identifiers and keywords* for lexical definition and section *Naming\nand binding* for documentation of naming and binding.\n\nWhen the name is bound to an object, evaluation of the atom yields\nthat object. When a name is not bound, an attempt to evaluate it\nraises a ``NameError`` exception.\n\n**Private name mangling:** When an identifier that textually occurs in\na class definition begins with two or more underscore characters and\ndoes not end in two or more underscores, it is considered a *private\nname* of that class. Private names are transformed to a longer form\nbefore code is generated for them. The transformation inserts the\nclass name, with leading underscores removed and a single underscore\ninserted, in front of the name. For example, the identifier\n``__spam`` occurring in a class named ``Ham`` will be transformed to\n``_Ham__spam``. This transformation is independent of the syntactical\ncontext in which the identifier is used. If the transformed name is\nextremely long (longer than 255 characters), implementation defined\ntruncation may happen. If the class name consists only of underscores,\nno transformation is done.\n', @@ -34,7 +34,7 @@ 'exprlists': '\nExpression lists\n****************\n\n expression_list ::= expression ( "," expression )* [","]\n\nAn expression list containing at least one comma yields a tuple. The\nlength of the tuple is the number of expressions in the list. The\nexpressions are evaluated from left to right.\n\nThe trailing comma is required only to create a single tuple (a.k.a. a\n*singleton*); it is optional in all other cases. A single expression\nwithout a trailing comma doesn\'t create a tuple, but rather yields the\nvalue of that expression. (To create an empty tuple, use an empty pair\nof parentheses: ``()``.)\n', 'floating': '\nFloating point literals\n***********************\n\nFloating point literals are described by the following lexical\ndefinitions:\n\n floatnumber ::= pointfloat | exponentfloat\n pointfloat ::= [intpart] fraction | intpart "."\n exponentfloat ::= (intpart | pointfloat) exponent\n intpart ::= digit+\n fraction ::= "." digit+\n exponent ::= ("e" | "E") ["+" | "-"] digit+\n\nNote that the integer and exponent parts are always interpreted using\nradix 10. For example, ``077e010`` is legal, and denotes the same\nnumber as ``77e10``. The allowed range of floating point literals is\nimplementation-dependent. Some examples of floating point literals:\n\n 3.14 10. .001 1e100 3.14e-10 0e0\n\nNote that numeric literals do not include a sign; a phrase like ``-1``\nis actually an expression composed of the unary operator ``-`` and the\nliteral ``1``.\n', 'for': '\nThe ``for`` statement\n*********************\n\nThe ``for`` statement is used to iterate over the elements of a\nsequence (such as a string, tuple or list) or other iterable object:\n\n for_stmt ::= "for" target_list "in" expression_list ":" suite\n ["else" ":" suite]\n\nThe expression list is evaluated once; it should yield an iterable\nobject. An iterator is created for the result of the\n``expression_list``. The suite is then executed once for each item\nprovided by the iterator, in the order of ascending indices. Each\nitem in turn is assigned to the target list using the standard rules\nfor assignments (see *Assignment statements*), and then the suite is\nexecuted. When the items are exhausted (which is immediately when the\nsequence is empty or an iterator raises a ``StopIteration``\nexception), the suite in the ``else`` clause, if present, is executed,\nand the loop terminates.\n\nA ``break`` statement executed in the first suite terminates the loop\nwithout executing the ``else`` clause\'s suite. A ``continue``\nstatement executed in the first suite skips the rest of the suite and\ncontinues with the next item, or with the ``else`` clause if there was\nno next item.\n\nThe suite may assign to the variable(s) in the target list; this does\nnot affect the next item assigned to it.\n\nNames in the target list are not deleted when the loop is finished,\nbut if the sequence is empty, it will not have been assigned to at all\nby the loop. Hint: the built-in function ``range()`` returns an\niterator of integers suitable to emulate the effect of Pascal\'s ``for\ni := a to b do``; e.g., ``list(range(3))`` returns the list ``[0, 1,\n2]``.\n\nNote: There is a subtlety when the sequence is being modified by the loop\n (this can only occur for mutable sequences, i.e. lists). An\n internal counter is used to keep track of which item is used next,\n and this is incremented on each iteration. When this counter has\n reached the length of the sequence the loop terminates. This means\n that if the suite deletes the current (or a previous) item from the\n sequence, the next item will be skipped (since it gets the index of\n the current item which has already been treated). Likewise, if the\n suite inserts an item in the sequence before the current item, the\n current item will be treated again the next time through the loop.\n This can lead to nasty bugs that can be avoided by making a\n temporary copy using a slice of the whole sequence, e.g.,\n\n for x in a[:]:\n if x < 0: a.remove(x)\n', - 'formatstrings': '\nFormat String Syntax\n********************\n\nThe ``str.format()`` method and the ``Formatter`` class share the same\nsyntax for format strings (although in the case of ``Formatter``,\nsubclasses can define their own format string syntax).\n\nFormat strings contain "replacement fields" surrounded by curly braces\n``{}``. Anything that is not contained in braces is considered literal\ntext, which is copied unchanged to the output. If you need to include\na brace character in the literal text, it can be escaped by doubling:\n``{{`` and ``}}``.\n\nThe grammar for a replacement field is as follows:\n\n replacement_field ::= "{" [field_name] ["!" conversion] [":" format_spec] "}"\n field_name ::= arg_name ("." attribute_name | "[" element_index "]")*\n arg_name ::= [identifier | integer]\n attribute_name ::= identifier\n element_index ::= integer | index_string\n index_string ::= +\n conversion ::= "r" | "s" | "a"\n format_spec ::= \n\nIn less formal terms, the replacement field can start with a\n*field_name* that specifies the object whose value is to be formatted\nand inserted into the output instead of the replacement field. The\n*field_name* is optionally followed by a *conversion* field, which is\npreceded by an exclamation point ``\'!\'``, and a *format_spec*, which\nis preceded by a colon ``\':\'``. These specify a non-default format\nfor the replacement value.\n\nSee also the *Format Specification Mini-Language* section.\n\nThe *field_name* itself begins with an *arg_name* that is either a\nnumber or a keyword. If it\'s a number, it refers to a positional\nargument, and if it\'s a keyword, it refers to a named keyword\nargument. If the numerical arg_names in a format string are 0, 1, 2,\n... in sequence, they can all be omitted (not just some) and the\nnumbers 0, 1, 2, ... will be automatically inserted in that order.\nBecause *arg_name* is not quote-delimited, it is not possible to\nspecify arbitrary dictionary keys (e.g., the strings ``\'10\'`` or\n``\':-]\'``) within a format string. The *arg_name* can be followed by\nany number of index or attribute expressions. An expression of the\nform ``\'.name\'`` selects the named attribute using ``getattr()``,\nwhile an expression of the form ``\'[index]\'`` does an index lookup\nusing ``__getitem__()``.\n\nChanged in version 3.1: The positional argument specifiers can be\nomitted, so ``\'{} {}\'`` is equivalent to ``\'{0} {1}\'``.\n\nSome simple format string examples:\n\n "First, thou shalt count to {0}" # References first positional argument\n "Bring me a {}" # Implicitly references the first positional argument\n "From {} to {}" # Same as "From {0} to {1}"\n "My quest is {name}" # References keyword argument \'name\'\n "Weight in tons {0.weight}" # \'weight\' attribute of first positional arg\n "Units destroyed: {players[0]}" # First element of keyword argument \'players\'.\n\nThe *conversion* field causes a type coercion before formatting.\nNormally, the job of formatting a value is done by the\n``__format__()`` method of the value itself. However, in some cases\nit is desirable to force a type to be formatted as a string,\noverriding its own definition of formatting. By converting the value\nto a string before calling ``__format__()``, the normal formatting\nlogic is bypassed.\n\nThree conversion flags are currently supported: ``\'!s\'`` which calls\n``str()`` on the value, ``\'!r\'`` which calls ``repr()`` and ``\'!a\'``\nwhich calls ``ascii()``.\n\nSome examples:\n\n "Harold\'s a clever {0!s}" # Calls str() on the argument first\n "Bring out the holy {name!r}" # Calls repr() on the argument first\n "More {!a}" # Calls ascii() on the argument first\n\nThe *format_spec* field contains a specification of how the value\nshould be presented, including such details as field width, alignment,\npadding, decimal precision and so on. Each value type can define its\nown "formatting mini-language" or interpretation of the *format_spec*.\n\nMost built-in types support a common formatting mini-language, which\nis described in the next section.\n\nA *format_spec* field can also include nested replacement fields\nwithin it. These nested replacement fields can contain only a field\nname; conversion flags and format specifications are not allowed. The\nreplacement fields within the format_spec are substituted before the\n*format_spec* string is interpreted. This allows the formatting of a\nvalue to be dynamically specified.\n\nSee the *Format examples* section for some examples.\n\n\nFormat Specification Mini-Language\n==================================\n\n"Format specifications" are used within replacement fields contained\nwithin a format string to define how individual values are presented\n(see *Format String Syntax*). They can also be passed directly to the\nbuilt-in ``format()`` function. Each formattable type may define how\nthe format specification is to be interpreted.\n\nMost built-in types implement the following options for format\nspecifications, although some of the formatting options are only\nsupported by the numeric types.\n\nA general convention is that an empty format string (``""``) produces\nthe same result as if you had called ``str()`` on the value. A non-\nempty format string typically modifies the result.\n\nThe general form of a *standard format specifier* is:\n\n format_spec ::= [[fill]align][sign][#][0][width][,][.precision][type]\n fill ::= \n align ::= "<" | ">" | "=" | "^"\n sign ::= "+" | "-" | " "\n width ::= integer\n precision ::= integer\n type ::= "b" | "c" | "d" | "e" | "E" | "f" | "F" | "g" | "G" | "n" | "o" | "s" | "x" | "X" | "%"\n\nThe *fill* character can be any character other than \'{\' or \'}\'. The\npresence of a fill character is signaled by the character following\nit, which must be one of the alignment options. If the second\ncharacter of *format_spec* is not a valid alignment option, then it is\nassumed that both the fill character and the alignment option are\nabsent.\n\nThe meaning of the various alignment options is as follows:\n\n +-----------+------------------------------------------------------------+\n | Option | Meaning |\n +===========+============================================================+\n | ``\'<\'`` | Forces the field to be left-aligned within the available |\n | | space (this is the default for most objects). |\n +-----------+------------------------------------------------------------+\n | ``\'>\'`` | Forces the field to be right-aligned within the available |\n | | space (this is the default for numbers). |\n +-----------+------------------------------------------------------------+\n | ``\'=\'`` | Forces the padding to be placed after the sign (if any) |\n | | but before the digits. This is used for printing fields |\n | | in the form \'+000000120\'. This alignment option is only |\n | | valid for numeric types. |\n +-----------+------------------------------------------------------------+\n | ``\'^\'`` | Forces the field to be centered within the available |\n | | space. |\n +-----------+------------------------------------------------------------+\n\nNote that unless a minimum field width is defined, the field width\nwill always be the same size as the data to fill it, so that the\nalignment option has no meaning in this case.\n\nThe *sign* option is only valid for number types, and can be one of\nthe following:\n\n +-----------+------------------------------------------------------------+\n | Option | Meaning |\n +===========+============================================================+\n | ``\'+\'`` | indicates that a sign should be used for both positive as |\n | | well as negative numbers. |\n +-----------+------------------------------------------------------------+\n | ``\'-\'`` | indicates that a sign should be used only for negative |\n | | numbers (this is the default behavior). |\n +-----------+------------------------------------------------------------+\n | space | indicates that a leading space should be used on positive |\n | | numbers, and a minus sign on negative numbers. |\n +-----------+------------------------------------------------------------+\n\nThe ``\'#\'`` option causes the "alternate form" to be used for the\nconversion. The alternate form is defined differently for different\ntypes. This option is only valid for integer, float, complex and\nDecimal types. For integers, when binary, octal, or hexadecimal output\nis used, this option adds the prefix respective ``\'0b\'``, ``\'0o\'``, or\n``\'0x\'`` to the output value. For floats, complex and Decimal the\nalternate form causes the result of the conversion to always contain a\ndecimal-point character, even if no digits follow it. Normally, a\ndecimal-point character appears in the result of these conversions\nonly if a digit follows it. In addition, for ``\'g\'`` and ``\'G\'``\nconversions, trailing zeros are not removed from the result.\n\nThe ``\',\'`` option signals the use of a comma for a thousands\nseparator. For a locale aware separator, use the ``\'n\'`` integer\npresentation type instead.\n\nChanged in version 3.1: Added the ``\',\'`` option (see also **PEP\n378**).\n\n*width* is a decimal integer defining the minimum field width. If not\nspecified, then the field width will be determined by the content.\n\nPreceding the *width* field by a zero (``\'0\'``) character enables\nsign-aware zero-padding for numeric types. This is equivalent to a\n*fill* character of ``\'0\'`` with an *alignment* type of ``\'=\'``.\n\nThe *precision* is a decimal number indicating how many digits should\nbe displayed after the decimal point for a floating point value\nformatted with ``\'f\'`` and ``\'F\'``, or before and after the decimal\npoint for a floating point value formatted with ``\'g\'`` or ``\'G\'``.\nFor non-number types the field indicates the maximum field size - in\nother words, how many characters will be used from the field content.\nThe *precision* is not allowed for integer values.\n\nFinally, the *type* determines how the data should be presented.\n\nThe available string presentation types are:\n\n +-----------+------------------------------------------------------------+\n | Type | Meaning |\n +===========+============================================================+\n | ``\'s\'`` | String format. This is the default type for strings and |\n | | may be omitted. |\n +-----------+------------------------------------------------------------+\n | None | The same as ``\'s\'``. |\n +-----------+------------------------------------------------------------+\n\nThe available integer presentation types are:\n\n +-----------+------------------------------------------------------------+\n | Type | Meaning |\n +===========+============================================================+\n | ``\'b\'`` | Binary format. Outputs the number in base 2. |\n +-----------+------------------------------------------------------------+\n | ``\'c\'`` | Character. Converts the integer to the corresponding |\n | | unicode character before printing. |\n +-----------+------------------------------------------------------------+\n | ``\'d\'`` | Decimal Integer. Outputs the number in base 10. |\n +-----------+------------------------------------------------------------+\n | ``\'o\'`` | Octal format. Outputs the number in base 8. |\n +-----------+------------------------------------------------------------+\n | ``\'x\'`` | Hex format. Outputs the number in base 16, using lower- |\n | | case letters for the digits above 9. |\n +-----------+------------------------------------------------------------+\n | ``\'X\'`` | Hex format. Outputs the number in base 16, using upper- |\n | | case letters for the digits above 9. |\n +-----------+------------------------------------------------------------+\n | ``\'n\'`` | Number. This is the same as ``\'d\'``, except that it uses |\n | | the current locale setting to insert the appropriate |\n | | number separator characters. |\n +-----------+------------------------------------------------------------+\n | None | The same as ``\'d\'``. |\n +-----------+------------------------------------------------------------+\n\nIn addition to the above presentation types, integers can be formatted\nwith the floating point presentation types listed below (except\n``\'n\'`` and None). When doing so, ``float()`` is used to convert the\ninteger to a floating point number before formatting.\n\nThe available presentation types for floating point and decimal values\nare:\n\n +-----------+------------------------------------------------------------+\n | Type | Meaning |\n +===========+============================================================+\n | ``\'e\'`` | Exponent notation. Prints the number in scientific |\n | | notation using the letter \'e\' to indicate the exponent. |\n | | The default precision is ``6``. |\n +-----------+------------------------------------------------------------+\n | ``\'E\'`` | Exponent notation. Same as ``\'e\'`` except it uses an upper |\n | | case \'E\' as the separator character. |\n +-----------+------------------------------------------------------------+\n | ``\'f\'`` | Fixed point. Displays the number as a fixed-point number. |\n | | The default precision is ``6``. |\n +-----------+------------------------------------------------------------+\n | ``\'F\'`` | Fixed point. Same as ``\'f\'``, but converts ``nan`` to |\n | | ``NAN`` and ``inf`` to ``INF``. |\n +-----------+------------------------------------------------------------+\n | ``\'g\'`` | General format. For a given precision ``p >= 1``, this |\n | | rounds the number to ``p`` significant digits and then |\n | | formats the result in either fixed-point format or in |\n | | scientific notation, depending on its magnitude. The |\n | | precise rules are as follows: suppose that the result |\n | | formatted with presentation type ``\'e\'`` and precision |\n | | ``p-1`` would have exponent ``exp``. Then if ``-4 <= exp |\n | | < p``, the number is formatted with presentation type |\n | | ``\'f\'`` and precision ``p-1-exp``. Otherwise, the number |\n | | is formatted with presentation type ``\'e\'`` and precision |\n | | ``p-1``. In both cases insignificant trailing zeros are |\n | | removed from the significand, and the decimal point is |\n | | also removed if there are no remaining digits following |\n | | it. Positive and negative infinity, positive and negative |\n | | zero, and nans, are formatted as ``inf``, ``-inf``, ``0``, |\n | | ``-0`` and ``nan`` respectively, regardless of the |\n | | precision. A precision of ``0`` is treated as equivalent |\n | | to a precision of ``1``. The default precision is ``6``. |\n +-----------+------------------------------------------------------------+\n | ``\'G\'`` | General format. Same as ``\'g\'`` except switches to ``\'E\'`` |\n | | if the number gets too large. The representations of |\n | | infinity and NaN are uppercased, too. |\n +-----------+------------------------------------------------------------+\n | ``\'n\'`` | Number. This is the same as ``\'g\'``, except that it uses |\n | | the current locale setting to insert the appropriate |\n | | number separator characters. |\n +-----------+------------------------------------------------------------+\n | ``\'%\'`` | Percentage. Multiplies the number by 100 and displays in |\n | | fixed (``\'f\'``) format, followed by a percent sign. |\n +-----------+------------------------------------------------------------+\n | None | Similar to ``\'g\'``, except with at least one digit past |\n | | the decimal point and a default precision of 12. This is |\n | | intended to match ``str()``, except you can add the other |\n | | format modifiers. |\n +-----------+------------------------------------------------------------+\n\n\nFormat examples\n===============\n\nThis section contains examples of the new format syntax and comparison\nwith the old ``%``-formatting.\n\nIn most of the cases the syntax is similar to the old\n``%``-formatting, with the addition of the ``{}`` and with ``:`` used\ninstead of ``%``. For example, ``\'%03.2f\'`` can be translated to\n``\'{:03.2f}\'``.\n\nThe new format syntax also supports new and different options, shown\nin the follow examples.\n\nAccessing arguments by position:\n\n >>> \'{0}, {1}, {2}\'.format(\'a\', \'b\', \'c\')\n \'a, b, c\'\n >>> \'{}, {}, {}\'.format(\'a\', \'b\', \'c\') # 3.1+ only\n \'a, b, c\'\n >>> \'{2}, {1}, {0}\'.format(\'a\', \'b\', \'c\')\n \'c, b, a\'\n >>> \'{2}, {1}, {0}\'.format(*\'abc\') # unpacking argument sequence\n \'c, b, a\'\n >>> \'{0}{1}{0}\'.format(\'abra\', \'cad\') # arguments\' indices can be repeated\n \'abracadabra\'\n\nAccessing arguments by name:\n\n >>> \'Coordinates: {latitude}, {longitude}\'.format(latitude=\'37.24N\', longitude=\'-115.81W\')\n \'Coordinates: 37.24N, -115.81W\'\n >>> coord = {\'latitude\': \'37.24N\', \'longitude\': \'-115.81W\'}\n >>> \'Coordinates: {latitude}, {longitude}\'.format(**coord)\n \'Coordinates: 37.24N, -115.81W\'\n\nAccessing arguments\' attributes:\n\n >>> c = 3-5j\n >>> (\'The complex number {0} is formed from the real part {0.real} \'\n ... \'and the imaginary part {0.imag}.\').format(c)\n \'The complex number (3-5j) is formed from the real part 3.0 and the imaginary part -5.0.\'\n >>> class Point:\n ... def __init__(self, x, y):\n ... self.x, self.y = x, y\n ... def __str__(self):\n ... return \'Point({self.x}, {self.y})\'.format(self=self)\n ...\n >>> str(Point(4, 2))\n \'Point(4, 2)\'\n\nAccessing arguments\' items:\n\n >>> coord = (3, 5)\n >>> \'X: {0[0]}; Y: {0[1]}\'.format(coord)\n \'X: 3; Y: 5\'\n\nReplacing ``%s`` and ``%r``:\n\n >>> "repr() shows quotes: {!r}; str() doesn\'t: {!s}".format(\'test1\', \'test2\')\n "repr() shows quotes: \'test1\'; str() doesn\'t: test2"\n\nAligning the text and specifying a width:\n\n >>> \'{:<30}\'.format(\'left aligned\')\n \'left aligned \'\n >>> \'{:>30}\'.format(\'right aligned\')\n \' right aligned\'\n >>> \'{:^30}\'.format(\'centered\')\n \' centered \'\n >>> \'{:*^30}\'.format(\'centered\') # use \'*\' as a fill char\n \'***********centered***********\'\n\nReplacing ``%+f``, ``%-f``, and ``% f`` and specifying a sign:\n\n >>> \'{:+f}; {:+f}\'.format(3.14, -3.14) # show it always\n \'+3.140000; -3.140000\'\n >>> \'{: f}; {: f}\'.format(3.14, -3.14) # show a space for positive numbers\n \' 3.140000; -3.140000\'\n >>> \'{:-f}; {:-f}\'.format(3.14, -3.14) # show only the minus -- same as \'{:f}; {:f}\'\n \'3.140000; -3.140000\'\n\nReplacing ``%x`` and ``%o`` and converting the value to different\nbases:\n\n >>> # format also supports binary numbers\n >>> "int: {0:d}; hex: {0:x}; oct: {0:o}; bin: {0:b}".format(42)\n \'int: 42; hex: 2a; oct: 52; bin: 101010\'\n >>> # with 0x, 0o, or 0b as prefix:\n >>> "int: {0:d}; hex: {0:#x}; oct: {0:#o}; bin: {0:#b}".format(42)\n \'int: 42; hex: 0x2a; oct: 0o52; bin: 0b101010\'\n\nUsing the comma as a thousands separator:\n\n >>> \'{:,}\'.format(1234567890)\n \'1,234,567,890\'\n\nExpressing a percentage:\n\n >>> points = 19\n >>> total = 22\n >>> \'Correct answers: {:.2%}\'.format(points/total)\n \'Correct answers: 86.36%\'\n\nUsing type-specific formatting:\n\n >>> import datetime\n >>> d = datetime.datetime(2010, 7, 4, 12, 15, 58)\n >>> \'{:%Y-%m-%d %H:%M:%S}\'.format(d)\n \'2010-07-04 12:15:58\'\n\nNesting arguments and more complex examples:\n\n >>> for align, text in zip(\'<^>\', [\'left\', \'center\', \'right\']):\n ... \'{0:{fill}{align}16}\'.format(text, fill=align, align=align)\n ...\n \'left<<<<<<<<<<<<\'\n \'^^^^^center^^^^^\'\n \'>>>>>>>>>>>right\'\n >>>\n >>> octets = [192, 168, 0, 1]\n >>> \'{:02X}{:02X}{:02X}{:02X}\'.format(*octets)\n \'C0A80001\'\n >>> int(_, 16)\n 3232235521\n >>>\n >>> width = 5\n >>> for num in range(5,12): #doctest: +NORMALIZE_WHITESPACE\n ... for base in \'dXob\':\n ... print(\'{0:{width}{base}}\'.format(num, base=base, width=width), end=\' \')\n ... print()\n ...\n 5 5 5 101\n 6 6 6 110\n 7 7 7 111\n 8 8 10 1000\n 9 9 11 1001\n 10 A 12 1010\n 11 B 13 1011\n', + 'formatstrings': '\nFormat String Syntax\n********************\n\nThe ``str.format()`` method and the ``Formatter`` class share the same\nsyntax for format strings (although in the case of ``Formatter``,\nsubclasses can define their own format string syntax).\n\nFormat strings contain "replacement fields" surrounded by curly braces\n``{}``. Anything that is not contained in braces is considered literal\ntext, which is copied unchanged to the output. If you need to include\na brace character in the literal text, it can be escaped by doubling:\n``{{`` and ``}}``.\n\nThe grammar for a replacement field is as follows:\n\n replacement_field ::= "{" [field_name] ["!" conversion] [":" format_spec] "}"\n field_name ::= arg_name ("." attribute_name | "[" element_index "]")*\n arg_name ::= [identifier | integer]\n attribute_name ::= identifier\n element_index ::= integer | index_string\n index_string ::= +\n conversion ::= "r" | "s" | "a"\n format_spec ::= \n\nIn less formal terms, the replacement field can start with a\n*field_name* that specifies the object whose value is to be formatted\nand inserted into the output instead of the replacement field. The\n*field_name* is optionally followed by a *conversion* field, which is\npreceded by an exclamation point ``\'!\'``, and a *format_spec*, which\nis preceded by a colon ``\':\'``. These specify a non-default format\nfor the replacement value.\n\nSee also the *Format Specification Mini-Language* section.\n\nThe *field_name* itself begins with an *arg_name* that is either a\nnumber or a keyword. If it\'s a number, it refers to a positional\nargument, and if it\'s a keyword, it refers to a named keyword\nargument. If the numerical arg_names in a format string are 0, 1, 2,\n... in sequence, they can all be omitted (not just some) and the\nnumbers 0, 1, 2, ... will be automatically inserted in that order.\nBecause *arg_name* is not quote-delimited, it is not possible to\nspecify arbitrary dictionary keys (e.g., the strings ``\'10\'`` or\n``\':-]\'``) within a format string. The *arg_name* can be followed by\nany number of index or attribute expressions. An expression of the\nform ``\'.name\'`` selects the named attribute using ``getattr()``,\nwhile an expression of the form ``\'[index]\'`` does an index lookup\nusing ``__getitem__()``.\n\nChanged in version 3.1: The positional argument specifiers can be\nomitted, so ``\'{} {}\'`` is equivalent to ``\'{0} {1}\'``.\n\nSome simple format string examples:\n\n "First, thou shalt count to {0}" # References first positional argument\n "Bring me a {}" # Implicitly references the first positional argument\n "From {} to {}" # Same as "From {0} to {1}"\n "My quest is {name}" # References keyword argument \'name\'\n "Weight in tons {0.weight}" # \'weight\' attribute of first positional arg\n "Units destroyed: {players[0]}" # First element of keyword argument \'players\'.\n\nThe *conversion* field causes a type coercion before formatting.\nNormally, the job of formatting a value is done by the\n``__format__()`` method of the value itself. However, in some cases\nit is desirable to force a type to be formatted as a string,\noverriding its own definition of formatting. By converting the value\nto a string before calling ``__format__()``, the normal formatting\nlogic is bypassed.\n\nThree conversion flags are currently supported: ``\'!s\'`` which calls\n``str()`` on the value, ``\'!r\'`` which calls ``repr()`` and ``\'!a\'``\nwhich calls ``ascii()``.\n\nSome examples:\n\n "Harold\'s a clever {0!s}" # Calls str() on the argument first\n "Bring out the holy {name!r}" # Calls repr() on the argument first\n "More {!a}" # Calls ascii() on the argument first\n\nThe *format_spec* field contains a specification of how the value\nshould be presented, including such details as field width, alignment,\npadding, decimal precision and so on. Each value type can define its\nown "formatting mini-language" or interpretation of the *format_spec*.\n\nMost built-in types support a common formatting mini-language, which\nis described in the next section.\n\nA *format_spec* field can also include nested replacement fields\nwithin it. These nested replacement fields can contain only a field\nname; conversion flags and format specifications are not allowed. The\nreplacement fields within the format_spec are substituted before the\n*format_spec* string is interpreted. This allows the formatting of a\nvalue to be dynamically specified.\n\nSee the *Format examples* section for some examples.\n\n\nFormat Specification Mini-Language\n==================================\n\n"Format specifications" are used within replacement fields contained\nwithin a format string to define how individual values are presented\n(see *Format String Syntax*). They can also be passed directly to the\nbuilt-in ``format()`` function. Each formattable type may define how\nthe format specification is to be interpreted.\n\nMost built-in types implement the following options for format\nspecifications, although some of the formatting options are only\nsupported by the numeric types.\n\nA general convention is that an empty format string (``""``) produces\nthe same result as if you had called ``str()`` on the value. A non-\nempty format string typically modifies the result.\n\nThe general form of a *standard format specifier* is:\n\n format_spec ::= [[fill]align][sign][#][0][width][,][.precision][type]\n fill ::= \n align ::= "<" | ">" | "=" | "^"\n sign ::= "+" | "-" | " "\n width ::= integer\n precision ::= integer\n type ::= "b" | "c" | "d" | "e" | "E" | "f" | "F" | "g" | "G" | "n" | "o" | "s" | "x" | "X" | "%"\n\nIf a valid *align* value is specified, it can be preceded by a *fill*\ncharacter that can be any character and defaults to a space if\nomitted. Note that it is not possible to use ``{`` and ``}`` as *fill*\nchar while using the ``str.format()`` method; this limitation however\ndoesn\'t affect the ``format()`` function.\n\nThe meaning of the various alignment options is as follows:\n\n +-----------+------------------------------------------------------------+\n | Option | Meaning |\n +===========+============================================================+\n | ``\'<\'`` | Forces the field to be left-aligned within the available |\n | | space (this is the default for most objects). |\n +-----------+------------------------------------------------------------+\n | ``\'>\'`` | Forces the field to be right-aligned within the available |\n | | space (this is the default for numbers). |\n +-----------+------------------------------------------------------------+\n | ``\'=\'`` | Forces the padding to be placed after the sign (if any) |\n | | but before the digits. This is used for printing fields |\n | | in the form \'+000000120\'. This alignment option is only |\n | | valid for numeric types. |\n +-----------+------------------------------------------------------------+\n | ``\'^\'`` | Forces the field to be centered within the available |\n | | space. |\n +-----------+------------------------------------------------------------+\n\nNote that unless a minimum field width is defined, the field width\nwill always be the same size as the data to fill it, so that the\nalignment option has no meaning in this case.\n\nThe *sign* option is only valid for number types, and can be one of\nthe following:\n\n +-----------+------------------------------------------------------------+\n | Option | Meaning |\n +===========+============================================================+\n | ``\'+\'`` | indicates that a sign should be used for both positive as |\n | | well as negative numbers. |\n +-----------+------------------------------------------------------------+\n | ``\'-\'`` | indicates that a sign should be used only for negative |\n | | numbers (this is the default behavior). |\n +-----------+------------------------------------------------------------+\n | space | indicates that a leading space should be used on positive |\n | | numbers, and a minus sign on negative numbers. |\n +-----------+------------------------------------------------------------+\n\nThe ``\'#\'`` option causes the "alternate form" to be used for the\nconversion. The alternate form is defined differently for different\ntypes. This option is only valid for integer, float, complex and\nDecimal types. For integers, when binary, octal, or hexadecimal output\nis used, this option adds the prefix respective ``\'0b\'``, ``\'0o\'``, or\n``\'0x\'`` to the output value. For floats, complex and Decimal the\nalternate form causes the result of the conversion to always contain a\ndecimal-point character, even if no digits follow it. Normally, a\ndecimal-point character appears in the result of these conversions\nonly if a digit follows it. In addition, for ``\'g\'`` and ``\'G\'``\nconversions, trailing zeros are not removed from the result.\n\nThe ``\',\'`` option signals the use of a comma for a thousands\nseparator. For a locale aware separator, use the ``\'n\'`` integer\npresentation type instead.\n\nChanged in version 3.1: Added the ``\',\'`` option (see also **PEP\n378**).\n\n*width* is a decimal integer defining the minimum field width. If not\nspecified, then the field width will be determined by the content.\n\nPreceding the *width* field by a zero (``\'0\'``) character enables\nsign-aware zero-padding for numeric types. This is equivalent to a\n*fill* character of ``\'0\'`` with an *alignment* type of ``\'=\'``.\n\nThe *precision* is a decimal number indicating how many digits should\nbe displayed after the decimal point for a floating point value\nformatted with ``\'f\'`` and ``\'F\'``, or before and after the decimal\npoint for a floating point value formatted with ``\'g\'`` or ``\'G\'``.\nFor non-number types the field indicates the maximum field size - in\nother words, how many characters will be used from the field content.\nThe *precision* is not allowed for integer values.\n\nFinally, the *type* determines how the data should be presented.\n\nThe available string presentation types are:\n\n +-----------+------------------------------------------------------------+\n | Type | Meaning |\n +===========+============================================================+\n | ``\'s\'`` | String format. This is the default type for strings and |\n | | may be omitted. |\n +-----------+------------------------------------------------------------+\n | None | The same as ``\'s\'``. |\n +-----------+------------------------------------------------------------+\n\nThe available integer presentation types are:\n\n +-----------+------------------------------------------------------------+\n | Type | Meaning |\n +===========+============================================================+\n | ``\'b\'`` | Binary format. Outputs the number in base 2. |\n +-----------+------------------------------------------------------------+\n | ``\'c\'`` | Character. Converts the integer to the corresponding |\n | | unicode character before printing. |\n +-----------+------------------------------------------------------------+\n | ``\'d\'`` | Decimal Integer. Outputs the number in base 10. |\n +-----------+------------------------------------------------------------+\n | ``\'o\'`` | Octal format. Outputs the number in base 8. |\n +-----------+------------------------------------------------------------+\n | ``\'x\'`` | Hex format. Outputs the number in base 16, using lower- |\n | | case letters for the digits above 9. |\n +-----------+------------------------------------------------------------+\n | ``\'X\'`` | Hex format. Outputs the number in base 16, using upper- |\n | | case letters for the digits above 9. |\n +-----------+------------------------------------------------------------+\n | ``\'n\'`` | Number. This is the same as ``\'d\'``, except that it uses |\n | | the current locale setting to insert the appropriate |\n | | number separator characters. |\n +-----------+------------------------------------------------------------+\n | None | The same as ``\'d\'``. |\n +-----------+------------------------------------------------------------+\n\nIn addition to the above presentation types, integers can be formatted\nwith the floating point presentation types listed below (except\n``\'n\'`` and None). When doing so, ``float()`` is used to convert the\ninteger to a floating point number before formatting.\n\nThe available presentation types for floating point and decimal values\nare:\n\n +-----------+------------------------------------------------------------+\n | Type | Meaning |\n +===========+============================================================+\n | ``\'e\'`` | Exponent notation. Prints the number in scientific |\n | | notation using the letter \'e\' to indicate the exponent. |\n | | The default precision is ``6``. |\n +-----------+------------------------------------------------------------+\n | ``\'E\'`` | Exponent notation. Same as ``\'e\'`` except it uses an upper |\n | | case \'E\' as the separator character. |\n +-----------+------------------------------------------------------------+\n | ``\'f\'`` | Fixed point. Displays the number as a fixed-point number. |\n | | The default precision is ``6``. |\n +-----------+------------------------------------------------------------+\n | ``\'F\'`` | Fixed point. Same as ``\'f\'``, but converts ``nan`` to |\n | | ``NAN`` and ``inf`` to ``INF``. |\n +-----------+------------------------------------------------------------+\n | ``\'g\'`` | General format. For a given precision ``p >= 1``, this |\n | | rounds the number to ``p`` significant digits and then |\n | | formats the result in either fixed-point format or in |\n | | scientific notation, depending on its magnitude. The |\n | | precise rules are as follows: suppose that the result |\n | | formatted with presentation type ``\'e\'`` and precision |\n | | ``p-1`` would have exponent ``exp``. Then if ``-4 <= exp |\n | | < p``, the number is formatted with presentation type |\n | | ``\'f\'`` and precision ``p-1-exp``. Otherwise, the number |\n | | is formatted with presentation type ``\'e\'`` and precision |\n | | ``p-1``. In both cases insignificant trailing zeros are |\n | | removed from the significand, and the decimal point is |\n | | also removed if there are no remaining digits following |\n | | it. Positive and negative infinity, positive and negative |\n | | zero, and nans, are formatted as ``inf``, ``-inf``, ``0``, |\n | | ``-0`` and ``nan`` respectively, regardless of the |\n | | precision. A precision of ``0`` is treated as equivalent |\n | | to a precision of ``1``. The default precision is ``6``. |\n +-----------+------------------------------------------------------------+\n | ``\'G\'`` | General format. Same as ``\'g\'`` except switches to ``\'E\'`` |\n | | if the number gets too large. The representations of |\n | | infinity and NaN are uppercased, too. |\n +-----------+------------------------------------------------------------+\n | ``\'n\'`` | Number. This is the same as ``\'g\'``, except that it uses |\n | | the current locale setting to insert the appropriate |\n | | number separator characters. |\n +-----------+------------------------------------------------------------+\n | ``\'%\'`` | Percentage. Multiplies the number by 100 and displays in |\n | | fixed (``\'f\'``) format, followed by a percent sign. |\n +-----------+------------------------------------------------------------+\n | None | Similar to ``\'g\'``, except with at least one digit past |\n | | the decimal point and a default precision of 12. This is |\n | | intended to match ``str()``, except you can add the other |\n | | format modifiers. |\n +-----------+------------------------------------------------------------+\n\n\nFormat examples\n===============\n\nThis section contains examples of the new format syntax and comparison\nwith the old ``%``-formatting.\n\nIn most of the cases the syntax is similar to the old\n``%``-formatting, with the addition of the ``{}`` and with ``:`` used\ninstead of ``%``. For example, ``\'%03.2f\'`` can be translated to\n``\'{:03.2f}\'``.\n\nThe new format syntax also supports new and different options, shown\nin the follow examples.\n\nAccessing arguments by position:\n\n >>> \'{0}, {1}, {2}\'.format(\'a\', \'b\', \'c\')\n \'a, b, c\'\n >>> \'{}, {}, {}\'.format(\'a\', \'b\', \'c\') # 3.1+ only\n \'a, b, c\'\n >>> \'{2}, {1}, {0}\'.format(\'a\', \'b\', \'c\')\n \'c, b, a\'\n >>> \'{2}, {1}, {0}\'.format(*\'abc\') # unpacking argument sequence\n \'c, b, a\'\n >>> \'{0}{1}{0}\'.format(\'abra\', \'cad\') # arguments\' indices can be repeated\n \'abracadabra\'\n\nAccessing arguments by name:\n\n >>> \'Coordinates: {latitude}, {longitude}\'.format(latitude=\'37.24N\', longitude=\'-115.81W\')\n \'Coordinates: 37.24N, -115.81W\'\n >>> coord = {\'latitude\': \'37.24N\', \'longitude\': \'-115.81W\'}\n >>> \'Coordinates: {latitude}, {longitude}\'.format(**coord)\n \'Coordinates: 37.24N, -115.81W\'\n\nAccessing arguments\' attributes:\n\n >>> c = 3-5j\n >>> (\'The complex number {0} is formed from the real part {0.real} \'\n ... \'and the imaginary part {0.imag}.\').format(c)\n \'The complex number (3-5j) is formed from the real part 3.0 and the imaginary part -5.0.\'\n >>> class Point:\n ... def __init__(self, x, y):\n ... self.x, self.y = x, y\n ... def __str__(self):\n ... return \'Point({self.x}, {self.y})\'.format(self=self)\n ...\n >>> str(Point(4, 2))\n \'Point(4, 2)\'\n\nAccessing arguments\' items:\n\n >>> coord = (3, 5)\n >>> \'X: {0[0]}; Y: {0[1]}\'.format(coord)\n \'X: 3; Y: 5\'\n\nReplacing ``%s`` and ``%r``:\n\n >>> "repr() shows quotes: {!r}; str() doesn\'t: {!s}".format(\'test1\', \'test2\')\n "repr() shows quotes: \'test1\'; str() doesn\'t: test2"\n\nAligning the text and specifying a width:\n\n >>> \'{:<30}\'.format(\'left aligned\')\n \'left aligned \'\n >>> \'{:>30}\'.format(\'right aligned\')\n \' right aligned\'\n >>> \'{:^30}\'.format(\'centered\')\n \' centered \'\n >>> \'{:*^30}\'.format(\'centered\') # use \'*\' as a fill char\n \'***********centered***********\'\n\nReplacing ``%+f``, ``%-f``, and ``% f`` and specifying a sign:\n\n >>> \'{:+f}; {:+f}\'.format(3.14, -3.14) # show it always\n \'+3.140000; -3.140000\'\n >>> \'{: f}; {: f}\'.format(3.14, -3.14) # show a space for positive numbers\n \' 3.140000; -3.140000\'\n >>> \'{:-f}; {:-f}\'.format(3.14, -3.14) # show only the minus -- same as \'{:f}; {:f}\'\n \'3.140000; -3.140000\'\n\nReplacing ``%x`` and ``%o`` and converting the value to different\nbases:\n\n >>> # format also supports binary numbers\n >>> "int: {0:d}; hex: {0:x}; oct: {0:o}; bin: {0:b}".format(42)\n \'int: 42; hex: 2a; oct: 52; bin: 101010\'\n >>> # with 0x, 0o, or 0b as prefix:\n >>> "int: {0:d}; hex: {0:#x}; oct: {0:#o}; bin: {0:#b}".format(42)\n \'int: 42; hex: 0x2a; oct: 0o52; bin: 0b101010\'\n\nUsing the comma as a thousands separator:\n\n >>> \'{:,}\'.format(1234567890)\n \'1,234,567,890\'\n\nExpressing a percentage:\n\n >>> points = 19\n >>> total = 22\n >>> \'Correct answers: {:.2%}\'.format(points/total)\n \'Correct answers: 86.36%\'\n\nUsing type-specific formatting:\n\n >>> import datetime\n >>> d = datetime.datetime(2010, 7, 4, 12, 15, 58)\n >>> \'{:%Y-%m-%d %H:%M:%S}\'.format(d)\n \'2010-07-04 12:15:58\'\n\nNesting arguments and more complex examples:\n\n >>> for align, text in zip(\'<^>\', [\'left\', \'center\', \'right\']):\n ... \'{0:{fill}{align}16}\'.format(text, fill=align, align=align)\n ...\n \'left<<<<<<<<<<<<\'\n \'^^^^^center^^^^^\'\n \'>>>>>>>>>>>right\'\n >>>\n >>> octets = [192, 168, 0, 1]\n >>> \'{:02X}{:02X}{:02X}{:02X}\'.format(*octets)\n \'C0A80001\'\n >>> int(_, 16)\n 3232235521\n >>>\n >>> width = 5\n >>> for num in range(5,12): #doctest: +NORMALIZE_WHITESPACE\n ... for base in \'dXob\':\n ... print(\'{0:{width}{base}}\'.format(num, base=base, width=width), end=\' \')\n ... print()\n ...\n 5 5 5 101\n 6 6 6 110\n 7 7 7 111\n 8 8 10 1000\n 9 9 11 1001\n 10 A 12 1010\n 11 B 13 1011\n', 'function': '\nFunction definitions\n********************\n\nA function definition defines a user-defined function object (see\nsection *The standard type hierarchy*):\n\n funcdef ::= [decorators] "def" funcname "(" [parameter_list] ")" ["->" expression] ":" suite\n decorators ::= decorator+\n decorator ::= "@" dotted_name ["(" [parameter_list [","]] ")"] NEWLINE\n dotted_name ::= identifier ("." identifier)*\n parameter_list ::= (defparameter ",")*\n ( "*" [parameter] ("," defparameter)* ["," "**" parameter]\n | "**" parameter\n | defparameter [","] )\n parameter ::= identifier [":" expression]\n defparameter ::= parameter ["=" expression]\n funcname ::= identifier\n\nA function definition is an executable statement. Its execution binds\nthe function name in the current local namespace to a function object\n(a wrapper around the executable code for the function). This\nfunction object contains a reference to the current global namespace\nas the global namespace to be used when the function is called.\n\nThe function definition does not execute the function body; this gets\nexecuted only when the function is called. [3]\n\nA function definition may be wrapped by one or more *decorator*\nexpressions. Decorator expressions are evaluated when the function is\ndefined, in the scope that contains the function definition. The\nresult must be a callable, which is invoked with the function object\nas the only argument. The returned value is bound to the function name\ninstead of the function object. Multiple decorators are applied in\nnested fashion. For example, the following code\n\n @f1(arg)\n @f2\n def func(): pass\n\nis equivalent to\n\n def func(): pass\n func = f1(arg)(f2(func))\n\nWhen one or more *parameters* have the form *parameter* ``=``\n*expression*, the function is said to have "default parameter values."\nFor a parameter with a default value, the corresponding *argument* may\nbe omitted from a call, in which case the parameter\'s default value is\nsubstituted. If a parameter has a default value, all following\nparameters up until the "``*``" must also have a default value ---\nthis is a syntactic restriction that is not expressed by the grammar.\n\n**Default parameter values are evaluated from left to right when the\nfunction definition is executed.** This means that the expression is\nevaluated once, when the function is defined, and that the same "pre-\ncomputed" value is used for each call. This is especially important\nto understand when a default parameter is a mutable object, such as a\nlist or a dictionary: if the function modifies the object (e.g. by\nappending an item to a list), the default value is in effect modified.\nThis is generally not what was intended. A way around this is to use\n``None`` as the default, and explicitly test for it in the body of the\nfunction, e.g.:\n\n def whats_on_the_telly(penguin=None):\n if penguin is None:\n penguin = []\n penguin.append("property of the zoo")\n return penguin\n\nFunction call semantics are described in more detail in section\n*Calls*. A function call always assigns values to all parameters\nmentioned in the parameter list, either from position arguments, from\nkeyword arguments, or from default values. If the form\n"``*identifier``" is present, it is initialized to a tuple receiving\nany excess positional parameters, defaulting to the empty tuple. If\nthe form "``**identifier``" is present, it is initialized to a new\ndictionary receiving any excess keyword arguments, defaulting to a new\nempty dictionary. Parameters after "``*``" or "``*identifier``" are\nkeyword-only parameters and may only be passed used keyword arguments.\n\nParameters may have annotations of the form "``: expression``"\nfollowing the parameter name. Any parameter may have an annotation\neven those of the form ``*identifier`` or ``**identifier``. Functions\nmay have "return" annotation of the form "``-> expression``" after the\nparameter list. These annotations can be any valid Python expression\nand are evaluated when the function definition is executed.\nAnnotations may be evaluated in a different order than they appear in\nthe source code. The presence of annotations does not change the\nsemantics of a function. The annotation values are available as\nvalues of a dictionary keyed by the parameters\' names in the\n``__annotations__`` attribute of the function object.\n\nIt is also possible to create anonymous functions (functions not bound\nto a name), for immediate use in expressions. This uses lambda\nexpressions, described in section *Lambdas*. Note that the lambda\nexpression is merely a shorthand for a simplified function definition;\na function defined in a "``def``" statement can be passed around or\nassigned to another name just like a function defined by a lambda\nexpression. The "``def``" form is actually more powerful since it\nallows the execution of multiple statements and annotations.\n\n**Programmer\'s note:** Functions are first-class objects. A "``def``"\nstatement executed inside a function definition defines a local\nfunction that can be returned or passed around. Free variables used\nin the nested function can access the local variables of the function\ncontaining the def. See section *Naming and binding* for details.\n\nSee also:\n\n **PEP 3107** - Function Annotations\n The original specification for function annotations.\n', 'global': '\nThe ``global`` statement\n************************\n\n global_stmt ::= "global" identifier ("," identifier)*\n\nThe ``global`` statement is a declaration which holds for the entire\ncurrent code block. It means that the listed identifiers are to be\ninterpreted as globals. It would be impossible to assign to a global\nvariable without ``global``, although free variables may refer to\nglobals without being declared global.\n\nNames listed in a ``global`` statement must not be used in the same\ncode block textually preceding that ``global`` statement.\n\nNames listed in a ``global`` statement must not be defined as formal\nparameters or in a ``for`` loop control target, ``class`` definition,\nfunction definition, or ``import`` statement.\n\n**CPython implementation detail:** The current implementation does not\nenforce the latter two restrictions, but programs should not abuse\nthis freedom, as future implementations may enforce them or silently\nchange the meaning of the program.\n\n**Programmer\'s note:** the ``global`` is a directive to the parser.\nIt applies only to code parsed at the same time as the ``global``\nstatement. In particular, a ``global`` statement contained in a string\nor code object supplied to the built-in ``exec()`` function does not\naffect the code block *containing* the function call, and code\ncontained in such a string is unaffected by ``global`` statements in\nthe code containing the function call. The same applies to the\n``eval()`` and ``compile()`` functions.\n', 'id-classes': '\nReserved classes of identifiers\n*******************************\n\nCertain classes of identifiers (besides keywords) have special\nmeanings. These classes are identified by the patterns of leading and\ntrailing underscore characters:\n\n``_*``\n Not imported by ``from module import *``. The special identifier\n ``_`` is used in the interactive interpreter to store the result of\n the last evaluation; it is stored in the ``builtins`` module. When\n not in interactive mode, ``_`` has no special meaning and is not\n defined. See section *The import statement*.\n\n Note: The name ``_`` is often used in conjunction with\n internationalization; refer to the documentation for the\n ``gettext`` module for more information on this convention.\n\n``__*__``\n System-defined names. These names are defined by the interpreter\n and its implementation (including the standard library). Current\n system names are discussed in the *Special method names* section\n and elsewhere. More will likely be defined in future versions of\n Python. *Any* use of ``__*__`` names, in any context, that does\n not follow explicitly documented use, is subject to breakage\n without warning.\n\n``__*``\n Class-private names. Names in this category, when used within the\n context of a class definition, are re-written to use a mangled form\n to help avoid name clashes between "private" attributes of base and\n derived classes. See section *Identifiers (Names)*.\n', @@ -61,7 +61,7 @@ 'slicings': '\nSlicings\n********\n\nA slicing selects a range of items in a sequence object (e.g., a\nstring, tuple or list). Slicings may be used as expressions or as\ntargets in assignment or ``del`` statements. The syntax for a\nslicing:\n\n slicing ::= primary "[" slice_list "]"\n slice_list ::= slice_item ("," slice_item)* [","]\n slice_item ::= expression | proper_slice\n proper_slice ::= [lower_bound] ":" [upper_bound] [ ":" [stride] ]\n lower_bound ::= expression\n upper_bound ::= expression\n stride ::= expression\n\nThere is ambiguity in the formal syntax here: anything that looks like\nan expression list also looks like a slice list, so any subscription\ncan be interpreted as a slicing. Rather than further complicating the\nsyntax, this is disambiguated by defining that in this case the\ninterpretation as a subscription takes priority over the\ninterpretation as a slicing (this is the case if the slice list\ncontains no proper slice).\n\nThe semantics for a slicing are as follows. The primary must evaluate\nto a mapping object, and it is indexed (using the same\n``__getitem__()`` method as normal subscription) with a key that is\nconstructed from the slice list, as follows. If the slice list\ncontains at least one comma, the key is a tuple containing the\nconversion of the slice items; otherwise, the conversion of the lone\nslice item is the key. The conversion of a slice item that is an\nexpression is that expression. The conversion of a proper slice is a\nslice object (see section *The standard type hierarchy*) whose\n``start``, ``stop`` and ``step`` attributes are the values of the\nexpressions given as lower bound, upper bound and stride,\nrespectively, substituting ``None`` for missing expressions.\n', 'specialattrs': '\nSpecial Attributes\n******************\n\nThe implementation adds a few special read-only attributes to several\nobject types, where they are relevant. Some of these are not reported\nby the ``dir()`` built-in function.\n\nobject.__dict__\n\n A dictionary or other mapping object used to store an object\'s\n (writable) attributes.\n\ninstance.__class__\n\n The class to which a class instance belongs.\n\nclass.__bases__\n\n The tuple of base classes of a class object.\n\nclass.__name__\n\n The name of the class or type.\n\nclass.__qualname__\n\n The *qualified name* of the class or type.\n\n New in version 3.3.\n\nclass.__mro__\n\n This attribute is a tuple of classes that are considered when\n looking for base classes during method resolution.\n\nclass.mro()\n\n This method can be overridden by a metaclass to customize the\n method resolution order for its instances. It is called at class\n instantiation, and its result is stored in ``__mro__``.\n\nclass.__subclasses__()\n\n Each class keeps a list of weak references to its immediate\n subclasses. This method returns a list of all those references\n still alive. Example:\n\n >>> int.__subclasses__()\n []\n\n-[ Footnotes ]-\n\n[1] Additional information on these special methods may be found in\n the Python Reference Manual (*Basic customization*).\n\n[2] As a consequence, the list ``[1, 2]`` is considered equal to\n ``[1.0, 2.0]``, and similarly for tuples.\n\n[3] They must have since the parser can\'t tell the type of the\n operands.\n\n[4] Cased characters are those with general category property being\n one of "Lu" (Letter, uppercase), "Ll" (Letter, lowercase), or "Lt"\n (Letter, titlecase).\n\n[5] To format only a tuple you should therefore provide a singleton\n tuple whose only element is the tuple to be formatted.\n', 'specialnames': '\nSpecial method names\n********************\n\nA class can implement certain operations that are invoked by special\nsyntax (such as arithmetic operations or subscripting and slicing) by\ndefining methods with special names. This is Python\'s approach to\n*operator overloading*, allowing classes to define their own behavior\nwith respect to language operators. For instance, if a class defines\na method named ``__getitem__()``, and ``x`` is an instance of this\nclass, then ``x[i]`` is roughly equivalent to ``type(x).__getitem__(x,\ni)``. Except where mentioned, attempts to execute an operation raise\nan exception when no appropriate method is defined (typically\n``AttributeError`` or ``TypeError``).\n\nWhen implementing a class that emulates any built-in type, it is\nimportant that the emulation only be implemented to the degree that it\nmakes sense for the object being modelled. For example, some\nsequences may work well with retrieval of individual elements, but\nextracting a slice may not make sense. (One example of this is the\n``NodeList`` interface in the W3C\'s Document Object Model.)\n\n\nBasic customization\n===================\n\nobject.__new__(cls[, ...])\n\n Called to create a new instance of class *cls*. ``__new__()`` is a\n static method (special-cased so you need not declare it as such)\n that takes the class of which an instance was requested as its\n first argument. The remaining arguments are those passed to the\n object constructor expression (the call to the class). The return\n value of ``__new__()`` should be the new object instance (usually\n an instance of *cls*).\n\n Typical implementations create a new instance of the class by\n invoking the superclass\'s ``__new__()`` method using\n ``super(currentclass, cls).__new__(cls[, ...])`` with appropriate\n arguments and then modifying the newly-created instance as\n necessary before returning it.\n\n If ``__new__()`` returns an instance of *cls*, then the new\n instance\'s ``__init__()`` method will be invoked like\n ``__init__(self[, ...])``, where *self* is the new instance and the\n remaining arguments are the same as were passed to ``__new__()``.\n\n If ``__new__()`` does not return an instance of *cls*, then the new\n instance\'s ``__init__()`` method will not be invoked.\n\n ``__new__()`` is intended mainly to allow subclasses of immutable\n types (like int, str, or tuple) to customize instance creation. It\n is also commonly overridden in custom metaclasses in order to\n customize class creation.\n\nobject.__init__(self[, ...])\n\n Called when the instance is created. The arguments are those\n passed to the class constructor expression. If a base class has an\n ``__init__()`` method, the derived class\'s ``__init__()`` method,\n if any, must explicitly call it to ensure proper initialization of\n the base class part of the instance; for example:\n ``BaseClass.__init__(self, [args...])``. As a special constraint\n on constructors, no value may be returned; doing so will cause a\n ``TypeError`` to be raised at runtime.\n\nobject.__del__(self)\n\n Called when the instance is about to be destroyed. This is also\n called a destructor. If a base class has a ``__del__()`` method,\n the derived class\'s ``__del__()`` method, if any, must explicitly\n call it to ensure proper deletion of the base class part of the\n instance. Note that it is possible (though not recommended!) for\n the ``__del__()`` method to postpone destruction of the instance by\n creating a new reference to it. It may then be called at a later\n time when this new reference is deleted. It is not guaranteed that\n ``__del__()`` methods are called for objects that still exist when\n the interpreter exits.\n\n Note: ``del x`` doesn\'t directly call ``x.__del__()`` --- the former\n decrements the reference count for ``x`` by one, and the latter\n is only called when ``x``\'s reference count reaches zero. Some\n common situations that may prevent the reference count of an\n object from going to zero include: circular references between\n objects (e.g., a doubly-linked list or a tree data structure with\n parent and child pointers); a reference to the object on the\n stack frame of a function that caught an exception (the traceback\n stored in ``sys.exc_info()[2]`` keeps the stack frame alive); or\n a reference to the object on the stack frame that raised an\n unhandled exception in interactive mode (the traceback stored in\n ``sys.last_traceback`` keeps the stack frame alive). The first\n situation can only be remedied by explicitly breaking the cycles;\n the latter two situations can be resolved by storing ``None`` in\n ``sys.last_traceback``. Circular references which are garbage are\n detected and cleaned up when the cyclic garbage collector is\n enabled (it\'s on by default). Refer to the documentation for the\n ``gc`` module for more information about this topic.\n\n Warning: Due to the precarious circumstances under which ``__del__()``\n methods are invoked, exceptions that occur during their execution\n are ignored, and a warning is printed to ``sys.stderr`` instead.\n Also, when ``__del__()`` is invoked in response to a module being\n deleted (e.g., when execution of the program is done), other\n globals referenced by the ``__del__()`` method may already have\n been deleted or in the process of being torn down (e.g. the\n import machinery shutting down). For this reason, ``__del__()``\n methods should do the absolute minimum needed to maintain\n external invariants. Starting with version 1.5, Python\n guarantees that globals whose name begins with a single\n underscore are deleted from their module before other globals are\n deleted; if no other references to such globals exist, this may\n help in assuring that imported modules are still available at the\n time when the ``__del__()`` method is called.\n\nobject.__repr__(self)\n\n Called by the ``repr()`` built-in function to compute the\n "official" string representation of an object. If at all possible,\n this should look like a valid Python expression that could be used\n to recreate an object with the same value (given an appropriate\n environment). If this is not possible, a string of the form\n ``<...some useful description...>`` should be returned. The return\n value must be a string object. If a class defines ``__repr__()``\n but not ``__str__()``, then ``__repr__()`` is also used when an\n "informal" string representation of instances of that class is\n required.\n\n This is typically used for debugging, so it is important that the\n representation is information-rich and unambiguous.\n\nobject.__str__(self)\n\n Called by ``str(object)`` and the built-in functions ``format()``\n and ``print()`` to compute the "informal" or nicely printable\n string representation of an object. The return value must be a\n *string* object.\n\n This method differs from ``object.__repr__()`` in that there is no\n expectation that ``__str__()`` return a valid Python expression: a\n more convenient or concise representation can be used.\n\n The default implementation defined by the built-in type ``object``\n calls ``object.__repr__()``.\n\nobject.__bytes__(self)\n\n Called by ``bytes()`` to compute a byte-string representation of an\n object. This should return a ``bytes`` object.\n\nobject.__format__(self, format_spec)\n\n Called by the ``format()`` built-in function (and by extension, the\n ``str.format()`` method of class ``str``) to produce a "formatted"\n string representation of an object. The ``format_spec`` argument is\n a string that contains a description of the formatting options\n desired. The interpretation of the ``format_spec`` argument is up\n to the type implementing ``__format__()``, however most classes\n will either delegate formatting to one of the built-in types, or\n use a similar formatting option syntax.\n\n See *Format Specification Mini-Language* for a description of the\n standard formatting syntax.\n\n The return value must be a string object.\n\nobject.__lt__(self, other)\nobject.__le__(self, other)\nobject.__eq__(self, other)\nobject.__ne__(self, other)\nobject.__gt__(self, other)\nobject.__ge__(self, other)\n\n These are the so-called "rich comparison" methods. The\n correspondence between operator symbols and method names is as\n follows: ``xy`` calls ``x.__gt__(y)``, and ``x>=y`` calls\n ``x.__ge__(y)``.\n\n A rich comparison method may return the singleton\n ``NotImplemented`` if it does not implement the operation for a\n given pair of arguments. By convention, ``False`` and ``True`` are\n returned for a successful comparison. However, these methods can\n return any value, so if the comparison operator is used in a\n Boolean context (e.g., in the condition of an ``if`` statement),\n Python will call ``bool()`` on the value to determine if the result\n is true or false.\n\n There are no implied relationships among the comparison operators.\n The truth of ``x==y`` does not imply that ``x!=y`` is false.\n Accordingly, when defining ``__eq__()``, one should also define\n ``__ne__()`` so that the operators will behave as expected. See\n the paragraph on ``__hash__()`` for some important notes on\n creating *hashable* objects which support custom comparison\n operations and are usable as dictionary keys.\n\n There are no swapped-argument versions of these methods (to be used\n when the left argument does not support the operation but the right\n argument does); rather, ``__lt__()`` and ``__gt__()`` are each\n other\'s reflection, ``__le__()`` and ``__ge__()`` are each other\'s\n reflection, and ``__eq__()`` and ``__ne__()`` are their own\n reflection.\n\n Arguments to rich comparison methods are never coerced.\n\n To automatically generate ordering operations from a single root\n operation, see ``functools.total_ordering()``.\n\nobject.__hash__(self)\n\n Called by built-in function ``hash()`` and for operations on\n members of hashed collections including ``set``, ``frozenset``, and\n ``dict``. ``__hash__()`` should return an integer. The only\n required property is that objects which compare equal have the same\n hash value; it is advised to somehow mix together (e.g. using\n exclusive or) the hash values for the components of the object that\n also play a part in comparison of objects.\n\n Note: ``hash()`` truncates the value returned from an object\'s custom\n ``__hash__()`` method to the size of a ``Py_ssize_t``. This is\n typically 8 bytes on 64-bit builds and 4 bytes on 32-bit builds.\n If an object\'s ``__hash__()`` must interoperate on builds of\n different bit sizes, be sure to check the width on all supported\n builds. An easy way to do this is with ``python -c "import sys;\n print(sys.hash_info.width)"``\n\n If a class does not define an ``__eq__()`` method it should not\n define a ``__hash__()`` operation either; if it defines\n ``__eq__()`` but not ``__hash__()``, its instances will not be\n usable as items in hashable collections. If a class defines\n mutable objects and implements an ``__eq__()`` method, it should\n not implement ``__hash__()``, since the implementation of hashable\n collections requires that a key\'s hash value is immutable (if the\n object\'s hash value changes, it will be in the wrong hash bucket).\n\n User-defined classes have ``__eq__()`` and ``__hash__()`` methods\n by default; with them, all objects compare unequal (except with\n themselves) and ``x.__hash__()`` returns an appropriate value such\n that ``x == y`` implies both that ``x is y`` and ``hash(x) ==\n hash(y)``.\n\n A class that overrides ``__eq__()`` and does not define\n ``__hash__()`` will have its ``__hash__()`` implicitly set to\n ``None``. When the ``__hash__()`` method of a class is ``None``,\n instances of the class will raise an appropriate ``TypeError`` when\n a program attempts to retrieve their hash value, and will also be\n correctly identified as unhashable when checking ``isinstance(obj,\n collections.Hashable``).\n\n If a class that overrides ``__eq__()`` needs to retain the\n implementation of ``__hash__()`` from a parent class, the\n interpreter must be told this explicitly by setting ``__hash__ =\n .__hash__``.\n\n If a class that does not override ``__eq__()`` wishes to suppress\n hash support, it should include ``__hash__ = None`` in the class\n definition. A class which defines its own ``__hash__()`` that\n explicitly raises a ``TypeError`` would be incorrectly identified\n as hashable by an ``isinstance(obj, collections.Hashable)`` call.\n\n Note: By default, the ``__hash__()`` values of str, bytes and datetime\n objects are "salted" with an unpredictable random value.\n Although they remain constant within an individual Python\n process, they are not predictable between repeated invocations of\n Python.This is intended to provide protection against a denial-\n of-service caused by carefully-chosen inputs that exploit the\n worst case performance of a dict insertion, O(n^2) complexity.\n See http://www.ocert.org/advisories/ocert-2011-003.html for\n details.Changing hash values affects the iteration order of\n dicts, sets and other mappings. Python has never made guarantees\n about this ordering (and it typically varies between 32-bit and\n 64-bit builds).See also ``PYTHONHASHSEED``.\n\n Changed in version 3.3: Hash randomization is enabled by default.\n\nobject.__bool__(self)\n\n Called to implement truth value testing and the built-in operation\n ``bool()``; should return ``False`` or ``True``. When this method\n is not defined, ``__len__()`` is called, if it is defined, and the\n object is considered true if its result is nonzero. If a class\n defines neither ``__len__()`` nor ``__bool__()``, all its instances\n are considered true.\n\n\nCustomizing attribute access\n============================\n\nThe following methods can be defined to customize the meaning of\nattribute access (use of, assignment to, or deletion of ``x.name``)\nfor class instances.\n\nobject.__getattr__(self, name)\n\n Called when an attribute lookup has not found the attribute in the\n usual places (i.e. it is not an instance attribute nor is it found\n in the class tree for ``self``). ``name`` is the attribute name.\n This method should return the (computed) attribute value or raise\n an ``AttributeError`` exception.\n\n Note that if the attribute is found through the normal mechanism,\n ``__getattr__()`` is not called. (This is an intentional asymmetry\n between ``__getattr__()`` and ``__setattr__()``.) This is done both\n for efficiency reasons and because otherwise ``__getattr__()``\n would have no way to access other attributes of the instance. Note\n that at least for instance variables, you can fake total control by\n not inserting any values in the instance attribute dictionary (but\n instead inserting them in another object). See the\n ``__getattribute__()`` method below for a way to actually get total\n control over attribute access.\n\nobject.__getattribute__(self, name)\n\n Called unconditionally to implement attribute accesses for\n instances of the class. If the class also defines\n ``__getattr__()``, the latter will not be called unless\n ``__getattribute__()`` either calls it explicitly or raises an\n ``AttributeError``. This method should return the (computed)\n attribute value or raise an ``AttributeError`` exception. In order\n to avoid infinite recursion in this method, its implementation\n should always call the base class method with the same name to\n access any attributes it needs, for example,\n ``object.__getattribute__(self, name)``.\n\n Note: This method may still be bypassed when looking up special methods\n as the result of implicit invocation via language syntax or\n built-in functions. See *Special method lookup*.\n\nobject.__setattr__(self, name, value)\n\n Called when an attribute assignment is attempted. This is called\n instead of the normal mechanism (i.e. store the value in the\n instance dictionary). *name* is the attribute name, *value* is the\n value to be assigned to it.\n\n If ``__setattr__()`` wants to assign to an instance attribute, it\n should call the base class method with the same name, for example,\n ``object.__setattr__(self, name, value)``.\n\nobject.__delattr__(self, name)\n\n Like ``__setattr__()`` but for attribute deletion instead of\n assignment. This should only be implemented if ``del obj.name`` is\n meaningful for the object.\n\nobject.__dir__(self)\n\n Called when ``dir()`` is called on the object. A sequence must be\n returned. ``dir()`` converts the returned sequence to a list and\n sorts it.\n\n\nImplementing Descriptors\n------------------------\n\nThe following methods only apply when an instance of the class\ncontaining the method (a so-called *descriptor* class) appears in an\n*owner* class (the descriptor must be in either the owner\'s class\ndictionary or in the class dictionary for one of its parents). In the\nexamples below, "the attribute" refers to the attribute whose name is\nthe key of the property in the owner class\' ``__dict__``.\n\nobject.__get__(self, instance, owner)\n\n Called to get the attribute of the owner class (class attribute\n access) or of an instance of that class (instance attribute\n access). *owner* is always the owner class, while *instance* is the\n instance that the attribute was accessed through, or ``None`` when\n the attribute is accessed through the *owner*. This method should\n return the (computed) attribute value or raise an\n ``AttributeError`` exception.\n\nobject.__set__(self, instance, value)\n\n Called to set the attribute on an instance *instance* of the owner\n class to a new value, *value*.\n\nobject.__delete__(self, instance)\n\n Called to delete the attribute on an instance *instance* of the\n owner class.\n\n\nInvoking Descriptors\n--------------------\n\nIn general, a descriptor is an object attribute with "binding\nbehavior", one whose attribute access has been overridden by methods\nin the descriptor protocol: ``__get__()``, ``__set__()``, and\n``__delete__()``. If any of those methods are defined for an object,\nit is said to be a descriptor.\n\nThe default behavior for attribute access is to get, set, or delete\nthe attribute from an object\'s dictionary. For instance, ``a.x`` has a\nlookup chain starting with ``a.__dict__[\'x\']``, then\n``type(a).__dict__[\'x\']``, and continuing through the base classes of\n``type(a)`` excluding metaclasses.\n\nHowever, if the looked-up value is an object defining one of the\ndescriptor methods, then Python may override the default behavior and\ninvoke the descriptor method instead. Where this occurs in the\nprecedence chain depends on which descriptor methods were defined and\nhow they were called.\n\nThe starting point for descriptor invocation is a binding, ``a.x``.\nHow the arguments are assembled depends on ``a``:\n\nDirect Call\n The simplest and least common call is when user code directly\n invokes a descriptor method: ``x.__get__(a)``.\n\nInstance Binding\n If binding to an object instance, ``a.x`` is transformed into the\n call: ``type(a).__dict__[\'x\'].__get__(a, type(a))``.\n\nClass Binding\n If binding to a class, ``A.x`` is transformed into the call:\n ``A.__dict__[\'x\'].__get__(None, A)``.\n\nSuper Binding\n If ``a`` is an instance of ``super``, then the binding ``super(B,\n obj).m()`` searches ``obj.__class__.__mro__`` for the base class\n ``A`` immediately preceding ``B`` and then invokes the descriptor\n with the call: ``A.__dict__[\'m\'].__get__(obj, obj.__class__)``.\n\nFor instance bindings, the precedence of descriptor invocation depends\non the which descriptor methods are defined. A descriptor can define\nany combination of ``__get__()``, ``__set__()`` and ``__delete__()``.\nIf it does not define ``__get__()``, then accessing the attribute will\nreturn the descriptor object itself unless there is a value in the\nobject\'s instance dictionary. If the descriptor defines ``__set__()``\nand/or ``__delete__()``, it is a data descriptor; if it defines\nneither, it is a non-data descriptor. Normally, data descriptors\ndefine both ``__get__()`` and ``__set__()``, while non-data\ndescriptors have just the ``__get__()`` method. Data descriptors with\n``__set__()`` and ``__get__()`` defined always override a redefinition\nin an instance dictionary. In contrast, non-data descriptors can be\noverridden by instances.\n\nPython methods (including ``staticmethod()`` and ``classmethod()``)\nare implemented as non-data descriptors. Accordingly, instances can\nredefine and override methods. This allows individual instances to\nacquire behaviors that differ from other instances of the same class.\n\nThe ``property()`` function is implemented as a data descriptor.\nAccordingly, instances cannot override the behavior of a property.\n\n\n__slots__\n---------\n\nBy default, instances of classes have a dictionary for attribute\nstorage. This wastes space for objects having very few instance\nvariables. The space consumption can become acute when creating large\nnumbers of instances.\n\nThe default can be overridden by defining *__slots__* in a class\ndefinition. The *__slots__* declaration takes a sequence of instance\nvariables and reserves just enough space in each instance to hold a\nvalue for each variable. Space is saved because *__dict__* is not\ncreated for each instance.\n\nobject.__slots__\n\n This class variable can be assigned a string, iterable, or sequence\n of strings with variable names used by instances. If defined in a\n class, *__slots__* reserves space for the declared variables and\n prevents the automatic creation of *__dict__* and *__weakref__* for\n each instance.\n\n\nNotes on using *__slots__*\n~~~~~~~~~~~~~~~~~~~~~~~~~~\n\n* When inheriting from a class without *__slots__*, the *__dict__*\n attribute of that class will always be accessible, so a *__slots__*\n definition in the subclass is meaningless.\n\n* Without a *__dict__* variable, instances cannot be assigned new\n variables not listed in the *__slots__* definition. Attempts to\n assign to an unlisted variable name raises ``AttributeError``. If\n dynamic assignment of new variables is desired, then add\n ``\'__dict__\'`` to the sequence of strings in the *__slots__*\n declaration.\n\n* Without a *__weakref__* variable for each instance, classes defining\n *__slots__* do not support weak references to its instances. If weak\n reference support is needed, then add ``\'__weakref__\'`` to the\n sequence of strings in the *__slots__* declaration.\n\n* *__slots__* are implemented at the class level by creating\n descriptors (*Implementing Descriptors*) for each variable name. As\n a result, class attributes cannot be used to set default values for\n instance variables defined by *__slots__*; otherwise, the class\n attribute would overwrite the descriptor assignment.\n\n* The action of a *__slots__* declaration is limited to the class\n where it is defined. As a result, subclasses will have a *__dict__*\n unless they also define *__slots__* (which must only contain names\n of any *additional* slots).\n\n* If a class defines a slot also defined in a base class, the instance\n variable defined by the base class slot is inaccessible (except by\n retrieving its descriptor directly from the base class). This\n renders the meaning of the program undefined. In the future, a\n check may be added to prevent this.\n\n* Nonempty *__slots__* does not work for classes derived from\n "variable-length" built-in types such as ``int``, ``str`` and\n ``tuple``.\n\n* Any non-string iterable may be assigned to *__slots__*. Mappings may\n also be used; however, in the future, special meaning may be\n assigned to the values corresponding to each key.\n\n* *__class__* assignment works only if both classes have the same\n *__slots__*.\n\n\nCustomizing class creation\n==========================\n\nBy default, classes are constructed using ``type()``. The class body\nis executed in a new namespace and the class name is bound locally to\nthe result of ``type(name, bases, namespace)``.\n\nThe class creation process can be customised by passing the\n``metaclass`` keyword argument in the class definition line, or by\ninheriting from an existing class that included such an argument. In\nthe following example, both ``MyClass`` and ``MySubclass`` are\ninstances of ``Meta``:\n\n class Meta(type):\n pass\n\n class MyClass(metaclass=Meta):\n pass\n\n class MySubclass(MyClass):\n pass\n\nAny other keyword arguments that are specified in the class definition\nare passed through to all metaclass operations described below.\n\nWhen a class definition is executed, the following steps occur:\n\n* the appropriate metaclass is determined\n\n* the class namespace is prepared\n\n* the class body is executed\n\n* the class object is created\n\n\nDetermining the appropriate metaclass\n-------------------------------------\n\nThe appropriate metaclass for a class definition is determined as\nfollows:\n\n* if no bases and no explicit metaclass are given, then ``type()`` is\n used\n\n* if an explicit metaclass is given and it is *not* an instance of\n ``type()``, then it is used directly as the metaclass\n\n* if an instance of ``type()`` is given as the explicit metaclass, or\n bases are defined, then the most derived metaclass is used\n\nThe most derived metaclass is selected from the explicitly specified\nmetaclass (if any) and the metaclasses (i.e. ``type(cls)``) of all\nspecified base classes. The most derived metaclass is one which is a\nsubtype of *all* of these candidate metaclasses. If none of the\ncandidate metaclasses meets that criterion, then the class definition\nwill fail with ``TypeError``.\n\n\nPreparing the class namespace\n-----------------------------\n\nOnce the appropriate metaclass has been identified, then the class\nnamespace is prepared. If the metaclass has a ``__prepare__``\nattribute, it is called as ``namespace = metaclass.__prepare__(name,\nbases, **kwds)`` (where the additional keyword arguments, if any, come\nfrom the class definition).\n\nIf the metaclass has no ``__prepare__`` attribute, then the class\nnamespace is initialised as an empty ``dict()`` instance.\n\nSee also:\n\n **PEP 3115** - Metaclasses in Python 3000\n Introduced the ``__prepare__`` namespace hook\n\n\nExecuting the class body\n------------------------\n\nThe class body is executed (approximately) as ``exec(body, globals(),\nnamespace)``. The key difference from a normal call to ``exec()`` is\nthat lexical scoping allows the class body (including any methods) to\nreference names from the current and outer scopes when the class\ndefinition occurs inside a function.\n\nHowever, even when the class definition occurs inside the function,\nmethods defined inside the class still cannot see names defined at the\nclass scope. Class variables must be accessed through the first\nparameter of instance or class methods, and cannot be accessed at all\nfrom static methods.\n\n\nCreating the class object\n-------------------------\n\nOnce the class namespace has been populated by executing the class\nbody, the class object is created by calling ``metaclass(name, bases,\nnamespace, **kwds)`` (the additional keywords passed here are the same\nas those passed to ``__prepare__``).\n\nThis class object is the one that will be referenced by the zero-\nargument form of ``super()``. ``__class__`` is an implicit closure\nreference created by the compiler if any methods in a class body refer\nto either ``__class__`` or ``super``. This allows the zero argument\nform of ``super()`` to correctly identify the class being defined\nbased on lexical scoping, while the class or instance that was used to\nmake the current call is identified based on the first argument passed\nto the method.\n\nAfter the class object is created, it is passed to the class\ndecorators included in the class definition (if any) and the resulting\nobject is bound in the local namespace as the defined class.\n\nSee also:\n\n **PEP 3135** - New super\n Describes the implicit ``__class__`` closure reference\n\n\nMetaclass example\n-----------------\n\nThe potential uses for metaclasses are boundless. Some ideas that have\nbeen explored include logging, interface checking, automatic\ndelegation, automatic property creation, proxies, frameworks, and\nautomatic resource locking/synchronization.\n\nHere is an example of a metaclass that uses an\n``collections.OrderedDict`` to remember the order that class members\nwere defined:\n\n class OrderedClass(type):\n\n @classmethod\n def __prepare__(metacls, name, bases, **kwds):\n return collections.OrderedDict()\n\n def __new__(cls, name, bases, namespace, **kwds):\n result = type.__new__(cls, name, bases, dict(namespace))\n result.members = tuple(namespace)\n return result\n\n class A(metaclass=OrderedClass):\n def one(self): pass\n def two(self): pass\n def three(self): pass\n def four(self): pass\n\n >>> A.members\n (\'__module__\', \'one\', \'two\', \'three\', \'four\')\n\nWhen the class definition for *A* gets executed, the process begins\nwith calling the metaclass\'s ``__prepare__()`` method which returns an\nempty ``collections.OrderedDict``. That mapping records the methods\nand attributes of *A* as they are defined within the body of the class\nstatement. Once those definitions are executed, the ordered dictionary\nis fully populated and the metaclass\'s ``__new__()`` method gets\ninvoked. That method builds the new type and it saves the ordered\ndictionary keys in an attribute called ``members``.\n\n\nCustomizing instance and subclass checks\n========================================\n\nThe following methods are used to override the default behavior of the\n``isinstance()`` and ``issubclass()`` built-in functions.\n\nIn particular, the metaclass ``abc.ABCMeta`` implements these methods\nin order to allow the addition of Abstract Base Classes (ABCs) as\n"virtual base classes" to any class or type (including built-in\ntypes), including other ABCs.\n\nclass.__instancecheck__(self, instance)\n\n Return true if *instance* should be considered a (direct or\n indirect) instance of *class*. If defined, called to implement\n ``isinstance(instance, class)``.\n\nclass.__subclasscheck__(self, subclass)\n\n Return true if *subclass* should be considered a (direct or\n indirect) subclass of *class*. If defined, called to implement\n ``issubclass(subclass, class)``.\n\nNote that these methods are looked up on the type (metaclass) of a\nclass. They cannot be defined as class methods in the actual class.\nThis is consistent with the lookup of special methods that are called\non instances, only in this case the instance is itself a class.\n\nSee also:\n\n **PEP 3119** - Introducing Abstract Base Classes\n Includes the specification for customizing ``isinstance()`` and\n ``issubclass()`` behavior through ``__instancecheck__()`` and\n ``__subclasscheck__()``, with motivation for this functionality\n in the context of adding Abstract Base Classes (see the ``abc``\n module) to the language.\n\n\nEmulating callable objects\n==========================\n\nobject.__call__(self[, args...])\n\n Called when the instance is "called" as a function; if this method\n is defined, ``x(arg1, arg2, ...)`` is a shorthand for\n ``x.__call__(arg1, arg2, ...)``.\n\n\nEmulating container types\n=========================\n\nThe following methods can be defined to implement container objects.\nContainers usually are sequences (such as lists or tuples) or mappings\n(like dictionaries), but can represent other containers as well. The\nfirst set of methods is used either to emulate a sequence or to\nemulate a mapping; the difference is that for a sequence, the\nallowable keys should be the integers *k* for which ``0 <= k < N``\nwhere *N* is the length of the sequence, or slice objects, which\ndefine a range of items. It is also recommended that mappings provide\nthe methods ``keys()``, ``values()``, ``items()``, ``get()``,\n``clear()``, ``setdefault()``, ``pop()``, ``popitem()``, ``copy()``,\nand ``update()`` behaving similar to those for Python\'s standard\ndictionary objects. The ``collections`` module provides a\n``MutableMapping`` abstract base class to help create those methods\nfrom a base set of ``__getitem__()``, ``__setitem__()``,\n``__delitem__()``, and ``keys()``. Mutable sequences should provide\nmethods ``append()``, ``count()``, ``index()``, ``extend()``,\n``insert()``, ``pop()``, ``remove()``, ``reverse()`` and ``sort()``,\nlike Python standard list objects. Finally, sequence types should\nimplement addition (meaning concatenation) and multiplication (meaning\nrepetition) by defining the methods ``__add__()``, ``__radd__()``,\n``__iadd__()``, ``__mul__()``, ``__rmul__()`` and ``__imul__()``\ndescribed below; they should not define other numerical operators. It\nis recommended that both mappings and sequences implement the\n``__contains__()`` method to allow efficient use of the ``in``\noperator; for mappings, ``in`` should search the mapping\'s keys; for\nsequences, it should search through the values. It is further\nrecommended that both mappings and sequences implement the\n``__iter__()`` method to allow efficient iteration through the\ncontainer; for mappings, ``__iter__()`` should be the same as\n``keys()``; for sequences, it should iterate through the values.\n\nobject.__len__(self)\n\n Called to implement the built-in function ``len()``. Should return\n the length of the object, an integer ``>=`` 0. Also, an object\n that doesn\'t define a ``__bool__()`` method and whose ``__len__()``\n method returns zero is considered to be false in a Boolean context.\n\nobject.__length_hint__(self)\n\n Called to implement ``operator.length_hint()``. Should return an\n estimated length for the object (which may be greater or less than\n the actual length). The length must be an integer ``>=`` 0. This\n method is purely an optimization and is never required for\n correctness.\n\n New in version 3.4.\n\nNote: Slicing is done exclusively with the following three methods. A\n call like\n\n a[1:2] = b\n\n is translated to\n\n a[slice(1, 2, None)] = b\n\n and so forth. Missing slice items are always filled in with\n ``None``.\n\nobject.__getitem__(self, key)\n\n Called to implement evaluation of ``self[key]``. For sequence\n types, the accepted keys should be integers and slice objects.\n Note that the special interpretation of negative indexes (if the\n class wishes to emulate a sequence type) is up to the\n ``__getitem__()`` method. If *key* is of an inappropriate type,\n ``TypeError`` may be raised; if of a value outside the set of\n indexes for the sequence (after any special interpretation of\n negative values), ``IndexError`` should be raised. For mapping\n types, if *key* is missing (not in the container), ``KeyError``\n should be raised.\n\n Note: ``for`` loops expect that an ``IndexError`` will be raised for\n illegal indexes to allow proper detection of the end of the\n sequence.\n\nobject.__setitem__(self, key, value)\n\n Called to implement assignment to ``self[key]``. Same note as for\n ``__getitem__()``. This should only be implemented for mappings if\n the objects support changes to the values for keys, or if new keys\n can be added, or for sequences if elements can be replaced. The\n same exceptions should be raised for improper *key* values as for\n the ``__getitem__()`` method.\n\nobject.__delitem__(self, key)\n\n Called to implement deletion of ``self[key]``. Same note as for\n ``__getitem__()``. This should only be implemented for mappings if\n the objects support removal of keys, or for sequences if elements\n can be removed from the sequence. The same exceptions should be\n raised for improper *key* values as for the ``__getitem__()``\n method.\n\nobject.__iter__(self)\n\n This method is called when an iterator is required for a container.\n This method should return a new iterator object that can iterate\n over all the objects in the container. For mappings, it should\n iterate over the keys of the container, and should also be made\n available as the method ``keys()``.\n\n Iterator objects also need to implement this method; they are\n required to return themselves. For more information on iterator\n objects, see *Iterator Types*.\n\nobject.__reversed__(self)\n\n Called (if present) by the ``reversed()`` built-in to implement\n reverse iteration. It should return a new iterator object that\n iterates over all the objects in the container in reverse order.\n\n If the ``__reversed__()`` method is not provided, the\n ``reversed()`` built-in will fall back to using the sequence\n protocol (``__len__()`` and ``__getitem__()``). Objects that\n support the sequence protocol should only provide\n ``__reversed__()`` if they can provide an implementation that is\n more efficient than the one provided by ``reversed()``.\n\nThe membership test operators (``in`` and ``not in``) are normally\nimplemented as an iteration through a sequence. However, container\nobjects can supply the following special method with a more efficient\nimplementation, which also does not require the object be a sequence.\n\nobject.__contains__(self, item)\n\n Called to implement membership test operators. Should return true\n if *item* is in *self*, false otherwise. For mapping objects, this\n should consider the keys of the mapping rather than the values or\n the key-item pairs.\n\n For objects that don\'t define ``__contains__()``, the membership\n test first tries iteration via ``__iter__()``, then the old\n sequence iteration protocol via ``__getitem__()``, see *this\n section in the language reference*.\n\n\nEmulating numeric types\n=======================\n\nThe following methods can be defined to emulate numeric objects.\nMethods corresponding to operations that are not supported by the\nparticular kind of number implemented (e.g., bitwise operations for\nnon-integral numbers) should be left undefined.\n\nobject.__add__(self, other)\nobject.__sub__(self, other)\nobject.__mul__(self, other)\nobject.__truediv__(self, other)\nobject.__floordiv__(self, other)\nobject.__mod__(self, other)\nobject.__divmod__(self, other)\nobject.__pow__(self, other[, modulo])\nobject.__lshift__(self, other)\nobject.__rshift__(self, other)\nobject.__and__(self, other)\nobject.__xor__(self, other)\nobject.__or__(self, other)\n\n These methods are called to implement the binary arithmetic\n operations (``+``, ``-``, ``*``, ``/``, ``//``, ``%``,\n ``divmod()``, ``pow()``, ``**``, ``<<``, ``>>``, ``&``, ``^``,\n ``|``). For instance, to evaluate the expression ``x + y``, where\n *x* is an instance of a class that has an ``__add__()`` method,\n ``x.__add__(y)`` is called. The ``__divmod__()`` method should be\n the equivalent to using ``__floordiv__()`` and ``__mod__()``; it\n should not be related to ``__truediv__()``. Note that\n ``__pow__()`` should be defined to accept an optional third\n argument if the ternary version of the built-in ``pow()`` function\n is to be supported.\n\n If one of those methods does not support the operation with the\n supplied arguments, it should return ``NotImplemented``.\n\nobject.__radd__(self, other)\nobject.__rsub__(self, other)\nobject.__rmul__(self, other)\nobject.__rtruediv__(self, other)\nobject.__rfloordiv__(self, other)\nobject.__rmod__(self, other)\nobject.__rdivmod__(self, other)\nobject.__rpow__(self, other)\nobject.__rlshift__(self, other)\nobject.__rrshift__(self, other)\nobject.__rand__(self, other)\nobject.__rxor__(self, other)\nobject.__ror__(self, other)\n\n These methods are called to implement the binary arithmetic\n operations (``+``, ``-``, ``*``, ``/``, ``//``, ``%``,\n ``divmod()``, ``pow()``, ``**``, ``<<``, ``>>``, ``&``, ``^``,\n ``|``) with reflected (swapped) operands. These functions are only\n called if the left operand does not support the corresponding\n operation and the operands are of different types. [2] For\n instance, to evaluate the expression ``x - y``, where *y* is an\n instance of a class that has an ``__rsub__()`` method,\n ``y.__rsub__(x)`` is called if ``x.__sub__(y)`` returns\n *NotImplemented*.\n\n Note that ternary ``pow()`` will not try calling ``__rpow__()``\n (the coercion rules would become too complicated).\n\n Note: If the right operand\'s type is a subclass of the left operand\'s\n type and that subclass provides the reflected method for the\n operation, this method will be called before the left operand\'s\n non-reflected method. This behavior allows subclasses to\n override their ancestors\' operations.\n\nobject.__iadd__(self, other)\nobject.__isub__(self, other)\nobject.__imul__(self, other)\nobject.__itruediv__(self, other)\nobject.__ifloordiv__(self, other)\nobject.__imod__(self, other)\nobject.__ipow__(self, other[, modulo])\nobject.__ilshift__(self, other)\nobject.__irshift__(self, other)\nobject.__iand__(self, other)\nobject.__ixor__(self, other)\nobject.__ior__(self, other)\n\n These methods are called to implement the augmented arithmetic\n assignments (``+=``, ``-=``, ``*=``, ``/=``, ``//=``, ``%=``,\n ``**=``, ``<<=``, ``>>=``, ``&=``, ``^=``, ``|=``). These methods\n should attempt to do the operation in-place (modifying *self*) and\n return the result (which could be, but does not have to be,\n *self*). If a specific method is not defined, the augmented\n assignment falls back to the normal methods. For instance, to\n execute the statement ``x += y``, where *x* is an instance of a\n class that has an ``__iadd__()`` method, ``x.__iadd__(y)`` is\n called. If *x* is an instance of a class that does not define a\n ``__iadd__()`` method, ``x.__add__(y)`` and ``y.__radd__(x)`` are\n considered, as with the evaluation of ``x + y``.\n\nobject.__neg__(self)\nobject.__pos__(self)\nobject.__abs__(self)\nobject.__invert__(self)\n\n Called to implement the unary arithmetic operations (``-``, ``+``,\n ``abs()`` and ``~``).\n\nobject.__complex__(self)\nobject.__int__(self)\nobject.__float__(self)\nobject.__round__(self[, n])\n\n Called to implement the built-in functions ``complex()``,\n ``int()``, ``float()`` and ``round()``. Should return a value of\n the appropriate type.\n\nobject.__index__(self)\n\n Called to implement ``operator.index()``. Also called whenever\n Python needs an integer object (such as in slicing, or in the\n built-in ``bin()``, ``hex()`` and ``oct()`` functions). Must return\n an integer.\n\n\nWith Statement Context Managers\n===============================\n\nA *context manager* is an object that defines the runtime context to\nbe established when executing a ``with`` statement. The context\nmanager handles the entry into, and the exit from, the desired runtime\ncontext for the execution of the block of code. Context managers are\nnormally invoked using the ``with`` statement (described in section\n*The with statement*), but can also be used by directly invoking their\nmethods.\n\nTypical uses of context managers include saving and restoring various\nkinds of global state, locking and unlocking resources, closing opened\nfiles, etc.\n\nFor more information on context managers, see *Context Manager Types*.\n\nobject.__enter__(self)\n\n Enter the runtime context related to this object. The ``with``\n statement will bind this method\'s return value to the target(s)\n specified in the ``as`` clause of the statement, if any.\n\nobject.__exit__(self, exc_type, exc_value, traceback)\n\n Exit the runtime context related to this object. The parameters\n describe the exception that caused the context to be exited. If the\n context was exited without an exception, all three arguments will\n be ``None``.\n\n If an exception is supplied, and the method wishes to suppress the\n exception (i.e., prevent it from being propagated), it should\n return a true value. Otherwise, the exception will be processed\n normally upon exit from this method.\n\n Note that ``__exit__()`` methods should not reraise the passed-in\n exception; this is the caller\'s responsibility.\n\nSee also:\n\n **PEP 0343** - The "with" statement\n The specification, background, and examples for the Python\n ``with`` statement.\n\n\nSpecial method lookup\n=====================\n\nFor custom classes, implicit invocations of special methods are only\nguaranteed to work correctly if defined on an object\'s type, not in\nthe object\'s instance dictionary. That behaviour is the reason why\nthe following code raises an exception:\n\n >>> class C:\n ... pass\n ...\n >>> c = C()\n >>> c.__len__ = lambda: 5\n >>> len(c)\n Traceback (most recent call last):\n File "", line 1, in \n TypeError: object of type \'C\' has no len()\n\nThe rationale behind this behaviour lies with a number of special\nmethods such as ``__hash__()`` and ``__repr__()`` that are implemented\nby all objects, including type objects. If the implicit lookup of\nthese methods used the conventional lookup process, they would fail\nwhen invoked on the type object itself:\n\n >>> 1 .__hash__() == hash(1)\n True\n >>> int.__hash__() == hash(int)\n Traceback (most recent call last):\n File "", line 1, in \n TypeError: descriptor \'__hash__\' of \'int\' object needs an argument\n\nIncorrectly attempting to invoke an unbound method of a class in this\nway is sometimes referred to as \'metaclass confusion\', and is avoided\nby bypassing the instance when looking up special methods:\n\n >>> type(1).__hash__(1) == hash(1)\n True\n >>> type(int).__hash__(int) == hash(int)\n True\n\nIn addition to bypassing any instance attributes in the interest of\ncorrectness, implicit special method lookup generally also bypasses\nthe ``__getattribute__()`` method even of the object\'s metaclass:\n\n >>> class Meta(type):\n ... def __getattribute__(*args):\n ... print("Metaclass getattribute invoked")\n ... return type.__getattribute__(*args)\n ...\n >>> class C(object, metaclass=Meta):\n ... def __len__(self):\n ... return 10\n ... def __getattribute__(*args):\n ... print("Class getattribute invoked")\n ... return object.__getattribute__(*args)\n ...\n >>> c = C()\n >>> c.__len__() # Explicit lookup via instance\n Class getattribute invoked\n 10\n >>> type(c).__len__(c) # Explicit lookup via type\n Metaclass getattribute invoked\n 10\n >>> len(c) # Implicit lookup\n 10\n\nBypassing the ``__getattribute__()`` machinery in this fashion\nprovides significant scope for speed optimisations within the\ninterpreter, at the cost of some flexibility in the handling of\nspecial methods (the special method *must* be set on the class object\nitself in order to be consistently invoked by the interpreter).\n\n-[ Footnotes ]-\n\n[1] It *is* possible in some cases to change an object\'s type, under\n certain controlled conditions. It generally isn\'t a good idea\n though, since it can lead to some very strange behaviour if it is\n handled incorrectly.\n\n[2] For operands of the same type, it is assumed that if the non-\n reflected method (such as ``__add__()``) fails the operation is\n not supported, which is why the reflected method is not called.\n', - 'string-methods': '\nString Methods\n**************\n\nStrings implement all of the *common* sequence operations, along with\nthe additional methods described below.\n\nStrings also support two styles of string formatting, one providing a\nlarge degree of flexibility and customization (see ``str.format()``,\n*Format String Syntax* and *String Formatting*) and the other based on\nC ``printf`` style formatting that handles a narrower range of types\nand is slightly harder to use correctly, but is often faster for the\ncases it can handle (*printf-style String Formatting*).\n\nThe *Text Processing Services* section of the standard library covers\na number of other modules that provide various text related utilities\n(including regular expression support in the ``re`` module).\n\nstr.capitalize()\n\n Return a copy of the string with its first character capitalized\n and the rest lowercased.\n\nstr.casefold()\n\n Return a casefolded copy of the string. Casefolded strings may be\n used for caseless matching.\n\n Casefolding is similar to lowercasing but more aggressive because\n it is intended to remove all case distinctions in a string. For\n example, the German lowercase letter ``\'\xc3\x9f\'`` is equivalent to\n ``"ss"``. Since it is already lowercase, ``lower()`` would do\n nothing to ``\'\xc3\x9f\'``; ``casefold()`` converts it to ``"ss"``.\n\n The casefolding algorithm is described in section 3.13 of the\n Unicode Standard.\n\n New in version 3.3.\n\nstr.center(width[, fillchar])\n\n Return centered in a string of length *width*. Padding is done\n using the specified *fillchar* (default is a space).\n\nstr.count(sub[, start[, end]])\n\n Return the number of non-overlapping occurrences of substring *sub*\n in the range [*start*, *end*]. Optional arguments *start* and\n *end* are interpreted as in slice notation.\n\nstr.encode(encoding="utf-8", errors="strict")\n\n Return an encoded version of the string as a bytes object. Default\n encoding is ``\'utf-8\'``. *errors* may be given to set a different\n error handling scheme. The default for *errors* is ``\'strict\'``,\n meaning that encoding errors raise a ``UnicodeError``. Other\n possible values are ``\'ignore\'``, ``\'replace\'``,\n ``\'xmlcharrefreplace\'``, ``\'backslashreplace\'`` and any other name\n registered via ``codecs.register_error()``, see section *Codec Base\n Classes*. For a list of possible encodings, see section *Standard\n Encodings*.\n\n Changed in version 3.1: Support for keyword arguments added.\n\nstr.endswith(suffix[, start[, end]])\n\n Return ``True`` if the string ends with the specified *suffix*,\n otherwise return ``False``. *suffix* can also be a tuple of\n suffixes to look for. With optional *start*, test beginning at\n that position. With optional *end*, stop comparing at that\n position.\n\nstr.expandtabs([tabsize])\n\n Return a copy of the string where all tab characters are replaced\n by one or more spaces, depending on the current column and the\n given tab size. Tab positions occur every *tabsize* characters\n (default is 8, giving tab positions at columns 0, 8, 16 and so on).\n To expand the string, the current column is set to zero and the\n string is examined character by character. If the character is a\n tab (``\\t``), one or more space characters are inserted in the\n result until the current column is equal to the next tab position.\n (The tab character itself is not copied.) If the character is a\n newline (``\\n``) or return (``\\r``), it is copied and the current\n column is reset to zero. Any other character is copied unchanged\n and the current column is incremented by one regardless of how the\n character is represented when printed.\n\n >>> \'01\\t012\\t0123\\t01234\'.expandtabs()\n \'01 012 0123 01234\'\n >>> \'01\\t012\\t0123\\t01234\'.expandtabs(4)\n \'01 012 0123 01234\'\n\nstr.find(sub[, start[, end]])\n\n Return the lowest index in the string where substring *sub* is\n found, such that *sub* is contained in the slice ``s[start:end]``.\n Optional arguments *start* and *end* are interpreted as in slice\n notation. Return ``-1`` if *sub* is not found.\n\n Note: The ``find()`` method should be used only if you need to know the\n position of *sub*. To check if *sub* is a substring or not, use\n the ``in`` operator:\n\n >>> \'Py\' in \'Python\'\n True\n\nstr.format(*args, **kwargs)\n\n Perform a string formatting operation. The string on which this\n method is called can contain literal text or replacement fields\n delimited by braces ``{}``. Each replacement field contains either\n the numeric index of a positional argument, or the name of a\n keyword argument. Returns a copy of the string where each\n replacement field is replaced with the string value of the\n corresponding argument.\n\n >>> "The sum of 1 + 2 is {0}".format(1+2)\n \'The sum of 1 + 2 is 3\'\n\n See *Format String Syntax* for a description of the various\n formatting options that can be specified in format strings.\n\nstr.format_map(mapping)\n\n Similar to ``str.format(**mapping)``, except that ``mapping`` is\n used directly and not copied to a ``dict`` . This is useful if for\n example ``mapping`` is a dict subclass:\n\n >>> class Default(dict):\n ... def __missing__(self, key):\n ... return key\n ...\n >>> \'{name} was born in {country}\'.format_map(Default(name=\'Guido\'))\n \'Guido was born in country\'\n\n New in version 3.2.\n\nstr.index(sub[, start[, end]])\n\n Like ``find()``, but raise ``ValueError`` when the substring is not\n found.\n\nstr.isalnum()\n\n Return true if all characters in the string are alphanumeric and\n there is at least one character, false otherwise. A character\n ``c`` is alphanumeric if one of the following returns ``True``:\n ``c.isalpha()``, ``c.isdecimal()``, ``c.isdigit()``, or\n ``c.isnumeric()``.\n\nstr.isalpha()\n\n Return true if all characters in the string are alphabetic and\n there is at least one character, false otherwise. Alphabetic\n characters are those characters defined in the Unicode character\n database as "Letter", i.e., those with general category property\n being one of "Lm", "Lt", "Lu", "Ll", or "Lo". Note that this is\n different from the "Alphabetic" property defined in the Unicode\n Standard.\n\nstr.isdecimal()\n\n Return true if all characters in the string are decimal characters\n and there is at least one character, false otherwise. Decimal\n characters are those from general category "Nd". This category\n includes digit characters, and all characters that can be used to\n form decimal-radix numbers, e.g. U+0660, ARABIC-INDIC DIGIT ZERO.\n\nstr.isdigit()\n\n Return true if all characters in the string are digits and there is\n at least one character, false otherwise. Digits include decimal\n characters and digits that need special handling, such as the\n compatibility superscript digits. Formally, a digit is a character\n that has the property value Numeric_Type=Digit or\n Numeric_Type=Decimal.\n\nstr.isidentifier()\n\n Return true if the string is a valid identifier according to the\n language definition, section *Identifiers and keywords*.\n\n Use ``keyword.iskeyword()`` to test for reserved identifiers such\n as ``def`` and ``class``.\n\nstr.islower()\n\n Return true if all cased characters [4] in the string are lowercase\n and there is at least one cased character, false otherwise.\n\nstr.isnumeric()\n\n Return true if all characters in the string are numeric characters,\n and there is at least one character, false otherwise. Numeric\n characters include digit characters, and all characters that have\n the Unicode numeric value property, e.g. U+2155, VULGAR FRACTION\n ONE FIFTH. Formally, numeric characters are those with the\n property value Numeric_Type=Digit, Numeric_Type=Decimal or\n Numeric_Type=Numeric.\n\nstr.isprintable()\n\n Return true if all characters in the string are printable or the\n string is empty, false otherwise. Nonprintable characters are\n those characters defined in the Unicode character database as\n "Other" or "Separator", excepting the ASCII space (0x20) which is\n considered printable. (Note that printable characters in this\n context are those which should not be escaped when ``repr()`` is\n invoked on a string. It has no bearing on the handling of strings\n written to ``sys.stdout`` or ``sys.stderr``.)\n\nstr.isspace()\n\n Return true if there are only whitespace characters in the string\n and there is at least one character, false otherwise. Whitespace\n characters are those characters defined in the Unicode character\n database as "Other" or "Separator" and those with bidirectional\n property being one of "WS", "B", or "S".\n\nstr.istitle()\n\n Return true if the string is a titlecased string and there is at\n least one character, for example uppercase characters may only\n follow uncased characters and lowercase characters only cased ones.\n Return false otherwise.\n\nstr.isupper()\n\n Return true if all cased characters [4] in the string are uppercase\n and there is at least one cased character, false otherwise.\n\nstr.join(iterable)\n\n Return a string which is the concatenation of the strings in the\n *iterable* *iterable*. A ``TypeError`` will be raised if there are\n any non-string values in *iterable*, including ``bytes`` objects.\n The separator between elements is the string providing this method.\n\nstr.ljust(width[, fillchar])\n\n Return the string left justified in a string of length *width*.\n Padding is done using the specified *fillchar* (default is a\n space). The original string is returned if *width* is less than or\n equal to ``len(s)``.\n\nstr.lower()\n\n Return a copy of the string with all the cased characters [4]\n converted to lowercase.\n\n The lowercasing algorithm used is described in section 3.13 of the\n Unicode Standard.\n\nstr.lstrip([chars])\n\n Return a copy of the string with leading characters removed. The\n *chars* argument is a string specifying the set of characters to be\n removed. If omitted or ``None``, the *chars* argument defaults to\n removing whitespace. The *chars* argument is not a prefix; rather,\n all combinations of its values are stripped:\n\n >>> \' spacious \'.lstrip()\n \'spacious \'\n >>> \'www.example.com\'.lstrip(\'cmowz.\')\n \'example.com\'\n\nstatic str.maketrans(x[, y[, z]])\n\n This static method returns a translation table usable for\n ``str.translate()``.\n\n If there is only one argument, it must be a dictionary mapping\n Unicode ordinals (integers) or characters (strings of length 1) to\n Unicode ordinals, strings (of arbitrary lengths) or None.\n Character keys will then be converted to ordinals.\n\n If there are two arguments, they must be strings of equal length,\n and in the resulting dictionary, each character in x will be mapped\n to the character at the same position in y. If there is a third\n argument, it must be a string, whose characters will be mapped to\n None in the result.\n\nstr.partition(sep)\n\n Split the string at the first occurrence of *sep*, and return a\n 3-tuple containing the part before the separator, the separator\n itself, and the part after the separator. If the separator is not\n found, return a 3-tuple containing the string itself, followed by\n two empty strings.\n\nstr.replace(old, new[, count])\n\n Return a copy of the string with all occurrences of substring *old*\n replaced by *new*. If the optional argument *count* is given, only\n the first *count* occurrences are replaced.\n\nstr.rfind(sub[, start[, end]])\n\n Return the highest index in the string where substring *sub* is\n found, such that *sub* is contained within ``s[start:end]``.\n Optional arguments *start* and *end* are interpreted as in slice\n notation. Return ``-1`` on failure.\n\nstr.rindex(sub[, start[, end]])\n\n Like ``rfind()`` but raises ``ValueError`` when the substring *sub*\n is not found.\n\nstr.rjust(width[, fillchar])\n\n Return the string right justified in a string of length *width*.\n Padding is done using the specified *fillchar* (default is a\n space). The original string is returned if *width* is less than or\n equal to ``len(s)``.\n\nstr.rpartition(sep)\n\n Split the string at the last occurrence of *sep*, and return a\n 3-tuple containing the part before the separator, the separator\n itself, and the part after the separator. If the separator is not\n found, return a 3-tuple containing two empty strings, followed by\n the string itself.\n\nstr.rsplit(sep=None, maxsplit=-1)\n\n Return a list of the words in the string, using *sep* as the\n delimiter string. If *maxsplit* is given, at most *maxsplit* splits\n are done, the *rightmost* ones. If *sep* is not specified or\n ``None``, any whitespace string is a separator. Except for\n splitting from the right, ``rsplit()`` behaves like ``split()``\n which is described in detail below.\n\nstr.rstrip([chars])\n\n Return a copy of the string with trailing characters removed. The\n *chars* argument is a string specifying the set of characters to be\n removed. If omitted or ``None``, the *chars* argument defaults to\n removing whitespace. The *chars* argument is not a suffix; rather,\n all combinations of its values are stripped:\n\n >>> \' spacious \'.rstrip()\n \' spacious\'\n >>> \'mississippi\'.rstrip(\'ipz\')\n \'mississ\'\n\nstr.split(sep=None, maxsplit=-1)\n\n Return a list of the words in the string, using *sep* as the\n delimiter string. If *maxsplit* is given, at most *maxsplit*\n splits are done (thus, the list will have at most ``maxsplit+1``\n elements). If *maxsplit* is not specified or ``-1``, then there is\n no limit on the number of splits (all possible splits are made).\n\n If *sep* is given, consecutive delimiters are not grouped together\n and are deemed to delimit empty strings (for example,\n ``\'1,,2\'.split(\',\')`` returns ``[\'1\', \'\', \'2\']``). The *sep*\n argument may consist of multiple characters (for example,\n ``\'1<>2<>3\'.split(\'<>\')`` returns ``[\'1\', \'2\', \'3\']``). Splitting\n an empty string with a specified separator returns ``[\'\']``.\n\n If *sep* is not specified or is ``None``, a different splitting\n algorithm is applied: runs of consecutive whitespace are regarded\n as a single separator, and the result will contain no empty strings\n at the start or end if the string has leading or trailing\n whitespace. Consequently, splitting an empty string or a string\n consisting of just whitespace with a ``None`` separator returns\n ``[]``.\n\n For example, ``\' 1 2 3 \'.split()`` returns ``[\'1\', \'2\', \'3\']``,\n and ``\' 1 2 3 \'.split(None, 1)`` returns ``[\'1\', \'2 3 \']``.\n\nstr.splitlines([keepends])\n\n Return a list of the lines in the string, breaking at line\n boundaries. This method uses the *universal newlines* approach to\n splitting lines. Line breaks are not included in the resulting list\n unless *keepends* is given and true.\n\n For example, ``\'ab c\\n\\nde fg\\rkl\\r\\n\'.splitlines()`` returns\n ``[\'ab c\', \'\', \'de fg\', \'kl\']``, while the same call with\n ``splitlines(True)`` returns ``[\'ab c\\n\', \'\\n\', \'de fg\\r\',\n \'kl\\r\\n\']``.\n\n Unlike ``split()`` when a delimiter string *sep* is given, this\n method returns an empty list for the empty string, and a terminal\n line break does not result in an extra line.\n\nstr.startswith(prefix[, start[, end]])\n\n Return ``True`` if string starts with the *prefix*, otherwise\n return ``False``. *prefix* can also be a tuple of prefixes to look\n for. With optional *start*, test string beginning at that\n position. With optional *end*, stop comparing string at that\n position.\n\nstr.strip([chars])\n\n Return a copy of the string with the leading and trailing\n characters removed. The *chars* argument is a string specifying the\n set of characters to be removed. If omitted or ``None``, the\n *chars* argument defaults to removing whitespace. The *chars*\n argument is not a prefix or suffix; rather, all combinations of its\n values are stripped:\n\n >>> \' spacious \'.strip()\n \'spacious\'\n >>> \'www.example.com\'.strip(\'cmowz.\')\n \'example\'\n\nstr.swapcase()\n\n Return a copy of the string with uppercase characters converted to\n lowercase and vice versa. Note that it is not necessarily true that\n ``s.swapcase().swapcase() == s``.\n\nstr.title()\n\n Return a titlecased version of the string where words start with an\n uppercase character and the remaining characters are lowercase.\n\n The algorithm uses a simple language-independent definition of a\n word as groups of consecutive letters. The definition works in\n many contexts but it means that apostrophes in contractions and\n possessives form word boundaries, which may not be the desired\n result:\n\n >>> "they\'re bill\'s friends from the UK".title()\n "They\'Re Bill\'S Friends From The Uk"\n\n A workaround for apostrophes can be constructed using regular\n expressions:\n\n >>> import re\n >>> def titlecase(s):\n ... return re.sub(r"[A-Za-z]+(\'[A-Za-z]+)?",\n ... lambda mo: mo.group(0)[0].upper() +\n ... mo.group(0)[1:].lower(),\n ... s)\n ...\n >>> titlecase("they\'re bill\'s friends.")\n "They\'re Bill\'s Friends."\n\nstr.translate(map)\n\n Return a copy of the *s* where all characters have been mapped\n through the *map* which must be a dictionary of Unicode ordinals\n (integers) to Unicode ordinals, strings or ``None``. Unmapped\n characters are left untouched. Characters mapped to ``None`` are\n deleted.\n\n You can use ``str.maketrans()`` to create a translation map from\n character-to-character mappings in different formats.\n\n Note: An even more flexible approach is to create a custom character\n mapping codec using the ``codecs`` module (see\n ``encodings.cp1251`` for an example).\n\nstr.upper()\n\n Return a copy of the string with all the cased characters [4]\n converted to uppercase. Note that ``str.upper().isupper()`` might\n be ``False`` if ``s`` contains uncased characters or if the Unicode\n category of the resulting character(s) is not "Lu" (Letter,\n uppercase), but e.g. "Lt" (Letter, titlecase).\n\n The uppercasing algorithm used is described in section 3.13 of the\n Unicode Standard.\n\nstr.zfill(width)\n\n Return the numeric string left filled with zeros in a string of\n length *width*. A sign prefix is handled correctly. The original\n string is returned if *width* is less than or equal to ``len(s)``.\n', + 'string-methods': '\nString Methods\n**************\n\nStrings implement all of the *common* sequence operations, along with\nthe additional methods described below.\n\nStrings also support two styles of string formatting, one providing a\nlarge degree of flexibility and customization (see ``str.format()``,\n*Format String Syntax* and *String Formatting*) and the other based on\nC ``printf`` style formatting that handles a narrower range of types\nand is slightly harder to use correctly, but is often faster for the\ncases it can handle (*printf-style String Formatting*).\n\nThe *Text Processing Services* section of the standard library covers\na number of other modules that provide various text related utilities\n(including regular expression support in the ``re`` module).\n\nstr.capitalize()\n\n Return a copy of the string with its first character capitalized\n and the rest lowercased.\n\nstr.casefold()\n\n Return a casefolded copy of the string. Casefolded strings may be\n used for caseless matching.\n\n Casefolding is similar to lowercasing but more aggressive because\n it is intended to remove all case distinctions in a string. For\n example, the German lowercase letter ``\'\xc3\x9f\'`` is equivalent to\n ``"ss"``. Since it is already lowercase, ``lower()`` would do\n nothing to ``\'\xc3\x9f\'``; ``casefold()`` converts it to ``"ss"``.\n\n The casefolding algorithm is described in section 3.13 of the\n Unicode Standard.\n\n New in version 3.3.\n\nstr.center(width[, fillchar])\n\n Return centered in a string of length *width*. Padding is done\n using the specified *fillchar* (default is a space).\n\nstr.count(sub[, start[, end]])\n\n Return the number of non-overlapping occurrences of substring *sub*\n in the range [*start*, *end*]. Optional arguments *start* and\n *end* are interpreted as in slice notation.\n\nstr.encode(encoding="utf-8", errors="strict")\n\n Return an encoded version of the string as a bytes object. Default\n encoding is ``\'utf-8\'``. *errors* may be given to set a different\n error handling scheme. The default for *errors* is ``\'strict\'``,\n meaning that encoding errors raise a ``UnicodeError``. Other\n possible values are ``\'ignore\'``, ``\'replace\'``,\n ``\'xmlcharrefreplace\'``, ``\'backslashreplace\'`` and any other name\n registered via ``codecs.register_error()``, see section *Codec Base\n Classes*. For a list of possible encodings, see section *Standard\n Encodings*.\n\n Changed in version 3.1: Support for keyword arguments added.\n\nstr.endswith(suffix[, start[, end]])\n\n Return ``True`` if the string ends with the specified *suffix*,\n otherwise return ``False``. *suffix* can also be a tuple of\n suffixes to look for. With optional *start*, test beginning at\n that position. With optional *end*, stop comparing at that\n position.\n\nstr.expandtabs(tabsize=8)\n\n Return a copy of the string where all tab characters are replaced\n by one or more spaces, depending on the current column and the\n given tab size. Tab positions occur every *tabsize* characters\n (default is 8, giving tab positions at columns 0, 8, 16 and so on).\n To expand the string, the current column is set to zero and the\n string is examined character by character. If the character is a\n tab (``\\t``), one or more space characters are inserted in the\n result until the current column is equal to the next tab position.\n (The tab character itself is not copied.) If the character is a\n newline (``\\n``) or return (``\\r``), it is copied and the current\n column is reset to zero. Any other character is copied unchanged\n and the current column is incremented by one regardless of how the\n character is represented when printed.\n\n >>> \'01\\t012\\t0123\\t01234\'.expandtabs()\n \'01 012 0123 01234\'\n >>> \'01\\t012\\t0123\\t01234\'.expandtabs(4)\n \'01 012 0123 01234\'\n\nstr.find(sub[, start[, end]])\n\n Return the lowest index in the string where substring *sub* is\n found, such that *sub* is contained in the slice ``s[start:end]``.\n Optional arguments *start* and *end* are interpreted as in slice\n notation. Return ``-1`` if *sub* is not found.\n\n Note: The ``find()`` method should be used only if you need to know the\n position of *sub*. To check if *sub* is a substring or not, use\n the ``in`` operator:\n\n >>> \'Py\' in \'Python\'\n True\n\nstr.format(*args, **kwargs)\n\n Perform a string formatting operation. The string on which this\n method is called can contain literal text or replacement fields\n delimited by braces ``{}``. Each replacement field contains either\n the numeric index of a positional argument, or the name of a\n keyword argument. Returns a copy of the string where each\n replacement field is replaced with the string value of the\n corresponding argument.\n\n >>> "The sum of 1 + 2 is {0}".format(1+2)\n \'The sum of 1 + 2 is 3\'\n\n See *Format String Syntax* for a description of the various\n formatting options that can be specified in format strings.\n\nstr.format_map(mapping)\n\n Similar to ``str.format(**mapping)``, except that ``mapping`` is\n used directly and not copied to a ``dict`` . This is useful if for\n example ``mapping`` is a dict subclass:\n\n >>> class Default(dict):\n ... def __missing__(self, key):\n ... return key\n ...\n >>> \'{name} was born in {country}\'.format_map(Default(name=\'Guido\'))\n \'Guido was born in country\'\n\n New in version 3.2.\n\nstr.index(sub[, start[, end]])\n\n Like ``find()``, but raise ``ValueError`` when the substring is not\n found.\n\nstr.isalnum()\n\n Return true if all characters in the string are alphanumeric and\n there is at least one character, false otherwise. A character\n ``c`` is alphanumeric if one of the following returns ``True``:\n ``c.isalpha()``, ``c.isdecimal()``, ``c.isdigit()``, or\n ``c.isnumeric()``.\n\nstr.isalpha()\n\n Return true if all characters in the string are alphabetic and\n there is at least one character, false otherwise. Alphabetic\n characters are those characters defined in the Unicode character\n database as "Letter", i.e., those with general category property\n being one of "Lm", "Lt", "Lu", "Ll", or "Lo". Note that this is\n different from the "Alphabetic" property defined in the Unicode\n Standard.\n\nstr.isdecimal()\n\n Return true if all characters in the string are decimal characters\n and there is at least one character, false otherwise. Decimal\n characters are those from general category "Nd". This category\n includes digit characters, and all characters that can be used to\n form decimal-radix numbers, e.g. U+0660, ARABIC-INDIC DIGIT ZERO.\n\nstr.isdigit()\n\n Return true if all characters in the string are digits and there is\n at least one character, false otherwise. Digits include decimal\n characters and digits that need special handling, such as the\n compatibility superscript digits. Formally, a digit is a character\n that has the property value Numeric_Type=Digit or\n Numeric_Type=Decimal.\n\nstr.isidentifier()\n\n Return true if the string is a valid identifier according to the\n language definition, section *Identifiers and keywords*.\n\n Use ``keyword.iskeyword()`` to test for reserved identifiers such\n as ``def`` and ``class``.\n\nstr.islower()\n\n Return true if all cased characters [4] in the string are lowercase\n and there is at least one cased character, false otherwise.\n\nstr.isnumeric()\n\n Return true if all characters in the string are numeric characters,\n and there is at least one character, false otherwise. Numeric\n characters include digit characters, and all characters that have\n the Unicode numeric value property, e.g. U+2155, VULGAR FRACTION\n ONE FIFTH. Formally, numeric characters are those with the\n property value Numeric_Type=Digit, Numeric_Type=Decimal or\n Numeric_Type=Numeric.\n\nstr.isprintable()\n\n Return true if all characters in the string are printable or the\n string is empty, false otherwise. Nonprintable characters are\n those characters defined in the Unicode character database as\n "Other" or "Separator", excepting the ASCII space (0x20) which is\n considered printable. (Note that printable characters in this\n context are those which should not be escaped when ``repr()`` is\n invoked on a string. It has no bearing on the handling of strings\n written to ``sys.stdout`` or ``sys.stderr``.)\n\nstr.isspace()\n\n Return true if there are only whitespace characters in the string\n and there is at least one character, false otherwise. Whitespace\n characters are those characters defined in the Unicode character\n database as "Other" or "Separator" and those with bidirectional\n property being one of "WS", "B", or "S".\n\nstr.istitle()\n\n Return true if the string is a titlecased string and there is at\n least one character, for example uppercase characters may only\n follow uncased characters and lowercase characters only cased ones.\n Return false otherwise.\n\nstr.isupper()\n\n Return true if all cased characters [4] in the string are uppercase\n and there is at least one cased character, false otherwise.\n\nstr.join(iterable)\n\n Return a string which is the concatenation of the strings in the\n *iterable* *iterable*. A ``TypeError`` will be raised if there are\n any non-string values in *iterable*, including ``bytes`` objects.\n The separator between elements is the string providing this method.\n\nstr.ljust(width[, fillchar])\n\n Return the string left justified in a string of length *width*.\n Padding is done using the specified *fillchar* (default is a\n space). The original string is returned if *width* is less than or\n equal to ``len(s)``.\n\nstr.lower()\n\n Return a copy of the string with all the cased characters [4]\n converted to lowercase.\n\n The lowercasing algorithm used is described in section 3.13 of the\n Unicode Standard.\n\nstr.lstrip([chars])\n\n Return a copy of the string with leading characters removed. The\n *chars* argument is a string specifying the set of characters to be\n removed. If omitted or ``None``, the *chars* argument defaults to\n removing whitespace. The *chars* argument is not a prefix; rather,\n all combinations of its values are stripped:\n\n >>> \' spacious \'.lstrip()\n \'spacious \'\n >>> \'www.example.com\'.lstrip(\'cmowz.\')\n \'example.com\'\n\nstatic str.maketrans(x[, y[, z]])\n\n This static method returns a translation table usable for\n ``str.translate()``.\n\n If there is only one argument, it must be a dictionary mapping\n Unicode ordinals (integers) or characters (strings of length 1) to\n Unicode ordinals, strings (of arbitrary lengths) or None.\n Character keys will then be converted to ordinals.\n\n If there are two arguments, they must be strings of equal length,\n and in the resulting dictionary, each character in x will be mapped\n to the character at the same position in y. If there is a third\n argument, it must be a string, whose characters will be mapped to\n None in the result.\n\nstr.partition(sep)\n\n Split the string at the first occurrence of *sep*, and return a\n 3-tuple containing the part before the separator, the separator\n itself, and the part after the separator. If the separator is not\n found, return a 3-tuple containing the string itself, followed by\n two empty strings.\n\nstr.replace(old, new[, count])\n\n Return a copy of the string with all occurrences of substring *old*\n replaced by *new*. If the optional argument *count* is given, only\n the first *count* occurrences are replaced.\n\nstr.rfind(sub[, start[, end]])\n\n Return the highest index in the string where substring *sub* is\n found, such that *sub* is contained within ``s[start:end]``.\n Optional arguments *start* and *end* are interpreted as in slice\n notation. Return ``-1`` on failure.\n\nstr.rindex(sub[, start[, end]])\n\n Like ``rfind()`` but raises ``ValueError`` when the substring *sub*\n is not found.\n\nstr.rjust(width[, fillchar])\n\n Return the string right justified in a string of length *width*.\n Padding is done using the specified *fillchar* (default is a\n space). The original string is returned if *width* is less than or\n equal to ``len(s)``.\n\nstr.rpartition(sep)\n\n Split the string at the last occurrence of *sep*, and return a\n 3-tuple containing the part before the separator, the separator\n itself, and the part after the separator. If the separator is not\n found, return a 3-tuple containing two empty strings, followed by\n the string itself.\n\nstr.rsplit(sep=None, maxsplit=-1)\n\n Return a list of the words in the string, using *sep* as the\n delimiter string. If *maxsplit* is given, at most *maxsplit* splits\n are done, the *rightmost* ones. If *sep* is not specified or\n ``None``, any whitespace string is a separator. Except for\n splitting from the right, ``rsplit()`` behaves like ``split()``\n which is described in detail below.\n\nstr.rstrip([chars])\n\n Return a copy of the string with trailing characters removed. The\n *chars* argument is a string specifying the set of characters to be\n removed. If omitted or ``None``, the *chars* argument defaults to\n removing whitespace. The *chars* argument is not a suffix; rather,\n all combinations of its values are stripped:\n\n >>> \' spacious \'.rstrip()\n \' spacious\'\n >>> \'mississippi\'.rstrip(\'ipz\')\n \'mississ\'\n\nstr.split(sep=None, maxsplit=-1)\n\n Return a list of the words in the string, using *sep* as the\n delimiter string. If *maxsplit* is given, at most *maxsplit*\n splits are done (thus, the list will have at most ``maxsplit+1``\n elements). If *maxsplit* is not specified or ``-1``, then there is\n no limit on the number of splits (all possible splits are made).\n\n If *sep* is given, consecutive delimiters are not grouped together\n and are deemed to delimit empty strings (for example,\n ``\'1,,2\'.split(\',\')`` returns ``[\'1\', \'\', \'2\']``). The *sep*\n argument may consist of multiple characters (for example,\n ``\'1<>2<>3\'.split(\'<>\')`` returns ``[\'1\', \'2\', \'3\']``). Splitting\n an empty string with a specified separator returns ``[\'\']``.\n\n If *sep* is not specified or is ``None``, a different splitting\n algorithm is applied: runs of consecutive whitespace are regarded\n as a single separator, and the result will contain no empty strings\n at the start or end if the string has leading or trailing\n whitespace. Consequently, splitting an empty string or a string\n consisting of just whitespace with a ``None`` separator returns\n ``[]``.\n\n For example, ``\' 1 2 3 \'.split()`` returns ``[\'1\', \'2\', \'3\']``,\n and ``\' 1 2 3 \'.split(None, 1)`` returns ``[\'1\', \'2 3 \']``.\n\nstr.splitlines([keepends])\n\n Return a list of the lines in the string, breaking at line\n boundaries. This method uses the *universal newlines* approach to\n splitting lines. Line breaks are not included in the resulting list\n unless *keepends* is given and true.\n\n For example, ``\'ab c\\n\\nde fg\\rkl\\r\\n\'.splitlines()`` returns\n ``[\'ab c\', \'\', \'de fg\', \'kl\']``, while the same call with\n ``splitlines(True)`` returns ``[\'ab c\\n\', \'\\n\', \'de fg\\r\',\n \'kl\\r\\n\']``.\n\n Unlike ``split()`` when a delimiter string *sep* is given, this\n method returns an empty list for the empty string, and a terminal\n line break does not result in an extra line.\n\nstr.startswith(prefix[, start[, end]])\n\n Return ``True`` if string starts with the *prefix*, otherwise\n return ``False``. *prefix* can also be a tuple of prefixes to look\n for. With optional *start*, test string beginning at that\n position. With optional *end*, stop comparing string at that\n position.\n\nstr.strip([chars])\n\n Return a copy of the string with the leading and trailing\n characters removed. The *chars* argument is a string specifying the\n set of characters to be removed. If omitted or ``None``, the\n *chars* argument defaults to removing whitespace. The *chars*\n argument is not a prefix or suffix; rather, all combinations of its\n values are stripped:\n\n >>> \' spacious \'.strip()\n \'spacious\'\n >>> \'www.example.com\'.strip(\'cmowz.\')\n \'example\'\n\nstr.swapcase()\n\n Return a copy of the string with uppercase characters converted to\n lowercase and vice versa. Note that it is not necessarily true that\n ``s.swapcase().swapcase() == s``.\n\nstr.title()\n\n Return a titlecased version of the string where words start with an\n uppercase character and the remaining characters are lowercase.\n\n The algorithm uses a simple language-independent definition of a\n word as groups of consecutive letters. The definition works in\n many contexts but it means that apostrophes in contractions and\n possessives form word boundaries, which may not be the desired\n result:\n\n >>> "they\'re bill\'s friends from the UK".title()\n "They\'Re Bill\'S Friends From The Uk"\n\n A workaround for apostrophes can be constructed using regular\n expressions:\n\n >>> import re\n >>> def titlecase(s):\n ... return re.sub(r"[A-Za-z]+(\'[A-Za-z]+)?",\n ... lambda mo: mo.group(0)[0].upper() +\n ... mo.group(0)[1:].lower(),\n ... s)\n ...\n >>> titlecase("they\'re bill\'s friends.")\n "They\'re Bill\'s Friends."\n\nstr.translate(map)\n\n Return a copy of the *s* where all characters have been mapped\n through the *map* which must be a dictionary of Unicode ordinals\n (integers) to Unicode ordinals, strings or ``None``. Unmapped\n characters are left untouched. Characters mapped to ``None`` are\n deleted.\n\n You can use ``str.maketrans()`` to create a translation map from\n character-to-character mappings in different formats.\n\n Note: An even more flexible approach is to create a custom character\n mapping codec using the ``codecs`` module (see\n ``encodings.cp1251`` for an example).\n\nstr.upper()\n\n Return a copy of the string with all the cased characters [4]\n converted to uppercase. Note that ``str.upper().isupper()`` might\n be ``False`` if ``s`` contains uncased characters or if the Unicode\n category of the resulting character(s) is not "Lu" (Letter,\n uppercase), but e.g. "Lt" (Letter, titlecase).\n\n The uppercasing algorithm used is described in section 3.13 of the\n Unicode Standard.\n\nstr.zfill(width)\n\n Return the numeric string left filled with zeros in a string of\n length *width*. A sign prefix is handled correctly. The original\n string is returned if *width* is less than or equal to ``len(s)``.\n', 'strings': '\nString and Bytes literals\n*************************\n\nString literals are described by the following lexical definitions:\n\n stringliteral ::= [stringprefix](shortstring | longstring)\n stringprefix ::= "r" | "u" | "R" | "U"\n shortstring ::= "\'" shortstringitem* "\'" | \'"\' shortstringitem* \'"\'\n longstring ::= "\'\'\'" longstringitem* "\'\'\'" | \'"""\' longstringitem* \'"""\'\n shortstringitem ::= shortstringchar | stringescapeseq\n longstringitem ::= longstringchar | stringescapeseq\n shortstringchar ::= \n longstringchar ::= \n stringescapeseq ::= "\\" \n\n bytesliteral ::= bytesprefix(shortbytes | longbytes)\n bytesprefix ::= "b" | "B" | "br" | "Br" | "bR" | "BR" | "rb" | "rB" | "Rb" | "RB"\n shortbytes ::= "\'" shortbytesitem* "\'" | \'"\' shortbytesitem* \'"\'\n longbytes ::= "\'\'\'" longbytesitem* "\'\'\'" | \'"""\' longbytesitem* \'"""\'\n shortbytesitem ::= shortbyteschar | bytesescapeseq\n longbytesitem ::= longbyteschar | bytesescapeseq\n shortbyteschar ::= \n longbyteschar ::= \n bytesescapeseq ::= "\\" \n\nOne syntactic restriction not indicated by these productions is that\nwhitespace is not allowed between the ``stringprefix`` or\n``bytesprefix`` and the rest of the literal. The source character set\nis defined by the encoding declaration; it is UTF-8 if no encoding\ndeclaration is given in the source file; see section *Encoding\ndeclarations*.\n\nIn plain English: Both types of literals can be enclosed in matching\nsingle quotes (``\'``) or double quotes (``"``). They can also be\nenclosed in matching groups of three single or double quotes (these\nare generally referred to as *triple-quoted strings*). The backslash\n(``\\``) character is used to escape characters that otherwise have a\nspecial meaning, such as newline, backslash itself, or the quote\ncharacter.\n\nBytes literals are always prefixed with ``\'b\'`` or ``\'B\'``; they\nproduce an instance of the ``bytes`` type instead of the ``str`` type.\nThey may only contain ASCII characters; bytes with a numeric value of\n128 or greater must be expressed with escapes.\n\nAs of Python 3.3 it is possible again to prefix unicode strings with a\n``u`` prefix to simplify maintenance of dual 2.x and 3.x codebases.\n\nBoth string and bytes literals may optionally be prefixed with a\nletter ``\'r\'`` or ``\'R\'``; such strings are called *raw strings* and\ntreat backslashes as literal characters. As a result, in string\nliterals, ``\'\\U\'`` and ``\'\\u\'`` escapes in raw strings are not treated\nspecially. Given that Python 2.x\'s raw unicode literals behave\ndifferently than Python 3.x\'s the ``\'ur\'`` syntax is not supported.\n\n New in version 3.3: The ``\'rb\'`` prefix of raw bytes literals has\n been added as a synonym of ``\'br\'``.\n\n New in version 3.3: Support for the unicode legacy literal\n (``u\'value\'``) was reintroduced to simplify the maintenance of dual\n Python 2.x and 3.x codebases. See **PEP 414** for more information.\n\nIn triple-quoted strings, unescaped newlines and quotes are allowed\n(and are retained), except that three unescaped quotes in a row\nterminate the string. (A "quote" is the character used to open the\nstring, i.e. either ``\'`` or ``"``.)\n\nUnless an ``\'r\'`` or ``\'R\'`` prefix is present, escape sequences in\nstrings are interpreted according to rules similar to those used by\nStandard C. The recognized escape sequences are:\n\n+-------------------+-----------------------------------+---------+\n| Escape Sequence | Meaning | Notes |\n+===================+===================================+=========+\n| ``\\newline`` | Backslash and newline ignored | |\n+-------------------+-----------------------------------+---------+\n| ``\\\\`` | Backslash (``\\``) | |\n+-------------------+-----------------------------------+---------+\n| ``\\\'`` | Single quote (``\'``) | |\n+-------------------+-----------------------------------+---------+\n| ``\\"`` | Double quote (``"``) | |\n+-------------------+-----------------------------------+---------+\n| ``\\a`` | ASCII Bell (BEL) | |\n+-------------------+-----------------------------------+---------+\n| ``\\b`` | ASCII Backspace (BS) | |\n+-------------------+-----------------------------------+---------+\n| ``\\f`` | ASCII Formfeed (FF) | |\n+-------------------+-----------------------------------+---------+\n| ``\\n`` | ASCII Linefeed (LF) | |\n+-------------------+-----------------------------------+---------+\n| ``\\r`` | ASCII Carriage Return (CR) | |\n+-------------------+-----------------------------------+---------+\n| ``\\t`` | ASCII Horizontal Tab (TAB) | |\n+-------------------+-----------------------------------+---------+\n| ``\\v`` | ASCII Vertical Tab (VT) | |\n+-------------------+-----------------------------------+---------+\n| ``\\ooo`` | Character with octal value *ooo* | (1,3) |\n+-------------------+-----------------------------------+---------+\n| ``\\xhh`` | Character with hex value *hh* | (2,3) |\n+-------------------+-----------------------------------+---------+\n\nEscape sequences only recognized in string literals are:\n\n+-------------------+-----------------------------------+---------+\n| Escape Sequence | Meaning | Notes |\n+===================+===================================+=========+\n| ``\\N{name}`` | Character named *name* in the | (4) |\n| | Unicode database | |\n+-------------------+-----------------------------------+---------+\n| ``\\uxxxx`` | Character with 16-bit hex value | (5) |\n| | *xxxx* | |\n+-------------------+-----------------------------------+---------+\n| ``\\Uxxxxxxxx`` | Character with 32-bit hex value | (6) |\n| | *xxxxxxxx* | |\n+-------------------+-----------------------------------+---------+\n\nNotes:\n\n1. As in Standard C, up to three octal digits are accepted.\n\n2. Unlike in Standard C, exactly two hex digits are required.\n\n3. In a bytes literal, hexadecimal and octal escapes denote the byte\n with the given value. In a string literal, these escapes denote a\n Unicode character with the given value.\n\n4. Changed in version 3.3: Support for name aliases [1] has been\n added.\n\n5. Individual code units which form parts of a surrogate pair can be\n encoded using this escape sequence. Exactly four hex digits are\n required.\n\n6. Any Unicode character can be encoded this way. Exactly eight hex\n digits are required.\n\nUnlike Standard C, all unrecognized escape sequences are left in the\nstring unchanged, i.e., *the backslash is left in the string*. (This\nbehavior is useful when debugging: if an escape sequence is mistyped,\nthe resulting output is more easily recognized as broken.) It is also\nimportant to note that the escape sequences only recognized in string\nliterals fall into the category of unrecognized escapes for bytes\nliterals.\n\nEven in a raw string, string quotes can be escaped with a backslash,\nbut the backslash remains in the string; for example, ``r"\\""`` is a\nvalid string literal consisting of two characters: a backslash and a\ndouble quote; ``r"\\"`` is not a valid string literal (even a raw\nstring cannot end in an odd number of backslashes). Specifically, *a\nraw string cannot end in a single backslash* (since the backslash\nwould escape the following quote character). Note also that a single\nbackslash followed by a newline is interpreted as those two characters\nas part of the string, *not* as a line continuation.\n', 'subscriptions': '\nSubscriptions\n*************\n\nA subscription selects an item of a sequence (string, tuple or list)\nor mapping (dictionary) object:\n\n subscription ::= primary "[" expression_list "]"\n\nThe primary must evaluate to an object that supports subscription,\ne.g. a list or dictionary. User-defined objects can support\nsubscription by defining a ``__getitem__()`` method.\n\nFor built-in objects, there are two types of objects that support\nsubscription:\n\nIf the primary is a mapping, the expression list must evaluate to an\nobject whose value is one of the keys of the mapping, and the\nsubscription selects the value in the mapping that corresponds to that\nkey. (The expression list is a tuple except if it has exactly one\nitem.)\n\nIf the primary is a sequence, the expression (list) must evaluate to\nan integer or a slice (as discussed in the following section).\n\nThe formal syntax makes no special provision for negative indices in\nsequences; however, built-in sequences all provide a ``__getitem__()``\nmethod that interprets negative indices by adding the length of the\nsequence to the index (so that ``x[-1]`` selects the last item of\n``x``). The resulting value must be a nonnegative integer less than\nthe number of items in the sequence, and the subscription selects the\nitem whose index is that value (counting from zero). Since the support\nfor negative indices and slicing occurs in the object\'s\n``__getitem__()`` method, subclasses overriding this method will need\nto explicitly add that support.\n\nA string\'s items are characters. A character is not a separate data\ntype but a string of exactly one character.\n', 'truth': "\nTruth Value Testing\n*******************\n\nAny object can be tested for truth value, for use in an ``if`` or\n``while`` condition or as operand of the Boolean operations below. The\nfollowing values are considered false:\n\n* ``None``\n\n* ``False``\n\n* zero of any numeric type, for example, ``0``, ``0.0``, ``0j``.\n\n* any empty sequence, for example, ``''``, ``()``, ``[]``.\n\n* any empty mapping, for example, ``{}``.\n\n* instances of user-defined classes, if the class defines a\n ``__bool__()`` or ``__len__()`` method, when that method returns the\n integer zero or ``bool`` value ``False``. [1]\n\nAll other values are considered true --- so objects of many types are\nalways true.\n\nOperations and built-in functions that have a Boolean result always\nreturn ``0`` or ``False`` for false and ``1`` or ``True`` for true,\nunless otherwise stated. (Important exception: the Boolean operations\n``or`` and ``and`` always return one of their operands.)\n", @@ -71,7 +71,7 @@ 'typesmapping': '\nMapping Types --- ``dict``\n**************************\n\nA *mapping* object maps *hashable* values to arbitrary objects.\nMappings are mutable objects. There is currently only one standard\nmapping type, the *dictionary*. (For other containers see the built-\nin ``list``, ``set``, and ``tuple`` classes, and the ``collections``\nmodule.)\n\nA dictionary\'s keys are *almost* arbitrary values. Values that are\nnot *hashable*, that is, values containing lists, dictionaries or\nother mutable types (that are compared by value rather than by object\nidentity) may not be used as keys. Numeric types used for keys obey\nthe normal rules for numeric comparison: if two numbers compare equal\n(such as ``1`` and ``1.0``) then they can be used interchangeably to\nindex the same dictionary entry. (Note however, that since computers\nstore floating-point numbers as approximations it is usually unwise to\nuse them as dictionary keys.)\n\nDictionaries can be created by placing a comma-separated list of\n``key: value`` pairs within braces, for example: ``{\'jack\': 4098,\n\'sjoerd\': 4127}`` or ``{4098: \'jack\', 4127: \'sjoerd\'}``, or by the\n``dict`` constructor.\n\nclass class dict(**kwarg)\nclass class dict(mapping, **kwarg)\nclass class dict(iterable, **kwarg)\n\n Return a new dictionary initialized from an optional positional\n argument and a possibly empty set of keyword arguments.\n\n If no positional argument is given, an empty dictionary is created.\n If a positional argument is given and it is a mapping object, a\n dictionary is created with the same key-value pairs as the mapping\n object. Otherwise, the positional argument must be an *iterator*\n object. Each item in the iterable must itself be an iterator with\n exactly two objects. The first object of each item becomes a key\n in the new dictionary, and the second object the corresponding\n value. If a key occurs more than once, the last value for that key\n becomes the corresponding value in the new dictionary.\n\n If keyword arguments are given, the keyword arguments and their\n values are added to the dictionary created from the positional\n argument. If a key being added is already present, the value from\n the keyword argument replaces the value from the positional\n argument.\n\n To illustrate, the following examples all return a dictionary equal\n to ``{"one": 1, "two": 2, "three": 3}``:\n\n >>> a = dict(one=1, two=2, three=3)\n >>> b = {\'one\': 1, \'two\': 2, \'three\': 3}\n >>> c = dict(zip([\'one\', \'two\', \'three\'], [1, 2, 3]))\n >>> d = dict([(\'two\', 2), (\'one\', 1), (\'three\', 3)])\n >>> e = dict({\'three\': 3, \'one\': 1, \'two\': 2})\n >>> a == b == c == d == e\n True\n\n Providing keyword arguments as in the first example only works for\n keys that are valid Python identifiers. Otherwise, any valid keys\n can be used.\n\n These are the operations that dictionaries support (and therefore,\n custom mapping types should support too):\n\n len(d)\n\n Return the number of items in the dictionary *d*.\n\n d[key]\n\n Return the item of *d* with key *key*. Raises a ``KeyError`` if\n *key* is not in the map.\n\n If a subclass of dict defines a method ``__missing__()``, if the\n key *key* is not present, the ``d[key]`` operation calls that\n method with the key *key* as argument. The ``d[key]`` operation\n then returns or raises whatever is returned or raised by the\n ``__missing__(key)`` call if the key is not present. No other\n operations or methods invoke ``__missing__()``. If\n ``__missing__()`` is not defined, ``KeyError`` is raised.\n ``__missing__()`` must be a method; it cannot be an instance\n variable:\n\n >>> class Counter(dict):\n ... def __missing__(self, key):\n ... return 0\n >>> c = Counter()\n >>> c[\'red\']\n 0\n >>> c[\'red\'] += 1\n >>> c[\'red\']\n 1\n\n See ``collections.Counter`` for a complete implementation\n including other methods helpful for accumulating and managing\n tallies.\n\n d[key] = value\n\n Set ``d[key]`` to *value*.\n\n del d[key]\n\n Remove ``d[key]`` from *d*. Raises a ``KeyError`` if *key* is\n not in the map.\n\n key in d\n\n Return ``True`` if *d* has a key *key*, else ``False``.\n\n key not in d\n\n Equivalent to ``not key in d``.\n\n iter(d)\n\n Return an iterator over the keys of the dictionary. This is a\n shortcut for ``iter(d.keys())``.\n\n clear()\n\n Remove all items from the dictionary.\n\n copy()\n\n Return a shallow copy of the dictionary.\n\n classmethod fromkeys(seq[, value])\n\n Create a new dictionary with keys from *seq* and values set to\n *value*.\n\n ``fromkeys()`` is a class method that returns a new dictionary.\n *value* defaults to ``None``.\n\n get(key[, default])\n\n Return the value for *key* if *key* is in the dictionary, else\n *default*. If *default* is not given, it defaults to ``None``,\n so that this method never raises a ``KeyError``.\n\n items()\n\n Return a new view of the dictionary\'s items (``(key, value)``\n pairs). See the *documentation of view objects*.\n\n keys()\n\n Return a new view of the dictionary\'s keys. See the\n *documentation of view objects*.\n\n pop(key[, default])\n\n If *key* is in the dictionary, remove it and return its value,\n else return *default*. If *default* is not given and *key* is\n not in the dictionary, a ``KeyError`` is raised.\n\n popitem()\n\n Remove and return an arbitrary ``(key, value)`` pair from the\n dictionary.\n\n ``popitem()`` is useful to destructively iterate over a\n dictionary, as often used in set algorithms. If the dictionary\n is empty, calling ``popitem()`` raises a ``KeyError``.\n\n setdefault(key[, default])\n\n If *key* is in the dictionary, return its value. If not, insert\n *key* with a value of *default* and return *default*. *default*\n defaults to ``None``.\n\n update([other])\n\n Update the dictionary with the key/value pairs from *other*,\n overwriting existing keys. Return ``None``.\n\n ``update()`` accepts either another dictionary object or an\n iterable of key/value pairs (as tuples or other iterables of\n length two). If keyword arguments are specified, the dictionary\n is then updated with those key/value pairs: ``d.update(red=1,\n blue=2)``.\n\n values()\n\n Return a new view of the dictionary\'s values. See the\n *documentation of view objects*.\n\nSee also:\n\n ``types.MappingProxyType`` can be used to create a read-only view\n of a ``dict``.\n\n\nDictionary view objects\n=======================\n\nThe objects returned by ``dict.keys()``, ``dict.values()`` and\n``dict.items()`` are *view objects*. They provide a dynamic view on\nthe dictionary\'s entries, which means that when the dictionary\nchanges, the view reflects these changes.\n\nDictionary views can be iterated over to yield their respective data,\nand support membership tests:\n\nlen(dictview)\n\n Return the number of entries in the dictionary.\n\niter(dictview)\n\n Return an iterator over the keys, values or items (represented as\n tuples of ``(key, value)``) in the dictionary.\n\n Keys and values are iterated over in an arbitrary order which is\n non-random, varies across Python implementations, and depends on\n the dictionary\'s history of insertions and deletions. If keys,\n values and items views are iterated over with no intervening\n modifications to the dictionary, the order of items will directly\n correspond. This allows the creation of ``(value, key)`` pairs\n using ``zip()``: ``pairs = zip(d.values(), d.keys())``. Another\n way to create the same list is ``pairs = [(v, k) for (k, v) in\n d.items()]``.\n\n Iterating views while adding or deleting entries in the dictionary\n may raise a ``RuntimeError`` or fail to iterate over all entries.\n\nx in dictview\n\n Return ``True`` if *x* is in the underlying dictionary\'s keys,\n values or items (in the latter case, *x* should be a ``(key,\n value)`` tuple).\n\nKeys views are set-like since their entries are unique and hashable.\nIf all values are hashable, so that ``(key, value)`` pairs are unique\nand hashable, then the items view is also set-like. (Values views are\nnot treated as set-like since the entries are generally not unique.)\nFor set-like views, all of the operations defined for the abstract\nbase class ``collections.abc.Set`` are available (for example, ``==``,\n``<``, or ``^``).\n\nAn example of dictionary view usage:\n\n >>> dishes = {\'eggs\': 2, \'sausage\': 1, \'bacon\': 1, \'spam\': 500}\n >>> keys = dishes.keys()\n >>> values = dishes.values()\n\n >>> # iteration\n >>> n = 0\n >>> for val in values:\n ... n += val\n >>> print(n)\n 504\n\n >>> # keys and values are iterated over in the same order\n >>> list(keys)\n [\'eggs\', \'bacon\', \'sausage\', \'spam\']\n >>> list(values)\n [2, 1, 1, 500]\n\n >>> # view objects are dynamic and reflect dict changes\n >>> del dishes[\'eggs\']\n >>> del dishes[\'sausage\']\n >>> list(keys)\n [\'spam\', \'bacon\']\n\n >>> # set operations\n >>> keys & {\'eggs\', \'bacon\', \'salad\'}\n {\'bacon\'}\n >>> keys ^ {\'sausage\', \'juice\'}\n {\'juice\', \'sausage\', \'bacon\', \'spam\'}\n', 'typesmethods': '\nMethods\n*******\n\nMethods are functions that are called using the attribute notation.\nThere are two flavors: built-in methods (such as ``append()`` on\nlists) and class instance methods. Built-in methods are described\nwith the types that support them.\n\nIf you access a method (a function defined in a class namespace)\nthrough an instance, you get a special object: a *bound method* (also\ncalled *instance method*) object. When called, it will add the\n``self`` argument to the argument list. Bound methods have two\nspecial read-only attributes: ``m.__self__`` is the object on which\nthe method operates, and ``m.__func__`` is the function implementing\nthe method. Calling ``m(arg-1, arg-2, ..., arg-n)`` is completely\nequivalent to calling ``m.__func__(m.__self__, arg-1, arg-2, ...,\narg-n)``.\n\nLike function objects, bound method objects support getting arbitrary\nattributes. However, since method attributes are actually stored on\nthe underlying function object (``meth.__func__``), setting method\nattributes on bound methods is disallowed. Attempting to set an\nattribute on a method results in an ``AttributeError`` being raised.\nIn order to set a method attribute, you need to explicitly set it on\nthe underlying function object:\n\n >>> class C:\n ... def method(self):\n ... pass\n ...\n >>> c = C()\n >>> c.method.whoami = \'my name is method\' # can\'t set on the method\n Traceback (most recent call last):\n File "", line 1, in \n AttributeError: \'method\' object has no attribute \'whoami\'\n >>> c.method.__func__.whoami = \'my name is method\'\n >>> c.method.whoami\n \'my name is method\'\n\nSee *The standard type hierarchy* for more information.\n', 'typesmodules': "\nModules\n*******\n\nThe only special operation on a module is attribute access:\n``m.name``, where *m* is a module and *name* accesses a name defined\nin *m*'s symbol table. Module attributes can be assigned to. (Note\nthat the ``import`` statement is not, strictly speaking, an operation\non a module object; ``import foo`` does not require a module object\nnamed *foo* to exist, rather it requires an (external) *definition*\nfor a module named *foo* somewhere.)\n\nA special attribute of every module is ``__dict__``. This is the\ndictionary containing the module's symbol table. Modifying this\ndictionary will actually change the module's symbol table, but direct\nassignment to the ``__dict__`` attribute is not possible (you can\nwrite ``m.__dict__['a'] = 1``, which defines ``m.a`` to be ``1``, but\nyou can't write ``m.__dict__ = {}``). Modifying ``__dict__`` directly\nis not recommended.\n\nModules built into the interpreter are written like this: ````. If loaded from a file, they are written as\n````.\n", - 'typesseq': '\nSequence Types --- ``list``, ``tuple``, ``range``\n*************************************************\n\nThere are three basic sequence types: lists, tuples, and range\nobjects. Additional sequence types tailored for processing of *binary\ndata* and *text strings* are described in dedicated sections.\n\n\nCommon Sequence Operations\n==========================\n\nThe operations in the following table are supported by most sequence\ntypes, both mutable and immutable. The ``collections.abc.Sequence``\nABC is provided to make it easier to correctly implement these\noperations on custom sequence types.\n\nThis table lists the sequence operations sorted in ascending priority\n(operations in the same box have the same priority). In the table,\n*s* and *t* are sequences of the same type, *n*, *i*, *j* and *k* are\nintegers and *x* is an arbitrary object that meets any type and value\nrestrictions imposed by *s*.\n\nThe ``in`` and ``not in`` operations have the same priorities as the\ncomparison operations. The ``+`` (concatenation) and ``*``\n(repetition) operations have the same priority as the corresponding\nnumeric operations.\n\n+----------------------------+----------------------------------+------------+\n| Operation | Result | Notes |\n+============================+==================================+============+\n| ``x in s`` | ``True`` if an item of *s* is | (1) |\n| | equal to *x*, else ``False`` | |\n+----------------------------+----------------------------------+------------+\n| ``x not in s`` | ``False`` if an item of *s* is | (1) |\n| | equal to *x*, else ``True`` | |\n+----------------------------+----------------------------------+------------+\n| ``s + t`` | the concatenation of *s* and *t* | (6)(7) |\n+----------------------------+----------------------------------+------------+\n| ``s * n`` or ``n * s`` | *n* shallow copies of *s* | (2)(7) |\n| | concatenated | |\n+----------------------------+----------------------------------+------------+\n| ``s[i]`` | *i*th item of *s*, origin 0 | (3) |\n+----------------------------+----------------------------------+------------+\n| ``s[i:j]`` | slice of *s* from *i* to *j* | (3)(4) |\n+----------------------------+----------------------------------+------------+\n| ``s[i:j:k]`` | slice of *s* from *i* to *j* | (3)(5) |\n| | with step *k* | |\n+----------------------------+----------------------------------+------------+\n| ``len(s)`` | length of *s* | |\n+----------------------------+----------------------------------+------------+\n| ``min(s)`` | smallest item of *s* | |\n+----------------------------+----------------------------------+------------+\n| ``max(s)`` | largest item of *s* | |\n+----------------------------+----------------------------------+------------+\n| ``s.index(x[, i[, j]])`` | index of the first occurrence of | (8) |\n| | *x* in *s* (at or after index | |\n| | *i* and before index *j*) | |\n+----------------------------+----------------------------------+------------+\n| ``s.count(x)`` | total number of occurrences of | |\n| | *x* in *s* | |\n+----------------------------+----------------------------------+------------+\n\nSequences of the same type also support comparisons. In particular,\ntuples and lists are compared lexicographically by comparing\ncorresponding elements. This means that to compare equal, every\nelement must compare equal and the two sequences must be of the same\ntype and have the same length. (For full details see *Comparisons* in\nthe language reference.)\n\nNotes:\n\n1. While the ``in`` and ``not in`` operations are used only for simple\n containment testing in the general case, some specialised sequences\n (such as ``str``, ``bytes`` and ``bytearray``) also use them for\n subsequence testing:\n\n >>> "gg" in "eggs"\n True\n\n2. Values of *n* less than ``0`` are treated as ``0`` (which yields an\n empty sequence of the same type as *s*). Note also that the copies\n are shallow; nested structures are not copied. This often haunts\n new Python programmers; consider:\n\n >>> lists = [[]] * 3\n >>> lists\n [[], [], []]\n >>> lists[0].append(3)\n >>> lists\n [[3], [3], [3]]\n\n What has happened is that ``[[]]`` is a one-element list containing\n an empty list, so all three elements of ``[[]] * 3`` are (pointers\n to) this single empty list. Modifying any of the elements of\n ``lists`` modifies this single list. You can create a list of\n different lists this way:\n\n >>> lists = [[] for i in range(3)]\n >>> lists[0].append(3)\n >>> lists[1].append(5)\n >>> lists[2].append(7)\n >>> lists\n [[3], [5], [7]]\n\n3. If *i* or *j* is negative, the index is relative to the end of the\n string: ``len(s) + i`` or ``len(s) + j`` is substituted. But note\n that ``-0`` is still ``0``.\n\n4. The slice of *s* from *i* to *j* is defined as the sequence of\n items with index *k* such that ``i <= k < j``. If *i* or *j* is\n greater than ``len(s)``, use ``len(s)``. If *i* is omitted or\n ``None``, use ``0``. If *j* is omitted or ``None``, use\n ``len(s)``. If *i* is greater than or equal to *j*, the slice is\n empty.\n\n5. The slice of *s* from *i* to *j* with step *k* is defined as the\n sequence of items with index ``x = i + n*k`` such that ``0 <= n <\n (j-i)/k``. In other words, the indices are ``i``, ``i+k``,\n ``i+2*k``, ``i+3*k`` and so on, stopping when *j* is reached (but\n never including *j*). If *i* or *j* is greater than ``len(s)``,\n use ``len(s)``. If *i* or *j* are omitted or ``None``, they become\n "end" values (which end depends on the sign of *k*). Note, *k*\n cannot be zero. If *k* is ``None``, it is treated like ``1``.\n\n6. Concatenating immutable sequences always results in a new object.\n This means that building up a sequence by repeated concatenation\n will have a quadratic runtime cost in the total sequence length.\n To get a linear runtime cost, you must switch to one of the\n alternatives below:\n\n * if concatenating ``str`` objects, you can build a list and use\n ``str.join()`` at the end or else write to a ``io.StringIO``\n instance and retrieve its value when complete\n\n * if concatenating ``bytes`` objects, you can similarly use\n ``bytes.join()`` or ``io.BytesIO``, or you can do in-place\n concatenation with a ``bytearray`` object. ``bytearray`` objects\n are mutable and have an efficient overallocation mechanism\n\n * if concatenating ``tuple`` objects, extend a ``list`` instead\n\n * for other types, investigate the relevant class documentation\n\n7. Some sequence types (such as ``range``) only support item sequences\n that follow specific patterns, and hence don\'t support sequence\n concatenation or repetition.\n\n8. ``index`` raises ``ValueError`` when *x* is not found in *s*. When\n supported, the additional arguments to the index method allow\n efficient searching of subsections of the sequence. Passing the\n extra arguments is roughly equivalent to using ``s[i:j].index(x)``,\n only without copying any data and with the returned index being\n relative to the start of the sequence rather than the start of the\n slice.\n\n\nImmutable Sequence Types\n========================\n\nThe only operation that immutable sequence types generally implement\nthat is not also implemented by mutable sequence types is support for\nthe ``hash()`` built-in.\n\nThis support allows immutable sequences, such as ``tuple`` instances,\nto be used as ``dict`` keys and stored in ``set`` and ``frozenset``\ninstances.\n\nAttempting to hash an immutable sequence that contains unhashable\nvalues will result in ``TypeError``.\n\n\nMutable Sequence Types\n======================\n\nThe operations in the following table are defined on mutable sequence\ntypes. The ``collections.abc.MutableSequence`` ABC is provided to make\nit easier to correctly implement these operations on custom sequence\ntypes.\n\nIn the table *s* is an instance of a mutable sequence type, *t* is any\niterable object and *x* is an arbitrary object that meets any type and\nvalue restrictions imposed by *s* (for example, ``bytearray`` only\naccepts integers that meet the value restriction ``0 <= x <= 255``).\n\n+--------------------------------+----------------------------------+-----------------------+\n| Operation | Result | Notes |\n+================================+==================================+=======================+\n| ``s[i] = x`` | item *i* of *s* is replaced by | |\n| | *x* | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s[i:j] = t`` | slice of *s* from *i* to *j* is | |\n| | replaced by the contents of the | |\n| | iterable *t* | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``del s[i:j]`` | same as ``s[i:j] = []`` | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s[i:j:k] = t`` | the elements of ``s[i:j:k]`` are | (1) |\n| | replaced by those of *t* | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``del s[i:j:k]`` | removes the elements of | |\n| | ``s[i:j:k]`` from the list | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.append(x)`` | appends *x* to the end of the | |\n| | sequence (same as | |\n| | ``s[len(s):len(s)] = [x]``) | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.clear()`` | removes all items from ``s`` | (5) |\n| | (same as ``del s[:]``) | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.copy()`` | creates a shallow copy of ``s`` | (5) |\n| | (same as ``s[:]``) | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.extend(t)`` | extends *s* with the contents of | |\n| | *t* (same as ``s[len(s):len(s)] | |\n| | = t``) | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.insert(i, x)`` | inserts *x* into *s* at the | |\n| | index given by *i* (same as | |\n| | ``s[i:i] = [x]``) | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.pop([i])`` | retrieves the item at *i* and | (2) |\n| | also removes it from *s* | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.remove(x)`` | remove the first item from *s* | (3) |\n| | where ``s[i] == x`` | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.reverse()`` | reverses the items of *s* in | (4) |\n| | place | |\n+--------------------------------+----------------------------------+-----------------------+\n\nNotes:\n\n1. *t* must have the same length as the slice it is replacing.\n\n2. The optional argument *i* defaults to ``-1``, so that by default\n the last item is removed and returned.\n\n3. ``remove`` raises ``ValueError`` when *x* is not found in *s*.\n\n4. The ``reverse()`` method modifies the sequence in place for economy\n of space when reversing a large sequence. To remind users that it\n operates by side effect, it does not return the reversed sequence.\n\n5. ``clear()`` and ``copy()`` are included for consistency with the\n interfaces of mutable containers that don\'t support slicing\n operations (such as ``dict`` and ``set``)\n\n New in version 3.3: ``clear()`` and ``copy()`` methods.\n\n\nLists\n=====\n\nLists are mutable sequences, typically used to store collections of\nhomogeneous items (where the precise degree of similarity will vary by\napplication).\n\nclass class list([iterable])\n\n Lists may be constructed in several ways:\n\n * Using a pair of square brackets to denote the empty list: ``[]``\n\n * Using square brackets, separating items with commas: ``[a]``,\n ``[a, b, c]``\n\n * Using a list comprehension: ``[x for x in iterable]``\n\n * Using the type constructor: ``list()`` or ``list(iterable)``\n\n The constructor builds a list whose items are the same and in the\n same order as *iterable*\'s items. *iterable* may be either a\n sequence, a container that supports iteration, or an iterator\n object. If *iterable* is already a list, a copy is made and\n returned, similar to ``iterable[:]``. For example, ``list(\'abc\')``\n returns ``[\'a\', \'b\', \'c\']`` and ``list( (1, 2, 3) )`` returns ``[1,\n 2, 3]``. If no argument is given, the constructor creates a new\n empty list, ``[]``.\n\n Many other operations also produce lists, including the\n ``sorted()`` built-in.\n\n Lists implement all of the *common* and *mutable* sequence\n operations. Lists also provide the following additional method:\n\n sort(*, key=None, reverse=None)\n\n This method sorts the list in place, using only ``<``\n comparisons between items. Exceptions are not suppressed - if\n any comparison operations fail, the entire sort operation will\n fail (and the list will likely be left in a partially modified\n state).\n\n *key* specifies a function of one argument that is used to\n extract a comparison key from each list element (for example,\n ``key=str.lower``). The key corresponding to each item in the\n list is calculated once and then used for the entire sorting\n process. The default value of ``None`` means that list items are\n sorted directly without calculating a separate key value.\n\n The ``functools.cmp_to_key()`` utility is available to convert a\n 2.x style *cmp* function to a *key* function.\n\n *reverse* is a boolean value. If set to ``True``, then the list\n elements are sorted as if each comparison were reversed.\n\n This method modifies the sequence in place for economy of space\n when sorting a large sequence. To remind users that it operates\n by side effect, it does not return the sorted sequence (use\n ``sorted()`` to explicitly request a new sorted list instance).\n\n The ``sort()`` method is guaranteed to be stable. A sort is\n stable if it guarantees not to change the relative order of\n elements that compare equal --- this is helpful for sorting in\n multiple passes (for example, sort by department, then by salary\n grade).\n\n **CPython implementation detail:** While a list is being sorted,\n the effect of attempting to mutate, or even inspect, the list is\n undefined. The C implementation of Python makes the list appear\n empty for the duration, and raises ``ValueError`` if it can\n detect that the list has been mutated during a sort.\n\n\nTuples\n======\n\nTuples are immutable sequences, typically used to store collections of\nheterogeneous data (such as the 2-tuples produced by the\n``enumerate()`` built-in). Tuples are also used for cases where an\nimmutable sequence of homogeneous data is needed (such as allowing\nstorage in a ``set`` or ``dict`` instance).\n\nclass class tuple([iterable])\n\n Tuples may be constructed in a number of ways:\n\n * Using a pair of parentheses to denote the empty tuple: ``()``\n\n * Using a trailing comma for a singleton tuple: ``a,`` or ``(a,)``\n\n * Separating items with commas: ``a, b, c`` or ``(a, b, c)``\n\n * Using the ``tuple()`` built-in: ``tuple()`` or\n ``tuple(iterable)``\n\n The constructor builds a tuple whose items are the same and in the\n same order as *iterable*\'s items. *iterable* may be either a\n sequence, a container that supports iteration, or an iterator\n object. If *iterable* is already a tuple, it is returned\n unchanged. For example, ``tuple(\'abc\')`` returns ``(\'a\', \'b\',\n \'c\')`` and ``tuple( [1, 2, 3] )`` returns ``(1, 2, 3)``. If no\n argument is given, the constructor creates a new empty tuple,\n ``()``.\n\n Note that it is actually the comma which makes a tuple, not the\n parentheses. The parentheses are optional, except in the empty\n tuple case, or when they are needed to avoid syntactic ambiguity.\n For example, ``f(a, b, c)`` is a function call with three\n arguments, while ``f((a, b, c))`` is a function call with a 3-tuple\n as the sole argument.\n\n Tuples implement all of the *common* sequence operations.\n\nFor heterogeneous collections of data where access by name is clearer\nthan access by index, ``collections.namedtuple()`` may be a more\nappropriate choice than a simple tuple object.\n\n\nRanges\n======\n\nThe ``range`` type represents an immutable sequence of numbers and is\ncommonly used for looping a specific number of times in ``for`` loops.\n\nclass class range(stop)\nclass class range(start, stop[, step])\n\n The arguments to the range constructor must be integers (either\n built-in ``int`` or any object that implements the ``__index__``\n special method). If the *step* argument is omitted, it defaults to\n ``1``. If the *start* argument is omitted, it defaults to ``0``. If\n *step* is zero, ``ValueError`` is raised.\n\n For a positive *step*, the contents of a range ``r`` are determined\n by the formula ``r[i] = start + step*i`` where ``i >= 0`` and\n ``r[i] < stop``.\n\n For a negative *step*, the contents of the range are still\n determined by the formula ``r[i] = start + step*i``, but the\n constraints are ``i >= 0`` and ``r[i] > stop``.\n\n A range object will be empty if ``r[0]`` does not meet the value\n constraint. Ranges do support negative indices, but these are\n interpreted as indexing from the end of the sequence determined by\n the positive indices.\n\n Ranges containing absolute values larger than ``sys.maxsize`` are\n permitted but some features (such as ``len()``) may raise\n ``OverflowError``.\n\n Range examples:\n\n >>> list(range(10))\n [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]\n >>> list(range(1, 11))\n [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]\n >>> list(range(0, 30, 5))\n [0, 5, 10, 15, 20, 25]\n >>> list(range(0, 10, 3))\n [0, 3, 6, 9]\n >>> list(range(0, -10, -1))\n [0, -1, -2, -3, -4, -5, -6, -7, -8, -9]\n >>> list(range(0))\n []\n >>> list(range(1, 0))\n []\n\n Ranges implement all of the *common* sequence operations except\n concatenation and repetition (due to the fact that range objects\n can only represent sequences that follow a strict pattern and\n repetition and concatenation will usually violate that pattern).\n\nThe advantage of the ``range`` type over a regular ``list`` or\n``tuple`` is that a ``range`` object will always take the same (small)\namount of memory, no matter the size of the range it represents (as it\nonly stores the ``start``, ``stop`` and ``step`` values, calculating\nindividual items and subranges as needed).\n\nRange objects implement the ``collections.abc.Sequence`` ABC, and\nprovide features such as containment tests, element index lookup,\nslicing and support for negative indices (see *Sequence Types ---\nlist, tuple, range*):\n\n>>> r = range(0, 20, 2)\n>>> r\nrange(0, 20, 2)\n>>> 11 in r\nFalse\n>>> 10 in r\nTrue\n>>> r.index(10)\n5\n>>> r[5]\n10\n>>> r[:5]\nrange(0, 10, 2)\n>>> r[-1]\n18\n\nTesting range objects for equality with ``==`` and ``!=`` compares\nthem as sequences. That is, two range objects are considered equal if\nthey represent the same sequence of values. (Note that two range\nobjects that compare equal might have different ``start``, ``stop``\nand ``step`` attributes, for example ``range(0) == range(2, 1, 3)`` or\n``range(0, 3, 2) == range(0, 4, 2)``.)\n\nChanged in version 3.2: Implement the Sequence ABC. Support slicing\nand negative indices. Test ``int`` objects for membership in constant\ntime instead of iterating through all items.\n\nChanged in version 3.3: Define \'==\' and \'!=\' to compare range objects\nbased on the sequence of values they define (instead of comparing\nbased on object identity).\n\nNew in version 3.3: The ``start``, ``stop`` and ``step`` attributes.\n', + 'typesseq': '\nSequence Types --- ``list``, ``tuple``, ``range``\n*************************************************\n\nThere are three basic sequence types: lists, tuples, and range\nobjects. Additional sequence types tailored for processing of *binary\ndata* and *text strings* are described in dedicated sections.\n\n\nCommon Sequence Operations\n==========================\n\nThe operations in the following table are supported by most sequence\ntypes, both mutable and immutable. The ``collections.abc.Sequence``\nABC is provided to make it easier to correctly implement these\noperations on custom sequence types.\n\nThis table lists the sequence operations sorted in ascending priority\n(operations in the same box have the same priority). In the table,\n*s* and *t* are sequences of the same type, *n*, *i*, *j* and *k* are\nintegers and *x* is an arbitrary object that meets any type and value\nrestrictions imposed by *s*.\n\nThe ``in`` and ``not in`` operations have the same priorities as the\ncomparison operations. The ``+`` (concatenation) and ``*``\n(repetition) operations have the same priority as the corresponding\nnumeric operations.\n\n+----------------------------+----------------------------------+------------+\n| Operation | Result | Notes |\n+============================+==================================+============+\n| ``x in s`` | ``True`` if an item of *s* is | (1) |\n| | equal to *x*, else ``False`` | |\n+----------------------------+----------------------------------+------------+\n| ``x not in s`` | ``False`` if an item of *s* is | (1) |\n| | equal to *x*, else ``True`` | |\n+----------------------------+----------------------------------+------------+\n| ``s + t`` | the concatenation of *s* and *t* | (6)(7) |\n+----------------------------+----------------------------------+------------+\n| ``s * n`` or ``n * s`` | *n* shallow copies of *s* | (2)(7) |\n| | concatenated | |\n+----------------------------+----------------------------------+------------+\n| ``s[i]`` | *i*th item of *s*, origin 0 | (3) |\n+----------------------------+----------------------------------+------------+\n| ``s[i:j]`` | slice of *s* from *i* to *j* | (3)(4) |\n+----------------------------+----------------------------------+------------+\n| ``s[i:j:k]`` | slice of *s* from *i* to *j* | (3)(5) |\n| | with step *k* | |\n+----------------------------+----------------------------------+------------+\n| ``len(s)`` | length of *s* | |\n+----------------------------+----------------------------------+------------+\n| ``min(s)`` | smallest item of *s* | |\n+----------------------------+----------------------------------+------------+\n| ``max(s)`` | largest item of *s* | |\n+----------------------------+----------------------------------+------------+\n| ``s.index(x[, i[, j]])`` | index of the first occurrence of | (8) |\n| | *x* in *s* (at or after index | |\n| | *i* and before index *j*) | |\n+----------------------------+----------------------------------+------------+\n| ``s.count(x)`` | total number of occurrences of | |\n| | *x* in *s* | |\n+----------------------------+----------------------------------+------------+\n\nSequences of the same type also support comparisons. In particular,\ntuples and lists are compared lexicographically by comparing\ncorresponding elements. This means that to compare equal, every\nelement must compare equal and the two sequences must be of the same\ntype and have the same length. (For full details see *Comparisons* in\nthe language reference.)\n\nNotes:\n\n1. While the ``in`` and ``not in`` operations are used only for simple\n containment testing in the general case, some specialised sequences\n (such as ``str``, ``bytes`` and ``bytearray``) also use them for\n subsequence testing:\n\n >>> "gg" in "eggs"\n True\n\n2. Values of *n* less than ``0`` are treated as ``0`` (which yields an\n empty sequence of the same type as *s*). Note also that the copies\n are shallow; nested structures are not copied. This often haunts\n new Python programmers; consider:\n\n >>> lists = [[]] * 3\n >>> lists\n [[], [], []]\n >>> lists[0].append(3)\n >>> lists\n [[3], [3], [3]]\n\n What has happened is that ``[[]]`` is a one-element list containing\n an empty list, so all three elements of ``[[]] * 3`` are (pointers\n to) this single empty list. Modifying any of the elements of\n ``lists`` modifies this single list. You can create a list of\n different lists this way:\n\n >>> lists = [[] for i in range(3)]\n >>> lists[0].append(3)\n >>> lists[1].append(5)\n >>> lists[2].append(7)\n >>> lists\n [[3], [5], [7]]\n\n3. If *i* or *j* is negative, the index is relative to the end of the\n string: ``len(s) + i`` or ``len(s) + j`` is substituted. But note\n that ``-0`` is still ``0``.\n\n4. The slice of *s* from *i* to *j* is defined as the sequence of\n items with index *k* such that ``i <= k < j``. If *i* or *j* is\n greater than ``len(s)``, use ``len(s)``. If *i* is omitted or\n ``None``, use ``0``. If *j* is omitted or ``None``, use\n ``len(s)``. If *i* is greater than or equal to *j*, the slice is\n empty.\n\n5. The slice of *s* from *i* to *j* with step *k* is defined as the\n sequence of items with index ``x = i + n*k`` such that ``0 <= n <\n (j-i)/k``. In other words, the indices are ``i``, ``i+k``,\n ``i+2*k``, ``i+3*k`` and so on, stopping when *j* is reached (but\n never including *j*). If *i* or *j* is greater than ``len(s)``,\n use ``len(s)``. If *i* or *j* are omitted or ``None``, they become\n "end" values (which end depends on the sign of *k*). Note, *k*\n cannot be zero. If *k* is ``None``, it is treated like ``1``.\n\n6. Concatenating immutable sequences always results in a new object.\n This means that building up a sequence by repeated concatenation\n will have a quadratic runtime cost in the total sequence length.\n To get a linear runtime cost, you must switch to one of the\n alternatives below:\n\n * if concatenating ``str`` objects, you can build a list and use\n ``str.join()`` at the end or else write to a ``io.StringIO``\n instance and retrieve its value when complete\n\n * if concatenating ``bytes`` objects, you can similarly use\n ``bytes.join()`` or ``io.BytesIO``, or you can do in-place\n concatenation with a ``bytearray`` object. ``bytearray`` objects\n are mutable and have an efficient overallocation mechanism\n\n * if concatenating ``tuple`` objects, extend a ``list`` instead\n\n * for other types, investigate the relevant class documentation\n\n7. Some sequence types (such as ``range``) only support item sequences\n that follow specific patterns, and hence don\'t support sequence\n concatenation or repetition.\n\n8. ``index`` raises ``ValueError`` when *x* is not found in *s*. When\n supported, the additional arguments to the index method allow\n efficient searching of subsections of the sequence. Passing the\n extra arguments is roughly equivalent to using ``s[i:j].index(x)``,\n only without copying any data and with the returned index being\n relative to the start of the sequence rather than the start of the\n slice.\n\n\nImmutable Sequence Types\n========================\n\nThe only operation that immutable sequence types generally implement\nthat is not also implemented by mutable sequence types is support for\nthe ``hash()`` built-in.\n\nThis support allows immutable sequences, such as ``tuple`` instances,\nto be used as ``dict`` keys and stored in ``set`` and ``frozenset``\ninstances.\n\nAttempting to hash an immutable sequence that contains unhashable\nvalues will result in ``TypeError``.\n\n\nMutable Sequence Types\n======================\n\nThe operations in the following table are defined on mutable sequence\ntypes. The ``collections.abc.MutableSequence`` ABC is provided to make\nit easier to correctly implement these operations on custom sequence\ntypes.\n\nIn the table *s* is an instance of a mutable sequence type, *t* is any\niterable object and *x* is an arbitrary object that meets any type and\nvalue restrictions imposed by *s* (for example, ``bytearray`` only\naccepts integers that meet the value restriction ``0 <= x <= 255``).\n\n+--------------------------------+----------------------------------+-----------------------+\n| Operation | Result | Notes |\n+================================+==================================+=======================+\n| ``s[i] = x`` | item *i* of *s* is replaced by | |\n| | *x* | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s[i:j] = t`` | slice of *s* from *i* to *j* is | |\n| | replaced by the contents of the | |\n| | iterable *t* | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``del s[i:j]`` | same as ``s[i:j] = []`` | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s[i:j:k] = t`` | the elements of ``s[i:j:k]`` are | (1) |\n| | replaced by those of *t* | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``del s[i:j:k]`` | removes the elements of | |\n| | ``s[i:j:k]`` from the list | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.append(x)`` | appends *x* to the end of the | |\n| | sequence (same as | |\n| | ``s[len(s):len(s)] = [x]``) | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.clear()`` | removes all items from ``s`` | (5) |\n| | (same as ``del s[:]``) | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.copy()`` | creates a shallow copy of ``s`` | (5) |\n| | (same as ``s[:]``) | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.extend(t)`` | extends *s* with the contents of | |\n| | *t* (same as ``s[len(s):len(s)] | |\n| | = t``) | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.insert(i, x)`` | inserts *x* into *s* at the | |\n| | index given by *i* (same as | |\n| | ``s[i:i] = [x]``) | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.pop([i])`` | retrieves the item at *i* and | (2) |\n| | also removes it from *s* | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.remove(x)`` | remove the first item from *s* | (3) |\n| | where ``s[i] == x`` | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.reverse()`` | reverses the items of *s* in | (4) |\n| | place | |\n+--------------------------------+----------------------------------+-----------------------+\n\nNotes:\n\n1. *t* must have the same length as the slice it is replacing.\n\n2. The optional argument *i* defaults to ``-1``, so that by default\n the last item is removed and returned.\n\n3. ``remove`` raises ``ValueError`` when *x* is not found in *s*.\n\n4. The ``reverse()`` method modifies the sequence in place for economy\n of space when reversing a large sequence. To remind users that it\n operates by side effect, it does not return the reversed sequence.\n\n5. ``clear()`` and ``copy()`` are included for consistency with the\n interfaces of mutable containers that don\'t support slicing\n operations (such as ``dict`` and ``set``)\n\n New in version 3.3: ``clear()`` and ``copy()`` methods.\n\n\nLists\n=====\n\nLists are mutable sequences, typically used to store collections of\nhomogeneous items (where the precise degree of similarity will vary by\napplication).\n\nclass class list([iterable])\n\n Lists may be constructed in several ways:\n\n * Using a pair of square brackets to denote the empty list: ``[]``\n\n * Using square brackets, separating items with commas: ``[a]``,\n ``[a, b, c]``\n\n * Using a list comprehension: ``[x for x in iterable]``\n\n * Using the type constructor: ``list()`` or ``list(iterable)``\n\n The constructor builds a list whose items are the same and in the\n same order as *iterable*\'s items. *iterable* may be either a\n sequence, a container that supports iteration, or an iterator\n object. If *iterable* is already a list, a copy is made and\n returned, similar to ``iterable[:]``. For example, ``list(\'abc\')``\n returns ``[\'a\', \'b\', \'c\']`` and ``list( (1, 2, 3) )`` returns ``[1,\n 2, 3]``. If no argument is given, the constructor creates a new\n empty list, ``[]``.\n\n Many other operations also produce lists, including the\n ``sorted()`` built-in.\n\n Lists implement all of the *common* and *mutable* sequence\n operations. Lists also provide the following additional method:\n\n sort(*, key=None, reverse=None)\n\n This method sorts the list in place, using only ``<``\n comparisons between items. Exceptions are not suppressed - if\n any comparison operations fail, the entire sort operation will\n fail (and the list will likely be left in a partially modified\n state).\n\n ``sort()`` accepts two arguments that can only be passed by\n keyword (*keyword-only arguments*):\n\n *key* specifies a function of one argument that is used to\n extract a comparison key from each list element (for example,\n ``key=str.lower``). The key corresponding to each item in the\n list is calculated once and then used for the entire sorting\n process. The default value of ``None`` means that list items are\n sorted directly without calculating a separate key value.\n\n The ``functools.cmp_to_key()`` utility is available to convert a\n 2.x style *cmp* function to a *key* function.\n\n *reverse* is a boolean value. If set to ``True``, then the list\n elements are sorted as if each comparison were reversed.\n\n This method modifies the sequence in place for economy of space\n when sorting a large sequence. To remind users that it operates\n by side effect, it does not return the sorted sequence (use\n ``sorted()`` to explicitly request a new sorted list instance).\n\n The ``sort()`` method is guaranteed to be stable. A sort is\n stable if it guarantees not to change the relative order of\n elements that compare equal --- this is helpful for sorting in\n multiple passes (for example, sort by department, then by salary\n grade).\n\n **CPython implementation detail:** While a list is being sorted,\n the effect of attempting to mutate, or even inspect, the list is\n undefined. The C implementation of Python makes the list appear\n empty for the duration, and raises ``ValueError`` if it can\n detect that the list has been mutated during a sort.\n\n\nTuples\n======\n\nTuples are immutable sequences, typically used to store collections of\nheterogeneous data (such as the 2-tuples produced by the\n``enumerate()`` built-in). Tuples are also used for cases where an\nimmutable sequence of homogeneous data is needed (such as allowing\nstorage in a ``set`` or ``dict`` instance).\n\nclass class tuple([iterable])\n\n Tuples may be constructed in a number of ways:\n\n * Using a pair of parentheses to denote the empty tuple: ``()``\n\n * Using a trailing comma for a singleton tuple: ``a,`` or ``(a,)``\n\n * Separating items with commas: ``a, b, c`` or ``(a, b, c)``\n\n * Using the ``tuple()`` built-in: ``tuple()`` or\n ``tuple(iterable)``\n\n The constructor builds a tuple whose items are the same and in the\n same order as *iterable*\'s items. *iterable* may be either a\n sequence, a container that supports iteration, or an iterator\n object. If *iterable* is already a tuple, it is returned\n unchanged. For example, ``tuple(\'abc\')`` returns ``(\'a\', \'b\',\n \'c\')`` and ``tuple( [1, 2, 3] )`` returns ``(1, 2, 3)``. If no\n argument is given, the constructor creates a new empty tuple,\n ``()``.\n\n Note that it is actually the comma which makes a tuple, not the\n parentheses. The parentheses are optional, except in the empty\n tuple case, or when they are needed to avoid syntactic ambiguity.\n For example, ``f(a, b, c)`` is a function call with three\n arguments, while ``f((a, b, c))`` is a function call with a 3-tuple\n as the sole argument.\n\n Tuples implement all of the *common* sequence operations.\n\nFor heterogeneous collections of data where access by name is clearer\nthan access by index, ``collections.namedtuple()`` may be a more\nappropriate choice than a simple tuple object.\n\n\nRanges\n======\n\nThe ``range`` type represents an immutable sequence of numbers and is\ncommonly used for looping a specific number of times in ``for`` loops.\n\nclass class range(stop)\nclass class range(start, stop[, step])\n\n The arguments to the range constructor must be integers (either\n built-in ``int`` or any object that implements the ``__index__``\n special method). If the *step* argument is omitted, it defaults to\n ``1``. If the *start* argument is omitted, it defaults to ``0``. If\n *step* is zero, ``ValueError`` is raised.\n\n For a positive *step*, the contents of a range ``r`` are determined\n by the formula ``r[i] = start + step*i`` where ``i >= 0`` and\n ``r[i] < stop``.\n\n For a negative *step*, the contents of the range are still\n determined by the formula ``r[i] = start + step*i``, but the\n constraints are ``i >= 0`` and ``r[i] > stop``.\n\n A range object will be empty if ``r[0]`` does not meet the value\n constraint. Ranges do support negative indices, but these are\n interpreted as indexing from the end of the sequence determined by\n the positive indices.\n\n Ranges containing absolute values larger than ``sys.maxsize`` are\n permitted but some features (such as ``len()``) may raise\n ``OverflowError``.\n\n Range examples:\n\n >>> list(range(10))\n [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]\n >>> list(range(1, 11))\n [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]\n >>> list(range(0, 30, 5))\n [0, 5, 10, 15, 20, 25]\n >>> list(range(0, 10, 3))\n [0, 3, 6, 9]\n >>> list(range(0, -10, -1))\n [0, -1, -2, -3, -4, -5, -6, -7, -8, -9]\n >>> list(range(0))\n []\n >>> list(range(1, 0))\n []\n\n Ranges implement all of the *common* sequence operations except\n concatenation and repetition (due to the fact that range objects\n can only represent sequences that follow a strict pattern and\n repetition and concatenation will usually violate that pattern).\n\nThe advantage of the ``range`` type over a regular ``list`` or\n``tuple`` is that a ``range`` object will always take the same (small)\namount of memory, no matter the size of the range it represents (as it\nonly stores the ``start``, ``stop`` and ``step`` values, calculating\nindividual items and subranges as needed).\n\nRange objects implement the ``collections.abc.Sequence`` ABC, and\nprovide features such as containment tests, element index lookup,\nslicing and support for negative indices (see *Sequence Types ---\nlist, tuple, range*):\n\n>>> r = range(0, 20, 2)\n>>> r\nrange(0, 20, 2)\n>>> 11 in r\nFalse\n>>> 10 in r\nTrue\n>>> r.index(10)\n5\n>>> r[5]\n10\n>>> r[:5]\nrange(0, 10, 2)\n>>> r[-1]\n18\n\nTesting range objects for equality with ``==`` and ``!=`` compares\nthem as sequences. That is, two range objects are considered equal if\nthey represent the same sequence of values. (Note that two range\nobjects that compare equal might have different ``start``, ``stop``\nand ``step`` attributes, for example ``range(0) == range(2, 1, 3)`` or\n``range(0, 3, 2) == range(0, 4, 2)``.)\n\nChanged in version 3.2: Implement the Sequence ABC. Support slicing\nand negative indices. Test ``int`` objects for membership in constant\ntime instead of iterating through all items.\n\nChanged in version 3.3: Define \'==\' and \'!=\' to compare range objects\nbased on the sequence of values they define (instead of comparing\nbased on object identity).\n\nNew in version 3.3: The ``start``, ``stop`` and ``step`` attributes.\n', 'typesseq-mutable': "\nMutable Sequence Types\n**********************\n\nThe operations in the following table are defined on mutable sequence\ntypes. The ``collections.abc.MutableSequence`` ABC is provided to make\nit easier to correctly implement these operations on custom sequence\ntypes.\n\nIn the table *s* is an instance of a mutable sequence type, *t* is any\niterable object and *x* is an arbitrary object that meets any type and\nvalue restrictions imposed by *s* (for example, ``bytearray`` only\naccepts integers that meet the value restriction ``0 <= x <= 255``).\n\n+--------------------------------+----------------------------------+-----------------------+\n| Operation | Result | Notes |\n+================================+==================================+=======================+\n| ``s[i] = x`` | item *i* of *s* is replaced by | |\n| | *x* | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s[i:j] = t`` | slice of *s* from *i* to *j* is | |\n| | replaced by the contents of the | |\n| | iterable *t* | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``del s[i:j]`` | same as ``s[i:j] = []`` | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s[i:j:k] = t`` | the elements of ``s[i:j:k]`` are | (1) |\n| | replaced by those of *t* | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``del s[i:j:k]`` | removes the elements of | |\n| | ``s[i:j:k]`` from the list | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.append(x)`` | appends *x* to the end of the | |\n| | sequence (same as | |\n| | ``s[len(s):len(s)] = [x]``) | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.clear()`` | removes all items from ``s`` | (5) |\n| | (same as ``del s[:]``) | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.copy()`` | creates a shallow copy of ``s`` | (5) |\n| | (same as ``s[:]``) | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.extend(t)`` | extends *s* with the contents of | |\n| | *t* (same as ``s[len(s):len(s)] | |\n| | = t``) | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.insert(i, x)`` | inserts *x* into *s* at the | |\n| | index given by *i* (same as | |\n| | ``s[i:i] = [x]``) | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.pop([i])`` | retrieves the item at *i* and | (2) |\n| | also removes it from *s* | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.remove(x)`` | remove the first item from *s* | (3) |\n| | where ``s[i] == x`` | |\n+--------------------------------+----------------------------------+-----------------------+\n| ``s.reverse()`` | reverses the items of *s* in | (4) |\n| | place | |\n+--------------------------------+----------------------------------+-----------------------+\n\nNotes:\n\n1. *t* must have the same length as the slice it is replacing.\n\n2. The optional argument *i* defaults to ``-1``, so that by default\n the last item is removed and returned.\n\n3. ``remove`` raises ``ValueError`` when *x* is not found in *s*.\n\n4. The ``reverse()`` method modifies the sequence in place for economy\n of space when reversing a large sequence. To remind users that it\n operates by side effect, it does not return the reversed sequence.\n\n5. ``clear()`` and ``copy()`` are included for consistency with the\n interfaces of mutable containers that don't support slicing\n operations (such as ``dict`` and ``set``)\n\n New in version 3.3: ``clear()`` and ``copy()`` methods.\n", 'unary': '\nUnary arithmetic and bitwise operations\n***************************************\n\nAll unary arithmetic and bitwise operations have the same priority:\n\n u_expr ::= power | "-" u_expr | "+" u_expr | "~" u_expr\n\nThe unary ``-`` (minus) operator yields the negation of its numeric\nargument.\n\nThe unary ``+`` (plus) operator yields its numeric argument unchanged.\n\nThe unary ``~`` (invert) operator yields the bitwise inversion of its\ninteger argument. The bitwise inversion of ``x`` is defined as\n``-(x+1)``. It only applies to integral numbers.\n\nIn all three cases, if the argument does not have the proper type, a\n``TypeError`` exception is raised.\n', 'while': '\nThe ``while`` statement\n***********************\n\nThe ``while`` statement is used for repeated execution as long as an\nexpression is true:\n\n while_stmt ::= "while" expression ":" suite\n ["else" ":" suite]\n\nThis repeatedly tests the expression and, if it is true, executes the\nfirst suite; if the expression is false (which may be the first time\nit is tested) the suite of the ``else`` clause, if present, is\nexecuted and the loop terminates.\n\nA ``break`` statement executed in the first suite terminates the loop\nwithout executing the ``else`` clause\'s suite. A ``continue``\nstatement executed in the first suite skips the rest of the suite and\ngoes back to testing the expression.\n', -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 23:07:20 2013 From: python-checkins at python.org (larry.hastings) Date: Sun, 24 Nov 2013 23:07:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Bump_version_number_to_3?= =?utf-8?b?LjQuMGIxLg==?= Message-ID: <3dSQTh2XhQz7LjY@mail.python.org> http://hg.python.org/cpython/rev/3405dc9a6afa changeset: 87522:3405dc9a6afa tag: v3.4.0b1 user: Larry Hastings date: Sun Nov 24 06:59:35 2013 -0800 summary: Bump version number to 3.4.0b1. files: Include/patchlevel.h | 6 +++--- Lib/distutils/__init__.py | 2 +- Lib/idlelib/idlever.py | 2 +- Misc/NEWS | 2 +- Misc/RPM/python-3.4.spec | 2 +- README | 4 ++-- 6 files changed, 9 insertions(+), 9 deletions(-) diff --git a/Include/patchlevel.h b/Include/patchlevel.h --- a/Include/patchlevel.h +++ b/Include/patchlevel.h @@ -19,11 +19,11 @@ #define PY_MAJOR_VERSION 3 #define PY_MINOR_VERSION 4 #define PY_MICRO_VERSION 0 -#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_ALPHA -#define PY_RELEASE_SERIAL 4 +#define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_BETA +#define PY_RELEASE_SERIAL 1 /* Version as a string */ -#define PY_VERSION "3.4.0a4+" +#define PY_VERSION "3.4.0b1" /*--end constants--*/ /* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2. diff --git a/Lib/distutils/__init__.py b/Lib/distutils/__init__.py --- a/Lib/distutils/__init__.py +++ b/Lib/distutils/__init__.py @@ -13,5 +13,5 @@ # Updated automatically by the Python release process. # #--start constants-- -__version__ = "3.4.0a4" +__version__ = "3.4.0b1" #--end constants-- diff --git a/Lib/idlelib/idlever.py b/Lib/idlelib/idlever.py --- a/Lib/idlelib/idlever.py +++ b/Lib/idlelib/idlever.py @@ -1,1 +1,1 @@ -IDLE_VERSION = "3.4.0a4" +IDLE_VERSION = "3.4.0b1" diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -5,7 +5,7 @@ What's New in Python 3.4.0 Beta 1? ================================== -Projected release date: 2013-11-24 +Release date: 2013-11-24 Core and Builtins ----------------- diff --git a/Misc/RPM/python-3.4.spec b/Misc/RPM/python-3.4.spec --- a/Misc/RPM/python-3.4.spec +++ b/Misc/RPM/python-3.4.spec @@ -39,7 +39,7 @@ %define name python #--start constants-- -%define version 3.4.0a4 +%define version 3.4.0b1 %define libvers 3.4 #--end constants-- %define release 1pydotorg diff --git a/README b/README --- a/README +++ b/README @@ -1,5 +1,5 @@ -This is Python version 3.4.0 alpha 4 -==================================== +This is Python version 3.4.0 beta 1 +=================================== Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013 Python Software Foundation. All rights reserved. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 23:07:21 2013 From: python-checkins at python.org (larry.hastings) Date: Sun, 24 Nov 2013 23:07:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Added_tag_v3=2E4=2E0b1_for?= =?utf-8?q?_changeset_3405dc9a6afa?= Message-ID: <3dSQTj4MBMz7Lk5@mail.python.org> http://hg.python.org/cpython/rev/b89a2430ad78 changeset: 87523:b89a2430ad78 user: Larry Hastings date: Sun Nov 24 07:01:59 2013 -0800 summary: Added tag v3.4.0b1 for changeset 3405dc9a6afa files: .hgtags | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/.hgtags b/.hgtags --- a/.hgtags +++ b/.hgtags @@ -122,3 +122,4 @@ 9265a2168e2cb2a84785d8717792acc661e6b692 v3.4.0a2 dd9cdf90a5073510877e9dd5112f8e6cf20d5e89 v3.4.0a3 e245b0d7209bb6d0e19316e1e2af1aa9c2139104 v3.4.0a4 +3405dc9a6afaa0a06dd1f6f182ec5c998dce6f5f v3.4.0b1 -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 23:07:23 2013 From: python-checkins at python.org (larry.hastings) Date: Sun, 24 Nov 2013 23:07:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merged_3=2E4=2E0b1_release_head_back_into_trunk=2E?= Message-ID: <3dSQTl60LHz7LkH@mail.python.org> http://hg.python.org/cpython/rev/0d2ac94fa389 changeset: 87524:0d2ac94fa389 parent: 87523:b89a2430ad78 parent: 87520:694e2708b4a8 user: Larry Hastings date: Sun Nov 24 14:05:57 2013 -0800 summary: Merged 3.4.0b1 release head back into trunk. files: Doc/Makefile | 1 + Doc/library/email.contentmanager.rst | 4 +- Doc/library/email.message.rst | 2 +- Doc/library/gettext.rst | 6 +- Doc/library/unittest.rst | 4 +- Doc/tools/sphinxext/susp-ignored.csv | 16 +- Doc/whatsnew/3.4.rst | 2 +- Lib/_strptime.py | 2 +- Lib/asyncio/windows_events.py | 11 +- Lib/decimal.py | 1 + Lib/fileinput.py | 2 +- Lib/test/test_codecs.py | 4 +- Lib/test/test_decimal.py | 1 + Lib/test/test_fileinput.py | 8 +- Lib/test/test_gdb.py | 1 + Lib/test/test_io.py | 3 +- Lib/test/test_strptime.py | 4 + Lib/test/test_time.py | 4 + Lib/test/test_zipfile.py | 29 +- Misc/NEWS | 23 +- Modules/_decimal/_decimal.c | 8 +- Modules/_decimal/libmpdec/mpdecimal.c | 260 +++ Modules/_decimal/libmpdec/mpdecimal.h | 94 +- Modules/_elementtree.c | 21 +- Modules/_pickle.c | 977 +++++++++---- Modules/posixmodule.c | 47 +- setup.py | 2 +- 27 files changed, 1151 insertions(+), 386 deletions(-) diff --git a/Doc/Makefile b/Doc/Makefile --- a/Doc/Makefile +++ b/Doc/Makefile @@ -186,6 +186,7 @@ autobuild-dev: make update make dist SPHINXOPTS='-A daily=1 -A versionswitcher=1' + -make suspicious # for quick rebuilds (HTML only) autobuild-html: diff --git a/Doc/library/email.contentmanager.rst b/Doc/library/email.contentmanager.rst --- a/Doc/library/email.contentmanager.rst +++ b/Doc/library/email.contentmanager.rst @@ -96,7 +96,7 @@ only it when looking for candidate matches. Otherwise consider only the first (default root) part of the ``multipart/related``. - If a part has a :mailheader:``Content-Disposition`` header, only consider + If a part has a :mailheader:`Content-Disposition` header, only consider the part a candidate match if the value of the header is ``inline``. If none of the candidates matches any of the preferences in @@ -134,7 +134,7 @@ Return an iterator over all of the immediate sub-parts of the message, which will be empty for a non-``multipart``. (See also - :meth:``~email.message.walk``.) + :meth:`~email.message.walk`.) .. method:: get_content(*args, content_manager=None, **kw) diff --git a/Doc/library/email.message.rst b/Doc/library/email.message.rst --- a/Doc/library/email.message.rst +++ b/Doc/library/email.message.rst @@ -35,7 +35,7 @@ If *policy* is specified (it must be an instance of a :mod:`~email.policy` class) use the rules it specifies to udpate and serialize the representation - of the message. If *policy* is not set, use the :class`compat32 + of the message. If *policy* is not set, use the :class:`compat32 ` policy, which maintains backward compatibility with the Python 3.2 version of the email package. For more information see the :mod:`~email.policy` documentation. diff --git a/Doc/library/gettext.rst b/Doc/library/gettext.rst --- a/Doc/library/gettext.rst +++ b/Doc/library/gettext.rst @@ -476,9 +476,9 @@ :program:`xgettext`, :program:`pygettext`, and similar tools generate :file:`.po` files that are message catalogs. They are structured -:human-readable files that contain every marked string in the source -:code, along with a placeholder for the translated versions of these -:strings. +human-readable files that contain every marked string in the source +code, along with a placeholder for the translated versions of these +strings. Copies of these :file:`.po` files are then handed over to the individual human translators who write translations for every diff --git a/Doc/library/unittest.rst b/Doc/library/unittest.rst --- a/Doc/library/unittest.rst +++ b/Doc/library/unittest.rst @@ -901,8 +901,8 @@ | :meth:`assertWarnsRegex(warn, r, fun, *args, **kwds) | ``fun(*args, **kwds)`` raises *warn* | 3.2 | | ` | and the message matches regex *r* | | +---------------------------------------------------------+--------------------------------------+------------+ - | :meth:`assertLogs(logger, level)` | The ``with`` block logs on *logger* | 3.4 | - | ` | with minimum *level* | | + | :meth:`assertLogs(logger, level) | The ``with`` block logs on *logger* | 3.4 | + | ` | with minimum *level* | | +---------------------------------------------------------+--------------------------------------+------------+ .. method:: assertRaises(exception, callable, *args, **kwds) diff --git a/Doc/tools/sphinxext/susp-ignored.csv b/Doc/tools/sphinxext/susp-ignored.csv --- a/Doc/tools/sphinxext/susp-ignored.csv +++ b/Doc/tools/sphinxext/susp-ignored.csv @@ -141,15 +141,8 @@ library/logging.handlers,,:port,host:port library/mmap,,:i2,obj[i1:i2] library/multiprocessing,,`,# Add more tasks using `put()` -library/multiprocessing,,`,# A test file for the `multiprocessing` package -library/multiprocessing,,`,# A test of `multiprocessing.Pool` class -library/multiprocessing,,`,# `BaseManager`. -library/multiprocessing,,`,# in the original order then consider using `Pool.map()` or library/multiprocessing,,`,">>> l._callmethod('__getitem__', (20,)) # equiv to `l[20]`" library/multiprocessing,,`,">>> l._callmethod('__getslice__', (2, 7)) # equiv to `l[2:7]`" -library/multiprocessing,,`,# Not sure if we should synchronize access to `socket.accept()` method by -library/multiprocessing,,`,# object. (We import `multiprocessing.reduction` to enable this pickling.) -library/multiprocessing,,`,# `Pool.imap()` (which will save on the amount of code needed anyway). library/multiprocessing,,:queue,">>> QueueManager.register('get_queue', callable=lambda:queue)" library/multiprocessing,,`,# register the Foo class; make `f()` and `g()` accessible via proxy library/multiprocessing,,`,# register the Foo class; make `g()` and `_h()` accessible via proxy @@ -158,6 +151,10 @@ library/nntplib,,:lines,:lines library/optparse,,:len,"del parser.rargs[:len(value)]" library/os.path,,:foo,c:foo +library/pathlib,,:bar,">>> PureWindowsPath('c:/Windows', 'd:bar')" +library/pathlib,,:bar,PureWindowsPath('d:bar') +library/pathlib,,:Program,>>> PureWindowsPath('c:Program Files/').root +library/pathlib,,:Program,>>> PureWindowsPath('c:Program Files/').anchor library/pdb,,:lineno,filename:lineno library/pickle,,:memory,"conn = sqlite3.connect("":memory:"")" library/posix,,`,"CFLAGS=""`getconf LFS_CFLAGS`"" OPT=""-g -O2 $CFLAGS""" @@ -200,7 +197,12 @@ library/tarfile,,:xz,'w:xz' library/time,,:mm, library/time,,:ss, +library/tracemalloc,,:limit,"for index, stat in enumerate(top_stats[:limit], 1):" library/turtle,,::,Example:: +library/unittest,1412,:foo,"self.assertEqual(cm.output, ['INFO:foo:first message'," +library/unittest,1412,:first,"self.assertEqual(cm.output, ['INFO:foo:first message'," +library/unittest,1412,:foo,'ERROR:foo.bar:second message']) +library/unittest,1412,:second,'ERROR:foo.bar:second message']) library/urllib.request,,:close,Connection:close library/urllib.request,,:lang,"xmlns=""http://www.w3.org/1999/xhtml"" xml:lang=""en"" lang=""en"">\n\n\n" library/urllib.request,,:password,"""joe:password at python.org""" diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -424,7 +424,7 @@ The encoding and decoding functions in :mod:`base64` now accept any :term:`bytes-like object` in cases where it previously required a -:class:`bytes` or :class:`bytearray` instance (:issue`17839`) +:class:`bytes` or :class:`bytearray` instance (:issue:`17839`). colorsys diff --git a/Lib/_strptime.py b/Lib/_strptime.py --- a/Lib/_strptime.py +++ b/Lib/_strptime.py @@ -329,7 +329,7 @@ (bad_directive, format)) from None # IndexError only occurs when the format string is "%" except IndexError: - raise ValueError("stray %% in format '%s'" % format) + raise ValueError("stray %% in format '%s'" % format) from None _regex_cache[format] = format_regex found = format_regex.match(data_string) if not found: diff --git a/Lib/asyncio/windows_events.py b/Lib/asyncio/windows_events.py --- a/Lib/asyncio/windows_events.py +++ b/Lib/asyncio/windows_events.py @@ -327,14 +327,21 @@ handle, self._iocp, ov.address, ms) f = _WaitHandleFuture(wh, loop=self._loop) - def finish(timed_out, _, ov): + def finish(trans, key, ov): if not f.cancelled(): try: _overlapped.UnregisterWait(wh) except OSError as e: if e.winerror != _overlapped.ERROR_IO_PENDING: raise - return not timed_out + # Note that this second wait means that we should only use + # this with handles types where a successful wait has no + # effect. So events or processes are all right, but locks + # or semaphores are not. Also note if the handle is + # signalled and then quickly reset, then we may return + # False even though we have not timed out. + return (_winapi.WaitForSingleObject(handle, 0) == + _winapi.WAIT_OBJECT_0) self._cache[ov.address] = (f, ov, None, finish) return f diff --git a/Lib/decimal.py b/Lib/decimal.py --- a/Lib/decimal.py +++ b/Lib/decimal.py @@ -140,6 +140,7 @@ __version__ = '1.70' # Highest version of the spec this complies with # See http://speleotrove.com/decimal/ +__libmpdec_version__ = "2.4.0" # compatible libmpdec version import copy as _copy import math as _math diff --git a/Lib/fileinput.py b/Lib/fileinput.py --- a/Lib/fileinput.py +++ b/Lib/fileinput.py @@ -224,7 +224,7 @@ "'r', 'rU', 'U' and 'rb'") if 'U' in mode: import warnings - warnings.warn("Use of 'U' mode is deprecated", + warnings.warn("'U' mode is deprecated", DeprecationWarning, 2) self._mode = mode if openhook: diff --git a/Lib/test/test_codecs.py b/Lib/test/test_codecs.py --- a/Lib/test/test_codecs.py +++ b/Lib/test/test_codecs.py @@ -602,7 +602,9 @@ self.addCleanup(support.unlink, support.TESTFN) with open(support.TESTFN, 'wb') as fp: fp.write(s) - with codecs.open(support.TESTFN, 'U', encoding=self.encoding) as reader: + with support.check_warnings(('', DeprecationWarning)): + reader = codecs.open(support.TESTFN, 'U', encoding=self.encoding) + with reader: self.assertEqual(reader.read(), s1) class UTF16LETest(ReadTest, unittest.TestCase): diff --git a/Lib/test/test_decimal.py b/Lib/test/test_decimal.py --- a/Lib/test/test_decimal.py +++ b/Lib/test/test_decimal.py @@ -4149,6 +4149,7 @@ self.assertTrue(P.HAVE_THREADS is True or P.HAVE_THREADS is False) self.assertEqual(C.__version__, P.__version__) + self.assertEqual(C.__libmpdec_version__, P.__libmpdec_version__) x = dir(C) y = [s for s in dir(P) if '__' in s or not s.startswith('_')] diff --git a/Lib/test/test_fileinput.py b/Lib/test/test_fileinput.py --- a/Lib/test/test_fileinput.py +++ b/Lib/test/test_fileinput.py @@ -22,7 +22,7 @@ from io import StringIO from fileinput import FileInput, hook_encoded -from test.support import verbose, TESTFN, run_unittest +from test.support import verbose, TESTFN, run_unittest, check_warnings from test.support import unlink as safe_unlink @@ -224,8 +224,10 @@ try: # try opening in universal newline mode t1 = writeTmp(1, [b"A\nB\r\nC\rD"], mode="wb") - fi = FileInput(files=t1, mode="U") - lines = list(fi) + with check_warnings(('', DeprecationWarning)): + fi = FileInput(files=t1, mode="U") + with check_warnings(('', DeprecationWarning)): + lines = list(fi) self.assertEqual(lines, ["A\n", "B\n", "C\n", "D"]) finally: remove_tempfiles(t1) diff --git a/Lib/test/test_gdb.py b/Lib/test/test_gdb.py --- a/Lib/test/test_gdb.py +++ b/Lib/test/test_gdb.py @@ -170,6 +170,7 @@ 'Do you need "set solib-search-path" or ' '"set sysroot"?', 'warning: Source file is more recent than executable.', + 'Missing separate debuginfo for ', ) for line in errlines: if not line.startswith(ignore_patterns): diff --git a/Lib/test/test_io.py b/Lib/test/test_io.py --- a/Lib/test/test_io.py +++ b/Lib/test/test_io.py @@ -2777,7 +2777,8 @@ self.assertEqual(f.mode, "wb") f.close() - f = self.open(support.TESTFN, "U") + with support.check_warnings(('', DeprecationWarning)): + f = self.open(support.TESTFN, "U") self.assertEqual(f.name, support.TESTFN) self.assertEqual(f.buffer.name, support.TESTFN) self.assertEqual(f.buffer.raw.name, support.TESTFN) diff --git a/Lib/test/test_strptime.py b/Lib/test/test_strptime.py --- a/Lib/test/test_strptime.py +++ b/Lib/test/test_strptime.py @@ -223,6 +223,10 @@ with self.assertRaises(ValueError) as e: _strptime._strptime_time('', '%D') self.assertIs(e.exception.__suppress_context__, True) + # additional check for IndexError branch (issue #19545) + with self.assertRaises(ValueError) as e: + _strptime._strptime_time('19', '%Y %') + self.assertIs(e.exception.__suppress_context__, True) def test_unconverteddata(self): # Check ValueError is raised when there is unconverted data diff --git a/Lib/test/test_time.py b/Lib/test/test_time.py --- a/Lib/test/test_time.py +++ b/Lib/test/test_time.py @@ -198,6 +198,10 @@ with self.assertRaises(ValueError) as e: time.strptime('', '%D') self.assertIs(e.exception.__suppress_context__, True) + # additional check for IndexError branch (issue #19545) + with self.assertRaises(ValueError) as e: + time.strptime('19', '%Y %') + self.assertIs(e.exception.__suppress_context__, True) def test_asctime(self): time.asctime(time.gmtime(self.t)) diff --git a/Lib/test/test_zipfile.py b/Lib/test/test_zipfile.py --- a/Lib/test/test_zipfile.py +++ b/Lib/test/test_zipfile.py @@ -14,7 +14,7 @@ from test.support import (TESTFN, findfile, unlink, requires_zlib, requires_bz2, requires_lzma, - captured_stdout) + captured_stdout, check_warnings) TESTFN2 = TESTFN + "2" TESTFNDIR = TESTFN + "d" @@ -35,6 +35,10 @@ yield f test.assertFalse(f.closed) +def openU(zipfp, fn): + with check_warnings(('', DeprecationWarning)): + return zipfp.open(fn, 'rU') + class AbstractTestsWithSourceFile: @classmethod def setUpClass(cls): @@ -875,6 +879,17 @@ data += zipfp.read(info) self.assertIn(data, {b"foobar", b"barfoo"}) + def test_universal_deprecation(self): + f = io.BytesIO() + with zipfile.ZipFile(f, "w") as zipfp: + zipfp.writestr('spam.txt', b'ababagalamaga') + + with zipfile.ZipFile(f, "r") as zipfp: + for mode in 'U', 'rU': + with self.assertWarns(DeprecationWarning): + zipopen = zipfp.open('spam.txt', mode) + zipopen.close() + def test_universal_readaheads(self): f = io.BytesIO() @@ -884,7 +899,7 @@ data2 = b'' with zipfile.ZipFile(f, 'r') as zipfp, \ - zipfp.open(TESTFN, 'rU') as zipopen: + openU(zipfp, TESTFN) as zipopen: for line in zipopen: data2 += line @@ -1613,7 +1628,7 @@ # Read the ZIP archive with zipfile.ZipFile(f, "r") as zipfp: for sep, fn in self.arcfiles.items(): - with zipfp.open(fn, "rU") as fp: + with openU(zipfp, fn) as fp: zipdata = fp.read() self.assertEqual(self.arcdata[sep], zipdata) @@ -1627,7 +1642,7 @@ # Read the ZIP archive with zipfile.ZipFile(f, "r") as zipfp: for sep, fn in self.arcfiles.items(): - with zipfp.open(fn, "rU") as zipopen: + with openU(zipfp, fn) as zipopen: data = b'' while True: read = zipopen.readline() @@ -1652,7 +1667,7 @@ # Read the ZIP archive with zipfile.ZipFile(f, "r") as zipfp: for sep, fn in self.arcfiles.items(): - with zipfp.open(fn, "rU") as zipopen: + with openU(zipfp, fn) as zipopen: for line in self.line_gen: linedata = zipopen.readline() self.assertEqual(linedata, line + b'\n') @@ -1667,7 +1682,7 @@ # Read the ZIP archive with zipfile.ZipFile(f, "r") as zipfp: for sep, fn in self.arcfiles.items(): - with zipfp.open(fn, "rU") as fp: + with openU(zipfp, fn) as fp: ziplines = fp.readlines() for line, zipline in zip(self.line_gen, ziplines): self.assertEqual(zipline, line + b'\n') @@ -1682,7 +1697,7 @@ # Read the ZIP archive with zipfile.ZipFile(f, "r") as zipfp: for sep, fn in self.arcfiles.items(): - with zipfp.open(fn, "rU") as fp: + with openU(zipfp, fn) as fp: for line, zipline in zip(self.line_gen, fp): self.assertEqual(zipline, line + b'\n') diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -2,6 +2,21 @@ Python News +++++++++++ +What's New in Python 3.4.0 Beta 2? +================================== + +Release date: 2014-01-05 + +Core and Builtins +----------------- + +Library +------- + +- Issue #19545: Avoid chained exceptions while passing stray % to + time.strptime(). Initial patch by Claudiu Popa. + + What's New in Python 3.4.0 Beta 1? ================================== @@ -2518,7 +2533,7 @@ - Issue #14398: Fix size truncation and overflow bugs in the bz2 module. - Issue #12692: Fix resource leak in urllib.request when talking to an HTTP - server that does not include a "Connection: close" header in its responses. + server that does not include a ``Connection: close`` header in its responses. - Issue #12034: Fix bogus caching of result in check_GetFinalPathNameByHandle. Patch by Atsuo Ishimoto. @@ -6091,7 +6106,7 @@ given as a low fd, it gets overwritten. - Issue #12576: Fix urlopen behavior on sites which do not send (or obfuscates) - Connection:close header. + ``Connection: close`` header. - Issue #12560: Build libpython.so on OpenBSD. Patch by Stefan Sperling. @@ -6686,7 +6701,7 @@ - Issue #11127: Raise a TypeError when trying to pickle a socket object. -- Issue #11563: Connection:close header is sent by requests using URLOpener +- Issue #11563: ``Connection: close`` header is sent by requests using URLOpener class which helps in closing of sockets after connection is over. Patch contributions by Jeff McNeil and Nadeem Vawda. @@ -7262,7 +7277,7 @@ - Issue #11505: improves test coverage of string.py. Patch by Alicia Arlen -- Issue #11490: test_subprocess:test_leaking_fds_on_error no longer gives a +- Issue #11490: test_subprocess.test_leaking_fds_on_error no longer gives a false positive if the last directory in the path is inaccessible. - Issue #11223: Fix test_threadsignals to fail, not hang, when the diff --git a/Modules/_decimal/_decimal.c b/Modules/_decimal/_decimal.c --- a/Modules/_decimal/_decimal.c +++ b/Modules/_decimal/_decimal.c @@ -39,6 +39,11 @@ #include "memory.h" +#if MPD_MAJOR_VERSION != 2 + #error "libmpdec major version 2 required" +#endif + + /* * Type sizes with assertions in mpdecimal.h and pyport.h: * sizeof(size_t) == sizeof(Py_ssize_t) @@ -5730,7 +5735,8 @@ } /* Add specification version number */ - CHECK_INT(PyModule_AddStringConstant(m, "__version__", " 1.70")); + CHECK_INT(PyModule_AddStringConstant(m, "__version__", "1.70")); + CHECK_INT(PyModule_AddStringConstant(m, "__libmpdec_version__", mpd_version())); return m; diff --git a/Modules/_decimal/libmpdec/mpdecimal.c b/Modules/_decimal/libmpdec/mpdecimal.c --- a/Modules/_decimal/libmpdec/mpdecimal.c +++ b/Modules/_decimal/libmpdec/mpdecimal.c @@ -97,6 +97,8 @@ mpd_ssize_t exp); static inline mpd_ssize_t _mpd_real_size(mpd_uint_t *data, mpd_ssize_t size); +static int _mpd_cmp_abs(const mpd_t *a, const mpd_t *b); + static void _mpd_qadd(mpd_t *result, const mpd_t *a, const mpd_t *b, const mpd_context_t *ctx, uint32_t *status); static inline void _mpd_qmul(mpd_t *result, const mpd_t *a, const mpd_t *b, @@ -111,6 +113,17 @@ /******************************************************************************/ +/* Version */ +/******************************************************************************/ + +const char * +mpd_version(void) +{ + return MPD_VERSION; +} + + +/******************************************************************************/ /* Performance critical inline functions */ /******************************************************************************/ @@ -1345,6 +1358,91 @@ return MPD_SSIZE_MAX; } +#if defined(CONFIG_32) && !defined(LEGACY_COMPILER) +/* + * Quietly get a uint64_t from a decimal. If the operation is impossible, + * MPD_Invalid_operation is set. + */ +static uint64_t +_c32_qget_u64(int use_sign, const mpd_t *a, uint32_t *status) +{ + MPD_NEW_STATIC(tmp,0,0,20,3); + mpd_context_t maxcontext; + uint64_t ret; + + tmp_data[0] = 709551615; + tmp_data[1] = 446744073; + tmp_data[2] = 18; + + if (mpd_isspecial(a)) { + *status |= MPD_Invalid_operation; + return UINT64_MAX; + } + if (mpd_iszero(a)) { + return 0; + } + if (use_sign && mpd_isnegative(a)) { + *status |= MPD_Invalid_operation; + return UINT64_MAX; + } + if (!_mpd_isint(a)) { + *status |= MPD_Invalid_operation; + return UINT64_MAX; + } + + if (_mpd_cmp_abs(a, &tmp) > 0) { + *status |= MPD_Invalid_operation; + return UINT64_MAX; + } + + mpd_maxcontext(&maxcontext); + mpd_qrescale(&tmp, a, 0, &maxcontext, &maxcontext.status); + maxcontext.status &= ~MPD_Rounded; + if (maxcontext.status != 0) { + *status |= (maxcontext.status|MPD_Invalid_operation); /* GCOV_NOT_REACHED */ + return UINT64_MAX; /* GCOV_NOT_REACHED */ + } + + ret = 0; + switch (tmp.len) { + case 3: + ret += (uint64_t)tmp_data[2] * 1000000000000000000ULL; + case 2: + ret += (uint64_t)tmp_data[1] * 1000000000ULL; + case 1: + ret += tmp_data[0]; + break; + default: + abort(); /* GCOV_NOT_REACHED */ + } + + return ret; +} + +static int64_t +_c32_qget_i64(const mpd_t *a, uint32_t *status) +{ + uint64_t u; + int isneg; + + u = _c32_qget_u64(0, a, status); + if (*status&MPD_Invalid_operation) { + return INT64_MAX; + } + + isneg = mpd_isnegative(a); + if (u <= INT64_MAX) { + return isneg ? -((int64_t)u) : (int64_t)u; + } + else if (isneg && u+(INT64_MIN+INT64_MAX) == INT64_MAX) { + return INT64_MIN; + } + + *status |= MPD_Invalid_operation; + return INT64_MAX; +} +#endif /* CONFIG_32 && !LEGACY_COMPILER */ + #ifdef CONFIG_64 /* quietly get a uint64_t from a decimal */ uint64_t @@ -1359,7 +1457,57 @@ { return mpd_qget_ssize(a, status); } + +/* quietly get a uint32_t from a decimal */ +uint32_t +mpd_qget_u32(const mpd_t *a, uint32_t *status) +{ + uint64_t x = mpd_qget_uint(a, status); + + if (*status&MPD_Invalid_operation) { + return UINT32_MAX; + } + if (x > UINT32_MAX) { + *status |= MPD_Invalid_operation; + return UINT32_MAX; + } + + return (uint32_t)x; +} + +/* quietly get an int32_t from a decimal */ +int32_t +mpd_qget_i32(const mpd_t *a, uint32_t *status) +{ + int64_t x = mpd_qget_ssize(a, status); + + if (*status&MPD_Invalid_operation) { + return INT32_MAX; + } + if (x < INT32_MIN || x > INT32_MAX) { + *status |= MPD_Invalid_operation; + return INT32_MAX; + } + + return (int32_t)x; +} #else +#ifndef LEGACY_COMPILER +/* quietly get a uint64_t from a decimal */ +uint64_t +mpd_qget_u64(const mpd_t *a, uint32_t *status) +{ + return _c32_qget_u64(1, a, status); +} + +/* quietly get an int64_t from a decimal */ +int64_t +mpd_qget_i64(const mpd_t *a, uint32_t *status) +{ + return _c32_qget_i64(a, status); +} +#endif + /* quietly get a uint32_t from a decimal */ uint32_t mpd_qget_u32(const mpd_t *a, uint32_t *status) @@ -3386,6 +3534,34 @@ { mpd_qadd_uint(result, a, b, ctx, status); } +#elif !defined(LEGACY_COMPILER) +/* Add decimal and int64_t. */ +void +mpd_qadd_i64(mpd_t *result, const mpd_t *a, int64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_i64(&bb, b, &maxcontext, status); + mpd_qadd(result, a, &bb, ctx, status); + mpd_del(&bb); +} + +/* Add decimal and uint64_t. */ +void +mpd_qadd_u64(mpd_t *result, const mpd_t *a, uint64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_u64(&bb, b, &maxcontext, status); + mpd_qadd(result, a, &bb, ctx, status); + mpd_del(&bb); +} #endif /* Subtract int32_t from decimal. */ @@ -3420,6 +3596,34 @@ { mpd_qsub_uint(result, a, b, ctx, status); } +#elif !defined(LEGACY_COMPILER) +/* Subtract int64_t from decimal. */ +void +mpd_qsub_i64(mpd_t *result, const mpd_t *a, int64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_i64(&bb, b, &maxcontext, status); + mpd_qsub(result, a, &bb, ctx, status); + mpd_del(&bb); +} + +/* Subtract uint64_t from decimal. */ +void +mpd_qsub_u64(mpd_t *result, const mpd_t *a, uint64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_u64(&bb, b, &maxcontext, status); + mpd_qsub(result, a, &bb, ctx, status); + mpd_del(&bb); +} #endif @@ -3871,6 +4075,34 @@ { mpd_qdiv_uint(result, a, b, ctx, status); } +#elif !defined(LEGACY_COMPILER) +/* Divide decimal by int64_t. */ +void +mpd_qdiv_i64(mpd_t *result, const mpd_t *a, int64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_i64(&bb, b, &maxcontext, status); + mpd_qdiv(result, a, &bb, ctx, status); + mpd_del(&bb); +} + +/* Divide decimal by uint64_t. */ +void +mpd_qdiv_u64(mpd_t *result, const mpd_t *a, uint64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_u64(&bb, b, &maxcontext, status); + mpd_qdiv(result, a, &bb, ctx, status); + mpd_del(&bb); +} #endif /* Pad the result with trailing zeros if it has fewer digits than prec. */ @@ -5664,6 +5896,34 @@ { mpd_qmul_uint(result, a, b, ctx, status); } +#elif !defined(LEGACY_COMPILER) +/* Multiply decimal and int64_t. */ +void +mpd_qmul_i64(mpd_t *result, const mpd_t *a, int64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_i64(&bb, b, &maxcontext, status); + mpd_qmul(result, a, &bb, ctx, status); + mpd_del(&bb); +} + +/* Multiply decimal and uint64_t. */ +void +mpd_qmul_u64(mpd_t *result, const mpd_t *a, uint64_t b, + const mpd_context_t *ctx, uint32_t *status) +{ + mpd_context_t maxcontext; + MPD_NEW_STATIC(bb,0,0,0,0); + + mpd_maxcontext(&maxcontext); + mpd_qset_u64(&bb, b, &maxcontext, status); + mpd_qmul(result, a, &bb, ctx, status); + mpd_del(&bb); +} #endif /* Like the minus operator. */ diff --git a/Modules/_decimal/libmpdec/mpdecimal.h b/Modules/_decimal/libmpdec/mpdecimal.h --- a/Modules/_decimal/libmpdec/mpdecimal.h +++ b/Modules/_decimal/libmpdec/mpdecimal.h @@ -32,7 +32,6 @@ #ifdef __cplusplus extern "C" { -#define __STDC_LIMIT_MACROS #endif @@ -56,12 +55,18 @@ #define MPD_HIDE_SYMBOLS_END #define EXTINLINE extern inline #else - #ifdef HAVE_STDINT_H - #include - #endif #ifdef HAVE_INTTYPES_H #include #endif + #ifdef HAVE_STDINT_H + #if defined(__cplusplus) && !defined(__STDC_LIMIT_MACROS) + #define __STDC_LIMIT_MACROS + #include + #undef __STDC_LIMIT_MACROS + #else + #include + #endif + #endif #ifndef __GNUC_STDC_INLINE__ #define __GNUC_STDC_INLINE__ 1 #endif @@ -100,6 +105,19 @@ /******************************************************************************/ +/* Version */ +/******************************************************************************/ + +#define MPD_MAJOR_VERSION 2 +#define MPD_MINOR_VERSION 4 +#define MPD_MICRO_VERSION 0 + +#define MPD_VERSION "2.4.0" + +const char *mpd_version(void); + + +/******************************************************************************/ /* Configuration */ /******************************************************************************/ @@ -241,7 +259,7 @@ extern const char *mpd_clamp_string[MPD_CLAMP_GUARD]; -typedef struct { +typedef struct mpd_context_t { mpd_ssize_t prec; /* precision */ mpd_ssize_t emax; /* max positive exp */ mpd_ssize_t emin; /* min negative exp */ @@ -353,7 +371,7 @@ #define MPD_DATAFLAGS (MPD_STATIC_DATA|MPD_SHARED_DATA|MPD_CONST_DATA) /* mpd_t */ -typedef struct { +typedef struct mpd_t { uint8_t flags; mpd_ssize_t exp; mpd_ssize_t digits; @@ -371,7 +389,7 @@ /******************************************************************************/ /* format specification */ -typedef struct { +typedef struct mpd_spec_t { mpd_ssize_t min_width; /* minimum field width */ mpd_ssize_t prec; /* fraction digits or significant digits */ char type; /* conversion specifier */ @@ -437,6 +455,12 @@ mpd_uint_t mpd_qget_uint(const mpd_t *dec, uint32_t *status); mpd_uint_t mpd_qabs_uint(const mpd_t *dec, uint32_t *status); +int32_t mpd_qget_i32(const mpd_t *dec, uint32_t *status); +uint32_t mpd_qget_u32(const mpd_t *dec, uint32_t *status); +#ifndef LEGACY_COMPILER +int64_t mpd_qget_i64(const mpd_t *dec, uint32_t *status); +uint64_t mpd_qget_u64(const mpd_t *dec, uint32_t *status); +#endif /* quiet functions */ int mpd_qcheck_nan(mpd_t *nanresult, const mpd_t *a, const mpd_context_t *ctx, uint32_t *status); @@ -528,6 +552,17 @@ void mpd_qsqrt(mpd_t *result, const mpd_t *a, const mpd_context_t *ctx, uint32_t *status); void mpd_qinvroot(mpd_t *result, const mpd_t *a, const mpd_context_t *ctx, uint32_t *status); +#ifndef LEGACY_COMPILER +void mpd_qadd_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qadd_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qsub_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qsub_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qmul_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qmul_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qdiv_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); +void mpd_qdiv_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); +#endif + size_t mpd_sizeinbase(const mpd_t *a, uint32_t base); void mpd_qimport_u16(mpd_t *result, const uint16_t *srcdata, size_t srclen, @@ -571,6 +606,12 @@ mpd_ssize_t mpd_get_ssize(const mpd_t *a, mpd_context_t *ctx); mpd_uint_t mpd_get_uint(const mpd_t *a, mpd_context_t *ctx); mpd_uint_t mpd_abs_uint(const mpd_t *a, mpd_context_t *ctx); +int32_t mpd_get_i32(const mpd_t *a, mpd_context_t *ctx); +uint32_t mpd_get_u32(const mpd_t *a, mpd_context_t *ctx); +#ifndef LEGACY_COMPILER +int64_t mpd_get_i64(const mpd_t *a, mpd_context_t *ctx); +uint64_t mpd_get_u64(const mpd_t *a, mpd_context_t *ctx); +#endif void mpd_and(mpd_t *result, const mpd_t *a, const mpd_t *b, mpd_context_t *ctx); void mpd_copy(mpd_t *result, const mpd_t *a, mpd_context_t *ctx); void mpd_canonical(mpd_t *result, const mpd_t *a, mpd_context_t *ctx); @@ -641,6 +682,17 @@ void mpd_sqrt(mpd_t *result, const mpd_t *a, mpd_context_t *ctx); void mpd_invroot(mpd_t *result, const mpd_t *a, mpd_context_t *ctx); +#ifndef LEGACY_COMPILER +void mpd_add_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); +void mpd_add_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); +void mpd_sub_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); +void mpd_sub_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); +void mpd_div_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); +void mpd_div_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); +void mpd_mul_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); +void mpd_mul_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); +#endif + /******************************************************************************/ /* Configuration specific */ @@ -649,36 +701,8 @@ #ifdef CONFIG_64 void mpd_qsset_i64(mpd_t *result, int64_t a, const mpd_context_t *ctx, uint32_t *status); void mpd_qsset_u64(mpd_t *result, uint64_t a, const mpd_context_t *ctx, uint32_t *status); -int64_t mpd_qget_i64(const mpd_t *dec, uint32_t *status); -uint64_t mpd_qget_u64(const mpd_t *dec, uint32_t *status); - -void mpd_qadd_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qadd_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qsub_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qsub_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qmul_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qmul_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qdiv_i64(mpd_t *result, const mpd_t *a, int64_t b, const mpd_context_t *ctx, uint32_t *status); -void mpd_qdiv_u64(mpd_t *result, const mpd_t *a, uint64_t b, const mpd_context_t *ctx, uint32_t *status); - void mpd_sset_i64(mpd_t *result, int64_t a, mpd_context_t *ctx); void mpd_sset_u64(mpd_t *result, uint64_t a, mpd_context_t *ctx); -int64_t mpd_get_i64(const mpd_t *a, mpd_context_t *ctx); -uint64_t mpd_get_u64(const mpd_t *a, mpd_context_t *ctx); - -void mpd_add_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); -void mpd_add_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); -void mpd_sub_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); -void mpd_sub_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); -void mpd_div_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); -void mpd_div_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); -void mpd_mul_i64(mpd_t *result, const mpd_t *a, int64_t b, mpd_context_t *ctx); -void mpd_mul_u64(mpd_t *result, const mpd_t *a, uint64_t b, mpd_context_t *ctx); -#else -int32_t mpd_qget_i32(const mpd_t *dec, uint32_t *status); -uint32_t mpd_qget_u32(const mpd_t *dec, uint32_t *status); -int32_t mpd_get_i32(const mpd_t *a, mpd_context_t *ctx); -uint32_t mpd_get_u32(const mpd_t *a, mpd_context_t *ctx); #endif diff --git a/Modules/_elementtree.c b/Modules/_elementtree.c --- a/Modules/_elementtree.c +++ b/Modules/_elementtree.c @@ -567,8 +567,9 @@ PyObject* attrib = NULL; if (!PyArg_ParseTuple(args, "O!O|O!:SubElement", &Element_Type, &parent, &tag, - &PyDict_Type, &attrib)) + &PyDict_Type, &attrib)) { return NULL; + } if (attrib) { /* attrib passed as positional arg */ @@ -652,7 +653,6 @@ } /* -------------------------------------------------------------------- */ -/* methods (in alphabetical order) */ static PyObject* element_append(ElementObject* self, PyObject* args) @@ -696,8 +696,7 @@ return NULL; element = (ElementObject*) create_new_element( - self->tag, (self->extra) ? self->extra->attrib : Py_None - ); + self->tag, (self->extra) ? self->extra->attrib : Py_None); if (!element) return NULL; @@ -710,7 +709,6 @@ Py_INCREF(JOIN_OBJ(element->tail)); if (self->extra) { - if (element_resize(element, self->extra->length) < 0) { Py_DECREF(element); return NULL; @@ -722,7 +720,6 @@ } element->extra->length = self->extra->length; - } return (PyObject*) element; @@ -779,7 +776,6 @@ element->tail = JOIN_SET(tail, JOIN_GET(self->tail)); if (self->extra) { - if (element_resize(element, self->extra->length) < 0) goto error; @@ -793,7 +789,6 @@ } element->extra->length = self->extra->length; - } /* add object to memo dictionary (so deepcopy won't visit it again) */ @@ -1141,8 +1136,8 @@ for (i = 0; i < self->extra->length; i++) { ElementObject* item = (ElementObject*) self->extra->children[i]; - if (Element_CheckExact(item) && (PyObject_RichCompareBool(item->tag, tag, Py_EQ) == 1)) { - + if (Element_CheckExact(item) && + (PyObject_RichCompareBool(item->tag, tag, Py_EQ) == 1)) { PyObject* text = element_get_text(item); if (text == Py_None) return PyUnicode_New(0, 0); @@ -1207,12 +1202,12 @@ elementtreestate *st = ET_STATE_GLOBAL; if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|O:iterfind", kwlist, - &tag, &namespaces)) + &tag, &namespaces)) { return NULL; + } return _PyObject_CallMethodId( - st->elementpath_obj, &PyId_iterfind, "OOO", self, tag, namespaces - ); + st->elementpath_obj, &PyId_iterfind, "OOO", self, tag, namespaces); } static PyObject* diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -1,6 +1,30 @@ #include "Python.h" #include "structmember.h" +/*[clinic] +module _pickle +class _pickle.Pickler +class _pickle.PicklerMemoProxy +class _pickle.Unpickler +class _pickle.UnpicklerMemoProxy +[clinic]*/ +/*[clinic checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ + +/*[python] +class PicklerObject_converter(self_converter): + type = "PicklerObject *" + +class PicklerMemoProxyObject_converter(self_converter): + type = "PicklerMemoProxyObject *" + +class UnpicklerObject_converter(self_converter): + type = "UnpicklerObject *" + +class UnpicklerMemoProxyObject_converter(self_converter): + type = "UnpicklerMemoProxyObject *" +[python]*/ +/*[python checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ + PyDoc_STRVAR(pickle_module_doc, "Optimized C implementation for the Python pickle module."); @@ -866,34 +890,29 @@ } static int -_Pickler_SetProtocol(PicklerObject *self, PyObject *proto_obj, - PyObject *fix_imports_obj) -{ - long proto = 0; - int fix_imports; - - if (proto_obj == NULL || proto_obj == Py_None) +_Pickler_SetProtocol(PicklerObject *self, PyObject *protocol, int fix_imports) +{ + long proto; + + if (protocol == NULL || protocol == Py_None) { proto = DEFAULT_PROTOCOL; + } else { - proto = PyLong_AsLong(proto_obj); - if (proto == -1 && PyErr_Occurred()) + proto = PyLong_AsLong(protocol); + if (proto < 0) { + if (proto == -1 && PyErr_Occurred()) + return -1; + proto = HIGHEST_PROTOCOL; + } + else if (proto > HIGHEST_PROTOCOL) { + PyErr_Format(PyExc_ValueError, "pickle protocol must be <= %d", + HIGHEST_PROTOCOL); return -1; - } - if (proto < 0) - proto = HIGHEST_PROTOCOL; - if (proto > HIGHEST_PROTOCOL) { - PyErr_Format(PyExc_ValueError, "pickle protocol must be <= %d", - HIGHEST_PROTOCOL); - return -1; - } - fix_imports = PyObject_IsTrue(fix_imports_obj); - if (fix_imports == -1) - return -1; - - self->proto = proto; + } + } + self->proto = (int)proto; self->bin = proto > 0; self->fix_imports = fix_imports && proto < 3; - return 0; } @@ -3708,16 +3727,35 @@ return 0; } -PyDoc_STRVAR(Pickler_clear_memo_doc, -"clear_memo() -> None. Clears the pickler's \"memo\"." +/*[clinic] + +_pickle.Pickler.clear_memo + + self: PicklerObject + +Clears the pickler's "memo". + +The memo is the data structure that remembers which objects the +pickler has already seen, so that shared or recursive objects are +pickled by reference and not by value. This method is useful when +re-using picklers. +[clinic]*/ + +PyDoc_STRVAR(_pickle_Pickler_clear_memo__doc__, +"clear_memo()\n" +"Clears the pickler\'s \"memo\".\n" "\n" "The memo is the data structure that remembers which objects the\n" "pickler has already seen, so that shared or recursive objects are\n" "pickled by reference and not by value. This method is useful when\n" "re-using picklers."); +#define _PICKLE_PICKLER_CLEAR_MEMO_METHODDEF \ + {"clear_memo", (PyCFunction)_pickle_Pickler_clear_memo, METH_NOARGS, _pickle_Pickler_clear_memo__doc__}, + static PyObject * -Pickler_clear_memo(PicklerObject *self) +_pickle_Pickler_clear_memo(PicklerObject *self) +/*[clinic checksum: 9c32be7e7a17ff82a81aae409d0d4f469033a5b2]*/ { if (self->memo) PyMemoTable_Clear(self->memo); @@ -3725,14 +3763,28 @@ Py_RETURN_NONE; } -PyDoc_STRVAR(Pickler_dump_doc, -"dump(obj) -> None. Write a pickled representation of obj to the open file."); +/*[clinic] + +_pickle.Pickler.dump + + self: PicklerObject + obj: object + / + +Write a pickled representation of the given object to the open file. +[clinic]*/ + +PyDoc_STRVAR(_pickle_Pickler_dump__doc__, +"dump(obj)\n" +"Write a pickled representation of the given object to the open file."); + +#define _PICKLE_PICKLER_DUMP_METHODDEF \ + {"dump", (PyCFunction)_pickle_Pickler_dump, METH_O, _pickle_Pickler_dump__doc__}, static PyObject * -Pickler_dump(PicklerObject *self, PyObject *args) -{ - PyObject *obj; - +_pickle_Pickler_dump(PicklerObject *self, PyObject *obj) +/*[clinic checksum: b72a69ec98737fabf66dae7c5a3210178bdbd3e6]*/ +{ /* Check whether the Pickler was initialized correctly (issue3664). Developers often forget to call __init__() in their subclasses, which would trigger a segfault without this check. */ @@ -3743,9 +3795,6 @@ return NULL; } - if (!PyArg_ParseTuple(args, "O:dump", &obj)) - return NULL; - if (_Pickler_ClearBuffer(self) < 0) return NULL; @@ -3759,10 +3808,8 @@ } static struct PyMethodDef Pickler_methods[] = { - {"dump", (PyCFunction)Pickler_dump, METH_VARARGS, - Pickler_dump_doc}, - {"clear_memo", (PyCFunction)Pickler_clear_memo, METH_NOARGS, - Pickler_clear_memo_doc}, + _PICKLE_PICKLER_DUMP_METHODDEF + _PICKLE_PICKLER_CLEAR_MEMO_METHODDEF {NULL, NULL} /* sentinel */ }; @@ -3813,9 +3860,39 @@ } -PyDoc_STRVAR(Pickler_doc, -"Pickler(file, protocol=None)" -"\n" +/*[clinic] + +_pickle.Pickler.__init__ + + self: PicklerObject + file: object + protocol: object = NULL + fix_imports: bool = True + +This takes a binary file for writing a pickle data stream. + +The optional protocol argument tells the pickler to use the +given protocol; supported protocols are 0, 1, 2, 3 and 4. The +default protocol is 3; a backward-incompatible protocol designed for +Python 3. + +Specifying a negative protocol version selects the highest +protocol version supported. The higher the protocol used, the +more recent the version of Python needed to read the pickle +produced. + +The file argument must have a write() method that accepts a single +bytes argument. It can thus be a file object opened for binary +writing, a io.BytesIO instance, or any other custom object that +meets this interface. + +If fix_imports is True and protocol is less than 3, pickle will try to +map the new Python 3 names to the old module names used in Python 2, +so that the pickle data stream is readable with Python 2. +[clinic]*/ + +PyDoc_STRVAR(_pickle_Pickler___init____doc__, +"__init__(file, protocol=None, fix_imports=True)\n" "This takes a binary file for writing a pickle data stream.\n" "\n" "The optional protocol argument tells the pickler to use the\n" @@ -3835,37 +3912,55 @@ "\n" "If fix_imports is True and protocol is less than 3, pickle will try to\n" "map the new Python 3 names to the old module names used in Python 2,\n" -"so that the pickle data stream is readable with Python 2.\n"); - -static int -Pickler_init(PicklerObject *self, PyObject *args, PyObject *kwds) -{ - static char *kwlist[] = {"file", "protocol", "fix_imports", 0}; +"so that the pickle data stream is readable with Python 2."); + +#define _PICKLE_PICKLER___INIT___METHODDEF \ + {"__init__", (PyCFunction)_pickle_Pickler___init__, METH_VARARGS|METH_KEYWORDS, _pickle_Pickler___init____doc__}, + +static PyObject * +_pickle_Pickler___init___impl(PicklerObject *self, PyObject *file, PyObject *protocol, int fix_imports); + +static PyObject * +_pickle_Pickler___init__(PyObject *self, PyObject *args, PyObject *kwargs) +{ + PyObject *return_value = NULL; + static char *_keywords[] = {"file", "protocol", "fix_imports", NULL}; PyObject *file; - PyObject *proto_obj = NULL; - PyObject *fix_imports = Py_True; + PyObject *protocol = NULL; + int fix_imports = 1; + + if (!PyArg_ParseTupleAndKeywords(args, kwargs, + "O|Op:__init__", _keywords, + &file, &protocol, &fix_imports)) + goto exit; + return_value = _pickle_Pickler___init___impl((PicklerObject *)self, file, protocol, fix_imports); + +exit: + return return_value; +} + +static PyObject * +_pickle_Pickler___init___impl(PicklerObject *self, PyObject *file, PyObject *protocol, int fix_imports) +/*[clinic checksum: c99ff417bd703a74affc4b708167e56e135e8969]*/ +{ _Py_IDENTIFIER(persistent_id); _Py_IDENTIFIER(dispatch_table); - if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|OO:Pickler", - kwlist, &file, &proto_obj, &fix_imports)) - return -1; - /* In case of multiple __init__() calls, clear previous content. */ if (self->write != NULL) (void)Pickler_clear(self); - if (_Pickler_SetProtocol(self, proto_obj, fix_imports) < 0) - return -1; + if (_Pickler_SetProtocol(self, protocol, fix_imports) < 0) + return NULL; if (_Pickler_SetOutputStream(self, file) < 0) - return -1; + return NULL; /* memo and output_buffer may have already been created in _Pickler_New */ if (self->memo == NULL) { self->memo = PyMemoTable_New(); if (self->memo == NULL) - return -1; + return NULL; } self->output_len = 0; if (self->output_buffer == NULL) { @@ -3873,7 +3968,7 @@ self->output_buffer = PyBytes_FromStringAndSize(NULL, self->max_output_len); if (self->output_buffer == NULL) - return -1; + return NULL; } self->arg = NULL; @@ -3885,14 +3980,24 @@ self->pers_func = _PyObject_GetAttrId((PyObject *)self, &PyId_persistent_id); if (self->pers_func == NULL) - return -1; + return NULL; } self->dispatch_table = NULL; if (_PyObject_HasAttrId((PyObject *)self, &PyId_dispatch_table)) { self->dispatch_table = _PyObject_GetAttrId((PyObject *)self, &PyId_dispatch_table); if (self->dispatch_table == NULL) - return -1; + return NULL; + } + return Py_None; +} + +/* XXX Slight hack to slot a Clinic generated signature in tp_init. */ +static int +Pickler_init(PyObject *self, PyObject *args, PyObject *kwargs) +{ + if (_pickle_Pickler___init__(self, args, kwargs) == NULL) { + return -1; } return 0; } @@ -3912,22 +4017,48 @@ PicklerObject *pickler; /* Pickler whose memo table we're proxying. */ } PicklerMemoProxyObject; -PyDoc_STRVAR(pmp_clear_doc, -"memo.clear() -> None. Remove all items from memo."); +/*[clinic] +_pickle.PicklerMemoProxy.clear + + self: PicklerMemoProxyObject + +Remove all items from memo. +[clinic]*/ + +PyDoc_STRVAR(_pickle_PicklerMemoProxy_clear__doc__, +"clear()\n" +"Remove all items from memo."); + +#define _PICKLE_PICKLERMEMOPROXY_CLEAR_METHODDEF \ + {"clear", (PyCFunction)_pickle_PicklerMemoProxy_clear, METH_NOARGS, _pickle_PicklerMemoProxy_clear__doc__}, static PyObject * -pmp_clear(PicklerMemoProxyObject *self) +_pickle_PicklerMemoProxy_clear(PicklerMemoProxyObject *self) +/*[clinic checksum: 507f13938721992e175a3e58b5ad02620045a1cc]*/ { if (self->pickler->memo) PyMemoTable_Clear(self->pickler->memo); Py_RETURN_NONE; } -PyDoc_STRVAR(pmp_copy_doc, -"memo.copy() -> new_memo. Copy the memo to a new object."); +/*[clinic] +_pickle.PicklerMemoProxy.copy + + self: PicklerMemoProxyObject + +Copy the memo to a new object. +[clinic]*/ + +PyDoc_STRVAR(_pickle_PicklerMemoProxy_copy__doc__, +"copy()\n" +"Copy the memo to a new object."); + +#define _PICKLE_PICKLERMEMOPROXY_COPY_METHODDEF \ + {"copy", (PyCFunction)_pickle_PicklerMemoProxy_copy, METH_NOARGS, _pickle_PicklerMemoProxy_copy__doc__}, static PyObject * -pmp_copy(PicklerMemoProxyObject *self) +_pickle_PicklerMemoProxy_copy(PicklerMemoProxyObject *self) +/*[clinic checksum: 73a5117ab354290ebdbe07bd0bf7232d0936a69d]*/ { Py_ssize_t i; PyMemoTable *memo; @@ -3964,14 +4095,27 @@ return NULL; } -PyDoc_STRVAR(pmp_reduce_doc, -"memo.__reduce__(). Pickling support."); +/*[clinic] +_pickle.PicklerMemoProxy.__reduce__ + + self: PicklerMemoProxyObject + +Implement pickle support. +[clinic]*/ + +PyDoc_STRVAR(_pickle_PicklerMemoProxy___reduce____doc__, +"__reduce__()\n" +"Implement pickle support."); + +#define _PICKLE_PICKLERMEMOPROXY___REDUCE___METHODDEF \ + {"__reduce__", (PyCFunction)_pickle_PicklerMemoProxy___reduce__, METH_NOARGS, _pickle_PicklerMemoProxy___reduce____doc__}, static PyObject * -pmp_reduce(PicklerMemoProxyObject *self, PyObject *args) +_pickle_PicklerMemoProxy___reduce__(PicklerMemoProxyObject *self) +/*[clinic checksum: 40f0bf7a9b161e77130674f0481bda0a0184dcce]*/ { PyObject *reduce_value, *dict_args; - PyObject *contents = pmp_copy(self); + PyObject *contents = _pickle_PicklerMemoProxy_copy(self); if (contents == NULL) return NULL; @@ -3994,9 +4138,9 @@ } static PyMethodDef picklerproxy_methods[] = { - {"clear", (PyCFunction)pmp_clear, METH_NOARGS, pmp_clear_doc}, - {"copy", (PyCFunction)pmp_copy, METH_NOARGS, pmp_copy_doc}, - {"__reduce__", (PyCFunction)pmp_reduce, METH_VARARGS, pmp_reduce_doc}, + _PICKLE_PICKLERMEMOPROXY_CLEAR_METHODDEF + _PICKLE_PICKLERMEMOPROXY_COPY_METHODDEF + _PICKLE_PICKLERMEMOPROXY___REDUCE___METHODDEF {NULL, NULL} /* sentinel */ }; @@ -4208,7 +4352,7 @@ 0, /*tp_setattro*/ 0, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE | Py_TPFLAGS_HAVE_GC, - Pickler_doc, /*tp_doc*/ + _pickle_Pickler___init____doc__, /*tp_doc*/ (traverseproc)Pickler_traverse, /*tp_traverse*/ (inquiry)Pickler_clear, /*tp_clear*/ 0, /*tp_richcompare*/ @@ -4223,7 +4367,7 @@ 0, /*tp_descr_get*/ 0, /*tp_descr_set*/ 0, /*tp_dictoffset*/ - (initproc)Pickler_init, /*tp_init*/ + Pickler_init, /*tp_init*/ PyType_GenericAlloc, /*tp_alloc*/ PyType_GenericNew, /*tp_new*/ PyObject_GC_Del, /*tp_free*/ @@ -5938,57 +6082,111 @@ return value; } -PyDoc_STRVAR(Unpickler_load_doc, -"load() -> object. Load a pickle." +/*[clinic] + +_pickle.Unpickler.load + +Load a pickle. + +Read a pickled object representation from the open file object given in +the constructor, and return the reconstituted object hierarchy specified +therein. +[clinic]*/ + +PyDoc_STRVAR(_pickle_Unpickler_load__doc__, +"load()\n" +"Load a pickle.\n" "\n" "Read a pickled object representation from the open file object given in\n" "the constructor, and return the reconstituted object hierarchy specified\n" -"therein.\n"); +"therein."); + +#define _PICKLE_UNPICKLER_LOAD_METHODDEF \ + {"load", (PyCFunction)_pickle_Unpickler_load, METH_NOARGS, _pickle_Unpickler_load__doc__}, static PyObject * -Unpickler_load(UnpicklerObject *self) -{ +_pickle_Unpickler_load(PyObject *self) +/*[clinic checksum: 9a30ba4e4d9221d4dcd705e1471ab11b2c9e3ac6]*/ +{ + UnpicklerObject *unpickler = (UnpicklerObject*)self; /* Check whether the Unpickler was initialized correctly. This prevents segfaulting if a subclass overridden __init__ with a function that does not call Unpickler.__init__(). Here, we simply ensure that self->read is not NULL. */ - if (self->read == NULL) { + if (unpickler->read == NULL) { PyErr_Format(UnpicklingError, "Unpickler.__init__() was not called by %s.__init__()", - Py_TYPE(self)->tp_name); + Py_TYPE(unpickler)->tp_name); return NULL; } - return load(self); + return load(unpickler); } /* The name of find_class() is misleading. In newer pickle protocols, this function is used for loading any global (i.e., functions), not just classes. The name is kept only for backward compatibility. */ -PyDoc_STRVAR(Unpickler_find_class_doc, -"find_class(module_name, global_name) -> object.\n" +/*[clinic] + +_pickle.Unpickler.find_class + + self: UnpicklerObject + module_name: object + global_name: object + / + +Return an object from a specified module. + +If necessary, the module will be imported. Subclasses may override this +method (e.g. to restrict unpickling of arbitrary classes and functions). + +This method is called whenever a class or a function object is +needed. Both arguments passed are str objects. +[clinic]*/ + +PyDoc_STRVAR(_pickle_Unpickler_find_class__doc__, +"find_class(module_name, global_name)\n" +"Return an object from a specified module.\n" "\n" -"Return an object from a specified module, importing the module if\n" -"necessary. Subclasses may override this method (e.g. to restrict\n" -"unpickling of arbitrary classes and functions).\n" +"If necessary, the module will be imported. Subclasses may override this\n" +"method (e.g. to restrict unpickling of arbitrary classes and functions).\n" "\n" "This method is called whenever a class or a function object is\n" -"needed. Both arguments passed are str objects.\n"); +"needed. Both arguments passed are str objects."); + +#define _PICKLE_UNPICKLER_FIND_CLASS_METHODDEF \ + {"find_class", (PyCFunction)_pickle_Unpickler_find_class, METH_VARARGS, _pickle_Unpickler_find_class__doc__}, static PyObject * -Unpickler_find_class(UnpicklerObject *self, PyObject *args) +_pickle_Unpickler_find_class_impl(UnpicklerObject *self, PyObject *module_name, PyObject *global_name); + +static PyObject * +_pickle_Unpickler_find_class(PyObject *self, PyObject *args) +{ + PyObject *return_value = NULL; + PyObject *module_name; + PyObject *global_name; + + if (!PyArg_ParseTuple(args, + "OO:find_class", + &module_name, &global_name)) + goto exit; + return_value = _pickle_Unpickler_find_class_impl((UnpicklerObject *)self, module_name, global_name); + +exit: + return return_value; +} + +static PyObject * +_pickle_Unpickler_find_class_impl(UnpicklerObject *self, PyObject *module_name, PyObject *global_name) +/*[clinic checksum: b7d05d4dd8adc698e5780c1ac2be0f5062d33915]*/ { PyObject *global; PyObject *modules_dict; PyObject *module; - PyObject *module_name, *global_name; _Py_IDENTIFIER(modules); - if (!PyArg_UnpackTuple(args, "find_class", 2, 2, - &module_name, &global_name)) - return NULL; - /* Try to map the old names used in Python 2.x to the new ones used in Python 3.x. We do this only with old pickle protocols and when the user has not disabled the feature. */ @@ -6065,10 +6263,8 @@ } static struct PyMethodDef Unpickler_methods[] = { - {"load", (PyCFunction)Unpickler_load, METH_NOARGS, - Unpickler_load_doc}, - {"find_class", (PyCFunction)Unpickler_find_class, METH_VARARGS, - Unpickler_find_class_doc}, + _PICKLE_UNPICKLER_LOAD_METHODDEF + _PICKLE_UNPICKLER_FIND_CLASS_METHODDEF {NULL, NULL} /* sentinel */ }; @@ -6135,9 +6331,41 @@ return 0; } -PyDoc_STRVAR(Unpickler_doc, -"Unpickler(file, *, encoding='ASCII', errors='strict')" -"\n" +/*[clinic] + +_pickle.Unpickler.__init__ + + self: UnpicklerObject + file: object + * + fix_imports: bool = True + encoding: str = 'ASCII' + errors: str = 'strict' + +This takes a binary file for reading a pickle data stream. + +The protocol version of the pickle is detected automatically, so no +proto argument is needed. + +The file-like object must have two methods, a read() method +that takes an integer argument, and a readline() method that +requires no arguments. Both methods should return bytes. +Thus file-like object can be a binary file object opened for +reading, a BytesIO object, or any other custom object that +meets this interface. + +Optional keyword arguments are *fix_imports*, *encoding* and *errors*, +which are used to control compatiblity support for pickle stream +generated by Python 2.x. If *fix_imports* is True, pickle will try to +map the old Python 2.x names to the new names used in Python 3.x. The +*encoding* and *errors* tell pickle how to decode 8-bit string +instances pickled by Python 2.x; these default to 'ASCII' and +'strict', respectively. + +[clinic]*/ + +PyDoc_STRVAR(_pickle_Unpickler___init____doc__, +"__init__(file, *, fix_imports=True, encoding=\'ASCII\', errors=\'strict\')\n" "This takes a binary file for reading a pickle data stream.\n" "\n" "The protocol version of the pickle is detected automatically, so no\n" @@ -6155,57 +6383,60 @@ "generated by Python 2.x. If *fix_imports* is True, pickle will try to\n" "map the old Python 2.x names to the new names used in Python 3.x. The\n" "*encoding* and *errors* tell pickle how to decode 8-bit string\n" -"instances pickled by Python 2.x; these default to 'ASCII' and\n" -"'strict', respectively.\n"); - -static int -Unpickler_init(UnpicklerObject *self, PyObject *args, PyObject *kwds) -{ - static char *kwlist[] = {"file", "fix_imports", "encoding", "errors", 0}; +"instances pickled by Python 2.x; these default to \'ASCII\' and\n" +"\'strict\', respectively."); + +#define _PICKLE_UNPICKLER___INIT___METHODDEF \ + {"__init__", (PyCFunction)_pickle_Unpickler___init__, METH_VARARGS|METH_KEYWORDS, _pickle_Unpickler___init____doc__}, + +static PyObject * +_pickle_Unpickler___init___impl(UnpicklerObject *self, PyObject *file, int fix_imports, const char *encoding, const char *errors); + +static PyObject * +_pickle_Unpickler___init__(PyObject *self, PyObject *args, PyObject *kwargs) +{ + PyObject *return_value = NULL; + static char *_keywords[] = {"file", "fix_imports", "encoding", "errors", NULL}; PyObject *file; - PyObject *fix_imports = Py_True; - char *encoding = NULL; - char *errors = NULL; + int fix_imports = 1; + const char *encoding = "ASCII"; + const char *errors = "strict"; + + if (!PyArg_ParseTupleAndKeywords(args, kwargs, + "O|$pss:__init__", _keywords, + &file, &fix_imports, &encoding, &errors)) + goto exit; + return_value = _pickle_Unpickler___init___impl((UnpicklerObject *)self, file, fix_imports, encoding, errors); + +exit: + return return_value; +} + +static PyObject * +_pickle_Unpickler___init___impl(UnpicklerObject *self, PyObject *file, int fix_imports, const char *encoding, const char *errors) +/*[clinic checksum: bed0d8bbe1c647960ccc6f997b33bf33935fa56f]*/ +{ _Py_IDENTIFIER(persistent_load); - /* XXX: That is an horrible error message. But, I don't know how to do - better... */ - if (Py_SIZE(args) != 1) { - PyErr_Format(PyExc_TypeError, - "%s takes exactly one positional argument (%zd given)", - Py_TYPE(self)->tp_name, Py_SIZE(args)); - return -1; - } - - /* Arguments parsing needs to be done in the __init__() method to allow - subclasses to define their own __init__() method, which may (or may - not) support Unpickler arguments. However, this means we need to be - extra careful in the other Unpickler methods, since a subclass could - forget to call Unpickler.__init__() thus breaking our internal - invariants. */ - if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|Oss:Unpickler", kwlist, - &file, &fix_imports, &encoding, &errors)) - return -1; - /* In case of multiple __init__() calls, clear previous content. */ if (self->read != NULL) (void)Unpickler_clear(self); if (_Unpickler_SetInputStream(self, file) < 0) - return -1; + return NULL; if (_Unpickler_SetInputEncoding(self, encoding, errors) < 0) - return -1; - - self->fix_imports = PyObject_IsTrue(fix_imports); + return NULL; + + self->fix_imports = fix_imports; if (self->fix_imports == -1) - return -1; + return NULL; if (_PyObject_HasAttrId((PyObject *)self, &PyId_persistent_load)) { self->pers_func = _PyObject_GetAttrId((PyObject *)self, &PyId_persistent_load); if (self->pers_func == NULL) - return -1; + return NULL; } else { self->pers_func = NULL; @@ -6213,16 +6444,26 @@ self->stack = (Pdata *)Pdata_New(); if (self->stack == NULL) - return -1; + return NULL; self->memo_size = 32; self->memo = _Unpickler_NewMemo(self->memo_size); if (self->memo == NULL) - return -1; + return NULL; self->arg = NULL; self->proto = 0; + return Py_None; +} + +/* XXX Slight hack to slot a Clinic generated signature in tp_init. */ +static int +Unpickler_init(PyObject *self, PyObject *args, PyObject *kwargs) +{ + if (_pickle_Unpickler___init__(self, args, kwargs) == NULL) { + return -1; + } return 0; } @@ -6244,11 +6485,24 @@ UnpicklerObject *unpickler; } UnpicklerMemoProxyObject; -PyDoc_STRVAR(ump_clear_doc, -"memo.clear() -> None. Remove all items from memo."); +/*[clinic] +_pickle.UnpicklerMemoProxy.clear + + self: UnpicklerMemoProxyObject + +Remove all items from memo. +[clinic]*/ + +PyDoc_STRVAR(_pickle_UnpicklerMemoProxy_clear__doc__, +"clear()\n" +"Remove all items from memo."); + +#define _PICKLE_UNPICKLERMEMOPROXY_CLEAR_METHODDEF \ + {"clear", (PyCFunction)_pickle_UnpicklerMemoProxy_clear, METH_NOARGS, _pickle_UnpicklerMemoProxy_clear__doc__}, static PyObject * -ump_clear(UnpicklerMemoProxyObject *self) +_pickle_UnpicklerMemoProxy_clear(UnpicklerMemoProxyObject *self) +/*[clinic checksum: 46fecf4e33c0c873124f845edf6cc3a2e9864bd5]*/ { _Unpickler_MemoCleanup(self->unpickler); self->unpickler->memo = _Unpickler_NewMemo(self->unpickler->memo_size); @@ -6257,11 +6511,24 @@ Py_RETURN_NONE; } -PyDoc_STRVAR(ump_copy_doc, -"memo.copy() -> new_memo. Copy the memo to a new object."); +/*[clinic] +_pickle.UnpicklerMemoProxy.copy + + self: UnpicklerMemoProxyObject + +Copy the memo to a new object. +[clinic]*/ + +PyDoc_STRVAR(_pickle_UnpicklerMemoProxy_copy__doc__, +"copy()\n" +"Copy the memo to a new object."); + +#define _PICKLE_UNPICKLERMEMOPROXY_COPY_METHODDEF \ + {"copy", (PyCFunction)_pickle_UnpicklerMemoProxy_copy, METH_NOARGS, _pickle_UnpicklerMemoProxy_copy__doc__}, static PyObject * -ump_copy(UnpicklerMemoProxyObject *self) +_pickle_UnpicklerMemoProxy_copy(UnpicklerMemoProxyObject *self) +/*[clinic checksum: f8856c4e8a33540886dfbb245f286af3008fa0ad]*/ { Py_ssize_t i; PyObject *new_memo = PyDict_New(); @@ -6291,15 +6558,28 @@ return NULL; } -PyDoc_STRVAR(ump_reduce_doc, -"memo.__reduce__(). Pickling support."); +/*[clinic] +_pickle.UnpicklerMemoProxy.__reduce__ + + self: UnpicklerMemoProxyObject + +Implement pickling support. +[clinic]*/ + +PyDoc_STRVAR(_pickle_UnpicklerMemoProxy___reduce____doc__, +"__reduce__()\n" +"Implement pickling support."); + +#define _PICKLE_UNPICKLERMEMOPROXY___REDUCE___METHODDEF \ + {"__reduce__", (PyCFunction)_pickle_UnpicklerMemoProxy___reduce__, METH_NOARGS, _pickle_UnpicklerMemoProxy___reduce____doc__}, static PyObject * -ump_reduce(UnpicklerMemoProxyObject *self, PyObject *args) +_pickle_UnpicklerMemoProxy___reduce__(UnpicklerMemoProxyObject *self) +/*[clinic checksum: ab5516a77659144e1191c7dd70a0c6c7455660bc]*/ { PyObject *reduce_value; PyObject *constructor_args; - PyObject *contents = ump_copy(self); + PyObject *contents = _pickle_UnpicklerMemoProxy_copy(self); if (contents == NULL) return NULL; @@ -6322,9 +6602,9 @@ } static PyMethodDef unpicklerproxy_methods[] = { - {"clear", (PyCFunction)ump_clear, METH_NOARGS, ump_clear_doc}, - {"copy", (PyCFunction)ump_copy, METH_NOARGS, ump_copy_doc}, - {"__reduce__", (PyCFunction)ump_reduce, METH_VARARGS, ump_reduce_doc}, + _PICKLE_UNPICKLERMEMOPROXY_CLEAR_METHODDEF + _PICKLE_UNPICKLERMEMOPROXY_COPY_METHODDEF + _PICKLE_UNPICKLERMEMOPROXY___REDUCE___METHODDEF {NULL, NULL} /* sentinel */ }; @@ -6548,7 +6828,7 @@ 0, /*tp_setattro*/ 0, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE | Py_TPFLAGS_HAVE_GC, - Unpickler_doc, /*tp_doc*/ + _pickle_Unpickler___init____doc__, /*tp_doc*/ (traverseproc)Unpickler_traverse, /*tp_traverse*/ (inquiry)Unpickler_clear, /*tp_clear*/ 0, /*tp_richcompare*/ @@ -6563,21 +6843,53 @@ 0, /*tp_descr_get*/ 0, /*tp_descr_set*/ 0, /*tp_dictoffset*/ - (initproc)Unpickler_init, /*tp_init*/ + Unpickler_init, /*tp_init*/ PyType_GenericAlloc, /*tp_alloc*/ PyType_GenericNew, /*tp_new*/ PyObject_GC_Del, /*tp_free*/ 0, /*tp_is_gc*/ }; -PyDoc_STRVAR(pickle_dump_doc, -"dump(obj, file, protocol=None, *, fix_imports=True) -> None\n" +/*[clinic] + +_pickle.dump + + obj: object + file: object + protocol: object = NULL + * + fix_imports: bool = True + +Write a pickled representation of obj to the open file object file. + +This is equivalent to ``Pickler(file, protocol).dump(obj)``, but may be more +efficient. + +The optional protocol argument tells the pickler to use the given protocol +supported protocols are 0, 1, 2, 3. The default protocol is 3; a +backward-incompatible protocol designed for Python 3.0. + +Specifying a negative protocol version selects the highest protocol version +supported. The higher the protocol used, the more recent the version of +Python needed to read the pickle produced. + +The file argument must have a write() method that accepts a single bytes +argument. It can thus be a file object opened for binary writing, a +io.BytesIO instance, or any other custom object that meets this interface. + +If fix_imports is True and protocol is less than 3, pickle will try to +map the new Python 3.x names to the old module names used in Python 2.x, +so that the pickle data stream is readable with Python 2.x. +[clinic]*/ + +PyDoc_STRVAR(_pickle_dump__doc__, +"dump(obj, file, protocol=None, *, fix_imports=True)\n" +"Write a pickled representation of obj to the open file object file.\n" "\n" -"Write a pickled representation of obj to the open file object file. This\n" -"is equivalent to ``Pickler(file, protocol).dump(obj)``, but may be more\n" +"This is equivalent to ``Pickler(file, protocol).dump(obj)``, but may be more\n" "efficient.\n" "\n" -"The optional protocol argument tells the pickler to use the given protocol;\n" +"The optional protocol argument tells the pickler to use the given protocol\n" "supported protocols are 0, 1, 2, 3. The default protocol is 3; a\n" "backward-incompatible protocol designed for Python 3.0.\n" "\n" @@ -6591,35 +6903,44 @@ "\n" "If fix_imports is True and protocol is less than 3, pickle will try to\n" "map the new Python 3.x names to the old module names used in Python 2.x,\n" -"so that the pickle data stream is readable with Python 2.x.\n"); +"so that the pickle data stream is readable with Python 2.x."); + +#define _PICKLE_DUMP_METHODDEF \ + {"dump", (PyCFunction)_pickle_dump, METH_VARARGS|METH_KEYWORDS, _pickle_dump__doc__}, static PyObject * -pickle_dump(PyObject *self, PyObject *args, PyObject *kwds) -{ - static char *kwlist[] = {"obj", "file", "protocol", "fix_imports", 0}; +_pickle_dump_impl(PyModuleDef *module, PyObject *obj, PyObject *file, PyObject *protocol, int fix_imports); + +static PyObject * +_pickle_dump(PyModuleDef *module, PyObject *args, PyObject *kwargs) +{ + PyObject *return_value = NULL; + static char *_keywords[] = {"obj", "file", "protocol", "fix_imports", NULL}; PyObject *obj; PyObject *file; - PyObject *proto = NULL; - PyObject *fix_imports = Py_True; - PicklerObject *pickler; - - /* fix_imports is a keyword-only argument. */ - if (Py_SIZE(args) > 3) { - PyErr_Format(PyExc_TypeError, - "pickle.dump() takes at most 3 positional " - "argument (%zd given)", Py_SIZE(args)); - return NULL; - } - - if (!PyArg_ParseTupleAndKeywords(args, kwds, "OO|OO:dump", kwlist, - &obj, &file, &proto, &fix_imports)) - return NULL; - - pickler = _Pickler_New(); + PyObject *protocol = NULL; + int fix_imports = 1; + + if (!PyArg_ParseTupleAndKeywords(args, kwargs, + "OO|O$p:dump", _keywords, + &obj, &file, &protocol, &fix_imports)) + goto exit; + return_value = _pickle_dump_impl(module, obj, file, protocol, fix_imports); + +exit: + return return_value; +} + +static PyObject * +_pickle_dump_impl(PyModuleDef *module, PyObject *obj, PyObject *file, PyObject *protocol, int fix_imports) +/*[clinic checksum: e442721b16052d921b5e3fbd146d0a62e94a459e]*/ +{ + PicklerObject *pickler = _Pickler_New(); + if (pickler == NULL) return NULL; - if (_Pickler_SetProtocol(pickler, proto, fix_imports) < 0) + if (_Pickler_SetProtocol(pickler, protocol, fix_imports) < 0) goto error; if (_Pickler_SetOutputStream(pickler, file) < 0) @@ -6639,11 +6960,33 @@ return NULL; } -PyDoc_STRVAR(pickle_dumps_doc, -"dumps(obj, protocol=None, *, fix_imports=True) -> bytes\n" -"\n" -"Return the pickled representation of the object as a bytes\n" -"object, instead of writing it to a file.\n" +/*[clinic] + +_pickle.dumps + + obj: object + protocol: object = NULL + * + fix_imports: bool = True + +Return the pickled representation of the object as a bytes object. + +The optional protocol argument tells the pickler to use the given protocol; +supported protocols are 0, 1, 2, 3. The default protocol is 3; a +backward-incompatible protocol designed for Python 3.0. + +Specifying a negative protocol version selects the highest protocol version +supported. The higher the protocol used, the more recent the version of +Python needed to read the pickle produced. + +If fix_imports is True and *protocol* is less than 3, pickle will try to +map the new Python 3.x names to the old module names used in Python 2.x, +so that the pickle data stream is readable with Python 2.x. +[clinic]*/ + +PyDoc_STRVAR(_pickle_dumps__doc__, +"dumps(obj, protocol=None, *, fix_imports=True)\n" +"Return the pickled representation of the object as a bytes object.\n" "\n" "The optional protocol argument tells the pickler to use the given protocol;\n" "supported protocols are 0, 1, 2, 3. The default protocol is 3; a\n" @@ -6655,35 +6998,44 @@ "\n" "If fix_imports is True and *protocol* is less than 3, pickle will try to\n" "map the new Python 3.x names to the old module names used in Python 2.x,\n" -"so that the pickle data stream is readable with Python 2.x.\n"); +"so that the pickle data stream is readable with Python 2.x."); + +#define _PICKLE_DUMPS_METHODDEF \ + {"dumps", (PyCFunction)_pickle_dumps, METH_VARARGS|METH_KEYWORDS, _pickle_dumps__doc__}, static PyObject * -pickle_dumps(PyObject *self, PyObject *args, PyObject *kwds) -{ - static char *kwlist[] = {"obj", "protocol", "fix_imports", 0}; +_pickle_dumps_impl(PyModuleDef *module, PyObject *obj, PyObject *protocol, int fix_imports); + +static PyObject * +_pickle_dumps(PyModuleDef *module, PyObject *args, PyObject *kwargs) +{ + PyObject *return_value = NULL; + static char *_keywords[] = {"obj", "protocol", "fix_imports", NULL}; PyObject *obj; - PyObject *proto = NULL; + PyObject *protocol = NULL; + int fix_imports = 1; + + if (!PyArg_ParseTupleAndKeywords(args, kwargs, + "O|O$p:dumps", _keywords, + &obj, &protocol, &fix_imports)) + goto exit; + return_value = _pickle_dumps_impl(module, obj, protocol, fix_imports); + +exit: + return return_value; +} + +static PyObject * +_pickle_dumps_impl(PyModuleDef *module, PyObject *obj, PyObject *protocol, int fix_imports) +/*[clinic checksum: df6262c4c487f537f47aec8a1709318204c1e174]*/ +{ PyObject *result; - PyObject *fix_imports = Py_True; - PicklerObject *pickler; - - /* fix_imports is a keyword-only argument. */ - if (Py_SIZE(args) > 2) { - PyErr_Format(PyExc_TypeError, - "pickle.dumps() takes at most 2 positional " - "argument (%zd given)", Py_SIZE(args)); - return NULL; - } - - if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|OO:dumps", kwlist, - &obj, &proto, &fix_imports)) - return NULL; - - pickler = _Pickler_New(); + PicklerObject *pickler = _Pickler_New(); + if (pickler == NULL) return NULL; - if (_Pickler_SetProtocol(pickler, proto, fix_imports) < 0) + if (_Pickler_SetProtocol(pickler, protocol, fix_imports) < 0) goto error; if (dump(pickler, obj) < 0) @@ -6698,15 +7050,46 @@ return NULL; } -PyDoc_STRVAR(pickle_load_doc, -"load(file, *, fix_imports=True, encoding='ASCII', errors='strict') -> object\n" +/*[clinic] + +_pickle.load + + file: object + * + fix_imports: bool = True + encoding: str = 'ASCII' + errors: str = 'strict' + +Return a reconstituted object from the pickle data stored in a file. + +This is equivalent to ``Unpickler(file).load()``, but may be more efficient. + +The protocol version of the pickle is detected automatically, so no protocol +argument is needed. Bytes past the pickled object's representation are +ignored. + +The argument file must have two methods, a read() method that takes an +integer argument, and a readline() method that requires no arguments. Both +methods should return bytes. Thus *file* can be a binary file object opened +for reading, a BytesIO object, or any other custom object that meets this +interface. + +Optional keyword arguments are fix_imports, encoding and errors, +which are used to control compatiblity support for pickle stream generated +by Python 2.x. If fix_imports is True, pickle will try to map the old +Python 2.x names to the new names used in Python 3.x. The encoding and +errors tell pickle how to decode 8-bit string instances pickled by Python +2.x; these default to 'ASCII' and 'strict', respectively. +[clinic]*/ + +PyDoc_STRVAR(_pickle_load__doc__, +"load(file, *, fix_imports=True, encoding=\'ASCII\', errors=\'strict\')\n" +"Return a reconstituted object from the pickle data stored in a file.\n" "\n" -"Read a pickled object representation from the open file object file and\n" -"return the reconstituted object hierarchy specified therein. This is\n" -"equivalent to ``Unpickler(file).load()``, but may be more efficient.\n" +"This is equivalent to ``Unpickler(file).load()``, but may be more efficient.\n" "\n" "The protocol version of the pickle is detected automatically, so no protocol\n" -"argument is needed. Bytes past the pickled object's representation are\n" +"argument is needed. Bytes past the pickled object\'s representation are\n" "ignored.\n" "\n" "The argument file must have two methods, a read() method that takes an\n" @@ -6720,32 +7103,41 @@ "by Python 2.x. If fix_imports is True, pickle will try to map the old\n" "Python 2.x names to the new names used in Python 3.x. The encoding and\n" "errors tell pickle how to decode 8-bit string instances pickled by Python\n" -"2.x; these default to 'ASCII' and 'strict', respectively.\n"); +"2.x; these default to \'ASCII\' and \'strict\', respectively."); + +#define _PICKLE_LOAD_METHODDEF \ + {"load", (PyCFunction)_pickle_load, METH_VARARGS|METH_KEYWORDS, _pickle_load__doc__}, static PyObject * -pickle_load(PyObject *self, PyObject *args, PyObject *kwds) -{ - static char *kwlist[] = {"file", "fix_imports", "encoding", "errors", 0}; +_pickle_load_impl(PyModuleDef *module, PyObject *file, int fix_imports, const char *encoding, const char *errors); + +static PyObject * +_pickle_load(PyModuleDef *module, PyObject *args, PyObject *kwargs) +{ + PyObject *return_value = NULL; + static char *_keywords[] = {"file", "fix_imports", "encoding", "errors", NULL}; PyObject *file; - PyObject *fix_imports = Py_True; + int fix_imports = 1; + const char *encoding = "ASCII"; + const char *errors = "strict"; + + if (!PyArg_ParseTupleAndKeywords(args, kwargs, + "O|$pss:load", _keywords, + &file, &fix_imports, &encoding, &errors)) + goto exit; + return_value = _pickle_load_impl(module, file, fix_imports, encoding, errors); + +exit: + return return_value; +} + +static PyObject * +_pickle_load_impl(PyModuleDef *module, PyObject *file, int fix_imports, const char *encoding, const char *errors) +/*[clinic checksum: e10796f6765b22ce48dca6940f11b3933853ca35]*/ +{ PyObject *result; - char *encoding = NULL; - char *errors = NULL; - UnpicklerObject *unpickler; - - /* fix_imports, encoding and errors are a keyword-only argument. */ - if (Py_SIZE(args) != 1) { - PyErr_Format(PyExc_TypeError, - "pickle.load() takes exactly one positional " - "argument (%zd given)", Py_SIZE(args)); - return NULL; - } - - if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|Oss:load", kwlist, - &file, &fix_imports, &encoding, &errors)) - return NULL; - - unpickler = _Unpickler_New(); + UnpicklerObject *unpickler = _Unpickler_New(); + if (unpickler == NULL) return NULL; @@ -6755,9 +7147,7 @@ if (_Unpickler_SetInputEncoding(unpickler, encoding, errors) < 0) goto error; - unpickler->fix_imports = PyObject_IsTrue(fix_imports); - if (unpickler->fix_imports == -1) - goto error; + unpickler->fix_imports = fix_imports; result = load(unpickler); Py_DECREF(unpickler); @@ -6768,14 +7158,36 @@ return NULL; } -PyDoc_STRVAR(pickle_loads_doc, -"loads(input, *, fix_imports=True, encoding='ASCII', errors='strict') -> object\n" -"\n" -"Read a pickled object hierarchy from a bytes object and return the\n" -"reconstituted object hierarchy specified therein\n" +/*[clinic] + +_pickle.loads + + data: object + * + fix_imports: bool = True + encoding: str = 'ASCII' + errors: str = 'strict' + +Return a reconstituted object from the given pickle data. + +The protocol version of the pickle is detected automatically, so no protocol +argument is needed. Bytes past the pickled object's representation are +ignored. + +Optional keyword arguments are fix_imports, encoding and errors, which +are used to control compatiblity support for pickle stream generated +by Python 2.x. If fix_imports is True, pickle will try to map the old +Python 2.x names to the new names used in Python 3.x. The encoding and +errors tell pickle how to decode 8-bit string instances pickled by Python +2.x; these default to 'ASCII' and 'strict', respectively. +[clinic]*/ + +PyDoc_STRVAR(_pickle_loads__doc__, +"loads(data, *, fix_imports=True, encoding=\'ASCII\', errors=\'strict\')\n" +"Return a reconstituted object from the given pickle data.\n" "\n" "The protocol version of the pickle is detected automatically, so no protocol\n" -"argument is needed. Bytes past the pickled object's representation are\n" +"argument is needed. Bytes past the pickled object\'s representation are\n" "ignored.\n" "\n" "Optional keyword arguments are fix_imports, encoding and errors, which\n" @@ -6783,44 +7195,51 @@ "by Python 2.x. If fix_imports is True, pickle will try to map the old\n" "Python 2.x names to the new names used in Python 3.x. The encoding and\n" "errors tell pickle how to decode 8-bit string instances pickled by Python\n" -"2.x; these default to 'ASCII' and 'strict', respectively.\n"); +"2.x; these default to \'ASCII\' and \'strict\', respectively."); + +#define _PICKLE_LOADS_METHODDEF \ + {"loads", (PyCFunction)_pickle_loads, METH_VARARGS|METH_KEYWORDS, _pickle_loads__doc__}, static PyObject * -pickle_loads(PyObject *self, PyObject *args, PyObject *kwds) -{ - static char *kwlist[] = {"input", "fix_imports", "encoding", "errors", 0}; - PyObject *input; - PyObject *fix_imports = Py_True; +_pickle_loads_impl(PyModuleDef *module, PyObject *data, int fix_imports, const char *encoding, const char *errors); + +static PyObject * +_pickle_loads(PyModuleDef *module, PyObject *args, PyObject *kwargs) +{ + PyObject *return_value = NULL; + static char *_keywords[] = {"data", "fix_imports", "encoding", "errors", NULL}; + PyObject *data; + int fix_imports = 1; + const char *encoding = "ASCII"; + const char *errors = "strict"; + + if (!PyArg_ParseTupleAndKeywords(args, kwargs, + "O|$pss:loads", _keywords, + &data, &fix_imports, &encoding, &errors)) + goto exit; + return_value = _pickle_loads_impl(module, data, fix_imports, encoding, errors); + +exit: + return return_value; +} + +static PyObject * +_pickle_loads_impl(PyModuleDef *module, PyObject *data, int fix_imports, const char *encoding, const char *errors) +/*[clinic checksum: 29ee725efcbf51a3533c19cb8261a8e267b7080a]*/ +{ PyObject *result; - char *encoding = NULL; - char *errors = NULL; - UnpicklerObject *unpickler; - - /* fix_imports, encoding and errors are a keyword-only argument. */ - if (Py_SIZE(args) != 1) { - PyErr_Format(PyExc_TypeError, - "pickle.loads() takes exactly one positional " - "argument (%zd given)", Py_SIZE(args)); - return NULL; - } - - if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|Oss:loads", kwlist, - &input, &fix_imports, &encoding, &errors)) - return NULL; - - unpickler = _Unpickler_New(); + UnpicklerObject *unpickler = _Unpickler_New(); + if (unpickler == NULL) return NULL; - if (_Unpickler_SetStringInput(unpickler, input) < 0) + if (_Unpickler_SetStringInput(unpickler, data) < 0) goto error; if (_Unpickler_SetInputEncoding(unpickler, encoding, errors) < 0) goto error; - unpickler->fix_imports = PyObject_IsTrue(fix_imports); - if (unpickler->fix_imports == -1) - goto error; + unpickler->fix_imports = fix_imports; result = load(unpickler); Py_DECREF(unpickler); @@ -6833,14 +7252,10 @@ static struct PyMethodDef pickle_methods[] = { - {"dump", (PyCFunction)pickle_dump, METH_VARARGS|METH_KEYWORDS, - pickle_dump_doc}, - {"dumps", (PyCFunction)pickle_dumps, METH_VARARGS|METH_KEYWORDS, - pickle_dumps_doc}, - {"load", (PyCFunction)pickle_load, METH_VARARGS|METH_KEYWORDS, - pickle_load_doc}, - {"loads", (PyCFunction)pickle_loads, METH_VARARGS|METH_KEYWORDS, - pickle_loads_doc}, + _PICKLE_DUMP_METHODDEF + _PICKLE_DUMPS_METHODDEF + _PICKLE_LOAD_METHODDEF + _PICKLE_LOADS_METHODDEF {NULL, NULL} /* sentinel */ }; diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -385,6 +385,8 @@ #endif #endif +#define DWORD_MAX 4294967295U + #ifdef MS_WINDOWS static int @@ -897,7 +899,7 @@ length = PyBytes_GET_SIZE(bytes); #ifdef MS_WINDOWS - if (length > MAX_PATH) { + if (length > MAX_PATH-1) { FORMAT_EXCEPTION(PyExc_ValueError, "%s too long for Windows"); Py_DECREF(bytes); return 0; @@ -1376,18 +1378,18 @@ static BOOL __stdcall win32_chdir(LPCSTR path) { - char new_path[MAX_PATH+1]; + char new_path[MAX_PATH]; int result; char env[4] = "=x:"; if(!SetCurrentDirectoryA(path)) return FALSE; - result = GetCurrentDirectoryA(MAX_PATH+1, new_path); + result = GetCurrentDirectoryA(Py_ARRAY_LENGTH(new_path), new_path); if (!result) return FALSE; /* In the ANSI API, there should not be any paths longer - than MAX_PATH. */ - assert(result <= MAX_PATH+1); + than MAX_PATH-1 (not including the final null character). */ + assert(result < Py_ARRAY_LENGTH(new_path)); if (strncmp(new_path, "\\\\", 2) == 0 || strncmp(new_path, "//", 2) == 0) /* UNC path, nothing to do. */ @@ -1401,16 +1403,16 @@ static BOOL __stdcall win32_wchdir(LPCWSTR path) { - wchar_t _new_path[MAX_PATH+1], *new_path = _new_path; + wchar_t _new_path[MAX_PATH], *new_path = _new_path; int result; wchar_t env[4] = L"=x:"; if(!SetCurrentDirectoryW(path)) return FALSE; - result = GetCurrentDirectoryW(MAX_PATH+1, new_path); + result = GetCurrentDirectoryW(Py_ARRAY_LENGTH(new_path), new_path); if (!result) return FALSE; - if (result > MAX_PATH+1) { + if (result > Py_ARRAY_LENGTH(new_path)) { new_path = PyMem_RawMalloc(result * sizeof(wchar_t)); if (!new_path) { SetLastError(ERROR_OUTOFMEMORY); @@ -3396,11 +3398,11 @@ PyObject *resobj; DWORD len; Py_BEGIN_ALLOW_THREADS - len = GetCurrentDirectoryW(sizeof wbuf/ sizeof wbuf[0], wbuf); + len = GetCurrentDirectoryW(Py_ARRAY_LENGTH(wbuf), wbuf); /* If the buffer is large enough, len does not include the terminating \0. If the buffer is too small, len includes the space needed for the terminator. */ - if (len >= sizeof wbuf/ sizeof wbuf[0]) { + if (len >= Py_ARRAY_LENGTH(wbuf)) { wbuf2 = PyMem_RawMalloc(len * sizeof(wchar_t)); if (wbuf2) len = GetCurrentDirectoryW(len, wbuf2); @@ -3581,10 +3583,10 @@ HANDLE hFindFile = INVALID_HANDLE_VALUE; BOOL result; WIN32_FIND_DATA FileData; - char namebuf[MAX_PATH+5]; /* Overallocate for \\*.*\0 */ + char namebuf[MAX_PATH+4]; /* Overallocate for "\*.*" */ char *bufptr = namebuf; /* only claim to have space for MAX_PATH */ - Py_ssize_t len = sizeof(namebuf)-5; + Py_ssize_t len = Py_ARRAY_LENGTH(namebuf)-4; PyObject *po = NULL; wchar_t *wnamebuf = NULL; @@ -3873,14 +3875,14 @@ posix__getfullpathname(PyObject *self, PyObject *args) { const char *path; - char outbuf[MAX_PATH*2]; + char outbuf[MAX_PATH]; char *temp; PyObject *po; if (PyArg_ParseTuple(args, "U|:_getfullpathname", &po)) { wchar_t *wpath; - wchar_t woutbuf[MAX_PATH*2], *woutbufp = woutbuf; + wchar_t woutbuf[MAX_PATH], *woutbufp = woutbuf; wchar_t *wtemp; DWORD result; PyObject *v; @@ -4039,24 +4041,31 @@ { PyObject *po, *result; wchar_t *path, *mountpath=NULL; - size_t bufsize; + size_t buflen; BOOL ret; if (!PyArg_ParseTuple(args, "U|:_getvolumepathname", &po)) return NULL; - path = PyUnicode_AsUnicode(po); + path = PyUnicode_AsUnicodeAndSize(po, &buflen); if (path == NULL) return NULL; + buflen += 1; /* Volume path should be shorter than entire path */ - bufsize = max(MAX_PATH, wcslen(path) * 2 * sizeof(wchar_t)+1); - mountpath = (wchar_t *)PyMem_Malloc(bufsize); + buflen = Py_MAX(buflen, MAX_PATH); + + if (buflen > DWORD_MAX) { + PyErr_SetString(PyExc_OverflowError, "path too long"); + return NULL; + } + + mountpath = (wchar_t *)PyMem_Malloc(buflen * sizeof(wchar_t)); if (mountpath == NULL) return PyErr_NoMemory(); Py_BEGIN_ALLOW_THREADS ret = GetVolumePathNameW(path, mountpath, - Py_SAFE_DOWNCAST(bufsize, size_t, DWORD)); + Py_SAFE_DOWNCAST(buflen, size_t, DWORD)); Py_END_ALLOW_THREADS if (!ret) { diff --git a/setup.py b/setup.py --- a/setup.py +++ b/setup.py @@ -1956,7 +1956,7 @@ undef_macros = [] if '--with-system-libmpdec' in sysconfig.get_config_var("CONFIG_ARGS"): include_dirs = [] - libraries = ['mpdec'] + libraries = [':libmpdec.so.2'] sources = ['_decimal/_decimal.c'] depends = ['_decimal/docstrings.h'] else: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 23:14:26 2013 From: python-checkins at python.org (christian.heimes) Date: Sun, 24 Nov 2013 23:14:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319758=3A_silence_?= =?utf-8?q?PendingDeprecationWarnings_in_test=5Fhmac?= Message-ID: <3dSQdt1jspz7LjN@mail.python.org> http://hg.python.org/cpython/rev/53ba43ed7f27 changeset: 87525:53ba43ed7f27 user: Christian Heimes date: Sun Nov 24 23:14:16 2013 +0100 summary: Issue #19758: silence PendingDeprecationWarnings in test_hmac I also removed some bare excepts from the tests. files: Lib/test/test_hmac.py | 47 +++++++++++++++++++++--------- 1 files changed, 33 insertions(+), 14 deletions(-) diff --git a/Lib/test/test_hmac.py b/Lib/test/test_hmac.py --- a/Lib/test/test_hmac.py +++ b/Lib/test/test_hmac.py @@ -1,9 +1,21 @@ +import functools import hmac import hashlib import unittest import warnings from test import support + +def ignore_warning(func): + @functools.wraps(func) + def wrapper(*args, **kwargs): + with warnings.catch_warnings(): + warnings.filterwarnings("ignore", + category=PendingDeprecationWarning) + return func(*args, **kwargs) + return wrapper + + class TestVectorsTestCase(unittest.TestCase): def test_md5_vectors(self): @@ -264,56 +276,63 @@ class ConstructorTestCase(unittest.TestCase): + @ignore_warning def test_normal(self): # Standard constructor call. failed = 0 try: h = hmac.HMAC(b"key") - except: + except Exception: self.fail("Standard constructor call raised exception.") + @ignore_warning def test_with_str_key(self): # Pass a key of type str, which is an error, because it expects a key # of type bytes with self.assertRaises(TypeError): h = hmac.HMAC("key") + @ignore_warning def test_dot_new_with_str_key(self): # Pass a key of type str, which is an error, because it expects a key # of type bytes with self.assertRaises(TypeError): h = hmac.new("key") + @ignore_warning def test_withtext(self): # Constructor call with text. try: h = hmac.HMAC(b"key", b"hash this!") - except: + except Exception: self.fail("Constructor call with text argument raised exception.") + self.assertEqual(h.hexdigest(), '34325b639da4cfd95735b381e28cb864') def test_with_bytearray(self): try: - h = hmac.HMAC(bytearray(b"key"), bytearray(b"hash this!")) - self.assertEqual(h.hexdigest(), '34325b639da4cfd95735b381e28cb864') - except: + h = hmac.HMAC(bytearray(b"key"), bytearray(b"hash this!"), + digestmod="md5") + except Exception: self.fail("Constructor call with bytearray arguments raised exception.") + self.assertEqual(h.hexdigest(), '34325b639da4cfd95735b381e28cb864') def test_with_memoryview_msg(self): try: - h = hmac.HMAC(b"key", memoryview(b"hash this!")) - self.assertEqual(h.hexdigest(), '34325b639da4cfd95735b381e28cb864') - except: + h = hmac.HMAC(b"key", memoryview(b"hash this!"), digestmod="md5") + except Exception: self.fail("Constructor call with memoryview msg raised exception.") + self.assertEqual(h.hexdigest(), '34325b639da4cfd95735b381e28cb864') def test_withmodule(self): # Constructor call with text and digest module. try: h = hmac.HMAC(b"key", b"", hashlib.sha1) - except: + except Exception: self.fail("Constructor call with hashlib.sha1 raised exception.") class SanityTestCase(unittest.TestCase): + @ignore_warning def test_default_is_md5(self): # Testing if HMAC defaults to MD5 algorithm. # NOTE: this whitebox test depends on the hmac class internals @@ -324,19 +343,19 @@ # Exercising all methods once. # This must not raise any exceptions try: - h = hmac.HMAC(b"my secret key") + h = hmac.HMAC(b"my secret key", digestmod="md5") h.update(b"compute the hash of this text!") dig = h.digest() dig = h.hexdigest() h2 = h.copy() - except: + except Exception: self.fail("Exception raised during normal usage of HMAC class.") class CopyTestCase(unittest.TestCase): def test_attributes(self): # Testing if attributes are of same type. - h1 = hmac.HMAC(b"key") + h1 = hmac.HMAC(b"key", digestmod="md5") h2 = h1.copy() self.assertTrue(h1.digest_cons == h2.digest_cons, "digest constructors don't match.") @@ -347,7 +366,7 @@ def test_realcopy(self): # Testing if the copy method created a real copy. - h1 = hmac.HMAC(b"key") + h1 = hmac.HMAC(b"key", digestmod="md5") h2 = h1.copy() # Using id() in case somebody has overridden __eq__/__ne__. self.assertTrue(id(h1) != id(h2), "No real copy of the HMAC instance.") @@ -358,7 +377,7 @@ def test_equality(self): # Testing if the copy has the same digests. - h1 = hmac.HMAC(b"key") + h1 = hmac.HMAC(b"key", digestmod="md5") h1.update(b"some random text") h2 = h1.copy() self.assertEqual(h1.digest(), h2.digest(), -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 23:21:33 2013 From: python-checkins at python.org (larry.hastings) Date: Sun, 24 Nov 2013 23:21:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Updated_PEP_429=2C_the_3=2E4_?= =?utf-8?q?release_schedule=2E?= Message-ID: <3dSQp51r3sz7Lk8@mail.python.org> http://hg.python.org/peps/rev/8261770a386f changeset: 5319:8261770a386f user: Larry Hastings date: Sun Nov 24 14:21:08 2013 -0800 summary: Updated PEP 429, the 3.4 release schedule. files: pep-0429.txt | 41 ++++++++++++++------------------------- 1 files changed, 15 insertions(+), 26 deletions(-) diff --git a/pep-0429.txt b/pep-0429.txt --- a/pep-0429.txt +++ b/pep-0429.txt @@ -40,13 +40,12 @@ - 3.4.0 alpha 2: September 9, 2013 - 3.4.0 alpha 3: September 29, 2013 - 3.4.0 alpha 4: October 20, 2013 - -The anticipated schedule for future releases: - - 3.4.0 beta 1: November 24, 2013 (Beta 1 is also "feature freeze"--no new features beyond this point.) +The anticipated schedule for future releases: + - 3.4.0 beta 2: January 5, 2014 - 3.4.0 candidate 1: January 19, 2014 - 3.4.0 candidate 2: February 2, 2014 @@ -67,41 +66,31 @@ Implemented / Final PEPs: -* PEP 428, the pathlib module -- object-oriented filesystem paths +* PEP 428, a "pathlib" module providing object-oriented filesystem paths * PEP 435, a standardized "enum" module +* PEP 436, a build enhancement that will help generate introspection + information for builtins * PEP 442, improved semantics for object finalization * PEP 443, adding single-dispatch generic functions to the standard library * PEP 445, a new C API for implementing custom memory allocators -* PEP 446, make newly created file descriptors not inherited by default -* PEP 450, basic statistics module for the standard library -* PEP 451, a ModuleSpec Type for the Import System -* PEP 453, pip bootstrapping/bundling with CPython -* PEP 454, the tracemalloc module for tracing Python memory allocations -* PEP 456, secure and interchangeable hash algorithm -* PEP 3154, Pickle protocol revision 4 -* PEP 3156, improved asynchronous IO support +* PEP 446, changing file descriptors to not be inherited by default + in subprocesses +* PEP 450, a new "statistics" module +* PEP 451, standardizing module metadata for Python's module import system +* PEP 453, a bundled installer for the *pip* package manager +* PEP 454, a new "tracemalloc" module for tracing Python memory allocations +* PEP 456, a new hash algorithm for Python strings and binary data +* PEP 3154, a new and improved protocol for pickled objects +* PEP 3156, a new "asyncio" module, a new framework for asynchronous I/O -Other final large-scale changes: - -* None so far - -Candidate PEPs: +Deferred to post-3.4: * PEP 431, improved support for time zone databases -* PEP 436, a build-time preprocessor for builtin argument parsing * PEP 441, improved Python zip application support * PEP 447, support for __locallookup__ metaclass method * PEP 448, additional unpacking generalizations * PEP 455, key transforming dictionary -Other proposed large-scale changes: - -* Introspection information for builtins -* Email version 6 - -Deferred to post-3.4: - -* None so far Copyright ========= -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Sun Nov 24 23:43:11 2013 From: python-checkins at python.org (raymond.hettinger) Date: Sun, 24 Nov 2013 23:43:11 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Document_that_?= =?utf-8?q?=40property_can_incorporate_a_docstring_from_the_getter_method?= =?utf-8?q?=2E?= Message-ID: <3dSRH36kzyz7LjY@mail.python.org> http://hg.python.org/cpython/rev/0a148a029328 changeset: 87526:0a148a029328 branch: 2.7 parent: 87512:77e3e395f446 user: Raymond Hettinger date: Sun Nov 24 14:43:04 2013 -0800 summary: Document that @property can incorporate a docstring from the getter method. Improve readabilty with additional whitespace. files: Objects/descrobject.c | 14 +++++++++----- 1 files changed, 9 insertions(+), 5 deletions(-) diff --git a/Objects/descrobject.c b/Objects/descrobject.c --- a/Objects/descrobject.c +++ b/Objects/descrobject.c @@ -1363,21 +1363,25 @@ "\n" "fget is a function to be used for getting an attribute value, and likewise\n" "fset is a function for setting, and fdel a function for del'ing, an\n" -"attribute. Typical use is to define a managed attribute x:\n" +"attribute. Typical use is to define a managed attribute x:\n\n" "class C(object):\n" " def getx(self): return self._x\n" " def setx(self, value): self._x = value\n" " def delx(self): del self._x\n" " x = property(getx, setx, delx, \"I'm the 'x' property.\")\n" "\n" -"Decorators make defining new properties or modifying existing ones easy:\n" +"Decorators make defining new properties or modifying existing ones easy:\n\n" "class C(object):\n" " @property\n" -" def x(self): return self._x\n" +" def x(self):\n" +" \"\I am the 'x' property.\"\n" +" return self._x\n" " @x.setter\n" -" def x(self, value): self._x = value\n" +" def x(self, value):\n" +" self._x = value\n" " @x.deleter\n" -" def x(self): del self._x\n" +" def x(self):\n" +" del self._x\n" ); static int -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 23:54:15 2013 From: python-checkins at python.org (raymond.hettinger) Date: Sun, 24 Nov 2013 23:54:15 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Document_that_?= =?utf-8?q?=40property_can_incorporate_a_docstring_from_the_getter_method?= =?utf-8?q?=2E?= Message-ID: <3dSRWq4lv2z7LlD@mail.python.org> http://hg.python.org/cpython/rev/5ea2e6e3e3d3 changeset: 87527:5ea2e6e3e3d3 branch: 3.3 parent: 87517:cbd78679080b user: Raymond Hettinger date: Sun Nov 24 14:53:29 2013 -0800 summary: Document that @property can incorporate a docstring from the getter method. Improve readabilty with additional whitespace. files: Objects/descrobject.c | 14 +++++++++----- 1 files changed, 9 insertions(+), 5 deletions(-) diff --git a/Objects/descrobject.c b/Objects/descrobject.c --- a/Objects/descrobject.c +++ b/Objects/descrobject.c @@ -1512,21 +1512,25 @@ "\n" "fget is a function to be used for getting an attribute value, and likewise\n" "fset is a function for setting, and fdel a function for del'ing, an\n" -"attribute. Typical use is to define a managed attribute x:\n" +"attribute. Typical use is to define a managed attribute x:\n\n" "class C(object):\n" " def getx(self): return self._x\n" " def setx(self, value): self._x = value\n" " def delx(self): del self._x\n" " x = property(getx, setx, delx, \"I'm the 'x' property.\")\n" "\n" -"Decorators make defining new properties or modifying existing ones easy:\n" +"Decorators make defining new properties or modifying existing ones easy:\n\n" "class C(object):\n" " @property\n" -" def x(self): return self._x\n" +" def x(self):\n" +" \"\I am the 'x' property.\"\n" +" return self._x\n" " @x.setter\n" -" def x(self, value): self._x = value\n" +" def x(self, value):\n" +" self._x = value\n" " @x.deleter\n" -" def x(self): del self._x\n" +" def x(self):\n" +" del self._x\n" ); static int -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sun Nov 24 23:54:16 2013 From: python-checkins at python.org (raymond.hettinger) Date: Sun, 24 Nov 2013 23:54:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_merge?= Message-ID: <3dSRWr6X7Rz7LkQ@mail.python.org> http://hg.python.org/cpython/rev/d4ad2b523770 changeset: 87528:d4ad2b523770 parent: 87525:53ba43ed7f27 parent: 87527:5ea2e6e3e3d3 user: Raymond Hettinger date: Sun Nov 24 14:53:54 2013 -0800 summary: merge files: Objects/descrobject.c | 14 +++++++++----- 1 files changed, 9 insertions(+), 5 deletions(-) diff --git a/Objects/descrobject.c b/Objects/descrobject.c --- a/Objects/descrobject.c +++ b/Objects/descrobject.c @@ -1547,21 +1547,25 @@ "\n" "fget is a function to be used for getting an attribute value, and likewise\n" "fset is a function for setting, and fdel a function for del'ing, an\n" -"attribute. Typical use is to define a managed attribute x:\n" +"attribute. Typical use is to define a managed attribute x:\n\n" "class C(object):\n" " def getx(self): return self._x\n" " def setx(self, value): self._x = value\n" " def delx(self): del self._x\n" " x = property(getx, setx, delx, \"I'm the 'x' property.\")\n" "\n" -"Decorators make defining new properties or modifying existing ones easy:\n" +"Decorators make defining new properties or modifying existing ones easy:\n\n" "class C(object):\n" " @property\n" -" def x(self): return self._x\n" +" def x(self):\n" +" \"\I am the 'x' property.\"\n" +" return self._x\n" " @x.setter\n" -" def x(self, value): self._x = value\n" +" def x(self, value):\n" +" self._x = value\n" " @x.deleter\n" -" def x(self): del self._x\n" +" def x(self):\n" +" del self._x\n" ); static int -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 04:16:28 2013 From: python-checkins at python.org (ezio.melotti) Date: Mon, 25 Nov 2013 04:16:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogIzE5NjIwOiBGaXgg?= =?utf-8?q?typo_in_docstring_=28noticed_by_Christopher_Welborn=29=2E?= Message-ID: <3dSYLN0Lvdz7Llm@mail.python.org> http://hg.python.org/cpython/rev/9b170d74a0a2 changeset: 87529:9b170d74a0a2 branch: 2.7 parent: 87526:0a148a029328 user: Ezio Melotti date: Mon Nov 25 05:14:51 2013 +0200 summary: #19620: Fix typo in docstring (noticed by Christopher Welborn). files: Lib/lib2to3/pgen2/tokenize.py | 4 ++-- Lib/tokenize.py | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Lib/lib2to3/pgen2/tokenize.py b/Lib/lib2to3/pgen2/tokenize.py --- a/Lib/lib2to3/pgen2/tokenize.py +++ b/Lib/lib2to3/pgen2/tokenize.py @@ -252,7 +252,7 @@ def detect_encoding(readline): """ The detect_encoding() function is used to detect the encoding that should - be used to decode a Python source file. It requires one argment, readline, + be used to decode a Python source file. It requires one argument, readline, in the same way as the tokenize() generator. It will call readline a maximum of twice, and return the encoding used @@ -343,7 +343,7 @@ def generate_tokens(readline): """ - The generate_tokens() generator requires one argment, readline, which + The generate_tokens() generator requires one argument, readline, which must be a callable object which provides the same interface as the readline() method of built-in file objects. Each call to the function should return one line of input as a string. Alternately, readline diff --git a/Lib/tokenize.py b/Lib/tokenize.py --- a/Lib/tokenize.py +++ b/Lib/tokenize.py @@ -263,7 +263,7 @@ def generate_tokens(readline): """ - The generate_tokens() generator requires one argment, readline, which + The generate_tokens() generator requires one argument, readline, which must be a callable object which provides the same interface as the readline() method of built-in file objects. Each call to the function should return one line of input as a string. Alternately, readline -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 04:16:29 2013 From: python-checkins at python.org (ezio.melotti) Date: Mon, 25 Nov 2013 04:16:29 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogIzE5NjIwOiBGaXgg?= =?utf-8?q?typo_in_docstring_=28noticed_by_Christopher_Welborn=29=2E?= Message-ID: <3dSYLP282fz7LmY@mail.python.org> http://hg.python.org/cpython/rev/3f99564b712e changeset: 87530:3f99564b712e branch: 3.3 parent: 87527:5ea2e6e3e3d3 user: Ezio Melotti date: Mon Nov 25 05:14:51 2013 +0200 summary: #19620: Fix typo in docstring (noticed by Christopher Welborn). files: Lib/lib2to3/pgen2/tokenize.py | 4 ++-- Lib/tokenize.py | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Lib/lib2to3/pgen2/tokenize.py b/Lib/lib2to3/pgen2/tokenize.py --- a/Lib/lib2to3/pgen2/tokenize.py +++ b/Lib/lib2to3/pgen2/tokenize.py @@ -252,7 +252,7 @@ def detect_encoding(readline): """ The detect_encoding() function is used to detect the encoding that should - be used to decode a Python source file. It requires one argment, readline, + be used to decode a Python source file. It requires one argument, readline, in the same way as the tokenize() generator. It will call readline a maximum of twice, and return the encoding used @@ -343,7 +343,7 @@ def generate_tokens(readline): """ - The generate_tokens() generator requires one argment, readline, which + The generate_tokens() generator requires one argument, readline, which must be a callable object which provides the same interface as the readline() method of built-in file objects. Each call to the function should return one line of input as a string. Alternately, readline diff --git a/Lib/tokenize.py b/Lib/tokenize.py --- a/Lib/tokenize.py +++ b/Lib/tokenize.py @@ -333,7 +333,7 @@ def detect_encoding(readline): """ The detect_encoding() function is used to detect the encoding that should - be used to decode a Python source file. It requires one argment, readline, + be used to decode a Python source file. It requires one argument, readline, in the same way as the tokenize() generator. It will call readline a maximum of twice, and return the encoding used -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 04:16:30 2013 From: python-checkins at python.org (ezio.melotti) Date: Mon, 25 Nov 2013 04:16:30 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?b?KTogIzE5NjIwOiBtZXJnZSB3aXRoIDMuMy4=?= Message-ID: <3dSYLQ3xPTz7LmM@mail.python.org> http://hg.python.org/cpython/rev/0d5da548b80a changeset: 87531:0d5da548b80a parent: 87528:d4ad2b523770 parent: 87530:3f99564b712e user: Ezio Melotti date: Mon Nov 25 05:16:09 2013 +0200 summary: #19620: merge with 3.3. files: Lib/lib2to3/pgen2/tokenize.py | 4 ++-- Lib/tokenize.py | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Lib/lib2to3/pgen2/tokenize.py b/Lib/lib2to3/pgen2/tokenize.py --- a/Lib/lib2to3/pgen2/tokenize.py +++ b/Lib/lib2to3/pgen2/tokenize.py @@ -252,7 +252,7 @@ def detect_encoding(readline): """ The detect_encoding() function is used to detect the encoding that should - be used to decode a Python source file. It requires one argment, readline, + be used to decode a Python source file. It requires one argument, readline, in the same way as the tokenize() generator. It will call readline a maximum of twice, and return the encoding used @@ -343,7 +343,7 @@ def generate_tokens(readline): """ - The generate_tokens() generator requires one argment, readline, which + The generate_tokens() generator requires one argument, readline, which must be a callable object which provides the same interface as the readline() method of built-in file objects. Each call to the function should return one line of input as a string. Alternately, readline diff --git a/Lib/tokenize.py b/Lib/tokenize.py --- a/Lib/tokenize.py +++ b/Lib/tokenize.py @@ -333,7 +333,7 @@ def detect_encoding(readline): """ The detect_encoding() function is used to detect the encoding that should - be used to decode a Python source file. It requires one argment, readline, + be used to decode a Python source file. It requires one argument, readline, in the same way as the tokenize() generator. It will call readline a maximum of twice, and return the encoding used -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 04:42:36 2013 From: python-checkins at python.org (gregory.p.smith) Date: Mon, 25 Nov 2013 04:42:36 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogRml4IHRlc3QudGVz?= =?utf-8?q?t=5Fsupport=2Ebind=5Fport=28=29_to_not_cause_an_error_when_Pyth?= =?utf-8?q?on_was?= Message-ID: <3dSYwX3N4Dz7LjM@mail.python.org> http://hg.python.org/cpython/rev/01788f8477a5 changeset: 87532:01788f8477a5 branch: 2.7 parent: 87529:9b170d74a0a2 user: Gregory P. Smith date: Sun Nov 24 19:42:15 2013 -0800 summary: Fix test.test_support.bind_port() to not cause an error when Python was compiled on a system with SO_REUSEPORT defined in the headers but run on a system with an OS kernel that does not support that new socket option. files: Lib/test/test_support.py | 12 +++++++++--- Misc/NEWS | 4 ++++ 2 files changed, 13 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_support.py b/Lib/test/test_support.py --- a/Lib/test/test_support.py +++ b/Lib/test/test_support.py @@ -409,9 +409,15 @@ raise TestFailed("tests should never set the SO_REUSEADDR " \ "socket option on TCP/IP sockets!") if hasattr(socket, 'SO_REUSEPORT'): - if sock.getsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT) == 1: - raise TestFailed("tests should never set the SO_REUSEPORT " \ - "socket option on TCP/IP sockets!") + try: + if sock.getsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT) == 1: + raise TestFailed("tests should never set the SO_REUSEPORT " \ + "socket option on TCP/IP sockets!") + except EnvironmentError: + # Python's socket module was compiled using modern headers + # thus defining SO_REUSEPORT but this process is running + # under an older kernel that does not support SO_REUSEPORT. + pass if hasattr(socket, 'SO_EXCLUSIVEADDRUSE'): sock.setsockopt(socket.SOL_SOCKET, socket.SO_EXCLUSIVEADDRUSE, 1) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -12,6 +12,10 @@ Library ------- +- Fix test.test_support.bind_port() to not cause an error when Python was + compiled on a system with SO_REUSEPORT defined in the headers but run on + a system with an OS kernel that does not support that new socket option. + - Issue #19633: Fixed writing not compressed 16- and 32-bit wave files on big-endian platforms. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 05:19:02 2013 From: python-checkins at python.org (ezio.melotti) Date: Mon, 25 Nov 2013 05:19:02 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Add_whatsnew_section_about?= =?utf-8?q?_the_html_package=2E?= Message-ID: <3dSZkZ0LSKzSV8@mail.python.org> http://hg.python.org/cpython/rev/b2e917a6eb7b changeset: 87533:b2e917a6eb7b parent: 87531:0d5da548b80a user: Ezio Melotti date: Mon Nov 25 06:18:47 2013 +0200 summary: Add whatsnew section about the html package. files: Doc/whatsnew/3.4.rst | 21 +++++++++++++++++++++ 1 files changed, 21 insertions(+), 0 deletions(-) diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -553,6 +553,27 @@ (Contributed by Christian Heimes in :issue:`18582`) +html +---- + +Added a new :func:`html.unescape` function that converts HTML5 character +references to the corresponding Unicode characters. + +(Contributed by Ezio Melotti in :issue:`2927`) + +Added a new *convert_charrefs* keyword argument to +:class:`~html.parser.HTMLParser` that, when ``True``, automatically converts +all character references. For backward-compatibility, its value defaults +to ``False``, but it will change to ``True`` in future versions, so you +are invited to set it explicitly and update your code to use this new feature. + +(Contributed by Ezio Melotti in :issue:`13633`) + +The *strict* argument of :class:`~html.parser.HTMLParser` is now deprecated. + +(Contributed by Ezio Melotti in :issue:`15114`) + + inspect ------- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 05:30:33 2013 From: python-checkins at python.org (gregory.p.smith) Date: Mon, 25 Nov 2013 05:30:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_fix_docstring?= =?utf-8?q?=2E_extra_=5C=2E?= Message-ID: <3dSZzs726Cz7LkT@mail.python.org> http://hg.python.org/cpython/rev/3981e57a7bdc changeset: 87534:3981e57a7bdc branch: 2.7 parent: 87532:01788f8477a5 user: Gregory P. Smith date: Mon Nov 25 04:30:00 2013 +0000 summary: fix docstring. extra \. files: Objects/descrobject.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Objects/descrobject.c b/Objects/descrobject.c --- a/Objects/descrobject.c +++ b/Objects/descrobject.c @@ -1374,7 +1374,7 @@ "class C(object):\n" " @property\n" " def x(self):\n" -" \"\I am the 'x' property.\"\n" +" \"I am the 'x' property.\"\n" " return self._x\n" " @x.setter\n" " def x(self, value):\n" -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 05:41:18 2013 From: python-checkins at python.org (ezio.melotti) Date: Mon, 25 Nov 2013 05:41:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Fix_broken_lin?= =?utf-8?q?k_in_html=2Eentities_docs=2E?= Message-ID: <3dSbDG0H5tz7LjM@mail.python.org> http://hg.python.org/cpython/rev/7787a651e8b6 changeset: 87535:7787a651e8b6 branch: 3.3 parent: 87530:3f99564b712e user: Ezio Melotti date: Mon Nov 25 06:40:12 2013 +0200 summary: Fix broken link in html.entities docs. files: Doc/library/html.entities.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/library/html.entities.rst b/Doc/library/html.entities.rst --- a/Doc/library/html.entities.rst +++ b/Doc/library/html.entities.rst @@ -42,4 +42,4 @@ .. rubric:: Footnotes -.. [#] See http://www.w3.org/TR/html5/named-character-references.html +.. [#] See http://www.w3.org/TR/html5/syntax.html#named-character-references -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 05:41:19 2013 From: python-checkins at python.org (ezio.melotti) Date: Mon, 25 Nov 2013 05:41:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge_broken_link_fix_from_3=2E3=2E?= Message-ID: <3dSbDH2L5rz7LkV@mail.python.org> http://hg.python.org/cpython/rev/3f4c8e3e00ae changeset: 87536:3f4c8e3e00ae parent: 87533:b2e917a6eb7b parent: 87535:7787a651e8b6 user: Ezio Melotti date: Mon Nov 25 06:41:00 2013 +0200 summary: Merge broken link fix from 3.3. files: Doc/library/html.entities.rst | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Doc/library/html.entities.rst b/Doc/library/html.entities.rst --- a/Doc/library/html.entities.rst +++ b/Doc/library/html.entities.rst @@ -43,4 +43,4 @@ .. rubric:: Footnotes -.. [#] See http://www.w3.org/TR/html5/named-character-references.html +.. [#] See http://www.w3.org/TR/html5/syntax.html#named-character-references -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 05:45:50 2013 From: python-checkins at python.org (gregory.p.smith) Date: Mon, 25 Nov 2013 05:45:50 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogRml4IHRlc3RfZmNu?= =?utf-8?q?tl_to_run_properly_on_systems_that_do_not_support_the_flags?= Message-ID: <3dSbKV48lqz7Ljx@mail.python.org> http://hg.python.org/cpython/rev/cac7319c5972 changeset: 87537:cac7319c5972 branch: 2.7 parent: 87534:3981e57a7bdc user: Gregory P. Smith date: Mon Nov 25 04:45:27 2013 +0000 summary: Fix test_fcntl to run properly on systems that do not support the flags used in the "does the value get passed in properly" test. files: Lib/test/test_fcntl.py | 3 +++ 1 files changed, 3 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_fcntl.py b/Lib/test/test_fcntl.py --- a/Lib/test/test_fcntl.py +++ b/Lib/test/test_fcntl.py @@ -113,7 +113,10 @@ self.skipTest("F_NOTIFY or DN_MULTISHOT unavailable") fd = os.open(os.path.dirname(os.path.abspath(TESTFN)), os.O_RDONLY) try: + # This will raise OverflowError if issue1309352 is present. fcntl.fcntl(fd, cmd, flags) + except IOError: + pass # Running on a system that doesn't support these flags. finally: os.close(fd) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 06:41:12 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Mon, 25 Nov 2013 06:41:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Simplify_save=5Fbool_in_cp?= =?utf-8?q?ickle=2E?= Message-ID: <3dScYN06F2z7Ljx@mail.python.org> http://hg.python.org/cpython/rev/f27d6721f446 changeset: 87538:f27d6721f446 parent: 87536:3f4c8e3e00ae user: Alexandre Vassalotti date: Sun Nov 24 21:40:18 2013 -0800 summary: Simplify save_bool in cpickle. files: Modules/_pickle.c | 29 +++++++++++------------------ 1 files changed, 11 insertions(+), 18 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -110,16 +110,6 @@ FRAME = '\x95' }; -/* These aren't opcodes -- they're ways to pickle bools before protocol 2 - * so that unpicklers written before bools were introduced unpickle them - * as ints, but unpicklers after can recognize that bools were intended. - * Note that protocol 2 added direct ways to pickle bools. - */ -#undef TRUE -#define TRUE "I01\n" -#undef FALSE -#define FALSE "I00\n" - enum { /* Keep in synch with pickle.Pickler._BATCHSIZE. This is how many elements batch_list/dict() pumps out before doing APPENDS/SETITEMS. Nothing will @@ -1619,18 +1609,21 @@ static int save_bool(PicklerObject *self, PyObject *obj) { - static const char *buf[2] = { FALSE, TRUE }; - const char len[2] = {sizeof(FALSE) - 1, sizeof(TRUE) - 1}; - int p = (obj == Py_True); - if (self->proto >= 2) { - const char bool_op = p ? NEWTRUE : NEWFALSE; + const char bool_op = (obj == Py_True) ? NEWTRUE : NEWFALSE; if (_Pickler_Write(self, &bool_op, 1) < 0) return -1; } - else if (_Pickler_Write(self, buf[p], len[p]) < 0) - return -1; - + else { + /* These aren't opcodes -- they're ways to pickle bools before protocol 2 + * so that unpicklers written before bools were introduced unpickle them + * as ints, but unpicklers after can recognize that bools were intended. + * Note that protocol 2 added direct ways to pickle bools. + */ + const char *bool_str = (obj == Py_True) ? "I01\n" : "I00\n"; + if (_Pickler_Write(self, bool_str, strlen(bool_str)) < 0) + return -1; + } return 0; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 07:32:18 2013 From: python-checkins at python.org (guido.van.rossum) Date: Mon, 25 Nov 2013 07:32:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Add_BoundedSema?= =?utf-8?q?phore_to_export_list_in_locks=2E=5F=5Fall=5F=5F=2E?= Message-ID: <3dSdhL2h69z7LjV@mail.python.org> http://hg.python.org/cpython/rev/a126a7ec2298 changeset: 87539:a126a7ec2298 user: Guido van Rossum date: Sun Nov 24 22:32:09 2013 -0800 summary: asyncio: Add BoundedSemaphore to export list in locks.__all__. files: Lib/asyncio/locks.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/locks.py b/Lib/asyncio/locks.py --- a/Lib/asyncio/locks.py +++ b/Lib/asyncio/locks.py @@ -1,6 +1,6 @@ """Synchronization primitives.""" -__all__ = ['Lock', 'Event', 'Condition', 'Semaphore'] +__all__ = ['Lock', 'Event', 'Condition', 'Semaphore', 'BoundedSemaphore'] import collections -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 07:41:22 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Mon, 25 Nov 2013 07:41:22 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Merge_save=5Fint_into_save?= =?utf-8?q?=5Flong_in_cpickle_to_remove_redundant_code=2E?= Message-ID: <3dSdtp3xR5z7LjV@mail.python.org> http://hg.python.org/cpython/rev/14f2776686b3 changeset: 87540:14f2776686b3 user: Alexandre Vassalotti date: Sun Nov 24 22:41:13 2013 -0800 summary: Merge save_int into save_long in cpickle to remove redundant code. Also, replace unnessary uses of the #if preprocessor directive. files: Modules/_pickle.c | 92 +++++++++++++--------------------- 1 files changed, 36 insertions(+), 56 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -714,14 +714,14 @@ out[1] = (unsigned char)((value >> 8) & 0xff); out[2] = (unsigned char)((value >> 16) & 0xff); out[3] = (unsigned char)((value >> 24) & 0xff); -#if SIZEOF_SIZE_T >= 8 - out[4] = (unsigned char)((value >> 32) & 0xff); - out[5] = (unsigned char)((value >> 40) & 0xff); - out[6] = (unsigned char)((value >> 48) & 0xff); - out[7] = (unsigned char)((value >> 56) & 0xff); -#else - out[4] = out[5] = out[6] = out[7] = 0; -#endif + if (sizeof(size_t) >= 8) { + out[4] = (unsigned char)((value >> 32) & 0xff); + out[5] = (unsigned char)((value >> 40) & 0xff); + out[6] = (unsigned char)((value >> 48) & 0xff); + out[7] = (unsigned char)((value >> 56) & 0xff); + } else { + out[4] = out[5] = out[6] = out[7] = 0; + } } static void @@ -1628,30 +1628,31 @@ } static int -save_int(PicklerObject *self, long x) -{ - char pdata[32]; - Py_ssize_t len = 0; - - if (!self->bin -#if SIZEOF_LONG > 4 - || x > 0x7fffffffL || x < -0x80000000L -#endif - ) { - /* Text-mode pickle, or long too big to fit in the 4-byte - * signed BININT format: store as a string. - */ - pdata[0] = LONG; /* use LONG for consistency with pickle.py */ - PyOS_snprintf(pdata + 1, sizeof(pdata) - 1, "%ldL\n", x); - if (_Pickler_Write(self, pdata, strlen(pdata)) < 0) - return -1; - } - else { - /* Binary pickle and x fits in a signed 4-byte int. */ - pdata[1] = (unsigned char)(x & 0xff); - pdata[2] = (unsigned char)((x >> 8) & 0xff); - pdata[3] = (unsigned char)((x >> 16) & 0xff); - pdata[4] = (unsigned char)((x >> 24) & 0xff); +save_long(PicklerObject *self, PyObject *obj) +{ + PyObject *repr = NULL; + Py_ssize_t size; + long val; + int status = 0; + + const char long_op = LONG; + + val= PyLong_AsLong(obj); + if (val == -1 && PyErr_Occurred()) { + /* out of range for int pickling */ + PyErr_Clear(); + } + else if (self->bin && + (sizeof(long) <= 4 || + (val <= 0x7fffffffL && val >= -0x80000000L))) { + /* result fits in a signed 4-byte integer */ + char pdata[32]; + Py_ssize_t len = 0; + + pdata[1] = (unsigned char)(val & 0xff); + pdata[2] = (unsigned char)((val >> 8) & 0xff); + pdata[3] = (unsigned char)((val >> 16) & 0xff); + pdata[4] = (unsigned char)((val >> 24) & 0xff); if ((pdata[4] == 0) && (pdata[3] == 0)) { if (pdata[2] == 0) { @@ -1670,30 +1671,9 @@ if (_Pickler_Write(self, pdata, len) < 0) return -1; - } - - return 0; -} - -static int -save_long(PicklerObject *self, PyObject *obj) -{ - PyObject *repr = NULL; - Py_ssize_t size; - long val = PyLong_AsLong(obj); - int status = 0; - - const char long_op = LONG; - - if (val == -1 && PyErr_Occurred()) { - /* out of range for int pickling */ - PyErr_Clear(); - } - else -#if SIZEOF_LONG > 4 - if (val <= 0x7fffffffL && val >= -0x80000000L) -#endif - return save_int(self, val); + + return 0; + } if (self->proto >= 2) { /* Linear-time pickling. */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 07:42:13 2013 From: python-checkins at python.org (guido.van.rossum) Date: Mon, 25 Nov 2013 07:42:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Fix_docstring_o?= =?utf-8?b?ZiBnZXRfbm93YWl0KCku?= Message-ID: <3dSdvn3wqvz7LjV@mail.python.org> http://hg.python.org/cpython/rev/8d0206f97439 changeset: 87541:8d0206f97439 user: Guido van Rossum date: Sun Nov 24 22:41:35 2013 -0800 summary: asyncio: Fix docstring of get_nowait(). files: Lib/asyncio/queues.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/queues.py b/Lib/asyncio/queues.py --- a/Lib/asyncio/queues.py +++ b/Lib/asyncio/queues.py @@ -184,7 +184,7 @@ def get_nowait(self): """Remove and return an item from the queue. - Return an item if one is immediately available, else raise Full. + Return an item if one is immediately available, else raise Empty. """ self._consume_done_putters() if self._putters: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 07:43:42 2013 From: python-checkins at python.org (guido.van.rossum) Date: Mon, 25 Nov 2013 07:43:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Took_care_of_a_few_TODOs_in_T?= =?utf-8?q?ulip_PEP=2E?= Message-ID: <3dSdxV26s6z7Ljx@mail.python.org> http://hg.python.org/peps/rev/b153272def9b changeset: 5320:b153272def9b user: Guido van Rossum date: Sun Nov 24 22:43:36 2013 -0800 summary: Took care of a few TODOs in Tulip PEP. files: pep-3156.txt | 139 ++++++++++++++++++++++++++++++++++---- 1 files changed, 123 insertions(+), 16 deletions(-) diff --git a/pep-3156.txt b/pep-3156.txt --- a/pep-3156.txt +++ b/pep-3156.txt @@ -286,6 +286,25 @@ ``DefaultEventLoopPolicy``. The current event loop policy object can be retrieved by calling ``get_event_loop_policy()``. +Passing an Event Loop Around Explicitly +''''''''''''''''''''''''''''''''''''''' + +It is possible to write code that uses an event loop without relying +on a global or per-thread default event loop. For this purpose, all +APIs that need access to the current event loop (and aren't methods on +an event class) take an optional keyword argument named ``loop``. If +this argument is ``None`` or unspecified, such APIs will call +``get_event_loop()`` to get the default event loop, but if the +``loop`` keyword argument is set to an event loop object, they will +use that event loop, and pass it along to any other such APIs they +call. For example, ``Future(loop=my_loop)`` will create a Future tied +to the event loop ``my_loop``. When the default current event is +``None``, the ``loop`` keyword argument is effectively mandatory. + +Note that an explicitly passed event loop must still belong to the +current thread; the ``loop`` keyword argument does not magically +change the constraints on how an event loop can be used. + Specifying Times ---------------- @@ -967,6 +986,7 @@ nothing and return ``False``. Otherwise, this attempts to cancel the Future and returns ``True``. If the the cancellation attempt is successful, eventually the Future's state will change to cancelled + (so that ``cancelled()`` will return ``True``) and the callbacks will be scheduled. For regular Futures, cancellation will always succeed immediately; but for Tasks (see below) the task may ignore or delay the cancellation attempt. @@ -1583,6 +1603,21 @@ is exactly the same as the argument; however, if the returned Future is cancelled, the argument Future is unaffected. + A use case for this function would be a coroutine that caches a + query result for a coroutine that handles a request in an HTTP + server. When the request is cancelled by the client, we could + (arguably) want the query-caching coroutine to continue to run, so + that when the client reconnects, the query result is (hopefully) + cached. This could be written e.g. as follows:: + + @asyncio.coroutine + def handle_request(self, request): + ... + cached_query = self.get_cache(...) + if cached_query is None: + cached_query = yield from asyncio.shield(self.fill_cache(...)) + ... + Sleeping -------- @@ -1601,8 +1636,9 @@ Cancelling a task that's not done yet throws an ``asyncio.CancelledError`` exception into the coroutine. If the coroutine doesn't catch this (or if it re-raises it) the task will be -marked as cancelled; but if the coroutine somehow catches and ignores -the exception it may continue to execute. +marked as cancelled (i.e., ``cancelled()`` will return ``True``; but +if the coroutine somehow catches and ignores the exception it may +continue to execute (and ``cancelled()`` will return ``False``. Tasks are also useful for interoperating between coroutines and callback-based frameworks like Twisted. After converting a coroutine @@ -1645,31 +1681,98 @@ off the coroutine when ``connection_made()`` is called. +Synchronization +=============== + +Locks, events, conditions and semaphores modeled after those in the +``threading`` module are implemented and can be accessed by importing +the ``asyncio.locks`` submodule. Queus modeled after those in the +``queue`` module are implemented and can be accessed by importing the +``asyncio.queues`` submodule. + +In general these have a close correspondence to their threaded +counterparts, however, blocking methods (e.g. ``aqcuire()`` on locks, +``put()`` and ``get()`` on queues) are coroutines, and timeout +parameters are not provided (you can use ``asyncio.wait_for()`` to add +a timeout to a blocking call, however). + +The docstrings in the modules provide more complete documentation. + +Locks +----- + +The following classes are provided by ``asyncio.locks``. For all +these except ``Event``, the ``with`` statement may be used in +combination with ``yield from`` to acquire the lock and ensure that +the lock is released regardless of how the ``with`` block is left, as +follows:: + + with (yield from my_lock): + ... + + +- ``Lock``: a basic mutex, with methods ``acquire()`` (a coroutine), + ``locked()``, and ``release()``. + +- ``Event``: an event variable, with methods ``wait()`` (a coroutine), + ``set()``, ``clear()``, and ``is_set()``. + +- ``Condition``: a condition variable, with methods ``acquire()``, + ``wait()``, ``wait_for(predicate)`` (all three coroutines), + ``locked()``, ``release()``, ``notify()``, and ``notify_all()``. + +- ``Semaphore``: a semaphore, with methods ``acquire()`` (a + coroutine), ``locked()``, and ``release()``. The constructor + argument is the initial value (default ``1``). + +- ``BoundedSemaphore``: a bounded semaphore; this is similar to + ``Semaphore`` but the initial value is also the maximum value. + +Queues +------ + +The following classes and exceptions are provided by ``asyncio.queues``. + +- ``Queue``: a standard queue, with methods ``get()``, ``put()`` (both + coroutines), ``get_nowait()``, ''put_nowait()'', ``empty()``, + ``full()``, ``qsize()``, and ``maxsize()``. + +- ``PriorityQueue``: a subclass of ``Queue`` that retrieves entries + in priority order (lowest first). + +- ``LifoQueue``: a aubclass of ``Queue`` that retrieves the most + recently added entries first. + +- ``JoinableQueue``: a subclass of ``Queue`` with ``task_done()`` and + ``join()`` methods (the latter a coroutine). + +- ``Empty``, ``Full``: exceptions raised when ``get_nowait()`` or + ``put_nowait()`` is called on a queue that is empty or full, + respectively. + + TO DO ===== -- Document cancellation in more detail. +- Document ``StreamReader``, ``StreamWriter``, ``open_connection()``, + and ``start_server()``. -- Document locks and queues. +- Document all places that take a ``loop`` keyword argument. -- Document StreamReader, StreamWriter and open_connection(). +- Document ``logger`` object. -- Document passing 'loop=...' everywhere. - -- Document logger object. - -- Document SIGCHILD handling API. - -- Compare all APIs with the source code to be sure there aren't any - undocumented or unimplemented features. +- Document ``SIGCHILD`` handling API. Wish List ========= -- An open_server() helper. It should take a callback which is called - for each accepted connection with a reader and writer; the callback - may be a coroutine (then it is wrapped in a task). +(There is agreement that these features are desirable, but no +implementation was available when Python 3.4 beta 1 was released, and +the feature freeze for the rest of the Python 3.4 release cycle +prohibits adding them in this late stage. However, they will +hopefully be added in Python 3.5, and perhaps earlier in the PyPI +distribution.) - Support a "start TLS" operation to upgrade a TCP socket to SSL/TLS. @@ -1681,6 +1784,10 @@ Open Issues =========== +(Note that these have been resolved de facto in favor of the status +quo by the acceptance of the PEP. However, the PEP's provisional +status allows revising these decisions for Python 3.5.) + - Why do ``create_connection()`` and ``create_datagram_endpoint()`` have a ``proto`` argument but not ``create_server()``? And why are the family, flag, proto arguments for ``getaddrinfo()`` sometimes -- Repository URL: http://hg.python.org/peps From solipsis at pitrou.net Mon Nov 25 07:47:43 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Mon, 25 Nov 2013 07:47:43 +0100 Subject: [Python-checkins] Daily reference leaks (d4ad2b523770): sum=0 Message-ID: results for d4ad2b523770 on branch "default" -------------------------------------------- test_site leaked [2, 0, -2] references, sum=0 test_site leaked [2, 0, -2] memory blocks, sum=0 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogCSttA0', '-x'] From python-checkins at python.org Mon Nov 25 08:29:59 2013 From: python-checkins at python.org (georg.brandl) Date: Mon, 25 Nov 2013 08:29:59 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Fix_doc_markup?= =?utf-8?q?_error=2E?= Message-ID: <3dSfyv54Vvz7LjM@mail.python.org> http://hg.python.org/cpython/rev/90cfdff0f8c9 changeset: 87542:90cfdff0f8c9 branch: 3.3 parent: 87535:7787a651e8b6 user: Georg Brandl date: Mon Nov 25 08:29:44 2013 +0100 summary: Fix doc markup error. files: Doc/library/gettext.rst | 6 +++--- 1 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Doc/library/gettext.rst b/Doc/library/gettext.rst --- a/Doc/library/gettext.rst +++ b/Doc/library/gettext.rst @@ -476,9 +476,9 @@ :program:`xgettext`, :program:`pygettext`, and similar tools generate :file:`.po` files that are message catalogs. They are structured -:human-readable files that contain every marked string in the source -:code, along with a placeholder for the translated versions of these -:strings. +human-readable files that contain every marked string in the source +code, along with a placeholder for the translated versions of these +strings. Copies of these :file:`.po` files are then handed over to the individual human translators who write translations for every -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 08:30:00 2013 From: python-checkins at python.org (georg.brandl) Date: Mon, 25 Nov 2013 08:30:00 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_merge_with_3=2E3?= Message-ID: <3dSfyw6qNLz7LlT@mail.python.org> http://hg.python.org/cpython/rev/a6ea3714774f changeset: 87543:a6ea3714774f parent: 87541:8d0206f97439 parent: 87542:90cfdff0f8c9 user: Georg Brandl date: Mon Nov 25 08:29:54 2013 +0100 summary: merge with 3.3 files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 08:34:30 2013 From: python-checkins at python.org (georg.brandl) Date: Mon, 25 Nov 2013 08:34:30 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Fix_suspicious?= =?utf-8?q?_markup_and_sort_ignorelist=2E?= Message-ID: <3dSg465T9lz7LjM@mail.python.org> http://hg.python.org/cpython/rev/74b1b3ebf454 changeset: 87544:74b1b3ebf454 branch: 2.7 parent: 87537:cac7319c5972 user: Georg Brandl date: Mon Nov 25 08:34:24 2013 +0100 summary: Fix suspicious markup and sort ignorelist. files: Doc/license.rst | 4 +- Doc/tools/sphinxext/susp-ignored.csv | 35 ++++++++------- 2 files changed, 20 insertions(+), 19 deletions(-) diff --git a/Doc/license.rst b/Doc/license.rst --- a/Doc/license.rst +++ b/Doc/license.rst @@ -746,8 +746,8 @@ * */ - Original SSLeay License - ----------------------- + Original SSLeay License + ----------------------- /* Copyright (C) 1995-1998 Eric Young (eay at cryptsoft.com) * All rights reserved. diff --git a/Doc/tools/sphinxext/susp-ignored.csv b/Doc/tools/sphinxext/susp-ignored.csv --- a/Doc/tools/sphinxext/susp-ignored.csv +++ b/Doc/tools/sphinxext/susp-ignored.csv @@ -8,6 +8,10 @@ extending/extending,,:myfunction,"PyArg_ParseTuple(args, ""D:myfunction"", &c);" extending/newtypes,,:call,"if (!PyArg_ParseTuple(args, ""sss:call"", &arg1, &arg2, &arg3)) {" extending/windows,,:initspam,/export:initspam +faq/programming,,:reduce,"print (lambda Ru,Ro,Iu,Io,IM,Sx,Sy:reduce(lambda x,y:x+y,map(lambda y," +faq/programming,,:reduce,"Sx=Sx,Sy=Sy:reduce(lambda x,y:x+y,map(lambda x,xc=Ru,yc=yc,Ru=Ru,Ro=Ro," +faq/programming,,:chr,">=4.0) or 1+f(xc,yc,x*x-y*y+xc,2.0*x*y+yc,k-1,f):f(xc,yc,x,y,k,f):chr(" +faq/programming,,::,for x in sequence[::-1]: howto/cporting,,:encode,"if (!PyArg_ParseTuple(args, ""O:encode_object"", &myobj))" howto/cporting,,:say,"if (!PyArg_ParseTuple(args, ""U:say_hello"", &name))" howto/curses,,:black,"They are: 0:black, 1:red, 2:green, 3:yellow, 4:blue, 5:magenta, 6:cyan, and" @@ -39,10 +43,15 @@ howto/logging,,:logger,severity:logger name:message howto/logging,,:message,severity:logger name:message howto/logging,,:This,DEBUG:root:This message should go to the log file +howto/pyporting,75,::,# make sure to use :: Python *and* :: Python :: 3 so +howto/pyporting,75,::,"'Programming Language :: Python'," +howto/pyporting,75,::,'Programming Language :: Python :: 3' howto/regex,,::, howto/regex,,:foo,(?:foo) howto/urllib2,,:example,"for example ""joe at password:example.com""" library/audioop,,:ipos,"# factor = audioop.findfactor(in_test[ipos*2:ipos*2+len(out_test)]," +library/bisect,,:hi,all(val >= x for val in a[i:hi]) +library/bisect,,:hi,all(val > x for val in a[i:hi]) library/cookie,,`,!#$%&'*+-.^_`|~ library/datetime,,:MM, library/datetime,,:SS, @@ -92,8 +101,9 @@ library/profile,,:lineno,filename:lineno(function) library/pyexpat,,:elem1, library/pyexpat,,:py,"xmlns:py = ""http://www.python.org/ns/"">" -library/smtplib,,:port,"as well as a regular host:port server." +library/smtplib,,:port,method must support that as well as a regular host:port library/socket,,::,'5aef:2b::8' +library/socket,,::,"(10, 1, 6, '', ('2001:888:2000:d::a2', 80, 0, 0))]" library/sqlite3,,:memory, library/sqlite3,,:who,"cur.execute(""select * from people where name_last=:who and age=:age"", {""who"": who, ""age"": age})" library/sqlite3,,:age,"cur.execute(""select * from people where name_last=:who and age=:age"", {""who"": who, ""age"": age})" @@ -105,6 +115,7 @@ library/ssl,,:Some,"Locality Name (eg, city) []:Some City" library/ssl,,:US,Country Name (2 letter code) [AU]:US library/stdtypes,,:len,s[len(s):len(s)] +library/stdtypes,,:end,s[start:end] library/string,,:end,s[start:end] library/subprocess,,`,"output=`mycmd myarg`" library/subprocess,,`,"output=`dmesg | grep hda`" @@ -116,6 +127,7 @@ library/turtle,,::,Example:: library/urllib,,:port,:port library/urllib2,,:password,"""joe:password at python.org""" +library/urllib2,,:close,Connection:close library/uuid,,:uuid,urn:uuid:12345678-1234-5678-1234-567812345678 library/xmlrpclib,,:pass,http://user:pass at host:port/path library/xmlrpclib,,:pass,user:pass @@ -123,6 +135,10 @@ license,,`,THIS SOFTWARE IS PROVIDED BY THE PROJECT AND CONTRIBUTORS ``AS IS'' AND license,,:zooko,mailto:zooko at zooko.com license,,`,THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND +license,,`,* THIS SOFTWARE IS PROVIDED BY THE OpenSSL PROJECT ``AS IS'' AND ANY +license,,`,* THIS SOFTWARE IS PROVIDED BY ERIC YOUNG ``AS IS'' AND +license,,`,"``Software''), to deal in the Software without restriction, including" +license,,`,"THE SOFTWARE IS PROVIDED ``AS IS'', WITHOUT WARRANTY OF ANY KIND," reference/datamodel,,:step,a[i:j:step] reference/datamodel,,:max, reference/expressions,,:index,x[index:index] @@ -150,6 +166,7 @@ using/cmdline,,:message,action:message:category:module:line using/cmdline,,:module,action:message:category:module:line using/cmdline,,:errorhandler,:errorhandler +using/unix,,:Packaging,http://en.opensuse.org/Portal:Packaging whatsnew/2.0,418,:len, whatsnew/2.3,,::, whatsnew/2.3,,:config, @@ -163,24 +180,8 @@ whatsnew/2.5,,:memory,:memory: whatsnew/2.5,,:step,[start:stop:step] whatsnew/2.5,,:stop,[start:stop:step] -faq/programming,,:reduce,"print (lambda Ru,Ro,Iu,Io,IM,Sx,Sy:reduce(lambda x,y:x+y,map(lambda y," -faq/programming,,:reduce,"Sx=Sx,Sy=Sy:reduce(lambda x,y:x+y,map(lambda x,xc=Ru,yc=yc,Ru=Ru,Ro=Ro," -faq/programming,,:chr,">=4.0) or 1+f(xc,yc,x*x-y*y+xc,2.0*x*y+yc,k-1,f):f(xc,yc,x,y,k,f):chr(" -faq/programming,,::,for x in sequence[::-1]: -library/bisect,,:hi,all(val >= x for val in a[i:hi]) -library/bisect,,:hi,all(val > x for val in a[i:hi]) -library/socket,,::,"(10, 1, 6, '', ('2001:888:2000:d::a2', 80, 0, 0))]" -library/stdtypes,,:end,s[start:end] -license,,`,* THIS SOFTWARE IS PROVIDED BY THE OpenSSL PROJECT ``AS IS'' AND ANY -license,,`,* THIS SOFTWARE IS PROVIDED BY ERIC YOUNG ``AS IS'' AND -license,,`,"``Software''), to deal in the Software without restriction, including" -license,,`,"THE SOFTWARE IS PROVIDED ``AS IS'', WITHOUT WARRANTY OF ANY KIND," whatsnew/2.7,735,:Sunday,'2009:4:Sunday' whatsnew/2.7,862,::,"export PYTHONWARNINGS=all,error:::Cookie:0" whatsnew/2.7,862,:Cookie,"export PYTHONWARNINGS=all,error:::Cookie:0" whatsnew/2.7,,::,>>> urlparse.urlparse('http://[1080::8:800:200C:417A]/foo') whatsnew/2.7,,::,"ParseResult(scheme='http', netloc='[1080::8:800:200C:417A]'," -howto/pyporting,75,::,# make sure to use :: Python *and* :: Python :: 3 so -howto/pyporting,75,::,"'Programming Language :: Python'," -howto/pyporting,75,::,'Programming Language :: Python :: 3' -library/urllib2,67,:close,Connection:close -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 08:52:31 2013 From: python-checkins at python.org (georg.brandl) Date: Mon, 25 Nov 2013 08:52:31 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogQ2xvc2VzICMxOTYy?= =?utf-8?q?2=3A_clarify_message_about_bufsize_changes_in_3=2E2=2E4_and_3?= =?utf-8?b?LjMuMS4=?= Message-ID: <3dSgSv6CKfz7Lmb@mail.python.org> http://hg.python.org/cpython/rev/0f0dc0276a7c changeset: 87545:0f0dc0276a7c branch: 3.3 parent: 87542:90cfdff0f8c9 user: Georg Brandl date: Mon Nov 25 08:48:37 2013 +0100 summary: Closes #19622: clarify message about bufsize changes in 3.2.4 and 3.3.1. files: Doc/library/subprocess.rst | 10 +++++----- 1 files changed, 5 insertions(+), 5 deletions(-) diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -441,13 +441,13 @@ approximately that size. A negative bufsize (the default) means the system default of io.DEFAULT_BUFFER_SIZE will be used. - .. versionchanged:: 3.2.4, 3.3.1 + .. versionchanged:: 3.3.1 *bufsize* now defaults to -1 to enable buffering by default to match the - behavior that most code expects. In 3.2.0 through 3.2.3 and 3.3.0 it - incorrectly defaulted to :const:`0` which was unbuffered and allowed - short reads. This was unintentional and did not match the behavior of - Python 2 as most code expected. + behavior that most code expects. In versions prior to Python 3.2.4 and + 3.3.1 it incorrectly defaulted to :const:`0` which was unbuffered + and allowed short reads. This was unintentional and did not match the + behavior of Python 2 as most code expected. The *executable* argument specifies a replacement program to execute. It is very seldom needed. When ``shell=False``, *executable* replaces the -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 08:52:33 2013 From: python-checkins at python.org (georg.brandl) Date: Mon, 25 Nov 2013 08:52:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_merge_with_3=2E3?= Message-ID: <3dSgSx0tq7z7Lnc@mail.python.org> http://hg.python.org/cpython/rev/4473a7067fd8 changeset: 87546:4473a7067fd8 parent: 87543:a6ea3714774f parent: 87545:0f0dc0276a7c user: Georg Brandl date: Mon Nov 25 08:52:24 2013 +0100 summary: merge with 3.3 files: Doc/library/subprocess.rst | 10 +++++----- 1 files changed, 5 insertions(+), 5 deletions(-) diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -451,13 +451,13 @@ approximately that size. A negative bufsize (the default) means the system default of io.DEFAULT_BUFFER_SIZE will be used. - .. versionchanged:: 3.2.4, 3.3.1 + .. versionchanged:: 3.3.1 *bufsize* now defaults to -1 to enable buffering by default to match the - behavior that most code expects. In 3.2.0 through 3.2.3 and 3.3.0 it - incorrectly defaulted to :const:`0` which was unbuffered and allowed - short reads. This was unintentional and did not match the behavior of - Python 2 as most code expected. + behavior that most code expects. In versions prior to Python 3.2.4 and + 3.3.1 it incorrectly defaulted to :const:`0` which was unbuffered + and allowed short reads. This was unintentional and did not match the + behavior of Python 2 as most code expected. The *executable* argument specifies a replacement program to execute. It is very seldom needed. When ``shell=False``, *executable* replaces the -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 09:30:30 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 25 Nov 2013 09:30:30 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Cleanup_test=5Ftracemalloc?= =?utf-8?q?=2Epy=2E_Patch_written_by_Vajrasky_Kok=2E?= Message-ID: <3dShJk2S2HzQLH@mail.python.org> http://hg.python.org/cpython/rev/841dec769a04 changeset: 87547:841dec769a04 user: Victor Stinner date: Mon Nov 25 09:29:45 2013 +0100 summary: Cleanup test_tracemalloc.py. Patch written by Vajrasky Kok. files: Lib/test/test_tracemalloc.py | 6 ++---- 1 files changed, 2 insertions(+), 4 deletions(-) diff --git a/Lib/test/test_tracemalloc.py b/Lib/test/test_tracemalloc.py --- a/Lib/test/test_tracemalloc.py +++ b/Lib/test/test_tracemalloc.py @@ -1,6 +1,4 @@ -import _tracemalloc import contextlib -import datetime import os import sys import tracemalloc @@ -207,7 +205,7 @@ size, max_size = tracemalloc.get_traced_memory() self.assertGreater(size, 0) - # stop() rests also traced memory counters + # stop() also resets traced memory counters tracemalloc.stop() self.assertEqual(tracemalloc.get_traced_memory(), (0, 0)) @@ -729,7 +727,7 @@ stdout = stdout.rstrip() self.assertEqual(stdout, b'False') - # PYTHON* environment varibles must be ignored when -E option is + # PYTHON* environment variables must be ignored when -E option is # present code = 'import tracemalloc; print(tracemalloc.is_tracing())' ok, stdout, stderr = assert_python_ok('-E', '-c', code, PYTHONTRACEMALLOC='1') -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 09:32:36 2013 From: python-checkins at python.org (gregory.p.smith) Date: Mon, 25 Nov 2013 09:32:36 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Remove_an_erra?= =?utf-8?q?nt_extra_=5C_within_a_docstring=2E?= Message-ID: <3dShM862stz7Lkv@mail.python.org> http://hg.python.org/cpython/rev/6cc605e8c270 changeset: 87548:6cc605e8c270 branch: 3.3 parent: 87545:0f0dc0276a7c user: Gregory P. Smith date: Mon Nov 25 00:30:56 2013 -0800 summary: Remove an errant extra \ within a docstring. files: Objects/descrobject.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Objects/descrobject.c b/Objects/descrobject.c --- a/Objects/descrobject.c +++ b/Objects/descrobject.c @@ -1523,7 +1523,7 @@ "class C(object):\n" " @property\n" " def x(self):\n" -" \"\I am the 'x' property.\"\n" +" \"I am the 'x' property.\"\n" " return self._x\n" " @x.setter\n" " def x(self, value):\n" -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 09:32:38 2013 From: python-checkins at python.org (gregory.p.smith) Date: Mon, 25 Nov 2013 09:32:38 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Remove_an_errant_extra_=5C_within_a_docstring=2E?= Message-ID: <3dShMB0w2Dz7LkT@mail.python.org> http://hg.python.org/cpython/rev/89078a969796 changeset: 87549:89078a969796 parent: 87546:4473a7067fd8 parent: 87548:6cc605e8c270 user: Gregory P. Smith date: Mon Nov 25 00:31:31 2013 -0800 summary: Remove an errant extra \ within a docstring. files: Objects/descrobject.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Objects/descrobject.c b/Objects/descrobject.c --- a/Objects/descrobject.c +++ b/Objects/descrobject.c @@ -1558,7 +1558,7 @@ "class C(object):\n" " @property\n" " def x(self):\n" -" \"\I am the 'x' property.\"\n" +" \"I am the 'x' property.\"\n" " return self._x\n" " @x.setter\n" " def x(self, value):\n" -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 09:32:39 2013 From: python-checkins at python.org (gregory.p.smith) Date: Mon, 25 Nov 2013 09:32:39 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_merge_heads?= Message-ID: <3dShMC2YrXz7Lkx@mail.python.org> http://hg.python.org/cpython/rev/f1a0c7eb491d changeset: 87550:f1a0c7eb491d parent: 87549:89078a969796 parent: 87547:841dec769a04 user: Gregory P. Smith date: Mon Nov 25 00:32:20 2013 -0800 summary: merge heads files: Lib/test/test_tracemalloc.py | 6 ++---- 1 files changed, 2 insertions(+), 4 deletions(-) diff --git a/Lib/test/test_tracemalloc.py b/Lib/test/test_tracemalloc.py --- a/Lib/test/test_tracemalloc.py +++ b/Lib/test/test_tracemalloc.py @@ -1,6 +1,4 @@ -import _tracemalloc import contextlib -import datetime import os import sys import tracemalloc @@ -207,7 +205,7 @@ size, max_size = tracemalloc.get_traced_memory() self.assertGreater(size, 0) - # stop() rests also traced memory counters + # stop() also resets traced memory counters tracemalloc.stop() self.assertEqual(tracemalloc.get_traced_memory(), (0, 0)) @@ -729,7 +727,7 @@ stdout = stdout.rstrip() self.assertEqual(stdout, b'False') - # PYTHON* environment varibles must be ignored when -E option is + # PYTHON* environment variables must be ignored when -E option is # present code = 'import tracemalloc; print(tracemalloc.is_tracing())' ok, stdout, stderr = assert_python_ok('-E', '-c', code, PYTHONTRACEMALLOC='1') -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 09:34:47 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 25 Nov 2013 09:34:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2319762=3A_Fix_name?= =?utf-8?b?IG9mIF9nZXRfdHJhY2VzKCkgYW5kIF9nZXRfb2JqZWN0X3RyYWNlYmFjaygp?= =?utf-8?q?_function?= Message-ID: <3dShPg2yw6z7LmL@mail.python.org> http://hg.python.org/cpython/rev/2e2ec595dc58 changeset: 87551:2e2ec595dc58 user: Victor Stinner date: Mon Nov 25 09:33:18 2013 +0100 summary: Close #19762: Fix name of _get_traces() and _get_object_traceback() function name in their docstring. Patch written by Vajrasky Kok. files: Modules/_tracemalloc.c | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -1018,7 +1018,7 @@ } PyDoc_STRVAR(tracemalloc_get_traces_doc, - "get_traces() -> list\n" + "_get_traces() -> list\n" "\n" "Get traces of all memory blocks allocated by Python.\n" "Return a list of (size: int, traceback: tuple) tuples.\n" @@ -1083,7 +1083,7 @@ } PyDoc_STRVAR(tracemalloc_get_object_traceback_doc, - "get_object_traceback(obj)\n" + "_get_object_traceback(obj)\n" "\n" "Get the traceback where the Python object obj was allocated.\n" "Return a tuple of (filename: str, lineno: int) tuples.\n" -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 09:40:37 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 25 Nov 2013 09:40:37 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Mention_the_new_tracemallo?= =?utf-8?q?c_module_in_the_What=27s_New_in_Python_3=2E4_document?= Message-ID: <3dShXP4spqzS6T@mail.python.org> http://hg.python.org/cpython/rev/13e47495a7be changeset: 87552:13e47495a7be user: Victor Stinner date: Mon Nov 25 09:40:27 2013 +0100 summary: Mention the new tracemalloc module in the What's New in Python 3.4 document files: Doc/whatsnew/3.4.rst | 18 ++++++++++++++++++ 1 files changed, 18 insertions(+), 0 deletions(-) diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -97,6 +97,7 @@ * :mod:`selectors`: High-level and efficient I/O multiplexing, built upon the :mod:`select` module primitives. * :mod:`statistics`: A basic numerically stable statistics library (:pep:`450`). +* :mod:`tracemalloc`: Trace Python memory allocations (:pep:`454`). New expected features for Python implementations: @@ -400,6 +401,23 @@ PEP written and implemented by Steven D'Aprano +tracemalloc +----------- + +The new :mod:`tracemalloc` module (defined in :pep:`454`) is a debug tool to +trace memory blocks allocated by Python. It provides the following information: + +* Traceback where an object was allocated +* Statistics on allocated memory blocks per filename and per line number: + total size, number and average size of allocated memory blocks +* Compute the differences between two snapshots to detect memory leaks + +.. seealso:: + + :pep:`454` - Add a new tracemalloc module to trace Python memory allocations + PEP written and implemented by Victor Stinner + + Improved Modules ================ -- Repository URL: http://hg.python.org/cpython From ncoghlan at gmail.com Mon Nov 25 09:46:26 2013 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 25 Nov 2013 18:46:26 +1000 Subject: [Python-checkins] cpython (2.7): Fix test_fcntl to run properly on systems that do not support the flags In-Reply-To: <3dSbKV48lqz7Ljx@mail.python.org> References: <3dSbKV48lqz7Ljx@mail.python.org> Message-ID: On 25 Nov 2013 14:46, "gregory.p.smith" wrote: > > http://hg.python.org/cpython/rev/cac7319c5972 > changeset: 87537:cac7319c5972 > branch: 2.7 > parent: 87534:3981e57a7bdc > user: Gregory P. Smith > date: Mon Nov 25 04:45:27 2013 +0000 > summary: > Fix test_fcntl to run properly on systems that do not support the flags > used in the "does the value get passed in properly" test. > > files: > Lib/test/test_fcntl.py | 3 +++ > 1 files changed, 3 insertions(+), 0 deletions(-) > > > diff --git a/Lib/test/test_fcntl.py b/Lib/test/test_fcntl.py > --- a/Lib/test/test_fcntl.py > +++ b/Lib/test/test_fcntl.py > @@ -113,7 +113,10 @@ > self.skipTest("F_NOTIFY or DN_MULTISHOT unavailable") > fd = os.open(os.path.dirname(os.path.abspath(TESTFN)), os.O_RDONLY) > try: > + # This will raise OverflowError if issue1309352 is present. > fcntl.fcntl(fd, cmd, flags) > + except IOError: > + pass # Running on a system that doesn't support these flags. > finally: > os.close(fd) Raising a skip is generally preferred to marking a test that can't be executed as passing. Cheers, Nick. > > > -- > Repository URL: http://hg.python.org/cpython > > _______________________________________________ > Python-checkins mailing list > Python-checkins at python.org > https://mail.python.org/mailman/listinfo/python-checkins > -------------- next part -------------- An HTML attachment was scrubbed... URL: From python-checkins at python.org Mon Nov 25 10:44:48 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 25 Nov 2013 10:44:48 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319753=3A_New_try_?= =?utf-8?q?to_fix_test=5Fgdb_on_System_Z_buildbot?= Message-ID: <3dSjyS2MNMz7LjQ@mail.python.org> http://hg.python.org/cpython/rev/6ec6facb69ca changeset: 87553:6ec6facb69ca user: Victor Stinner date: Mon Nov 25 10:43:59 2013 +0100 summary: Issue #19753: New try to fix test_gdb on System Z buildbot files: Lib/test/test_gdb.py | 2 ++ 1 files changed, 2 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_gdb.py b/Lib/test/test_gdb.py --- a/Lib/test/test_gdb.py +++ b/Lib/test/test_gdb.py @@ -170,7 +170,9 @@ 'Do you need "set solib-search-path" or ' '"set sysroot"?', 'warning: Source file is more recent than executable.', + # Issue #19753: missing symbols on System Z 'Missing separate debuginfo for ', + 'Try: zypper install -C ', ) for line in errlines: if not line.startswith(ignore_patterns): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 11:53:10 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 25 Nov 2013 11:53:10 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_PEP_454=3A_Remove_set=5Ftrace?= =?utf-8?q?back=5Flimit=28=29=3B_update_the_PEP_to_reflect_the?= Message-ID: <3dSlTL3zzdzQrN@mail.python.org> http://hg.python.org/peps/rev/8d1157cf101e changeset: 5321:8d1157cf101e user: Victor Stinner date: Mon Nov 25 11:53:03 2013 +0100 summary: PEP 454: Remove set_traceback_limit(); update the PEP to reflect the implementation files: pep-0454.txt | 43 ++++++++++++++++++--------------------- 1 files changed, 20 insertions(+), 23 deletions(-) diff --git a/pep-0454.txt b/pep-0454.txt --- a/pep-0454.txt +++ b/pep-0454.txt @@ -126,10 +126,10 @@ Get the maximum number of frames stored in the traceback of a trace. - By default, a trace of a memory block only stores the most recent - frame: the limit is ``1``. + The ``tracemalloc`` module must be tracing memory allocations to get + the limit, otherwise an exception is raised. - Use the ``set_traceback_limit()`` function to change the limit. + The limit is set by the ``start()`` function. ``get_traced_memory()`` function: @@ -152,33 +152,29 @@ See also ``start()`` and ``stop()`` functions. -``set_traceback_limit(nframe: int)`` function: +``start(nframe: int=1)`` function: - Set the maximum number of frames stored in the traceback of a trace. - *nframe* must be greater or equal to ``1``. + Start tracing Python memory allocations: install hooks on Python + memory allocators. Collected tracebacks of traces will be limited to + *nframe* frames. By default, a trace of a memory block only stores + the most recent frame: the limit is ``1``. *nframe* must be greater + or equal to ``1``. - Storing more frames allocates more memory of the ``tracemalloc`` - module and makes Python slower. Use the ``get_tracemalloc_memory()`` + Storing more than ``1`` frame is only useful to compute statistics + grouped by ``'traceback'`` or to compute cumulative statistics: see + the ``Snapshot.compare_to()`` and ``Snapshot.statistics()`` methods. + + Storing more frames increases the memory and CPU overhead of the + ``tracemalloc`` module. Use the ``get_tracemalloc_memory()`` function to measure how much memory is used by the ``tracemalloc`` module. - Storing more than ``1`` frame is only useful to compute statistics - grouped by ``'traceback'`` or to compute cumulative statistics: see - the ``Snapshot.statistics()`` method. - - Use the ``get_traceback_limit()`` function to get the current limit. - The ``PYTHONTRACEMALLOC`` environment variable (``PYTHONTRACEMALLOC=NFRAME``) and the ``-X`` ``tracemalloc=NFRAME`` - command line option can be used to set the limit at startup. + command line option can be used to start tracing at startup. - -``start()`` function: - - Start tracing Python memory allocations: install hooks on Python - memory allocators. - - See also ``stop()`` and ``is_tracing()`` functions. + See also ``stop()``, ``is_tracing()`` and ``get_traceback_limit()`` + functions. ``stop()`` function: @@ -202,7 +198,8 @@ ``tracemalloc`` module started to trace memory allocations. Tracebacks of traces are limited to ``get_traceback_limit()`` - frames. Use ``set_traceback_limit()`` to store more frames. + frames. Use the *nframe* parameter of the ``start()`` function to + store more frames. The ``tracemalloc`` module must be tracing memory allocations to take a snapshot, see the the ``start()`` function. -- Repository URL: http://hg.python.org/peps From jimjjewett at gmail.com Mon Nov 25 16:04:21 2013 From: jimjjewett at gmail.com (Jim Jewett) Date: Mon, 25 Nov 2013 10:04:21 -0500 Subject: [Python-checkins] cpython: Close #19762: Fix name of _get_traces() and _get_object_traceback() function In-Reply-To: <3dShPg2yw6z7LmL@mail.python.org> References: <3dShPg2yw6z7LmL@mail.python.org> Message-ID: Why are these functions (get_traces and get_object_traceback) private? (1) Is the whole module provisional? At one point, I had thought so, but I don't see that in the PEP or implementation. (I'm not sure that it should be provisional, but I want to be sure that the decision is intentional.) (2) This implementation does lock in certain choices about the nature of traces. (What data to include for analysis vs excluding to save memory; which events are tracked separately and which combined into a single total; organizing the data that is saved in a hash by certain keys; etc) While I would prefer more flexibility, the existing code provides a reasonable default, and I can't forsee changing traces so much that these functions *can't* be reasonably supported unless the rest of the module API changes too. (3) get_object_traceback is the killer app that justifies the specific data-collection choices Victor made; if it isn't public, the implementation starts to look overbuilt. (4) get_traces is about the only way to get at even the all the data that *is* stored, prior to additional summarization. If it isn't public, those default summarization options become even more locked in.. -jJ On Mon, Nov 25, 2013 at 3:34 AM, victor.stinner wrote: > http://hg.python.org/cpython/rev/2e2ec595dc58 > changeset: 87551:2e2ec595dc58 > user: Victor Stinner > date: Mon Nov 25 09:33:18 2013 +0100 > summary: > Close #19762: Fix name of _get_traces() and _get_object_traceback() > function > name in their docstring. Patch written by Vajrasky Kok. > > files: > Modules/_tracemalloc.c | 4 ++-- > 1 files changed, 2 insertions(+), 2 deletions(-) > > > diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c > --- a/Modules/_tracemalloc.c > +++ b/Modules/_tracemalloc.c > @@ -1018,7 +1018,7 @@ > } > > PyDoc_STRVAR(tracemalloc_get_traces_doc, > - "get_traces() -> list\n" > + "_get_traces() -> list\n" > "\n" > "Get traces of all memory blocks allocated by Python.\n" > "Return a list of (size: int, traceback: tuple) tuples.\n" > @@ -1083,7 +1083,7 @@ > } > > PyDoc_STRVAR(tracemalloc_get_object_traceback_doc, > - "get_object_traceback(obj)\n" > + "_get_object_traceback(obj)\n" > "\n" > "Get the traceback where the Python object obj was allocated.\n" > "Return a tuple of (filename: str, lineno: int) tuples.\n" > > -- > Repository URL: http://hg.python.org/cpython > > _______________________________________________ > Python-checkins mailing list > Python-checkins at python.org > https://mail.python.org/mailman/listinfo/python-checkins > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From python-checkins at python.org Mon Nov 25 18:44:58 2013 From: python-checkins at python.org (guido.van.rossum) Date: Mon, 25 Nov 2013 18:44:58 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Change_mock_pip?= =?utf-8?q?e_to_mock_socket=2E_Hope_to_fix_issue_19750=2E?= Message-ID: <3dSwcV2spVz7LjP@mail.python.org> http://hg.python.org/cpython/rev/871d496fa06c changeset: 87554:871d496fa06c user: Guido van Rossum date: Mon Nov 25 09:43:52 2013 -0800 summary: asyncio: Change mock pipe to mock socket. Hope to fix issue 19750. files: Lib/test/test_asyncio/test_unix_events.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_asyncio/test_unix_events.py b/Lib/test/test_asyncio/test_unix_events.py --- a/Lib/test/test_asyncio/test_unix_events.py +++ b/Lib/test/test_asyncio/test_unix_events.py @@ -379,7 +379,7 @@ fstat_patcher = unittest.mock.patch('os.fstat') m_fstat = fstat_patcher.start() st = unittest.mock.Mock() - st.st_mode = stat.S_IFIFO + st.st_mode = stat.S_IFSOCK m_fstat.return_value = st self.addCleanup(fstat_patcher.stop) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 19:06:44 2013 From: python-checkins at python.org (guido.van.rossum) Date: Mon, 25 Nov 2013 19:06:44 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Hopeful_fix_for?= =?utf-8?q?_issue_19765=2E?= Message-ID: <3dSx5c3yHvz7LjM@mail.python.org> http://hg.python.org/cpython/rev/368b74823c76 changeset: 87555:368b74823c76 user: Guido van Rossum date: Mon Nov 25 10:06:34 2013 -0800 summary: asyncio: Hopeful fix for issue 19765. files: Lib/test/test_asyncio/test_events.py | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -560,6 +560,7 @@ client.connect(('127.0.0.1', port)) client.sendall(b'xxx') test_utils.run_briefly(self.loop) + test_utils.run_until(self.loop, lambda: proto is not None, 10) self.assertIsInstance(proto, MyProto) self.assertEqual('INITIAL', proto.state) test_utils.run_briefly(self.loop) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 19:08:40 2013 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 25 Nov 2013 19:08:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5Njkx?= =?utf-8?q?=3A_remove_outdated_mention_about_RuntimeError?= Message-ID: <3dSx7r13gKz7Llr@mail.python.org> http://hg.python.org/cpython/rev/313d9bb253bf changeset: 87556:313d9bb253bf branch: 2.7 parent: 87544:74b1b3ebf454 user: Antoine Pitrou date: Mon Nov 25 19:08:32 2013 +0100 summary: Issue #19691: remove outdated mention about RuntimeError files: Doc/library/exceptions.rst | 3 +-- 1 files changed, 1 insertions(+), 2 deletions(-) diff --git a/Doc/library/exceptions.rst b/Doc/library/exceptions.rst --- a/Doc/library/exceptions.rst +++ b/Doc/library/exceptions.rst @@ -286,8 +286,7 @@ Raised when an error is detected that doesn't fall in any of the other categories. The associated value is a string indicating what precisely went - wrong. (This exception is mostly a relic from a previous version of the - interpreter; it is not used very much any more.) + wrong. .. exception:: StopIteration -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 19:12:53 2013 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 25 Nov 2013 19:12:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5Njkx?= =?utf-8?q?=3A_remove_outdated_mention_about_RuntimeError?= Message-ID: <3dSxDj5r63zN4d@mail.python.org> http://hg.python.org/cpython/rev/6aeaaa614a19 changeset: 87557:6aeaaa614a19 branch: 3.3 parent: 87548:6cc605e8c270 user: Antoine Pitrou date: Mon Nov 25 19:08:32 2013 +0100 summary: Issue #19691: remove outdated mention about RuntimeError files: Doc/library/exceptions.rst | 3 +-- 1 files changed, 1 insertions(+), 2 deletions(-) diff --git a/Doc/library/exceptions.rst b/Doc/library/exceptions.rst --- a/Doc/library/exceptions.rst +++ b/Doc/library/exceptions.rst @@ -281,8 +281,7 @@ Raised when an error is detected that doesn't fall in any of the other categories. The associated value is a string indicating what precisely went - wrong. (This exception is mostly a relic from a previous version of the - interpreter; it is not used very much any more.) + wrong. .. exception:: StopIteration -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 19:12:55 2013 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 25 Nov 2013 19:12:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319691=3A_remove_outdated_mention_about_RuntimeE?= =?utf-8?q?rror?= Message-ID: <3dSxDl0dKyz7LkR@mail.python.org> http://hg.python.org/cpython/rev/f4de1c5e381d changeset: 87558:f4de1c5e381d parent: 87555:368b74823c76 parent: 87557:6aeaaa614a19 user: Antoine Pitrou date: Mon Nov 25 19:11:07 2013 +0100 summary: Issue #19691: remove outdated mention about RuntimeError files: Doc/library/exceptions.rst | 3 +-- 1 files changed, 1 insertions(+), 2 deletions(-) diff --git a/Doc/library/exceptions.rst b/Doc/library/exceptions.rst --- a/Doc/library/exceptions.rst +++ b/Doc/library/exceptions.rst @@ -287,8 +287,7 @@ Raised when an error is detected that doesn't fall in any of the other categories. The associated value is a string indicating what precisely went - wrong. (This exception is mostly a relic from a previous version of the - interpreter; it is not used very much any more.) + wrong. .. exception:: StopIteration -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 19:52:00 2013 From: python-checkins at python.org (antoine.pitrou) Date: Mon, 25 Nov 2013 19:52:00 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319742=3A_fix_a_te?= =?utf-8?q?st=5Fpathlib_failure_when_a_file_owner_or_group_isn=27t_in?= Message-ID: <3dSy5r3FwMz7Ljm@mail.python.org> http://hg.python.org/cpython/rev/b58b58948c27 changeset: 87559:b58b58948c27 user: Antoine Pitrou date: Mon Nov 25 19:51:53 2013 +0100 summary: Issue #19742: fix a test_pathlib failure when a file owner or group isn't in the system database files: Lib/test/test_pathlib.py | 12 ++++++++++-- 1 files changed, 10 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -1322,14 +1322,22 @@ def test_owner(self): p = self.cls(BASE) / 'fileA' uid = p.stat().st_uid - name = pwd.getpwuid(uid).pw_name + try: + name = pwd.getpwuid(uid).pw_name + except KeyError: + self.skipTest( + "user %d doesn't have an entry in the system database" % uid) self.assertEqual(name, p.owner()) @unittest.skipUnless(grp, "the grp module is needed for this test") def test_group(self): p = self.cls(BASE) / 'fileA' gid = p.stat().st_gid - name = grp.getgrgid(gid).gr_name + try: + name = grp.getgrgid(gid).gr_name + except KeyError: + self.skipTest( + "group %d doesn't have an entry in the system database" % gid) self.assertEqual(name, p.group()) def test_unlink(self): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 20:36:17 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Mon, 25 Nov 2013 20:36:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319739=3A_Try_to_f?= =?utf-8?q?ix_compiler_warnings_on_32-bit_Windows=2E?= Message-ID: <3dSz4x6BXRz7LmB@mail.python.org> http://hg.python.org/cpython/rev/f8ac01a762c1 changeset: 87560:f8ac01a762c1 user: Alexandre Vassalotti date: Mon Nov 25 11:35:46 2013 -0800 summary: Issue #19739: Try to fix compiler warnings on 32-bit Windows. files: Modules/_pickle.c | 43 +++++++++++++++++----------------- 1 files changed, 21 insertions(+), 22 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -710,17 +710,15 @@ static void _write_size64(char *out, size_t value) { - out[0] = (unsigned char)(value & 0xff); - out[1] = (unsigned char)((value >> 8) & 0xff); - out[2] = (unsigned char)((value >> 16) & 0xff); - out[3] = (unsigned char)((value >> 24) & 0xff); - if (sizeof(size_t) >= 8) { - out[4] = (unsigned char)((value >> 32) & 0xff); - out[5] = (unsigned char)((value >> 40) & 0xff); - out[6] = (unsigned char)((value >> 48) & 0xff); - out[7] = (unsigned char)((value >> 56) & 0xff); - } else { - out[4] = out[5] = out[6] = out[7] = 0; + int i; + + assert(sizeof(size_t) <= 8); + + for (i = 0; i < sizeof(size_t); i++) { + out[i] = (unsigned char)((value >> (8 * i)) & 0xff); + } + for (i = sizeof(size_t); i < 8; i++) { + out[i] = 0; } } @@ -1644,8 +1642,16 @@ } else if (self->bin && (sizeof(long) <= 4 || - (val <= 0x7fffffffL && val >= -0x80000000L))) { - /* result fits in a signed 4-byte integer */ + (val <= 0x7fffffffL && val >= (-0x7fffffffL - 1)))) { + /* result fits in a signed 4-byte integer. + + Note: we can't use -0x80000000L in the above condition because some + compilers (e.g., MSVC) will promote 0x80000000L to an unsigned type + before applying the unary minus when sizeof(long) <= 4. The + resulting value stays unsigned which is commonly not what we want, + so MSVC happily warns us about it. However, that result would have + been fine because we guard for sizeof(long) <= 4 which turns the + condition true in that particular case. */ char pdata[32]; Py_ssize_t len = 0; @@ -1904,11 +1910,8 @@ len = 5; } else if (self->proto >= 4) { - int i; header[0] = BINBYTES8; - for (i = 0; i < 8; i++) { - _write_size64(header + 1, size); - } + _write_size64(header + 1, size); len = 8; } else { @@ -2017,12 +2020,8 @@ len = 5; } else if (self->proto >= 4) { - int i; - header[0] = BINUNICODE8; - for (i = 0; i < 8; i++) { - _write_size64(header + 1, size); - } + _write_size64(header + 1, size); len = 9; } else { -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 20:45:05 2013 From: python-checkins at python.org (guido.van.rossum) Date: Mon, 25 Nov 2013 20:45:05 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Document_open=5Fconnection=28?= =?utf-8?q?=29=2C_start=5Fserver=28=29=2C_and_related_Stream*_classes=2E?= Message-ID: <3dSzH55pTTzPK0@mail.python.org> http://hg.python.org/peps/rev/4fef14a5f863 changeset: 5322:4fef14a5f863 user: Guido van Rossum date: Mon Nov 25 11:44:59 2013 -0800 summary: Document open_connection(), start_server(), and related Stream* classes. files: pep-3156.txt | 94 ++++++++++++++++++++++++++++++++++++++- 1 files changed, 91 insertions(+), 3 deletions(-) diff --git a/pep-3156.txt b/pep-3156.txt --- a/pep-3156.txt +++ b/pep-3156.txt @@ -1680,6 +1680,97 @@ protocol implementation should be provided that sets this up and kicks off the coroutine when ``connection_made()`` is called. +Convenience Utilities +--------------------- + +A few functions and classes are provided to simplify the writing of +basic stream-based clients and servers. Thes are: + +- ``asyncio.open_connection(host, port)``: A wrapper for + ``EventLoop.create_connection()`` that does not require you to + provide a ``Protocol`` factory or class. This is a coroutine that + returns a ``(reader, writer)`` pair, where ``reader`` is an instance + of ``StreamReader`` and ``writer`` is an instance of + ``StreamWriter`` (both described below). + +- ``asyncio.start_server(client_connected_cb, host, port)``: A wrapper + for ``EventLoop.create_server()`` that takes a simple callback + function rather than a ``Protocol`` factory or class. This is a + coroutine that returns a ``Server`` object just as + ``create_server()`` does. Each time a client connection is + accepted, ``client_connected_cb(reader, writer)`` is called, where + ``reader`` is an instance of ``StreamReader`` and ``writer`` is an + instance of ``StreamWriter`` (both described below). If the result + returned by ``client_connected_cb()`` is a coroutine, it is + automatically wrapped in a ``Task``. + +- ``StreamReader``: A class offering an interface not unlike that of a + read-only binary stream, except that the various reading methods are + coroutines. It is normally driven by a ``StreamReaderProtocol`` + instance. Note that there should be only one reader. The interface + for the reader is: + + - ``readline()``: A coroutine that reads a string of bytes + representing a line of text ending in ``'\n'``, or until the end + of the stream, whichever comes first. + + - ``read(n)``: A coroutine that reads up to ``n`` bytes. If ``n`` + is omitted or negative, it reads until the end of the stream. + + - ``readexactly(n)``: A coroutine that reads exactly ``n`` bytes, or + until the end of the stream, whichever comes first. + + - ``exception()``: Return the exception that has been set on the + stream using ``set_exception()``, or None if no exception is set. + + The interface for the driver is: + + - ``feed_data(data)``: Append ``data`` (a ``bytes`` object) to the + internal buffer. This unblocks a blocked reading coroutine if it + provides sufficient data to fulfill the reader's contract. + + - ``feed_eof()``: Signal the end of the buffer. This unclocks a + blocked reading coroutine. No more data should be fed to the + reader after this call. + + - ``set_exception(exc)``: Set an exception on the stream. All + subsequent reading methods will raise this exception. No more + data should be fed to the reader after this call. + +- ``StreamWriter``: A class offering an interface not unlike that of a + write-only binary stream. It wraps a transport. The interface is + an extended subset of the transport interface: the following methods + behave the same as the corresponding transport methods: ``write()``, + ``writelines()``, ``write_eof()``, ``can_write_eof()``, + ``get_extra_info()``, ``close()``. Note that the writing methods + are _not_ coroutines (this is the same as for transports, but + different from the ``StreamReader`` class). The following method is + in addition to the transport interface: + + - ``drain()``: This should be called with ``yield from`` after + writing significant data, for the purpose of flow control. The + intended use is like this:: + + writer.write(data) + yield from writer.drain() + + Note that this is not technically a coroutine: it returns either a + Future or an empty tuple (both can be passed to ``yield from``). + Use of this method is optional. However, when it is not used, the + internal buffer of the transport underlying the ``StreamWriter`` + may fill up with all data that was ever written to the writer. If + an app does not have a strict limit on how much data it writes, it + _should_ call ``yield from drain()`` occasionally to avoid filling + up the transport buffer. + +- ``StreamReaderProtocol``: A protocol implementation used as an + adapter between the bidirectional stream transport/protocol + interface and the ``StreamReader`` and ``StreamWriter`` classes. It + acts as a driver for a specific ``StreamReader`` instance, calling + its methods ``feed_data()``, ``feed_eof()``, and ``set_exception()`` + in response to various protocol callbacks. It also controls the + behavior of the ``drain()`` method of the ``StreamWriter`` instance. + Synchronization =============== @@ -1754,9 +1845,6 @@ TO DO ===== -- Document ``StreamReader``, ``StreamWriter``, ``open_connection()``, - and ``start_server()``. - - Document all places that take a ``loop`` keyword argument. - Document ``logger`` object. -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Mon Nov 25 21:47:19 2013 From: python-checkins at python.org (ezio.melotti) Date: Mon, 25 Nov 2013 21:47:19 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_=2319778=3A_fix_a_couple_o?= =?utf-8?q?f_re_reprs_in_the_documentation=2E?= Message-ID: <3dT0fv6dSHz7Ljf@mail.python.org> http://hg.python.org/cpython/rev/5237e7da8a76 changeset: 87561:5237e7da8a76 user: Ezio Melotti date: Mon Nov 25 22:47:01 2013 +0200 summary: #19778: fix a couple of re reprs in the documentation. files: Doc/howto/regex.rst | 8 ++++---- 1 files changed, 4 insertions(+), 4 deletions(-) diff --git a/Doc/howto/regex.rst b/Doc/howto/regex.rst --- a/Doc/howto/regex.rst +++ b/Doc/howto/regex.rst @@ -271,8 +271,8 @@ >>> import re >>> p = re.compile('ab*') - >>> p #doctest: +ELLIPSIS - <_sre.SRE_Pattern object at 0x...> + >>> p + re.compile('ab*') :func:`re.compile` also accepts an optional *flags* argument, used to enable various special features and syntax variations. We'll go over the available @@ -383,8 +383,8 @@ >>> import re >>> p = re.compile('[a-z]+') - >>> p #doctest: +ELLIPSIS - <_sre.SRE_Pattern object at 0x...> + >>> p + re.compile('[a-z]+') Now, you can try matching various strings against the RE ``[a-z]+``. An empty string shouldn't match at all, since ``+`` means 'one or more repetitions'. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 22:03:46 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Mon, 25 Nov 2013 22:03:46 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Combine_=5FPickler=5FFastC?= =?utf-8?q?all_and_=5FUnpickler=5FFastCall_in_cpickle=2E?= Message-ID: <3dT11t5K49z7Lkf@mail.python.org> http://hg.python.org/cpython/rev/e39db21df580 changeset: 87562:e39db21df580 user: Alexandre Vassalotti date: Mon Nov 25 13:03:32 2013 -0800 summary: Combine _Pickler_FastCall and _Unpickler_FastCall in cpickle. files: Modules/_pickle.c | 129 +++++++++------------------------ 1 files changed, 36 insertions(+), 93 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -346,7 +346,6 @@ pickling. */ PyObject *pers_func; /* persistent_id() method, can be NULL */ PyObject *dispatch_table; /* private dispatch_table, can be NULL */ - PyObject *arg; PyObject *write; /* write() method of the output stream. */ PyObject *output_buffer; /* Write into a local bytearray buffer before @@ -383,7 +382,6 @@ Py_ssize_t memo_size; /* Capacity of the memo array */ Py_ssize_t memo_len; /* Number of objects in the memo */ - PyObject *arg; PyObject *pers_func; /* persistent_load() method, can be NULL. */ Py_buffer buffer; @@ -639,58 +637,29 @@ /*************************************************************************/ -/* Helpers for creating the argument tuple passed to functions. This has the - performance advantage of calling PyTuple_New() only once. - - XXX(avassalotti): Inline directly in _Pickler_FastCall() and - _Unpickler_FastCall(). */ -#define ARG_TUP(self, obj) do { \ - if ((self)->arg || ((self)->arg=PyTuple_New(1))) { \ - Py_XDECREF(PyTuple_GET_ITEM((self)->arg, 0)); \ - PyTuple_SET_ITEM((self)->arg, 0, (obj)); \ - } \ - else { \ - Py_DECREF((obj)); \ - } \ - } while (0) - -#define FREE_ARG_TUP(self) do { \ - if ((self)->arg->ob_refcnt > 1) \ - Py_CLEAR((self)->arg); \ - } while (0) - -/* A temporary cleaner API for fast single argument function call. - - XXX: Does caching the argument tuple provides any real performance benefits? - - A quick benchmark, on a 2.0GHz Athlon64 3200+ running Linux 2.6.24 with - glibc 2.7, tells me that it takes roughly 20,000,000 PyTuple_New(1) calls - when the tuple is retrieved from the freelist (i.e, call PyTuple_New() then - immediately DECREF it) and 1,200,000 calls when allocating brand new tuples - (i.e, call PyTuple_New() and store the returned value in an array), to save - one second (wall clock time). Either ways, the loading time a pickle stream - large enough to generate this number of calls would be massively - overwhelmed by other factors, like I/O throughput, the GC traversal and - object allocation overhead. So, I really doubt these functions provide any - real benefits. - - On the other hand, oprofile reports that pickle spends a lot of time in - these functions. But, that is probably more related to the function call - overhead, than the argument tuple allocation. - - XXX: And, what is the reference behavior of these? Steal, borrow? At first - glance, it seems to steal the reference of 'arg' and borrow the reference - of 'func'. */ +/* Helper for calling a function with a single argument quickly. + + This has the performance advantage of reusing the argument tuple. This + provides a nice performance boost with older pickle protocols where many + unbuffered reads occurs. + + This function steals the reference of the given argument. */ static PyObject * -_Pickler_FastCall(PicklerObject *self, PyObject *func, PyObject *arg) -{ - PyObject *result = NULL; - - ARG_TUP(self, arg); - if (self->arg) { - result = PyObject_Call(func, self->arg, NULL); - FREE_ARG_TUP(self); - } +_Pickle_FastCall(PyObject *func, PyObject *arg) +{ + static PyObject *arg_tuple = NULL; + PyObject *result; + + if (arg_tuple == NULL) { + arg_tuple = PyTuple_New(1); + if (arg_tuple == NULL) { + Py_DECREF(arg); + return NULL; + } + } + PyTuple_SET_ITEM(arg_tuple, 0, arg); + result = PyObject_Call(func, arg_tuple, NULL); + Py_DECREF(arg); return result; } @@ -787,7 +756,7 @@ if (output == NULL) return -1; - result = _Pickler_FastCall(self, self->write, output); + result = _Pickle_FastCall(self->write, output); Py_XDECREF(result); return (result == NULL) ? -1 : 0; } @@ -853,7 +822,6 @@ self->pers_func = NULL; self->dispatch_table = NULL; - self->arg = NULL; self->write = NULL; self->proto = 0; self->bin = 0; @@ -922,20 +890,6 @@ return 0; } -/* See documentation for _Pickler_FastCall(). */ -static PyObject * -_Unpickler_FastCall(UnpicklerObject *self, PyObject *func, PyObject *arg) -{ - PyObject *result = NULL; - - ARG_TUP(self, arg); - if (self->arg) { - result = PyObject_Call(func, self->arg, NULL); - FREE_ARG_TUP(self); - } - return result; -} - /* Returns the size of the input on success, -1 on failure. This takes its own reference to `input`. */ static Py_ssize_t @@ -1006,7 +960,7 @@ PyObject *len = PyLong_FromSsize_t(n); if (len == NULL) return -1; - data = _Unpickler_FastCall(self, self->read, len); + data = _Pickle_FastCall(self->read, len); } if (data == NULL) return -1; @@ -1019,7 +973,7 @@ Py_DECREF(data); return -1; } - prefetched = _Unpickler_FastCall(self, self->peek, len); + prefetched = _Pickle_FastCall(self->peek, len); if (prefetched == NULL) { if (PyErr_ExceptionMatches(PyExc_NotImplementedError)) { /* peek() is probably not supported by the given file object */ @@ -1229,7 +1183,6 @@ if (self == NULL) return NULL; - self->arg = NULL; self->pers_func = NULL; self->input_buffer = NULL; self->input_line = NULL; @@ -3172,7 +3125,7 @@ const char binpersid_op = BINPERSID; Py_INCREF(obj); - pid = _Pickler_FastCall(self, func, obj); + pid = _Pickle_FastCall(func, obj); if (pid == NULL) return -1; @@ -3600,7 +3553,7 @@ } if (reduce_func != NULL) { Py_INCREF(obj); - reduce_value = _Pickler_FastCall(self, reduce_func, obj); + reduce_value = _Pickle_FastCall(reduce_func, obj); } else if (PyType_IsSubtype(type, &PyType_Type)) { status = save_global(self, obj, NULL); @@ -3625,7 +3578,7 @@ PyObject *proto; proto = PyLong_FromLong(self->proto); if (proto != NULL) { - reduce_value = _Pickler_FastCall(self, reduce_func, proto); + reduce_value = _Pickle_FastCall(reduce_func, proto); } } else { @@ -3794,7 +3747,6 @@ Py_XDECREF(self->write); Py_XDECREF(self->pers_func); Py_XDECREF(self->dispatch_table); - Py_XDECREF(self->arg); Py_XDECREF(self->fast_memo); PyMemoTable_Del(self->memo); @@ -3808,7 +3760,6 @@ Py_VISIT(self->write); Py_VISIT(self->pers_func); Py_VISIT(self->dispatch_table); - Py_VISIT(self->arg); Py_VISIT(self->fast_memo); return 0; } @@ -3820,7 +3771,6 @@ Py_CLEAR(self->write); Py_CLEAR(self->pers_func); Py_CLEAR(self->dispatch_table); - Py_CLEAR(self->arg); Py_CLEAR(self->fast_memo); if (self->memo != NULL) { @@ -3943,7 +3893,6 @@ return NULL; } - self->arg = NULL; self->fast = 0; self->fast_nesting = 0; self->fast_memo = NULL; @@ -5208,9 +5157,9 @@ if (pid == NULL) return -1; - /* Ugh... this does not leak since _Unpickler_FastCall() steals the - reference to pid first. */ - pid = _Unpickler_FastCall(self, self->pers_func, pid); + /* This does not leak since _Pickle_FastCall() steals the reference + to pid first. */ + pid = _Pickle_FastCall(self->pers_func, pid); if (pid == NULL) return -1; @@ -5235,9 +5184,9 @@ if (pid == NULL) return -1; - /* Ugh... this does not leak since _Unpickler_FastCall() steals the + /* This does not leak since _Pickle_FastCall() steals the reference to pid first. */ - pid = _Unpickler_FastCall(self, self->pers_func, pid); + pid = _Pickle_FastCall(self->pers_func, pid); if (pid == NULL) return -1; @@ -5585,7 +5534,7 @@ PyObject *result; value = self->stack->data[i]; - result = _Unpickler_FastCall(self, append_func, value); + result = _Pickle_FastCall(append_func, value); if (result == NULL) { Pdata_clear(self->stack, i + 1); Py_SIZE(self->stack) = x; @@ -5700,7 +5649,7 @@ PyObject *item; item = self->stack->data[i]; - result = _Unpickler_FastCall(self, add_func, item); + result = _Pickle_FastCall(add_func, item); if (result == NULL) { Pdata_clear(self->stack, i + 1); Py_SIZE(self->stack) = mark; @@ -5747,9 +5696,7 @@ PyObject *result; /* The explicit __setstate__ is responsible for everything. */ - /* Ugh... this does not leak since _Unpickler_FastCall() steals the - reference to state first. */ - result = _Unpickler_FastCall(self, setstate, state); + result = _Pickle_FastCall(setstate, state); Py_DECREF(setstate); if (result == NULL) return -1; @@ -6249,7 +6196,6 @@ Py_XDECREF(self->peek); Py_XDECREF(self->stack); Py_XDECREF(self->pers_func); - Py_XDECREF(self->arg); if (self->buffer.buf != NULL) { PyBuffer_Release(&self->buffer); self->buffer.buf = NULL; @@ -6272,7 +6218,6 @@ Py_VISIT(self->peek); Py_VISIT(self->stack); Py_VISIT(self->pers_func); - Py_VISIT(self->arg); return 0; } @@ -6284,7 +6229,6 @@ Py_CLEAR(self->peek); Py_CLEAR(self->stack); Py_CLEAR(self->pers_func); - Py_CLEAR(self->arg); if (self->buffer.buf != NULL) { PyBuffer_Release(&self->buffer); self->buffer.buf = NULL; @@ -6423,7 +6367,6 @@ if (self->memo == NULL) return NULL; - self->arg = NULL; self->proto = 0; return Py_None; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 22:20:39 2013 From: python-checkins at python.org (ezio.melotti) Date: Mon, 25 Nov 2013 22:20:39 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbjogIzEzNTkyLCAjMTcwODc6IGFk?= =?utf-8?q?d_whatsnew_entry_about_regex/match_object_repr_improvements=2E?= Message-ID: <3dT1PM4604z7LmP@mail.python.org> http://hg.python.org/cpython/rev/4ba7a29fe02c changeset: 87563:4ba7a29fe02c user: Ezio Melotti date: Mon Nov 25 23:20:20 2013 +0200 summary: #13592, #17087: add whatsnew entry about regex/match object repr improvements. files: Doc/whatsnew/3.4.rst | 7 +++++++ 1 files changed, 7 insertions(+), 0 deletions(-) diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -693,6 +693,13 @@ which anchor the pattern at both ends of the string to match. (Contributed by Matthew Barnett in :issue:`16203`.) +The repr of :ref:`regex objects ` now includes the pattern +and the flags; the repr of :ref:`match objects ` now +includes the start, end, and the part of the string that matched. + +(Contributed by Serhiy Storchaka in :issue:`13592` and :issue:`17087`.) + + resource -------- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 22:25:31 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Mon, 25 Nov 2013 22:25:31 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Reverting_e39db21df580_eag?= =?utf-8?q?erly_due_to_buildbot_failures=2E?= Message-ID: <3dT1Vz3SXHz7LlL@mail.python.org> http://hg.python.org/cpython/rev/2a1de922651a changeset: 87564:2a1de922651a user: Alexandre Vassalotti date: Mon Nov 25 13:25:12 2013 -0800 summary: Reverting e39db21df580 eagerly due to buildbot failures. files: Modules/_pickle.c | 129 ++++++++++++++++++++++++--------- 1 files changed, 93 insertions(+), 36 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -346,6 +346,7 @@ pickling. */ PyObject *pers_func; /* persistent_id() method, can be NULL */ PyObject *dispatch_table; /* private dispatch_table, can be NULL */ + PyObject *arg; PyObject *write; /* write() method of the output stream. */ PyObject *output_buffer; /* Write into a local bytearray buffer before @@ -382,6 +383,7 @@ Py_ssize_t memo_size; /* Capacity of the memo array */ Py_ssize_t memo_len; /* Number of objects in the memo */ + PyObject *arg; PyObject *pers_func; /* persistent_load() method, can be NULL. */ Py_buffer buffer; @@ -637,29 +639,58 @@ /*************************************************************************/ -/* Helper for calling a function with a single argument quickly. - - This has the performance advantage of reusing the argument tuple. This - provides a nice performance boost with older pickle protocols where many - unbuffered reads occurs. - - This function steals the reference of the given argument. */ +/* Helpers for creating the argument tuple passed to functions. This has the + performance advantage of calling PyTuple_New() only once. + + XXX(avassalotti): Inline directly in _Pickler_FastCall() and + _Unpickler_FastCall(). */ +#define ARG_TUP(self, obj) do { \ + if ((self)->arg || ((self)->arg=PyTuple_New(1))) { \ + Py_XDECREF(PyTuple_GET_ITEM((self)->arg, 0)); \ + PyTuple_SET_ITEM((self)->arg, 0, (obj)); \ + } \ + else { \ + Py_DECREF((obj)); \ + } \ + } while (0) + +#define FREE_ARG_TUP(self) do { \ + if ((self)->arg->ob_refcnt > 1) \ + Py_CLEAR((self)->arg); \ + } while (0) + +/* A temporary cleaner API for fast single argument function call. + + XXX: Does caching the argument tuple provides any real performance benefits? + + A quick benchmark, on a 2.0GHz Athlon64 3200+ running Linux 2.6.24 with + glibc 2.7, tells me that it takes roughly 20,000,000 PyTuple_New(1) calls + when the tuple is retrieved from the freelist (i.e, call PyTuple_New() then + immediately DECREF it) and 1,200,000 calls when allocating brand new tuples + (i.e, call PyTuple_New() and store the returned value in an array), to save + one second (wall clock time). Either ways, the loading time a pickle stream + large enough to generate this number of calls would be massively + overwhelmed by other factors, like I/O throughput, the GC traversal and + object allocation overhead. So, I really doubt these functions provide any + real benefits. + + On the other hand, oprofile reports that pickle spends a lot of time in + these functions. But, that is probably more related to the function call + overhead, than the argument tuple allocation. + + XXX: And, what is the reference behavior of these? Steal, borrow? At first + glance, it seems to steal the reference of 'arg' and borrow the reference + of 'func'. */ static PyObject * -_Pickle_FastCall(PyObject *func, PyObject *arg) -{ - static PyObject *arg_tuple = NULL; - PyObject *result; - - if (arg_tuple == NULL) { - arg_tuple = PyTuple_New(1); - if (arg_tuple == NULL) { - Py_DECREF(arg); - return NULL; - } - } - PyTuple_SET_ITEM(arg_tuple, 0, arg); - result = PyObject_Call(func, arg_tuple, NULL); - Py_DECREF(arg); +_Pickler_FastCall(PicklerObject *self, PyObject *func, PyObject *arg) +{ + PyObject *result = NULL; + + ARG_TUP(self, arg); + if (self->arg) { + result = PyObject_Call(func, self->arg, NULL); + FREE_ARG_TUP(self); + } return result; } @@ -756,7 +787,7 @@ if (output == NULL) return -1; - result = _Pickle_FastCall(self->write, output); + result = _Pickler_FastCall(self, self->write, output); Py_XDECREF(result); return (result == NULL) ? -1 : 0; } @@ -822,6 +853,7 @@ self->pers_func = NULL; self->dispatch_table = NULL; + self->arg = NULL; self->write = NULL; self->proto = 0; self->bin = 0; @@ -890,6 +922,20 @@ return 0; } +/* See documentation for _Pickler_FastCall(). */ +static PyObject * +_Unpickler_FastCall(UnpicklerObject *self, PyObject *func, PyObject *arg) +{ + PyObject *result = NULL; + + ARG_TUP(self, arg); + if (self->arg) { + result = PyObject_Call(func, self->arg, NULL); + FREE_ARG_TUP(self); + } + return result; +} + /* Returns the size of the input on success, -1 on failure. This takes its own reference to `input`. */ static Py_ssize_t @@ -960,7 +1006,7 @@ PyObject *len = PyLong_FromSsize_t(n); if (len == NULL) return -1; - data = _Pickle_FastCall(self->read, len); + data = _Unpickler_FastCall(self, self->read, len); } if (data == NULL) return -1; @@ -973,7 +1019,7 @@ Py_DECREF(data); return -1; } - prefetched = _Pickle_FastCall(self->peek, len); + prefetched = _Unpickler_FastCall(self, self->peek, len); if (prefetched == NULL) { if (PyErr_ExceptionMatches(PyExc_NotImplementedError)) { /* peek() is probably not supported by the given file object */ @@ -1183,6 +1229,7 @@ if (self == NULL) return NULL; + self->arg = NULL; self->pers_func = NULL; self->input_buffer = NULL; self->input_line = NULL; @@ -3125,7 +3172,7 @@ const char binpersid_op = BINPERSID; Py_INCREF(obj); - pid = _Pickle_FastCall(func, obj); + pid = _Pickler_FastCall(self, func, obj); if (pid == NULL) return -1; @@ -3553,7 +3600,7 @@ } if (reduce_func != NULL) { Py_INCREF(obj); - reduce_value = _Pickle_FastCall(reduce_func, obj); + reduce_value = _Pickler_FastCall(self, reduce_func, obj); } else if (PyType_IsSubtype(type, &PyType_Type)) { status = save_global(self, obj, NULL); @@ -3578,7 +3625,7 @@ PyObject *proto; proto = PyLong_FromLong(self->proto); if (proto != NULL) { - reduce_value = _Pickle_FastCall(reduce_func, proto); + reduce_value = _Pickler_FastCall(self, reduce_func, proto); } } else { @@ -3747,6 +3794,7 @@ Py_XDECREF(self->write); Py_XDECREF(self->pers_func); Py_XDECREF(self->dispatch_table); + Py_XDECREF(self->arg); Py_XDECREF(self->fast_memo); PyMemoTable_Del(self->memo); @@ -3760,6 +3808,7 @@ Py_VISIT(self->write); Py_VISIT(self->pers_func); Py_VISIT(self->dispatch_table); + Py_VISIT(self->arg); Py_VISIT(self->fast_memo); return 0; } @@ -3771,6 +3820,7 @@ Py_CLEAR(self->write); Py_CLEAR(self->pers_func); Py_CLEAR(self->dispatch_table); + Py_CLEAR(self->arg); Py_CLEAR(self->fast_memo); if (self->memo != NULL) { @@ -3893,6 +3943,7 @@ return NULL; } + self->arg = NULL; self->fast = 0; self->fast_nesting = 0; self->fast_memo = NULL; @@ -5157,9 +5208,9 @@ if (pid == NULL) return -1; - /* This does not leak since _Pickle_FastCall() steals the reference - to pid first. */ - pid = _Pickle_FastCall(self->pers_func, pid); + /* Ugh... this does not leak since _Unpickler_FastCall() steals the + reference to pid first. */ + pid = _Unpickler_FastCall(self, self->pers_func, pid); if (pid == NULL) return -1; @@ -5184,9 +5235,9 @@ if (pid == NULL) return -1; - /* This does not leak since _Pickle_FastCall() steals the + /* Ugh... this does not leak since _Unpickler_FastCall() steals the reference to pid first. */ - pid = _Pickle_FastCall(self->pers_func, pid); + pid = _Unpickler_FastCall(self, self->pers_func, pid); if (pid == NULL) return -1; @@ -5534,7 +5585,7 @@ PyObject *result; value = self->stack->data[i]; - result = _Pickle_FastCall(append_func, value); + result = _Unpickler_FastCall(self, append_func, value); if (result == NULL) { Pdata_clear(self->stack, i + 1); Py_SIZE(self->stack) = x; @@ -5649,7 +5700,7 @@ PyObject *item; item = self->stack->data[i]; - result = _Pickle_FastCall(add_func, item); + result = _Unpickler_FastCall(self, add_func, item); if (result == NULL) { Pdata_clear(self->stack, i + 1); Py_SIZE(self->stack) = mark; @@ -5696,7 +5747,9 @@ PyObject *result; /* The explicit __setstate__ is responsible for everything. */ - result = _Pickle_FastCall(setstate, state); + /* Ugh... this does not leak since _Unpickler_FastCall() steals the + reference to state first. */ + result = _Unpickler_FastCall(self, setstate, state); Py_DECREF(setstate); if (result == NULL) return -1; @@ -6196,6 +6249,7 @@ Py_XDECREF(self->peek); Py_XDECREF(self->stack); Py_XDECREF(self->pers_func); + Py_XDECREF(self->arg); if (self->buffer.buf != NULL) { PyBuffer_Release(&self->buffer); self->buffer.buf = NULL; @@ -6218,6 +6272,7 @@ Py_VISIT(self->peek); Py_VISIT(self->stack); Py_VISIT(self->pers_func); + Py_VISIT(self->arg); return 0; } @@ -6229,6 +6284,7 @@ Py_CLEAR(self->peek); Py_CLEAR(self->stack); Py_CLEAR(self->pers_func); + Py_CLEAR(self->arg); if (self->buffer.buf != NULL) { PyBuffer_Release(&self->buffer); self->buffer.buf = NULL; @@ -6367,6 +6423,7 @@ if (self->memo == NULL) return NULL; + self->arg = NULL; self->proto = 0; return Py_None; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 23:20:16 2013 From: python-checkins at python.org (victor.stinner) Date: Mon, 25 Nov 2013 23:20:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319752=3A_Fix_=22H?= =?utf-8?q?AVE=5FDEV=5FPTMX=22_implementation_of_os=2Eopenpty=28=29?= Message-ID: <3dT2k82bPQz7LjV@mail.python.org> http://hg.python.org/cpython/rev/63c1fbc4de4b changeset: 87565:63c1fbc4de4b user: Victor Stinner date: Mon Nov 25 23:19:58 2013 +0100 summary: Issue #19752: Fix "HAVE_DEV_PTMX" implementation of os.openpty() Regression introduced by the implementation of the PEP 446 (non-inheritable file descriptors by default). master_fd must be set non-inheritable after the creation of the slave_fd, otherwise grantpt(master_fd) fails with EPERM (errno 13). files: Modules/posixmodule.c | 6 +++++- 1 files changed, 5 insertions(+), 1 deletions(-) diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -6014,7 +6014,7 @@ goto posix_error; #else - master_fd = _Py_open(DEV_PTY_FILE, O_RDWR | O_NOCTTY); /* open master */ + master_fd = open(DEV_PTY_FILE, O_RDWR | O_NOCTTY); /* open master */ if (master_fd < 0) goto posix_error; @@ -6041,6 +6041,10 @@ slave_fd = _Py_open(slave_name, O_RDWR | O_NOCTTY); /* open slave */ if (slave_fd < 0) goto posix_error; + + if (_Py_set_inheritable(master_fd, 0, NULL) < 0) + goto posix_error; + #if !defined(__CYGWIN__) && !defined(HAVE_DEV_PTC) ioctl(slave_fd, I_PUSH, "ptem"); /* push ptem */ ioctl(slave_fd, I_PUSH, "ldterm"); /* push ldterm */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Mon Nov 25 23:54:34 2013 From: python-checkins at python.org (guido.van.rossum) Date: Mon, 25 Nov 2013 23:54:34 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Get_rid_of_final_TO_DO_items?= =?utf-8?q?=2E__Remove_now-redundant_section_on_coroutines_and?= Message-ID: <3dT3Tk4G7Pz7Lmt@mail.python.org> http://hg.python.org/peps/rev/009c6279ea5c changeset: 5323:009c6279ea5c user: Guido van Rossum date: Mon Nov 25 14:54:28 2013 -0800 summary: Get rid of final TO DO items. Remove now-redundant section on coroutines and protocols. files: pep-3156.txt | 58 ++++++++++++++++++++++++--------------- 1 files changed, 36 insertions(+), 22 deletions(-) diff --git a/pep-3156.txt b/pep-3156.txt --- a/pep-3156.txt +++ b/pep-3156.txt @@ -1664,27 +1664,12 @@ public interface of the event loop, so it will work with third-party event loop implementations, too. -Coroutines and Protocols ------------------------- - -The best way to use coroutines to implement an Internet protocol such -as FTP is probably to use a streaming buffer that gets filled by -``data_received()`` and can be read asynchronously using methods like -``read(n)`` and ``readline()`` that are coroutines or return a Future. -When the connection is closed, ``read()`` should eventually produce -``b''``, or raise an exception if ``connection_closed()`` is called -with an exception. - -To write a response, the ``write()`` method (and friends) on the -transport can be used -- these do not return Futures. A standard -protocol implementation should be provided that sets this up and kicks -off the coroutine when ``connection_made()`` is called. - Convenience Utilities --------------------- A few functions and classes are provided to simplify the writing of -basic stream-based clients and servers. Thes are: +basic stream-based clients and servers, such as FTP or HTTP. Thes +are: - ``asyncio.open_connection(host, port)``: A wrapper for ``EventLoop.create_connection()`` that does not require you to @@ -1842,14 +1827,43 @@ respectively. -TO DO -===== +Miscellaneous +============= -- Document all places that take a ``loop`` keyword argument. +Logging +------- -- Document ``logger`` object. +All logging performed by the ``asyncio`` package uses a single +``logging.Logger`` object, ``asyncio.logger``. To customize logging +you can use the standard ``Logger`` API on this object. (Do not +replace the object though.) -- Document ``SIGCHILD`` handling API. +``SIGCHLD`` handling on UNIX +---------------------------- + +Efficient implementation of the ``process_exited()`` method on +subprocess protocols requires a ``SIGCHLD`` signal handler. However, +signal handlers can only be set on the event loop associated with the +main thread. In order to support spawning subprocesses from event +loops running in other threads, a mechanism exists to allow sharing a +``SIGCHLD`` handler between multiple event loops. There are two +additional functions, ``asyncio.get_child_watcher()`` and +``asyncio.set_child_watcher()``, and corresponding methods on the +event loop policy. + +There are two child watcher implementation classes, +``FastChildWatcher`` and ``SafeChildWatcher``. Both use ``SIGCHLD``. +The ``SafeChildWatcher`` class is used by default; it is inefficient +when many subprocesses exist simultaneously. The ``FastChildWatcher`` +class is efficient, but it may interfere with other code (either C +code or Python code) that spawns subprocesses without using an +``asyncio`` event loop. If you are sure you are not using other code +that spawns subprocesses, to use the fast implementation, run the +following in your main thread:: + + watcher = asyncio.FastChildWatcher() + asyncio.set_child_watcher(watcher) + # TBD: Is a call to watcher.attach_loop() needed? Wish List -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Tue Nov 26 00:07:25 2013 From: python-checkins at python.org (guido.van.rossum) Date: Tue, 26 Nov 2013 00:07:25 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Add_StreamReade?= =?utf-8?b?clByb3RvY29sIHRvIF9fYWxsX18u?= Message-ID: <3dT3mY1fVqz7Ll5@mail.python.org> http://hg.python.org/cpython/rev/3ac8f9d5c2d8 changeset: 87566:3ac8f9d5c2d8 user: Guido van Rossum date: Mon Nov 25 15:07:18 2013 -0800 summary: asyncio: Add StreamReaderProtocol to __all__. files: Lib/asyncio/streams.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/streams.py b/Lib/asyncio/streams.py --- a/Lib/asyncio/streams.py +++ b/Lib/asyncio/streams.py @@ -1,6 +1,6 @@ """Stream-related things.""" -__all__ = ['StreamReader', 'StreamReaderProtocol', +__all__ = ['StreamReader', 'StreamWriter', 'StreamReaderProtocol', 'open_connection', 'start_server', ] -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 00:41:56 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 26 Nov 2013 00:41:56 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2318874=3A_tracemal?= =?utf-8?q?loc=3A_explain_the_purpose_of_get=5Ftraces=2Etracebacks_in_a?= Message-ID: <3dT4XN0K6Tz7Lmq@mail.python.org> http://hg.python.org/cpython/rev/2da28004dfac changeset: 87567:2da28004dfac user: Victor Stinner date: Tue Nov 26 00:26:23 2013 +0100 summary: Issue #18874: tracemalloc: explain the purpose of get_traces.tracebacks in a comment files: Modules/_tracemalloc.c | 2 ++ 1 files changed, 2 insertions(+), 0 deletions(-) diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -1041,6 +1041,8 @@ if (!tracemalloc_config.tracing) return get_traces.list; + /* the traceback hash table is used temporarily to intern traceback tuple + of (filename, lineno) tuples */ get_traces.tracebacks = hashtable_new(sizeof(PyObject *), _Py_hashtable_hash_ptr, _Py_hashtable_compare_direct); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 00:41:57 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 26 Nov 2013 00:41:57 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2318874=3A_apply_Ji?= =?utf-8?q?m_Jewett=27s_patch_on_tracemalloc_doc?= Message-ID: <3dT4XP2RGNz7Lnx@mail.python.org> http://hg.python.org/cpython/rev/b6414aa8cf77 changeset: 87568:b6414aa8cf77 user: Victor Stinner date: Tue Nov 26 00:40:10 2013 +0100 summary: Issue #18874: apply Jim Jewett's patch on tracemalloc doc files: Doc/library/tracemalloc.rst | 22 +++++++++++++--------- 1 files changed, 13 insertions(+), 9 deletions(-) diff --git a/Doc/library/tracemalloc.rst b/Doc/library/tracemalloc.rst --- a/Doc/library/tracemalloc.rst +++ b/Doc/library/tracemalloc.rst @@ -102,9 +102,11 @@ /usr/lib/python3.4/urllib/parse.py:476: size=71.8 KiB (+71.8 KiB), count=969 (+969), average=76 B /usr/lib/python3.4/contextlib.py:38: size=67.2 KiB (+67.2 KiB), count=126 (+126), average=546 B -We can see that Python loaded ``4.4 MiB`` of new data (bytecode and constants) -from modules (on of total of ``8.2 MiB``) and that the :mod:`linecache` module -cached ``940 KiB`` of Python source code to format tracebacks. +We can see that Python has loaded ``8.2 MiB`` of module data (bytecode and +constants), and that this is ``4.4 MiB`` more than had been loaded before the +tests, when the previous snapshot was taken. Similarly, the :mod:`linecache` +module has cached ``940 KiB`` of Python source code to format tracebacks, all +of it since the previous snapshot. If the system has little free memory, snapshots can be written on disk using the :meth:`Snapshot.dump` method to analyze the snapshot offline. Then use the @@ -174,11 +176,11 @@ File "/usr/lib/python3.4/runpy.py", line 160 "__main__", fname, loader, pkg_name) -We can see that most memory was allocated in the :mod:`importlib` module to +We can see that the most memory was allocated in the :mod:`importlib` module to load data (bytecode and constants) from modules: ``870 KiB``. The traceback is -where the :mod:`importlib` loaded data for the the last time: on the ``import -pdb`` line of the :mod:`doctest` module. The traceback may change if a new -module is loaded. +where the :mod:`importlib` loaded data most recently: on the ``import pdb`` +line of the :mod:`doctest` module. The traceback may change if a new module is +loaded. Pretty top @@ -319,12 +321,14 @@ .. function:: stop() Stop tracing Python memory allocations: uninstall hooks on Python memory - allocators. Clear also traces of memory blocks allocated by Python + allocators. Also clears all previously collected traces of memory blocks + allocated by Python. Call :func:`take_snapshot` function to take a snapshot of traces before clearing them. - See also :func:`start` and :func:`is_tracing` functions. + See also :func:`start`, :func:`is_tracing` and :func:`clear_traces` + functions. .. function:: take_snapshot() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 00:46:01 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 26 Nov 2013 00:46:01 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2318874=3A_allow_to?= =?utf-8?q?_call_tracemalloc=2ESnapshot=2Estatistics=28cumulative=3DTrue?= =?utf-8?q?=29?= Message-ID: <3dT4d55Nswz7Lnk@mail.python.org> http://hg.python.org/cpython/rev/a2811425dbde changeset: 87569:a2811425dbde user: Victor Stinner date: Tue Nov 26 00:45:47 2013 +0100 summary: Issue #18874: allow to call tracemalloc.Snapshot.statistics(cumulative=True) with traceback_limit=1 files: Doc/library/tracemalloc.rst | 3 +-- Lib/tracemalloc.py | 4 ---- 2 files changed, 1 insertions(+), 6 deletions(-) diff --git a/Doc/library/tracemalloc.rst b/Doc/library/tracemalloc.rst --- a/Doc/library/tracemalloc.rst +++ b/Doc/library/tracemalloc.rst @@ -479,8 +479,7 @@ If *cumulative* is ``True``, cumulate size and count of memory blocks of all frames of the traceback of a trace, not only the most recent frame. The cumulative mode can only be used with *group_by* equals to - ``'filename'`` and ``'lineno'`` and :attr:`traceback_limit` greater than - ``1``. + ``'filename'`` and ``'lineno'``. The result is sorted from the biggest to the smallest by: :attr:`Statistic.size`, :attr:`Statistic.count` and then by diff --git a/Lib/tracemalloc.py b/Lib/tracemalloc.py --- a/Lib/tracemalloc.py +++ b/Lib/tracemalloc.py @@ -380,10 +380,6 @@ if cumulative and key_type not in ('lineno', 'filename'): raise ValueError("cumulative mode cannot by used " "with key type %r" % key_type) - if cumulative and self.traceback_limit < 2: - raise ValueError("cumulative mode needs tracebacks with at least " - "2 frames, traceback limit is %s" - % self.traceback_limit) stats = {} tracebacks = {} -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 01:21:52 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 26 Nov 2013 01:21:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2318874=3A_tracemal?= =?utf-8?q?loc=3A_Comment_the_trace=5Ft_structure?= Message-ID: <3dT5QS256Jz7LpW@mail.python.org> http://hg.python.org/cpython/rev/cf2c4cd08dd9 changeset: 87570:cf2c4cd08dd9 user: Victor Stinner date: Tue Nov 26 01:06:02 2013 +0100 summary: Issue #18874: tracemalloc: Comment the trace_t structure files: Modules/_tracemalloc.c | 4 ++++ 1 files changed, 4 insertions(+), 0 deletions(-) diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -93,8 +93,12 @@ static PyObject *unknown_filename = NULL; static traceback_t tracemalloc_empty_traceback; +/* Trace of a memory block */ typedef struct { + /* Size of the memory block in bytes */ size_t size; + + /* Traceback where the memory block was allocated */ traceback_t *traceback; } trace_t; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 01:21:53 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 26 Nov 2013 01:21:53 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2318874=3A_make_it_?= =?utf-8?q?more_explicit_than_set=5Freentrant=28=29_only_accept_0_or_1?= Message-ID: <3dT5QT3szjz7Lpk@mail.python.org> http://hg.python.org/cpython/rev/f06a50c2bf85 changeset: 87571:f06a50c2bf85 user: Victor Stinner date: Tue Nov 26 01:08:53 2013 +0100 summary: Issue #18874: make it more explicit than set_reentrant() only accept 0 or 1 files: Modules/_tracemalloc.c | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -168,6 +168,7 @@ static void set_reentrant(int reentrant) { + assert(reentrant == 0 || reentrant == 1); if (reentrant) { assert(PyThread_get_key_value(tracemalloc_reentrant_key) == NULL); PyThread_set_key_value(tracemalloc_reentrant_key, -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 01:21:54 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 26 Nov 2013 01:21:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2318874=3A_Fix_typo?= Message-ID: <3dT5QV5lqPz7LpH@mail.python.org> http://hg.python.org/cpython/rev/02668e24d72d changeset: 87572:02668e24d72d user: Victor Stinner date: Tue Nov 26 01:18:52 2013 +0100 summary: Issue #18874: Fix typo files: Modules/_tracemalloc.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -284,7 +284,7 @@ code = pyframe->f_code; if (code == NULL) { #ifdef TRACE_DEBUG - tracemalloc_error("failed to get the code object of the a frame"); + tracemalloc_error("failed to get the code object of the frame"); #endif return; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 01:33:09 2013 From: python-checkins at python.org (christian.heimes) Date: Tue, 26 Nov 2013 01:33:09 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_suspicious_test_case?= Message-ID: <3dT5gT2bLzzP2P@mail.python.org> http://hg.python.org/cpython/rev/976e41a50022 changeset: 87573:976e41a50022 user: Christian Heimes date: Tue Nov 26 01:32:15 2013 +0100 summary: Fix suspicious test case files: Lib/test/test_statistics.py | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_statistics.py b/Lib/test/test_statistics.py --- a/Lib/test/test_statistics.py +++ b/Lib/test/test_statistics.py @@ -271,9 +271,9 @@ # Test that approx_equal(a, b) == approx_equal(b, a) args = [-23, -2, 5, 107, 93568] delta = 2 - for x in args: + for a in args: for type_ in (int, float, Decimal, Fraction): - x = type_(x)*100 + x = type_(a)*100 y = x + delta r = abs(delta/max(x, y)) # There are five cases to check: -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Tue Nov 26 07:46:10 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Tue, 26 Nov 2013 07:46:10 +0100 Subject: [Python-checkins] Daily reference leaks (976e41a50022): sum=-4 Message-ID: results for 976e41a50022 on branch "default" -------------------------------------------- test_site leaked [-2, 2, -2] references, sum=-2 test_site leaked [-2, 2, -2] memory blocks, sum=-2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogPd3Zq3', '-x'] From greg at krypto.org Tue Nov 26 08:07:44 2013 From: greg at krypto.org (Gregory P. Smith) Date: Mon, 25 Nov 2013 23:07:44 -0800 Subject: [Python-checkins] cpython (2.7): Fix test_fcntl to run properly on systems that do not support the flags In-Reply-To: References: <3dSbKV48lqz7Ljx@mail.python.org> Message-ID: On Mon, Nov 25, 2013 at 12:46 AM, Nick Coghlan wrote: > > On 25 Nov 2013 14:46, "gregory.p.smith" > wrote: > > > > http://hg.python.org/cpython/rev/cac7319c5972 > > changeset: 87537:cac7319c5972 > > branch: 2.7 > > parent: 87534:3981e57a7bdc > > user: Gregory P. Smith > > date: Mon Nov 25 04:45:27 2013 +0000 > > summary: > > Fix test_fcntl to run properly on systems that do not support the flags > > used in the "does the value get passed in properly" test. > > > > files: > > Lib/test/test_fcntl.py | 3 +++ > > 1 files changed, 3 insertions(+), 0 deletions(-) > > > > > > diff --git a/Lib/test/test_fcntl.py b/Lib/test/test_fcntl.py > > --- a/Lib/test/test_fcntl.py > > +++ b/Lib/test/test_fcntl.py > > @@ -113,7 +113,10 @@ > > self.skipTest("F_NOTIFY or DN_MULTISHOT unavailable") > > fd = os.open(os.path.dirname(os.path.abspath(TESTFN)), > os.O_RDONLY) > > try: > > + # This will raise OverflowError if issue1309352 is present. > > fcntl.fcntl(fd, cmd, flags) > > + except IOError: > > + pass # Running on a system that doesn't support these > flags. > > finally: > > os.close(fd) > > Raising a skip is generally preferred to marking a test that can't be > executed as passing. > For this test this is still a pass because no OverflowError was raised. -------------- next part -------------- An HTML attachment was scrubbed... URL: From python-checkins at python.org Tue Nov 26 08:25:48 2013 From: python-checkins at python.org (georg.brandl) Date: Tue, 26 Nov 2013 08:25:48 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Markup_fix=2E?= Message-ID: <3dTGqc3Gpgz7Ll6@mail.python.org> http://hg.python.org/cpython/rev/a5e590151034 changeset: 87574:a5e590151034 branch: 3.3 parent: 87557:6aeaaa614a19 user: Georg Brandl date: Tue Nov 26 08:25:24 2013 +0100 summary: Markup fix. files: Doc/library/subprocess.rst | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -442,7 +442,6 @@ the system default of io.DEFAULT_BUFFER_SIZE will be used. .. versionchanged:: 3.3.1 - *bufsize* now defaults to -1 to enable buffering by default to match the behavior that most code expects. In versions prior to Python 3.2.4 and 3.3.1 it incorrectly defaulted to :const:`0` which was unbuffered -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 08:25:49 2013 From: python-checkins at python.org (georg.brandl) Date: Tue, 26 Nov 2013 08:25:49 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_merge_with_3=2E3?= Message-ID: <3dTGqd52qfz7LmY@mail.python.org> http://hg.python.org/cpython/rev/1640b3bdd5b6 changeset: 87575:1640b3bdd5b6 parent: 87573:976e41a50022 parent: 87574:a5e590151034 user: Georg Brandl date: Tue Nov 26 08:25:45 2013 +0100 summary: merge with 3.3 files: Doc/library/subprocess.rst | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -452,7 +452,6 @@ the system default of io.DEFAULT_BUFFER_SIZE will be used. .. versionchanged:: 3.3.1 - *bufsize* now defaults to -1 to enable buffering by default to match the behavior that most code expects. In versions prior to Python 3.2.4 and 3.3.1 it incorrectly defaulted to :const:`0` which was unbuffered -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 10:16:45 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 26 Nov 2013 10:16:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_tracemalloc=3A_Fix_hash_me?= =?utf-8?q?thods_of_Statistic_and_StatisticDiff?= Message-ID: <3dTKHd5B4Jz7LkB@mail.python.org> http://hg.python.org/cpython/rev/9eae3a8181bc changeset: 87576:9eae3a8181bc user: Victor Stinner date: Tue Nov 26 10:16:25 2013 +0100 summary: tracemalloc: Fix hash methods of Statistic and StatisticDiff files: Lib/tracemalloc.py | 7 +++---- 1 files changed, 3 insertions(+), 4 deletions(-) diff --git a/Lib/tracemalloc.py b/Lib/tracemalloc.py --- a/Lib/tracemalloc.py +++ b/Lib/tracemalloc.py @@ -39,7 +39,7 @@ self.count = count def __hash__(self): - return (self.traceback, self.size, self.count) + return hash((self.traceback, self.size, self.count)) def __eq__(self, other): return (self.traceback == other.traceback @@ -79,8 +79,8 @@ self.count_diff = count_diff def __hash__(self): - return (self.traceback, self.size, self.size_diff, - self.count, self.count_diff) + return hash((self.traceback, self.size, self.size_diff, + self.count, self.count_diff)) def __eq__(self, other): return (self.traceback == other.traceback @@ -104,7 +104,6 @@ def __repr__(self): return ('' % (self.traceback, self.size, self.size_diff, - self.count, self.count_diff)) def _sort_key(self): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 10:46:26 2013 From: python-checkins at python.org (victor.stinner) Date: Tue, 26 Nov 2013 10:46:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_tracemalloc=3A_fix_get=5Ft?= =?utf-8?q?raced=5Fmemory=28=29_docstring_for_result_type?= Message-ID: <3dTKxt5SbPz7LnZ@mail.python.org> http://hg.python.org/cpython/rev/1fed50c8ea62 changeset: 87577:1fed50c8ea62 user: Victor Stinner date: Tue Nov 26 10:46:06 2013 +0100 summary: tracemalloc: fix get_traced_memory() docstring for result type files: Modules/_tracemalloc.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -1264,7 +1264,7 @@ } PyDoc_STRVAR(tracemalloc_get_traced_memory_doc, - "get_traced_memory() -> int\n" + "get_traced_memory() -> (int, int)\n" "\n" "Get the current size and maximum size of memory blocks traced\n" "by the tracemalloc module as a tuple: (size: int, max_size: int)."); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 16:08:51 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 26 Nov 2013 16:08:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319760=3A_Silence_?= =?utf-8?q?sysconfig=27s_=27SO=27_key_deprecation_warnings_in_tests=2E?= Message-ID: <3dTT5v4NjKz7Lp2@mail.python.org> http://hg.python.org/cpython/rev/25f856d500e3 changeset: 87578:25f856d500e3 user: Serhiy Storchaka date: Tue Nov 26 17:08:24 2013 +0200 summary: Issue #19760: Silence sysconfig's 'SO' key deprecation warnings in tests. Change stacklevel in warnings.warn() for 'SO' key to 2. files: Lib/distutils/sysconfig.py | 2 +- Lib/distutils/tests/test_sysconfig.py | 7 ++++--- Lib/sysconfig.py | 2 +- Lib/test/test_sysconfig.py | 7 ++++--- 4 files changed, 10 insertions(+), 8 deletions(-) diff --git a/Lib/distutils/sysconfig.py b/Lib/distutils/sysconfig.py --- a/Lib/distutils/sysconfig.py +++ b/Lib/distutils/sysconfig.py @@ -575,5 +575,5 @@ """ if name == 'SO': import warnings - warnings.warn('SO is deprecated, use EXT_SUFFIX', DeprecationWarning) + warnings.warn('SO is deprecated, use EXT_SUFFIX', DeprecationWarning, 2) return get_config_vars().get(name) diff --git a/Lib/distutils/tests/test_sysconfig.py b/Lib/distutils/tests/test_sysconfig.py --- a/Lib/distutils/tests/test_sysconfig.py +++ b/Lib/distutils/tests/test_sysconfig.py @@ -6,7 +6,7 @@ from distutils import sysconfig from distutils.ccompiler import get_default_compiler from distutils.tests import support -from test.support import TESTFN, run_unittest +from test.support import TESTFN, run_unittest, check_warnings class SysconfigTestCase(support.EnvironGuard, unittest.TestCase): def setUp(self): @@ -166,8 +166,9 @@ @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, 'EXT_SUFFIX required for this test') def test_SO_value(self): - self.assertEqual(sysconfig.get_config_var('SO'), - sysconfig.get_config_var('EXT_SUFFIX')) + with check_warnings(('', DeprecationWarning)): + self.assertEqual(sysconfig.get_config_var('SO'), + sysconfig.get_config_var('EXT_SUFFIX')) @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, 'EXT_SUFFIX required for this test') diff --git a/Lib/sysconfig.py b/Lib/sysconfig.py --- a/Lib/sysconfig.py +++ b/Lib/sysconfig.py @@ -585,7 +585,7 @@ """ if name == 'SO': import warnings - warnings.warn('SO is deprecated, use EXT_SUFFIX', DeprecationWarning) + warnings.warn('SO is deprecated, use EXT_SUFFIX', DeprecationWarning, 2) return get_config_vars().get(name) diff --git a/Lib/test/test_sysconfig.py b/Lib/test/test_sysconfig.py --- a/Lib/test/test_sysconfig.py +++ b/Lib/test/test_sysconfig.py @@ -5,7 +5,7 @@ import shutil from copy import copy -from test.support import (run_unittest, TESTFN, unlink, +from test.support import (run_unittest, TESTFN, unlink, check_warnings, captured_stdout, skip_unless_symlink) import sysconfig @@ -378,8 +378,9 @@ @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, 'EXT_SUFFIX required for this test') def test_SO_value(self): - self.assertEqual(sysconfig.get_config_var('SO'), - sysconfig.get_config_var('EXT_SUFFIX')) + with check_warnings(('', DeprecationWarning)): + self.assertEqual(sysconfig.get_config_var('SO'), + sysconfig.get_config_var('EXT_SUFFIX')) @unittest.skipIf(sysconfig.get_config_var('EXT_SUFFIX') is None, 'EXT_SUFFIX required for this test') -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 16:33:52 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 26 Nov 2013 16:33:52 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5Nzk0?= =?utf-8?q?=3A_Improved_markup_for_True/False_constants=2E?= Message-ID: <3dTTfm6tsxz7Lq4@mail.python.org> http://hg.python.org/cpython/rev/04c4f141874b changeset: 87579:04c4f141874b branch: 2.7 parent: 87556:313d9bb253bf user: Serhiy Storchaka date: Tue Nov 26 17:32:03 2013 +0200 summary: Issue #19794: Improved markup for True/False constants. files: Doc/library/decimal.rst | 22 +++++++++++----------- 1 files changed, 11 insertions(+), 11 deletions(-) diff --git a/Doc/library/decimal.rst b/Doc/library/decimal.rst --- a/Doc/library/decimal.rst +++ b/Doc/library/decimal.rst @@ -1226,52 +1226,52 @@ .. method:: is_canonical(x) - Returns True if *x* is canonical; otherwise returns False. + Returns ``True`` if *x* is canonical; otherwise returns ``False``. .. method:: is_finite(x) - Returns True if *x* is finite; otherwise returns False. + Returns ``True`` if *x* is finite; otherwise returns ``False``. .. method:: is_infinite(x) - Returns True if *x* is infinite; otherwise returns False. + Returns ``True`` if *x* is infinite; otherwise returns ``False``. .. method:: is_nan(x) - Returns True if *x* is a qNaN or sNaN; otherwise returns False. + Returns ``True`` if *x* is a qNaN or sNaN; otherwise returns ``False``. .. method:: is_normal(x) - Returns True if *x* is a normal number; otherwise returns False. + Returns ``True`` if *x* is a normal number; otherwise returns ``False``. .. method:: is_qnan(x) - Returns True if *x* is a quiet NaN; otherwise returns False. + Returns ``True`` if *x* is a quiet NaN; otherwise returns ``False``. .. method:: is_signed(x) - Returns True if *x* is negative; otherwise returns False. + Returns ``True`` if *x* is negative; otherwise returns ``False``. .. method:: is_snan(x) - Returns True if *x* is a signaling NaN; otherwise returns False. + Returns ``True`` if *x* is a signaling NaN; otherwise returns ``False``. .. method:: is_subnormal(x) - Returns True if *x* is subnormal; otherwise returns False. + Returns ``True`` if *x* is subnormal; otherwise returns ``False``. .. method:: is_zero(x) - Returns True if *x* is a zero; otherwise returns False. + Returns ``True`` if *x* is a zero; otherwise returns ``False``. .. method:: ln(x) @@ -1431,7 +1431,7 @@ .. method:: same_quantum(x, y) - Returns True if the two operands have the same exponent. + Returns ``True`` if the two operands have the same exponent. .. method:: scaleb (x, y) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 16:33:54 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 26 Nov 2013 16:33:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5Nzk0?= =?utf-8?q?=3A_Improved_markup_for_True/False_constants=2E?= Message-ID: <3dTTfp2Mqpz7Lr5@mail.python.org> http://hg.python.org/cpython/rev/c1d163203f21 changeset: 87580:c1d163203f21 branch: 3.3 parent: 87574:a5e590151034 user: Serhiy Storchaka date: Tue Nov 26 17:32:16 2013 +0200 summary: Issue #19794: Improved markup for True/False constants. files: Doc/library/decimal.rst | 26 +++++++++++++------------- 1 files changed, 13 insertions(+), 13 deletions(-) diff --git a/Doc/library/decimal.rst b/Doc/library/decimal.rst --- a/Doc/library/decimal.rst +++ b/Doc/library/decimal.rst @@ -1183,52 +1183,52 @@ .. method:: is_canonical(x) - Returns True if *x* is canonical; otherwise returns False. + Returns ``True`` if *x* is canonical; otherwise returns ``False``. .. method:: is_finite(x) - Returns True if *x* is finite; otherwise returns False. + Returns ``True`` if *x* is finite; otherwise returns ``False``. .. method:: is_infinite(x) - Returns True if *x* is infinite; otherwise returns False. + Returns ``True`` if *x* is infinite; otherwise returns ``False``. .. method:: is_nan(x) - Returns True if *x* is a qNaN or sNaN; otherwise returns False. + Returns ``True`` if *x* is a qNaN or sNaN; otherwise returns ``False``. .. method:: is_normal(x) - Returns True if *x* is a normal number; otherwise returns False. + Returns ``True`` if *x* is a normal number; otherwise returns ``False``. .. method:: is_qnan(x) - Returns True if *x* is a quiet NaN; otherwise returns False. + Returns ``True`` if *x* is a quiet NaN; otherwise returns ``False``. .. method:: is_signed(x) - Returns True if *x* is negative; otherwise returns False. + Returns ``True`` if *x* is negative; otherwise returns ``False``. .. method:: is_snan(x) - Returns True if *x* is a signaling NaN; otherwise returns False. + Returns ``True`` if *x* is a signaling NaN; otherwise returns ``False``. .. method:: is_subnormal(x) - Returns True if *x* is subnormal; otherwise returns False. + Returns ``True`` if *x* is subnormal; otherwise returns ``False``. .. method:: is_zero(x) - Returns True if *x* is a zero; otherwise returns False. + Returns ``True`` if *x* is a zero; otherwise returns ``False``. .. method:: ln(x) @@ -1390,7 +1390,7 @@ .. method:: same_quantum(x, y) - Returns True if the two operands have the same exponent. + Returns ``True`` if the two operands have the same exponent. .. method:: scaleb (x, y) @@ -1452,9 +1452,9 @@ .. data:: HAVE_THREADS - The default value is True. If Python is compiled without threads, the + The default value is ``True``. If Python is compiled without threads, the C version automatically disables the expensive thread local context - machinery. In this case, the value is False. + machinery. In this case, the value is ``False``. Rounding modes -------------- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 16:33:55 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 26 Nov 2013 16:33:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319794=3A_Improved_markup_for_True/False_constan?= =?utf-8?b?dHMu?= Message-ID: <3dTTfq5TCyz7Lsb@mail.python.org> http://hg.python.org/cpython/rev/67e642f5ab5d changeset: 87581:67e642f5ab5d parent: 87578:25f856d500e3 parent: 87580:c1d163203f21 user: Serhiy Storchaka date: Tue Nov 26 17:33:13 2013 +0200 summary: Issue #19794: Improved markup for True/False constants. files: Doc/library/decimal.rst | 26 +++++++++++++------------- 1 files changed, 13 insertions(+), 13 deletions(-) diff --git a/Doc/library/decimal.rst b/Doc/library/decimal.rst --- a/Doc/library/decimal.rst +++ b/Doc/library/decimal.rst @@ -1183,52 +1183,52 @@ .. method:: is_canonical(x) - Returns True if *x* is canonical; otherwise returns False. + Returns ``True`` if *x* is canonical; otherwise returns ``False``. .. method:: is_finite(x) - Returns True if *x* is finite; otherwise returns False. + Returns ``True`` if *x* is finite; otherwise returns ``False``. .. method:: is_infinite(x) - Returns True if *x* is infinite; otherwise returns False. + Returns ``True`` if *x* is infinite; otherwise returns ``False``. .. method:: is_nan(x) - Returns True if *x* is a qNaN or sNaN; otherwise returns False. + Returns ``True`` if *x* is a qNaN or sNaN; otherwise returns ``False``. .. method:: is_normal(x) - Returns True if *x* is a normal number; otherwise returns False. + Returns ``True`` if *x* is a normal number; otherwise returns ``False``. .. method:: is_qnan(x) - Returns True if *x* is a quiet NaN; otherwise returns False. + Returns ``True`` if *x* is a quiet NaN; otherwise returns ``False``. .. method:: is_signed(x) - Returns True if *x* is negative; otherwise returns False. + Returns ``True`` if *x* is negative; otherwise returns ``False``. .. method:: is_snan(x) - Returns True if *x* is a signaling NaN; otherwise returns False. + Returns ``True`` if *x* is a signaling NaN; otherwise returns ``False``. .. method:: is_subnormal(x) - Returns True if *x* is subnormal; otherwise returns False. + Returns ``True`` if *x* is subnormal; otherwise returns ``False``. .. method:: is_zero(x) - Returns True if *x* is a zero; otherwise returns False. + Returns ``True`` if *x* is a zero; otherwise returns ``False``. .. method:: ln(x) @@ -1390,7 +1390,7 @@ .. method:: same_quantum(x, y) - Returns True if the two operands have the same exponent. + Returns ``True`` if the two operands have the same exponent. .. method:: scaleb (x, y) @@ -1452,9 +1452,9 @@ .. data:: HAVE_THREADS - The default value is True. If Python is compiled without threads, the + The default value is ``True``. If Python is compiled without threads, the C version automatically disables the expensive thread local context - machinery. In this case, the value is False. + machinery. In this case, the value is ``False``. Rounding modes -------------- -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 17:19:58 2013 From: python-checkins at python.org (mark.dickinson) Date: Tue, 26 Nov 2013 17:19:58 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NjM4?= =?utf-8?q?=3A_Raise_ValueError_instead_of_crashing_when_converting_billio?= =?utf-8?q?n?= Message-ID: <3dTVgy4L4szMnX@mail.python.org> http://hg.python.org/cpython/rev/eac133e13bb5 changeset: 87582:eac133e13bb5 branch: 3.3 parent: 87580:c1d163203f21 user: Mark Dickinson date: Tue Nov 26 16:19:13 2013 +0000 summary: Issue #19638: Raise ValueError instead of crashing when converting billion character strings to float. files: Lib/test/test_strtod.py | 31 ++++++++++++++++ Misc/NEWS | 3 + Python/dtoa.c | 55 ++++++++++++++++++++++------ 3 files changed, 77 insertions(+), 12 deletions(-) diff --git a/Lib/test/test_strtod.py b/Lib/test/test_strtod.py --- a/Lib/test/test_strtod.py +++ b/Lib/test/test_strtod.py @@ -248,6 +248,37 @@ else: assert False, "expected ValueError" + @test.support.bigmemtest(size=5 * test.support._1G, memuse=1, dry_run=False) + def test_oversized_digit_strings(self, maxsize): + # Input string whose length doesn't fit in an INT. + s = "1." + "1" * int(2.2e9) + with self.assertRaises(ValueError): + float(s) + del s + + s = "0." + "0" * int(2.2e9) + "1" + with self.assertRaises(ValueError): + float(s) + del s + + def test_large_exponents(self): + # Verify that the clipping of the exponent in strtod doesn't affect the + # output values. + def positive_exp(n): + """ Long string with value 1.0 and exponent n""" + return '0.{}1e+{}'.format('0'*(n-1), n) + + def negative_exp(n): + """ Long string with value 1.0 and exponent -n""" + return '1{}e-{}'.format('0'*n, n) + + self.assertEqual(float(positive_exp(10000)), 1.0) + self.assertEqual(float(positive_exp(20000)), 1.0) + self.assertEqual(float(positive_exp(30000)), 1.0) + self.assertEqual(float(negative_exp(10000)), 1.0) + self.assertEqual(float(negative_exp(20000)), 1.0) + self.assertEqual(float(negative_exp(30000)), 1.0) + def test_particular(self): # inputs that produced crashes or incorrectly rounded results with # previous versions of dtoa.c, for various reasons diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,6 +10,9 @@ Core and Builtins ----------------- +- Issue #19638: Fix possible crash / undefined behaviour from huge (more than 2 + billion characters) input strings in _Py_dg_strtod. + Library ------- diff --git a/Python/dtoa.c b/Python/dtoa.c --- a/Python/dtoa.c +++ b/Python/dtoa.c @@ -204,7 +204,24 @@ MAX_ABS_EXP in absolute value get truncated to +-MAX_ABS_EXP. MAX_ABS_EXP should fit into an int. */ #ifndef MAX_ABS_EXP -#define MAX_ABS_EXP 19999U +#define MAX_ABS_EXP 1100000000U +#endif +/* Bound on length of pieces of input strings in _Py_dg_strtod; specifically, + this is used to bound the total number of digits ignoring leading zeros and + the number of digits that follow the decimal point. Ideally, MAX_DIGITS + should satisfy MAX_DIGITS + 400 < MAX_ABS_EXP; that ensures that the + exponent clipping in _Py_dg_strtod can't affect the value of the output. */ +#ifndef MAX_DIGITS +#define MAX_DIGITS 1000000000U +#endif + +/* Guard against trying to use the above values on unusual platforms with ints + * of width less than 32 bits. */ +#if MAX_ABS_EXP > INT_MAX +#error "MAX_ABS_EXP should fit in an int" +#endif +#if MAX_DIGITS > INT_MAX +#error "MAX_DIGITS should fit in an int" #endif /* The following definition of Storeinc is appropriate for MIPS processors. @@ -1538,6 +1555,7 @@ Long L; BCinfo bc; Bigint *bb, *bb1, *bd, *bd0, *bs, *delta; + size_t ndigits, fraclen; dval(&rv) = 0.; @@ -1560,40 +1578,53 @@ c = *++s; lz = s != s1; - /* Point s0 at the first nonzero digit (if any). nd0 will be the position - of the point relative to s0. nd will be the total number of digits - ignoring leading zeros. */ + /* Point s0 at the first nonzero digit (if any). fraclen will be the + number of digits between the decimal point and the end of the + digit string. ndigits will be the total number of digits ignoring + leading zeros. */ s0 = s1 = s; while ('0' <= c && c <= '9') c = *++s; - nd0 = nd = s - s1; + ndigits = s - s1; + fraclen = 0; /* Parse decimal point and following digits. */ if (c == '.') { c = *++s; - if (!nd) { + if (!ndigits) { s1 = s; while (c == '0') c = *++s; lz = lz || s != s1; - nd0 -= s - s1; + fraclen += (s - s1); s0 = s; } s1 = s; while ('0' <= c && c <= '9') c = *++s; - nd += s - s1; + ndigits += s - s1; + fraclen += s - s1; } - /* Now lz is true if and only if there were leading zero digits, and nd - gives the total number of digits ignoring leading zeros. A valid input - must have at least one digit. */ - if (!nd && !lz) { + /* Now lz is true if and only if there were leading zero digits, and + ndigits gives the total number of digits ignoring leading zeros. A + valid input must have at least one digit. */ + if (!ndigits && !lz) { if (se) *se = (char *)s00; goto parse_error; } + /* Range check ndigits and fraclen to make sure that they, and values + computed with them, can safely fit in an int. */ + if (ndigits > MAX_DIGITS || fraclen > MAX_DIGITS) { + if (se) + *se = (char *)s00; + goto parse_error; + } + nd = (int)ndigits; + nd0 = (int)ndigits - (int)fraclen; + /* Parse exponent. */ e = 0; if (c == 'e' || c == 'E') { -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 17:20:00 2013 From: python-checkins at python.org (mark.dickinson) Date: Tue, 26 Nov 2013 17:20:00 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319638=3A_Merge_from_3=2E3?= Message-ID: <3dTVh00Bn3zSTs@mail.python.org> http://hg.python.org/cpython/rev/9c7ab3e68243 changeset: 87583:9c7ab3e68243 parent: 87581:67e642f5ab5d parent: 87582:eac133e13bb5 user: Mark Dickinson date: Tue Nov 26 16:19:38 2013 +0000 summary: Issue #19638: Merge from 3.3 files: Lib/test/test_strtod.py | 31 ++++++++++++++++ Misc/NEWS | 3 + Python/dtoa.c | 55 ++++++++++++++++++++++------ 3 files changed, 77 insertions(+), 12 deletions(-) diff --git a/Lib/test/test_strtod.py b/Lib/test/test_strtod.py --- a/Lib/test/test_strtod.py +++ b/Lib/test/test_strtod.py @@ -248,6 +248,37 @@ else: assert False, "expected ValueError" + @test.support.bigmemtest(size=5 * test.support._1G, memuse=1, dry_run=False) + def test_oversized_digit_strings(self, maxsize): + # Input string whose length doesn't fit in an INT. + s = "1." + "1" * int(2.2e9) + with self.assertRaises(ValueError): + float(s) + del s + + s = "0." + "0" * int(2.2e9) + "1" + with self.assertRaises(ValueError): + float(s) + del s + + def test_large_exponents(self): + # Verify that the clipping of the exponent in strtod doesn't affect the + # output values. + def positive_exp(n): + """ Long string with value 1.0 and exponent n""" + return '0.{}1e+{}'.format('0'*(n-1), n) + + def negative_exp(n): + """ Long string with value 1.0 and exponent -n""" + return '1{}e-{}'.format('0'*n, n) + + self.assertEqual(float(positive_exp(10000)), 1.0) + self.assertEqual(float(positive_exp(20000)), 1.0) + self.assertEqual(float(positive_exp(30000)), 1.0) + self.assertEqual(float(negative_exp(10000)), 1.0) + self.assertEqual(float(negative_exp(20000)), 1.0) + self.assertEqual(float(negative_exp(30000)), 1.0) + def test_particular(self): # inputs that produced crashes or incorrectly rounded results with # previous versions of dtoa.c, for various reasons diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,6 +10,9 @@ Core and Builtins ----------------- +- Issue #19638: Fix possible crash / undefined behaviour from huge (more than 2 + billion characters) input strings in _Py_dg_strtod. + Library ------- diff --git a/Python/dtoa.c b/Python/dtoa.c --- a/Python/dtoa.c +++ b/Python/dtoa.c @@ -204,7 +204,24 @@ MAX_ABS_EXP in absolute value get truncated to +-MAX_ABS_EXP. MAX_ABS_EXP should fit into an int. */ #ifndef MAX_ABS_EXP -#define MAX_ABS_EXP 19999U +#define MAX_ABS_EXP 1100000000U +#endif +/* Bound on length of pieces of input strings in _Py_dg_strtod; specifically, + this is used to bound the total number of digits ignoring leading zeros and + the number of digits that follow the decimal point. Ideally, MAX_DIGITS + should satisfy MAX_DIGITS + 400 < MAX_ABS_EXP; that ensures that the + exponent clipping in _Py_dg_strtod can't affect the value of the output. */ +#ifndef MAX_DIGITS +#define MAX_DIGITS 1000000000U +#endif + +/* Guard against trying to use the above values on unusual platforms with ints + * of width less than 32 bits. */ +#if MAX_ABS_EXP > INT_MAX +#error "MAX_ABS_EXP should fit in an int" +#endif +#if MAX_DIGITS > INT_MAX +#error "MAX_DIGITS should fit in an int" #endif /* The following definition of Storeinc is appropriate for MIPS processors. @@ -1538,6 +1555,7 @@ Long L; BCinfo bc; Bigint *bb, *bb1, *bd, *bd0, *bs, *delta; + size_t ndigits, fraclen; dval(&rv) = 0.; @@ -1560,40 +1578,53 @@ c = *++s; lz = s != s1; - /* Point s0 at the first nonzero digit (if any). nd0 will be the position - of the point relative to s0. nd will be the total number of digits - ignoring leading zeros. */ + /* Point s0 at the first nonzero digit (if any). fraclen will be the + number of digits between the decimal point and the end of the + digit string. ndigits will be the total number of digits ignoring + leading zeros. */ s0 = s1 = s; while ('0' <= c && c <= '9') c = *++s; - nd0 = nd = s - s1; + ndigits = s - s1; + fraclen = 0; /* Parse decimal point and following digits. */ if (c == '.') { c = *++s; - if (!nd) { + if (!ndigits) { s1 = s; while (c == '0') c = *++s; lz = lz || s != s1; - nd0 -= s - s1; + fraclen += (s - s1); s0 = s; } s1 = s; while ('0' <= c && c <= '9') c = *++s; - nd += s - s1; + ndigits += s - s1; + fraclen += s - s1; } - /* Now lz is true if and only if there were leading zero digits, and nd - gives the total number of digits ignoring leading zeros. A valid input - must have at least one digit. */ - if (!nd && !lz) { + /* Now lz is true if and only if there were leading zero digits, and + ndigits gives the total number of digits ignoring leading zeros. A + valid input must have at least one digit. */ + if (!ndigits && !lz) { if (se) *se = (char *)s00; goto parse_error; } + /* Range check ndigits and fraclen to make sure that they, and values + computed with them, can safely fit in an int. */ + if (ndigits > MAX_DIGITS || fraclen > MAX_DIGITS) { + if (se) + *se = (char *)s00; + goto parse_error; + } + nd = (int)ndigits; + nd0 = (int)ndigits - (int)fraclen; + /* Parse exponent. */ e = 0; if (c == 'e' || c == 'E') { -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 17:38:42 2013 From: python-checkins at python.org (mark.dickinson) Date: Tue, 26 Nov 2013 17:38:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NjM4?= =?utf-8?q?=3A_Raise_ValueError_instead_of_crashing_when_converting_billio?= =?utf-8?q?n?= Message-ID: <3dTW5Z72XLz7Llx@mail.python.org> http://hg.python.org/cpython/rev/6d6a018a3bb0 changeset: 87584:6d6a018a3bb0 branch: 2.7 parent: 87579:04c4f141874b user: Mark Dickinson date: Tue Nov 26 16:38:25 2013 +0000 summary: Issue #19638: Raise ValueError instead of crashing when converting billion character strings to float. files: Lib/test/test_strtod.py | 31 +++++++++++++ Misc/NEWS | 3 + Python/dtoa.c | 65 +++++++++++++++++++++------- 3 files changed, 82 insertions(+), 17 deletions(-) diff --git a/Lib/test/test_strtod.py b/Lib/test/test_strtod.py --- a/Lib/test/test_strtod.py +++ b/Lib/test/test_strtod.py @@ -249,6 +249,37 @@ else: assert False, "expected ValueError" + @test_support.bigmemtest(minsize=5 * test_support._1G, memuse=1) + def test_oversized_digit_strings(self, maxsize): + # Input string whose length doesn't fit in an INT. + s = "1." + "1" * int(2.2e9) + with self.assertRaises(ValueError): + float(s) + del s + + s = "0." + "0" * int(2.2e9) + "1" + with self.assertRaises(ValueError): + float(s) + del s + + def test_large_exponents(self): + # Verify that the clipping of the exponent in strtod doesn't affect the + # output values. + def positive_exp(n): + """ Long string with value 1.0 and exponent n""" + return '0.{}1e+{}'.format('0'*(n-1), n) + + def negative_exp(n): + """ Long string with value 1.0 and exponent -n""" + return '1{}e-{}'.format('0'*n, n) + + self.assertEqual(float(positive_exp(10000)), 1.0) + self.assertEqual(float(positive_exp(20000)), 1.0) + self.assertEqual(float(positive_exp(30000)), 1.0) + self.assertEqual(float(negative_exp(10000)), 1.0) + self.assertEqual(float(negative_exp(20000)), 1.0) + self.assertEqual(float(negative_exp(30000)), 1.0) + def test_particular(self): # inputs that produced crashes or incorrectly rounded results with # previous versions of dtoa.c, for various reasons diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -9,6 +9,9 @@ Core and Builtins ----------------- +- Issue #19638: Fix possible crash / undefined behaviour from huge (more than 2 + billion characters) input strings in _Py_dg_strtod. + Library ------- diff --git a/Python/dtoa.c b/Python/dtoa.c --- a/Python/dtoa.c +++ b/Python/dtoa.c @@ -204,7 +204,24 @@ MAX_ABS_EXP in absolute value get truncated to +-MAX_ABS_EXP. MAX_ABS_EXP should fit into an int. */ #ifndef MAX_ABS_EXP -#define MAX_ABS_EXP 19999U +#define MAX_ABS_EXP 1100000000U +#endif +/* Bound on length of pieces of input strings in _Py_dg_strtod; specifically, + this is used to bound the total number of digits ignoring leading zeros and + the number of digits that follow the decimal point. Ideally, MAX_DIGITS + should satisfy MAX_DIGITS + 400 < MAX_ABS_EXP; that ensures that the + exponent clipping in _Py_dg_strtod can't affect the value of the output. */ +#ifndef MAX_DIGITS +#define MAX_DIGITS 1000000000U +#endif + +/* Guard against trying to use the above values on unusual platforms with ints + * of width less than 32 bits. */ +#if MAX_ABS_EXP > INT_MAX +#error "MAX_ABS_EXP should fit in an int" +#endif +#if MAX_DIGITS > INT_MAX +#error "MAX_DIGITS should fit in an int" #endif /* The following definition of Storeinc is appropriate for MIPS processors. @@ -1498,6 +1515,7 @@ Long L; BCinfo bc; Bigint *bb, *bb1, *bd, *bd0, *bs, *delta; + size_t ndigits, fraclen; dval(&rv) = 0.; @@ -1520,40 +1538,53 @@ c = *++s; lz = s != s1; - /* Point s0 at the first nonzero digit (if any). nd0 will be the position - of the point relative to s0. nd will be the total number of digits - ignoring leading zeros. */ + /* Point s0 at the first nonzero digit (if any). fraclen will be the + number of digits between the decimal point and the end of the + digit string. ndigits will be the total number of digits ignoring + leading zeros. */ s0 = s1 = s; while ('0' <= c && c <= '9') c = *++s; - nd0 = nd = s - s1; + ndigits = s - s1; + fraclen = 0; /* Parse decimal point and following digits. */ if (c == '.') { c = *++s; - if (!nd) { + if (!ndigits) { s1 = s; while (c == '0') c = *++s; lz = lz || s != s1; - nd0 -= s - s1; + fraclen += (s - s1); s0 = s; } s1 = s; while ('0' <= c && c <= '9') c = *++s; - nd += s - s1; + ndigits += s - s1; + fraclen += s - s1; } - /* Now lz is true if and only if there were leading zero digits, and nd - gives the total number of digits ignoring leading zeros. A valid input - must have at least one digit. */ - if (!nd && !lz) { + /* Now lz is true if and only if there were leading zero digits, and + ndigits gives the total number of digits ignoring leading zeros. A + valid input must have at least one digit. */ + if (!ndigits && !lz) { if (se) *se = (char *)s00; goto parse_error; } + /* Range check ndigits and fraclen to make sure that they, and values + computed with them, can safely fit in an int. */ + if (ndigits > MAX_DIGITS || fraclen > MAX_DIGITS) { + if (se) + *se = (char *)s00; + goto parse_error; + } + nd = (int)ndigits; + nd0 = (int)ndigits - (int)fraclen; + /* Parse exponent. */ e = 0; if (c == 'e' || c == 'E') { @@ -1886,20 +1917,20 @@ bd2++; /* At this stage bd5 - bb5 == e == bd2 - bb2 + bbe, bb2 - bs2 == 1, - and bs == 1, so: + and bs == 1, so: tdv == bd * 10**e = bd * 2**(bbe - bb2 + bd2) * 5**(bd5 - bb5) srv == bb * 2**bbe = bb * 2**(bbe - bb2 + bb2) - 0.5 ulp(srv) == 2**(bbe-1) = bs * 2**(bbe - bb2 + bs2) + 0.5 ulp(srv) == 2**(bbe-1) = bs * 2**(bbe - bb2 + bs2) - It follows that: + It follows that: M * tdv = bd * 2**bd2 * 5**bd5 M * srv = bb * 2**bb2 * 5**bb5 M * 0.5 ulp(srv) = bs * 2**bs2 * 5**bb5 - for some constant M. (Actually, M == 2**(bb2 - bbe) * 5**bb5, but - this fact is not needed below.) + for some constant M. (Actually, M == 2**(bb2 - bbe) * 5**bb5, but + this fact is not needed below.) */ /* Remove factor of 2**i, where i = min(bb2, bd2, bs2). */ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 18:03:08 2013 From: python-checkins at python.org (mark.dickinson) Date: Tue, 26 Nov 2013 18:03:08 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NjM4?= =?utf-8?q?=3A_Skip_large_digit_string_tests_on_32-bit_platforms=2E?= Message-ID: <3dTWdm0lN4z7Ltf@mail.python.org> http://hg.python.org/cpython/rev/4c523ea2b429 changeset: 87585:4c523ea2b429 branch: 2.7 user: Mark Dickinson date: Tue Nov 26 17:02:46 2013 +0000 summary: Issue #19638: Skip large digit string tests on 32-bit platforms. files: Lib/test/test_strtod.py | 3 +++ 1 files changed, 3 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_strtod.py b/Lib/test/test_strtod.py --- a/Lib/test/test_strtod.py +++ b/Lib/test/test_strtod.py @@ -251,6 +251,9 @@ @test_support.bigmemtest(minsize=5 * test_support._1G, memuse=1) def test_oversized_digit_strings(self, maxsize): + if sys.maxsize <= 0x7fffffff: + self.skipTest("Only applies on 64-bit systems.") + # Input string whose length doesn't fit in an INT. s = "1." + "1" * int(2.2e9) with self.assertRaises(ValueError): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 20:00:40 2013 From: python-checkins at python.org (antoine.pitrou) Date: Tue, 26 Nov 2013 20:00:40 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Fix_misuse_of_?= =?utf-8?q?the_bigmemtest_decorator?= Message-ID: <3dTZFN4k1wz7LjY@mail.python.org> http://hg.python.org/cpython/rev/461a4d10753a changeset: 87586:461a4d10753a branch: 2.7 user: Antoine Pitrou date: Tue Nov 26 20:00:34 2013 +0100 summary: Fix misuse of the bigmemtest decorator files: Lib/test/test_strtod.py | 6 ++---- 1 files changed, 2 insertions(+), 4 deletions(-) diff --git a/Lib/test/test_strtod.py b/Lib/test/test_strtod.py --- a/Lib/test/test_strtod.py +++ b/Lib/test/test_strtod.py @@ -249,11 +249,9 @@ else: assert False, "expected ValueError" - @test_support.bigmemtest(minsize=5 * test_support._1G, memuse=1) + @test_support.precisionbigmemtest(size=test_support._2G, memuse=3, + dry_run=False) def test_oversized_digit_strings(self, maxsize): - if sys.maxsize <= 0x7fffffff: - self.skipTest("Only applies on 64-bit systems.") - # Input string whose length doesn't fit in an INT. s = "1." + "1" * int(2.2e9) with self.assertRaises(ValueError): -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 20:33:43 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 26 Nov 2013 20:33:43 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzExNDg5?= =?utf-8?q?=3A_JSON_decoder_now_accepts_lone_surrogates=2E?= Message-ID: <3dTZzW5jB6z7LpK@mail.python.org> http://hg.python.org/cpython/rev/c85305a54e6d changeset: 87587:c85305a54e6d branch: 2.7 user: Serhiy Storchaka date: Tue Nov 26 21:25:15 2013 +0200 summary: Issue #11489: JSON decoder now accepts lone surrogates. files: Lib/json/decoder.py | 35 ++++++----- Lib/json/tests/test_scanstring.py | 55 +++++++++++++++++- Misc/NEWS | 2 + Modules/_json.c | 49 ++++------------ 4 files changed, 84 insertions(+), 57 deletions(-) diff --git a/Lib/json/decoder.py b/Lib/json/decoder.py --- a/Lib/json/decoder.py +++ b/Lib/json/decoder.py @@ -62,6 +62,16 @@ DEFAULT_ENCODING = "utf-8" +def _decode_uXXXX(s, pos): + esc = s[pos + 1:pos + 5] + if len(esc) == 4 and esc[1] not in 'xX': + try: + return int(esc, 16) + except ValueError: + pass + msg = "Invalid \\uXXXX escape" + raise ValueError(errmsg(msg, s, pos)) + def py_scanstring(s, end, encoding=None, strict=True, _b=BACKSLASH, _m=STRINGCHUNK.match): """Scan the string s for a JSON string. End is the index of the @@ -116,25 +126,16 @@ end += 1 else: # Unicode escape sequence - esc = s[end + 1:end + 5] - next_end = end + 5 - if len(esc) != 4: - msg = "Invalid \\uXXXX escape" - raise ValueError(errmsg(msg, s, end)) - uni = int(esc, 16) + uni = _decode_uXXXX(s, end) + end += 5 # Check for surrogate pair on UCS-4 systems - if 0xd800 <= uni <= 0xdbff and sys.maxunicode > 65535: - msg = "Invalid \\uXXXX\\uXXXX surrogate pair" - if not s[end + 5:end + 7] == '\\u': - raise ValueError(errmsg(msg, s, end)) - esc2 = s[end + 7:end + 11] - if len(esc2) != 4: - raise ValueError(errmsg(msg, s, end)) - uni2 = int(esc2, 16) - uni = 0x10000 + (((uni - 0xd800) << 10) | (uni2 - 0xdc00)) - next_end += 6 + if sys.maxunicode > 65535 and \ + 0xd800 <= uni <= 0xdbff and s[end:end + 2] == '\\u': + uni2 = _decode_uXXXX(s, end + 1) + if 0xdc00 <= uni2 <= 0xdfff: + uni = 0x10000 + (((uni - 0xd800) << 10) | (uni2 - 0xdc00)) + end += 6 char = unichr(uni) - end = next_end # Append the unescaped character _append(char) return u''.join(chunks), end diff --git a/Lib/json/tests/test_scanstring.py b/Lib/json/tests/test_scanstring.py --- a/Lib/json/tests/test_scanstring.py +++ b/Lib/json/tests/test_scanstring.py @@ -5,10 +5,6 @@ class TestScanstring(object): def test_scanstring(self): scanstring = self.json.decoder.scanstring - self.assertEqual( - scanstring('"z\\ud834\\udd20x"', 1, None, True), - (u'z\U0001d120x', 16)) - if sys.maxunicode == 65535: self.assertEqual( scanstring(u'"z\U0001d120x"', 1, None, True), @@ -94,6 +90,57 @@ scanstring('["Bad value", truth]', 2, None, True), (u'Bad value', 12)) + def test_surrogates(self): + scanstring = self.json.decoder.scanstring + def assertScan(given, expect): + self.assertEqual(scanstring(given, 1, None, True), + (expect, len(given))) + if not isinstance(given, unicode): + given = unicode(given) + self.assertEqual(scanstring(given, 1, None, True), + (expect, len(given))) + + assertScan('"z\\ud834\\u0079x"', u'z\ud834yx') + assertScan('"z\\ud834\\udd20x"', u'z\U0001d120x') + assertScan('"z\\ud834\\ud834\\udd20x"', u'z\ud834\U0001d120x') + assertScan('"z\\ud834x"', u'z\ud834x') + assertScan(u'"z\\ud834\udd20x12345"', u'z\ud834\udd20x12345') + assertScan('"z\\udd20x"', u'z\udd20x') + assertScan(u'"z\ud834\udd20x"', u'z\ud834\udd20x') + assertScan(u'"z\ud834\\udd20x"', u'z\ud834\udd20x') + assertScan(u'"z\ud834x"', u'z\ud834x') + + def test_bad_escapes(self): + scanstring = self.json.decoder.scanstring + bad_escapes = [ + '"\\"', + '"\\x"', + '"\\u"', + '"\\u0"', + '"\\u01"', + '"\\u012"', + '"\\uz012"', + '"\\u0z12"', + '"\\u01z2"', + '"\\u012z"', + '"\\u0x12"', + '"\\u0X12"', + '"\\ud834\\"', + '"\\ud834\\u"', + '"\\ud834\\ud"', + '"\\ud834\\udd"', + '"\\ud834\\udd2"', + '"\\ud834\\uzdd2"', + '"\\ud834\\udzd2"', + '"\\ud834\\uddz2"', + '"\\ud834\\udd2z"', + '"\\ud834\\u0x20"', + '"\\ud834\\u0X20"', + ] + for s in bad_escapes: + with self.assertRaises(ValueError): + scanstring(s, 1, None, True) + def test_issue3623(self): self.assertRaises(ValueError, self.json.decoder.scanstring, b"xxx", 1, "xxx") diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -15,6 +15,8 @@ Library ------- +- Issue #11489: JSON decoder now accepts lone surrogates. + - Fix test.test_support.bind_port() to not cause an error when Python was compiled on a system with SO_REUSEPORT defined in the headers but run on a system with an OS kernel that does not support that new socket option. diff --git a/Modules/_json.c b/Modules/_json.c --- a/Modules/_json.c +++ b/Modules/_json.c @@ -524,16 +524,10 @@ } #ifdef Py_UNICODE_WIDE /* Surrogate pair */ - if ((c & 0xfc00) == 0xd800) { + if ((c & 0xfc00) == 0xd800 && end + 6 < len && + buf[next++] == '\\' && + buf[next++] == 'u') { Py_UNICODE c2 = 0; - if (end + 6 >= len) { - raise_errmsg("Unpaired high surrogate", pystr, end - 5); - goto bail; - } - if (buf[next++] != '\\' || buf[next++] != 'u') { - raise_errmsg("Unpaired high surrogate", pystr, end - 5); - goto bail; - } end += 6; /* Decode 4 hex digits */ for (; next < end; next++) { @@ -554,15 +548,10 @@ goto bail; } } - if ((c2 & 0xfc00) != 0xdc00) { - raise_errmsg("Unpaired high surrogate", pystr, end - 5); - goto bail; - } - c = 0x10000 + (((c - 0xd800) << 10) | (c2 - 0xdc00)); - } - else if ((c & 0xfc00) == 0xdc00) { - raise_errmsg("Unpaired low surrogate", pystr, end - 5); - goto bail; + if ((c2 & 0xfc00) == 0xdc00) + c = 0x10000 + (((c - 0xd800) << 10) | (c2 - 0xdc00)); + else + end -= 6; } #endif } @@ -703,16 +692,9 @@ } #ifdef Py_UNICODE_WIDE /* Surrogate pair */ - if ((c & 0xfc00) == 0xd800) { + if ((c & 0xfc00) == 0xd800 && end + 6 < len && + buf[next++] == '\\' && buf[next++] == 'u') { Py_UNICODE c2 = 0; - if (end + 6 >= len) { - raise_errmsg("Unpaired high surrogate", pystr, end - 5); - goto bail; - } - if (buf[next++] != '\\' || buf[next++] != 'u') { - raise_errmsg("Unpaired high surrogate", pystr, end - 5); - goto bail; - } end += 6; /* Decode 4 hex digits */ for (; next < end; next++) { @@ -733,15 +715,10 @@ goto bail; } } - if ((c2 & 0xfc00) != 0xdc00) { - raise_errmsg("Unpaired high surrogate", pystr, end - 5); - goto bail; - } - c = 0x10000 + (((c - 0xd800) << 10) | (c2 - 0xdc00)); - } - else if ((c & 0xfc00) == 0xdc00) { - raise_errmsg("Unpaired low surrogate", pystr, end - 5); - goto bail; + if ((c2 & 0xfc00) == 0xdc00) + c = 0x10000 + (((c - 0xd800) << 10) | (c2 - 0xdc00)); + else + end -= 6; } #endif } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 20:33:45 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 26 Nov 2013 20:33:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzExNDg5?= =?utf-8?q?=3A_JSON_decoder_now_accepts_lone_surrogates=2E?= Message-ID: <3dTZzY1tpFz7LpK@mail.python.org> http://hg.python.org/cpython/rev/8abbdbe86c01 changeset: 87588:8abbdbe86c01 branch: 3.3 parent: 87582:eac133e13bb5 user: Serhiy Storchaka date: Tue Nov 26 21:25:28 2013 +0200 summary: Issue #11489: JSON decoder now accepts lone surrogates. files: Lib/json/decoder.py | 35 +++++----- Lib/test/test_json/test_scanstring.py | 51 +++++++++++++- Misc/NEWS | 2 + Modules/_json.c | 26 ++----- 4 files changed, 73 insertions(+), 41 deletions(-) diff --git a/Lib/json/decoder.py b/Lib/json/decoder.py --- a/Lib/json/decoder.py +++ b/Lib/json/decoder.py @@ -66,6 +66,16 @@ 'b': '\b', 'f': '\f', 'n': '\n', 'r': '\r', 't': '\t', } +def _decode_uXXXX(s, pos): + esc = s[pos + 1:pos + 5] + if len(esc) == 4 and esc[1] not in 'xX': + try: + return int(esc, 16) + except ValueError: + pass + msg = "Invalid \\uXXXX escape" + raise ValueError(errmsg(msg, s, pos)) + def py_scanstring(s, end, strict=True, _b=BACKSLASH, _m=STRINGCHUNK.match): """Scan the string s for a JSON string. End is the index of the @@ -115,25 +125,14 @@ raise ValueError(errmsg(msg, s, end)) end += 1 else: - esc = s[end + 1:end + 5] - next_end = end + 5 - if len(esc) != 4: - msg = "Invalid \\uXXXX escape" - raise ValueError(errmsg(msg, s, end)) - uni = int(esc, 16) - if 0xd800 <= uni <= 0xdbff: - msg = "Invalid \\uXXXX\\uXXXX surrogate pair" - if not s[end + 5:end + 7] == '\\u': - raise ValueError(errmsg(msg, s, end)) - esc2 = s[end + 7:end + 11] - if len(esc2) != 4: - raise ValueError(errmsg(msg, s, end)) - uni2 = int(esc2, 16) - uni = 0x10000 + (((uni - 0xd800) << 10) | (uni2 - 0xdc00)) - next_end += 6 + uni = _decode_uXXXX(s, end) + end += 5 + if 0xd800 <= uni <= 0xdbff and s[end:end + 2] == '\\u': + uni2 = _decode_uXXXX(s, end + 1) + if 0xdc00 <= uni2 <= 0xdfff: + uni = 0x10000 + (((uni - 0xd800) << 10) | (uni2 - 0xdc00)) + end += 6 char = chr(uni) - - end = next_end _append(char) return ''.join(chunks), end diff --git a/Lib/test/test_json/test_scanstring.py b/Lib/test/test_json/test_scanstring.py --- a/Lib/test/test_json/test_scanstring.py +++ b/Lib/test/test_json/test_scanstring.py @@ -6,10 +6,6 @@ def test_scanstring(self): scanstring = self.json.decoder.scanstring self.assertEqual( - scanstring('"z\\ud834\\udd20x"', 1, True), - ('z\U0001d120x', 16)) - - self.assertEqual( scanstring('"z\U0001d120x"', 1, True), ('z\U0001d120x', 5)) @@ -89,6 +85,53 @@ scanstring('["Bad value", truth]', 2, True), ('Bad value', 12)) + def test_surrogates(self): + scanstring = self.json.decoder.scanstring + def assertScan(given, expect): + self.assertEqual(scanstring(given, 1, True), + (expect, len(given))) + + assertScan('"z\\ud834\\u0079x"', 'z\ud834yx') + assertScan('"z\\ud834\\udd20x"', 'z\U0001d120x') + assertScan('"z\\ud834\\ud834\\udd20x"', 'z\ud834\U0001d120x') + assertScan('"z\\ud834x"', 'z\ud834x') + assertScan('"z\\ud834\udd20x12345"', 'z\ud834\udd20x12345') + assertScan('"z\\udd20x"', 'z\udd20x') + assertScan('"z\ud834\udd20x"', 'z\ud834\udd20x') + assertScan('"z\ud834\\udd20x"', 'z\ud834\udd20x') + assertScan('"z\ud834x"', 'z\ud834x') + + def test_bad_escapes(self): + scanstring = self.json.decoder.scanstring + bad_escapes = [ + '"\\"', + '"\\x"', + '"\\u"', + '"\\u0"', + '"\\u01"', + '"\\u012"', + '"\\uz012"', + '"\\u0z12"', + '"\\u01z2"', + '"\\u012z"', + '"\\u0x12"', + '"\\u0X12"', + '"\\ud834\\"', + '"\\ud834\\u"', + '"\\ud834\\ud"', + '"\\ud834\\udd"', + '"\\ud834\\udd2"', + '"\\ud834\\uzdd2"', + '"\\ud834\\udzd2"', + '"\\ud834\\uddz2"', + '"\\ud834\\udd2z"', + '"\\ud834\\u0x20"', + '"\\ud834\\u0X20"', + ] + for s in bad_escapes: + with self.assertRaises(ValueError, msg=s): + scanstring(s, 1, True) + def test_overflow(self): with self.assertRaises(OverflowError): self.json.decoder.scanstring(b"xxx", sys.maxsize+1) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -16,6 +16,8 @@ Library ------- +- Issue #11489: JSON decoder now accepts lone surrogates. + - Issue #19545: Avoid chained exceptions while passing stray % to time.strptime(). Initial patch by Claudiu Popa. diff --git a/Modules/_json.c b/Modules/_json.c --- a/Modules/_json.c +++ b/Modules/_json.c @@ -433,17 +433,10 @@ } } /* Surrogate pair */ - if ((c & 0xfc00) == 0xd800) { + if (Py_UNICODE_IS_HIGH_SURROGATE(c) && end + 6 < len && + PyUnicode_READ(kind, buf, next++) == '\\' && + PyUnicode_READ(kind, buf, next++) == 'u') { Py_UCS4 c2 = 0; - if (end + 6 >= len) { - raise_errmsg("Unpaired high surrogate", pystr, end - 5); - goto bail; - } - if (PyUnicode_READ(kind, buf, next++) != '\\' || - PyUnicode_READ(kind, buf, next++) != 'u') { - raise_errmsg("Unpaired high surrogate", pystr, end - 5); - goto bail; - } end += 6; /* Decode 4 hex digits */ for (; next < end; next++) { @@ -464,15 +457,10 @@ goto bail; } } - if ((c2 & 0xfc00) != 0xdc00) { - raise_errmsg("Unpaired high surrogate", pystr, end - 5); - goto bail; - } - c = 0x10000 + (((c - 0xd800) << 10) | (c2 - 0xdc00)); - } - else if ((c & 0xfc00) == 0xdc00) { - raise_errmsg("Unpaired low surrogate", pystr, end - 5); - goto bail; + if (Py_UNICODE_IS_LOW_SURROGATE(c2)) + c = Py_UNICODE_JOIN_SURROGATES(c, c2); + else + end -= 6; } } APPEND_OLD_CHUNK -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 20:33:46 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 26 Nov 2013 20:33:46 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2311489=3A_JSON_decoder_now_accepts_lone_surrogat?= =?utf-8?b?ZXMu?= Message-ID: <3dTZzZ5JSZz7Lr1@mail.python.org> http://hg.python.org/cpython/rev/5f7326ed850f changeset: 87589:5f7326ed850f parent: 87583:9c7ab3e68243 parent: 87588:8abbdbe86c01 user: Serhiy Storchaka date: Tue Nov 26 21:27:11 2013 +0200 summary: Issue #11489: JSON decoder now accepts lone surrogates. files: Lib/json/decoder.py | 35 +++++----- Lib/test/test_json/test_scanstring.py | 51 +++++++++++++- Misc/NEWS | 2 + Modules/_json.c | 26 ++----- 4 files changed, 73 insertions(+), 41 deletions(-) diff --git a/Lib/json/decoder.py b/Lib/json/decoder.py --- a/Lib/json/decoder.py +++ b/Lib/json/decoder.py @@ -58,6 +58,16 @@ 'b': '\b', 'f': '\f', 'n': '\n', 'r': '\r', 't': '\t', } +def _decode_uXXXX(s, pos): + esc = s[pos + 1:pos + 5] + if len(esc) == 4 and esc[1] not in 'xX': + try: + return int(esc, 16) + except ValueError: + pass + msg = "Invalid \\uXXXX escape" + raise ValueError(errmsg(msg, s, pos)) + def py_scanstring(s, end, strict=True, _b=BACKSLASH, _m=STRINGCHUNK.match): """Scan the string s for a JSON string. End is the index of the @@ -107,25 +117,14 @@ raise ValueError(errmsg(msg, s, end)) end += 1 else: - esc = s[end + 1:end + 5] - next_end = end + 5 - if len(esc) != 4: - msg = "Invalid \\uXXXX escape" - raise ValueError(errmsg(msg, s, end)) - uni = int(esc, 16) - if 0xd800 <= uni <= 0xdbff: - msg = "Invalid \\uXXXX\\uXXXX surrogate pair" - if not s[end + 5:end + 7] == '\\u': - raise ValueError(errmsg(msg, s, end)) - esc2 = s[end + 7:end + 11] - if len(esc2) != 4: - raise ValueError(errmsg(msg, s, end)) - uni2 = int(esc2, 16) - uni = 0x10000 + (((uni - 0xd800) << 10) | (uni2 - 0xdc00)) - next_end += 6 + uni = _decode_uXXXX(s, end) + end += 5 + if 0xd800 <= uni <= 0xdbff and s[end:end + 2] == '\\u': + uni2 = _decode_uXXXX(s, end + 1) + if 0xdc00 <= uni2 <= 0xdfff: + uni = 0x10000 + (((uni - 0xd800) << 10) | (uni2 - 0xdc00)) + end += 6 char = chr(uni) - - end = next_end _append(char) return ''.join(chunks), end diff --git a/Lib/test/test_json/test_scanstring.py b/Lib/test/test_json/test_scanstring.py --- a/Lib/test/test_json/test_scanstring.py +++ b/Lib/test/test_json/test_scanstring.py @@ -6,10 +6,6 @@ def test_scanstring(self): scanstring = self.json.decoder.scanstring self.assertEqual( - scanstring('"z\\ud834\\udd20x"', 1, True), - ('z\U0001d120x', 16)) - - self.assertEqual( scanstring('"z\U0001d120x"', 1, True), ('z\U0001d120x', 5)) @@ -89,6 +85,53 @@ scanstring('["Bad value", truth]', 2, True), ('Bad value', 12)) + def test_surrogates(self): + scanstring = self.json.decoder.scanstring + def assertScan(given, expect): + self.assertEqual(scanstring(given, 1, True), + (expect, len(given))) + + assertScan('"z\\ud834\\u0079x"', 'z\ud834yx') + assertScan('"z\\ud834\\udd20x"', 'z\U0001d120x') + assertScan('"z\\ud834\\ud834\\udd20x"', 'z\ud834\U0001d120x') + assertScan('"z\\ud834x"', 'z\ud834x') + assertScan('"z\\ud834\udd20x12345"', 'z\ud834\udd20x12345') + assertScan('"z\\udd20x"', 'z\udd20x') + assertScan('"z\ud834\udd20x"', 'z\ud834\udd20x') + assertScan('"z\ud834\\udd20x"', 'z\ud834\udd20x') + assertScan('"z\ud834x"', 'z\ud834x') + + def test_bad_escapes(self): + scanstring = self.json.decoder.scanstring + bad_escapes = [ + '"\\"', + '"\\x"', + '"\\u"', + '"\\u0"', + '"\\u01"', + '"\\u012"', + '"\\uz012"', + '"\\u0z12"', + '"\\u01z2"', + '"\\u012z"', + '"\\u0x12"', + '"\\u0X12"', + '"\\ud834\\"', + '"\\ud834\\u"', + '"\\ud834\\ud"', + '"\\ud834\\udd"', + '"\\ud834\\udd2"', + '"\\ud834\\uzdd2"', + '"\\ud834\\udzd2"', + '"\\ud834\\uddz2"', + '"\\ud834\\udd2z"', + '"\\ud834\\u0x20"', + '"\\ud834\\u0X20"', + ] + for s in bad_escapes: + with self.assertRaises(ValueError, msg=s): + scanstring(s, 1, True) + def test_overflow(self): with self.assertRaises(OverflowError): self.json.decoder.scanstring(b"xxx", sys.maxsize+1) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -16,6 +16,8 @@ Library ------- +- Issue #11489: JSON decoder now accepts lone surrogates. + - Issue #19545: Avoid chained exceptions while passing stray % to time.strptime(). Initial patch by Claudiu Popa. diff --git a/Modules/_json.c b/Modules/_json.c --- a/Modules/_json.c +++ b/Modules/_json.c @@ -409,17 +409,10 @@ } } /* Surrogate pair */ - if (Py_UNICODE_IS_HIGH_SURROGATE(c)) { + if (Py_UNICODE_IS_HIGH_SURROGATE(c) && end + 6 < len && + PyUnicode_READ(kind, buf, next++) == '\\' && + PyUnicode_READ(kind, buf, next++) == 'u') { Py_UCS4 c2 = 0; - if (end + 6 >= len) { - raise_errmsg("Unpaired high surrogate", pystr, end - 5); - goto bail; - } - if (PyUnicode_READ(kind, buf, next++) != '\\' || - PyUnicode_READ(kind, buf, next++) != 'u') { - raise_errmsg("Unpaired high surrogate", pystr, end - 5); - goto bail; - } end += 6; /* Decode 4 hex digits */ for (; next < end; next++) { @@ -440,15 +433,10 @@ goto bail; } } - if (!Py_UNICODE_IS_LOW_SURROGATE(c2)) { - raise_errmsg("Unpaired high surrogate", pystr, end - 5); - goto bail; - } - c = Py_UNICODE_JOIN_SURROGATES(c, c2); - } - else if (Py_UNICODE_IS_LOW_SURROGATE(c)) { - raise_errmsg("Unpaired low surrogate", pystr, end - 5); - goto bail; + if (Py_UNICODE_IS_LOW_SURROGATE(c2)) + c = Py_UNICODE_JOIN_SURROGATES(c, c2); + else + end -= 6; } } APPEND_OLD_CHUNK -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 20:37:44 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 26 Nov 2013 20:37:44 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319793=3A_Improved?= =?utf-8?q?_markup_for_True/False_constants_in_pathlib_documentation=2E?= Message-ID: <3dTb485f7GzQN7@mail.python.org> http://hg.python.org/cpython/rev/835007ccf2b0 changeset: 87590:835007ccf2b0 user: Serhiy Storchaka date: Tue Nov 26 21:37:12 2013 +0200 summary: Issue #19793: Improved markup for True/False constants in pathlib documentation. files: Doc/library/pathlib.rst | 56 ++++++++++++++-------------- 1 files changed, 28 insertions(+), 28 deletions(-) diff --git a/Doc/library/pathlib.rst b/Doc/library/pathlib.rst --- a/Doc/library/pathlib.rst +++ b/Doc/library/pathlib.rst @@ -439,9 +439,9 @@ .. method:: PurePath.is_reserved() - With :class:`PureWindowsPath`, return True if the path is considered - reserved under Windows, False otherwise. With :class:`PurePosixPath`, - False is always returned. + With :class:`PureWindowsPath`, return ``True`` if the path is considered + reserved under Windows, ``False`` otherwise. With :class:`PurePosixPath`, + ``False`` is always returned. >>> PureWindowsPath('nul').is_reserved() True @@ -469,8 +469,8 @@ .. method:: PurePath.match(pattern) - Match this path against the provided glob-style pattern. Return True - if matching is successful, False otherwise. + Match this path against the provided glob-style pattern. Return ``True`` + if matching is successful, ``False`` otherwise. If *pattern* is relative, the path can be either relative or absolute, and matching is done from the right:: @@ -661,63 +661,63 @@ .. method:: Path.is_dir() - Return True if the path points to a directory (or a symbolic link - pointing to a directory), False if it points to another kind of file. + Return ``True`` if the path points to a directory (or a symbolic link + pointing to a directory), ``False`` if it points to another kind of file. - False is also returned if the path doesn't exist or is a broken symlink; + ``False`` is also returned if the path doesn't exist or is a broken symlink; other errors (such as permission errors) are propagated. .. method:: Path.is_file() - Return True if the path points to a regular file (or a symbolic link - pointing to a regular file), False if it points to another kind of file. + Return ``True`` if the path points to a regular file (or a symbolic link + pointing to a regular file), ``False`` if it points to another kind of file. - False is also returned if the path doesn't exist or is a broken symlink; + ``False`` is also returned if the path doesn't exist or is a broken symlink; other errors (such as permission errors) are propagated. .. method:: Path.is_symlink() - Return True if the path points to a symbolic link, False otherwise. + Return ``True`` if the path points to a symbolic link, ``False`` otherwise. - False is also returned if the path doesn't exist; other errors (such + ``False`` is also returned if the path doesn't exist; other errors (such as permission errors) are propagated. .. method:: Path.is_socket() - Return True if the path points to a Unix socket (or a symbolic link - pointing to a Unix socket), False if it points to another kind of file. + Return ``True`` if the path points to a Unix socket (or a symbolic link + pointing to a Unix socket), ``False`` if it points to another kind of file. - False is also returned if the path doesn't exist or is a broken symlink; + ``False`` is also returned if the path doesn't exist or is a broken symlink; other errors (such as permission errors) are propagated. .. method:: Path.is_fifo() - Return True if the path points to a FIFO (or a symbolic link - pointing to a FIFO), False if it points to another kind of file. + Return ``True`` if the path points to a FIFO (or a symbolic link + pointing to a FIFO), ``False`` if it points to another kind of file. - False is also returned if the path doesn't exist or is a broken symlink; + ``False`` is also returned if the path doesn't exist or is a broken symlink; other errors (such as permission errors) are propagated. .. method:: Path.is_block_device() - Return True if the path points to a block device (or a symbolic link - pointing to a block device), False if it points to another kind of file. + Return ``True`` if the path points to a block device (or a symbolic link + pointing to a block device), ``False`` if it points to another kind of file. - False is also returned if the path doesn't exist or is a broken symlink; + ``False`` is also returned if the path doesn't exist or is a broken symlink; other errors (such as permission errors) are propagated. .. method:: Path.is_char_device() - Return True if the path points to a character device (or a symbolic link - pointing to a character device), False if it points to another kind of file. + Return ``True`` if the path points to a character device (or a symbolic link + pointing to a character device), ``False`` if it points to another kind of file. - False is also returned if the path doesn't exist or is a broken symlink; + ``False`` is also returned if the path doesn't exist or is a broken symlink; other errors (such as permission errors) are propagated. @@ -755,8 +755,8 @@ combined with the process' ``umask`` value to determine the file mode and access flags. If the path already exists, :exc:`OSError` is raised. - If *parents* is True, any missing parents of this path are created - as needed. If *parents* is False (the default), a missing parent raises + If *parents* is true, any missing parents of this path are created + as needed. If *parents* is false (the default), a missing parent raises :exc:`OSError`. @@ -841,7 +841,7 @@ .. method:: Path.symlink_to(target, target_is_directory=False) Make this path a symbolic link to *target*. Under Windows, - *target_is_directory* must be True (default False) if the link's target + *target_is_directory* must be true (default ``False``) if the link's target is a directory. Under POSIX, *target_is_directory*'s value is ignored. >>> p = Path('mylink') -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 21:29:25 2013 From: python-checkins at python.org (mark.dickinson) Date: Tue, 26 Nov 2013 21:29:25 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogVXNlIEBiaWdtZW10?= =?utf-8?q?est_more_accurately=2E?= Message-ID: <3dTcCn3Msfz7Lqc@mail.python.org> http://hg.python.org/cpython/rev/8662734296a0 changeset: 87591:8662734296a0 branch: 3.3 parent: 87588:8abbdbe86c01 user: Mark Dickinson date: Tue Nov 26 20:28:29 2013 +0000 summary: Use @bigmemtest more accurately. files: Lib/test/test_strtod.py | 6 +++--- 1 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_strtod.py b/Lib/test/test_strtod.py --- a/Lib/test/test_strtod.py +++ b/Lib/test/test_strtod.py @@ -248,15 +248,15 @@ else: assert False, "expected ValueError" - @test.support.bigmemtest(size=5 * test.support._1G, memuse=1, dry_run=False) + @test.support.bigmemtest(size=test.support._2G+10, memuse=4, dry_run=False) def test_oversized_digit_strings(self, maxsize): # Input string whose length doesn't fit in an INT. - s = "1." + "1" * int(2.2e9) + s = "1." + "1" * maxsize with self.assertRaises(ValueError): float(s) del s - s = "0." + "0" * int(2.2e9) + "1" + s = "0." + "0" * maxsize + "1" with self.assertRaises(ValueError): float(s) del s -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 21:29:26 2013 From: python-checkins at python.org (mark.dickinson) Date: Tue, 26 Nov 2013 21:29:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Use_=40bigmemtest_more_accurately=2E?= Message-ID: <3dTcCp5y6Fz7Ltm@mail.python.org> http://hg.python.org/cpython/rev/08bcbc210a0f changeset: 87592:08bcbc210a0f parent: 87590:835007ccf2b0 parent: 87591:8662734296a0 user: Mark Dickinson date: Tue Nov 26 20:29:06 2013 +0000 summary: Use @bigmemtest more accurately. files: Lib/test/test_strtod.py | 6 +++--- 1 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_strtod.py b/Lib/test/test_strtod.py --- a/Lib/test/test_strtod.py +++ b/Lib/test/test_strtod.py @@ -248,15 +248,15 @@ else: assert False, "expected ValueError" - @test.support.bigmemtest(size=5 * test.support._1G, memuse=1, dry_run=False) + @test.support.bigmemtest(size=test.support._2G+10, memuse=3, dry_run=False) def test_oversized_digit_strings(self, maxsize): # Input string whose length doesn't fit in an INT. - s = "1." + "1" * int(2.2e9) + s = "1." + "1" * maxsize with self.assertRaises(ValueError): float(s) del s - s = "0." + "0" * int(2.2e9) + "1" + s = "0." + "0" * maxsize + "1" with self.assertRaises(ValueError): float(s) del s -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 21:50:03 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 26 Nov 2013 21:50:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzExNTA4?= =?utf-8?q?=3A_Fixed_uuid=2Egetnode=28=29_and_uuid=2Euuid1=28=29_on_enviro?= =?utf-8?q?nment_with?= Message-ID: <3dTcgb2PdDz7Lk0@mail.python.org> http://hg.python.org/cpython/rev/72951ffbdc76 changeset: 87593:72951ffbdc76 branch: 2.7 parent: 87587:c85305a54e6d user: Serhiy Storchaka date: Tue Nov 26 22:47:05 2013 +0200 summary: Issue #11508: Fixed uuid.getnode() and uuid.uuid1() on environment with virtual interface. Original patch by Kent Frazier. files: Lib/test/test_uuid.py | 20 ++++++++++++++++++++ Lib/uuid.py | 12 ++++++++++-- Misc/ACKS | 1 + Misc/NEWS | 3 +++ 4 files changed, 34 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_uuid.py b/Lib/test/test_uuid.py --- a/Lib/test/test_uuid.py +++ b/Lib/test/test_uuid.py @@ -1,5 +1,6 @@ import unittest from test import test_support +import io import os import uuid @@ -346,6 +347,25 @@ self.assertEqual(node1, node2) + def test_find_mac(self): + data = '''\ + +fake hwaddr +cscotun0 Link encap:UNSPEC HWaddr 00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00 +eth0 Link encap:Ethernet HWaddr 12:34:56:78:90:ab +''' + def mock_popen(cmd): + return io.BytesIO(data) + + with test_support.swap_attr(os, 'popen', mock_popen): + mac = uuid._find_mac( + command='ifconfig', + args='', + hw_identifiers=['hwaddr'], + get_index=lambda x: x + 1, + ) + self.assertEqual(mac, 0x1234567890ab) + @unittest.skipUnless(importable('ctypes'), 'requires ctypes') def test_uuid1(self): equal = self.assertEqual diff --git a/Lib/uuid.py b/Lib/uuid.py --- a/Lib/uuid.py +++ b/Lib/uuid.py @@ -307,8 +307,16 @@ words = line.lower().split() for i in range(len(words)): if words[i] in hw_identifiers: - return int( - words[get_index(i)].replace(':', ''), 16) + try: + return int( + words[get_index(i)].replace(':', ''), 16) + except (ValueError, IndexError): + # Virtual interfaces, such as those provided by + # VPNs, do not have a colon-delimited MAC address + # as expected, but a 16-byte HWAddr separated by + # dashes. These should be ignored in favor of a + # real MAC address + pass except IOError: continue return None diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -324,6 +324,7 @@ John Fouhy Stefan Franke Martin Franklin +Kent Frazier Bruce Frederiksen Robin Friedrich Bradley Froehle diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -15,6 +15,9 @@ Library ------- +- Issue #11508: Fixed uuid.getnode() and uuid.uuid1() on environment with + virtual interface. Original patch by Kent Frazier. + - Issue #11489: JSON decoder now accepts lone surrogates. - Fix test.test_support.bind_port() to not cause an error when Python was -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 21:50:04 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 26 Nov 2013 21:50:04 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzExNTA4?= =?utf-8?q?=3A_Fixed_uuid=2Egetnode=28=29_and_uuid=2Euuid1=28=29_on_enviro?= =?utf-8?q?nment_with?= Message-ID: <3dTcgc5Wsdz7Ls2@mail.python.org> http://hg.python.org/cpython/rev/a5c26e57f429 changeset: 87594:a5c26e57f429 branch: 3.3 parent: 87591:8662734296a0 user: Serhiy Storchaka date: Tue Nov 26 22:47:16 2013 +0200 summary: Issue #11508: Fixed uuid.getnode() and uuid.uuid1() on environment with virtual interface. Original patch by Kent Frazier. files: Lib/test/test_uuid.py | 21 +++++++++++++++++++++ Lib/uuid.py | 12 ++++++++++-- Misc/ACKS | 1 + Misc/NEWS | 3 +++ 4 files changed, 35 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_uuid.py b/Lib/test/test_uuid.py --- a/Lib/test/test_uuid.py +++ b/Lib/test/test_uuid.py @@ -1,5 +1,7 @@ import unittest +from test import support import builtins +import io import os import uuid @@ -356,6 +358,25 @@ self.assertEqual(node1, node2) + def test_find_mac(self): + data = '''\ + +fake hwaddr +cscotun0 Link encap:UNSPEC HWaddr 00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00 +eth0 Link encap:Ethernet HWaddr 12:34:56:78:90:ab +''' + def mock_popen(cmd): + return io.StringIO(data) + + with support.swap_attr(os, 'popen', mock_popen): + mac = uuid._find_mac( + command='ifconfig', + args='', + hw_identifiers=['hwaddr'], + get_index=lambda x: x + 1, + ) + self.assertEqual(mac, 0x1234567890ab) + @unittest.skipUnless(importable('ctypes'), 'requires ctypes') def test_uuid1(self): equal = self.assertEqual diff --git a/Lib/uuid.py b/Lib/uuid.py --- a/Lib/uuid.py +++ b/Lib/uuid.py @@ -327,8 +327,16 @@ words = line.lower().split() for i in range(len(words)): if words[i] in hw_identifiers: - return int( - words[get_index(i)].replace(':', ''), 16) + try: + return int( + words[get_index(i)].replace(':', ''), 16) + except (ValueError, IndexError): + # Virtual interfaces, such as those provided by + # VPNs, do not have a colon-delimited MAC address + # as expected, but a 16-byte HWAddr separated by + # dashes. These should be ignored in favor of a + # real MAC address + pass except IOError: continue return None diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -394,6 +394,7 @@ Andrew Francis Stefan Franke Martin Franklin +Kent Frazier Bruce Frederiksen Robin Friedrich Bradley Froehle diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -16,6 +16,9 @@ Library ------- +- Issue #11508: Fixed uuid.getnode() and uuid.uuid1() on environment with + virtual interface. Original patch by Kent Frazier. + - Issue #11489: JSON decoder now accepts lone surrogates. - Issue #19545: Avoid chained exceptions while passing stray % to -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 21:50:06 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Tue, 26 Nov 2013 21:50:06 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2311508=3A_Fixed_uuid=2Egetnode=28=29_and_uuid=2E?= =?utf-8?q?uuid1=28=29_on_environment_with?= Message-ID: <3dTcgf15Pmz7LtZ@mail.python.org> http://hg.python.org/cpython/rev/efd0e6d675ff changeset: 87595:efd0e6d675ff parent: 87592:08bcbc210a0f parent: 87594:a5c26e57f429 user: Serhiy Storchaka date: Tue Nov 26 22:49:36 2013 +0200 summary: Issue #11508: Fixed uuid.getnode() and uuid.uuid1() on environment with virtual interface. Original patch by Kent Frazier. files: Lib/test/test_uuid.py | 21 +++++++++++++++++++++ Lib/uuid.py | 12 ++++++++++-- Misc/ACKS | 1 + Misc/NEWS | 3 +++ 4 files changed, 35 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_uuid.py b/Lib/test/test_uuid.py --- a/Lib/test/test_uuid.py +++ b/Lib/test/test_uuid.py @@ -1,5 +1,7 @@ import unittest +from test import support import builtins +import io import os import uuid @@ -356,6 +358,25 @@ self.assertEqual(node1, node2) + def test_find_mac(self): + data = '''\ + +fake hwaddr +cscotun0 Link encap:UNSPEC HWaddr 00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00 +eth0 Link encap:Ethernet HWaddr 12:34:56:78:90:ab +''' + def mock_popen(cmd): + return io.StringIO(data) + + with support.swap_attr(os, 'popen', mock_popen): + mac = uuid._find_mac( + command='ifconfig', + args='', + hw_identifiers=['hwaddr'], + get_index=lambda x: x + 1, + ) + self.assertEqual(mac, 0x1234567890ab) + @unittest.skipUnless(importable('ctypes'), 'requires ctypes') def test_uuid1(self): equal = self.assertEqual diff --git a/Lib/uuid.py b/Lib/uuid.py --- a/Lib/uuid.py +++ b/Lib/uuid.py @@ -327,8 +327,16 @@ words = line.lower().split() for i in range(len(words)): if words[i] in hw_identifiers: - return int( - words[get_index(i)].replace(':', ''), 16) + try: + return int( + words[get_index(i)].replace(':', ''), 16) + except (ValueError, IndexError): + # Virtual interfaces, such as those provided by + # VPNs, do not have a colon-delimited MAC address + # as expected, but a 16-byte HWAddr separated by + # dashes. These should be ignored in favor of a + # real MAC address + pass except OSError: continue return None diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -407,6 +407,7 @@ Andrew Francis Stefan Franke Martin Franklin +Kent Frazier Bruce Frederiksen Robin Friedrich Bradley Froehle diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -16,6 +16,9 @@ Library ------- +- Issue #11508: Fixed uuid.getnode() and uuid.uuid1() on environment with + virtual interface. Original patch by Kent Frazier. + - Issue #11489: JSON decoder now accepts lone surrogates. - Issue #19545: Avoid chained exceptions while passing stray % to -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 21:59:18 2013 From: python-checkins at python.org (zach.ware) Date: Tue, 26 Nov 2013 21:59:18 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NTg4?= =?utf-8?q?=3A_Fixed_tests_in_test=5Frandom_that_were_silently_skipped_mos?= =?utf-8?q?t?= Message-ID: <3dTctG6N6jz7Lk0@mail.python.org> http://hg.python.org/cpython/rev/c8e138646be1 changeset: 87596:c8e138646be1 branch: 2.7 parent: 87587:c85305a54e6d user: Zachary Ware date: Tue Nov 26 14:49:42 2013 -0600 summary: Issue #19588: Fixed tests in test_random that were silently skipped most of the time. Patch by Julian Gindi. files: Lib/test/test_random.py | 12 ++++++------ Misc/ACKS | 1 + Misc/NEWS | 3 +++ 3 files changed, 10 insertions(+), 6 deletions(-) diff --git a/Lib/test/test_random.py b/Lib/test/test_random.py --- a/Lib/test/test_random.py +++ b/Lib/test/test_random.py @@ -251,10 +251,10 @@ def test_bigrand_ranges(self): for i in [40,80, 160, 200, 211, 250, 375, 512, 550]: - start = self.gen.randrange(2 ** i) - stop = self.gen.randrange(2 ** (i-2)) + start = self.gen.randrange(2 ** (i-2)) + stop = self.gen.randrange(2 ** i) if stop <= start: - return + continue self.assertTrue(start <= self.gen.randrange(start, stop) < stop) def test_rangelimits(self): @@ -403,10 +403,10 @@ def test_bigrand_ranges(self): for i in [40,80, 160, 200, 211, 250, 375, 512, 550]: - start = self.gen.randrange(2 ** i) - stop = self.gen.randrange(2 ** (i-2)) + start = self.gen.randrange(2 ** (i-2)) + stop = self.gen.randrange(2 ** i) if stop <= start: - return + continue self.assertTrue(start <= self.gen.randrange(start, stop) < stop) def test_rangelimits(self): diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -362,6 +362,7 @@ Jonathan Giddy Johannes Gijsbers Michael Gilfix +Julian Gindi Wim Glenn Christoph Gohlke Tim Golden diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -50,6 +50,9 @@ Tests ----- +- Issue #19588: Fixed tests in test_random that were silently skipped most + of the time. Patch by Julian Gindi. + - Issue #17883: Tweak test_tcl testLoadWithUNC to skip the test in the event of a permission error on Windows and to properly report other skip conditions. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 21:59:20 2013 From: python-checkins at python.org (zach.ware) Date: Tue, 26 Nov 2013 21:59:20 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NTg4?= =?utf-8?q?=3A_Fixed_tests_in_test=5Frandom_that_were_silently_skipped_mos?= =?utf-8?q?t?= Message-ID: <3dTctJ1B1Jz7LqR@mail.python.org> http://hg.python.org/cpython/rev/c65882d79c5f changeset: 87597:c65882d79c5f branch: 3.3 parent: 87591:8662734296a0 user: Zachary Ware date: Tue Nov 26 14:50:10 2013 -0600 summary: Issue #19588: Fixed tests in test_random that were silently skipped most of the time. Patch by Julian Gindi. files: Lib/test/test_random.py | 12 ++++++------ Misc/ACKS | 1 + Misc/NEWS | 3 +++ 3 files changed, 10 insertions(+), 6 deletions(-) diff --git a/Lib/test/test_random.py b/Lib/test/test_random.py --- a/Lib/test/test_random.py +++ b/Lib/test/test_random.py @@ -194,10 +194,10 @@ def test_bigrand_ranges(self): for i in [40,80, 160, 200, 211, 250, 375, 512, 550]: - start = self.gen.randrange(2 ** i) - stop = self.gen.randrange(2 ** (i-2)) + start = self.gen.randrange(2 ** (i-2)) + stop = self.gen.randrange(2 ** i) if stop <= start: - return + continue self.assertTrue(start <= self.gen.randrange(start, stop) < stop) def test_rangelimits(self): @@ -357,10 +357,10 @@ def test_bigrand_ranges(self): for i in [40,80, 160, 200, 211, 250, 375, 512, 550]: - start = self.gen.randrange(2 ** i) - stop = self.gen.randrange(2 ** (i-2)) + start = self.gen.randrange(2 ** (i-2)) + stop = self.gen.randrange(2 ** i) if stop <= start: - return + continue self.assertTrue(start <= self.gen.randrange(start, stop) < stop) def test_rangelimits(self): diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -435,6 +435,7 @@ Jonathan Giddy Johannes Gijsbers Michael Gilfix +Julian Gindi Yannick Gingras Matt Giuca Wim Glenn diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -65,6 +65,9 @@ Tests ----- +- Issue #19588: Fixed tests in test_random that were silently skipped most + of the time. Patch by Julian Gindi. + - Issue #19596: Set untestable tests in test_importlib to None to avoid reporting success on empty tests. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 21:59:21 2013 From: python-checkins at python.org (zach.ware) Date: Tue, 26 Nov 2013 21:59:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319588=3A_Merge_with_3=2E3?= Message-ID: <3dTctK3KJcz7Ls2@mail.python.org> http://hg.python.org/cpython/rev/28ec217ce510 changeset: 87598:28ec217ce510 parent: 87592:08bcbc210a0f parent: 87597:c65882d79c5f user: Zachary Ware date: Tue Nov 26 14:54:21 2013 -0600 summary: Issue #19588: Merge with 3.3 files: Lib/test/test_random.py | 12 ++++++------ Misc/ACKS | 1 + Misc/NEWS | 6 ++++++ 3 files changed, 13 insertions(+), 6 deletions(-) diff --git a/Lib/test/test_random.py b/Lib/test/test_random.py --- a/Lib/test/test_random.py +++ b/Lib/test/test_random.py @@ -240,10 +240,10 @@ def test_bigrand_ranges(self): for i in [40,80, 160, 200, 211, 250, 375, 512, 550]: - start = self.gen.randrange(2 ** i) - stop = self.gen.randrange(2 ** (i-2)) + start = self.gen.randrange(2 ** (i-2)) + stop = self.gen.randrange(2 ** i) if stop <= start: - return + continue self.assertTrue(start <= self.gen.randrange(start, stop) < stop) def test_rangelimits(self): @@ -432,10 +432,10 @@ def test_bigrand_ranges(self): for i in [40,80, 160, 200, 211, 250, 375, 512, 550]: - start = self.gen.randrange(2 ** i) - stop = self.gen.randrange(2 ** (i-2)) + start = self.gen.randrange(2 ** (i-2)) + stop = self.gen.randrange(2 ** i) if stop <= start: - return + continue self.assertTrue(start <= self.gen.randrange(start, stop) < stop) def test_rangelimits(self): diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -449,6 +449,7 @@ Jonathan Giddy Johannes Gijsbers Michael Gilfix +Julian Gindi Yannick Gingras Matt Giuca Wim Glenn diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -21,6 +21,12 @@ - Issue #19545: Avoid chained exceptions while passing stray % to time.strptime(). Initial patch by Claudiu Popa. +Tests +----- + +- Issue #19588: Fixed tests in test_random that were silently skipped most + of the time. Patch by Julian Gindi. + What's New in Python 3.4.0 Beta 1? ================================== -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 21:59:22 2013 From: python-checkins at python.org (zach.ware) Date: Tue, 26 Nov 2013 21:59:22 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMi43IC0+IDIuNyk6?= =?utf-8?q?_Merge_heads?= Message-ID: <3dTctL5n8Gz7LsH@mail.python.org> http://hg.python.org/cpython/rev/cbf30e9aa586 changeset: 87599:cbf30e9aa586 branch: 2.7 parent: 87596:c8e138646be1 parent: 87593:72951ffbdc76 user: Zachary Ware date: Tue Nov 26 14:55:46 2013 -0600 summary: Merge heads files: Lib/test/test_uuid.py | 20 ++++++++++++++++++++ Lib/uuid.py | 12 ++++++++++-- Misc/ACKS | 1 + Misc/NEWS | 3 +++ 4 files changed, 34 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_uuid.py b/Lib/test/test_uuid.py --- a/Lib/test/test_uuid.py +++ b/Lib/test/test_uuid.py @@ -1,5 +1,6 @@ import unittest from test import test_support +import io import os import uuid @@ -346,6 +347,25 @@ self.assertEqual(node1, node2) + def test_find_mac(self): + data = '''\ + +fake hwaddr +cscotun0 Link encap:UNSPEC HWaddr 00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00 +eth0 Link encap:Ethernet HWaddr 12:34:56:78:90:ab +''' + def mock_popen(cmd): + return io.BytesIO(data) + + with test_support.swap_attr(os, 'popen', mock_popen): + mac = uuid._find_mac( + command='ifconfig', + args='', + hw_identifiers=['hwaddr'], + get_index=lambda x: x + 1, + ) + self.assertEqual(mac, 0x1234567890ab) + @unittest.skipUnless(importable('ctypes'), 'requires ctypes') def test_uuid1(self): equal = self.assertEqual diff --git a/Lib/uuid.py b/Lib/uuid.py --- a/Lib/uuid.py +++ b/Lib/uuid.py @@ -307,8 +307,16 @@ words = line.lower().split() for i in range(len(words)): if words[i] in hw_identifiers: - return int( - words[get_index(i)].replace(':', ''), 16) + try: + return int( + words[get_index(i)].replace(':', ''), 16) + except (ValueError, IndexError): + # Virtual interfaces, such as those provided by + # VPNs, do not have a colon-delimited MAC address + # as expected, but a 16-byte HWAddr separated by + # dashes. These should be ignored in favor of a + # real MAC address + pass except IOError: continue return None diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -324,6 +324,7 @@ John Fouhy Stefan Franke Martin Franklin +Kent Frazier Bruce Frederiksen Robin Friedrich Bradley Froehle diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -15,6 +15,9 @@ Library ------- +- Issue #11508: Fixed uuid.getnode() and uuid.uuid1() on environment with + virtual interface. Original patch by Kent Frazier. + - Issue #11489: JSON decoder now accepts lone surrogates. - Fix test.test_support.bind_port() to not cause an error when Python was -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 21:59:24 2013 From: python-checkins at python.org (zach.ware) Date: Tue, 26 Nov 2013 21:59:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAobWVyZ2UgMy4zIC0+IDMuMyk6?= =?utf-8?q?_Merge_heads?= Message-ID: <3dTctN1PRDz7Lsd@mail.python.org> http://hg.python.org/cpython/rev/b8d606c62d68 changeset: 87600:b8d606c62d68 branch: 3.3 parent: 87597:c65882d79c5f parent: 87594:a5c26e57f429 user: Zachary Ware date: Tue Nov 26 14:57:10 2013 -0600 summary: Merge heads files: Lib/test/test_uuid.py | 21 +++++++++++++++++++++ Lib/uuid.py | 12 ++++++++++-- Misc/ACKS | 1 + Misc/NEWS | 3 +++ 4 files changed, 35 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_uuid.py b/Lib/test/test_uuid.py --- a/Lib/test/test_uuid.py +++ b/Lib/test/test_uuid.py @@ -1,5 +1,7 @@ import unittest +from test import support import builtins +import io import os import uuid @@ -356,6 +358,25 @@ self.assertEqual(node1, node2) + def test_find_mac(self): + data = '''\ + +fake hwaddr +cscotun0 Link encap:UNSPEC HWaddr 00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00 +eth0 Link encap:Ethernet HWaddr 12:34:56:78:90:ab +''' + def mock_popen(cmd): + return io.StringIO(data) + + with support.swap_attr(os, 'popen', mock_popen): + mac = uuid._find_mac( + command='ifconfig', + args='', + hw_identifiers=['hwaddr'], + get_index=lambda x: x + 1, + ) + self.assertEqual(mac, 0x1234567890ab) + @unittest.skipUnless(importable('ctypes'), 'requires ctypes') def test_uuid1(self): equal = self.assertEqual diff --git a/Lib/uuid.py b/Lib/uuid.py --- a/Lib/uuid.py +++ b/Lib/uuid.py @@ -327,8 +327,16 @@ words = line.lower().split() for i in range(len(words)): if words[i] in hw_identifiers: - return int( - words[get_index(i)].replace(':', ''), 16) + try: + return int( + words[get_index(i)].replace(':', ''), 16) + except (ValueError, IndexError): + # Virtual interfaces, such as those provided by + # VPNs, do not have a colon-delimited MAC address + # as expected, but a 16-byte HWAddr separated by + # dashes. These should be ignored in favor of a + # real MAC address + pass except IOError: continue return None diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -394,6 +394,7 @@ Andrew Francis Stefan Franke Martin Franklin +Kent Frazier Bruce Frederiksen Robin Friedrich Bradley Froehle diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -16,6 +16,9 @@ Library ------- +- Issue #11508: Fixed uuid.getnode() and uuid.uuid1() on environment with + virtual interface. Original patch by Kent Frazier. + - Issue #11489: JSON decoder now accepts lone surrogates. - Issue #19545: Avoid chained exceptions while passing stray % to -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 21:59:25 2013 From: python-checkins at python.org (zach.ware) Date: Tue, 26 Nov 2013 21:59:25 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_default_-=3E_default?= =?utf-8?q?=29=3A_Merge_heads?= Message-ID: <3dTctP4mV4z7Lqh@mail.python.org> http://hg.python.org/cpython/rev/eb307288ffb8 changeset: 87601:eb307288ffb8 parent: 87598:28ec217ce510 parent: 87595:efd0e6d675ff user: Zachary Ware date: Tue Nov 26 14:57:45 2013 -0600 summary: Merge heads files: Lib/test/test_uuid.py | 21 +++++++++++++++++++++ Lib/uuid.py | 12 ++++++++++-- Misc/ACKS | 1 + Misc/NEWS | 3 +++ 4 files changed, 35 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_uuid.py b/Lib/test/test_uuid.py --- a/Lib/test/test_uuid.py +++ b/Lib/test/test_uuid.py @@ -1,5 +1,7 @@ import unittest +from test import support import builtins +import io import os import uuid @@ -356,6 +358,25 @@ self.assertEqual(node1, node2) + def test_find_mac(self): + data = '''\ + +fake hwaddr +cscotun0 Link encap:UNSPEC HWaddr 00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00 +eth0 Link encap:Ethernet HWaddr 12:34:56:78:90:ab +''' + def mock_popen(cmd): + return io.StringIO(data) + + with support.swap_attr(os, 'popen', mock_popen): + mac = uuid._find_mac( + command='ifconfig', + args='', + hw_identifiers=['hwaddr'], + get_index=lambda x: x + 1, + ) + self.assertEqual(mac, 0x1234567890ab) + @unittest.skipUnless(importable('ctypes'), 'requires ctypes') def test_uuid1(self): equal = self.assertEqual diff --git a/Lib/uuid.py b/Lib/uuid.py --- a/Lib/uuid.py +++ b/Lib/uuid.py @@ -327,8 +327,16 @@ words = line.lower().split() for i in range(len(words)): if words[i] in hw_identifiers: - return int( - words[get_index(i)].replace(':', ''), 16) + try: + return int( + words[get_index(i)].replace(':', ''), 16) + except (ValueError, IndexError): + # Virtual interfaces, such as those provided by + # VPNs, do not have a colon-delimited MAC address + # as expected, but a 16-byte HWAddr separated by + # dashes. These should be ignored in favor of a + # real MAC address + pass except OSError: continue return None diff --git a/Misc/ACKS b/Misc/ACKS --- a/Misc/ACKS +++ b/Misc/ACKS @@ -407,6 +407,7 @@ Andrew Francis Stefan Franke Martin Franklin +Kent Frazier Bruce Frederiksen Robin Friedrich Bradley Froehle diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -16,6 +16,9 @@ Library ------- +- Issue #11508: Fixed uuid.getnode() and uuid.uuid1() on environment with + virtual interface. Original patch by Kent Frazier. + - Issue #11489: JSON decoder now accepts lone surrogates. - Issue #19545: Avoid chained exceptions while passing stray % to -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 21:59:27 2013 From: python-checkins at python.org (zach.ware) Date: Tue, 26 Nov 2013 21:59:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Merge_with_3=2E3?= Message-ID: <3dTctR0YxCz7LqR@mail.python.org> http://hg.python.org/cpython/rev/dc1eeca2dfb0 changeset: 87602:dc1eeca2dfb0 parent: 87601:eb307288ffb8 parent: 87600:b8d606c62d68 user: Zachary Ware date: Tue Nov 26 14:58:10 2013 -0600 summary: Merge with 3.3 files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 23:35:03 2013 From: python-checkins at python.org (zach.ware) Date: Tue, 26 Nov 2013 23:35:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5Nzg4?= =?utf-8?q?=3A_kill=5Fpython=28=5Fd=29=2Eexe_is_now_run_as_a_PreBuildEvent?= =?utf-8?q?_on_the?= Message-ID: <3dTg0l2G8Yz7LvJ@mail.python.org> http://hg.python.org/cpython/rev/ca600150205a changeset: 87603:ca600150205a branch: 3.3 parent: 87600:b8d606c62d68 user: Zachary Ware date: Tue Nov 26 16:32:59 2013 -0600 summary: Issue #19788: kill_python(_d).exe is now run as a PreBuildEvent on the pythoncore sub-project. This should prevent build errors due a previous build's python(_d).exe still running. files: Misc/NEWS | 4 + PCbuild/pythoncore.vcxproj | 48 ++++++++++++++++++++++ Tools/buildbot/build-amd64.bat | 3 +- Tools/buildbot/build.bat | 3 +- 4 files changed, 54 insertions(+), 4 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -92,6 +92,10 @@ Build ----- +- Issue #19788: kill_python(_d).exe is now run as a PreBuildEvent on the + pythoncore sub-project. This should prevent build errors due a previous + build's python(_d).exe still running. + - Add workaround for VS 2010 nmake clean issue. VS 2010 doesn't set up PATH for nmake.exe correctly. diff --git a/PCbuild/pythoncore.vcxproj b/PCbuild/pythoncore.vcxproj --- a/PCbuild/pythoncore.vcxproj +++ b/PCbuild/pythoncore.vcxproj @@ -181,6 +181,12 @@ libc;%(IgnoreSpecificDefaultLibraries) 0x1e000000 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + @@ -206,6 +212,12 @@ libc;%(IgnoreSpecificDefaultLibraries) 0x1e000000 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + @@ -231,6 +243,12 @@ libc;%(IgnoreSpecificDefaultLibraries) 0x1e000000 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + @@ -259,6 +277,12 @@ libc;%(IgnoreSpecificDefaultLibraries) 0x1e000000 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + @@ -282,6 +306,12 @@ libc;%(IgnoreSpecificDefaultLibraries) 0x1e000000 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + @@ -308,6 +338,12 @@ 0x1e000000 MachineX64 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + @@ -331,6 +367,12 @@ libc;%(IgnoreSpecificDefaultLibraries) 0x1e000000 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + @@ -357,6 +399,12 @@ 0x1e000000 MachineX64 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + diff --git a/Tools/buildbot/build-amd64.bat b/Tools/buildbot/build-amd64.bat --- a/Tools/buildbot/build-amd64.bat +++ b/Tools/buildbot/build-amd64.bat @@ -2,6 +2,5 @@ cmd /c Tools\buildbot\external-amd64.bat call "%VS100COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64 cmd /c Tools\buildbot\clean-amd64.bat -msbuild PCbuild\kill_python.vcxproj /p:Configuration=Debug /p:PlatformTarget=x64 -PCbuild\amd64\kill_python_d.exe + msbuild PCbuild\pcbuild.sln /p:Configuration=Debug /p:Platform=x64 diff --git a/Tools/buildbot/build.bat b/Tools/buildbot/build.bat --- a/Tools/buildbot/build.bat +++ b/Tools/buildbot/build.bat @@ -2,7 +2,6 @@ cmd /c Tools\buildbot\external.bat call "%VS100COMNTOOLS%vsvars32.bat" cmd /c Tools\buildbot\clean.bat -msbuild PCbuild\kill_python.vcxproj /p:Configuration=Debug /p:PlatformTarget=x86 -PCbuild\kill_python_d.exe + msbuild PCbuild\pcbuild.sln /p:Configuration=Debug /p:Platform=Win32 -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Tue Nov 26 23:35:04 2013 From: python-checkins at python.org (zach.ware) Date: Tue, 26 Nov 2013 23:35:04 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?b?KTogSXNzdWUgIzE5Nzg4OiBraWxsX3B5dGhvbihfZCkuZXhlIGlzIG5vdyBy?= =?utf-8?q?un_as_a_PreBuildEvent_on_the?= Message-ID: <3dTg0m5dDbz7Lts@mail.python.org> http://hg.python.org/cpython/rev/2c1e041cb504 changeset: 87604:2c1e041cb504 parent: 87602:dc1eeca2dfb0 parent: 87603:ca600150205a user: Zachary Ware date: Tue Nov 26 16:34:45 2013 -0600 summary: Issue #19788: kill_python(_d).exe is now run as a PreBuildEvent on the pythoncore sub-project. This should prevent build errors due a previous build's python(_d).exe still running. files: Misc/NEWS | 6 ++ PCbuild/pythoncore.vcxproj | 48 ++++++++++++++++++++++ Tools/buildbot/build-amd64.bat | 3 +- Tools/buildbot/build.bat | 3 +- 4 files changed, 56 insertions(+), 4 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -30,6 +30,12 @@ - Issue #19588: Fixed tests in test_random that were silently skipped most of the time. Patch by Julian Gindi. +Build +----- + +- Issue #19788: kill_python(_d).exe is now run as a PreBuildEvent on the + pythoncore sub-project. This should prevent build errors due a previous + build's python(_d).exe still running. What's New in Python 3.4.0 Beta 1? ================================== diff --git a/PCbuild/pythoncore.vcxproj b/PCbuild/pythoncore.vcxproj --- a/PCbuild/pythoncore.vcxproj +++ b/PCbuild/pythoncore.vcxproj @@ -181,6 +181,12 @@ libc;%(IgnoreSpecificDefaultLibraries) 0x1e000000 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + @@ -206,6 +212,12 @@ libc;%(IgnoreSpecificDefaultLibraries) 0x1e000000 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + @@ -231,6 +243,12 @@ libc;%(IgnoreSpecificDefaultLibraries) 0x1e000000 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + @@ -259,6 +277,12 @@ libc;%(IgnoreSpecificDefaultLibraries) 0x1e000000 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + @@ -282,6 +306,12 @@ libc;%(IgnoreSpecificDefaultLibraries) 0x1e000000 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + @@ -308,6 +338,12 @@ 0x1e000000 MachineX64 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + @@ -331,6 +367,12 @@ libc;%(IgnoreSpecificDefaultLibraries) 0x1e000000 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + @@ -357,6 +399,12 @@ 0x1e000000 MachineX64 + + $(KillPythonExe) + + + Killing any running $(PythonExe) instances... + diff --git a/Tools/buildbot/build-amd64.bat b/Tools/buildbot/build-amd64.bat --- a/Tools/buildbot/build-amd64.bat +++ b/Tools/buildbot/build-amd64.bat @@ -2,6 +2,5 @@ cmd /c Tools\buildbot\external-amd64.bat call "%VS100COMNTOOLS%\..\..\VC\vcvarsall.bat" x86_amd64 cmd /c Tools\buildbot\clean-amd64.bat -msbuild PCbuild\kill_python.vcxproj /p:Configuration=Debug /p:PlatformTarget=x64 -PCbuild\amd64\kill_python_d.exe + msbuild PCbuild\pcbuild.sln /p:Configuration=Debug /p:Platform=x64 diff --git a/Tools/buildbot/build.bat b/Tools/buildbot/build.bat --- a/Tools/buildbot/build.bat +++ b/Tools/buildbot/build.bat @@ -2,7 +2,6 @@ cmd /c Tools\buildbot\external.bat call "%VS100COMNTOOLS%vsvars32.bat" cmd /c Tools\buildbot\clean.bat -msbuild PCbuild\kill_python.vcxproj /p:Configuration=Debug /p:PlatformTarget=x86 -PCbuild\kill_python_d.exe + msbuild PCbuild\pcbuild.sln /p:Configuration=Debug /p:Platform=Win32 -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 27 02:24:13 2013 From: python-checkins at python.org (benjamin.peterson) Date: Wed, 27 Nov 2013 02:24:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_fix_format_spe?= =?utf-8?q?c_recursive_expansion_=28closes_=2319729=29?= Message-ID: <3dTklx3YgVz7Ljg@mail.python.org> http://hg.python.org/cpython/rev/bab7dc2ffc16 changeset: 87605:bab7dc2ffc16 branch: 3.3 parent: 87603:ca600150205a user: Benjamin Peterson date: Tue Nov 26 19:22:36 2013 -0600 summary: fix format spec recursive expansion (closes #19729) files: Lib/test/test_unicode.py | 1 + Misc/NEWS | 2 ++ Objects/stringlib/unicode_format.h | 6 ++++-- 3 files changed, 7 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_unicode.py b/Lib/test/test_unicode.py --- a/Lib/test/test_unicode.py +++ b/Lib/test/test_unicode.py @@ -955,6 +955,7 @@ '') self.assertEqual("{[{}]}".format({"{}": 5}), "5") + self.assertEqual("0x{:0{:d}X}".format(0x0,16), "0x0000000000000000") def test_format_map(self): self.assertEqual(''.format_map({}), '') diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,6 +10,8 @@ Core and Builtins ----------------- +- Issue #19729: In str.format(), fix recursive expansion in format spec. + - Issue #19638: Fix possible crash / undefined behaviour from huge (more than 2 billion characters) input strings in _Py_dg_strtod. diff --git a/Objects/stringlib/unicode_format.h b/Objects/stringlib/unicode_format.h --- a/Objects/stringlib/unicode_format.h +++ b/Objects/stringlib/unicode_format.h @@ -727,8 +727,10 @@ while (self->str.start < self->str.end) { switch (c = PyUnicode_READ_CHAR(self->str.str, self->str.start++)) { case ':': - hit_format_spec = 1; - count = 1; + if (!hit_format_spec) { + count = 1; + hit_format_spec = 1; + } break; case '{': /* the format spec needs to be recursively expanded. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 27 02:24:14 2013 From: python-checkins at python.org (benjamin.peterson) Date: Wed, 27 Nov 2013 02:24:14 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?b?KTogbWVyZ2UgMy4zICgjMTk3Mjkp?= Message-ID: <3dTkly5YBtz7Ll9@mail.python.org> http://hg.python.org/cpython/rev/e27684eed3b6 changeset: 87606:e27684eed3b6 parent: 87604:2c1e041cb504 parent: 87605:bab7dc2ffc16 user: Benjamin Peterson date: Tue Nov 26 19:24:01 2013 -0600 summary: merge 3.3 (#19729) files: Lib/test/test_unicode.py | 2 ++ Misc/NEWS | 2 ++ 2 files changed, 4 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_unicode.py b/Lib/test/test_unicode.py --- a/Lib/test/test_unicode.py +++ b/Lib/test/test_unicode.py @@ -973,6 +973,8 @@ self.assertRaises(ValueError, "{a{b}".format, 42) self.assertRaises(ValueError, "{[}".format, 42) + self.assertEqual("0x{:0{:d}X}".format(0x0,16), "0x0000000000000000") + def test_format_map(self): self.assertEqual(''.format_map({}), '') self.assertEqual('a'.format_map({}), 'a') diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -10,6 +10,8 @@ Core and Builtins ----------------- +- Issue #19729: In str.format(), fix recursive expansion in format spec. + - Issue #19638: Fix possible crash / undefined behaviour from huge (more than 2 billion characters) input strings in _Py_dg_strtod. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 27 06:06:27 2013 From: python-checkins at python.org (benjamin.peterson) Date: Wed, 27 Nov 2013 06:06:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_recommend_Orde?= =?utf-8?q?redDict_for_this_FAQ_=28closes_=2319805=29?= Message-ID: <3dTqhM3MVMz7Lk0@mail.python.org> http://hg.python.org/cpython/rev/22e514e41fac changeset: 87607:22e514e41fac branch: 3.3 parent: 87605:bab7dc2ffc16 user: Benjamin Peterson date: Tue Nov 26 23:05:25 2013 -0600 summary: recommend OrderedDict for this FAQ (closes #19805) files: Doc/faq/programming.rst | 30 ++-------------------------- 1 files changed, 3 insertions(+), 27 deletions(-) diff --git a/Doc/faq/programming.rst b/Doc/faq/programming.rst --- a/Doc/faq/programming.rst +++ b/Doc/faq/programming.rst @@ -1193,34 +1193,10 @@ Dictionaries ============ -How can I get a dictionary to display its keys in a consistent order? ---------------------------------------------------------------------- +How can I get a dictionary to store and display its keys in a consistent order? +------------------------------------------------------------------------------- -You can't. Dictionaries store their keys in an unpredictable order, so the -display order of a dictionary's elements will be similarly unpredictable. - -This can be frustrating if you want to save a printable version to a file, make -some changes and then compare it with some other printed dictionary. In this -case, use the ``pprint`` module to pretty-print the dictionary; the items will -be presented in order sorted by the key. - -A more complicated solution is to subclass ``dict`` to create a -``SortedDict`` class that prints itself in a predictable order. Here's one -simpleminded implementation of such a class:: - - class SortedDict(dict): - def __repr__(self): - keys = sorted(self.keys()) - result = ("{!r}: {!r}".format(k, self[k]) for k in keys) - return "{{{}}}".format(", ".join(result)) - - __str__ = __repr__ - -This will work for many common situations you might encounter, though it's far -from a perfect solution. The largest flaw is that if some values in the -dictionary are also dictionaries, their values won't be presented in any -particular order. - +Use :class:`collections.OrderedDict`. I want to do a complicated sort: can you do a Schwartzian Transform in Python? ------------------------------------------------------------------------------ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 27 06:06:28 2013 From: python-checkins at python.org (benjamin.peterson) Date: Wed, 27 Nov 2013 06:06:28 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?b?KTogbWVyZ2UgMy4zICgjMTk4MDUp?= Message-ID: <3dTqhN576Fz7Lkt@mail.python.org> http://hg.python.org/cpython/rev/ddbcb86afbdc changeset: 87608:ddbcb86afbdc parent: 87606:e27684eed3b6 parent: 87607:22e514e41fac user: Benjamin Peterson date: Tue Nov 26 23:05:37 2013 -0600 summary: merge 3.3 (#19805) files: Doc/faq/programming.rst | 30 ++-------------------------- 1 files changed, 3 insertions(+), 27 deletions(-) diff --git a/Doc/faq/programming.rst b/Doc/faq/programming.rst --- a/Doc/faq/programming.rst +++ b/Doc/faq/programming.rst @@ -1193,34 +1193,10 @@ Dictionaries ============ -How can I get a dictionary to display its keys in a consistent order? ---------------------------------------------------------------------- +How can I get a dictionary to store and display its keys in a consistent order? +------------------------------------------------------------------------------- -You can't. Dictionaries store their keys in an unpredictable order, so the -display order of a dictionary's elements will be similarly unpredictable. - -This can be frustrating if you want to save a printable version to a file, make -some changes and then compare it with some other printed dictionary. In this -case, use the ``pprint`` module to pretty-print the dictionary; the items will -be presented in order sorted by the key. - -A more complicated solution is to subclass ``dict`` to create a -``SortedDict`` class that prints itself in a predictable order. Here's one -simpleminded implementation of such a class:: - - class SortedDict(dict): - def __repr__(self): - keys = sorted(self.keys()) - result = ("{!r}: {!r}".format(k, self[k]) for k in keys) - return "{{{}}}".format(", ".join(result)) - - __str__ = __repr__ - -This will work for many common situations you might encounter, though it's far -from a perfect solution. The largest flaw is that if some values in the -dictionary are also dictionaries, their values won't be presented in any -particular order. - +Use :class:`collections.OrderedDict`. I want to do a complicated sort: can you do a Schwartzian Transform in Python? ------------------------------------------------------------------------------ -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Wed Nov 27 07:45:48 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Wed, 27 Nov 2013 07:45:48 +0100 Subject: [Python-checkins] Daily reference leaks (e27684eed3b6): sum=4 Message-ID: results for e27684eed3b6 on branch "default" -------------------------------------------- test_site leaked [0, 0, 2] references, sum=2 test_site leaked [0, 0, 2] memory blocks, sum=2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogQwLvSf', '-x'] From python-checkins at python.org Wed Nov 27 07:58:47 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Wed, 27 Nov 2013 07:58:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogU2tpcCB0ZXN0X2Zp?= =?utf-8?q?nd=5Fmac_on_Windows_=28issue_=2319804=29=2E?= Message-ID: <3dTt9z4f6sz7Lk2@mail.python.org> http://hg.python.org/cpython/rev/e5b7f140ae30 changeset: 87609:e5b7f140ae30 branch: 2.7 parent: 87599:cbf30e9aa586 user: Serhiy Storchaka date: Wed Nov 27 08:57:33 2013 +0200 summary: Skip test_find_mac on Windows (issue #19804). This test requires the ifconfig executable on $PATH, /sbin/, or /usr/sbin. files: Lib/test/test_uuid.py | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_uuid.py b/Lib/test/test_uuid.py --- a/Lib/test/test_uuid.py +++ b/Lib/test/test_uuid.py @@ -347,6 +347,7 @@ self.assertEqual(node1, node2) + @unittest.skipUnless(os.name == 'posix', 'requires Posix') def test_find_mac(self): data = '''\ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 27 07:58:48 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Wed, 27 Nov 2013 07:58:48 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogU2tpcCB0ZXN0X2Zp?= =?utf-8?q?nd=5Fmac_on_Windows_=28issue_=2319804=29=2E?= Message-ID: <3dTtB06RYmz7Ll5@mail.python.org> http://hg.python.org/cpython/rev/dfadce7635ce changeset: 87610:dfadce7635ce branch: 3.3 parent: 87607:22e514e41fac user: Serhiy Storchaka date: Wed Nov 27 08:57:51 2013 +0200 summary: Skip test_find_mac on Windows (issue #19804). This test requires the ifconfig executable on $PATH, /sbin/, or /usr/sbin. files: Lib/test/test_uuid.py | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_uuid.py b/Lib/test/test_uuid.py --- a/Lib/test/test_uuid.py +++ b/Lib/test/test_uuid.py @@ -358,6 +358,7 @@ self.assertEqual(node1, node2) + @unittest.skipUnless(os.name == 'posix', 'requires Posix') def test_find_mac(self): data = '''\ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 27 07:58:50 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Wed, 27 Nov 2013 07:58:50 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Skip_test=5Ffind=5Fmac_on_Windows_=28issue_=2319804=29?= =?utf-8?q?=2E?= Message-ID: <3dTtB211C9z7LlQ@mail.python.org> http://hg.python.org/cpython/rev/4d5c3cb08170 changeset: 87611:4d5c3cb08170 parent: 87608:ddbcb86afbdc parent: 87610:dfadce7635ce user: Serhiy Storchaka date: Wed Nov 27 08:58:13 2013 +0200 summary: Skip test_find_mac on Windows (issue #19804). This test requires the ifconfig executable on $PATH, /sbin/, or /usr/sbin. files: Lib/test/test_uuid.py | 1 + 1 files changed, 1 insertions(+), 0 deletions(-) diff --git a/Lib/test/test_uuid.py b/Lib/test/test_uuid.py --- a/Lib/test/test_uuid.py +++ b/Lib/test/test_uuid.py @@ -358,6 +358,7 @@ self.assertEqual(node1, node2) + @unittest.skipUnless(os.name == 'posix', 'requires Posix') def test_find_mac(self): data = '''\ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 27 11:27:06 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Wed, 27 Nov 2013 11:27:06 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Combine_the_FastCall_funct?= =?utf-8?q?ions_in_cpickle=2E?= Message-ID: <3dTypL2TVmz7Lmp@mail.python.org> http://hg.python.org/cpython/rev/23459df0753e changeset: 87612:23459df0753e user: Alexandre Vassalotti date: Wed Nov 27 02:26:54 2013 -0800 summary: Combine the FastCall functions in cpickle. I fixed the bug that was in my previous attempt of this cleanup. I ran the full test suite to verify I didn't introduce any obvious bugs. files: Modules/_pickle.c | 136 +++++++++++----------------------- 1 files changed, 44 insertions(+), 92 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -346,7 +346,6 @@ pickling. */ PyObject *pers_func; /* persistent_id() method, can be NULL */ PyObject *dispatch_table; /* private dispatch_table, can be NULL */ - PyObject *arg; PyObject *write; /* write() method of the output stream. */ PyObject *output_buffer; /* Write into a local bytearray buffer before @@ -383,7 +382,6 @@ Py_ssize_t memo_size; /* Capacity of the memo array */ Py_ssize_t memo_len; /* Number of objects in the memo */ - PyObject *arg; PyObject *pers_func; /* persistent_load() method, can be NULL. */ Py_buffer buffer; @@ -639,57 +637,37 @@ /*************************************************************************/ -/* Helpers for creating the argument tuple passed to functions. This has the - performance advantage of calling PyTuple_New() only once. - - XXX(avassalotti): Inline directly in _Pickler_FastCall() and - _Unpickler_FastCall(). */ -#define ARG_TUP(self, obj) do { \ - if ((self)->arg || ((self)->arg=PyTuple_New(1))) { \ - Py_XDECREF(PyTuple_GET_ITEM((self)->arg, 0)); \ - PyTuple_SET_ITEM((self)->arg, 0, (obj)); \ - } \ - else { \ - Py_DECREF((obj)); \ - } \ - } while (0) - -#define FREE_ARG_TUP(self) do { \ - if ((self)->arg->ob_refcnt > 1) \ - Py_CLEAR((self)->arg); \ - } while (0) - -/* A temporary cleaner API for fast single argument function call. - - XXX: Does caching the argument tuple provides any real performance benefits? - - A quick benchmark, on a 2.0GHz Athlon64 3200+ running Linux 2.6.24 with - glibc 2.7, tells me that it takes roughly 20,000,000 PyTuple_New(1) calls - when the tuple is retrieved from the freelist (i.e, call PyTuple_New() then - immediately DECREF it) and 1,200,000 calls when allocating brand new tuples - (i.e, call PyTuple_New() and store the returned value in an array), to save - one second (wall clock time). Either ways, the loading time a pickle stream - large enough to generate this number of calls would be massively - overwhelmed by other factors, like I/O throughput, the GC traversal and - object allocation overhead. So, I really doubt these functions provide any - real benefits. - - On the other hand, oprofile reports that pickle spends a lot of time in - these functions. But, that is probably more related to the function call - overhead, than the argument tuple allocation. - - XXX: And, what is the reference behavior of these? Steal, borrow? At first - glance, it seems to steal the reference of 'arg' and borrow the reference - of 'func'. */ +/* Helper for calling a function with a single argument quickly. + + This has the performance advantage of reusing the argument tuple. This + provides a nice performance boost with older pickle protocols where many + unbuffered reads occurs. + + This function steals the reference of the given argument. */ static PyObject * -_Pickler_FastCall(PicklerObject *self, PyObject *func, PyObject *arg) -{ - PyObject *result = NULL; - - ARG_TUP(self, arg); - if (self->arg) { - result = PyObject_Call(func, self->arg, NULL); - FREE_ARG_TUP(self); +_Pickle_FastCall(PyObject *func, PyObject *obj) +{ + static PyObject *arg_tuple = NULL; + PyObject *result; + + if (arg_tuple == NULL) { + arg_tuple = PyTuple_New(1); + if (arg_tuple == NULL) { + Py_DECREF(obj); + return NULL; + } + } + assert(arg_tuple->ob_refcnt == 1); + assert(PyTuple_GET_ITEM(arg_tuple, 0) == NULL); + + PyTuple_SET_ITEM(arg_tuple, 0, obj); + result = PyObject_Call(func, arg_tuple, NULL); + + Py_CLEAR(PyTuple_GET_ITEM(arg_tuple, 0)); + if (arg_tuple->ob_refcnt > 1) { + /* The function called saved a reference to our argument tuple. + This means we cannot reuse it anymore. */ + Py_CLEAR(arg_tuple); } return result; } @@ -787,7 +765,7 @@ if (output == NULL) return -1; - result = _Pickler_FastCall(self, self->write, output); + result = _Pickle_FastCall(self->write, output); Py_XDECREF(result); return (result == NULL) ? -1 : 0; } @@ -853,7 +831,6 @@ self->pers_func = NULL; self->dispatch_table = NULL; - self->arg = NULL; self->write = NULL; self->proto = 0; self->bin = 0; @@ -922,20 +899,6 @@ return 0; } -/* See documentation for _Pickler_FastCall(). */ -static PyObject * -_Unpickler_FastCall(UnpicklerObject *self, PyObject *func, PyObject *arg) -{ - PyObject *result = NULL; - - ARG_TUP(self, arg); - if (self->arg) { - result = PyObject_Call(func, self->arg, NULL); - FREE_ARG_TUP(self); - } - return result; -} - /* Returns the size of the input on success, -1 on failure. This takes its own reference to `input`. */ static Py_ssize_t @@ -1006,7 +969,7 @@ PyObject *len = PyLong_FromSsize_t(n); if (len == NULL) return -1; - data = _Unpickler_FastCall(self, self->read, len); + data = _Pickle_FastCall(self->read, len); } if (data == NULL) return -1; @@ -1019,7 +982,7 @@ Py_DECREF(data); return -1; } - prefetched = _Unpickler_FastCall(self, self->peek, len); + prefetched = _Pickle_FastCall(self->peek, len); if (prefetched == NULL) { if (PyErr_ExceptionMatches(PyExc_NotImplementedError)) { /* peek() is probably not supported by the given file object */ @@ -1229,7 +1192,6 @@ if (self == NULL) return NULL; - self->arg = NULL; self->pers_func = NULL; self->input_buffer = NULL; self->input_line = NULL; @@ -3172,7 +3134,7 @@ const char binpersid_op = BINPERSID; Py_INCREF(obj); - pid = _Pickler_FastCall(self, func, obj); + pid = _Pickle_FastCall(func, obj); if (pid == NULL) return -1; @@ -3600,7 +3562,7 @@ } if (reduce_func != NULL) { Py_INCREF(obj); - reduce_value = _Pickler_FastCall(self, reduce_func, obj); + reduce_value = _Pickle_FastCall(reduce_func, obj); } else if (PyType_IsSubtype(type, &PyType_Type)) { status = save_global(self, obj, NULL); @@ -3625,7 +3587,7 @@ PyObject *proto; proto = PyLong_FromLong(self->proto); if (proto != NULL) { - reduce_value = _Pickler_FastCall(self, reduce_func, proto); + reduce_value = _Pickle_FastCall(reduce_func, proto); } } else { @@ -3794,7 +3756,6 @@ Py_XDECREF(self->write); Py_XDECREF(self->pers_func); Py_XDECREF(self->dispatch_table); - Py_XDECREF(self->arg); Py_XDECREF(self->fast_memo); PyMemoTable_Del(self->memo); @@ -3808,7 +3769,6 @@ Py_VISIT(self->write); Py_VISIT(self->pers_func); Py_VISIT(self->dispatch_table); - Py_VISIT(self->arg); Py_VISIT(self->fast_memo); return 0; } @@ -3820,7 +3780,6 @@ Py_CLEAR(self->write); Py_CLEAR(self->pers_func); Py_CLEAR(self->dispatch_table); - Py_CLEAR(self->arg); Py_CLEAR(self->fast_memo); if (self->memo != NULL) { @@ -3943,7 +3902,6 @@ return NULL; } - self->arg = NULL; self->fast = 0; self->fast_nesting = 0; self->fast_memo = NULL; @@ -5208,9 +5166,9 @@ if (pid == NULL) return -1; - /* Ugh... this does not leak since _Unpickler_FastCall() steals the - reference to pid first. */ - pid = _Unpickler_FastCall(self, self->pers_func, pid); + /* This does not leak since _Pickle_FastCall() steals the reference + to pid first. */ + pid = _Pickle_FastCall(self->pers_func, pid); if (pid == NULL) return -1; @@ -5235,9 +5193,9 @@ if (pid == NULL) return -1; - /* Ugh... this does not leak since _Unpickler_FastCall() steals the + /* This does not leak since _Pickle_FastCall() steals the reference to pid first. */ - pid = _Unpickler_FastCall(self, self->pers_func, pid); + pid = _Pickle_FastCall(self->pers_func, pid); if (pid == NULL) return -1; @@ -5585,7 +5543,7 @@ PyObject *result; value = self->stack->data[i]; - result = _Unpickler_FastCall(self, append_func, value); + result = _Pickle_FastCall(append_func, value); if (result == NULL) { Pdata_clear(self->stack, i + 1); Py_SIZE(self->stack) = x; @@ -5700,7 +5658,7 @@ PyObject *item; item = self->stack->data[i]; - result = _Unpickler_FastCall(self, add_func, item); + result = _Pickle_FastCall(add_func, item); if (result == NULL) { Pdata_clear(self->stack, i + 1); Py_SIZE(self->stack) = mark; @@ -5747,9 +5705,7 @@ PyObject *result; /* The explicit __setstate__ is responsible for everything. */ - /* Ugh... this does not leak since _Unpickler_FastCall() steals the - reference to state first. */ - result = _Unpickler_FastCall(self, setstate, state); + result = _Pickle_FastCall(setstate, state); Py_DECREF(setstate); if (result == NULL) return -1; @@ -6249,7 +6205,6 @@ Py_XDECREF(self->peek); Py_XDECREF(self->stack); Py_XDECREF(self->pers_func); - Py_XDECREF(self->arg); if (self->buffer.buf != NULL) { PyBuffer_Release(&self->buffer); self->buffer.buf = NULL; @@ -6272,7 +6227,6 @@ Py_VISIT(self->peek); Py_VISIT(self->stack); Py_VISIT(self->pers_func); - Py_VISIT(self->arg); return 0; } @@ -6284,7 +6238,6 @@ Py_CLEAR(self->peek); Py_CLEAR(self->stack); Py_CLEAR(self->pers_func); - Py_CLEAR(self->arg); if (self->buffer.buf != NULL) { PyBuffer_Release(&self->buffer); self->buffer.buf = NULL; @@ -6423,7 +6376,6 @@ if (self->memo == NULL) return NULL; - self->arg = NULL; self->proto = 0; return Py_None; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 27 16:19:45 2013 From: python-checkins at python.org (benjamin.peterson) Date: Wed, 27 Nov 2013 16:19:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_add_SO=5FPRIORITY_=28close?= =?utf-8?b?cyAjMTk4MDIp?= Message-ID: <3dV5J13x8Gz7LqZ@mail.python.org> http://hg.python.org/cpython/rev/9bbada125b3f changeset: 87613:9bbada125b3f user: Benjamin Peterson date: Wed Nov 27 09:18:54 2013 -0600 summary: add SO_PRIORITY (closes #19802) Patch by Claudiu Popa. files: Misc/NEWS | 2 ++ Modules/socketmodule.c | 3 +++ 2 files changed, 5 insertions(+), 0 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -18,6 +18,8 @@ Library ------- +- Issue #19802: Add socket.SO_PRIORITY. + - Issue #11508: Fixed uuid.getnode() and uuid.uuid1() on environment with virtual interface. Original patch by Kent Frazier. diff --git a/Modules/socketmodule.c b/Modules/socketmodule.c --- a/Modules/socketmodule.c +++ b/Modules/socketmodule.c @@ -6241,6 +6241,9 @@ #ifdef SO_BINDTODEVICE PyModule_AddIntMacro(m, SO_BINDTODEVICE); #endif +#ifdef SO_PRIORITY + PyModule_AddIntMacro(m, SO_PRIORITY); +#endif /* Maximum number of connections for "listen" */ #ifdef SOMAXCONN -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 27 19:22:47 2013 From: python-checkins at python.org (guido.van.rossum) Date: Wed, 27 Nov 2013 19:22:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Remove_question_to_myself_abo?= =?utf-8?q?ut_set=5Fchild=5Fwatcher=28=29_now_the_issue_is_fixed=2E?= Message-ID: <3dV9MC2Yrwz7Llh@mail.python.org> http://hg.python.org/peps/rev/ee7c8a3b3b6a changeset: 5324:ee7c8a3b3b6a user: Guido van Rossum date: Wed Nov 27 10:22:42 2013 -0800 summary: Remove question to myself about set_child_watcher() now the issue is fixed. files: pep-3156.txt | 1 - 1 files changed, 0 insertions(+), 1 deletions(-) diff --git a/pep-3156.txt b/pep-3156.txt --- a/pep-3156.txt +++ b/pep-3156.txt @@ -1863,7 +1863,6 @@ watcher = asyncio.FastChildWatcher() asyncio.set_child_watcher(watcher) - # TBD: Is a call to watcher.attach_loop() needed? Wish List -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Wed Nov 27 19:37:21 2013 From: python-checkins at python.org (guido.van.rossum) Date: Wed, 27 Nov 2013 19:37:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Fix_get=5Fevent?= =?utf-8?q?=5Floop=28=29_to_call_set=5Fevent=5Floop=28=29_when_setting_the?= =?utf-8?q?_loop=2E?= Message-ID: <3dV9h11PjxzS9H@mail.python.org> http://hg.python.org/cpython/rev/5c9af8194d3b changeset: 87614:5c9af8194d3b user: Guido van Rossum date: Wed Nov 27 10:37:13 2013 -0800 summary: asyncio: Fix get_event_loop() to call set_event_loop() when setting the loop. By Anthony Baire. files: Lib/asyncio/events.py | 2 +- Lib/test/test_asyncio/test_events.py | 16 ++++++++++++++++ 2 files changed, 17 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/events.py b/Lib/asyncio/events.py --- a/Lib/asyncio/events.py +++ b/Lib/asyncio/events.py @@ -360,7 +360,7 @@ if (self._local._loop is None and not self._local._set_called and isinstance(threading.current_thread(), threading._MainThread)): - self._local._loop = self.new_event_loop() + self.set_event_loop(self.new_event_loop()) assert self._local._loop is not None, \ ('There is no current event loop in thread %r.' % threading.current_thread().name) diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -1599,6 +1599,22 @@ self.assertIs(loop, policy.get_event_loop()) loop.close() + def test_get_event_loop_calls_set_event_loop(self): + policy = self.create_policy() + + with unittest.mock.patch.object( + policy, "set_event_loop", + wraps=policy.set_event_loop) as m_set_event_loop: + + loop = policy.get_event_loop() + + # policy._local._loop must be set through .set_event_loop() + # (the unix DefaultEventLoopPolicy needs this call to attach + # the child watcher correctly) + m_set_event_loop.assert_called_with(loop) + + loop.close() + def test_get_event_loop_after_set_none(self): policy = self.create_policy() policy.set_event_loop(None) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 27 21:40:12 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 27 Nov 2013 21:40:12 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Close_=2319798=3A_replace_?= =?utf-8?q?=22maximum=22_term_with_=22peak=22_in_get=5Ftraced=5Fmemory=28?= =?utf-8?q?=29?= Message-ID: <3dVDPm0n6gzRnh@mail.python.org> http://hg.python.org/cpython/rev/553144bd7bf1 changeset: 87615:553144bd7bf1 user: Victor Stinner date: Wed Nov 27 21:39:49 2013 +0100 summary: Close #19798: replace "maximum" term with "peak" in get_traced_memory() documentation. Use also the term "current" for the current size. files: Doc/library/tracemalloc.rst | 4 ++-- Lib/test/test_tracemalloc.py | 14 +++++++------- Modules/_tracemalloc.c | 24 ++++++++++++------------ 3 files changed, 21 insertions(+), 21 deletions(-) diff --git a/Doc/library/tracemalloc.rst b/Doc/library/tracemalloc.rst --- a/Doc/library/tracemalloc.rst +++ b/Doc/library/tracemalloc.rst @@ -276,8 +276,8 @@ .. function:: get_traced_memory() - Get the current size and maximum size of memory blocks traced by the - :mod:`tracemalloc` module as a tuple: ``(size: int, max_size: int)``. + Get the current size and peak size of memory blocks traced by the + :mod:`tracemalloc` module as a tuple: ``(current: int, peak: int)``. .. function:: get_tracemalloc_memory() diff --git a/Lib/test/test_tracemalloc.py b/Lib/test/test_tracemalloc.py --- a/Lib/test/test_tracemalloc.py +++ b/Lib/test/test_tracemalloc.py @@ -182,19 +182,19 @@ obj_size = 1024 * 1024 tracemalloc.clear_traces() obj, obj_traceback = allocate_bytes(obj_size) - size, max_size = tracemalloc.get_traced_memory() + size, peak_size = tracemalloc.get_traced_memory() self.assertGreaterEqual(size, obj_size) - self.assertGreaterEqual(max_size, size) + self.assertGreaterEqual(peak_size, size) self.assertLessEqual(size - obj_size, max_error) - self.assertLessEqual(max_size - size, max_error) + self.assertLessEqual(peak_size - size, max_error) # destroy the object obj = None - size2, max_size2 = tracemalloc.get_traced_memory() + size2, peak_size2 = tracemalloc.get_traced_memory() self.assertLess(size2, size) self.assertGreaterEqual(size - size2, obj_size - max_error) - self.assertGreaterEqual(max_size2, max_size) + self.assertGreaterEqual(peak_size2, peak_size) # clear_traces() must reset traced memory counters tracemalloc.clear_traces() @@ -202,8 +202,8 @@ # allocate another object obj, obj_traceback = allocate_bytes(obj_size) - size, max_size = tracemalloc.get_traced_memory() - self.assertGreater(size, 0) + size, peak_size = tracemalloc.get_traced_memory() + self.assertGreaterEqual(size, obj_size) # stop() also resets traced memory counters tracemalloc.stop() diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -106,9 +106,9 @@ Protected by TABLES_LOCK(). */ static size_t tracemalloc_traced_memory = 0; -/* Maximum size in bytes of traced memory. +/* Peak size in bytes of traced memory. Protected by TABLES_LOCK(). */ -static size_t tracemalloc_max_traced_memory = 0; +static size_t tracemalloc_peak_traced_memory = 0; /* Hash table used as a set to to intern filenames: PyObject* => PyObject*. @@ -464,8 +464,8 @@ if (res == 0) { assert(tracemalloc_traced_memory <= PY_SIZE_MAX - size); tracemalloc_traced_memory += size; - if (tracemalloc_traced_memory > tracemalloc_max_traced_memory) - tracemalloc_max_traced_memory = tracemalloc_traced_memory; + if (tracemalloc_traced_memory > tracemalloc_peak_traced_memory) + tracemalloc_peak_traced_memory = tracemalloc_traced_memory; } TABLES_UNLOCK(); @@ -674,7 +674,7 @@ TABLES_LOCK(); _Py_hashtable_clear(tracemalloc_traces); tracemalloc_traced_memory = 0; - tracemalloc_max_traced_memory = 0; + tracemalloc_peak_traced_memory = 0; TABLES_UNLOCK(); _Py_hashtable_foreach(tracemalloc_tracebacks, traceback_free_traceback, NULL); @@ -1266,26 +1266,26 @@ PyDoc_STRVAR(tracemalloc_get_traced_memory_doc, "get_traced_memory() -> (int, int)\n" "\n" - "Get the current size and maximum size of memory blocks traced\n" - "by the tracemalloc module as a tuple: (size: int, max_size: int)."); + "Get the current size and peak size of memory blocks traced\n" + "by the tracemalloc module as a tuple: (current: int, peak: int)."); static PyObject* tracemalloc_get_traced_memory(PyObject *self) { - Py_ssize_t size, max_size; - PyObject *size_obj, *max_size_obj; + Py_ssize_t size, peak_size; + PyObject *size_obj, *peak_size_obj; if (!tracemalloc_config.tracing) return Py_BuildValue("ii", 0, 0); TABLES_LOCK(); size = tracemalloc_traced_memory; - max_size = tracemalloc_max_traced_memory; + peak_size = tracemalloc_peak_traced_memory; TABLES_UNLOCK(); size_obj = PyLong_FromSize_t(size); - max_size_obj = PyLong_FromSize_t(max_size); - return Py_BuildValue("NN", size_obj, max_size_obj); + peak_size_obj = PyLong_FromSize_t(peak_size); + return Py_BuildValue("NN", size_obj, peak_size_obj); } static PyMethodDef module_methods[] = { -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 27 22:30:58 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 27 Nov 2013 22:30:58 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Closes_=2319786=3A_tracema?= =?utf-8?q?lloc=2C_remove_the_arbitrary_limit_of_100_frames?= Message-ID: <3dVFXL08mvz7LmM@mail.python.org> http://hg.python.org/cpython/rev/eead17ba32d8 changeset: 87616:eead17ba32d8 user: Victor Stinner date: Wed Nov 27 22:27:13 2013 +0100 summary: Closes #19786: tracemalloc, remove the arbitrary limit of 100 frames The limit is now 178,956,969 on 64 bit (it is greater on 32 bit because structures are smaller). Use int instead of Py_ssize_t to store the number of frames to have smaller traceback_t objects. files: Lib/test/test_tracemalloc.py | 12 ++-- Modules/_tracemalloc.c | 55 ++++++++++++++--------- 2 files changed, 40 insertions(+), 27 deletions(-) diff --git a/Lib/test/test_tracemalloc.py b/Lib/test/test_tracemalloc.py --- a/Lib/test/test_tracemalloc.py +++ b/Lib/test/test_tracemalloc.py @@ -747,14 +747,14 @@ self.assertEqual(stdout, b'10') def test_env_var_invalid(self): - for nframe in (-1, 0, 5000): + for nframe in (-1, 0, 2**30): with self.subTest(nframe=nframe): with support.SuppressCrashReport(): ok, stdout, stderr = assert_python_failure( '-c', 'pass', PYTHONTRACEMALLOC=str(nframe)) - self.assertIn(b'PYTHONTRACEMALLOC must be an integer ' - b'in range [1; 100]', + self.assertIn(b'PYTHONTRACEMALLOC: invalid ' + b'number of frames', stderr) def test_sys_xoptions(self): @@ -770,13 +770,13 @@ self.assertEqual(stdout, str(nframe).encode('ascii')) def test_sys_xoptions_invalid(self): - for nframe in (-1, 0, 5000): + for nframe in (-1, 0, 2**30): with self.subTest(nframe=nframe): with support.SuppressCrashReport(): args = ('-X', 'tracemalloc=%s' % nframe, '-c', 'pass') ok, stdout, stderr = assert_python_failure(*args) - self.assertIn(b'-X tracemalloc=NFRAME: number of frame must ' - b'be an integer in range [1; 100]', + self.assertIn(b'-X tracemalloc=NFRAME: invalid ' + b'number of frames', stderr) diff --git a/Modules/_tracemalloc.c b/Modules/_tracemalloc.c --- a/Modules/_tracemalloc.c +++ b/Modules/_tracemalloc.c @@ -27,11 +27,6 @@ PyMemAllocator obj; } allocators; -/* Arbitrary limit of the number of frames in a traceback. The value was chosen - to not allocate too much memory on the stack (see TRACEBACK_STACK_SIZE - below). */ -#define MAX_NFRAME 100 - static struct { /* Module initialized? Variable protected by the GIL */ @@ -88,7 +83,9 @@ #define TRACEBACK_SIZE(NFRAME) \ (sizeof(traceback_t) + sizeof(frame_t) * (NFRAME - 1)) -#define TRACEBACK_STACK_SIZE TRACEBACK_SIZE(MAX_NFRAME) + +#define MAX_NFRAME \ + ((INT_MAX - sizeof(traceback_t)) / sizeof(frame_t) + 1) static PyObject *unknown_filename = NULL; static traceback_t tracemalloc_empty_traceback; @@ -115,6 +112,10 @@ Protected by the GIL */ static _Py_hashtable_t *tracemalloc_filenames = NULL; +/* Buffer to store a new traceback in traceback_new(). + Protected by the GIL. */ +static traceback_t *tracemalloc_traceback = NULL; + /* Hash table used as a set to intern tracebacks: traceback_t* => traceback_t* Protected by the GIL */ @@ -394,8 +395,7 @@ static traceback_t * traceback_new(void) { - char stack_buffer[TRACEBACK_STACK_SIZE]; - traceback_t *traceback = (traceback_t *)stack_buffer; + traceback_t *traceback; _Py_hashtable_entry_t *entry; #ifdef WITH_THREAD @@ -403,6 +403,7 @@ #endif /* get frames */ + traceback = tracemalloc_traceback; traceback->nframe = 0; traceback_get_frames(traceback); if (traceback->nframe == 0) @@ -788,9 +789,10 @@ } static int -tracemalloc_start(void) +tracemalloc_start(int max_nframe) { PyMemAllocator alloc; + size_t size; if (tracemalloc_init() < 0) return -1; @@ -803,6 +805,18 @@ if (tracemalloc_atexit_register() < 0) return -1; + assert(1 <= max_nframe && max_nframe <= MAX_NFRAME); + tracemalloc_config.max_nframe = max_nframe; + + /* allocate a buffer to store a new traceback */ + size = TRACEBACK_SIZE(max_nframe); + assert(tracemalloc_traceback == NULL); + tracemalloc_traceback = raw_malloc(size); + if (tracemalloc_traceback == NULL) { + PyErr_NoMemory(); + return -1; + } + #ifdef TRACE_RAW_MALLOC alloc.malloc = tracemalloc_raw_malloc; alloc.realloc = tracemalloc_raw_realloc; @@ -854,9 +868,10 @@ /* release memory */ tracemalloc_clear_traces(); + raw_free(tracemalloc_traceback); + tracemalloc_traceback = NULL; } - static PyObject* lineno_as_obj(int lineno) { @@ -1194,6 +1209,7 @@ py_tracemalloc_start(PyObject *self, PyObject *args) { Py_ssize_t nframe = 1; + int nframe_int; if (!PyArg_ParseTuple(args, "|n:start", &nframe)) return NULL; @@ -1201,12 +1217,12 @@ if (nframe < 1 || nframe > MAX_NFRAME) { PyErr_Format(PyExc_ValueError, "the number of frames must be in range [1; %i]", - MAX_NFRAME); + (int)MAX_NFRAME); return NULL; } - tracemalloc_config.max_nframe = Py_SAFE_DOWNCAST(nframe, Py_ssize_t, int); + nframe_int = Py_SAFE_DOWNCAST(nframe, Py_ssize_t, int); - if (tracemalloc_start() < 0) + if (tracemalloc_start(nframe_int) < 0) return NULL; Py_RETURN_NONE; @@ -1378,16 +1394,15 @@ if ((p = Py_GETENV("PYTHONTRACEMALLOC")) && *p != '\0') { char *endptr = p; - unsigned long value; + long value; - value = strtoul(p, &endptr, 10); + value = strtol(p, &endptr, 10); if (*endptr != '\0' || value < 1 || value > MAX_NFRAME || (errno == ERANGE && value == ULONG_MAX)) { - Py_FatalError("PYTHONTRACEMALLOC must be an integer " - "in range [1; " STR(MAX_NFRAME) "]"); + Py_FatalError("PYTHONTRACEMALLOC: invalid number of frames"); return -1; } @@ -1417,12 +1432,10 @@ nframe = parse_sys_xoptions(value); Py_DECREF(value); if (nframe < 0) { - Py_FatalError("-X tracemalloc=NFRAME: number of frame must be " - "an integer in range [1; " STR(MAX_NFRAME) "]"); + Py_FatalError("-X tracemalloc=NFRAME: invalid number of frames"); } } - tracemalloc_config.max_nframe = nframe; - return tracemalloc_start(); + return tracemalloc_start(nframe); } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 27 23:12:55 2013 From: python-checkins at python.org (guido.van.rossum) Date: Wed, 27 Nov 2013 23:12:55 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Change_write_bu?= =?utf-8?q?ffer_use_to_avoid_O=28N**2=29=2E_Make_write=28=29/sendto=28=29_?= =?utf-8?q?accept?= Message-ID: <3dVGSl4Tl3zRNM@mail.python.org> http://hg.python.org/cpython/rev/80e0040d910c changeset: 87617:80e0040d910c user: Guido van Rossum date: Wed Nov 27 14:12:48 2013 -0800 summary: asyncio: Change write buffer use to avoid O(N**2). Make write()/sendto() accept bytearray/memoryview too. Change some asserts with proper exceptions. files: Lib/asyncio/selector_events.py | 82 ++- Lib/test/test_asyncio/test_selector_events.py | 203 +++++++-- 2 files changed, 207 insertions(+), 78 deletions(-) diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py --- a/Lib/asyncio/selector_events.py +++ b/Lib/asyncio/selector_events.py @@ -340,6 +340,8 @@ max_size = 256 * 1024 # Buffer size passed to recv(). + _buffer_factory = bytearray # Constructs initial value for self._buffer. + def __init__(self, loop, sock, protocol, extra, server=None): super().__init__(extra) self._extra['socket'] = sock @@ -354,7 +356,7 @@ self._sock_fd = sock.fileno() self._protocol = protocol self._server = server - self._buffer = collections.deque() + self._buffer = self._buffer_factory() self._conn_lost = 0 # Set when call to connection_lost scheduled. self._closing = False # Set when close() called. self._protocol_paused = False @@ -433,12 +435,14 @@ high = 4*low if low is None: low = high // 4 - assert 0 <= low <= high, repr((low, high)) + if not high >= low >= 0: + raise ValueError('high (%r) must be >= low (%r) must be >= 0' % + (high, low)) self._high_water = high self._low_water = low def get_write_buffer_size(self): - return sum(len(data) for data in self._buffer) + return len(self._buffer) class _SelectorSocketTransport(_SelectorTransport): @@ -455,13 +459,16 @@ self._loop.call_soon(waiter.set_result, None) def pause_reading(self): - assert not self._closing, 'Cannot pause_reading() when closing' - assert not self._paused, 'Already paused' + if self._closing: + raise RuntimeError('Cannot pause_reading() when closing') + if self._paused: + raise RuntimeError('Already paused') self._paused = True self._loop.remove_reader(self._sock_fd) def resume_reading(self): - assert self._paused, 'Not paused' + if not self._paused: + raise RuntimeError('Not paused') self._paused = False if self._closing: return @@ -488,8 +495,11 @@ self.close() def write(self, data): - assert isinstance(data, bytes), repr(type(data)) - assert not self._eof, 'Cannot call write() after write_eof()' + if not isinstance(data, (bytes, bytearray, memoryview)): + raise TypeError('data argument must be byte-ish (%r)', + type(data)) + if self._eof: + raise RuntimeError('Cannot call write() after write_eof()') if not data: return @@ -516,25 +526,23 @@ self._loop.add_writer(self._sock_fd, self._write_ready) # Add it to the buffer. - self._buffer.append(data) + self._buffer.extend(data) self._maybe_pause_protocol() def _write_ready(self): - data = b''.join(self._buffer) - assert data, 'Data should not be empty' + assert self._buffer, 'Data should not be empty' - self._buffer.clear() # Optimistically; may have to put it back later. try: - n = self._sock.send(data) + n = self._sock.send(self._buffer) except (BlockingIOError, InterruptedError): - self._buffer.append(data) # Still need to write this. + pass except Exception as exc: self._loop.remove_writer(self._sock_fd) + self._buffer.clear() self._fatal_error(exc) else: - data = data[n:] - if data: - self._buffer.append(data) # Still need to write this. + if n: + del self._buffer[:n] self._maybe_resume_protocol() # May append to buffer. if not self._buffer: self._loop.remove_writer(self._sock_fd) @@ -556,6 +564,8 @@ class _SelectorSslTransport(_SelectorTransport): + _buffer_factory = bytearray + def __init__(self, loop, rawsock, protocol, sslcontext, waiter=None, server_side=False, server_hostname=None, extra=None, server=None): @@ -661,13 +671,16 @@ # accept more data for the buffer and eventually the app will # call resume_reading() again, and things will flow again. - assert not self._closing, 'Cannot pause_reading() when closing' - assert not self._paused, 'Already paused' + if self._closing: + raise RuntimeError('Cannot pause_reading() when closing') + if self._paused: + raise RuntimeError('Already paused') self._paused = True self._loop.remove_reader(self._sock_fd) def resume_reading(self): - assert self._paused, 'Not paused' + if not self._paused: + raise ('Not paused') self._paused = False if self._closing: return @@ -712,10 +725,8 @@ self._loop.add_reader(self._sock_fd, self._read_ready) if self._buffer: - data = b''.join(self._buffer) - self._buffer.clear() try: - n = self._sock.send(data) + n = self._sock.send(self._buffer) except (BlockingIOError, InterruptedError, ssl.SSLWantWriteError): n = 0 @@ -725,11 +736,12 @@ self._write_wants_read = True except Exception as exc: self._loop.remove_writer(self._sock_fd) + self._buffer.clear() self._fatal_error(exc) return - if n < len(data): - self._buffer.append(data[n:]) + if n: + del self._buffer[:n] self._maybe_resume_protocol() # May append to buffer. @@ -739,7 +751,9 @@ self._call_connection_lost(None) def write(self, data): - assert isinstance(data, bytes), repr(type(data)) + if not isinstance(data, (bytes, bytearray, memoryview)): + raise TypeError('data argument must be byte-ish (%r)', + type(data)) if not data: return @@ -753,7 +767,7 @@ self._loop.add_writer(self._sock_fd, self._write_ready) # Add it to the buffer. - self._buffer.append(data) + self._buffer.extend(data) self._maybe_pause_protocol() def can_write_eof(self): @@ -762,6 +776,8 @@ class _SelectorDatagramTransport(_SelectorTransport): + _buffer_factory = collections.deque + def __init__(self, loop, sock, protocol, address=None, extra=None): super().__init__(loop, sock, protocol, extra) self._address = address @@ -784,12 +800,15 @@ self._protocol.datagram_received(data, addr) def sendto(self, data, addr=None): - assert isinstance(data, bytes), repr(type(data)) + if not isinstance(data, (bytes, bytearray, memoryview)): + raise TypeError('data argument must be byte-ish (%r)', + type(data)) if not data: return - if self._address: - assert addr in (None, self._address) + if self._address and addr not in (None, self._address): + raise ValueError('Invalid address: must be None or %s' % + (self._address,)) if self._conn_lost and self._address: if self._conn_lost >= constants.LOG_THRESHOLD_FOR_CONNLOST_WRITES: @@ -814,7 +833,8 @@ self._fatal_error(exc) return - self._buffer.append((data, addr)) + # Ensure that what we buffer is immutable. + self._buffer.append((bytes(data), addr)) self._maybe_pause_protocol() def _sendto_ready(self): diff --git a/Lib/test/test_asyncio/test_selector_events.py b/Lib/test/test_asyncio/test_selector_events.py --- a/Lib/test/test_asyncio/test_selector_events.py +++ b/Lib/test/test_asyncio/test_selector_events.py @@ -32,6 +32,10 @@ self._internal_fds += 1 +def list_to_buffer(l=()): + return bytearray().join(l) + + class BaseSelectorEventLoopTests(unittest.TestCase): def setUp(self): @@ -613,7 +617,7 @@ def test_close_write_buffer(self): tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) - tr._buffer.append(b'data') + tr._buffer.extend(b'data') tr.close() self.assertFalse(self.loop.readers) @@ -622,13 +626,13 @@ def test_force_close(self): tr = _SelectorTransport(self.loop, self.sock, self.protocol, None) - tr._buffer.append(b'1') + tr._buffer.extend(b'1') self.loop.add_reader(7, unittest.mock.sentinel) self.loop.add_writer(7, unittest.mock.sentinel) tr._force_close(None) self.assertTrue(tr._closing) - self.assertEqual(tr._buffer, collections.deque()) + self.assertEqual(tr._buffer, list_to_buffer()) self.assertFalse(self.loop.readers) self.assertFalse(self.loop.writers) @@ -783,21 +787,40 @@ transport.write(data) self.sock.send.assert_called_with(data) + def test_write_bytearray(self): + data = bytearray(b'data') + self.sock.send.return_value = len(data) + + transport = _SelectorSocketTransport( + self.loop, self.sock, self.protocol) + transport.write(data) + self.sock.send.assert_called_with(data) + self.assertEqual(data, bytearray(b'data')) # Hasn't been mutated. + + def test_write_memoryview(self): + data = memoryview(b'data') + self.sock.send.return_value = len(data) + + transport = _SelectorSocketTransport( + self.loop, self.sock, self.protocol) + transport.write(data) + self.sock.send.assert_called_with(data) + def test_write_no_data(self): transport = _SelectorSocketTransport( self.loop, self.sock, self.protocol) - transport._buffer.append(b'data') + transport._buffer.extend(b'data') transport.write(b'') self.assertFalse(self.sock.send.called) - self.assertEqual(collections.deque([b'data']), transport._buffer) + self.assertEqual(list_to_buffer([b'data']), transport._buffer) def test_write_buffer(self): transport = _SelectorSocketTransport( self.loop, self.sock, self.protocol) - transport._buffer.append(b'data1') + transport._buffer.extend(b'data1') transport.write(b'data2') self.assertFalse(self.sock.send.called) - self.assertEqual(collections.deque([b'data1', b'data2']), + self.assertEqual(list_to_buffer([b'data1', b'data2']), transport._buffer) def test_write_partial(self): @@ -809,7 +832,30 @@ transport.write(data) self.loop.assert_writer(7, transport._write_ready) - self.assertEqual(collections.deque([b'ta']), transport._buffer) + self.assertEqual(list_to_buffer([b'ta']), transport._buffer) + + def test_write_partial_bytearray(self): + data = bytearray(b'data') + self.sock.send.return_value = 2 + + transport = _SelectorSocketTransport( + self.loop, self.sock, self.protocol) + transport.write(data) + + self.loop.assert_writer(7, transport._write_ready) + self.assertEqual(list_to_buffer([b'ta']), transport._buffer) + self.assertEqual(data, bytearray(b'data')) # Hasn't been mutated. + + def test_write_partial_memoryview(self): + data = memoryview(b'data') + self.sock.send.return_value = 2 + + transport = _SelectorSocketTransport( + self.loop, self.sock, self.protocol) + transport.write(data) + + self.loop.assert_writer(7, transport._write_ready) + self.assertEqual(list_to_buffer([b'ta']), transport._buffer) def test_write_partial_none(self): data = b'data' @@ -821,7 +867,7 @@ transport.write(data) self.loop.assert_writer(7, transport._write_ready) - self.assertEqual(collections.deque([b'data']), transport._buffer) + self.assertEqual(list_to_buffer([b'data']), transport._buffer) def test_write_tryagain(self): self.sock.send.side_effect = BlockingIOError @@ -832,7 +878,7 @@ transport.write(data) self.loop.assert_writer(7, transport._write_ready) - self.assertEqual(collections.deque([b'data']), transport._buffer) + self.assertEqual(list_to_buffer([b'data']), transport._buffer) @unittest.mock.patch('asyncio.selector_events.logger') def test_write_exception(self, m_log): @@ -859,7 +905,7 @@ def test_write_str(self): transport = _SelectorSocketTransport( self.loop, self.sock, self.protocol) - self.assertRaises(AssertionError, transport.write, 'str') + self.assertRaises(TypeError, transport.write, 'str') def test_write_closing(self): transport = _SelectorSocketTransport( @@ -875,11 +921,10 @@ transport = _SelectorSocketTransport( self.loop, self.sock, self.protocol) - transport._buffer.append(data) + transport._buffer.extend(data) self.loop.add_writer(7, transport._write_ready) transport._write_ready() self.assertTrue(self.sock.send.called) - self.assertEqual(self.sock.send.call_args[0], (data,)) self.assertFalse(self.loop.writers) def test_write_ready_closing(self): @@ -889,10 +934,10 @@ transport = _SelectorSocketTransport( self.loop, self.sock, self.protocol) transport._closing = True - transport._buffer.append(data) + transport._buffer.extend(data) self.loop.add_writer(7, transport._write_ready) transport._write_ready() - self.sock.send.assert_called_with(data) + self.assertTrue(self.sock.send.called) self.assertFalse(self.loop.writers) self.sock.close.assert_called_with() self.protocol.connection_lost.assert_called_with(None) @@ -900,6 +945,7 @@ def test_write_ready_no_data(self): transport = _SelectorSocketTransport( self.loop, self.sock, self.protocol) + # This is an internal error. self.assertRaises(AssertionError, transport._write_ready) def test_write_ready_partial(self): @@ -908,11 +954,11 @@ transport = _SelectorSocketTransport( self.loop, self.sock, self.protocol) - transport._buffer.append(data) + transport._buffer.extend(data) self.loop.add_writer(7, transport._write_ready) transport._write_ready() self.loop.assert_writer(7, transport._write_ready) - self.assertEqual(collections.deque([b'ta']), transport._buffer) + self.assertEqual(list_to_buffer([b'ta']), transport._buffer) def test_write_ready_partial_none(self): data = b'data' @@ -920,23 +966,23 @@ transport = _SelectorSocketTransport( self.loop, self.sock, self.protocol) - transport._buffer.append(data) + transport._buffer.extend(data) self.loop.add_writer(7, transport._write_ready) transport._write_ready() self.loop.assert_writer(7, transport._write_ready) - self.assertEqual(collections.deque([b'data']), transport._buffer) + self.assertEqual(list_to_buffer([b'data']), transport._buffer) def test_write_ready_tryagain(self): self.sock.send.side_effect = BlockingIOError transport = _SelectorSocketTransport( self.loop, self.sock, self.protocol) - transport._buffer = collections.deque([b'data1', b'data2']) + transport._buffer = list_to_buffer([b'data1', b'data2']) self.loop.add_writer(7, transport._write_ready) transport._write_ready() self.loop.assert_writer(7, transport._write_ready) - self.assertEqual(collections.deque([b'data1data2']), transport._buffer) + self.assertEqual(list_to_buffer([b'data1data2']), transport._buffer) def test_write_ready_exception(self): err = self.sock.send.side_effect = OSError() @@ -944,7 +990,7 @@ transport = _SelectorSocketTransport( self.loop, self.sock, self.protocol) transport._fatal_error = unittest.mock.Mock() - transport._buffer.append(b'data') + transport._buffer.extend(b'data') transport._write_ready() transport._fatal_error.assert_called_with(err) @@ -956,7 +1002,7 @@ transport = _SelectorSocketTransport( self.loop, self.sock, self.protocol) transport.close() - transport._buffer.append(b'data') + transport._buffer.extend(b'data') transport._write_ready() remove_writer.assert_called_with(self.sock_fd) @@ -976,12 +1022,12 @@ self.sock.send.side_effect = BlockingIOError tr.write(b'data') tr.write_eof() - self.assertEqual(tr._buffer, collections.deque([b'data'])) + self.assertEqual(tr._buffer, list_to_buffer([b'data'])) self.assertTrue(tr._eof) self.assertFalse(self.sock.shutdown.called) self.sock.send.side_effect = lambda _: 4 tr._write_ready() - self.sock.send.assert_called_with(b'data') + self.assertTrue(self.sock.send.called) self.sock.shutdown.assert_called_with(socket.SHUT_WR) tr.close() @@ -1065,15 +1111,34 @@ self.assertFalse(tr._paused) self.loop.assert_reader(1, tr._read_ready) + def test_write(self): + transport = self._make_one() + transport.write(b'data') + self.assertEqual(list_to_buffer([b'data']), transport._buffer) + + def test_write_bytearray(self): + transport = self._make_one() + data = bytearray(b'data') + transport.write(data) + self.assertEqual(list_to_buffer([b'data']), transport._buffer) + self.assertEqual(data, bytearray(b'data')) # Hasn't been mutated. + self.assertIsNot(data, transport._buffer) # Hasn't been incorporated. + + def test_write_memoryview(self): + transport = self._make_one() + data = memoryview(b'data') + transport.write(data) + self.assertEqual(list_to_buffer([b'data']), transport._buffer) + def test_write_no_data(self): transport = self._make_one() - transport._buffer.append(b'data') + transport._buffer.extend(b'data') transport.write(b'') - self.assertEqual(collections.deque([b'data']), transport._buffer) + self.assertEqual(list_to_buffer([b'data']), transport._buffer) def test_write_str(self): transport = self._make_one() - self.assertRaises(AssertionError, transport.write, 'str') + self.assertRaises(TypeError, transport.write, 'str') def test_write_closing(self): transport = self._make_one() @@ -1087,7 +1152,7 @@ transport = self._make_one() transport._conn_lost = 1 transport.write(b'data') - self.assertEqual(transport._buffer, collections.deque()) + self.assertEqual(transport._buffer, list_to_buffer()) transport.write(b'data') transport.write(b'data') transport.write(b'data') @@ -1107,7 +1172,7 @@ transport = self._make_one() transport._write_wants_read = True transport._write_ready = unittest.mock.Mock() - transport._buffer.append(b'data') + transport._buffer.extend(b'data') transport._read_ready() self.assertFalse(transport._write_wants_read) @@ -1168,31 +1233,31 @@ def test_write_ready_send(self): self.sslsock.send.return_value = 4 transport = self._make_one() - transport._buffer = collections.deque([b'data']) + transport._buffer = list_to_buffer([b'data']) transport._write_ready() - self.assertEqual(collections.deque(), transport._buffer) + self.assertEqual(list_to_buffer(), transport._buffer) self.assertTrue(self.sslsock.send.called) def test_write_ready_send_none(self): self.sslsock.send.return_value = 0 transport = self._make_one() - transport._buffer = collections.deque([b'data1', b'data2']) + transport._buffer = list_to_buffer([b'data1', b'data2']) transport._write_ready() self.assertTrue(self.sslsock.send.called) - self.assertEqual(collections.deque([b'data1data2']), transport._buffer) + self.assertEqual(list_to_buffer([b'data1data2']), transport._buffer) def test_write_ready_send_partial(self): self.sslsock.send.return_value = 2 transport = self._make_one() - transport._buffer = collections.deque([b'data1', b'data2']) + transport._buffer = list_to_buffer([b'data1', b'data2']) transport._write_ready() self.assertTrue(self.sslsock.send.called) - self.assertEqual(collections.deque([b'ta1data2']), transport._buffer) + self.assertEqual(list_to_buffer([b'ta1data2']), transport._buffer) def test_write_ready_send_closing_partial(self): self.sslsock.send.return_value = 2 transport = self._make_one() - transport._buffer = collections.deque([b'data1', b'data2']) + transport._buffer = list_to_buffer([b'data1', b'data2']) transport._write_ready() self.assertTrue(self.sslsock.send.called) self.assertFalse(self.sslsock.close.called) @@ -1201,7 +1266,7 @@ self.sslsock.send.return_value = 4 transport = self._make_one() transport.close() - transport._buffer = collections.deque([b'data']) + transport._buffer = list_to_buffer([b'data']) transport._write_ready() self.assertFalse(self.loop.writers) self.protocol.connection_lost.assert_called_with(None) @@ -1210,26 +1275,26 @@ self.sslsock.send.return_value = 4 transport = self._make_one() transport.close() - transport._buffer = collections.deque() + transport._buffer = list_to_buffer() transport._write_ready() self.assertFalse(self.loop.writers) self.protocol.connection_lost.assert_called_with(None) def test_write_ready_send_retry(self): transport = self._make_one() - transport._buffer = collections.deque([b'data']) + transport._buffer = list_to_buffer([b'data']) self.sslsock.send.side_effect = ssl.SSLWantWriteError transport._write_ready() - self.assertEqual(collections.deque([b'data']), transport._buffer) + self.assertEqual(list_to_buffer([b'data']), transport._buffer) self.sslsock.send.side_effect = BlockingIOError() transport._write_ready() - self.assertEqual(collections.deque([b'data']), transport._buffer) + self.assertEqual(list_to_buffer([b'data']), transport._buffer) def test_write_ready_send_read(self): transport = self._make_one() - transport._buffer = collections.deque([b'data']) + transport._buffer = list_to_buffer([b'data']) self.loop.remove_writer = unittest.mock.Mock() self.sslsock.send.side_effect = ssl.SSLWantReadError @@ -1242,11 +1307,11 @@ err = self.sslsock.send.side_effect = OSError() transport = self._make_one() - transport._buffer = collections.deque([b'data']) + transport._buffer = list_to_buffer([b'data']) transport._fatal_error = unittest.mock.Mock() transport._write_ready() transport._fatal_error.assert_called_with(err) - self.assertEqual(collections.deque(), transport._buffer) + self.assertEqual(list_to_buffer(), transport._buffer) def test_write_ready_read_wants_write(self): self.loop.add_reader = unittest.mock.Mock() @@ -1355,6 +1420,24 @@ self.assertEqual( self.sock.sendto.call_args[0], (data, ('0.0.0.0', 1234))) + def test_sendto_bytearray(self): + data = bytearray(b'data') + transport = _SelectorDatagramTransport( + self.loop, self.sock, self.protocol) + transport.sendto(data, ('0.0.0.0', 1234)) + self.assertTrue(self.sock.sendto.called) + self.assertEqual( + self.sock.sendto.call_args[0], (data, ('0.0.0.0', 1234))) + + def test_sendto_memoryview(self): + data = memoryview(b'data') + transport = _SelectorDatagramTransport( + self.loop, self.sock, self.protocol) + transport.sendto(data, ('0.0.0.0', 1234)) + self.assertTrue(self.sock.sendto.called) + self.assertEqual( + self.sock.sendto.call_args[0], (data, ('0.0.0.0', 1234))) + def test_sendto_no_data(self): transport = _SelectorDatagramTransport( self.loop, self.sock, self.protocol) @@ -1375,6 +1458,32 @@ (b'data2', ('0.0.0.0', 12345))], list(transport._buffer)) + def test_sendto_buffer_bytearray(self): + data2 = bytearray(b'data2') + transport = _SelectorDatagramTransport( + self.loop, self.sock, self.protocol) + transport._buffer.append((b'data1', ('0.0.0.0', 12345))) + transport.sendto(data2, ('0.0.0.0', 12345)) + self.assertFalse(self.sock.sendto.called) + self.assertEqual( + [(b'data1', ('0.0.0.0', 12345)), + (b'data2', ('0.0.0.0', 12345))], + list(transport._buffer)) + self.assertIsInstance(transport._buffer[1][0], bytes) + + def test_sendto_buffer_memoryview(self): + data2 = memoryview(b'data2') + transport = _SelectorDatagramTransport( + self.loop, self.sock, self.protocol) + transport._buffer.append((b'data1', ('0.0.0.0', 12345))) + transport.sendto(data2, ('0.0.0.0', 12345)) + self.assertFalse(self.sock.sendto.called) + self.assertEqual( + [(b'data1', ('0.0.0.0', 12345)), + (b'data2', ('0.0.0.0', 12345))], + list(transport._buffer)) + self.assertIsInstance(transport._buffer[1][0], bytes) + def test_sendto_tryagain(self): data = b'data' @@ -1439,13 +1548,13 @@ def test_sendto_str(self): transport = _SelectorDatagramTransport( self.loop, self.sock, self.protocol) - self.assertRaises(AssertionError, transport.sendto, 'str', ()) + self.assertRaises(TypeError, transport.sendto, 'str', ()) def test_sendto_connected_addr(self): transport = _SelectorDatagramTransport( self.loop, self.sock, self.protocol, ('0.0.0.0', 1)) self.assertRaises( - AssertionError, transport.sendto, b'str', ('0.0.0.0', 2)) + ValueError, transport.sendto, b'str', ('0.0.0.0', 2)) def test_sendto_closing(self): transport = _SelectorDatagramTransport( -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 27 23:41:54 2013 From: python-checkins at python.org (victor.stinner) Date: Wed, 27 Nov 2013 23:41:54 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319818=3A_tracemal?= =?utf-8?q?loc=2C_the_number_of_frame_limit_cannot_be_zero_anymore?= Message-ID: <3dVH6B4qRrz7Ln5@mail.python.org> http://hg.python.org/cpython/rev/8df54b9b99ef changeset: 87618:8df54b9b99ef user: Victor Stinner date: Wed Nov 27 23:39:55 2013 +0100 summary: Issue #19818: tracemalloc, the number of frame limit cannot be zero anymore files: Doc/library/tracemalloc.rst | 6 +++--- 1 files changed, 3 insertions(+), 3 deletions(-) diff --git a/Doc/library/tracemalloc.rst b/Doc/library/tracemalloc.rst --- a/Doc/library/tracemalloc.rst +++ b/Doc/library/tracemalloc.rst @@ -391,9 +391,9 @@ If *all_frames* is ``True``, all frames of the traceback are checked. If *all_frames* is ``False``, only the most recent frame is checked. - This attribute is ignored if the traceback limit is less than ``2``. See - the :func:`get_traceback_limit` function and - :attr:`Snapshot.traceback_limit` attribute. + This attribute has no effect if the traceback limit is ``1``. See the + :func:`get_traceback_limit` function and :attr:`Snapshot.traceback_limit` + attribute. Frame -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Wed Nov 27 23:43:32 2013 From: python-checkins at python.org (ned.deily) Date: Wed, 27 Nov 2013 23:43:32 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Change_pathlib_documentati?= =?utf-8?q?on_to_use_=22raise=22_instead_of_=22throw=22=2E?= Message-ID: <3dVH840F7qzNWm@mail.python.org> http://hg.python.org/cpython/rev/acabd3f035fe changeset: 87619:acabd3f035fe user: Ned Deily date: Wed Nov 27 14:42:55 2013 -0800 summary: Change pathlib documentation to use "raise" instead of "throw". files: Doc/library/pathlib.rst | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Doc/library/pathlib.rst b/Doc/library/pathlib.rst --- a/Doc/library/pathlib.rst +++ b/Doc/library/pathlib.rst @@ -655,7 +655,7 @@ .. method:: Path.group() - Return the name of the group owning the file. :exc:`KeyError` is thrown + Return the name of the group owning the file. :exc:`KeyError` is raised if the file's gid isn't found in the system database. @@ -774,7 +774,7 @@ .. method:: Path.owner() - Return the name of the user owning the file. :exc:`KeyError` is thrown + Return the name of the user owning the file. :exc:`KeyError` is raised if the file's uid isn't found in the system database. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 28 04:37:05 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Thu, 28 Nov 2013 04:37:05 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Encapsulate_cpickle_global?= =?utf-8?q?_state_in_a_dedicated_object=2E?= Message-ID: <3dVPfn6S28zRGY@mail.python.org> http://hg.python.org/cpython/rev/64c6d52793be changeset: 87620:64c6d52793be user: Alexandre Vassalotti date: Wed Nov 27 19:36:52 2013 -0800 summary: Encapsulate cpickle global state in a dedicated object. This implements PEP 3121 module finalization as well. This change does not cause any significant impact on performance. files: Modules/_pickle.c | 716 ++++++++++++++++++++------------- 1 files changed, 436 insertions(+), 280 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -1,6 +1,9 @@ #include "Python.h" #include "structmember.h" +PyDoc_STRVAR(pickle_module_doc, +"Optimized C implementation for the Python pickle module."); + /*[clinic] module _pickle class _pickle.Pickler @@ -25,9 +28,6 @@ [python]*/ /*[python checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ -PyDoc_STRVAR(pickle_module_doc, -"Optimized C implementation for the Python pickle module."); - /* Bump this when new opcodes are added to the pickle protocol. */ enum { HIGHEST_PROTOCOL = 4, @@ -132,40 +132,260 @@ FRAME_HEADER_SIZE = 9 }; -/* Exception classes for pickle. These should override the ones defined in - pickle.py, when the C-optimized Pickler and Unpickler are used. */ -static PyObject *PickleError = NULL; -static PyObject *PicklingError = NULL; -static PyObject *UnpicklingError = NULL; - -/* copyreg.dispatch_table, {type_object: pickling_function} */ -static PyObject *dispatch_table = NULL; -/* For EXT[124] opcodes. */ -/* copyreg._extension_registry, {(module_name, function_name): code} */ -static PyObject *extension_registry = NULL; -/* copyreg._inverted_registry, {code: (module_name, function_name)} */ -static PyObject *inverted_registry = NULL; -/* copyreg._extension_cache, {code: object} */ -static PyObject *extension_cache = NULL; - -/* _compat_pickle.NAME_MAPPING, {(oldmodule, oldname): (newmodule, newname)} */ -static PyObject *name_mapping_2to3 = NULL; -/* _compat_pickle.IMPORT_MAPPING, {oldmodule: newmodule} */ -static PyObject *import_mapping_2to3 = NULL; -/* Same, but with REVERSE_NAME_MAPPING / REVERSE_IMPORT_MAPPING */ -static PyObject *name_mapping_3to2 = NULL; -static PyObject *import_mapping_3to2 = NULL; - -/* XXX: Are these really nescessary? */ -/* As the name says, an empty tuple. */ -static PyObject *empty_tuple = NULL; -/* For looking up name pairs in copyreg._extension_registry. */ -static PyObject *two_tuple = NULL; +/*************************************************************************/ + +/* State of the pickle module, per PEP 3121. */ +typedef struct { + /* Exception classes for pickle. */ + PyObject *PickleError; + PyObject *PicklingError; + PyObject *UnpicklingError; + + /* copyreg.dispatch_table, {type_object: pickling_function} */ + PyObject *dispatch_table; + + /* For the extension opcodes EXT1, EXT2 and EXT4. */ + + /* copyreg._extension_registry, {(module_name, function_name): code} */ + PyObject *extension_registry; + /* copyreg._extension_cache, {code: object} */ + PyObject *extension_cache; + /* copyreg._inverted_registry, {code: (module_name, function_name)} */ + PyObject *inverted_registry; + + /* Import mappings for compatibility with Python 2.x */ + + /* _compat_pickle.NAME_MAPPING, + {(oldmodule, oldname): (newmodule, newname)} */ + PyObject *name_mapping_2to3; + /* _compat_pickle.IMPORT_MAPPING, {oldmodule: newmodule} */ + PyObject *import_mapping_2to3; + /* Same, but with REVERSE_NAME_MAPPING / REVERSE_IMPORT_MAPPING */ + PyObject *name_mapping_3to2; + PyObject *import_mapping_3to2; + + /* codecs.encode, used for saving bytes in older protocols */ + PyObject *codecs_encode; + + /* As the name says, an empty tuple. */ + PyObject *empty_tuple; + /* Single argument tuple used by _Pickle_FastCall */ + PyObject *arg_tuple; +} PickleState; + +/* Forward declaration of the _pickle module definition. */ +static struct PyModuleDef _picklemodule; + +/* Given a module object, get its per-module state. */ +static PickleState * +_Pickle_GetState(PyObject *module) +{ + return (PickleState *)PyModule_GetState(module); +} + +/* Find the module instance imported in the currently running sub-interpreter + and get its state. */ +static PickleState * +_Pickle_GetGlobalState(void) +{ + return _Pickle_GetState(PyState_FindModule(&_picklemodule)); +} + +/* Clear the given pickle module state. */ +static void +_Pickle_ClearState(PickleState *st) +{ + Py_CLEAR(st->PickleError); + Py_CLEAR(st->PicklingError); + Py_CLEAR(st->UnpicklingError); + Py_CLEAR(st->dispatch_table); + Py_CLEAR(st->extension_registry); + Py_CLEAR(st->extension_cache); + Py_CLEAR(st->inverted_registry); + Py_CLEAR(st->name_mapping_2to3); + Py_CLEAR(st->import_mapping_2to3); + Py_CLEAR(st->name_mapping_3to2); + Py_CLEAR(st->import_mapping_3to2); + Py_CLEAR(st->codecs_encode); + Py_CLEAR(st->empty_tuple); + Py_CLEAR(st->arg_tuple); +} + +/* Initialize the given pickle module state. */ +static int +_Pickle_InitState(PickleState *st) +{ + PyObject *copyreg = NULL; + PyObject *compat_pickle = NULL; + PyObject *codecs = NULL; + + copyreg = PyImport_ImportModule("copyreg"); + if (!copyreg) + goto error; + st->dispatch_table = PyObject_GetAttrString(copyreg, "dispatch_table"); + if (!st->dispatch_table) + goto error; + if (!PyDict_CheckExact(st->dispatch_table)) { + PyErr_Format(PyExc_RuntimeError, + "copyreg.dispatch_table should be a dict, not %.200s", + Py_TYPE(st->dispatch_table)->tp_name); + goto error; + } + st->extension_registry = \ + PyObject_GetAttrString(copyreg, "_extension_registry"); + if (!st->extension_registry) + goto error; + if (!PyDict_CheckExact(st->extension_registry)) { + PyErr_Format(PyExc_RuntimeError, + "copyreg._extension_registry should be a dict, " + "not %.200s", Py_TYPE(st->extension_registry)->tp_name); + goto error; + } + st->inverted_registry = \ + PyObject_GetAttrString(copyreg, "_inverted_registry"); + if (!st->inverted_registry) + goto error; + if (!PyDict_CheckExact(st->inverted_registry)) { + PyErr_Format(PyExc_RuntimeError, + "copyreg._inverted_registry should be a dict, " + "not %.200s", Py_TYPE(st->inverted_registry)->tp_name); + goto error; + } + st->extension_cache = PyObject_GetAttrString(copyreg, "_extension_cache"); + if (!st->extension_cache) + goto error; + if (!PyDict_CheckExact(st->extension_cache)) { + PyErr_Format(PyExc_RuntimeError, + "copyreg._extension_cache should be a dict, " + "not %.200s", Py_TYPE(st->extension_cache)->tp_name); + goto error; + } + Py_CLEAR(copyreg); + + /* Load the 2.x -> 3.x stdlib module mapping tables */ + compat_pickle = PyImport_ImportModule("_compat_pickle"); + if (!compat_pickle) + goto error; + st->name_mapping_2to3 = \ + PyObject_GetAttrString(compat_pickle, "NAME_MAPPING"); + if (!st->name_mapping_2to3) + goto error; + if (!PyDict_CheckExact(st->name_mapping_2to3)) { + PyErr_Format(PyExc_RuntimeError, + "_compat_pickle.NAME_MAPPING should be a dict, not %.200s", + Py_TYPE(st->name_mapping_2to3)->tp_name); + goto error; + } + st->import_mapping_2to3 = \ + PyObject_GetAttrString(compat_pickle, "IMPORT_MAPPING"); + if (!st->import_mapping_2to3) + goto error; + if (!PyDict_CheckExact(st->import_mapping_2to3)) { + PyErr_Format(PyExc_RuntimeError, + "_compat_pickle.IMPORT_MAPPING should be a dict, " + "not %.200s", Py_TYPE(st->import_mapping_2to3)->tp_name); + goto error; + } + /* ... and the 3.x -> 2.x mapping tables */ + st->name_mapping_3to2 = \ + PyObject_GetAttrString(compat_pickle, "REVERSE_NAME_MAPPING"); + if (!st->name_mapping_3to2) + goto error; + if (!PyDict_CheckExact(st->name_mapping_3to2)) { + PyErr_Format(PyExc_RuntimeError, + "_compat_pickle.REVERSE_NAME_MAPPING should be a dict, " + "not %.200s", Py_TYPE(st->name_mapping_3to2)->tp_name); + goto error; + } + st->import_mapping_3to2 = \ + PyObject_GetAttrString(compat_pickle, "REVERSE_IMPORT_MAPPING"); + if (!st->import_mapping_3to2) + goto error; + if (!PyDict_CheckExact(st->import_mapping_3to2)) { + PyErr_Format(PyExc_RuntimeError, + "_compat_pickle.REVERSE_IMPORT_MAPPING should be a dict, " + "not %.200s", Py_TYPE(st->import_mapping_3to2)->tp_name); + goto error; + } + Py_CLEAR(compat_pickle); + + codecs = PyImport_ImportModule("codecs"); + if (codecs == NULL) + goto error; + st->codecs_encode = PyObject_GetAttrString(codecs, "encode"); + if (st->codecs_encode == NULL) { + goto error; + } + if (!PyCallable_Check(st->codecs_encode)) { + PyErr_Format(PyExc_RuntimeError, + "codecs.encode should be a callable, not %.200s", + Py_TYPE(st->codecs_encode)->tp_name); + goto error; + } + Py_CLEAR(codecs); + + st->empty_tuple = PyTuple_New(0); + if (st->empty_tuple == NULL) + goto error; + + st->arg_tuple = NULL; + + return 0; + + error: + Py_CLEAR(copyreg); + Py_CLEAR(compat_pickle); + Py_CLEAR(codecs); + _Pickle_ClearState(st); + return -1; +} + +/* Helper for calling a function with a single argument quickly. + + This has the performance advantage of reusing the argument tuple. This + provides a nice performance boost with older pickle protocols where many + unbuffered reads occurs. + + This function steals the reference of the given argument. */ +static PyObject * +_Pickle_FastCall(PyObject *func, PyObject *obj) +{ + PickleState *st = _Pickle_GetGlobalState(); + PyObject *arg_tuple; + PyObject *result; + + arg_tuple = st->arg_tuple; + if (arg_tuple == NULL) { + arg_tuple = PyTuple_New(1); + if (arg_tuple == NULL) { + Py_DECREF(obj); + return NULL; + } + } + assert(arg_tuple->ob_refcnt == 1); + assert(PyTuple_GET_ITEM(arg_tuple, 0) == NULL); + + PyTuple_SET_ITEM(arg_tuple, 0, obj); + result = PyObject_Call(func, arg_tuple, NULL); + + Py_CLEAR(PyTuple_GET_ITEM(arg_tuple, 0)); + if (arg_tuple->ob_refcnt > 1) { + /* The function called saved a reference to our argument tuple. + This means we cannot reuse it anymore. */ + Py_CLEAR(arg_tuple); + } + st->arg_tuple = arg_tuple; + + return result; +} + +/*************************************************************************/ static int stack_underflow(void) { - PyErr_SetString(UnpicklingError, "unpickling stack underflow"); + PickleState *st = _Pickle_GetGlobalState(); + PyErr_SetString(st->UnpicklingError, "unpickling stack underflow"); return -1; } @@ -266,8 +486,9 @@ static PyObject * Pdata_pop(Pdata *self) { + PickleState *st = _Pickle_GetGlobalState(); if (Py_SIZE(self) == 0) { - PyErr_SetString(UnpicklingError, "bad pickle data"); + PyErr_SetString(st->UnpicklingError, "bad pickle data"); return NULL; } return self->data[--Py_SIZE(self)]; @@ -637,40 +858,6 @@ /*************************************************************************/ -/* Helper for calling a function with a single argument quickly. - - This has the performance advantage of reusing the argument tuple. This - provides a nice performance boost with older pickle protocols where many - unbuffered reads occurs. - - This function steals the reference of the given argument. */ -static PyObject * -_Pickle_FastCall(PyObject *func, PyObject *obj) -{ - static PyObject *arg_tuple = NULL; - PyObject *result; - - if (arg_tuple == NULL) { - arg_tuple = PyTuple_New(1); - if (arg_tuple == NULL) { - Py_DECREF(obj); - return NULL; - } - } - assert(arg_tuple->ob_refcnt == 1); - assert(PyTuple_GET_ITEM(arg_tuple, 0) == NULL); - - PyTuple_SET_ITEM(arg_tuple, 0, obj); - result = PyObject_Call(func, arg_tuple, NULL); - - Py_CLEAR(PyTuple_GET_ITEM(arg_tuple, 0)); - if (arg_tuple->ob_refcnt > 1) { - /* The function called saved a reference to our argument tuple. - This means we cannot reuse it anymore. */ - Py_CLEAR(arg_tuple); - } - return result; -} static int _Pickler_ClearBuffer(PicklerObject *self) @@ -963,8 +1150,10 @@ if (_Unpickler_SkipConsumed(self) < 0) return -1; - if (n == READ_WHOLE_LINE) - data = PyObject_Call(self->readline, empty_tuple, NULL); + if (n == READ_WHOLE_LINE) { + PickleState *st = _Pickle_GetGlobalState(); + data = PyObject_Call(self->readline, st->empty_tuple, NULL); + } else { PyObject *len = PyLong_FromSsize_t(n); if (len == NULL) @@ -1308,7 +1497,8 @@ len = 5; } else { /* unlikely */ - PyErr_SetString(PicklingError, + PickleState *st = _Pickle_GetGlobalState(); + PyErr_SetString(st->PicklingError, "memo id too large for LONG_BINGET"); return -1; } @@ -1364,7 +1554,8 @@ len = 5; } else { /* unlikely */ - PyErr_SetString(PicklingError, + PickleState *st = _Pickle_GetGlobalState(); + PyErr_SetString(st->PicklingError, "memo id too large for LONG_BINPUT"); return -1; } @@ -1807,26 +1998,14 @@ Python 2 *and* the appropriate 'bytes' object when unpickled using Python 3. Again this is a hack and we don't need to do this with newer protocols. */ - static PyObject *codecs_encode = NULL; PyObject *reduce_value = NULL; int status; - if (codecs_encode == NULL) { - PyObject *codecs_module = PyImport_ImportModule("codecs"); - if (codecs_module == NULL) { - return -1; - } - codecs_encode = PyObject_GetAttrString(codecs_module, "encode"); - Py_DECREF(codecs_module); - if (codecs_encode == NULL) { - return -1; - } - } - if (PyBytes_GET_SIZE(obj) == 0) { reduce_value = Py_BuildValue("(O())", (PyObject*)&PyBytes_Type); } else { + PickleState *st = _Pickle_GetGlobalState(); PyObject *unicode_str = PyUnicode_DecodeLatin1(PyBytes_AS_STRING(obj), PyBytes_GET_SIZE(obj), @@ -1836,7 +2015,7 @@ if (unicode_str == NULL) return -1; reduce_value = Py_BuildValue("(O(OO))", - codecs_encode, unicode_str, + st->codecs_encode, unicode_str, _PyUnicode_FromId(&PyId_latin1)); Py_DECREF(unicode_str); } @@ -2840,11 +3019,12 @@ { PyObject *key; PyObject *item; + PickleState *st = _Pickle_GetGlobalState(); key = PyTuple_Pack(2, *module_name, *global_name); if (key == NULL) return -1; - item = PyDict_GetItemWithError(name_mapping_3to2, key); + item = PyDict_GetItemWithError(st->name_mapping_3to2, key); Py_DECREF(key); if (item) { PyObject *fixed_module_name; @@ -2880,7 +3060,7 @@ return -1; } - item = PyDict_GetItemWithError(import_mapping_3to2, *module_name); + item = PyDict_GetItemWithError(st->import_mapping_3to2, *module_name); if (item) { if (!PyUnicode_Check(item)) { PyErr_Format(PyExc_RuntimeError, @@ -2907,6 +3087,7 @@ PyObject *module_name = NULL; PyObject *module = NULL; PyObject *cls; + PickleState *st = _Pickle_GetGlobalState(); int status = 0; _Py_IDENTIFIER(__name__); _Py_IDENTIFIER(__qualname__); @@ -2947,21 +3128,21 @@ extra parameters of __import__ to fix that. */ module = PyImport_Import(module_name); if (module == NULL) { - PyErr_Format(PicklingError, + PyErr_Format(st->PicklingError, "Can't pickle %R: import of module %R failed", obj, module_name); goto error; } cls = getattribute(module, global_name, self->proto >= 4); if (cls == NULL) { - PyErr_Format(PicklingError, + PyErr_Format(st->PicklingError, "Can't pickle %R: attribute lookup %S on %S failed", obj, global_name, module_name); goto error; } if (cls != obj) { Py_DECREF(cls); - PyErr_Format(PicklingError, + PyErr_Format(st->PicklingError, "Can't pickle %R: it's not the same object as %S.%S", obj, module_name, global_name); goto error; @@ -2972,14 +3153,18 @@ /* See whether this is in the extension registry, and if * so generate an EXT opcode. */ + PyObject *extension_key; PyObject *code_obj; /* extension code as Python object */ long code; /* extension code as C value */ char pdata[5]; Py_ssize_t n; - PyTuple_SET_ITEM(two_tuple, 0, module_name); - PyTuple_SET_ITEM(two_tuple, 1, global_name); - code_obj = PyDict_GetItem(extension_registry, two_tuple); + extension_key = PyTuple_Pack(2, module_name, global_name); + if (extension_key == NULL) { + goto error; + } + code_obj = PyDict_GetItem(st->extension_registry, extension_key); + Py_DECREF(extension_key); /* The object is not registered in the extension registry. This is the most likely code path. */ if (code_obj == NULL) @@ -2991,7 +3176,7 @@ /* Verify code_obj has the right type and value. */ if (!PyLong_Check(code_obj)) { - PyErr_Format(PicklingError, + PyErr_Format(st->PicklingError, "Can't pickle %R: extension code %R isn't an integer", obj, code_obj); goto error; @@ -2999,9 +3184,8 @@ code = PyLong_AS_LONG(code_obj); if (code <= 0 || code > 0x7fffffffL) { if (!PyErr_Occurred()) - PyErr_Format(PicklingError, - "Can't pickle %R: extension code %ld is out of range", - obj, code); + PyErr_Format(st->PicklingError, "Can't pickle %R: extension " + "code %ld is out of range", obj, code); goto error; } @@ -3074,7 +3258,7 @@ encoded = unicode_encoder(module_name); if (encoded == NULL) { if (PyErr_ExceptionMatches(PyExc_UnicodeEncodeError)) - PyErr_Format(PicklingError, + PyErr_Format(st->PicklingError, "can't pickle module identifier '%S' using " "pickle protocol %i", module_name, self->proto); @@ -3093,7 +3277,7 @@ encoded = unicode_encoder(global_name); if (encoded == NULL) { if (PyErr_ExceptionMatches(PyExc_UnicodeEncodeError)) - PyErr_Format(PicklingError, + PyErr_Format(st->PicklingError, "can't pickle global identifier '%S' using " "pickle protocol %i", global_name, self->proto); @@ -3206,6 +3390,7 @@ PyObject *state = NULL; PyObject *listitems = Py_None; PyObject *dictitems = Py_None; + PickleState *st = _Pickle_GetGlobalState(); Py_ssize_t size; int use_newobj = 0, use_newobj_ex = 0; @@ -3216,7 +3401,7 @@ size = PyTuple_Size(args); if (size < 2 || size > 5) { - PyErr_SetString(PicklingError, "tuple returned by " + PyErr_SetString(st->PicklingError, "tuple returned by " "__reduce__ must contain 2 through 5 elements"); return -1; } @@ -3226,12 +3411,12 @@ return -1; if (!PyCallable_Check(callable)) { - PyErr_SetString(PicklingError, "first item of the tuple " + PyErr_SetString(st->PicklingError, "first item of the tuple " "returned by __reduce__ must be callable"); return -1; } if (!PyTuple_Check(argtup)) { - PyErr_SetString(PicklingError, "second item of the tuple " + PyErr_SetString(st->PicklingError, "second item of the tuple " "returned by __reduce__ must be a tuple"); return -1; } @@ -3242,7 +3427,7 @@ if (listitems == Py_None) listitems = NULL; else if (!PyIter_Check(listitems)) { - PyErr_Format(PicklingError, "fourth element of the tuple " + PyErr_Format(st->PicklingError, "fourth element of the tuple " "returned by __reduce__ must be an iterator, not %s", Py_TYPE(listitems)->tp_name); return -1; @@ -3251,7 +3436,7 @@ if (dictitems == Py_None) dictitems = NULL; else if (!PyIter_Check(dictitems)) { - PyErr_Format(PicklingError, "fifth element of the tuple " + PyErr_Format(st->PicklingError, "fifth element of the tuple " "returned by __reduce__ must be an iterator, not %s", Py_TYPE(dictitems)->tp_name); return -1; @@ -3290,7 +3475,7 @@ PyObject *kwargs; if (Py_SIZE(argtup) != 3) { - PyErr_Format(PicklingError, + PyErr_Format(st->PicklingError, "length of the NEWOBJ_EX argument tuple must be " "exactly 3, not %zd", Py_SIZE(argtup)); return -1; @@ -3298,21 +3483,21 @@ cls = PyTuple_GET_ITEM(argtup, 0); if (!PyType_Check(cls)) { - PyErr_Format(PicklingError, + PyErr_Format(st->PicklingError, "first item from NEWOBJ_EX argument tuple must " "be a class, not %.200s", Py_TYPE(cls)->tp_name); return -1; } args = PyTuple_GET_ITEM(argtup, 1); if (!PyTuple_Check(args)) { - PyErr_Format(PicklingError, + PyErr_Format(st->PicklingError, "second item from NEWOBJ_EX argument tuple must " "be a tuple, not %.200s", Py_TYPE(args)->tp_name); return -1; } kwargs = PyTuple_GET_ITEM(argtup, 2); if (!PyDict_Check(kwargs)) { - PyErr_Format(PicklingError, + PyErr_Format(st->PicklingError, "third item from NEWOBJ_EX argument tuple must " "be a dict, not %.200s", Py_TYPE(kwargs)->tp_name); return -1; @@ -3333,13 +3518,13 @@ /* Sanity checks. */ if (Py_SIZE(argtup) < 1) { - PyErr_SetString(PicklingError, "__newobj__ arglist is empty"); + PyErr_SetString(st->PicklingError, "__newobj__ arglist is empty"); return -1; } cls = PyTuple_GET_ITEM(argtup, 0); if (!PyType_Check(cls)) { - PyErr_SetString(PicklingError, "args[0] from " + PyErr_SetString(st->PicklingError, "args[0] from " "__newobj__ args is not a type"); return -1; } @@ -3349,7 +3534,7 @@ p = obj_class != cls; /* true iff a problem */ Py_DECREF(obj_class); if (p) { - PyErr_SetString(PicklingError, "args[0] from " + PyErr_SetString(st->PicklingError, "args[0] from " "__newobj__ args has the wrong class"); return -1; } @@ -3547,7 +3732,8 @@ * __reduce_ex__ method, or the object's __reduce__ method. */ if (self->dispatch_table == NULL) { - reduce_func = PyDict_GetItem(dispatch_table, (PyObject *)type); + PickleState *st = _Pickle_GetGlobalState(); + reduce_func = PyDict_GetItem(st->dispatch_table, (PyObject *)type); /* PyDict_GetItem() unlike PyObject_GetItem() and PyObject_GetAttr() returns a borrowed ref */ Py_XINCREF(reduce_func); @@ -3591,17 +3777,23 @@ } } else { - if (PyErr_ExceptionMatches(PyExc_AttributeError)) + PickleState *st = _Pickle_GetGlobalState(); + + if (PyErr_ExceptionMatches(PyExc_AttributeError)) { PyErr_Clear(); - else + } + else { goto error; + } /* Check for a __reduce__ method. */ reduce_func = _PyObject_GetAttrId(obj, &PyId___reduce__); if (reduce_func != NULL) { - reduce_value = PyObject_Call(reduce_func, empty_tuple, NULL); + reduce_value = PyObject_Call(reduce_func, st->empty_tuple, + NULL); } else { - PyErr_Format(PicklingError, "can't pickle '%.200s' object: %R", + PyErr_Format(st->PicklingError, + "can't pickle '%.200s' object: %R", type->tp_name, obj); goto error; } @@ -3617,7 +3809,8 @@ } if (!PyTuple_Check(reduce_value)) { - PyErr_SetString(PicklingError, + PickleState *st = _Pickle_GetGlobalState(); + PyErr_SetString(st->PicklingError, "__reduce__ must return a string or tuple"); goto error; } @@ -3723,7 +3916,8 @@ Developers often forget to call __init__() in their subclasses, which would trigger a segfault without this check. */ if (self->write == NULL) { - PyErr_Format(PicklingError, + PickleState *st = _Pickle_GetGlobalState(); + PyErr_Format(st->PicklingError, "Pickler.__init__() was not called by %s.__init__()", Py_TYPE(self)->tp_name); return NULL; @@ -3919,16 +4113,19 @@ if (self->dispatch_table == NULL) return NULL; } - return Py_None; -} - -/* XXX Slight hack to slot a Clinic generated signature in tp_init. */ + + Py_RETURN_NONE; +} + +/* Wrap the Clinic generated signature to slot it in tp_init. */ static int Pickler_init(PyObject *self, PyObject *args, PyObject *kwargs) { - if (_pickle_Pickler___init__(self, args, kwargs) == NULL) { - return -1; - } + PyObject *result = _pickle_Pickler___init__(self, args, kwargs); + if (result == NULL) { + return -1; + } + Py_DECREF(result); return 0; } @@ -4323,8 +4520,9 @@ static Py_ssize_t marker(UnpicklerObject *self) { + PickleState *st = _Pickle_GetGlobalState(); if (self->num_marks < 1) { - PyErr_SetString(UnpicklingError, "could not find MARK"); + PyErr_SetString(st->UnpicklingError, "could not find MARK"); return -1; } @@ -4341,7 +4539,8 @@ static int bad_readline(void) { - PyErr_SetString(UnpicklingError, "pickle data was truncated"); + PickleState *st = _Pickle_GetGlobalState(); + PyErr_SetString(st->UnpicklingError, "pickle data was truncated"); return -1; } @@ -4536,8 +4735,9 @@ size = calc_binint(nbytes, size); if (size < 0) { + PickleState *st = _Pickle_GetGlobalState(); /* Corrupt or hostile pickle -- we never write one like this */ - PyErr_SetString(UnpicklingError, + PyErr_SetString(st->UnpicklingError, "LONG pickle has negative byte count"); return -1; } @@ -4625,7 +4825,8 @@ len -= 2; } else { - PyErr_SetString(UnpicklingError, + PickleState *st = _Pickle_GetGlobalState(); + PyErr_SetString(st->UnpicklingError, "the STRING opcode argument must be quoted"); return -1; } @@ -4686,7 +4887,8 @@ size = calc_binsize(s, nbytes); if (size < 0) { - PyErr_Format(UnpicklingError, + PickleState *st = _Pickle_GetGlobalState(); + PyErr_Format(st->UnpicklingError, "BINSTRING exceeds system's maximum size of %zd bytes", PY_SSIZE_T_MAX); return -1; @@ -4994,6 +5196,7 @@ PyObject *clsraw = NULL; PyTypeObject *cls; /* clsraw cast to its true type */ PyObject *obj; + PickleState *st = _Pickle_GetGlobalState(); /* Stack is ... cls argtuple, and we want to call * cls.__new__(cls, *argtuple). @@ -5002,7 +5205,8 @@ if (args == NULL) goto error; if (!PyTuple_Check(args)) { - PyErr_SetString(UnpicklingError, "NEWOBJ expected an arg " "tuple."); + PyErr_SetString(st->UnpicklingError, + "NEWOBJ expected an arg " "tuple."); goto error; } @@ -5011,12 +5215,12 @@ if (cls == NULL) goto error; if (!PyType_Check(cls)) { - PyErr_SetString(UnpicklingError, "NEWOBJ class argument " + PyErr_SetString(st->UnpicklingError, "NEWOBJ class argument " "isn't a type object"); goto error; } if (cls->tp_new == NULL) { - PyErr_SetString(UnpicklingError, "NEWOBJ class argument " + PyErr_SetString(st->UnpicklingError, "NEWOBJ class argument " "has NULL tp_new"); goto error; } @@ -5042,6 +5246,7 @@ { PyObject *cls, *args, *kwargs; PyObject *obj; + PickleState *st = _Pickle_GetGlobalState(); PDATA_POP(self->stack, kwargs); if (kwargs == NULL) { @@ -5063,7 +5268,7 @@ Py_DECREF(kwargs); Py_DECREF(args); Py_DECREF(cls); - PyErr_Format(UnpicklingError, + PyErr_Format(st->UnpicklingError, "NEWOBJ_EX class argument must be a type, not %.200s", Py_TYPE(cls)->tp_name); return -1; @@ -5073,7 +5278,7 @@ Py_DECREF(kwargs); Py_DECREF(args); Py_DECREF(cls); - PyErr_SetString(UnpicklingError, + PyErr_SetString(st->UnpicklingError, "NEWOBJ_EX class argument doesn't have __new__"); return -1; } @@ -5135,7 +5340,8 @@ PDATA_POP(self->stack, module_name); if (module_name == NULL || !PyUnicode_CheckExact(module_name) || global_name == NULL || !PyUnicode_CheckExact(global_name)) { - PyErr_SetString(UnpicklingError, "STACK_GLOBAL requires str"); + PickleState *st = _Pickle_GetGlobalState(); + PyErr_SetString(st->UnpicklingError, "STACK_GLOBAL requires str"); Py_XDECREF(global_name); Py_XDECREF(module_name); return -1; @@ -5176,7 +5382,8 @@ return 0; } else { - PyErr_SetString(UnpicklingError, + PickleState *st = _Pickle_GetGlobalState(); + PyErr_SetString(st->UnpicklingError, "A load persistent id instruction was encountered,\n" "but no persistent_load function was specified."); return -1; @@ -5203,7 +5410,8 @@ return 0; } else { - PyErr_SetString(UnpicklingError, + PickleState *st = _Pickle_GetGlobalState(); + PyErr_SetString(st->UnpicklingError, "A load persistent id instruction was encountered,\n" "but no persistent_load function was specified."); return -1; @@ -5359,6 +5567,7 @@ PyObject *obj; /* the object to push */ PyObject *pair; /* (module_name, class_name) */ PyObject *module_name, *class_name; + PickleState *st = _Pickle_GetGlobalState(); assert(nbytes == 1 || nbytes == 2 || nbytes == 4); if (_Unpickler_Read(self, &codebytes, nbytes) < 0) @@ -5366,7 +5575,7 @@ code = calc_binint(codebytes, nbytes); if (code <= 0) { /* note that 0 is forbidden */ /* Corrupt or hostile pickle. */ - PyErr_SetString(UnpicklingError, "EXT specifies code <= 0"); + PyErr_SetString(st->UnpicklingError, "EXT specifies code <= 0"); return -1; } @@ -5374,7 +5583,7 @@ py_code = PyLong_FromLong(code); if (py_code == NULL) return -1; - obj = PyDict_GetItem(extension_cache, py_code); + obj = PyDict_GetItem(st->extension_cache, py_code); if (obj != NULL) { /* Bingo. */ Py_DECREF(py_code); @@ -5383,7 +5592,7 @@ } /* Look up the (module_name, class_name) pair. */ - pair = PyDict_GetItem(inverted_registry, py_code); + pair = PyDict_GetItem(st->inverted_registry, py_code); if (pair == NULL) { Py_DECREF(py_code); PyErr_Format(PyExc_ValueError, "unregistered extension " @@ -5408,7 +5617,7 @@ return -1; } /* Cache code -> obj. */ - code = PyDict_SetItem(extension_cache, py_code, obj); + code = PyDict_SetItem(st->extension_cache, py_code, obj); Py_DECREF(py_code); if (code < 0) { Py_DECREF(obj); @@ -5585,8 +5794,10 @@ if (len == x) /* nothing to do */ return 0; if ((len - x) % 2 != 0) { + PickleState *st = _Pickle_GetGlobalState(); /* Currupt or hostile pickle -- we never write one like this. */ - PyErr_SetString(UnpicklingError, "odd number of items for SETITEMS"); + PyErr_SetString(st->UnpicklingError, + "odd number of items for SETITEMS"); return -1; } @@ -5736,7 +5947,8 @@ _Py_IDENTIFIER(__dict__); if (!PyDict_Check(state)) { - PyErr_SetString(UnpicklingError, "state is not a dictionary"); + PickleState *st = _Pickle_GetGlobalState(); + PyErr_SetString(st->UnpicklingError, "state is not a dictionary"); goto error; } dict = _PyObject_GetAttrId(inst, &PyId___dict__); @@ -5765,7 +5977,8 @@ Py_ssize_t i; if (!PyDict_Check(slotstate)) { - PyErr_SetString(UnpicklingError, + PickleState *st = _Pickle_GetGlobalState(); + PyErr_SetString(st->UnpicklingError, "slot state is not a dictionary"); goto error; } @@ -5988,11 +6201,14 @@ break; default: - if (s[0] == '\0') + if (s[0] == '\0') { PyErr_SetNone(PyExc_EOFError); - else - PyErr_Format(UnpicklingError, + } + else { + PickleState *st = _Pickle_GetGlobalState(); + PyErr_Format(st->UnpicklingError, "invalid load key, '%c'.", s[0]); + } return NULL; } @@ -6037,12 +6253,14 @@ /*[clinic checksum: 9a30ba4e4d9221d4dcd705e1471ab11b2c9e3ac6]*/ { UnpicklerObject *unpickler = (UnpicklerObject*)self; + /* Check whether the Unpickler was initialized correctly. This prevents segfaulting if a subclass overridden __init__ with a function that does not call Unpickler.__init__(). Here, we simply ensure that self->read is not NULL. */ if (unpickler->read == NULL) { - PyErr_Format(UnpicklingError, + PickleState *st = _Pickle_GetGlobalState(); + PyErr_Format(st->UnpicklingError, "Unpickler.__init__() was not called by %s.__init__()", Py_TYPE(unpickler)->tp_name); return NULL; @@ -6121,13 +6339,14 @@ if (self->proto < 3 && self->fix_imports) { PyObject *key; PyObject *item; + PickleState *st = _Pickle_GetGlobalState(); /* Check if the global (i.e., a function or a class) was renamed or moved to another module. */ key = PyTuple_Pack(2, module_name, global_name); if (key == NULL) return NULL; - item = PyDict_GetItemWithError(name_mapping_2to3, key); + item = PyDict_GetItemWithError(st->name_mapping_2to3, key); Py_DECREF(key); if (item) { if (!PyTuple_Check(item) || PyTuple_GET_SIZE(item) != 2) { @@ -6153,7 +6372,7 @@ } /* Check if the module was renamed. */ - item = PyDict_GetItemWithError(import_mapping_2to3, module_name); + item = PyDict_GetItemWithError(st->import_mapping_2to3, module_name); if (item) { if (!PyUnicode_Check(item)) { PyErr_Format(PyExc_RuntimeError, @@ -6378,16 +6597,18 @@ self->proto = 0; - return Py_None; -} - -/* XXX Slight hack to slot a Clinic generated signature in tp_init. */ + Py_RETURN_NONE; +} + +/* Wrap the Clinic generated signature to slot it in tp_init. */ static int Unpickler_init(PyObject *self, PyObject *args, PyObject *kwargs) { - if (_pickle_Unpickler___init__(self, args, kwargs) == NULL) { - return -1; - } + PyObject *result = _pickle_Unpickler___init__(self, args, kwargs); + if (result == NULL) { + return -1; + } + Py_DECREF(result); return 0; } @@ -7174,7 +7395,6 @@ return NULL; } - static struct PyMethodDef pickle_methods[] = { _PICKLE_DUMP_METHODDEF _PICKLE_DUMPS_METHODDEF @@ -7184,125 +7404,56 @@ }; static int -initmodule(void) -{ - PyObject *copyreg = NULL; - PyObject *compat_pickle = NULL; - - /* XXX: We should ensure that the types of the dictionaries imported are - exactly PyDict objects. Otherwise, it is possible to crash the pickle - since we use the PyDict API directly to access these dictionaries. */ - - copyreg = PyImport_ImportModule("copyreg"); - if (!copyreg) - goto error; - dispatch_table = PyObject_GetAttrString(copyreg, "dispatch_table"); - if (!dispatch_table) - goto error; - extension_registry = \ - PyObject_GetAttrString(copyreg, "_extension_registry"); - if (!extension_registry) - goto error; - inverted_registry = PyObject_GetAttrString(copyreg, "_inverted_registry"); - if (!inverted_registry) - goto error; - extension_cache = PyObject_GetAttrString(copyreg, "_extension_cache"); - if (!extension_cache) - goto error; - Py_CLEAR(copyreg); - - /* Load the 2.x -> 3.x stdlib module mapping tables */ - compat_pickle = PyImport_ImportModule("_compat_pickle"); - if (!compat_pickle) - goto error; - name_mapping_2to3 = PyObject_GetAttrString(compat_pickle, "NAME_MAPPING"); - if (!name_mapping_2to3) - goto error; - if (!PyDict_CheckExact(name_mapping_2to3)) { - PyErr_Format(PyExc_RuntimeError, - "_compat_pickle.NAME_MAPPING should be a dict, not %.200s", - Py_TYPE(name_mapping_2to3)->tp_name); - goto error; - } - import_mapping_2to3 = PyObject_GetAttrString(compat_pickle, - "IMPORT_MAPPING"); - if (!import_mapping_2to3) - goto error; - if (!PyDict_CheckExact(import_mapping_2to3)) { - PyErr_Format(PyExc_RuntimeError, - "_compat_pickle.IMPORT_MAPPING should be a dict, " - "not %.200s", Py_TYPE(import_mapping_2to3)->tp_name); - goto error; - } - /* ... and the 3.x -> 2.x mapping tables */ - name_mapping_3to2 = PyObject_GetAttrString(compat_pickle, - "REVERSE_NAME_MAPPING"); - if (!name_mapping_3to2) - goto error; - if (!PyDict_CheckExact(name_mapping_3to2)) { - PyErr_Format(PyExc_RuntimeError, - "_compat_pickle.REVERSE_NAME_MAPPING should be a dict, " - "not %.200s", Py_TYPE(name_mapping_3to2)->tp_name); - goto error; - } - import_mapping_3to2 = PyObject_GetAttrString(compat_pickle, - "REVERSE_IMPORT_MAPPING"); - if (!import_mapping_3to2) - goto error; - if (!PyDict_CheckExact(import_mapping_3to2)) { - PyErr_Format(PyExc_RuntimeError, - "_compat_pickle.REVERSE_IMPORT_MAPPING should be a dict, " - "not %.200s", Py_TYPE(import_mapping_3to2)->tp_name); - goto error; - } - Py_CLEAR(compat_pickle); - - empty_tuple = PyTuple_New(0); - if (empty_tuple == NULL) - goto error; - two_tuple = PyTuple_New(2); - if (two_tuple == NULL) - goto error; - /* We use this temp container with no regard to refcounts, or to - * keeping containees alive. Exempt from GC, because we don't - * want anything looking at two_tuple() by magic. - */ - PyObject_GC_UnTrack(two_tuple); - +pickle_clear(PyObject *m) +{ + _Pickle_ClearState(_Pickle_GetState(m)); return 0; - - error: - Py_CLEAR(copyreg); - Py_CLEAR(dispatch_table); - Py_CLEAR(extension_registry); - Py_CLEAR(inverted_registry); - Py_CLEAR(extension_cache); - Py_CLEAR(compat_pickle); - Py_CLEAR(name_mapping_2to3); - Py_CLEAR(import_mapping_2to3); - Py_CLEAR(name_mapping_3to2); - Py_CLEAR(import_mapping_3to2); - Py_CLEAR(empty_tuple); - Py_CLEAR(two_tuple); - return -1; +} + +static int +pickle_traverse(PyObject *m, visitproc visit, void *arg) +{ + PickleState *st = _Pickle_GetState(m); + Py_VISIT(st->PickleError); + Py_VISIT(st->PicklingError); + Py_VISIT(st->UnpicklingError); + Py_VISIT(st->dispatch_table); + Py_VISIT(st->extension_registry); + Py_VISIT(st->extension_cache); + Py_VISIT(st->inverted_registry); + Py_VISIT(st->name_mapping_2to3); + Py_VISIT(st->import_mapping_2to3); + Py_VISIT(st->name_mapping_3to2); + Py_VISIT(st->import_mapping_3to2); + Py_VISIT(st->codecs_encode); + Py_VISIT(st->empty_tuple); + Py_VISIT(st->arg_tuple); + return 0; } static struct PyModuleDef _picklemodule = { PyModuleDef_HEAD_INIT, - "_pickle", - pickle_module_doc, - -1, - pickle_methods, - NULL, - NULL, - NULL, - NULL + "_pickle", /* m_name */ + pickle_module_doc, /* m_doc */ + sizeof(PickleState), /* m_size */ + pickle_methods, /* m_methods */ + NULL, /* m_reload */ + pickle_traverse, /* m_traverse */ + pickle_clear, /* m_clear */ + NULL /* m_free */ }; PyMODINIT_FUNC PyInit__pickle(void) { PyObject *m; + PickleState *st; + + m = PyState_FindModule(&_picklemodule); + if (m) { + Py_INCREF(m); + return m; + } if (PyType_Ready(&Unpickler_Type) < 0) return NULL; @@ -7327,27 +7478,32 @@ if (PyModule_AddObject(m, "Unpickler", (PyObject *)&Unpickler_Type) < 0) return NULL; + st = _Pickle_GetState(m); + /* Initialize the exceptions. */ - PickleError = PyErr_NewException("_pickle.PickleError", NULL, NULL); - if (PickleError == NULL) + st->PickleError = PyErr_NewException("_pickle.PickleError", NULL, NULL); + if (st->PickleError == NULL) return NULL; - PicklingError = \ - PyErr_NewException("_pickle.PicklingError", PickleError, NULL); - if (PicklingError == NULL) + st->PicklingError = \ + PyErr_NewException("_pickle.PicklingError", st->PickleError, NULL); + if (st->PicklingError == NULL) return NULL; - UnpicklingError = \ - PyErr_NewException("_pickle.UnpicklingError", PickleError, NULL); - if (UnpicklingError == NULL) + st->UnpicklingError = \ + PyErr_NewException("_pickle.UnpicklingError", st->PickleError, NULL); + if (st->UnpicklingError == NULL) return NULL; - if (PyModule_AddObject(m, "PickleError", PickleError) < 0) + Py_INCREF(st->PickleError); + if (PyModule_AddObject(m, "PickleError", st->PickleError) < 0) return NULL; - if (PyModule_AddObject(m, "PicklingError", PicklingError) < 0) + Py_INCREF(st->PicklingError); + if (PyModule_AddObject(m, "PicklingError", st->PicklingError) < 0) return NULL; - if (PyModule_AddObject(m, "UnpicklingError", UnpicklingError) < 0) + Py_INCREF(st->UnpicklingError); + if (PyModule_AddObject(m, "UnpicklingError", st->UnpicklingError) < 0) return NULL; - if (initmodule() < 0) + if (_Pickle_InitState(st) < 0) return NULL; return m; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 28 06:58:45 2013 From: python-checkins at python.org (zach.ware) Date: Thu, 28 Nov 2013 06:58:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319595=3A_Re-enabl?= =?utf-8?q?e_a_long-disabled_test_in_test=5Fwinsound?= Message-ID: <3dVSpF07Fcz7LpR@mail.python.org> http://hg.python.org/cpython/rev/6f1c5d0b44ed changeset: 87621:6f1c5d0b44ed user: Zachary Ware date: Wed Nov 27 23:56:04 2013 -0600 summary: Issue #19595: Re-enable a long-disabled test in test_winsound files: Lib/test/test_winsound.py | 20 ++++++++------------ Misc/NEWS | 2 ++ 2 files changed, 10 insertions(+), 12 deletions(-) diff --git a/Lib/test/test_winsound.py b/Lib/test/test_winsound.py --- a/Lib/test/test_winsound.py +++ b/Lib/test/test_winsound.py @@ -158,18 +158,14 @@ ) def test_alias_fallback(self): - # This test can't be expected to work on all systems. The MS - # PlaySound() docs say: - # - # If it cannot find the specified sound, PlaySound uses the - # default system event sound entry instead. If the function - # can find neither the system default entry nor the default - # sound, it makes no sound and returns FALSE. - # - # It's known to return FALSE on some real systems. - - # winsound.PlaySound('!"$%&/(#+*', winsound.SND_ALIAS) - return + if _have_soundcard(): + winsound.PlaySound('!"$%&/(#+*', winsound.SND_ALIAS) + else: + self.assertRaises( + RuntimeError, + winsound.PlaySound, + '!"$%&/(#+*', winsound.SND_ALIAS + ) def test_alias_nofallback(self): if _have_soundcard(): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -31,6 +31,8 @@ Tests ----- +- Issue #19595: Re-enabled a long-disabled test in test_winsound. + - Issue #19588: Fixed tests in test_random that were silently skipped most of the time. Patch by Julian Gindi. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 28 08:07:03 2013 From: python-checkins at python.org (christian.heimes) Date: Thu, 28 Nov 2013 08:07:03 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_ssl=2Ecreate=5Fdefault=5Fc?= =?utf-8?q?ontext=28=29_sets_OP=5FNO=5FCOMPRESSION_to_prevent_CRIME?= Message-ID: <3dVVK32gz7z7Lpf@mail.python.org> http://hg.python.org/cpython/rev/98eb88d3d94e changeset: 87622:98eb88d3d94e user: Christian Heimes date: Thu Nov 28 08:06:54 2013 +0100 summary: ssl.create_default_context() sets OP_NO_COMPRESSION to prevent CRIME files: Lib/ssl.py | 2 ++ Misc/NEWS | 2 ++ 2 files changed, 4 insertions(+), 0 deletions(-) diff --git a/Lib/ssl.py b/Lib/ssl.py --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -383,6 +383,8 @@ context = SSLContext(PROTOCOL_TLSv1) # SSLv2 considered harmful. context.options |= OP_NO_SSLv2 + # disable compression to prevent CRIME attacks (OpenSSL 1.0+) + context.options |= getattr(_ssl, "OP_NO_COMPRESSION", 0) # disallow ciphers with known vulnerabilities context.set_ciphers(_RESTRICTED_CIPHERS) # verify certs in client mode diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -18,6 +18,8 @@ Library ------- +- ssl.create_default_context() sets OP_NO_COMPRESSION to prevent CRIME. + - Issue #19802: Add socket.SO_PRIORITY. - Issue #11508: Fixed uuid.getnode() and uuid.uuid1() on environment with -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Thu Nov 28 09:39:14 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Thu, 28 Nov 2013 09:39:14 +0100 Subject: [Python-checkins] Daily reference leaks (acabd3f035fe): sum=0 Message-ID: results for acabd3f035fe on branch "default" -------------------------------------------- test_site leaked [-2, 0, 2] references, sum=0 test_site leaked [-2, 0, 2] memory blocks, sum=0 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogXqtArS', '-x'] From python-checkins at python.org Thu Nov 28 15:12:21 2013 From: python-checkins at python.org (christian.heimes) Date: Thu, 28 Nov 2013 15:12:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_SNI_was_added_in_OpenSSL_0?= =?utf-8?q?=2E9=2E8f__=5B11_Oct_2007=5D=2C_too?= Message-ID: <3dVgln3fbxzSY5@mail.python.org> http://hg.python.org/cpython/rev/87858e0b757d changeset: 87623:87858e0b757d user: Christian Heimes date: Thu Nov 28 15:12:15 2013 +0100 summary: SNI was added in OpenSSL 0.9.8f [11 Oct 2007], too files: Modules/_ssl.c | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Modules/_ssl.c b/Modules/_ssl.c --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -97,7 +97,7 @@ # define HAVE_TLSv1_2 0 #endif -/* SNI support (client- and server-side) appeared in OpenSSL 1.0.0. +/* SNI support (client- and server-side) appeared in OpenSSL 1.0.0 and 0.9.8f * This includes the SSL_set_SSL_CTX() function. */ #ifdef SSL_CTRL_SET_TLSEXT_HOSTNAME -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 28 15:25:49 2013 From: python-checkins at python.org (eli.bendersky) Date: Thu, 28 Nov 2013 15:25:49 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5ODE1?= =?utf-8?q?=3A_Fix_segfault_when_parsing_empty_namespace_declaration=2E?= Message-ID: <3dVh3K2BtfzSW4@mail.python.org> http://hg.python.org/cpython/rev/395a266bcb5a changeset: 87624:395a266bcb5a branch: 2.7 parent: 87609:e5b7f140ae30 user: Eli Bendersky date: Thu Nov 28 06:25:45 2013 -0800 summary: Issue #19815: Fix segfault when parsing empty namespace declaration. Based on patches by Christian Heimes and Vajrasky Kok files: Lib/test/test_xml_etree.py | 11 +++++++++-- Modules/_elementtree.c | 5 ++++- 2 files changed, 13 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_xml_etree.py b/Lib/test/test_xml_etree.py --- a/Lib/test/test_xml_etree.py +++ b/Lib/test/test_xml_etree.py @@ -713,14 +713,21 @@ end {namespace}root end-ns None + >>> import StringIO + + >>> events = ('start-ns', 'end-ns') + >>> context = ET.iterparse(StringIO.StringIO(r""), events) + >>> for action, elem in context: + ... print action, elem + start-ns ('', '') + end-ns None + >>> events = ("start", "end", "bogus") >>> with open(SIMPLE_XMLFILE, "rb") as f: ... iterparse(f, events) Traceback (most recent call last): ValueError: unknown event 'bogus' - >>> import StringIO - >>> source = StringIO.StringIO( ... "\\n" ... " http://hg.python.org/cpython/rev/68f1e5262a7a changeset: 87625:68f1e5262a7a branch: 3.3 parent: 87610:dfadce7635ce user: Eli Bendersky date: Thu Nov 28 06:31:58 2013 -0800 summary: Issue #19815: Fix segfault when parsing empty namespace declaration. Based on patches by Christian Heimes and Vajrasky Kok files: Lib/test/test_xml_etree.py | 5 +++++ Modules/_elementtree.c | 5 ++++- 2 files changed, 9 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_xml_etree.py b/Lib/test/test_xml_etree.py --- a/Lib/test/test_xml_etree.py +++ b/Lib/test/test_xml_etree.py @@ -535,6 +535,11 @@ ('end-ns', None), ]) + events = ('start-ns', 'end-ns') + context = iterparse(io.StringIO(r""), events) + res = [action for action, elem in context] + self.assertEqual(res, ['start-ns', 'end-ns']) + events = ("start", "end", "bogus") with self.assertRaises(ValueError) as cm: with open(SIMPLE_XMLFILE, "rb") as f: diff --git a/Modules/_elementtree.c b/Modules/_elementtree.c --- a/Modules/_elementtree.c +++ b/Modules/_elementtree.c @@ -2997,7 +2997,10 @@ PyObject* sprefix = NULL; PyObject* suri = NULL; - suri = PyUnicode_DecodeUTF8(uri, strlen(uri), "strict"); + if (uri) + suri = PyUnicode_DecodeUTF8(uri, strlen(uri), "strict"); + else + suri = PyUnicode_FromString(""); if (!suri) return; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 28 15:35:46 2013 From: python-checkins at python.org (eli.bendersky) Date: Thu, 28 Nov 2013 15:35:46 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Fix_indentatio?= =?utf-8?q?n_from_previous_commit?= Message-ID: <3dVhGp0n3Qz7LjY@mail.python.org> http://hg.python.org/cpython/rev/712ebde527c2 changeset: 87626:712ebde527c2 branch: 3.3 user: Eli Bendersky date: Thu Nov 28 06:33:21 2013 -0800 summary: Fix indentation from previous commit files: Modules/_elementtree.c | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Modules/_elementtree.c b/Modules/_elementtree.c --- a/Modules/_elementtree.c +++ b/Modules/_elementtree.c @@ -2998,9 +2998,9 @@ PyObject* suri = NULL; if (uri) - suri = PyUnicode_DecodeUTF8(uri, strlen(uri), "strict"); + suri = PyUnicode_DecodeUTF8(uri, strlen(uri), "strict"); else - suri = PyUnicode_FromString(""); + suri = PyUnicode_FromString(""); if (!suri) return; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 28 15:35:47 2013 From: python-checkins at python.org (eli.bendersky) Date: Thu, 28 Nov 2013 15:35:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319815=3A_Fix_segfault_when_parsing_empty_namesp?= =?utf-8?q?ace_declaration=2E?= Message-ID: <3dVhGq3rw8z7Lk8@mail.python.org> http://hg.python.org/cpython/rev/2b2925c08a6c changeset: 87627:2b2925c08a6c parent: 87623:87858e0b757d parent: 87626:712ebde527c2 user: Eli Bendersky date: Thu Nov 28 06:35:40 2013 -0800 summary: Issue #19815: Fix segfault when parsing empty namespace declaration. Based on patches by Christian Heimes and Vajrasky Kok files: Lib/test/test_xml_etree.py | 5 +++++ Modules/_elementtree.c | 5 ++++- 2 files changed, 9 insertions(+), 1 deletions(-) diff --git a/Lib/test/test_xml_etree.py b/Lib/test/test_xml_etree.py --- a/Lib/test/test_xml_etree.py +++ b/Lib/test/test_xml_etree.py @@ -526,6 +526,11 @@ ('end-ns', None), ]) + events = ('start-ns', 'end-ns') + context = iterparse(io.StringIO(r""), events) + res = [action for action, elem in context] + self.assertEqual(res, ['start-ns', 'end-ns']) + events = ("start", "end", "bogus") with self.assertRaises(ValueError) as cm: with open(SIMPLE_XMLFILE, "rb") as f: diff --git a/Modules/_elementtree.c b/Modules/_elementtree.c --- a/Modules/_elementtree.c +++ b/Modules/_elementtree.c @@ -3038,7 +3038,10 @@ if (PyErr_Occurred()) return; - suri = PyUnicode_DecodeUTF8(uri, strlen(uri), "strict"); + if (uri) + suri = PyUnicode_DecodeUTF8(uri, strlen(uri), "strict"); + else + suri = PyUnicode_FromString(""); if (!suri) return; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 28 15:37:27 2013 From: python-checkins at python.org (eli.bendersky) Date: Thu, 28 Nov 2013 15:37:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Fix_indentatio?= =?utf-8?q?n_from_previous_commit?= Message-ID: <3dVhJl051Sz7LjQ@mail.python.org> http://hg.python.org/cpython/rev/3a7a3def6503 changeset: 87628:3a7a3def6503 branch: 2.7 parent: 87624:395a266bcb5a user: Eli Bendersky date: Thu Nov 28 06:37:25 2013 -0800 summary: Fix indentation from previous commit files: Modules/_elementtree.c | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Modules/_elementtree.c b/Modules/_elementtree.c --- a/Modules/_elementtree.c +++ b/Modules/_elementtree.c @@ -2339,9 +2339,9 @@ PyObject* suri = NULL; if (uri) - suri = makestring(uri, strlen(uri)); + suri = makestring(uri, strlen(uri)); else - suri = PyString_FromStringAndSize("", 0); + suri = PyString_FromStringAndSize("", 0); if (!suri) return; -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 28 15:53:08 2013 From: python-checkins at python.org (eli.bendersky) Date: Thu, 28 Nov 2013 15:53:08 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Some_minor_clarifications_?= =?utf-8?q?in_the_documentation_of_pathlib_+_inheritance_diagram?= Message-ID: <3dVhfr6wdQzSyw@mail.python.org> http://hg.python.org/cpython/rev/90b56ec318b6 changeset: 87629:90b56ec318b6 parent: 87627:2b2925c08a6c user: Eli Bendersky date: Thu Nov 28 06:53:05 2013 -0800 summary: Some minor clarifications in the documentation of pathlib + inheritance diagram files: Doc/library/pathlib-inheritance.png | Bin Doc/library/pathlib.rst | 170 ++++++++------- 2 files changed, 91 insertions(+), 79 deletions(-) diff --git a/Doc/library/pathlib-inheritance.png b/Doc/library/pathlib-inheritance.png new file mode 100644 index 0000000000000000000000000000000000000000..b81c3deb6049d83051337776fe5c4f3de751e1cb GIT binary patch [stripped] diff --git a/Doc/library/pathlib.rst b/Doc/library/pathlib.rst --- a/Doc/library/pathlib.rst +++ b/Doc/library/pathlib.rst @@ -9,15 +9,27 @@ .. versionadded:: 3.4 - This module offers classes representing filesystem paths with semantics appropriate for different operating systems. Path classes are divided between :ref:`pure paths `, which provide purely computational operations without I/O, and :ref:`concrete paths `, which inherit from pure paths but also provide I/O operations. -The main point of entry is the :class:`Path` class, which will instantiate -a :ref:`concrete path ` for the current platform. +.. image:: pathlib-inheritance.png + :align: center + +If you've never used this module before or just aren't sure which class is +right for your task, :class:`Path` is most likely what you need. It instantiates +a :ref:`concrete path ` for the platform the code is running on. + +Pure paths are useful in some special cases; for example: + +#. If you want to manipulate Windows paths on a Unix machine (or vice versa). + You cannot instantiate a :class:`WindowsPath` when running on Unix, but you + can instantiate :class:`PureWindowsPath`. +#. You want to make sure that your code only manipulates paths without actually + accessing the OS. In this case, instantiating one of the pure classes may be + useful since those simply don't have any OS-accessing operations. .. note:: This module has been included in the standard library on a @@ -86,8 +98,57 @@ access a filesystem. There are three ways to access these classes, which we also call *flavours*: +.. class:: PurePath(*pathsegments) -.. class:: PurePosixPath + A generic class that represents the system's path flavour (instantiating + it creates either a :class:`PurePosixPath` or a :class:`PureWindowsPath`):: + + >>> PurePath('setup.py') # Running on a Unix machine + PurePosixPath('setup.py') + + Each element of *pathsegments* can be either a string or bytes object + representing a path segment; it can also be another path object:: + + >>> PurePath('foo', 'some/path', 'bar') + PurePosixPath('foo/some/path/bar') + >>> PurePath(Path('foo'), Path('bar')) + PurePosixPath('foo/bar') + + When *pathsegments* is empty, the current directory is assumed:: + + >>> PurePath() + PurePosixPath('.') + + When several absolute paths are given, the last is taken as an anchor + (mimicking :func:`os.path.join`'s behaviour):: + + >>> PurePath('/etc', '/usr', 'lib64') + PurePosixPath('/usr/lib64') + >>> PureWindowsPath('c:/Windows', 'd:bar') + PureWindowsPath('d:bar') + + However, in a Windows path, changing the local root doesn't discard the + previous drive setting:: + + >>> PureWindowsPath('c:/Windows', '/Program Files') + PureWindowsPath('c:/Program Files') + + Spurious slashes and single dots are collapsed, but double dots (``'..'``) + are not, since this would change the meaning of a path in the face of + symbolic links:: + + >>> PurePath('foo//bar') + PurePosixPath('foo/bar') + >>> PurePath('foo/./bar') + PurePosixPath('foo/bar') + >>> PurePath('foo/../bar') + PurePosixPath('foo/../bar') + + (a na?ve approach would make ``PurePosixPath('foo/../bar')`` equivalent + to ``PurePosixPath('bar')``, which is wrong if ``foo`` is a symbolic link + to another directory) + +.. class:: PurePosixPath(*pathsegments) A subclass of :class:`PurePath`, this path flavour represents non-Windows filesystem paths:: @@ -95,7 +156,9 @@ >>> PurePosixPath('/etc') PurePosixPath('/etc') -.. class:: PureWindowsPath + *pathsegments* is specified similarly to :class:`PurePath`. + +.. class:: PureWindowsPath(*pathsegments) A subclass of :class:`PurePath`, this path flavour represents Windows filesystem paths:: @@ -103,67 +166,12 @@ >>> PureWindowsPath('c:/Program Files/') PureWindowsPath('c:/Program Files') -.. class:: PurePath - - A generic class that represents the system's path flavour (instantiating - it creates either a :class:`PurePosixPath` or a :class:`PureWindowsPath`):: - - >>> PurePath('setup.py') - PurePosixPath('setup.py') - + *pathsegments* is specified similarly to :class:`PurePath`. Regardless of the system you're running on, you can instantiate all of these classes, since they don't provide any operation that does system calls. -Constructing paths -^^^^^^^^^^^^^^^^^^ - -Path constructors accept an arbitrary number of positional arguments. -When called without any argument, a path object points to the current -directory:: - - >>> PurePath() - PurePosixPath('.') - -Any argument can be a string or bytes object representing an arbitrary number -of path segments, but it can also be another path object:: - - >>> PurePath('foo', 'some/path', 'bar') - PurePosixPath('foo/some/path/bar') - >>> PurePath(Path('foo'), Path('bar')) - PurePosixPath('foo/bar') - -When several absolute paths are given, the last is taken as an anchor -(mimicking :func:`os.path.join`'s behaviour):: - - >>> PurePath('/etc', '/usr', 'lib64') - PurePosixPath('/usr/lib64') - >>> PureWindowsPath('c:/Windows', 'd:bar') - PureWindowsPath('d:bar') - -However, in a Windows path, changing the local root doesn't discard the -previous drive setting:: - - >>> PureWindowsPath('c:/Windows', '/Program Files') - PureWindowsPath('c:/Program Files') - -Spurious slashes and single dots are collapsed, but double dots (``'..'``) -are not, since this would change the meaning of a path in the face of -symbolic links:: - - >>> PurePath('foo//bar') - PurePosixPath('foo/bar') - >>> PurePath('foo/./bar') - PurePosixPath('foo/bar') - >>> PurePath('foo/../bar') - PurePosixPath('foo/../bar') - -(a na?ve approach would make ``PurePosixPath('foo/../bar')`` equivalent -to ``PurePosixPath('bar')``, which is wrong if ``foo`` is a symbolic link -to another directory) - - General properties ^^^^^^^^^^^^^^^^^^ @@ -524,24 +532,7 @@ operations provided by the latter, they also provide methods to do system calls on path objects. There are three ways to instantiate concrete paths: - -.. class:: PosixPath - - A subclass of :class:`Path` and :class:`PurePosixPath`, this class - represents concrete non-Windows filesystem paths:: - - >>> PosixPath('/etc') - PosixPath('/etc') - -.. class:: WindowsPath - - A subclass of :class:`Path` and :class:`PureWindowsPath`, this class - represents concrete Windows filesystem paths:: - - >>> WindowsPath('c:/Program Files/') - WindowsPath('c:/Program Files') - -.. class:: Path +.. class:: Path(*pathsegments) A subclass of :class:`PurePath`, this class represents concrete paths of the system's path flavour (instantiating it creates either a @@ -550,6 +541,27 @@ >>> Path('setup.py') PosixPath('setup.py') + *pathsegments* is specified similarly to :class:`PurePath`. + +.. class:: PosixPath(*pathsegments) + + A subclass of :class:`Path` and :class:`PurePosixPath`, this class + represents concrete non-Windows filesystem paths:: + + >>> PosixPath('/etc') + PosixPath('/etc') + + *pathsegments* is specified similarly to :class:`PurePath`. + +.. class:: WindowsPath(*pathsegments) + + A subclass of :class:`Path` and :class:`PureWindowsPath`, this class + represents concrete Windows filesystem paths:: + + >>> WindowsPath('c:/Program Files/') + WindowsPath('c:/Program Files') + + *pathsegments* is specified similarly to :class:`PurePath`. You can only instantiate the class flavour that corresponds to your system (allowing system calls on non-compatible path flavours could lead to -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Thu Nov 28 23:56:23 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Thu, 28 Nov 2013 23:56:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Remove_the_tuple_reuse_opt?= =?utf-8?q?imization_in_=5FPickle=5FFastCall=2E?= Message-ID: <3dVvNR0vCfz7LjM@mail.python.org> http://hg.python.org/cpython/rev/dd51b72cfb52 changeset: 87630:dd51b72cfb52 user: Alexandre Vassalotti date: Thu Nov 28 14:56:09 2013 -0800 summary: Remove the tuple reuse optimization in _Pickle_FastCall. I have noticed a race-condition occurring on one of the buildbots because of this optimization. The function called may release the GIL which means multiple threads may end up accessing the shared tuple. I could fix it up by storing the tuple to the Pickler and Unipickler object again, but honestly it really not worth the trouble. I ran many benchmarks and the only time the optimization helps is when using a fin-memory file, like io.BytesIO on which reads are super cheap, combined with pickle protocol less than 4. Even in this contrived case, the speedup is a about 5%. For everything else, this optimization does not provide any noticable improvements. files: Modules/_pickle.c | 49 ++++++++++++---------------------- 1 files changed, 17 insertions(+), 32 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -169,8 +169,6 @@ /* As the name says, an empty tuple. */ PyObject *empty_tuple; - /* Single argument tuple used by _Pickle_FastCall */ - PyObject *arg_tuple; } PickleState; /* Forward declaration of the _pickle module definition. */ @@ -208,7 +206,6 @@ Py_CLEAR(st->import_mapping_3to2); Py_CLEAR(st->codecs_encode); Py_CLEAR(st->empty_tuple); - Py_CLEAR(st->arg_tuple); } /* Initialize the given pickle module state. */ @@ -328,8 +325,6 @@ if (st->empty_tuple == NULL) goto error; - st->arg_tuple = NULL; - return 0; error: @@ -342,40 +337,31 @@ /* Helper for calling a function with a single argument quickly. - This has the performance advantage of reusing the argument tuple. This - provides a nice performance boost with older pickle protocols where many - unbuffered reads occurs. - This function steals the reference of the given argument. */ static PyObject * _Pickle_FastCall(PyObject *func, PyObject *obj) { - PickleState *st = _Pickle_GetGlobalState(); - PyObject *arg_tuple; PyObject *result; - - arg_tuple = st->arg_tuple; + PyObject *arg_tuple = PyTuple_New(1); + + /* Note: this function used to reuse the argument tuple. This used to give + a slight performance boost with older pickle implementations where many + unbuffered reads occurred (thus needing many function calls). + + However, this optimization was removed because it was too complicated + to get right. It abused the C API for tuples to mutate them which led + to subtle reference counting and concurrency bugs. Furthermore, the + introduction of protocol 4 and the prefetching optimization via peek() + significantly reduced the number of function calls we do. Thus, the + benefits became marginal at best. */ + if (arg_tuple == NULL) { - arg_tuple = PyTuple_New(1); - if (arg_tuple == NULL) { - Py_DECREF(obj); - return NULL; - } - } - assert(arg_tuple->ob_refcnt == 1); - assert(PyTuple_GET_ITEM(arg_tuple, 0) == NULL); - + Py_DECREF(obj); + return NULL; + } PyTuple_SET_ITEM(arg_tuple, 0, obj); result = PyObject_Call(func, arg_tuple, NULL); - - Py_CLEAR(PyTuple_GET_ITEM(arg_tuple, 0)); - if (arg_tuple->ob_refcnt > 1) { - /* The function called saved a reference to our argument tuple. - This means we cannot reuse it anymore. */ - Py_CLEAR(arg_tuple); - } - st->arg_tuple = arg_tuple; - + Py_CLEAR(arg_tuple); return result; } @@ -7427,7 +7413,6 @@ Py_VISIT(st->import_mapping_3to2); Py_VISIT(st->codecs_encode); Py_VISIT(st->empty_tuple); - Py_VISIT(st->arg_tuple); return 0; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 29 00:17:43 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Fri, 29 Nov 2013 00:17:43 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Remove_explicit_empty_tupl?= =?utf-8?q?e_reuse_in_cpickle=2E?= Message-ID: <3dVvs34TSMzSSJ@mail.python.org> http://hg.python.org/cpython/rev/b44dbf07732e changeset: 87631:b44dbf07732e user: Alexandre Vassalotti date: Thu Nov 28 15:17:29 2013 -0800 summary: Remove explicit empty tuple reuse in cpickle. PyTuple_New(0) always returns the same empty tuple from its free list anyway, so we are not saving much here. Plus, the code where this was used is on uncommon run paths. files: Modules/_pickle.c | 18 ++++++------------ 1 files changed, 6 insertions(+), 12 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -166,9 +166,6 @@ /* codecs.encode, used for saving bytes in older protocols */ PyObject *codecs_encode; - - /* As the name says, an empty tuple. */ - PyObject *empty_tuple; } PickleState; /* Forward declaration of the _pickle module definition. */ @@ -205,7 +202,6 @@ Py_CLEAR(st->name_mapping_3to2); Py_CLEAR(st->import_mapping_3to2); Py_CLEAR(st->codecs_encode); - Py_CLEAR(st->empty_tuple); } /* Initialize the given pickle module state. */ @@ -321,10 +317,6 @@ } Py_CLEAR(codecs); - st->empty_tuple = PyTuple_New(0); - if (st->empty_tuple == NULL) - goto error; - return 0; error: @@ -1137,8 +1129,9 @@ return -1; if (n == READ_WHOLE_LINE) { - PickleState *st = _Pickle_GetGlobalState(); - data = PyObject_Call(self->readline, st->empty_tuple, NULL); + PyObject *empty_tuple = PyTuple_New(0); + data = PyObject_Call(self->readline, empty_tuple, NULL); + Py_DECREF(empty_tuple); } else { PyObject *len = PyLong_FromSsize_t(n); @@ -3774,8 +3767,10 @@ /* Check for a __reduce__ method. */ reduce_func = _PyObject_GetAttrId(obj, &PyId___reduce__); if (reduce_func != NULL) { - reduce_value = PyObject_Call(reduce_func, st->empty_tuple, + PyObject *empty_tuple = PyTuple_New(0); + reduce_value = PyObject_Call(reduce_func, empty_tuple, NULL); + Py_DECREF(empty_tuple); } else { PyErr_Format(st->PicklingError, @@ -7412,7 +7407,6 @@ Py_VISIT(st->name_mapping_3to2); Py_VISIT(st->import_mapping_3to2); Py_VISIT(st->codecs_encode); - Py_VISIT(st->empty_tuple); return 0; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 29 02:09:33 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Fri, 29 Nov 2013 02:09:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Use_PyDict=5FGetItemWithEr?= =?utf-8?q?ror_instead_of_PyDict=5FGetItem_in_cpickle=2E?= Message-ID: <3dVyL54fLKz7LjZ@mail.python.org> http://hg.python.org/cpython/rev/d829111a27e5 changeset: 87632:d829111a27e5 user: Alexandre Vassalotti date: Thu Nov 28 17:09:16 2013 -0800 summary: Use PyDict_GetItemWithError instead of PyDict_GetItem in cpickle. files: Modules/_pickle.c | 47 ++++++++++++++++++++++++++-------- 1 files changed, 35 insertions(+), 12 deletions(-) diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -1691,7 +1691,7 @@ key = PyLong_FromVoidPtr(obj); if (key == NULL) return 0; - if (PyDict_GetItem(self->fast_memo, key)) { + if (PyDict_GetItemWithError(self->fast_memo, key)) { Py_DECREF(key); PyErr_Format(PyExc_ValueError, "fast mode: can't pickle cyclic objects " @@ -1700,6 +1700,9 @@ self->fast_nesting = -1; return 0; } + if (PyErr_Occurred()) { + return 0; + } if (PyDict_SetItem(self->fast_memo, key, Py_None) < 0) { Py_DECREF(key); self->fast_nesting = -1; @@ -3142,12 +3145,17 @@ if (extension_key == NULL) { goto error; } - code_obj = PyDict_GetItem(st->extension_registry, extension_key); + code_obj = PyDict_GetItemWithError(st->extension_registry, + extension_key); Py_DECREF(extension_key); /* The object is not registered in the extension registry. This is the most likely code path. */ - if (code_obj == NULL) + if (code_obj == NULL) { + if (PyErr_Occurred()) { + goto error; + } goto gen_global; + } /* XXX: pickle.py doesn't check neither the type, nor the range of the value returned by the extension_registry. It should for @@ -3712,12 +3720,21 @@ */ if (self->dispatch_table == NULL) { PickleState *st = _Pickle_GetGlobalState(); - reduce_func = PyDict_GetItem(st->dispatch_table, (PyObject *)type); - /* PyDict_GetItem() unlike PyObject_GetItem() and - PyObject_GetAttr() returns a borrowed ref */ - Py_XINCREF(reduce_func); + reduce_func = PyDict_GetItemWithError(st->dispatch_table, + (PyObject *)type); + if (reduce_func == NULL) { + if (PyErr_Occurred()) { + goto error; + } + } else { + /* PyDict_GetItemWithError() returns a borrowed reference. + Increase the reference count to be consistent with + PyObject_GetItem and _PyObject_GetAttrId used below. */ + Py_INCREF(reduce_func); + } } else { - reduce_func = PyObject_GetItem(self->dispatch_table, (PyObject *)type); + reduce_func = PyObject_GetItem(self->dispatch_table, + (PyObject *)type); if (reduce_func == NULL) { if (PyErr_ExceptionMatches(PyExc_KeyError)) PyErr_Clear(); @@ -5564,20 +5581,26 @@ py_code = PyLong_FromLong(code); if (py_code == NULL) return -1; - obj = PyDict_GetItem(st->extension_cache, py_code); + obj = PyDict_GetItemWithError(st->extension_cache, py_code); if (obj != NULL) { /* Bingo. */ Py_DECREF(py_code); PDATA_APPEND(self->stack, obj, -1); return 0; } + if (PyErr_Occurred()) { + Py_DECREF(py_code); + return -1; + } /* Look up the (module_name, class_name) pair. */ - pair = PyDict_GetItem(st->inverted_registry, py_code); + pair = PyDict_GetItemWithError(st->inverted_registry, py_code); if (pair == NULL) { Py_DECREF(py_code); - PyErr_Format(PyExc_ValueError, "unregistered extension " - "code %ld", code); + if (!PyErr_Occurred()) { + PyErr_Format(PyExc_ValueError, "unregistered extension " + "code %ld", code); + } return -1; } /* Since the extension registry is manipulable via Python code, -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Fri Nov 29 09:40:18 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Fri, 29 Nov 2013 09:40:18 +0100 Subject: [Python-checkins] Daily reference leaks (d829111a27e5): sum=0 Message-ID: results for d829111a27e5 on branch "default" -------------------------------------------- test_asyncio leaked [0, 4, 0] memory blocks, sum=4 test_site leaked [-2, 2, -2] references, sum=-2 test_site leaked [-2, 2, -2] memory blocks, sum=-2 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogwzGOWs', '-x'] From python-checkins at python.org Fri Nov 29 11:22:58 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Fri, 29 Nov 2013 11:22:58 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5Nzk1?= =?utf-8?q?=3A_Improved_markup_of_True/False_constants=2E?= Message-ID: <3dWBcf4QXVz7LjP@mail.python.org> http://hg.python.org/cpython/rev/f51ca196d77a changeset: 87633:f51ca196d77a branch: 2.7 parent: 87628:3a7a3def6503 user: Serhiy Storchaka date: Fri Nov 29 12:16:53 2013 +0200 summary: Issue #19795: Improved markup of True/False constants. files: Doc/howto/functional.rst | 4 ++-- Doc/library/ast.rst | 2 +- Doc/library/bdb.rst | 8 ++++---- Doc/library/cookielib.rst | 16 ++++++++-------- Doc/library/ctypes.rst | 2 +- Doc/library/email.message.rst | 2 +- Doc/library/functions.rst | 6 +++--- Doc/library/gc.rst | 4 ++-- Doc/library/io.rst | 2 +- Doc/library/itertools.rst | 4 ++-- Doc/library/os.path.rst | 6 +++--- Doc/library/sqlite3.rst | 8 ++++---- Doc/library/ssl.rst | 2 +- Doc/library/stdtypes.rst | 2 +- Doc/library/struct.rst | 2 +- Doc/library/ttk.rst | 8 ++++---- Doc/library/turtle.rst | 6 +++--- Doc/library/urllib2.rst | 2 +- Doc/library/zipimport.rst | 2 +- Doc/reference/datamodel.rst | 2 +- Misc/NEWS | 5 +++++ 21 files changed, 50 insertions(+), 45 deletions(-) diff --git a/Doc/howto/functional.rst b/Doc/howto/functional.rst --- a/Doc/howto/functional.rst +++ b/Doc/howto/functional.rst @@ -743,8 +743,8 @@ Python wiki at http://wiki.python.org/moin/HowTo/Sorting.) The ``any(iter)`` and ``all(iter)`` built-ins look at the truth values of an -iterable's contents. :func:`any` returns True if any element in the iterable is -a true value, and :func:`all` returns True if all of the elements are true +iterable's contents. :func:`any` returns ``True`` if any element in the iterable is +a true value, and :func:`all` returns ``True`` if all of the elements are true values: >>> any([0,1,0]) diff --git a/Doc/library/ast.rst b/Doc/library/ast.rst --- a/Doc/library/ast.rst +++ b/Doc/library/ast.rst @@ -257,6 +257,6 @@ Return a formatted dump of the tree in *node*. This is mainly useful for debugging purposes. The returned string will show the names and the values for fields. This makes the code impossible to evaluate, so if evaluation is - wanted *annotate_fields* must be set to False. Attributes such as line + wanted *annotate_fields* must be set to ``False``. Attributes such as line numbers and column offsets are not dumped by default. If this is wanted, *include_attributes* can be set to ``True``. diff --git a/Doc/library/bdb.rst b/Doc/library/bdb.rst --- a/Doc/library/bdb.rst +++ b/Doc/library/bdb.rst @@ -186,17 +186,17 @@ .. method:: user_line(frame) This method is called from :meth:`dispatch_line` when either - :meth:`stop_here` or :meth:`break_here` yields True. + :meth:`stop_here` or :meth:`break_here` yields ``True``. .. method:: user_return(frame, return_value) This method is called from :meth:`dispatch_return` when :meth:`stop_here` - yields True. + yields ``True``. .. method:: user_exception(frame, exc_info) This method is called from :meth:`dispatch_exception` when - :meth:`stop_here` yields True. + :meth:`stop_here` yields ``True``. .. method:: do_clear(arg) @@ -237,7 +237,7 @@ .. method:: set_quit() - Set the :attr:`quitting` attribute to True. This raises :exc:`BdbQuit` in + Set the :attr:`quitting` attribute to ``True``. This raises :exc:`BdbQuit` in the next call to one of the :meth:`dispatch_\*` methods. diff --git a/Doc/library/cookielib.rst b/Doc/library/cookielib.rst --- a/Doc/library/cookielib.rst +++ b/Doc/library/cookielib.rst @@ -98,7 +98,7 @@ Netscape and RFC 2965 cookies. By default, RFC 2109 cookies (ie. cookies received in a :mailheader:`Set-Cookie` header with a version cookie-attribute of 1) are treated according to the RFC 2965 rules. However, if RFC 2965 handling - is turned off or :attr:`rfc2109_as_netscape` is True, RFC 2109 cookies are + is turned off or :attr:`rfc2109_as_netscape` is ``True``, RFC 2109 cookies are 'downgraded' by the :class:`CookieJar` instance to Netscape cookies, by setting the :attr:`version` attribute of the :class:`Cookie` instance to 0. :class:`DefaultCookiePolicy` also provides some parameters to allow some @@ -652,7 +652,7 @@ .. attribute:: Cookie.secure - True if cookie should only be returned over a secure connection. + ``True`` if cookie should only be returned over a secure connection. .. attribute:: Cookie.expires @@ -663,7 +663,7 @@ .. attribute:: Cookie.discard - True if this is a session cookie. + ``True`` if this is a session cookie. .. attribute:: Cookie.comment @@ -680,7 +680,7 @@ .. attribute:: Cookie.rfc2109 - True if this cookie was received as an RFC 2109 cookie (ie. the cookie + ``True`` if this cookie was received as an RFC 2109 cookie (ie. the cookie arrived in a :mailheader:`Set-Cookie` header, and the value of the Version cookie-attribute in that header was 1). This attribute is provided because :mod:`cookielib` may 'downgrade' RFC 2109 cookies to Netscape cookies, in @@ -691,18 +691,18 @@ .. attribute:: Cookie.port_specified - True if a port or set of ports was explicitly specified by the server (in the + ``True`` if a port or set of ports was explicitly specified by the server (in the :mailheader:`Set-Cookie` / :mailheader:`Set-Cookie2` header). .. attribute:: Cookie.domain_specified - True if a domain was explicitly specified by the server. + ``True`` if a domain was explicitly specified by the server. .. attribute:: Cookie.domain_initial_dot - True if the domain explicitly specified by the server began with a dot + ``True`` if the domain explicitly specified by the server began with a dot (``'.'``). Cookies may have additional non-standard cookie-attributes. These may be @@ -729,7 +729,7 @@ .. method:: Cookie.is_expired([now=None]) - True if cookie has passed the time at which the server requested it should + ``True`` if cookie has passed the time at which the server requested it should expire. If *now* is given (in seconds since the epoch), return whether the cookie has expired at the specified time. diff --git a/Doc/library/ctypes.rst b/Doc/library/ctypes.rst --- a/Doc/library/ctypes.rst +++ b/Doc/library/ctypes.rst @@ -2351,7 +2351,7 @@ .. class:: c_bool Represent the C :c:type:`bool` datatype (more accurately, :c:type:`_Bool` from - C99). Its value can be True or False, and the constructor accepts any object + C99). Its value can be ``True`` or ``False``, and the constructor accepts any object that has a truth value. .. versionadded:: 2.6 diff --git a/Doc/library/email.message.rst b/Doc/library/email.message.rst --- a/Doc/library/email.message.rst +++ b/Doc/library/email.message.rst @@ -68,7 +68,7 @@ Return ``True`` if the message's payload is a list of sub-\ :class:`Message` objects, otherwise return ``False``. When - :meth:`is_multipart` returns False, the payload should be a string object. + :meth:`is_multipart` returns ``False``, the payload should be a string object. .. method:: set_unixfrom(unixfrom) diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -47,7 +47,7 @@ .. function:: all(iterable) - Return True if all elements of the *iterable* are true (or if the iterable + Return ``True`` if all elements of the *iterable* are true (or if the iterable is empty). Equivalent to:: def all(iterable): @@ -61,8 +61,8 @@ .. function:: any(iterable) - Return True if any element of the *iterable* is true. If the iterable - is empty, return False. Equivalent to:: + Return ``True`` if any element of the *iterable* is true. If the iterable + is empty, return ``False``. Equivalent to:: def any(iterable): for element in iterable: diff --git a/Doc/library/gc.rst b/Doc/library/gc.rst --- a/Doc/library/gc.rst +++ b/Doc/library/gc.rst @@ -142,8 +142,8 @@ .. function:: is_tracked(obj) - Returns True if the object is currently tracked by the garbage collector, - False otherwise. As a general rule, instances of atomic types aren't + Returns ``True`` if the object is currently tracked by the garbage collector, + ``False`` otherwise. As a general rule, instances of atomic types aren't tracked and instances of non-atomic types (containers, user-defined objects...) are. However, some type-specific optimizations can be present in order to suppress the garbage collector footprint of simple instances diff --git a/Doc/library/io.rst b/Doc/library/io.rst --- a/Doc/library/io.rst +++ b/Doc/library/io.rst @@ -278,7 +278,7 @@ .. method:: readable() - Return ``True`` if the stream can be read from. If False, :meth:`read` + Return ``True`` if the stream can be read from. If ``False``, :meth:`read` will raise :exc:`IOError`. .. method:: readline(limit=-1) diff --git a/Doc/library/itertools.rst b/Doc/library/itertools.rst --- a/Doc/library/itertools.rst +++ b/Doc/library/itertools.rst @@ -52,8 +52,8 @@ :func:`compress` data, selectors (d[0] if s[0]), (d[1] if s[1]), ... ``compress('ABCDEF', [1,0,1,0,1,1]) --> A C E F`` :func:`dropwhile` pred, seq seq[n], seq[n+1], starting when pred fails ``dropwhile(lambda x: x<5, [1,4,6,4,1]) --> 6 4 1`` :func:`groupby` iterable[, keyfunc] sub-iterators grouped by value of keyfunc(v) -:func:`ifilter` pred, seq elements of seq where pred(elem) is True ``ifilter(lambda x: x%2, range(10)) --> 1 3 5 7 9`` -:func:`ifilterfalse` pred, seq elements of seq where pred(elem) is False ``ifilterfalse(lambda x: x%2, range(10)) --> 0 2 4 6 8`` +:func:`ifilter` pred, seq elements of seq where pred(elem) is true ``ifilter(lambda x: x%2, range(10)) --> 1 3 5 7 9`` +:func:`ifilterfalse` pred, seq elements of seq where pred(elem) is false ``ifilterfalse(lambda x: x%2, range(10)) --> 0 2 4 6 8`` :func:`islice` seq, [start,] stop [, step] elements from seq[start:stop:step] ``islice('ABCDEFG', 2, None) --> C D E F G`` :func:`imap` func, p, q, ... func(p0, q0), func(p1, q1), ... ``imap(pow, (2,3,10), (5,2,3)) --> 32 9 1000`` :func:`starmap` func, seq func(\*seq[0]), func(\*seq[1]), ... ``starmap(pow, [(2,5), (3,2), (10,3)]) --> 32 9 1000`` diff --git a/Doc/library/os.path.rst b/Doc/library/os.path.rst --- a/Doc/library/os.path.rst +++ b/Doc/library/os.path.rst @@ -127,7 +127,7 @@ .. versionadded:: 1.5.2 .. versionchanged:: 2.3 - If :func:`os.stat_float_times` returns True, the result is a floating point + If :func:`os.stat_float_times` returns ``True``, the result is a floating point number. @@ -140,7 +140,7 @@ .. versionadded:: 1.5.2 .. versionchanged:: 2.3 - If :func:`os.stat_float_times` returns True, the result is a floating point + If :func:`os.stat_float_times` returns ``True``, the result is a floating point number. @@ -345,7 +345,7 @@ .. data:: supports_unicode_filenames - True if arbitrary Unicode strings can be used as file names (within limitations + ``True`` if arbitrary Unicode strings can be used as file names (within limitations imposed by the file system). .. versionadded:: 2.3 diff --git a/Doc/library/sqlite3.rst b/Doc/library/sqlite3.rst --- a/Doc/library/sqlite3.rst +++ b/Doc/library/sqlite3.rst @@ -234,10 +234,10 @@ .. function:: enable_callback_tracebacks(flag) By default you will not get any tracebacks in user-defined functions, - aggregates, converters, authorizer callbacks etc. If you want to debug them, you - can call this function with *flag* as True. Afterwards, you will get tracebacks - from callbacks on ``sys.stderr``. Use :const:`False` to disable the feature - again. + aggregates, converters, authorizer callbacks etc. If you want to debug them, + you can call this function with *flag* set to ``True``. Afterwards, you will + get tracebacks from callbacks on ``sys.stderr``. Use :const:`False` to + disable the feature again. .. _sqlite3-connection-objects: diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -166,7 +166,7 @@ .. function:: RAND_status() - Returns True if the SSL pseudo-random number generator has been seeded with + Returns ``True`` if the SSL pseudo-random number generator has been seeded with 'enough' randomness, and False otherwise. You can use :func:`ssl.RAND_egd` and :func:`ssl.RAND_add` to increase the randomness of the pseudo-random number generator. diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -1783,7 +1783,7 @@ .. method:: isdisjoint(other) - Return True if the set has no elements in common with *other*. Sets are + Return ``True`` if the set has no elements in common with *other*. Sets are disjoint if and only if their intersection is the empty set. .. versionadded:: 2.6 diff --git a/Doc/library/struct.rst b/Doc/library/struct.rst --- a/Doc/library/struct.rst +++ b/Doc/library/struct.rst @@ -284,7 +284,7 @@ For the ``'?'`` format character, the return value is either :const:`True` or :const:`False`. When packing, the truth value of the argument object is used. Either 0 or 1 in the native or standard bool representation will be packed, and -any non-zero value will be True when unpacking. +any non-zero value will be ``True`` when unpacking. diff --git a/Doc/library/ttk.rst b/Doc/library/ttk.rst --- a/Doc/library/ttk.rst +++ b/Doc/library/ttk.rst @@ -267,8 +267,8 @@ .. method:: instate(statespec, callback=None, *args, **kw) - Test the widget's state. If a callback is not specified, returns True - if the widget state matches *statespec* and False otherwise. If callback + Test the widget's state. If a callback is not specified, returns ``True`` + if the widget state matches *statespec* and ``False`` otherwise. If callback is specified then it is called with *args* if widget state matches *statespec*. @@ -919,7 +919,7 @@ .. method:: exists(item) - Returns True if the specified *item* is present in the tree. + Returns ``True`` if the specified *item* is present in the tree. .. method:: focus([item=None]) @@ -1065,7 +1065,7 @@ Ensure that *item* is visible. - Sets all of *item*'s ancestors open option to True, and scrolls the + Sets all of *item*'s ancestors open option to ``True``, and scrolls the widget if necessary so that *item* is within the visible portion of the tree. diff --git a/Doc/library/turtle.rst b/Doc/library/turtle.rst --- a/Doc/library/turtle.rst +++ b/Doc/library/turtle.rst @@ -1049,8 +1049,8 @@ Write text - the string representation of *arg* - at the current turtle position according to *align* ("left", "center" or right") and with the given - font. If *move* is True, the pen is moved to the bottom-right corner of the - text. By default, *move* is False. + font. If *move* is true, the pen is moved to the bottom-right corner of the + text. By default, *move* is ``False``. >>> turtle.write("Home = ", True, align="center") >>> turtle.write((0,0), True) @@ -1086,7 +1086,7 @@ .. function:: isvisible() - Return True if the Turtle is shown, False if it's hidden. + Return ``True`` if the Turtle is shown, ``False`` if it's hidden. >>> turtle.hideturtle() >>> turtle.isvisible() diff --git a/Doc/library/urllib2.rst b/Doc/library/urllib2.rst --- a/Doc/library/urllib2.rst +++ b/Doc/library/urllib2.rst @@ -166,7 +166,7 @@ should be the request-host of the request for the page containing the image. *unverifiable* should indicate whether the request is unverifiable, as defined - by RFC 2965. It defaults to False. An unverifiable request is one whose URL + by RFC 2965. It defaults to ``False``. An unverifiable request is one whose URL the user did not have the option to approve. For example, if the request is for an image in an HTML document, and the user had no option to approve the automatic fetching of the image, this should be true. diff --git a/Doc/library/zipimport.rst b/Doc/library/zipimport.rst --- a/Doc/library/zipimport.rst +++ b/Doc/library/zipimport.rst @@ -115,7 +115,7 @@ .. method:: is_package(fullname) - Return True if the module specified by *fullname* is a package. Raise + Return ``True`` if the module specified by *fullname* is a package. Raise :exc:`ZipImportError` if the module couldn't be found. diff --git a/Doc/reference/datamodel.rst b/Doc/reference/datamodel.rst --- a/Doc/reference/datamodel.rst +++ b/Doc/reference/datamodel.rst @@ -207,7 +207,7 @@ single: True These represent the truth values False and True. The two objects - representing the values False and True are the only Boolean objects. + representing the values ``False`` and ``True`` are the only Boolean objects. The Boolean type is a subtype of plain integers, and Boolean values behave like the values 0 and 1, respectively, in almost all contexts, the exception being that when converted to a string, the strings diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -67,6 +67,11 @@ - Issue #19085: Added basic tests for all tkinter widget options. +Documentation +------------- + +- Issue #19795: Improved markup of True/False constants. + Whats' New in Python 2.7.6? =========================== -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 29 11:23:00 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Fri, 29 Nov 2013 11:23:00 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5Nzk1?= =?utf-8?q?=3A_Improved_markup_of_True/False_constants=2E?= Message-ID: <3dWBch3QN0z7LjZ@mail.python.org> http://hg.python.org/cpython/rev/b2066bc8cab9 changeset: 87634:b2066bc8cab9 branch: 3.3 parent: 87626:712ebde527c2 user: Serhiy Storchaka date: Fri Nov 29 12:17:13 2013 +0200 summary: Issue #19795: Improved markup of True/False constants. files: Doc/howto/curses.rst | 2 +- Doc/howto/functional.rst | 4 ++-- Doc/library/ast.rst | 2 +- Doc/library/bdb.rst | 8 ++++---- Doc/library/bz2.rst | 2 +- Doc/library/ctypes.rst | 2 +- Doc/library/difflib.rst | 2 +- Doc/library/email.message.rst | 2 +- Doc/library/functions.rst | 8 ++++---- Doc/library/gc.rst | 4 ++-- Doc/library/http.client.rst | 2 +- Doc/library/http.cookiejar.rst | 16 ++++++++-------- Doc/library/imp.rst | 4 ++-- Doc/library/importlib.rst | 10 ++++++++++ Doc/library/io.rst | 2 +- Doc/library/itertools.rst | 2 +- Doc/library/logging.handlers.rst | 2 +- Doc/library/logging.rst | 6 +++--- Doc/library/lzma.rst | 2 +- Doc/library/mmap.rst | 2 +- Doc/library/nntplib.rst | 4 ++-- Doc/library/os.path.rst | 6 +++--- Doc/library/os.rst | 2 +- Doc/library/pickle.rst | 12 ++++++------ Doc/library/sched.rst | 2 +- Doc/library/sqlite3.rst | 8 ++++---- Doc/library/ssl.rst | 6 +++--- Doc/library/stdtypes.rst | 2 +- Doc/library/struct.rst | 2 +- Doc/library/subprocess.rst | 4 ++-- Doc/library/sys.rst | 2 +- Doc/library/tarfile.rst | 2 +- Doc/library/tkinter.ttk.rst | 8 ++++---- Doc/library/turtle.rst | 6 +++--- Doc/library/unittest.mock.rst | 6 +++--- Doc/library/urllib.request.rst | 4 ++-- Doc/library/venv.rst | 4 ++-- Doc/library/zipimport.rst | 2 +- Doc/reference/datamodel.rst | 2 +- Doc/whatsnew/3.3.rst | 2 +- Misc/NEWS | 2 ++ 41 files changed, 92 insertions(+), 80 deletions(-) diff --git a/Doc/howto/curses.rst b/Doc/howto/curses.rst --- a/Doc/howto/curses.rst +++ b/Doc/howto/curses.rst @@ -422,7 +422,7 @@ blue or any other color you like. Unfortunately, the Linux console doesn't support this, so I'm unable to try it out, and can't provide any examples. You can check if your terminal can do this by calling -:func:`~curses.can_change_color`, which returns True if the capability is +:func:`~curses.can_change_color`, which returns ``True`` if the capability is there. If you're lucky enough to have such a talented terminal, consult your system's man pages for more information. diff --git a/Doc/howto/functional.rst b/Doc/howto/functional.rst --- a/Doc/howto/functional.rst +++ b/Doc/howto/functional.rst @@ -687,8 +687,8 @@ The :func:`any(iter) ` and :func:`all(iter) ` built-ins look at the -truth values of an iterable's contents. :func:`any` returns True if any element -in the iterable is a true value, and :func:`all` returns True if all of the +truth values of an iterable's contents. :func:`any` returns ``True`` if any element +in the iterable is a true value, and :func:`all` returns ``True`` if all of the elements are true values: >>> any([0,1,0]) diff --git a/Doc/library/ast.rst b/Doc/library/ast.rst --- a/Doc/library/ast.rst +++ b/Doc/library/ast.rst @@ -244,6 +244,6 @@ Return a formatted dump of the tree in *node*. This is mainly useful for debugging purposes. The returned string will show the names and the values for fields. This makes the code impossible to evaluate, so if evaluation is - wanted *annotate_fields* must be set to False. Attributes such as line + wanted *annotate_fields* must be set to ``False``. Attributes such as line numbers and column offsets are not dumped by default. If this is wanted, *include_attributes* can be set to ``True``. diff --git a/Doc/library/bdb.rst b/Doc/library/bdb.rst --- a/Doc/library/bdb.rst +++ b/Doc/library/bdb.rst @@ -194,17 +194,17 @@ .. method:: user_line(frame) This method is called from :meth:`dispatch_line` when either - :meth:`stop_here` or :meth:`break_here` yields True. + :meth:`stop_here` or :meth:`break_here` yields ``True``. .. method:: user_return(frame, return_value) This method is called from :meth:`dispatch_return` when :meth:`stop_here` - yields True. + yields ``True``. .. method:: user_exception(frame, exc_info) This method is called from :meth:`dispatch_exception` when - :meth:`stop_here` yields True. + :meth:`stop_here` yields ``True``. .. method:: do_clear(arg) @@ -245,7 +245,7 @@ .. method:: set_quit() - Set the :attr:`quitting` attribute to True. This raises :exc:`BdbQuit` in + Set the :attr:`quitting` attribute to ``True``. This raises :exc:`BdbQuit` in the next call to one of the :meth:`dispatch_\*` methods. diff --git a/Doc/library/bz2.rst b/Doc/library/bz2.rst --- a/Doc/library/bz2.rst +++ b/Doc/library/bz2.rst @@ -162,7 +162,7 @@ .. attribute:: eof - True if the end-of-stream marker has been reached. + ``True`` if the end-of-stream marker has been reached. .. versionadded:: 3.3 diff --git a/Doc/library/ctypes.rst b/Doc/library/ctypes.rst --- a/Doc/library/ctypes.rst +++ b/Doc/library/ctypes.rst @@ -2242,7 +2242,7 @@ .. class:: c_bool Represent the C :c:type:`bool` datatype (more accurately, :c:type:`_Bool` from - C99). Its value can be True or False, and the constructor accepts any object + C99). Its value can be ``True`` or ``False``, and the constructor accepts any object that has a truth value. diff --git a/Doc/library/difflib.rst b/Doc/library/difflib.rst --- a/Doc/library/difflib.rst +++ b/Doc/library/difflib.rst @@ -359,7 +359,7 @@ The *autojunk* parameter. SequenceMatcher objects get three data attributes: *bjunk* is the - set of elements of *b* for which *isjunk* is True; *bpopular* is the set of + set of elements of *b* for which *isjunk* is ``True``; *bpopular* is the set of non-junk elements considered popular by the heuristic (if it is not disabled); *b2j* is a dict mapping the remaining elements of *b* to a list of positions where they occur. All three are reset whenever *b* is reset diff --git a/Doc/library/email.message.rst b/Doc/library/email.message.rst --- a/Doc/library/email.message.rst +++ b/Doc/library/email.message.rst @@ -75,7 +75,7 @@ Return ``True`` if the message's payload is a list of sub-\ :class:`Message` objects, otherwise return ``False``. When - :meth:`is_multipart` returns False, the payload should be a string object. + :meth:`is_multipart` returns ``False``, the payload should be a string object. .. method:: set_unixfrom(unixfrom) diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -48,7 +48,7 @@ .. function:: all(iterable) - Return True if all elements of the *iterable* are true (or if the iterable + Return ``True`` if all elements of the *iterable* are true (or if the iterable is empty). Equivalent to:: def all(iterable): @@ -60,8 +60,8 @@ .. function:: any(iterable) - Return True if any element of the *iterable* is true. If the iterable - is empty, return False. Equivalent to:: + Return ``True`` if any element of the *iterable* is true. If the iterable + is empty, return ``False``. Equivalent to:: def any(iterable): for element in iterable: @@ -890,7 +890,7 @@ the buffer will typically be 4096 or 8192 bytes long. * "Interactive" text files (files for which :meth:`~io.IOBase.isatty` - returns True) use line buffering. Other text files use the policy + returns ``True``) use line buffering. Other text files use the policy described above for binary files. *encoding* is the name of the encoding used to decode or encode the file. diff --git a/Doc/library/gc.rst b/Doc/library/gc.rst --- a/Doc/library/gc.rst +++ b/Doc/library/gc.rst @@ -130,8 +130,8 @@ .. function:: is_tracked(obj) - Returns True if the object is currently tracked by the garbage collector, - False otherwise. As a general rule, instances of atomic types aren't + Returns ``True`` if the object is currently tracked by the garbage collector, + ``False`` otherwise. As a general rule, instances of atomic types aren't tracked and instances of non-atomic types (containers, user-defined objects...) are. However, some type-specific optimizations can be present in order to suppress the garbage collector footprint of simple instances diff --git a/Doc/library/http.client.rst b/Doc/library/http.client.rst --- a/Doc/library/http.client.rst +++ b/Doc/library/http.client.rst @@ -573,7 +573,7 @@ .. attribute:: HTTPResponse.closed - Is True if the stream is closed. + Is ``True`` if the stream is closed. Examples -------- diff --git a/Doc/library/http.cookiejar.rst b/Doc/library/http.cookiejar.rst --- a/Doc/library/http.cookiejar.rst +++ b/Doc/library/http.cookiejar.rst @@ -90,7 +90,7 @@ Netscape and RFC 2965 cookies. By default, RFC 2109 cookies (ie. cookies received in a :mailheader:`Set-Cookie` header with a version cookie-attribute of 1) are treated according to the RFC 2965 rules. However, if RFC 2965 handling - is turned off or :attr:`rfc2109_as_netscape` is True, RFC 2109 cookies are + is turned off or :attr:`rfc2109_as_netscape` is ``True``, RFC 2109 cookies are 'downgraded' by the :class:`CookieJar` instance to Netscape cookies, by setting the :attr:`version` attribute of the :class:`Cookie` instance to 0. :class:`DefaultCookiePolicy` also provides some parameters to allow some @@ -644,7 +644,7 @@ .. attribute:: Cookie.secure - True if cookie should only be returned over a secure connection. + ``True`` if cookie should only be returned over a secure connection. .. attribute:: Cookie.expires @@ -655,7 +655,7 @@ .. attribute:: Cookie.discard - True if this is a session cookie. + ``True`` if this is a session cookie. .. attribute:: Cookie.comment @@ -672,7 +672,7 @@ .. attribute:: Cookie.rfc2109 - True if this cookie was received as an RFC 2109 cookie (ie. the cookie + ``True`` if this cookie was received as an RFC 2109 cookie (ie. the cookie arrived in a :mailheader:`Set-Cookie` header, and the value of the Version cookie-attribute in that header was 1). This attribute is provided because :mod:`http.cookiejar` may 'downgrade' RFC 2109 cookies to Netscape cookies, in @@ -681,18 +681,18 @@ .. attribute:: Cookie.port_specified - True if a port or set of ports was explicitly specified by the server (in the + ``True`` if a port or set of ports was explicitly specified by the server (in the :mailheader:`Set-Cookie` / :mailheader:`Set-Cookie2` header). .. attribute:: Cookie.domain_specified - True if a domain was explicitly specified by the server. + ``True`` if a domain was explicitly specified by the server. .. attribute:: Cookie.domain_initial_dot - True if the domain explicitly specified by the server began with a dot + ``True`` if the domain explicitly specified by the server began with a dot (``'.'``). Cookies may have additional non-standard cookie-attributes. These may be @@ -719,7 +719,7 @@ .. method:: Cookie.is_expired(now=None) - True if cookie has passed the time at which the server requested it should + ``True`` if cookie has passed the time at which the server requested it should expire. If *now* is given (in seconds since the epoch), return whether the cookie has expired at the specified time. diff --git a/Doc/library/imp.rst b/Doc/library/imp.rst --- a/Doc/library/imp.rst +++ b/Doc/library/imp.rst @@ -190,8 +190,8 @@ The ``cpython-32`` string comes from the current magic tag (see :func:`get_tag`; if :attr:`sys.implementation.cache_tag` is not defined then :exc:`NotImplementedError` will be raised). The returned path will end in - ``.pyc`` when ``__debug__`` is True or ``.pyo`` for an optimized Python - (i.e. ``__debug__`` is False). By passing in True or False for + ``.pyc`` when ``__debug__`` is ``True`` or ``.pyo`` for an optimized Python + (i.e. ``__debug__`` is ``False``). By passing in ``True`` or ``False`` for *debug_override* you can override the system's value for ``__debug__`` for extension selection. diff --git a/Doc/library/importlib.rst b/Doc/library/importlib.rst --- a/Doc/library/importlib.rst +++ b/Doc/library/importlib.rst @@ -918,8 +918,18 @@ Reliance on this decorator is discouraged when it is possible to set :attr:`__package__` before importing. By setting it beforehand the code for the module is executed with the +<<<<<<< attribute set and thus can be used by global level code during initialization. +======= + The ``cpython-32`` string comes from the current magic tag (see + :func:`get_tag`; if :attr:`sys.implementation.cache_tag` is not defined then + :exc:`NotImplementedError` will be raised). The returned path will end in + ``.pyc`` when ``__debug__`` is ``True`` or ``.pyo`` for an optimized Python + (i.e. ``__debug__`` is ``False``). By passing in ``True`` or ``False`` for + *debug_override* you can override the system's value for ``__debug__`` for + extension selection. +>>>>>>> .. note:: diff --git a/Doc/library/io.rst b/Doc/library/io.rst --- a/Doc/library/io.rst +++ b/Doc/library/io.rst @@ -280,7 +280,7 @@ .. method:: readable() - Return ``True`` if the stream can be read from. If False, :meth:`read` + Return ``True`` if the stream can be read from. If ``False``, :meth:`read` will raise :exc:`OSError`. .. method:: readline(limit=-1) diff --git a/Doc/library/itertools.rst b/Doc/library/itertools.rst --- a/Doc/library/itertools.rst +++ b/Doc/library/itertools.rst @@ -51,7 +51,7 @@ chain.from_iterable iterable p0, p1, ... plast, q0, q1, ... ``chain.from_iterable(['ABC', 'DEF']) --> A B C D E F`` :func:`compress` data, selectors (d[0] if s[0]), (d[1] if s[1]), ... ``compress('ABCDEF', [1,0,1,0,1,1]) --> A C E F`` :func:`dropwhile` pred, seq seq[n], seq[n+1], starting when pred fails ``dropwhile(lambda x: x<5, [1,4,6,4,1]) --> 6 4 1`` -:func:`filterfalse` pred, seq elements of seq where pred(elem) is False ``filterfalse(lambda x: x%2, range(10)) --> 0 2 4 6 8`` +:func:`filterfalse` pred, seq elements of seq where pred(elem) is false ``filterfalse(lambda x: x%2, range(10)) --> 0 2 4 6 8`` :func:`groupby` iterable[, keyfunc] sub-iterators grouped by value of keyfunc(v) :func:`islice` seq, [start,] stop [, step] elements from seq[start:stop:step] ``islice('ABCDEFG', 2, None) --> C D E F G`` :func:`starmap` func, seq func(\*seq[0]), func(\*seq[1]), ... ``starmap(pow, [(2,5), (3,2), (10,3)]) --> 32 9 1000`` diff --git a/Doc/library/logging.handlers.rst b/Doc/library/logging.handlers.rst --- a/Doc/library/logging.handlers.rst +++ b/Doc/library/logging.handlers.rst @@ -831,7 +831,7 @@ Returns a new instance of the :class:`HTTPHandler` class. The *host* can be of the form ``host:port``, should you need to use a specific port number. - If no *method* is specified, ``GET`` is used. If *secure* is True, an HTTPS + If no *method* is specified, ``GET`` is used. If *secure* is true, an HTTPS connection will be used. If *credentials* is specified, it should be a 2-tuple consisting of userid and password, which will be placed in an HTTP 'Authorization' header using Basic authentication. If you specify diff --git a/Doc/library/logging.rst b/Doc/library/logging.rst --- a/Doc/library/logging.rst +++ b/Doc/library/logging.rst @@ -160,7 +160,7 @@ is called to get the exception information. The second optional keyword argument is *stack_info*, which defaults to - False. If specified as True, stack information is added to the logging + ``False``. If true, stack information is added to the logging message, including the actual logging call. Note that this is not the same stack information as that displayed through specifying *exc_info*: The former is stack frames from the bottom of the stack up to the logging call @@ -308,7 +308,7 @@ Checks to see if this logger has any handlers configured. This is done by looking for handlers in this logger and its parents in the logger hierarchy. - Returns True if a handler was found, else False. The method stops searching + Returns ``True`` if a handler was found, else ``False``. The method stops searching up the hierarchy whenever a logger with the 'propagate' attribute set to False is found - that will be the last logger which is checked for the existence of handlers. @@ -878,7 +878,7 @@ is called to get the exception information. The second optional keyword argument is *stack_info*, which defaults to - False. If specified as True, stack information is added to the logging + ``False``. If true, stack information is added to the logging message, including the actual logging call. Note that this is not the same stack information as that displayed through specifying *exc_info*: The former is stack frames from the bottom of the stack up to the logging call diff --git a/Doc/library/lzma.rst b/Doc/library/lzma.rst --- a/Doc/library/lzma.rst +++ b/Doc/library/lzma.rst @@ -225,7 +225,7 @@ .. attribute:: eof - True if the end-of-stream marker has been reached. + ``True`` if the end-of-stream marker has been reached. .. attribute:: unused_data diff --git a/Doc/library/mmap.rst b/Doc/library/mmap.rst --- a/Doc/library/mmap.rst +++ b/Doc/library/mmap.rst @@ -162,7 +162,7 @@ .. attribute:: closed - True if the file is closed. + ``True`` if the file is closed. .. versionadded:: 3.2 diff --git a/Doc/library/nntplib.rst b/Doc/library/nntplib.rst --- a/Doc/library/nntplib.rst +++ b/Doc/library/nntplib.rst @@ -83,7 +83,7 @@ .. versionchanged:: 3.2 - *usenetrc* is now False by default. + *usenetrc* is now ``False`` by default. .. versionchanged:: 3.3 Support for the :keyword:`with` statement was added. @@ -217,7 +217,7 @@ .. method:: NNTP.login(user=None, password=None, usenetrc=True) Send ``AUTHINFO`` commands with the user name and password. If *user* - and *password* are None and *usenetrc* is True, credentials from + and *password* are None and *usenetrc* is true, credentials from ``~/.netrc`` will be used if possible. Unless intentionally delayed, login is normally performed during the diff --git a/Doc/library/os.path.rst b/Doc/library/os.path.rst --- a/Doc/library/os.path.rst +++ b/Doc/library/os.path.rst @@ -133,7 +133,7 @@ the number of seconds since the epoch (see the :mod:`time` module). Raise :exc:`OSError` if the file does not exist or is inaccessible. - If :func:`os.stat_float_times` returns True, the result is a floating point + If :func:`os.stat_float_times` returns ``True``, the result is a floating point number. @@ -143,7 +143,7 @@ giving the number of seconds since the epoch (see the :mod:`time` module). Raise :exc:`OSError` if the file does not exist or is inaccessible. - If :func:`os.stat_float_times` returns True, the result is a floating point + If :func:`os.stat_float_times` returns ``True``, the result is a floating point number. @@ -334,5 +334,5 @@ .. data:: supports_unicode_filenames - True if arbitrary Unicode strings can be used as file names (within limitations + ``True`` if arbitrary Unicode strings can be used as file names (within limitations imposed by the file system). diff --git a/Doc/library/os.rst b/Doc/library/os.rst --- a/Doc/library/os.rst +++ b/Doc/library/os.rst @@ -551,7 +551,7 @@ .. data:: supports_bytes_environ - True if the native OS type of the environment is bytes (eg. False on + ``True`` if the native OS type of the environment is bytes (eg. ``False`` on Windows). .. versionadded:: 3.2 diff --git a/Doc/library/pickle.rst b/Doc/library/pickle.rst --- a/Doc/library/pickle.rst +++ b/Doc/library/pickle.rst @@ -161,7 +161,7 @@ :class:`io.BytesIO` instance, or any other custom object that meets this interface. - If *fix_imports* is True and *protocol* is less than 3, pickle will try to + If *fix_imports* is true and *protocol* is less than 3, pickle will try to map the new Python 3.x names to the old module names used in Python 2.x, so that the pickle data stream is readable with Python 2.x. @@ -178,7 +178,7 @@ supported. The higher the protocol used, the more recent the version of Python needed to read the pickle produced. - If *fix_imports* is True and *protocol* is less than 3, pickle will try to + If *fix_imports* is true and *protocol* is less than 3, pickle will try to map the new Python 3.x names to the old module names used in Python 2.x, so that the pickle data stream is readable with Python 2.x. @@ -200,7 +200,7 @@ Optional keyword arguments are *fix_imports*, *encoding* and *errors*, which are used to control compatibility support for pickle stream generated - by Python 2.x. If *fix_imports* is True, pickle will try to map the old + by Python 2.x. If *fix_imports* is true, pickle will try to map the old Python 2.x names to the new names used in Python 3.x. The *encoding* and *errors* tell pickle how to decode 8-bit string instances pickled by Python 2.x; these default to 'ASCII' and 'strict', respectively. @@ -216,7 +216,7 @@ Optional keyword arguments are *fix_imports*, *encoding* and *errors*, which are used to control compatibility support for pickle stream generated - by Python 2.x. If *fix_imports* is True, pickle will try to map the old + by Python 2.x. If *fix_imports* is true, pickle will try to map the old Python 2.x names to the new names used in Python 3.x. The *encoding* and *errors* tell pickle how to decode 8-bit string instances pickled by Python 2.x; these default to 'ASCII' and 'strict', respectively. @@ -266,7 +266,7 @@ argument. It can thus be an on-disk file opened for binary writing, a :class:`io.BytesIO` instance, or any other custom object that meets this interface. - If *fix_imports* is True and *protocol* is less than 3, pickle will try to + If *fix_imports* is true and *protocol* is less than 3, pickle will try to map the new Python 3.x names to the old module names used in Python 2.x, so that the pickle data stream is readable with Python 2.x. @@ -336,7 +336,7 @@ Optional keyword arguments are *fix_imports*, *encoding* and *errors*, which are used to control compatibility support for pickle stream generated - by Python 2.x. If *fix_imports* is True, pickle will try to map the old + by Python 2.x. If *fix_imports* is true, pickle will try to map the old Python 2.x names to the new names used in Python 3.x. The *encoding* and *errors* tell pickle how to decode 8-bit string instances pickled by Python 2.x; these default to 'ASCII' and 'strict', respectively. diff --git a/Doc/library/sched.rst b/Doc/library/sched.rst --- a/Doc/library/sched.rst +++ b/Doc/library/sched.rst @@ -113,7 +113,7 @@ function passed to the constructor) for the next event, then execute it and so on until there are no more scheduled events. - If *blocking* is False executes the scheduled events due to expire soonest + If *blocking* is false executes the scheduled events due to expire soonest (if any) and then return the deadline of the next scheduled call in the scheduler (if any). diff --git a/Doc/library/sqlite3.rst b/Doc/library/sqlite3.rst --- a/Doc/library/sqlite3.rst +++ b/Doc/library/sqlite3.rst @@ -229,10 +229,10 @@ .. function:: enable_callback_tracebacks(flag) By default you will not get any tracebacks in user-defined functions, - aggregates, converters, authorizer callbacks etc. If you want to debug them, you - can call this function with *flag* as True. Afterwards, you will get tracebacks - from callbacks on ``sys.stderr``. Use :const:`False` to disable the feature - again. + aggregates, converters, authorizer callbacks etc. If you want to debug them, + you can call this function with *flag* set to ``True``. Afterwards, you will + get tracebacks from callbacks on ``sys.stderr``. Use :const:`False` to + disable the feature again. .. _sqlite3-connection-objects: diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -245,7 +245,7 @@ .. function:: RAND_pseudo_bytes(num) Returns (bytes, is_cryptographic): bytes are *num* pseudo-random bytes, - is_cryptographic is True if the bytes generated are cryptographically + is_cryptographic is ``True`` if the bytes generated are cryptographically strong. Raises an :class:`SSLError` if the operation is not supported by the current RAND method. @@ -258,8 +258,8 @@ .. function:: RAND_status() - Returns True if the SSL pseudo-random number generator has been seeded with - 'enough' randomness, and False otherwise. You can use :func:`ssl.RAND_egd` + Returns ``True`` if the SSL pseudo-random number generator has been seeded with + 'enough' randomness, and ``False`` otherwise. You can use :func:`ssl.RAND_egd` and :func:`ssl.RAND_add` to increase the randomness of the pseudo-random number generator. diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -2847,7 +2847,7 @@ .. method:: isdisjoint(other) - Return True if the set has no elements in common with *other*. Sets are + Return ``True`` if the set has no elements in common with *other*. Sets are disjoint if and only if their intersection is the empty set. .. method:: issubset(other) diff --git a/Doc/library/struct.rst b/Doc/library/struct.rst --- a/Doc/library/struct.rst +++ b/Doc/library/struct.rst @@ -282,7 +282,7 @@ For the ``'?'`` format character, the return value is either :const:`True` or :const:`False`. When packing, the truth value of the argument object is used. Either 0 or 1 in the native or standard bool representation will be packed, and -any non-zero value will be True when unpacking. +any non-zero value will be ``True`` when unpacking. diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -512,7 +512,7 @@ *executable* (or for the first item in *args*) relative to *cwd* if the executable path is a relative path. - If *restore_signals* is True (the default) all signals that Python has set to + If *restore_signals* is true (the default) all signals that Python has set to SIG_IGN are restored to SIG_DFL in the child process before the exec. Currently this includes the SIGPIPE, SIGXFZ and SIGXFSZ signals. (Unix only) @@ -520,7 +520,7 @@ .. versionchanged:: 3.2 *restore_signals* was added. - If *start_new_session* is True the setsid() system call will be made in the + If *start_new_session* is true the setsid() system call will be made in the child process prior to the execution of the subprocess. (Unix only) .. versionchanged:: 3.2 diff --git a/Doc/library/sys.rst b/Doc/library/sys.rst --- a/Doc/library/sys.rst +++ b/Doc/library/sys.rst @@ -1032,7 +1032,7 @@ :func:`open` function. Their parameters are chosen as follows: * The character encoding is platform-dependent. Under Windows, if the stream - is interactive (that is, if its :meth:`isatty` method returns True), the + is interactive (that is, if its :meth:`isatty` method returns ``True``), the console codepage is used, otherwise the ANSI code page. Under other platforms, the locale encoding is used (see :meth:`locale.getpreferredencoding`). diff --git a/Doc/library/tarfile.rst b/Doc/library/tarfile.rst --- a/Doc/library/tarfile.rst +++ b/Doc/library/tarfile.rst @@ -359,7 +359,7 @@ full name. Its file information is extracted as accurately as possible. *member* may be a filename or a :class:`TarInfo` object. You can specify a different directory using *path*. File attributes (owner, mtime, mode) are set unless - *set_attrs* is False. + *set_attrs* is false. .. note:: diff --git a/Doc/library/tkinter.ttk.rst b/Doc/library/tkinter.ttk.rst --- a/Doc/library/tkinter.ttk.rst +++ b/Doc/library/tkinter.ttk.rst @@ -272,8 +272,8 @@ .. method:: instate(statespec, callback=None, *args, **kw) - Test the widget's state. If a callback is not specified, returns True - if the widget state matches *statespec* and False otherwise. If callback + Test the widget's state. If a callback is not specified, returns ``True`` + if the widget state matches *statespec* and ``False`` otherwise. If callback is specified then it is called with args if widget state matches *statespec*. @@ -938,7 +938,7 @@ .. method:: exists(item) - Returns True if the specified *item* is present in the tree. + Returns ``True`` if the specified *item* is present in the tree. .. method:: focus(item=None) @@ -1084,7 +1084,7 @@ Ensure that *item* is visible. - Sets all of *item*'s ancestors open option to True, and scrolls the + Sets all of *item*'s ancestors open option to ``True``, and scrolls the widget if necessary so that *item* is within the visible portion of the tree. diff --git a/Doc/library/turtle.rst b/Doc/library/turtle.rst --- a/Doc/library/turtle.rst +++ b/Doc/library/turtle.rst @@ -1055,8 +1055,8 @@ Write text - the string representation of *arg* - at the current turtle position according to *align* ("left", "center" or right") and with the given - font. If *move* is True, the pen is moved to the bottom-right corner of the - text. By default, *move* is False. + font. If *move* is true, the pen is moved to the bottom-right corner of the + text. By default, *move* is ``False``. >>> turtle.write("Home = ", True, align="center") >>> turtle.write((0,0), True) @@ -1092,7 +1092,7 @@ .. function:: isvisible() - Return True if the Turtle is shown, False if it's hidden. + Return ``True`` if the Turtle is shown, ``False`` if it's hidden. >>> turtle.hideturtle() >>> turtle.isvisible() diff --git a/Doc/library/unittest.mock.rst b/Doc/library/unittest.mock.rst --- a/Doc/library/unittest.mock.rst +++ b/Doc/library/unittest.mock.rst @@ -299,11 +299,11 @@ assert the mock has been called with the specified calls. The `mock_calls` list is checked for the calls. - If `any_order` is False (the default) then the calls must be + If `any_order` is false (the default) then the calls must be sequential. There can be extra calls before or after the specified calls. - If `any_order` is True then the calls can be in any order, but + If `any_order` is true then the calls can be in any order, but they must all appear in :attr:`mock_calls`. >>> mock = Mock(return_value=None) @@ -1180,7 +1180,7 @@ `values` can be a dictionary of values to set in the dictionary. `values` can also be an iterable of `(key, value)` pairs. - If `clear` is True then the dictionary will be cleared before the new + If `clear` is true then the dictionary will be cleared before the new values are set. `patch.dict` can also be called with arbitrary keyword arguments to set diff --git a/Doc/library/urllib.request.rst b/Doc/library/urllib.request.rst --- a/Doc/library/urllib.request.rst +++ b/Doc/library/urllib.request.rst @@ -59,7 +59,7 @@ some non-Windows platforms. .. warning:: - If neither *cafile* nor *capath* is specified, and *cadefault* is False, + If neither *cafile* nor *capath* is specified, and *cadefault* is ``False``, an HTTPS request will not do any verification of the server's certificate. @@ -211,7 +211,7 @@ containing the image. *unverifiable* should indicate whether the request is unverifiable, - as defined by RFC 2965. It defaults to False. An unverifiable + as defined by RFC 2965. It defaults to ``False``. An unverifiable request is one whose URL the user did not have the option to approve. For example, if the request is for an image in an HTML document, and the user had no option to approve the automatic diff --git a/Doc/library/venv.rst b/Doc/library/venv.rst --- a/Doc/library/venv.rst +++ b/Doc/library/venv.rst @@ -93,7 +93,7 @@ * ``system_site_packages`` -- a Boolean value indicating that the system Python site-packages should be available to the environment (defaults to ``False``). - * ``clear`` -- a Boolean value which, if True, will delete any existing target + * ``clear`` -- a Boolean value which, if true, will delete any existing target directory instead of raising an exception (defaults to ``False``). * ``symlinks`` -- a Boolean value indicating whether to attempt to symlink the @@ -101,7 +101,7 @@ e.g. ``pythonw.exe``), rather than copying. Defaults to ``True`` on Linux and Unix systems, but ``False`` on Windows. - * ``upgrade`` -- a Boolean value which, if True, will upgrade an existing + * ``upgrade`` -- a Boolean value which, if true, will upgrade an existing environment with the running Python - for use when that Python has been upgraded in-place (defaults to ``False``). diff --git a/Doc/library/zipimport.rst b/Doc/library/zipimport.rst --- a/Doc/library/zipimport.rst +++ b/Doc/library/zipimport.rst @@ -111,7 +111,7 @@ .. method:: is_package(fullname) - Return True if the module specified by *fullname* is a package. Raise + Return ``True`` if the module specified by *fullname* is a package. Raise :exc:`ZipImportError` if the module couldn't be found. diff --git a/Doc/reference/datamodel.rst b/Doc/reference/datamodel.rst --- a/Doc/reference/datamodel.rst +++ b/Doc/reference/datamodel.rst @@ -201,7 +201,7 @@ single: True These represent the truth values False and True. The two objects representing - the values False and True are the only Boolean objects. The Boolean type is a + the values ``False`` and ``True`` are the only Boolean objects. The Boolean type is a subtype of the integer type, and Boolean values behave like the values 0 and 1, respectively, in almost all contexts, the exception being that when converted to a string, the strings ``"False"`` or ``"True"`` are returned, respectively. diff --git a/Doc/whatsnew/3.3.rst b/Doc/whatsnew/3.3.rst --- a/Doc/whatsnew/3.3.rst +++ b/Doc/whatsnew/3.3.rst @@ -1125,7 +1125,7 @@ * If Python is compiled without threads, the C version automatically disables the expensive thread local context machinery. In this case, - the variable :data:`~decimal.HAVE_THREADS` is set to False. + the variable :data:`~decimal.HAVE_THREADS` is set to ``False``. API changes ~~~~~~~~~~~ diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -87,6 +87,8 @@ Documentation ------------- +- Issue #19795: Improved markup of True/False constants. + - Issue #18326: Clarify that list.sort's arguments are keyword-only. Also, attempt to reduce confusion in the glossary by not saying there are different "types" of arguments and parameters. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 29 11:23:02 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Fri, 29 Nov 2013 11:23:02 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319795=3A_Improved_markup_of_True/False_constant?= =?utf-8?q?s=2E?= Message-ID: <3dWBck2pK0z7Ljx@mail.python.org> http://hg.python.org/cpython/rev/6a3e09cd96f3 changeset: 87635:6a3e09cd96f3 parent: 87632:d829111a27e5 parent: 87634:b2066bc8cab9 user: Serhiy Storchaka date: Fri Nov 29 12:19:53 2013 +0200 summary: Issue #19795: Improved markup of True/False constants. files: Doc/howto/curses.rst | 2 +- Doc/howto/functional.rst | 4 ++-- Doc/library/ast.rst | 2 +- Doc/library/bdb.rst | 8 ++++---- Doc/library/bz2.rst | 2 +- Doc/library/ctypes.rst | 2 +- Doc/library/difflib.rst | 2 +- Doc/library/dis.rst | 2 +- Doc/library/email.message.rst | 2 +- Doc/library/functions.rst | 8 ++++---- Doc/library/gc.rst | 4 ++-- Doc/library/http.client.rst | 2 +- Doc/library/http.cookiejar.rst | 16 ++++++++-------- Doc/library/imp.rst | 4 ++-- Doc/library/importlib.rst | 4 ++-- Doc/library/io.rst | 2 +- Doc/library/itertools.rst | 2 +- Doc/library/logging.handlers.rst | 2 +- Doc/library/logging.rst | 6 +++--- Doc/library/lzma.rst | 2 +- Doc/library/mmap.rst | 2 +- Doc/library/nntplib.rst | 4 ++-- Doc/library/os.path.rst | 6 +++--- Doc/library/os.rst | 2 +- Doc/library/pickle.rst | 12 ++++++------ Doc/library/plistlib.rst | 2 +- Doc/library/sched.rst | 2 +- Doc/library/sqlite3.rst | 8 ++++---- Doc/library/ssl.rst | 8 ++++---- Doc/library/stdtypes.rst | 2 +- Doc/library/struct.rst | 2 +- Doc/library/subprocess.rst | 4 ++-- Doc/library/sys.rst | 2 +- Doc/library/tarfile.rst | 2 +- Doc/library/tkinter.ttk.rst | 8 ++++---- Doc/library/turtle.rst | 6 +++--- Doc/library/unittest.mock.rst | 6 +++--- Doc/library/urllib.request.rst | 4 ++-- Doc/library/venv.rst | 6 +++--- Doc/library/zipfile.rst | 2 +- Doc/library/zipimport.rst | 2 +- Doc/reference/datamodel.rst | 2 +- Doc/whatsnew/3.3.rst | 2 +- Misc/NEWS | 6 ++++++ 44 files changed, 93 insertions(+), 87 deletions(-) diff --git a/Doc/howto/curses.rst b/Doc/howto/curses.rst --- a/Doc/howto/curses.rst +++ b/Doc/howto/curses.rst @@ -422,7 +422,7 @@ blue or any other color you like. Unfortunately, the Linux console doesn't support this, so I'm unable to try it out, and can't provide any examples. You can check if your terminal can do this by calling -:func:`~curses.can_change_color`, which returns True if the capability is +:func:`~curses.can_change_color`, which returns ``True`` if the capability is there. If you're lucky enough to have such a talented terminal, consult your system's man pages for more information. diff --git a/Doc/howto/functional.rst b/Doc/howto/functional.rst --- a/Doc/howto/functional.rst +++ b/Doc/howto/functional.rst @@ -689,8 +689,8 @@ The :func:`any(iter) ` and :func:`all(iter) ` built-ins look at the -truth values of an iterable's contents. :func:`any` returns True if any element -in the iterable is a true value, and :func:`all` returns True if all of the +truth values of an iterable's contents. :func:`any` returns ``True`` if any element +in the iterable is a true value, and :func:`all` returns ``True`` if all of the elements are true values: >>> any([0,1,0]) diff --git a/Doc/library/ast.rst b/Doc/library/ast.rst --- a/Doc/library/ast.rst +++ b/Doc/library/ast.rst @@ -244,6 +244,6 @@ Return a formatted dump of the tree in *node*. This is mainly useful for debugging purposes. The returned string will show the names and the values for fields. This makes the code impossible to evaluate, so if evaluation is - wanted *annotate_fields* must be set to False. Attributes such as line + wanted *annotate_fields* must be set to ``False``. Attributes such as line numbers and column offsets are not dumped by default. If this is wanted, *include_attributes* can be set to ``True``. diff --git a/Doc/library/bdb.rst b/Doc/library/bdb.rst --- a/Doc/library/bdb.rst +++ b/Doc/library/bdb.rst @@ -194,17 +194,17 @@ .. method:: user_line(frame) This method is called from :meth:`dispatch_line` when either - :meth:`stop_here` or :meth:`break_here` yields True. + :meth:`stop_here` or :meth:`break_here` yields ``True``. .. method:: user_return(frame, return_value) This method is called from :meth:`dispatch_return` when :meth:`stop_here` - yields True. + yields ``True``. .. method:: user_exception(frame, exc_info) This method is called from :meth:`dispatch_exception` when - :meth:`stop_here` yields True. + :meth:`stop_here` yields ``True``. .. method:: do_clear(arg) @@ -245,7 +245,7 @@ .. method:: set_quit() - Set the :attr:`quitting` attribute to True. This raises :exc:`BdbQuit` in + Set the :attr:`quitting` attribute to ``True``. This raises :exc:`BdbQuit` in the next call to one of the :meth:`dispatch_\*` methods. diff --git a/Doc/library/bz2.rst b/Doc/library/bz2.rst --- a/Doc/library/bz2.rst +++ b/Doc/library/bz2.rst @@ -169,7 +169,7 @@ .. attribute:: eof - True if the end-of-stream marker has been reached. + ``True`` if the end-of-stream marker has been reached. .. versionadded:: 3.3 diff --git a/Doc/library/ctypes.rst b/Doc/library/ctypes.rst --- a/Doc/library/ctypes.rst +++ b/Doc/library/ctypes.rst @@ -2242,7 +2242,7 @@ .. class:: c_bool Represent the C :c:type:`bool` datatype (more accurately, :c:type:`_Bool` from - C99). Its value can be True or False, and the constructor accepts any object + C99). Its value can be ``True`` or ``False``, and the constructor accepts any object that has a truth value. diff --git a/Doc/library/difflib.rst b/Doc/library/difflib.rst --- a/Doc/library/difflib.rst +++ b/Doc/library/difflib.rst @@ -359,7 +359,7 @@ The *autojunk* parameter. SequenceMatcher objects get three data attributes: *bjunk* is the - set of elements of *b* for which *isjunk* is True; *bpopular* is the set of + set of elements of *b* for which *isjunk* is ``True``; *bpopular* is the set of non-junk elements considered popular by the heuristic (if it is not disabled); *b2j* is a dict mapping the remaining elements of *b* to a list of positions where they occur. All three are reset whenever *b* is reset diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -274,7 +274,7 @@ .. data:: is_jump_target - True if other code jumps to here, otherwise False + ``True`` if other code jumps to here, otherwise ``False`` .. versionadded:: 3.4 diff --git a/Doc/library/email.message.rst b/Doc/library/email.message.rst --- a/Doc/library/email.message.rst +++ b/Doc/library/email.message.rst @@ -131,7 +131,7 @@ Return ``True`` if the message's payload is a list of sub-\ :class:`Message` objects, otherwise return ``False``. When - :meth:`is_multipart` returns False, the payload should be a string object. + :meth:`is_multipart` returns ``False``, the payload should be a string object. .. method:: set_unixfrom(unixfrom) diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -48,7 +48,7 @@ .. function:: all(iterable) - Return True if all elements of the *iterable* are true (or if the iterable + Return ``True`` if all elements of the *iterable* are true (or if the iterable is empty). Equivalent to:: def all(iterable): @@ -60,8 +60,8 @@ .. function:: any(iterable) - Return True if any element of the *iterable* is true. If the iterable - is empty, return False. Equivalent to:: + Return ``True`` if any element of the *iterable* is true. If the iterable + is empty, return ``False``. Equivalent to:: def any(iterable): for element in iterable: @@ -905,7 +905,7 @@ the buffer will typically be 4096 or 8192 bytes long. * "Interactive" text files (files for which :meth:`~io.IOBase.isatty` - returns True) use line buffering. Other text files use the policy + returns ``True``) use line buffering. Other text files use the policy described above for binary files. *encoding* is the name of the encoding used to decode or encode the file. diff --git a/Doc/library/gc.rst b/Doc/library/gc.rst --- a/Doc/library/gc.rst +++ b/Doc/library/gc.rst @@ -148,8 +148,8 @@ .. function:: is_tracked(obj) - Returns True if the object is currently tracked by the garbage collector, - False otherwise. As a general rule, instances of atomic types aren't + Returns ``True`` if the object is currently tracked by the garbage collector, + ``False`` otherwise. As a general rule, instances of atomic types aren't tracked and instances of non-atomic types (containers, user-defined objects...) are. However, some type-specific optimizations can be present in order to suppress the garbage collector footprint of simple instances diff --git a/Doc/library/http.client.rst b/Doc/library/http.client.rst --- a/Doc/library/http.client.rst +++ b/Doc/library/http.client.rst @@ -573,7 +573,7 @@ .. attribute:: HTTPResponse.closed - Is True if the stream is closed. + Is ``True`` if the stream is closed. Examples -------- diff --git a/Doc/library/http.cookiejar.rst b/Doc/library/http.cookiejar.rst --- a/Doc/library/http.cookiejar.rst +++ b/Doc/library/http.cookiejar.rst @@ -90,7 +90,7 @@ Netscape and RFC 2965 cookies. By default, RFC 2109 cookies (ie. cookies received in a :mailheader:`Set-Cookie` header with a version cookie-attribute of 1) are treated according to the RFC 2965 rules. However, if RFC 2965 handling - is turned off or :attr:`rfc2109_as_netscape` is True, RFC 2109 cookies are + is turned off or :attr:`rfc2109_as_netscape` is ``True``, RFC 2109 cookies are 'downgraded' by the :class:`CookieJar` instance to Netscape cookies, by setting the :attr:`version` attribute of the :class:`Cookie` instance to 0. :class:`DefaultCookiePolicy` also provides some parameters to allow some @@ -644,7 +644,7 @@ .. attribute:: Cookie.secure - True if cookie should only be returned over a secure connection. + ``True`` if cookie should only be returned over a secure connection. .. attribute:: Cookie.expires @@ -655,7 +655,7 @@ .. attribute:: Cookie.discard - True if this is a session cookie. + ``True`` if this is a session cookie. .. attribute:: Cookie.comment @@ -672,7 +672,7 @@ .. attribute:: Cookie.rfc2109 - True if this cookie was received as an RFC 2109 cookie (ie. the cookie + ``True`` if this cookie was received as an RFC 2109 cookie (ie. the cookie arrived in a :mailheader:`Set-Cookie` header, and the value of the Version cookie-attribute in that header was 1). This attribute is provided because :mod:`http.cookiejar` may 'downgrade' RFC 2109 cookies to Netscape cookies, in @@ -681,18 +681,18 @@ .. attribute:: Cookie.port_specified - True if a port or set of ports was explicitly specified by the server (in the + ``True`` if a port or set of ports was explicitly specified by the server (in the :mailheader:`Set-Cookie` / :mailheader:`Set-Cookie2` header). .. attribute:: Cookie.domain_specified - True if a domain was explicitly specified by the server. + ``True`` if a domain was explicitly specified by the server. .. attribute:: Cookie.domain_initial_dot - True if the domain explicitly specified by the server began with a dot + ``True`` if the domain explicitly specified by the server began with a dot (``'.'``). Cookies may have additional non-standard cookie-attributes. These may be @@ -719,7 +719,7 @@ .. method:: Cookie.is_expired(now=None) - True if cookie has passed the time at which the server requested it should + ``True`` if cookie has passed the time at which the server requested it should expire. If *now* is given (in seconds since the epoch), return whether the cookie has expired at the specified time. diff --git a/Doc/library/imp.rst b/Doc/library/imp.rst --- a/Doc/library/imp.rst +++ b/Doc/library/imp.rst @@ -200,8 +200,8 @@ The ``cpython-32`` string comes from the current magic tag (see :func:`get_tag`; if :attr:`sys.implementation.cache_tag` is not defined then :exc:`NotImplementedError` will be raised). The returned path will end in - ``.pyc`` when ``__debug__`` is True or ``.pyo`` for an optimized Python - (i.e. ``__debug__`` is False). By passing in True or False for + ``.pyc`` when ``__debug__`` is ``True`` or ``.pyo`` for an optimized Python + (i.e. ``__debug__`` is ``False``). By passing in ``True`` or ``False`` for *debug_override* you can override the system's value for ``__debug__`` for extension selection. diff --git a/Doc/library/importlib.rst b/Doc/library/importlib.rst --- a/Doc/library/importlib.rst +++ b/Doc/library/importlib.rst @@ -928,8 +928,8 @@ The ``cpython-32`` string comes from the current magic tag (see :func:`get_tag`; if :attr:`sys.implementation.cache_tag` is not defined then :exc:`NotImplementedError` will be raised). The returned path will end in - ``.pyc`` when ``__debug__`` is True or ``.pyo`` for an optimized Python - (i.e. ``__debug__`` is False). By passing in True or False for + ``.pyc`` when ``__debug__`` is ``True`` or ``.pyo`` for an optimized Python + (i.e. ``__debug__`` is ``False``). By passing in ``True`` or ``False`` for *debug_override* you can override the system's value for ``__debug__`` for extension selection. diff --git a/Doc/library/io.rst b/Doc/library/io.rst --- a/Doc/library/io.rst +++ b/Doc/library/io.rst @@ -280,7 +280,7 @@ .. method:: readable() - Return ``True`` if the stream can be read from. If False, :meth:`read` + Return ``True`` if the stream can be read from. If ``False``, :meth:`read` will raise :exc:`OSError`. .. method:: readline(size=-1) diff --git a/Doc/library/itertools.rst b/Doc/library/itertools.rst --- a/Doc/library/itertools.rst +++ b/Doc/library/itertools.rst @@ -51,7 +51,7 @@ chain.from_iterable iterable p0, p1, ... plast, q0, q1, ... ``chain.from_iterable(['ABC', 'DEF']) --> A B C D E F`` :func:`compress` data, selectors (d[0] if s[0]), (d[1] if s[1]), ... ``compress('ABCDEF', [1,0,1,0,1,1]) --> A C E F`` :func:`dropwhile` pred, seq seq[n], seq[n+1], starting when pred fails ``dropwhile(lambda x: x<5, [1,4,6,4,1]) --> 6 4 1`` -:func:`filterfalse` pred, seq elements of seq where pred(elem) is False ``filterfalse(lambda x: x%2, range(10)) --> 0 2 4 6 8`` +:func:`filterfalse` pred, seq elements of seq where pred(elem) is false ``filterfalse(lambda x: x%2, range(10)) --> 0 2 4 6 8`` :func:`groupby` iterable[, keyfunc] sub-iterators grouped by value of keyfunc(v) :func:`islice` seq, [start,] stop [, step] elements from seq[start:stop:step] ``islice('ABCDEFG', 2, None) --> C D E F G`` :func:`starmap` func, seq func(\*seq[0]), func(\*seq[1]), ... ``starmap(pow, [(2,5), (3,2), (10,3)]) --> 32 9 1000`` diff --git a/Doc/library/logging.handlers.rst b/Doc/library/logging.handlers.rst --- a/Doc/library/logging.handlers.rst +++ b/Doc/library/logging.handlers.rst @@ -843,7 +843,7 @@ Returns a new instance of the :class:`HTTPHandler` class. The *host* can be of the form ``host:port``, should you need to use a specific port number. - If no *method* is specified, ``GET`` is used. If *secure* is True, an HTTPS + If no *method* is specified, ``GET`` is used. If *secure* is true, an HTTPS connection will be used. If *credentials* is specified, it should be a 2-tuple consisting of userid and password, which will be placed in an HTTP 'Authorization' header using Basic authentication. If you specify diff --git a/Doc/library/logging.rst b/Doc/library/logging.rst --- a/Doc/library/logging.rst +++ b/Doc/library/logging.rst @@ -160,7 +160,7 @@ is called to get the exception information. The second optional keyword argument is *stack_info*, which defaults to - False. If specified as True, stack information is added to the logging + ``False``. If true, stack information is added to the logging message, including the actual logging call. Note that this is not the same stack information as that displayed through specifying *exc_info*: The former is stack frames from the bottom of the stack up to the logging call @@ -308,7 +308,7 @@ Checks to see if this logger has any handlers configured. This is done by looking for handlers in this logger and its parents in the logger hierarchy. - Returns True if a handler was found, else False. The method stops searching + Returns ``True`` if a handler was found, else ``False``. The method stops searching up the hierarchy whenever a logger with the 'propagate' attribute set to False is found - that will be the last logger which is checked for the existence of handlers. @@ -878,7 +878,7 @@ is called to get the exception information. The second optional keyword argument is *stack_info*, which defaults to - False. If specified as True, stack information is added to the logging + ``False``. If true, stack information is added to the logging message, including the actual logging call. Note that this is not the same stack information as that displayed through specifying *exc_info*: The former is stack frames from the bottom of the stack up to the logging call diff --git a/Doc/library/lzma.rst b/Doc/library/lzma.rst --- a/Doc/library/lzma.rst +++ b/Doc/library/lzma.rst @@ -232,7 +232,7 @@ .. attribute:: eof - True if the end-of-stream marker has been reached. + ``True`` if the end-of-stream marker has been reached. .. attribute:: unused_data diff --git a/Doc/library/mmap.rst b/Doc/library/mmap.rst --- a/Doc/library/mmap.rst +++ b/Doc/library/mmap.rst @@ -162,7 +162,7 @@ .. attribute:: closed - True if the file is closed. + ``True`` if the file is closed. .. versionadded:: 3.2 diff --git a/Doc/library/nntplib.rst b/Doc/library/nntplib.rst --- a/Doc/library/nntplib.rst +++ b/Doc/library/nntplib.rst @@ -82,7 +82,7 @@ .. versionchanged:: 3.2 - *usenetrc* is now False by default. + *usenetrc* is now ``False`` by default. .. versionchanged:: 3.3 Support for the :keyword:`with` statement was added. @@ -216,7 +216,7 @@ .. method:: NNTP.login(user=None, password=None, usenetrc=True) Send ``AUTHINFO`` commands with the user name and password. If *user* - and *password* are None and *usenetrc* is True, credentials from + and *password* are None and *usenetrc* is true, credentials from ``~/.netrc`` will be used if possible. Unless intentionally delayed, login is normally performed during the diff --git a/Doc/library/os.path.rst b/Doc/library/os.path.rst --- a/Doc/library/os.path.rst +++ b/Doc/library/os.path.rst @@ -137,7 +137,7 @@ the number of seconds since the epoch (see the :mod:`time` module). Raise :exc:`OSError` if the file does not exist or is inaccessible. - If :func:`os.stat_float_times` returns True, the result is a floating point + If :func:`os.stat_float_times` returns ``True``, the result is a floating point number. @@ -147,7 +147,7 @@ giving the number of seconds since the epoch (see the :mod:`time` module). Raise :exc:`OSError` if the file does not exist or is inaccessible. - If :func:`os.stat_float_times` returns True, the result is a floating point + If :func:`os.stat_float_times` returns ``True``, the result is a floating point number. @@ -340,5 +340,5 @@ .. data:: supports_unicode_filenames - True if arbitrary Unicode strings can be used as file names (within limitations + ``True`` if arbitrary Unicode strings can be used as file names (within limitations imposed by the file system). diff --git a/Doc/library/os.rst b/Doc/library/os.rst --- a/Doc/library/os.rst +++ b/Doc/library/os.rst @@ -551,7 +551,7 @@ .. data:: supports_bytes_environ - True if the native OS type of the environment is bytes (eg. False on + ``True`` if the native OS type of the environment is bytes (eg. ``False`` on Windows). .. versionadded:: 3.2 diff --git a/Doc/library/pickle.rst b/Doc/library/pickle.rst --- a/Doc/library/pickle.rst +++ b/Doc/library/pickle.rst @@ -161,7 +161,7 @@ :class:`io.BytesIO` instance, or any other custom object that meets this interface. - If *fix_imports* is True and *protocol* is less than 3, pickle will try to + If *fix_imports* is true and *protocol* is less than 3, pickle will try to map the new Python 3.x names to the old module names used in Python 2.x, so that the pickle data stream is readable with Python 2.x. @@ -178,7 +178,7 @@ supported. The higher the protocol used, the more recent the version of Python needed to read the pickle produced. - If *fix_imports* is True and *protocol* is less than 3, pickle will try to + If *fix_imports* is true and *protocol* is less than 3, pickle will try to map the new Python 3.x names to the old module names used in Python 2.x, so that the pickle data stream is readable with Python 2.x. @@ -200,7 +200,7 @@ Optional keyword arguments are *fix_imports*, *encoding* and *errors*, which are used to control compatibility support for pickle stream generated - by Python 2.x. If *fix_imports* is True, pickle will try to map the old + by Python 2.x. If *fix_imports* is true, pickle will try to map the old Python 2.x names to the new names used in Python 3.x. The *encoding* and *errors* tell pickle how to decode 8-bit string instances pickled by Python 2.x; these default to 'ASCII' and 'strict', respectively. @@ -216,7 +216,7 @@ Optional keyword arguments are *fix_imports*, *encoding* and *errors*, which are used to control compatibility support for pickle stream generated - by Python 2.x. If *fix_imports* is True, pickle will try to map the old + by Python 2.x. If *fix_imports* is true, pickle will try to map the old Python 2.x names to the new names used in Python 3.x. The *encoding* and *errors* tell pickle how to decode 8-bit string instances pickled by Python 2.x; these default to 'ASCII' and 'strict', respectively. @@ -266,7 +266,7 @@ argument. It can thus be an on-disk file opened for binary writing, a :class:`io.BytesIO` instance, or any other custom object that meets this interface. - If *fix_imports* is True and *protocol* is less than 3, pickle will try to + If *fix_imports* is true and *protocol* is less than 3, pickle will try to map the new Python 3.x names to the old module names used in Python 2.x, so that the pickle data stream is readable with Python 2.x. @@ -336,7 +336,7 @@ Optional keyword arguments are *fix_imports*, *encoding* and *errors*, which are used to control compatibility support for pickle stream generated - by Python 2.x. If *fix_imports* is True, pickle will try to map the old + by Python 2.x. If *fix_imports* is true, pickle will try to map the old Python 2.x names to the new names used in Python 3.x. The *encoding* and *errors* tell pickle how to decode 8-bit string instances pickled by Python 2.x; these default to 'ASCII' and 'strict', respectively. diff --git a/Doc/library/plistlib.rst b/Doc/library/plistlib.rst --- a/Doc/library/plistlib.rst +++ b/Doc/library/plistlib.rst @@ -54,7 +54,7 @@ * :data:`FMT_BINARY`: Binary plist format - If *use_builtin_types* is True (the default) binary data will be returned + If *use_builtin_types* is true (the default) binary data will be returned as instances of :class:`bytes`, otherwise it is returned as instances of :class:`Data`. diff --git a/Doc/library/sched.rst b/Doc/library/sched.rst --- a/Doc/library/sched.rst +++ b/Doc/library/sched.rst @@ -113,7 +113,7 @@ function passed to the constructor) for the next event, then execute it and so on until there are no more scheduled events. - If *blocking* is False executes the scheduled events due to expire soonest + If *blocking* is false executes the scheduled events due to expire soonest (if any) and then return the deadline of the next scheduled call in the scheduler (if any). diff --git a/Doc/library/sqlite3.rst b/Doc/library/sqlite3.rst --- a/Doc/library/sqlite3.rst +++ b/Doc/library/sqlite3.rst @@ -241,10 +241,10 @@ .. function:: enable_callback_tracebacks(flag) By default you will not get any tracebacks in user-defined functions, - aggregates, converters, authorizer callbacks etc. If you want to debug them, you - can call this function with *flag* as True. Afterwards, you will get tracebacks - from callbacks on ``sys.stderr``. Use :const:`False` to disable the feature - again. + aggregates, converters, authorizer callbacks etc. If you want to debug them, + you can call this function with *flag* set to ``True``. Afterwards, you will + get tracebacks from callbacks on ``sys.stderr``. Use :const:`False` to + disable the feature again. .. _sqlite3-connection-objects: diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -287,7 +287,7 @@ .. function:: RAND_pseudo_bytes(num) Returns (bytes, is_cryptographic): bytes are *num* pseudo-random bytes, - is_cryptographic is True if the bytes generated are cryptographically + is_cryptographic is ``True`` if the bytes generated are cryptographically strong. Raises an :class:`SSLError` if the operation is not supported by the current RAND method. @@ -300,8 +300,8 @@ .. function:: RAND_status() - Returns True if the SSL pseudo-random number generator has been seeded with - 'enough' randomness, and False otherwise. You can use :func:`ssl.RAND_egd` + Returns ``True`` if the SSL pseudo-random number generator has been seeded with + 'enough' randomness, and ``False`` otherwise. You can use :func:`ssl.RAND_egd` and :func:`ssl.RAND_add` to increase the randomness of the pseudo-random number generator. @@ -1173,7 +1173,7 @@ .. method:: SSLContext.get_ca_certs(binary_form=False) Returns a list of dicts with information of loaded CA certs. If the - optional argument is True, returns a DER-encoded copy of the CA + optional argument is true, returns a DER-encoded copy of the CA certificate. .. note:: diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -2851,7 +2851,7 @@ .. method:: isdisjoint(other) - Return True if the set has no elements in common with *other*. Sets are + Return ``True`` if the set has no elements in common with *other*. Sets are disjoint if and only if their intersection is the empty set. .. method:: issubset(other) diff --git a/Doc/library/struct.rst b/Doc/library/struct.rst --- a/Doc/library/struct.rst +++ b/Doc/library/struct.rst @@ -295,7 +295,7 @@ For the ``'?'`` format character, the return value is either :const:`True` or :const:`False`. When packing, the truth value of the argument object is used. Either 0 or 1 in the native or standard bool representation will be packed, and -any non-zero value will be True when unpacking. +any non-zero value will be ``True`` when unpacking. diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -522,7 +522,7 @@ *executable* (or for the first item in *args*) relative to *cwd* if the executable path is a relative path. - If *restore_signals* is True (the default) all signals that Python has set to + If *restore_signals* is true (the default) all signals that Python has set to SIG_IGN are restored to SIG_DFL in the child process before the exec. Currently this includes the SIGPIPE, SIGXFZ and SIGXFSZ signals. (Unix only) @@ -530,7 +530,7 @@ .. versionchanged:: 3.2 *restore_signals* was added. - If *start_new_session* is True the setsid() system call will be made in the + If *start_new_session* is true the setsid() system call will be made in the child process prior to the execution of the subprocess. (Unix only) .. versionchanged:: 3.2 diff --git a/Doc/library/sys.rst b/Doc/library/sys.rst --- a/Doc/library/sys.rst +++ b/Doc/library/sys.rst @@ -1067,7 +1067,7 @@ :func:`open` function. Their parameters are chosen as follows: * The character encoding is platform-dependent. Under Windows, if the stream - is interactive (that is, if its :meth:`isatty` method returns True), the + is interactive (that is, if its :meth:`isatty` method returns ``True``), the console codepage is used, otherwise the ANSI code page. Under other platforms, the locale encoding is used (see :meth:`locale.getpreferredencoding`). diff --git a/Doc/library/tarfile.rst b/Doc/library/tarfile.rst --- a/Doc/library/tarfile.rst +++ b/Doc/library/tarfile.rst @@ -359,7 +359,7 @@ full name. Its file information is extracted as accurately as possible. *member* may be a filename or a :class:`TarInfo` object. You can specify a different directory using *path*. File attributes (owner, mtime, mode) are set unless - *set_attrs* is False. + *set_attrs* is false. .. note:: diff --git a/Doc/library/tkinter.ttk.rst b/Doc/library/tkinter.ttk.rst --- a/Doc/library/tkinter.ttk.rst +++ b/Doc/library/tkinter.ttk.rst @@ -272,8 +272,8 @@ .. method:: instate(statespec, callback=None, *args, **kw) - Test the widget's state. If a callback is not specified, returns True - if the widget state matches *statespec* and False otherwise. If callback + Test the widget's state. If a callback is not specified, returns ``True`` + if the widget state matches *statespec* and ``False`` otherwise. If callback is specified then it is called with args if widget state matches *statespec*. @@ -938,7 +938,7 @@ .. method:: exists(item) - Returns True if the specified *item* is present in the tree. + Returns ``True`` if the specified *item* is present in the tree. .. method:: focus(item=None) @@ -1084,7 +1084,7 @@ Ensure that *item* is visible. - Sets all of *item*'s ancestors open option to True, and scrolls the + Sets all of *item*'s ancestors open option to ``True``, and scrolls the widget if necessary so that *item* is within the visible portion of the tree. diff --git a/Doc/library/turtle.rst b/Doc/library/turtle.rst --- a/Doc/library/turtle.rst +++ b/Doc/library/turtle.rst @@ -1055,8 +1055,8 @@ Write text - the string representation of *arg* - at the current turtle position according to *align* ("left", "center" or right") and with the given - font. If *move* is True, the pen is moved to the bottom-right corner of the - text. By default, *move* is False. + font. If *move* is true, the pen is moved to the bottom-right corner of the + text. By default, *move* is ``False``. >>> turtle.write("Home = ", True, align="center") >>> turtle.write((0,0), True) @@ -1092,7 +1092,7 @@ .. function:: isvisible() - Return True if the Turtle is shown, False if it's hidden. + Return ``True`` if the Turtle is shown, ``False`` if it's hidden. >>> turtle.hideturtle() >>> turtle.isvisible() diff --git a/Doc/library/unittest.mock.rst b/Doc/library/unittest.mock.rst --- a/Doc/library/unittest.mock.rst +++ b/Doc/library/unittest.mock.rst @@ -298,11 +298,11 @@ assert the mock has been called with the specified calls. The `mock_calls` list is checked for the calls. - If `any_order` is False (the default) then the calls must be + If `any_order` is false (the default) then the calls must be sequential. There can be extra calls before or after the specified calls. - If `any_order` is True then the calls can be in any order, but + If `any_order` is true then the calls can be in any order, but they must all appear in :attr:`mock_calls`. >>> mock = Mock(return_value=None) @@ -1200,7 +1200,7 @@ `values` can be a dictionary of values to set in the dictionary. `values` can also be an iterable of `(key, value)` pairs. - If `clear` is True then the dictionary will be cleared before the new + If `clear` is true then the dictionary will be cleared before the new values are set. `patch.dict` can also be called with arbitrary keyword arguments to set diff --git a/Doc/library/urllib.request.rst b/Doc/library/urllib.request.rst --- a/Doc/library/urllib.request.rst +++ b/Doc/library/urllib.request.rst @@ -59,7 +59,7 @@ some non-Windows platforms. .. warning:: - If neither *cafile* nor *capath* is specified, and *cadefault* is False, + If neither *cafile* nor *capath* is specified, and *cadefault* is ``False``, an HTTPS request will not do any verification of the server's certificate. @@ -211,7 +211,7 @@ containing the image. *unverifiable* should indicate whether the request is unverifiable, - as defined by RFC 2965. It defaults to False. An unverifiable + as defined by RFC 2965. It defaults to ``False``. An unverifiable request is one whose URL the user did not have the option to approve. For example, if the request is for an image in an HTML document, and the user had no option to approve the automatic diff --git a/Doc/library/venv.rst b/Doc/library/venv.rst --- a/Doc/library/venv.rst +++ b/Doc/library/venv.rst @@ -94,7 +94,7 @@ * ``system_site_packages`` -- a Boolean value indicating that the system Python site-packages should be available to the environment (defaults to ``False``). - * ``clear`` -- a Boolean value which, if True, will delete the contents of + * ``clear`` -- a Boolean value which, if true, will delete the contents of any existing target directory, before creating the environment. * ``symlinks`` -- a Boolean value indicating whether to attempt to symlink the @@ -102,11 +102,11 @@ e.g. ``pythonw.exe``), rather than copying. Defaults to ``True`` on Linux and Unix systems, but ``False`` on Windows. - * ``upgrade`` -- a Boolean value which, if True, will upgrade an existing + * ``upgrade`` -- a Boolean value which, if true, will upgrade an existing environment with the running Python - for use when that Python has been upgraded in-place (defaults to ``False``). - * ``with_pip`` -- a Boolean value which, if True, ensures pip is + * ``with_pip`` -- a Boolean value which, if true, ensures pip is installed in the virtual environment .. versionchanged:: 3.4 diff --git a/Doc/library/zipfile.rst b/Doc/library/zipfile.rst --- a/Doc/library/zipfile.rst +++ b/Doc/library/zipfile.rst @@ -411,7 +411,7 @@ name as a file path, and if any subdirectories are package directories, all of these are added recursively. *basename* is intended for internal use only. When *filterfunc(pathname)* is given, it will be called for every - invocation. When it returns a False value, that path and its subpaths will + invocation. When it returns a false value, that path and its subpaths will be ignored. The :meth:`writepy` method makes archives with file names like this:: diff --git a/Doc/library/zipimport.rst b/Doc/library/zipimport.rst --- a/Doc/library/zipimport.rst +++ b/Doc/library/zipimport.rst @@ -111,7 +111,7 @@ .. method:: is_package(fullname) - Return True if the module specified by *fullname* is a package. Raise + Return ``True`` if the module specified by *fullname* is a package. Raise :exc:`ZipImportError` if the module couldn't be found. diff --git a/Doc/reference/datamodel.rst b/Doc/reference/datamodel.rst --- a/Doc/reference/datamodel.rst +++ b/Doc/reference/datamodel.rst @@ -201,7 +201,7 @@ single: True These represent the truth values False and True. The two objects representing - the values False and True are the only Boolean objects. The Boolean type is a + the values ``False`` and ``True`` are the only Boolean objects. The Boolean type is a subtype of the integer type, and Boolean values behave like the values 0 and 1, respectively, in almost all contexts, the exception being that when converted to a string, the strings ``"False"`` or ``"True"`` are returned, respectively. diff --git a/Doc/whatsnew/3.3.rst b/Doc/whatsnew/3.3.rst --- a/Doc/whatsnew/3.3.rst +++ b/Doc/whatsnew/3.3.rst @@ -1125,7 +1125,7 @@ * If Python is compiled without threads, the C version automatically disables the expensive thread local context machinery. In this case, - the variable :data:`~decimal.HAVE_THREADS` is set to False. + the variable :data:`~decimal.HAVE_THREADS` is set to ``False``. API changes ~~~~~~~~~~~ diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -45,6 +45,12 @@ pythoncore sub-project. This should prevent build errors due a previous build's python(_d).exe still running. +Documentation +------------- + +- Issue #19795: Improved markup of True/False constants. + + What's New in Python 3.4.0 Beta 1? ================================== -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 29 17:00:21 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 29 Nov 2013 17:00:21 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319698=3A_Remove_e?= =?utf-8?q?xec=5Fmodule=28=29_from_the_built-in_and_extension?= Message-ID: <3dWL5x45T5z7Ljv@mail.python.org> http://hg.python.org/cpython/rev/b5bbd47a9bc4 changeset: 87636:b5bbd47a9bc4 user: Brett Cannon date: Fri Nov 29 11:00:11 2013 -0500 summary: Issue #19698: Remove exec_module() from the built-in and extension module loaders. Due to the fact that the call signatures for extension modules and built-in modules does not allow for the specifying of what module to initialize and that on Windows all extension modules are built-in modules, work to clean up built-in and extension module initialization will have to wait until Python 3.5. Because of this the semantics of exec_module() would be incorrect, so removing the methods for now is the best option; load_module() is still used as a fallback by importlib and so this won't affect semantics. files: Lib/importlib/_bootstrap.py | 63 +- Lib/test/test_importlib/builtin/test_loader.py | 72 - Lib/test/test_importlib/extension/test_loader.py | 65 - Misc/NEWS | 3 + Python/importlib.h | 8369 +++++---- 5 files changed, 4241 insertions(+), 4331 deletions(-) diff --git a/Lib/importlib/_bootstrap.py b/Lib/importlib/_bootstrap.py --- a/Lib/importlib/_bootstrap.py +++ b/Lib/importlib/_bootstrap.py @@ -133,6 +133,24 @@ _code_type = type(_wrap.__code__) + +class _ManageReload: + + """Manages the possible clean-up of sys.modules for load_module().""" + + def __init__(self, name): + self._name = name + + def __enter__(self): + self._is_reload = self._name in sys.modules + + def __exit__(self, *args): + if any(arg is not None for arg in args) and not self._is_reload: + try: + del sys.modules[self._name] + except KeyError: + pass + # Module-level locking ######################################################## # A dict mapping module names to weakrefs of _ModuleLock instances @@ -1242,23 +1260,15 @@ spec = cls.find_spec(fullname, path) return spec.loader if spec is not None else None - @staticmethod - def exec_module(module): - spec = module.__spec__ - name = spec.name - if not _imp.is_builtin(name): - raise ImportError('{!r} is not a built-in module'.format(name), - name=name) - _call_with_frames_removed(_imp.init_builtin, name) - # Have to manually initialize attributes since init_builtin() is not - # going to do it for us. - # XXX: Create _imp.exec_builtin(module) - _SpecMethods(spec).init_module_attrs(sys.modules[name]) - @classmethod + @_requires_builtin def load_module(cls, fullname): """Load a built-in module.""" - return _load_module_shim(cls, fullname) + with _ManageReload(fullname): + module = _call_with_frames_removed(_imp.init_builtin, fullname) + module.__loader__ = cls + module.__package__ = '' + return module @classmethod @_requires_builtin @@ -1639,22 +1649,21 @@ self.name = name self.path = path - def exec_module(self, module): - # XXX create _imp.exec_dynamic() - spec = module.__spec__ - fullname = spec.name - module = _call_with_frames_removed(_imp.load_dynamic, - fullname, self.path) - _verbose_message('extension module loaded from {!r}', self.path) - if self.is_package(fullname) and not hasattr(module, '__path__'): - module.__path__ = [_path_split(self.path)[0]] - _SpecMethods(spec).init_module_attrs(module) - - # XXX deprecate @_check_name def load_module(self, fullname): """Load an extension module.""" - return _load_module_shim(self, fullname) + with _ManageReload(fullname): + module = _call_with_frames_removed(_imp.load_dynamic, + fullname, self.path) + _verbose_message('extension module loaded from {!r}', self.path) + is_package = self.is_package(fullname) + if is_package and not hasattr(module, '__path__'): + module.__path__ = [_path_split(self.path)[0]] + module.__loader__ = self + module.__package__ = module.__name__ + if not is_package: + module.__package__ = module.__package__.rpartition('.')[0] + return module def is_package(self, fullname): """Return True if the extension module is a package.""" diff --git a/Lib/test/test_importlib/builtin/test_loader.py b/Lib/test/test_importlib/builtin/test_loader.py --- a/Lib/test/test_importlib/builtin/test_loader.py +++ b/Lib/test/test_importlib/builtin/test_loader.py @@ -9,78 +9,6 @@ import unittest -class ExecModTests(abc.LoaderTests): - - """Test exec_module() for built-in modules.""" - - @classmethod - def setUpClass(cls): - cls.verification = {'__name__': 'errno', '__package__': '', - '__loader__': cls.machinery.BuiltinImporter} - - def verify(self, module): - """Verify that the module matches against what it should have.""" - self.assertIsInstance(module, types.ModuleType) - for attr, value in self.verification.items(): - self.assertEqual(getattr(module, attr), value) - self.assertIn(module.__name__, sys.modules) - self.assertTrue(hasattr(module, '__spec__')) - self.assertEqual(module.__spec__.origin, 'built-in') - - def load_spec(self, name): - spec = self.machinery.ModuleSpec(name, self.machinery.BuiltinImporter, - origin='built-in') - module = types.ModuleType(name) - module.__spec__ = spec - self.machinery.BuiltinImporter.exec_module(module) - # Strictly not what exec_module() is supposed to do, but since - # _imp.init_builtin() does this we can't get around it. - return sys.modules[name] - - def test_module(self): - # Common case. - with util.uncache(builtin_util.NAME): - module = self.load_spec(builtin_util.NAME) - self.verify(module) - self.assertIn('built-in', str(module)) - - # Built-in modules cannot be a package. - test_package = None - - # Built-in modules cannot be a package. - test_lacking_parent = None - - # Not way to force an import failure. - test_state_after_failure = None - - def test_unloadable(self): - name = 'dssdsdfff' - assert name not in sys.builtin_module_names - with self.assertRaises(ImportError) as cm: - self.load_spec(name) - self.assertEqual(cm.exception.name, name) - - def test_already_imported(self): - # Using the name of a module already imported but not a built-in should - # still fail. - module_name = 'builtin_reload_test' - assert module_name not in sys.builtin_module_names - with util.uncache(module_name): - module = types.ModuleType(module_name) - spec = self.machinery.ModuleSpec(module_name, - self.machinery.BuiltinImporter, - origin='built-in') - module.__spec__ = spec - sys.modules[module_name] = module - with self.assertRaises(ImportError) as cm: - self.machinery.BuiltinImporter.exec_module(module) - self.assertEqual(cm.exception.name, module_name) - - -Frozen_ExecModTests, Source_ExecModTests = util.test_both(ExecModTests, - machinery=[frozen_machinery, source_machinery]) - - class LoaderTests(abc.LoaderTests): """Test load_module() for built-in modules.""" diff --git a/Lib/test/test_importlib/extension/test_loader.py b/Lib/test/test_importlib/extension/test_loader.py --- a/Lib/test/test_importlib/extension/test_loader.py +++ b/Lib/test/test_importlib/extension/test_loader.py @@ -10,71 +10,6 @@ import unittest -class ExecModuleTests(abc.LoaderTests): - - """Test load_module() for extension modules.""" - - def setUp(self): - self.loader = self.machinery.ExtensionFileLoader(ext_util.NAME, - ext_util.FILEPATH) - - def exec_module(self, fullname): - module = types.ModuleType(fullname) - module.__spec__ = self.machinery.ModuleSpec(fullname, self.loader) - self.loader.exec_module(module) - return sys.modules[fullname] - - def test_exec_module_API(self): - with self.assertRaises(ImportError): - self.exec_module('XXX') - - - def test_module(self): - with util.uncache(ext_util.NAME): - module = self.exec_module(ext_util.NAME) - for attr, value in [('__name__', ext_util.NAME), - ('__file__', ext_util.FILEPATH), - ('__package__', '')]: - given = getattr(module, attr) - self.assertEqual(given, value, - '{}: {!r} != {!r}'.format(attr, given, value)) - self.assertIn(ext_util.NAME, sys.modules) - self.assertIsInstance(module.__loader__, - self.machinery.ExtensionFileLoader) - - # No extension module as __init__ available for testing. - test_package = None - - # No extension module in a package available for testing. - test_lacking_parent = None - - def test_module_reuse(self): - with util.uncache(ext_util.NAME): - module1 = self.exec_module(ext_util.NAME) - module2 = self.exec_module(ext_util.NAME) - self.assertIs(module1, module2) - - def test_state_after_failure(self): - # No easy way to trigger a failure after a successful import. - pass - - def test_unloadable(self): - name = 'asdfjkl;' - with self.assertRaises(ImportError) as cm: - self.exec_module(name) - self.assertEqual(cm.exception.name, name) - - def test_is_package(self): - self.assertFalse(self.loader.is_package(ext_util.NAME)) - for suffix in self.machinery.EXTENSION_SUFFIXES: - path = os.path.join('some', 'path', 'pkg', '__init__' + suffix) - loader = self.machinery.ExtensionFileLoader('pkg', path) - self.assertTrue(loader.is_package('pkg')) - -Frozen_ExecModuleTests, Source_ExecModuleTests = util.test_both( - ExecModuleTests, machinery=machinery) - - class LoaderTests(abc.LoaderTests): """Test load_module() for extension modules.""" diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -18,6 +18,9 @@ Library ------- +- Issue #19698: Removed exec_module() methods from + importlib.machinery.BuiltinImporter and ExtensionFileLoader. + - ssl.create_default_context() sets OP_NO_COMPRESSION to prevent CRIME. - Issue #19802: Add socket.SO_PRIORITY. diff --git a/Python/importlib.h b/Python/importlib.h --- a/Python/importlib.h +++ b/Python/importlib.h [stripped] -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 29 18:29:06 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 29 Nov 2013 18:29:06 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_asyncio=3A_Add_=27shield?= =?utf-8?b?JyB0byBfX2FsbF9fLg==?= Message-ID: <3dWN4L1R8WzNRr@mail.python.org> http://hg.python.org/cpython/rev/dfe327390cc2 changeset: 87637:dfe327390cc2 user: Guido van Rossum date: Fri Nov 29 09:29:00 2013 -0800 summary: asyncio: Add 'shield' to __all__. files: Lib/asyncio/tasks.py | 2 +- 1 files changed, 1 insertions(+), 1 deletions(-) diff --git a/Lib/asyncio/tasks.py b/Lib/asyncio/tasks.py --- a/Lib/asyncio/tasks.py +++ b/Lib/asyncio/tasks.py @@ -3,7 +3,7 @@ __all__ = ['coroutine', 'Task', 'FIRST_COMPLETED', 'FIRST_EXCEPTION', 'ALL_COMPLETED', 'wait', 'wait_for', 'as_completed', 'sleep', 'async', - 'gather', + 'gather', 'shield', ] import collections -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 29 19:27:00 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 29 Nov 2013 19:27:00 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Add_brief_explanation_and_web?= =?utf-8?q?_pointers_to_README=2Etxt=2E_Fixes_issue_19822=2E?= Message-ID: <3dWPM81dGpz7Lk1@mail.python.org> http://hg.python.org/peps/rev/34cb64cdbf7b changeset: 5325:34cb64cdbf7b user: Guido van Rossum date: Fri Nov 29 10:26:55 2013 -0800 summary: Add brief explanation and web pointers to README.txt. Fixes issue 19822. files: README.txt | 17 ++++++++++++++--- 1 files changed, 14 insertions(+), 3 deletions(-) diff --git a/README.txt b/README.txt --- a/README.txt +++ b/README.txt @@ -1,12 +1,23 @@ +Python Enhancement Proposals +============================ + +The PEPs in this repo are published automatically on the web at +http://www.python.org/dev/peps/. To learn more about the purpose of +PEPs and how to go about writing a PEP, please start reading at PEP 1 +(pep-0001.txt in this repo). Note that PEP 0, the index PEP, is now +automatically generated, and not committed to the repo. + + reStructuredText for PEPs ========================= Original PEP source may be written using two standard formats, a mildly idiomatic plaintext format and the reStructuredText format -(also, technically plaintext). These two formats are described in PEP -9 and PEP 12 respectively. The pep2html.py processing and +(also, technically plaintext). These two formats are described in +PEP 9 and PEP 12 respectively. The pep2html.py processing and installation script knows how to produce the HTML for either PEP format. For processing reStructuredText format PEPs, you need the docutils -package, which is available from PyPI. +package, which is available from PyPI (http://pypi.python.org). +If you have pip, "pip install docutils" should install it. -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 29 19:34:42 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 29 Nov 2013 19:34:42 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?peps=3A_Add_instructions_for_HTM_gene?= =?utf-8?q?ration=2E?= Message-ID: <3dWPX22fFCz7LjQ@mail.python.org> http://hg.python.org/peps/rev/ee8a6ec1de74 changeset: 5326:ee8a6ec1de74 user: Guido van Rossum date: Fri Nov 29 10:34:39 2013 -0800 summary: Add instructions for HTM generation. files: README.txt | 11 +++++++++++ 1 files changed, 11 insertions(+), 0 deletions(-) diff --git a/README.txt b/README.txt --- a/README.txt +++ b/README.txt @@ -21,3 +21,14 @@ For processing reStructuredText format PEPs, you need the docutils package, which is available from PyPI (http://pypi.python.org). If you have pip, "pip install docutils" should install it. + + +Generating HTML +=============== + +Do not commit changes with bad formatting. To check the formatting of +a PEP, use the Makefile. In particular, to generate HTML for PEP 999, +your source code should be in pep-0999.txt and the HTML will be +generated to pep-0999.html by the command "make pep-0999.html". The +default Make target generates HTML for all PEPs. If you don't have +Make, use the pep2html.py script. -- Repository URL: http://hg.python.org/peps From python-checkins at python.org Fri Nov 29 21:06:33 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 29 Nov 2013 21:06:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?devinabox=3A_Make_running_build=5Fcpy?= =?utf-8?q?thon=2Epy_even_easier_by_using_more_heuristics_to_guess?= Message-ID: <3dWRZ14lhpz7LjR@mail.python.org> http://hg.python.org/devinabox/rev/32c1302a9242 changeset: 59:32c1302a9242 user: Brett Cannon date: Fri Nov 29 15:06:28 2013 -0500 summary: Make running build_cpython.py even easier by using more heuristics to guess where the CPython checkout is. files: build_cpython.py | 8 +++++++- 1 files changed, 7 insertions(+), 1 deletions(-) diff --git a/build_cpython.py b/build_cpython.py --- a/build_cpython.py +++ b/build_cpython.py @@ -58,7 +58,13 @@ if arg_count > 1: raise ValueError( '0 or 1 arguments expected, not {}'.format(arg_count)) - executable = main(sys.argv[1] if arg_count else 'cpython') + elif arg_count == 1: + directory = sys.argv[1] + elif os.path.exists('cpython'): + directory = 'cpython' + else: + directory = '.' + executable = main(directory) if not executable: print('CPython executable NOT found') else: -- Repository URL: http://hg.python.org/devinabox From python-checkins at python.org Fri Nov 29 21:37:47 2013 From: python-checkins at python.org (charles-francois.natali) Date: Fri, 29 Nov 2013 21:37:47 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Set_the_FDs_non-blocking_i?= =?utf-8?q?n_the_selectors_example=2E?= Message-ID: <3dWSG30Nwwz7LjR@mail.python.org> http://hg.python.org/cpython/rev/07e4b1a4c36d changeset: 87638:07e4b1a4c36d user: Charles-Fran?ois Natali date: Fri Nov 29 18:52:51 2013 +0100 summary: Set the FDs non-blocking in the selectors example. files: Doc/library/selectors.rst | 21 ++++++++++++--------- 1 files changed, 12 insertions(+), 9 deletions(-) diff --git a/Doc/library/selectors.rst b/Doc/library/selectors.rst --- a/Doc/library/selectors.rst +++ b/Doc/library/selectors.rst @@ -214,26 +214,29 @@ >>> import selectors >>> import socket - >>> + >>> >>> s = selectors.DefaultSelector() >>> r, w = socket.socketpair() - >>> + >>> + >>> r.setblocking(False) + >>> w.setblocking(False) + >>> >>> s.register(r, selectors.EVENT_READ) - SelectorKey(fileobj=, fd=4, events=1, data=None) + SelectorKey(fileobj=, fd=4, events=1, data=None) >>> s.register(w, selectors.EVENT_WRITE) - SelectorKey(fileobj=, fd=5, events=2, data=None) - >>> + SelectorKey(fileobj=, fd=5, events=2, data=None) + >>> >>> print(s.select()) - [(SelectorKey(fileobj=, fd=5, events=2, data=None), 2)] - >>> + [(SelectorKey(fileobj=, fd=5, events=2, data=None), 2)] + >>> >>> for key, events in s.select(): ... if events & selectors.EVENT_WRITE: ... key.fileobj.send(b'spam') - ... + ... 4 >>> for key, events in s.select(): ... if events & selectors.EVENT_READ: ... print(key.fileobj.recv(1024)) - ... + ... b'spam' >>> s.close() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 29 21:37:48 2013 From: python-checkins at python.org (charles-francois.natali) Date: Fri, 29 Nov 2013 21:37:48 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Remove_trailing_blanks=2E?= Message-ID: <3dWSG423Htz7LjR@mail.python.org> http://hg.python.org/cpython/rev/a1a936a3b2f6 changeset: 87639:a1a936a3b2f6 user: Charles-Fran?ois Natali date: Fri Nov 29 18:57:47 2013 +0100 summary: Remove trailing blanks. files: Doc/library/selectors.rst | 14 +++++++------- 1 files changed, 7 insertions(+), 7 deletions(-) diff --git a/Doc/library/selectors.rst b/Doc/library/selectors.rst --- a/Doc/library/selectors.rst +++ b/Doc/library/selectors.rst @@ -214,29 +214,29 @@ >>> import selectors >>> import socket - >>> + >>> >>> s = selectors.DefaultSelector() >>> r, w = socket.socketpair() - >>> + >>> >>> r.setblocking(False) >>> w.setblocking(False) - >>> + >>> >>> s.register(r, selectors.EVENT_READ) SelectorKey(fileobj=, fd=4, events=1, data=None) >>> s.register(w, selectors.EVENT_WRITE) SelectorKey(fileobj=, fd=5, events=2, data=None) - >>> + >>> >>> print(s.select()) [(SelectorKey(fileobj=, fd=5, events=2, data=None), 2)] - >>> + >>> >>> for key, events in s.select(): ... if events & selectors.EVENT_WRITE: ... key.fileobj.send(b'spam') - ... + ... 4 >>> for key, events in s.select(): ... if events & selectors.EVENT_READ: ... print(key.fileobj.recv(1024)) - ... + ... b'spam' >>> s.close() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 29 22:17:13 2013 From: python-checkins at python.org (brett.cannon) Date: Fri, 29 Nov 2013 22:17:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319712=3A_Port_tes?= =?utf-8?q?t=2Etest=5Fimportlib=2Eimport=5F_tests_to_use_PEP_451?= Message-ID: <3dWT7Y56R0z7LjM@mail.python.org> http://hg.python.org/cpython/rev/dad74ff75a6b changeset: 87640:dad74ff75a6b user: Brett Cannon date: Fri Nov 29 16:17:05 2013 -0500 summary: Issue #19712: Port test.test_importlib.import_ tests to use PEP 451 that don't require changing test.test_importlib.util.mock_modules(). files: Lib/test/test_importlib/import_/test___loader__.py | 22 +++ Lib/test/test_importlib/import_/test_api.py | 60 +++++++-- Lib/test/test_importlib/import_/test_path.py | 2 +- 3 files changed, 68 insertions(+), 16 deletions(-) diff --git a/Lib/test/test_importlib/import_/test___loader__.py b/Lib/test/test_importlib/import_/test___loader__.py --- a/Lib/test/test_importlib/import_/test___loader__.py +++ b/Lib/test/test_importlib/import_/test___loader__.py @@ -1,3 +1,4 @@ +from importlib import machinery import sys import types import unittest @@ -6,6 +7,27 @@ from . import util as import_util +class SpecLoaderMock: + + def find_spec(self, fullname, path=None, target=None): + return machinery.ModuleSpec(fullname, self) + + def exec_module(self, module): + pass + + +class SpecLoaderAttributeTests: + + def test___loader__(self): + loader = SpecLoaderMock() + with util.uncache('blah'), util.import_state(meta_path=[loader]): + module = self.__import__('blah') + self.assertEqual(loader, module.__loader__) + +Frozen_SpecTests, Source_SpecTests = util.test_both( + SpecLoaderAttributeTests, __import__=import_util.__import__) + + class LoaderMock: def find_module(self, fullname, path=None): diff --git a/Lib/test/test_importlib/import_/test_api.py b/Lib/test/test_importlib/import_/test_api.py --- a/Lib/test/test_importlib/import_/test_api.py +++ b/Lib/test/test_importlib/import_/test_api.py @@ -1,19 +1,37 @@ from .. import util from . import util as import_util + +from importlib import machinery import sys import types import unittest +PKG_NAME = 'fine' +SUBMOD_NAME = 'fine.bogus' + + +class BadSpecFinderLoader: + @classmethod + def find_spec(cls, fullname, path=None, target=None): + if fullname == SUBMOD_NAME: + spec = machinery.ModuleSpec(fullname, cls) + return spec + + @staticmethod + def exec_module(module): + if module.__name__ == SUBMOD_NAME: + raise ImportError('I cannot be loaded!') + class BadLoaderFinder: - bad = 'fine.bogus' @classmethod def find_module(cls, fullname, path): - if fullname == cls.bad: + if fullname == SUBMOD_NAME: return cls + @classmethod def load_module(cls, fullname): - if fullname == cls.bad: + if fullname == SUBMOD_NAME: raise ImportError('I cannot be loaded!') @@ -37,27 +55,39 @@ def test_nonexistent_fromlist_entry(self): # If something in fromlist doesn't exist, that's okay. # issue15715 - mod = types.ModuleType('fine') + mod = types.ModuleType(PKG_NAME) mod.__path__ = ['XXX'] - with util.import_state(meta_path=[BadLoaderFinder]): - with util.uncache('fine'): - sys.modules['fine'] = mod - self.__import__('fine', fromlist=['not here']) + with util.import_state(meta_path=[self.bad_finder_loader]): + with util.uncache(PKG_NAME): + sys.modules[PKG_NAME] = mod + self.__import__(PKG_NAME, fromlist=['not here']) def test_fromlist_load_error_propagates(self): # If something in fromlist triggers an exception not related to not # existing, let that exception propagate. # issue15316 - mod = types.ModuleType('fine') + mod = types.ModuleType(PKG_NAME) mod.__path__ = ['XXX'] - with util.import_state(meta_path=[BadLoaderFinder]): - with util.uncache('fine'): - sys.modules['fine'] = mod + with util.import_state(meta_path=[self.bad_finder_loader]): + with util.uncache(PKG_NAME): + sys.modules[PKG_NAME] = mod with self.assertRaises(ImportError): - self.__import__('fine', fromlist=['bogus']) + self.__import__(PKG_NAME, + fromlist=[SUBMOD_NAME.rpartition('.')[-1]]) -Frozen_APITests, Source_APITests = util.test_both( - APITest, __import__=import_util.__import__) + +class OldAPITests(APITest): + bad_finder_loader = BadLoaderFinder + +Frozen_OldAPITests, Source_OldAPITests = util.test_both( + OldAPITests, __import__=import_util.__import__) + + +class SpecAPITests(APITest): + bad_finder_loader = BadSpecFinderLoader + +Frozen_SpecAPITests, Source_SpecAPITests = util.test_both( + SpecAPITests, __import__=import_util.__import__) if __name__ == '__main__': diff --git a/Lib/test/test_importlib/import_/test_path.py b/Lib/test/test_importlib/import_/test_path.py --- a/Lib/test/test_importlib/import_/test_path.py +++ b/Lib/test/test_importlib/import_/test_path.py @@ -17,7 +17,7 @@ """Tests for PathFinder.""" def test_failure(self): - # Test None returned upon not finding a suitable finder. + # Test None returned upon not finding a suitable loader. module = '' with util.import_state(): self.assertIsNone(self.machinery.PathFinder.find_module(module)) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 29 22:43:23 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Fri, 29 Nov 2013 22:43:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=283=2E3=29=3A_Revert_unrelat?= =?utf-8?q?ed_changes_introduced_by_changeset_b2066bc8cab9__=28issue_=2319?= =?utf-8?b?Nzk1KS4=?= Message-ID: <3dWTjl3yK6z7LjY@mail.python.org> http://hg.python.org/cpython/rev/9d0dc3d7bf01 changeset: 87641:9d0dc3d7bf01 branch: 3.3 parent: 87634:b2066bc8cab9 user: Serhiy Storchaka date: Fri Nov 29 23:40:35 2013 +0200 summary: Revert unrelated changes introduced by changeset b2066bc8cab9 (issue #19795). files: Doc/library/importlib.rst | 10 ---------- 1 files changed, 0 insertions(+), 10 deletions(-) diff --git a/Doc/library/importlib.rst b/Doc/library/importlib.rst --- a/Doc/library/importlib.rst +++ b/Doc/library/importlib.rst @@ -918,18 +918,8 @@ Reliance on this decorator is discouraged when it is possible to set :attr:`__package__` before importing. By setting it beforehand the code for the module is executed with the -<<<<<<< attribute set and thus can be used by global level code during initialization. -======= - The ``cpython-32`` string comes from the current magic tag (see - :func:`get_tag`; if :attr:`sys.implementation.cache_tag` is not defined then - :exc:`NotImplementedError` will be raised). The returned path will end in - ``.pyc`` when ``__debug__`` is ``True`` or ``.pyo`` for an optimized Python - (i.e. ``__debug__`` is ``False``). By passing in ``True`` or ``False`` for - *debug_override* you can override the system's value for ``__debug__`` for - extension selection. ->>>>>>> .. note:: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 29 22:43:24 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Fri, 29 Nov 2013 22:43:24 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Null_merge?= Message-ID: <3dWTjm5lczz7Ljl@mail.python.org> http://hg.python.org/cpython/rev/312137dd7ece changeset: 87642:312137dd7ece parent: 87640:dad74ff75a6b parent: 87641:9d0dc3d7bf01 user: Serhiy Storchaka date: Fri Nov 29 23:42:55 2013 +0200 summary: Null merge files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Fri Nov 29 23:53:11 2013 From: python-checkins at python.org (guido.van.rossum) Date: Fri, 29 Nov 2013 23:53:11 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_More_realistic_example_for?= =?utf-8?q?_selectors=2Epy=2E?= Message-ID: <3dWWGH323Bz7LjM@mail.python.org> http://hg.python.org/cpython/rev/2a679870d7d2 changeset: 87643:2a679870d7d2 user: Guido van Rossum date: Fri Nov 29 14:51:18 2013 -0800 summary: More realistic example for selectors.py. files: Doc/library/selectors.rst | 65 ++++++++++++++------------ 1 files changed, 36 insertions(+), 29 deletions(-) diff --git a/Doc/library/selectors.rst b/Doc/library/selectors.rst --- a/Doc/library/selectors.rst +++ b/Doc/library/selectors.rst @@ -210,33 +210,40 @@ :func:`select.kqueue` object. -Examples of selector usage:: +Examples +-------- - >>> import selectors - >>> import socket - >>> - >>> s = selectors.DefaultSelector() - >>> r, w = socket.socketpair() - >>> - >>> r.setblocking(False) - >>> w.setblocking(False) - >>> - >>> s.register(r, selectors.EVENT_READ) - SelectorKey(fileobj=, fd=4, events=1, data=None) - >>> s.register(w, selectors.EVENT_WRITE) - SelectorKey(fileobj=, fd=5, events=2, data=None) - >>> - >>> print(s.select()) - [(SelectorKey(fileobj=, fd=5, events=2, data=None), 2)] - >>> - >>> for key, events in s.select(): - ... if events & selectors.EVENT_WRITE: - ... key.fileobj.send(b'spam') - ... - 4 - >>> for key, events in s.select(): - ... if events & selectors.EVENT_READ: - ... print(key.fileobj.recv(1024)) - ... - b'spam' - >>> s.close() +Here is a simple echo server implementation:: + + import selectors + import socket + + sel = selectors.DefaultSelector() + + def accept(sock, mask): + conn, addr = sock.accept() # Should be ready + print('accepted', conn, 'from', addr) + conn.setblocking(False) + sel.register(conn, selectors.EVENT_READ, read) + + def read(conn, mask): + data = conn.recv(1000) # Should be ready + if data: + print('echoing', repr(data), 'to', conn) + conn.send(data) # Hope it won't block + else: + print('closing', conn) + sel.unregister(conn) + conn.close() + + sock = socket.socket() + sock.bind(('localhost', 1234)) + sock.listen(100) + sock.setblocking(False) + sel.register(sock, selectors.EVENT_READ, accept) + + while True: + events = sel.select() + for key, mask in events: + callback = key.data + callback(key.fileobj, mask) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 04:15:22 2013 From: python-checkins at python.org (zach.ware) Date: Sat, 30 Nov 2013 04:15:22 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5NTk1?= =?utf-8?q?=3A_Re-enable_a_long-disabled_test_in_test=5Fwinsound?= Message-ID: <3dWd4p4LCDz7Lkb@mail.python.org> http://hg.python.org/cpython/rev/30b3798782f1 changeset: 87644:30b3798782f1 branch: 2.7 parent: 87633:f51ca196d77a user: Zachary Ware date: Wed Nov 27 23:56:04 2013 -0600 summary: Issue #19595: Re-enable a long-disabled test in test_winsound files: Lib/test/test_winsound.py | 20 ++++++++------------ Misc/NEWS | 2 ++ 2 files changed, 10 insertions(+), 12 deletions(-) diff --git a/Lib/test/test_winsound.py b/Lib/test/test_winsound.py --- a/Lib/test/test_winsound.py +++ b/Lib/test/test_winsound.py @@ -159,18 +159,14 @@ ) def test_alias_fallback(self): - # This test can't be expected to work on all systems. The MS - # PlaySound() docs say: - # - # If it cannot find the specified sound, PlaySound uses the - # default system event sound entry instead. If the function - # can find neither the system default entry nor the default - # sound, it makes no sound and returns FALSE. - # - # It's known to return FALSE on some real systems. - - # winsound.PlaySound('!"$%&/(#+*', winsound.SND_ALIAS) - return + if _have_soundcard(): + winsound.PlaySound('!"$%&/(#+*', winsound.SND_ALIAS) + else: + self.assertRaises( + RuntimeError, + winsound.PlaySound, + '!"$%&/(#+*', winsound.SND_ALIAS + ) def test_alias_nofallback(self): if _have_soundcard(): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -53,6 +53,8 @@ Tests ----- +- Issue #19595: Re-enabled a long-disabled test in test_winsound. + - Issue #19588: Fixed tests in test_random that were silently skipped most of the time. Patch by Julian Gindi. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 04:15:23 2013 From: python-checkins at python.org (zach.ware) Date: Sat, 30 Nov 2013 04:15:23 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5NTk1?= =?utf-8?q?=3A_Re-enable_a_long-disabled_test_in_test=5Fwinsound?= Message-ID: <3dWd4q6Gb0z7Lkn@mail.python.org> http://hg.python.org/cpython/rev/63f3e8670fa6 changeset: 87645:63f3e8670fa6 branch: 3.3 parent: 87641:9d0dc3d7bf01 user: Zachary Ware date: Wed Nov 27 23:56:04 2013 -0600 summary: Issue #19595: Re-enable a long-disabled test in test_winsound files: Lib/test/test_winsound.py | 20 ++++++++------------ Misc/NEWS | 2 ++ 2 files changed, 10 insertions(+), 12 deletions(-) diff --git a/Lib/test/test_winsound.py b/Lib/test/test_winsound.py --- a/Lib/test/test_winsound.py +++ b/Lib/test/test_winsound.py @@ -158,18 +158,14 @@ ) def test_alias_fallback(self): - # This test can't be expected to work on all systems. The MS - # PlaySound() docs say: - # - # If it cannot find the specified sound, PlaySound uses the - # default system event sound entry instead. If the function - # can find neither the system default entry nor the default - # sound, it makes no sound and returns FALSE. - # - # It's known to return FALSE on some real systems. - - # winsound.PlaySound('!"$%&/(#+*', winsound.SND_ALIAS) - return + if _have_soundcard(): + winsound.PlaySound('!"$%&/(#+*', winsound.SND_ALIAS) + else: + self.assertRaises( + RuntimeError, + winsound.PlaySound, + '!"$%&/(#+*', winsound.SND_ALIAS + ) def test_alias_nofallback(self): if _have_soundcard(): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -70,6 +70,8 @@ Tests ----- +- Issue #19595: Re-enabled a long-disabled test in test_winsound. + - Issue #19588: Fixed tests in test_random that were silently skipped most of the time. Patch by Julian Gindi. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 04:15:25 2013 From: python-checkins at python.org (zach.ware) Date: Sat, 30 Nov 2013 04:15:25 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Null_merge?= Message-ID: <3dWd4s2Fkpz7Ll4@mail.python.org> http://hg.python.org/cpython/rev/b0e180f8e588 changeset: 87646:b0e180f8e588 parent: 87643:2a679870d7d2 parent: 87645:63f3e8670fa6 user: Zachary Ware date: Fri Nov 29 21:14:22 2013 -0600 summary: Null merge files: -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 05:47:27 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Sat, 30 Nov 2013 05:47:27 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=233693=3A_Fix_array?= =?utf-8?q?_obscure_error_message_when_given_a_str=2E?= Message-ID: <3dWg732mP5z7LkM@mail.python.org> http://hg.python.org/cpython/rev/2c87d3944c7a changeset: 87647:2c87d3944c7a user: Alexandre Vassalotti date: Fri Nov 29 20:47:15 2013 -0800 summary: Issue #3693: Fix array obscure error message when given a str. files: Lib/test/test_array.py | 12 ++++++++ Misc/NEWS | 4 ++ Modules/arraymodule.c | 45 ++++++++++++++++++------------ 3 files changed, 43 insertions(+), 18 deletions(-) diff --git a/Lib/test/test_array.py b/Lib/test/test_array.py --- a/Lib/test/test_array.py +++ b/Lib/test/test_array.py @@ -1027,6 +1027,18 @@ basesize = support.calcvobjsize('Pn2Pi') support.check_sizeof(self, a, basesize) + def test_initialize_with_unicode(self): + if self.typecode != 'u': + with self.assertRaises(TypeError) as cm: + a = array.array(self.typecode, 'foo') + self.assertIn("cannot use a str", str(cm.exception)) + with self.assertRaises(TypeError) as cm: + a = array.array(self.typecode, array.array('u', 'foo')) + self.assertIn("cannot use a unicode array", str(cm.exception)) + else: + a = array.array(self.typecode, "foo") + a = array.array(self.typecode, array.array('u', 'foo')) + class StringTest(BaseTest): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -18,6 +18,10 @@ Library ------- +- Issue #3693: Make the error message more helpful when the array.array() + constructor is given a str. Move the array module typecode documentation to + the docstring of the constructor. + - Issue #19698: Removed exec_module() methods from importlib.machinery.BuiltinImporter and ExtensionFileLoader. diff --git a/Modules/arraymodule.c b/Modules/arraymodule.c --- a/Modules/arraymodule.c +++ b/Modules/arraymodule.c @@ -2518,6 +2518,20 @@ if (!PyArg_ParseTuple(args, "C|O:array", &c, &initial)) return NULL; + if (initial && c != 'u') { + if (PyUnicode_Check(initial)) { + PyErr_Format(PyExc_TypeError, "cannot use a str to initialize " + "an array with typecode '%c'", c); + return NULL; + } + else if (array_Check(initial) && + ((arrayobject*)initial)->ob_descr->typecode == 'u') { + PyErr_Format(PyExc_TypeError, "cannot use a unicode array to " + "initialize an array with typecode '%c'", c); + return NULL; + } + } + if (!(initial == NULL || PyList_Check(initial) || PyByteArray_Check(initial) || PyBytes_Check(initial) @@ -2644,9 +2658,19 @@ "This module defines an object type which can efficiently represent\n\ an array of basic values: characters, integers, floating point\n\ numbers. Arrays are sequence types and behave very much like lists,\n\ -except that the type of objects stored in them is constrained. The\n\ -type is specified at object creation time by using a type code, which\n\ -is a single character. The following type codes are defined:\n\ +except that the type of objects stored in them is constrained.\n"); + +PyDoc_STRVAR(arraytype_doc, +"array(typecode [, initializer]) -> array\n\ +\n\ +Return a new array whose items are restricted by typecode, and\n\ +initialized from the optional initializer value, which must be a list,\n\ +string or iterable over elements of the appropriate type.\n\ +\n\ +Arrays represent basic values and behave very much like lists, except\n\ +the type of objects stored in them is constrained. The type is specified\n\ +at object creation time by using a type code, which is a single character.\n\ +The following type codes are defined:\n\ \n\ Type code C Type Minimum size in bytes \n\ 'b' signed integer 1 \n\ @@ -2670,21 +2694,6 @@ C compiler used to build Python supports 'long long', or, on Windows, \n\ '__int64'.\n\ \n\ -The constructor is:\n\ -\n\ -array(typecode [, initializer]) -- create a new array\n\ -"); - -PyDoc_STRVAR(arraytype_doc, -"array(typecode [, initializer]) -> array\n\ -\n\ -Return a new array whose items are restricted by typecode, and\n\ -initialized from the optional initializer value, which must be a list,\n\ -string or iterable over elements of the appropriate type.\n\ -\n\ -Arrays represent basic values and behave very much like lists, except\n\ -the type of objects stored in them is constrained.\n\ -\n\ Methods:\n\ \n\ append() -- append a new item to the end of the array\n\ -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 06:57:26 2013 From: python-checkins at python.org (nick.coghlan) Date: Sat, 30 Nov 2013 06:57:26 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Fix_and_test_pip_version_c?= =?utf-8?q?onsistency?= Message-ID: <3dWhgp2qjbz7Ljt@mail.python.org> http://hg.python.org/cpython/rev/73a84d8dc544 changeset: 87648:73a84d8dc544 user: Nick Coghlan date: Sat Nov 30 15:56:58 2013 +1000 summary: Fix and test pip version consistency files: Lib/ensurepip/__init__.py | 2 +- Lib/ensurepip/_bundled/pip-1.5.rc1-py2.py3-none-any.whl | 0 Lib/test/test_venv.py | 4 +++- 3 files changed, 4 insertions(+), 2 deletions(-) diff --git a/Lib/ensurepip/__init__.py b/Lib/ensurepip/__init__.py --- a/Lib/ensurepip/__init__.py +++ b/Lib/ensurepip/__init__.py @@ -12,7 +12,7 @@ _SETUPTOOLS_VERSION = "1.3.2" -_PIP_VERSION = "1.5.rc1" +_PIP_VERSION = "1.5rc1" _PROJECTS = [ ("setuptools", _SETUPTOOLS_VERSION), diff --git a/Lib/ensurepip/_bundled/pip-1.5.rc1-py2.py3-none-any.whl b/Lib/ensurepip/_bundled/pip-1.5rc1-py2.py3-none-any.whl rename from Lib/ensurepip/_bundled/pip-1.5.rc1-py2.py3-none-any.whl rename to Lib/ensurepip/_bundled/pip-1.5rc1-py2.py3-none-any.whl diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -5,6 +5,7 @@ Licensed to the PSF under a contributor agreement. """ +import ensurepip import os import os.path import shutil @@ -313,8 +314,9 @@ err = err.decode("latin-1") # Force to text, prevent decoding errors self.assertEqual(err, "") out = out.decode("latin-1") # Force to text, prevent decoding errors + expected_version = "pip {}".format(ensurepip.version()) + self.assertEqual(out[:len(expected_version)], expected_version) env_dir = os.fsencode(self.env_dir).decode("latin-1") - self.assertTrue(out.startswith("pip")) self.assertIn(env_dir, out) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 08:15:33 2013 From: python-checkins at python.org (nick.coghlan) Date: Sat, 30 Nov 2013 08:15:33 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319728=3A_add_priv?= =?utf-8?q?ate_ensurepip=2E=5Funinstall_CLI?= Message-ID: <3dWkPx3th3z7Ll5@mail.python.org> http://hg.python.org/cpython/rev/7dc4bf283857 changeset: 87649:7dc4bf283857 user: Nick Coghlan date: Sat Nov 30 17:15:09 2013 +1000 summary: Issue #19728: add private ensurepip._uninstall CLI MvL would like to be able to preserve CPython's existing clean uninstall behaviour on Windows before enabling the pip installation option by default. This private CLI means running "python -m ensurepip._uninstall" will remove pip and setuptools before proceeding with the rest of the uninstallation process. If the version of pip differs from the one bootstrapped by CPython, then the uninstallation helper will leave it alone (just like any other pip installed packages) files: Lib/ensurepip/__init__.py | 26 ++++++++- Lib/test/test_ensurepip.py | 75 ++++++++++++++++++++++++++ Lib/test/test_venv.py | 66 +++++++++++++++++----- 3 files changed, 149 insertions(+), 18 deletions(-) diff --git a/Lib/ensurepip/__init__.py b/Lib/ensurepip/__init__.py --- a/Lib/ensurepip/__init__.py +++ b/Lib/ensurepip/__init__.py @@ -20,9 +20,10 @@ ] -def _run_pip(args, additional_paths): +def _run_pip(args, additional_paths=None): # Add our bundled software to the sys.path so we can import it - sys.path = additional_paths + sys.path + if additional_paths is not None: + sys.path = additional_paths + sys.path # Install the bundled software import pip @@ -90,3 +91,24 @@ args += ["-" + "v" * verbosity] _run_pip(args + [p[0] for p in _PROJECTS], additional_paths) + +def _uninstall(*, verbosity=0): + """Helper to support a clean default uninstall process on Windows""" + # Nothing to do if pip was never installed, or has been removed + try: + import pip + except ImportError: + return + + # If the pip version doesn't match the bundled one, leave it alone + if pip.__version__ != _PIP_VERSION: + msg = ("ensurepip will only uninstall a matching pip " + "({!r} installed, {!r} bundled)") + raise RuntimeError(msg.format(pip.__version__, _PIP_VERSION)) + + # Construct the arguments to be passed to the pip command + args = ["uninstall", "-y"] + if verbosity: + args += ["-" + "v" * verbosity] + + _run_pip(args + [p[0] for p in reversed(_PROJECTS)]) diff --git a/Lib/test/test_ensurepip.py b/Lib/test/test_ensurepip.py --- a/Lib/test/test_ensurepip.py +++ b/Lib/test/test_ensurepip.py @@ -4,6 +4,8 @@ import test.support import os import os.path +import contextlib +import sys class TestEnsurePipVersion(unittest.TestCase): @@ -122,6 +124,79 @@ def test_altinstall_default_pip_conflict(self): with self.assertRaises(ValueError): ensurepip.bootstrap(altinstall=True, default_pip=True) + self.run_pip.assert_not_called() + + at contextlib.contextmanager +def fake_pip(version=ensurepip._PIP_VERSION): + if version is None: + pip = None + else: + class FakePip(): + __version__ = version + pip = FakePip() + sentinel = object() + orig_pip = sys.modules.get("pip", sentinel) + sys.modules["pip"] = pip + try: + yield pip + finally: + if orig_pip is sentinel: + del sys.modules["pip"] + else: + sys.modules["pip"] = orig_pip + +class TestUninstall(unittest.TestCase): + + def setUp(self): + run_pip_patch = unittest.mock.patch("ensurepip._run_pip") + self.run_pip = run_pip_patch.start() + self.addCleanup(run_pip_patch.stop) + + def test_uninstall_skipped_when_not_installed(self): + with fake_pip(None): + ensurepip._uninstall() + self.run_pip.assert_not_called() + + def test_uninstall_fails_with_wrong_version(self): + with fake_pip("not a valid version"): + with self.assertRaises(RuntimeError): + ensurepip._uninstall() + self.run_pip.assert_not_called() + + + def test_uninstall(self): + with fake_pip(): + ensurepip._uninstall() + + self.run_pip.assert_called_once_with( + ["uninstall", "-y", "pip", "setuptools"] + ) + + def test_uninstall_with_verbosity_1(self): + with fake_pip(): + ensurepip._uninstall(verbosity=1) + + self.run_pip.assert_called_once_with( + ["uninstall", "-y", "-v", "pip", "setuptools"] + ) + + def test_uninstall_with_verbosity_2(self): + with fake_pip(): + ensurepip._uninstall(verbosity=2) + + self.run_pip.assert_called_once_with( + ["uninstall", "-y", "-vv", "pip", "setuptools"] + ) + + def test_uninstall_with_verbosity_3(self): + with fake_pip(): + ensurepip._uninstall(verbosity=3) + + self.run_pip.assert_called_once_with( + ["uninstall", "-y", "-vvv", "pip", "setuptools"] + ) + + if __name__ == "__main__": diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -14,6 +14,7 @@ import tempfile from test.support import (captured_stdout, captured_stderr, run_unittest, can_symlink, EnvironmentVarGuard) +import textwrap import unittest import venv try: @@ -258,30 +259,31 @@ @skipInVenv class EnsurePipTest(BaseTest): """Test venv module installation of pip.""" - - def test_no_pip_by_default(self): - shutil.rmtree(self.env_dir) - self.run_with_capture(venv.create, self.env_dir) - envpy = os.path.join(os.path.realpath(self.env_dir), self.bindir, self.exe) + def assert_pip_not_installed(self): + envpy = os.path.join(os.path.realpath(self.env_dir), + self.bindir, self.exe) try_import = 'try:\n import pip\nexcept ImportError:\n print("OK")' cmd = [envpy, '-c', try_import] p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE) out, err = p.communicate() - self.assertEqual(err, b"") - self.assertEqual(out.strip(), b"OK") + # We force everything to text, so unittest gives the detailed diff + # if we get unexpected results + err = err.decode("latin-1") # Force to text, prevent decoding errors + self.assertEqual(err, "") + out = out.decode("latin-1") # Force to text, prevent decoding errors + self.assertEqual(out.strip(), "OK") + + + def test_no_pip_by_default(self): + shutil.rmtree(self.env_dir) + self.run_with_capture(venv.create, self.env_dir) + self.assert_pip_not_installed() def test_explicit_no_pip(self): shutil.rmtree(self.env_dir) self.run_with_capture(venv.create, self.env_dir, with_pip=False) - envpy = os.path.join(os.path.realpath(self.env_dir), self.bindir, self.exe) - try_import = 'try:\n import pip\nexcept ImportError:\n print("OK")' - cmd = [envpy, '-c', try_import] - p = subprocess.Popen(cmd, stdout=subprocess.PIPE, - stderr=subprocess.PIPE) - out, err = p.communicate() - self.assertEqual(err, b"") - self.assertEqual(out.strip(), b"OK") + self.assert_pip_not_installed() # Temporary skip for http://bugs.python.org/issue19744 @unittest.skipIf(ssl is None, 'pip needs SSL support') @@ -293,7 +295,8 @@ # environment settings don't cause venv to fail. envvars["PYTHONWARNINGS"] = "e" # pip doesn't ignore environment variables when running in - # isolated mode, and we don't have an active virtualenv here + # isolated mode, and we don't have an active virtualenv here, + # we're relying on the native venv support in 3.3+ # See http://bugs.python.org/issue19734 for details del envvars["PIP_REQUIRE_VIRTUALENV"] try: @@ -304,6 +307,7 @@ details = exc.output.decode(errors="replace") msg = "{}\n\n**Subprocess Output**\n{}".format(exc, details) self.fail(msg) + # Ensure pip is available in the virtual environment envpy = os.path.join(os.path.realpath(self.env_dir), self.bindir, self.exe) cmd = [envpy, '-Im', 'pip', '--version'] p = subprocess.Popen(cmd, stdout=subprocess.PIPE, @@ -319,6 +323,36 @@ env_dir = os.fsencode(self.env_dir).decode("latin-1") self.assertIn(env_dir, out) + # http://bugs.python.org/issue19728 + # Check the private uninstall command provided for the Windows + # installers works (at least in a virtual environment) + cmd = [envpy, '-Im', 'ensurepip._uninstall'] + with EnvironmentVarGuard() as envvars: + # pip doesn't ignore environment variables when running in + # isolated mode, and we don't have an active virtualenv here, + # we're relying on the native venv support in 3.3+ + # See http://bugs.python.org/issue19734 for details + del envvars["PIP_REQUIRE_VIRTUALENV"] + p = subprocess.Popen(cmd, stdout=subprocess.PIPE, + stderr=subprocess.PIPE) + out, err = p.communicate() + # We force everything to text, so unittest gives the detailed diff + # if we get unexpected results + err = err.decode("latin-1") # Force to text, prevent decoding errors + self.assertEqual(err, "") + # Being really specific regarding the expected behaviour for the + # initial bundling phase in Python 3.4. If the output changes in + # future pip versions, this test can be relaxed a bit. + out = out.decode("latin-1") # Force to text, prevent decoding errors + expected_output = textwrap.dedent("""\ + Uninstalling pip: + Successfully uninstalled pip + Uninstalling setuptools: + Successfully uninstalled setuptools + """) + self.assertEqual(out, expected_output) + self.assert_pip_not_installed() + def test_main(): run_unittest(BasicTest, EnsurePipTest) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 08:45:22 2013 From: python-checkins at python.org (nick.coghlan) Date: Sat, 30 Nov 2013 08:45:22 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319726=3A_actually?= =?utf-8?q?_running_=27hg_add=27_helps=2E=2E=2E?= Message-ID: <3dWl4L21nYz7Ll9@mail.python.org> http://hg.python.org/cpython/rev/04088790c077 changeset: 87650:04088790c077 user: Nick Coghlan date: Sat Nov 30 17:45:09 2013 +1000 summary: Issue #19726: actually running 'hg add' helps... files: Lib/ensurepip/_uninstall.py | 30 +++++++++++++++++++++++++ 1 files changed, 30 insertions(+), 0 deletions(-) diff --git a/Lib/ensurepip/_uninstall.py b/Lib/ensurepip/_uninstall.py new file mode 100644 --- /dev/null +++ b/Lib/ensurepip/_uninstall.py @@ -0,0 +1,30 @@ +"""Basic pip uninstallation support, helper for the Windows uninstaller""" + +import argparse +import ensurepip + + +def main(): + parser = argparse.ArgumentParser(prog="python -m ensurepip._uninstall") + parser.add_argument( + "--version", + action="version", + version="pip {}".format(ensurepip.version()), + help="Show the version of pip this will attempt to uninstall.", + ) + parser.add_argument( + "-v", "--verbose", + action="count", + default=0, + dest="verbosity", + help=("Give more output. Option is additive, and can be used up to 3 " + "times."), + ) + + args = parser.parse_args() + + ensurepip._uninstall(verbosity=args.verbosity) + + +if __name__ == "__main__": + main() -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 09:35:46 2013 From: python-checkins at python.org (nick.coghlan) Date: Sat, 30 Nov 2013 09:35:46 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2319728=3A_don=27t_?= =?utf-8?q?be_sensitive_to_line_endings?= Message-ID: <3dWmBV1gG8z7LkR@mail.python.org> http://hg.python.org/cpython/rev/a0ec33efa743 changeset: 87651:a0ec33efa743 user: Nick Coghlan date: Sat Nov 30 18:35:32 2013 +1000 summary: Issue #19728: don't be sensitive to line endings files: Lib/test/test_venv.py | 14 +++++--------- 1 files changed, 5 insertions(+), 9 deletions(-) diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -340,17 +340,13 @@ # if we get unexpected results err = err.decode("latin-1") # Force to text, prevent decoding errors self.assertEqual(err, "") - # Being really specific regarding the expected behaviour for the + # Being fairly specific regarding the expected behaviour for the # initial bundling phase in Python 3.4. If the output changes in - # future pip versions, this test can be relaxed a bit. + # future pip versions, this test can likely be relaxed further. out = out.decode("latin-1") # Force to text, prevent decoding errors - expected_output = textwrap.dedent("""\ - Uninstalling pip: - Successfully uninstalled pip - Uninstalling setuptools: - Successfully uninstalled setuptools - """) - self.assertEqual(out, expected_output) + self.assertIn("Successfully uninstalled pip", out) + self.assertIn("Successfully uninstalled setuptools", out) + # Check pip is now gone from the virtual environment self.assert_pip_not_installed() -- Repository URL: http://hg.python.org/cpython From solipsis at pitrou.net Sat Nov 30 09:43:01 2013 From: solipsis at pitrou.net (solipsis at pitrou.net) Date: Sat, 30 Nov 2013 09:43:01 +0100 Subject: [Python-checkins] Daily reference leaks (2a679870d7d2): sum=0 Message-ID: results for 2a679870d7d2 on branch "default" -------------------------------------------- test_site leaked [-2, 0, 2] references, sum=0 test_site leaked [-2, 0, 2] memory blocks, sum=0 Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/antoine/cpython/refleaks/reflogpU80bI', '-x'] From python-checkins at python.org Sat Nov 30 10:06:16 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Sat, 30 Nov 2013 10:06:16 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5MDg4?= =?utf-8?q?=3A_Fix_incorrect_caching_of_the_copyreg_module=2E?= Message-ID: <3dWmsh0lNfz7Lky@mail.python.org> http://hg.python.org/cpython/rev/96d1207d33d0 changeset: 87652:96d1207d33d0 branch: 3.3 parent: 87645:63f3e8670fa6 user: Alexandre Vassalotti date: Sat Nov 30 00:53:09 2013 -0800 summary: Issue #19088: Fix incorrect caching of the copyreg module. This fix does not cause any degradation in performance. files: Misc/NEWS | 3 ++ Objects/typeobject.c | 43 +++++++++++++++++-------------- 2 files changed, 26 insertions(+), 20 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -18,6 +18,9 @@ Library ------- +- Issue #19088: Fixed incorrect caching of the copyreg module in + object.__reduce__() and object.__reduce_ex__(). + - Issue #11508: Fixed uuid.getnode() and uuid.uuid1() on environment with virtual interface. Original patch by Kent Frazier. diff --git a/Objects/typeobject.c b/Objects/typeobject.c --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -7,10 +7,6 @@ #include -/* Cached lookup of the copyreg module, for faster __reduce__ calls */ - -static PyObject *cached_copyreg_module = NULL; - /* Support type attribute cache */ /* The cache can keep references to the names alive for longer than @@ -73,9 +69,6 @@ _PyType_Fini(void) { PyType_ClearCache(); - /* Need to forget our obsolete instance of the copyreg module at - * interpreter shutdown (issue #17408). */ - Py_CLEAR(cached_copyreg_module); } void @@ -3348,19 +3341,29 @@ static PyObject * import_copyreg(void) { - static PyObject *copyreg_str; - - if (!copyreg_str) { - copyreg_str = PyUnicode_InternFromString("copyreg"); - if (copyreg_str == NULL) - return NULL; - } - if (!cached_copyreg_module) { - cached_copyreg_module = PyImport_Import(copyreg_str); - } - - Py_XINCREF(cached_copyreg_module); - return cached_copyreg_module; + PyObject *copyreg_str; + PyObject *copyreg_module; + PyInterpreterState *interp = PyThreadState_GET()->interp; + _Py_IDENTIFIER(copyreg); + + copyreg_str = _PyUnicode_FromId(&PyId_copyreg); + if (copyreg_str == NULL) { + return NULL; + } + /* Try to fetch cached copy of copyreg from sys.modules first in an + attempt to avoid the import overhead. Previously this was implemented + by storing a reference to the cached module in a static variable, but + this broke when multiple embeded interpreters were in use (see issue + #17408 and #19088). */ + copyreg_module = PyDict_GetItemWithError(interp->modules, copyreg_str); + if (copyreg_module != NULL) { + Py_INCREF(copyreg_module); + return copyreg_module; + } + if (PyErr_Occurred()) { + return NULL; + } + return PyImport_Import(copyreg_str); } static PyObject * -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 10:06:17 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Sat, 30 Nov 2013 10:06:17 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Issue_=2319088=3A_Merge_with_3=2E3=2E?= Message-ID: <3dWmsj2jfxz7Lky@mail.python.org> http://hg.python.org/cpython/rev/1ceb6f84b617 changeset: 87653:1ceb6f84b617 parent: 87651:a0ec33efa743 parent: 87652:96d1207d33d0 user: Alexandre Vassalotti date: Sat Nov 30 01:05:51 2013 -0800 summary: Issue #19088: Merge with 3.3. files: Misc/NEWS | 3 ++ Objects/typeobject.c | 43 +++++++++++++++++-------------- 2 files changed, 26 insertions(+), 20 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -22,6 +22,9 @@ constructor is given a str. Move the array module typecode documentation to the docstring of the constructor. +- Issue #19088: Fixed incorrect caching of the copyreg module in + object.__reduce__() and object.__reduce_ex__(). + - Issue #19698: Removed exec_module() methods from importlib.machinery.BuiltinImporter and ExtensionFileLoader. diff --git a/Objects/typeobject.c b/Objects/typeobject.c --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -7,10 +7,6 @@ #include -/* Cached lookup of the copyreg module, for faster __reduce__ calls */ - -static PyObject *cached_copyreg_module = NULL; - /* Support type attribute cache */ /* The cache can keep references to the names alive for longer than @@ -79,9 +75,6 @@ _PyType_Fini(void) { PyType_ClearCache(); - /* Need to forget our obsolete instance of the copyreg module at - * interpreter shutdown (issue #17408). */ - Py_CLEAR(cached_copyreg_module); } void @@ -3390,19 +3383,29 @@ static PyObject * import_copyreg(void) { - static PyObject *copyreg_str; - - if (!copyreg_str) { - copyreg_str = PyUnicode_InternFromString("copyreg"); - if (copyreg_str == NULL) - return NULL; - } - if (!cached_copyreg_module) { - cached_copyreg_module = PyImport_Import(copyreg_str); - } - - Py_XINCREF(cached_copyreg_module); - return cached_copyreg_module; + PyObject *copyreg_str; + PyObject *copyreg_module; + PyInterpreterState *interp = PyThreadState_GET()->interp; + _Py_IDENTIFIER(copyreg); + + copyreg_str = _PyUnicode_FromId(&PyId_copyreg); + if (copyreg_str == NULL) { + return NULL; + } + /* Try to fetch cached copy of copyreg from sys.modules first in an + attempt to avoid the import overhead. Previously this was implemented + by storing a reference to the cached module in a static variable, but + this broke when multiple embeded interpreters were in use (see issue + #17408 and #19088). */ + copyreg_module = PyDict_GetItemWithError(interp->modules, copyreg_str); + if (copyreg_module != NULL) { + Py_INCREF(copyreg_module); + return copyreg_module; + } + if (PyErr_Occurred()) { + return NULL; + } + return PyImport_Import(copyreg_str); } Py_LOCAL(PyObject *) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 22:16:13 2013 From: python-checkins at python.org (serhiy.storchaka) Date: Sat, 30 Nov 2013 22:16:13 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython=3A_Issue_=2317897=3A_Optimize?= =?utf-8?q?d_unpickle_prefetching=2E?= Message-ID: <3dX53x2P87z7LjR@mail.python.org> http://hg.python.org/cpython/rev/d565310e7ae3 changeset: 87654:d565310e7ae3 user: Serhiy Storchaka date: Sat Nov 30 23:15:38 2013 +0200 summary: Issue #17897: Optimized unpickle prefetching. files: Misc/NEWS | 2 + Modules/_pickle.c | 56 +++++++++++++++-------------------- 2 files changed, 26 insertions(+), 32 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -18,6 +18,8 @@ Library ------- +- Issue #17897: Optimized unpickle prefetching. + - Issue #3693: Make the error message more helpful when the array.array() constructor is given a str. Move the array module typecode documentation to the docstring of the constructor. diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -1121,7 +1121,7 @@ _Unpickler_ReadFromFile(UnpicklerObject *self, Py_ssize_t n) { PyObject *data; - Py_ssize_t read_size, prefetched_size = 0; + Py_ssize_t read_size; assert(self->read != NULL); @@ -1134,46 +1134,38 @@ Py_DECREF(empty_tuple); } else { - PyObject *len = PyLong_FromSsize_t(n); - if (len == NULL) - return -1; - data = _Pickle_FastCall(self->read, len); - } - if (data == NULL) - return -1; - - /* Prefetch some data without advancing the file pointer, if possible */ - if (self->peek) { - PyObject *len, *prefetched; - len = PyLong_FromSsize_t(PREFETCH); - if (len == NULL) { - Py_DECREF(data); - return -1; - } - prefetched = _Pickle_FastCall(self->peek, len); - if (prefetched == NULL) { - if (PyErr_ExceptionMatches(PyExc_NotImplementedError)) { + PyObject *len; + /* Prefetch some data without advancing the file pointer, if possible */ + if (self->peek && n < PREFETCH) { + len = PyLong_FromSsize_t(PREFETCH); + if (len == NULL) + return -1; + data = _Pickle_FastCall(self->peek, len); + if (data == NULL) { + if (!PyErr_ExceptionMatches(PyExc_NotImplementedError)) + return -1; /* peek() is probably not supported by the given file object */ PyErr_Clear(); Py_CLEAR(self->peek); } else { + read_size = _Unpickler_SetStringInput(self, data); Py_DECREF(data); - return -1; + self->prefetched_idx = 0; + if (n <= read_size) + return n; } } - else { - assert(PyBytes_Check(prefetched)); - prefetched_size = PyBytes_GET_SIZE(prefetched); - PyBytes_ConcatAndDel(&data, prefetched); - if (data == NULL) - return -1; - } - } - - read_size = _Unpickler_SetStringInput(self, data) - prefetched_size; + len = PyLong_FromSsize_t(n); + if (len == NULL) + return -1; + data = _Pickle_FastCall(self->read, len); + } + if (data == NULL) + return -1; + + read_size = _Unpickler_SetStringInput(self, data); Py_DECREF(data); - self->prefetched_idx = read_size; return read_size; } -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 22:24:51 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Sat, 30 Nov 2013 22:24:51 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE2MjMx?= =?utf-8?q?=3A_Allow_false_values_other_than_None_to_be_used_as_persistent?= =?utf-8?q?_IDs=2E?= Message-ID: <3dX5Fv5ph3zSyF@mail.python.org> http://hg.python.org/cpython/rev/ee627983ba28 changeset: 87655:ee627983ba28 branch: 2.7 parent: 87644:30b3798782f1 user: Alexandre Vassalotti date: Sat Nov 30 13:24:13 2013 -0800 summary: Issue #16231: Allow false values other than None to be used as persistent IDs. files: Lib/pickle.py | 2 +- Lib/test/pickletester.py | 40 +++++++++++++++------------ Misc/NEWS | 5 +++ 3 files changed, 28 insertions(+), 19 deletions(-) diff --git a/Lib/pickle.py b/Lib/pickle.py --- a/Lib/pickle.py +++ b/Lib/pickle.py @@ -269,7 +269,7 @@ def save(self, obj): # Check for persistent id (defined by a subclass) pid = self.persistent_id(obj) - if pid: + if pid is not None: self.save_pers(pid) return diff --git a/Lib/test/pickletester.py b/Lib/test/pickletester.py --- a/Lib/test/pickletester.py +++ b/Lib/test/pickletester.py @@ -1152,30 +1152,34 @@ if isinstance(object, int) and object % 2 == 0: self.id_count += 1 return str(object) + elif object == "test_false_value": + self.false_count += 1 + return "" else: return None def persistent_load(self, oid): - self.load_count += 1 - object = int(oid) - assert object % 2 == 0 - return object + if not oid: + self.load_false_count += 1 + return "test_false_value" + else: + self.load_count += 1 + object = int(oid) + assert object % 2 == 0 + return object def test_persistence(self): - self.id_count = 0 - self.load_count = 0 - L = range(10) - self.assertEqual(self.loads(self.dumps(L)), L) - self.assertEqual(self.id_count, 5) - self.assertEqual(self.load_count, 5) - - def test_bin_persistence(self): - self.id_count = 0 - self.load_count = 0 - L = range(10) - self.assertEqual(self.loads(self.dumps(L, 1)), L) - self.assertEqual(self.id_count, 5) - self.assertEqual(self.load_count, 5) + L = range(10) + ["test_false_value"] + for proto in protocols: + self.id_count = 0 + self.false_count = 0 + self.load_false_count = 0 + self.load_count = 0 + self.assertEqual(self.loads(self.dumps(L, proto)), L) + self.assertEqual(self.id_count, 5) + self.assertEqual(self.false_count, 1) + self.assertEqual(self.load_count, 5) + self.assertEqual(self.load_false_count, 1) class AbstractPicklerUnpicklerObjectTests(unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -15,6 +15,11 @@ Library ------- +- Issue #16231: Fixed pickle.Pickler to only fallback to its default pickling + behaviour when Pickler.persistent_id returns None, but not for any other + false values. This allows false values other values other than None to be + used as persistent IDs. This behaviour is consistent with cPickle. + - Issue #11508: Fixed uuid.getnode() and uuid.uuid1() on environment with virtual interface. Original patch by Kent Frazier. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 22:56:31 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Sat, 30 Nov 2013 22:56:31 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogRml4ZWQgX3BpY2ts?= =?utf-8?q?e=2EUnpickler_to_handle_empty_persistent_IDs_correctly=2E?= Message-ID: <3dX5yR3WhWz7Ljl@mail.python.org> http://hg.python.org/cpython/rev/b92f9eaedb76 changeset: 87656:b92f9eaedb76 branch: 3.3 parent: 87652:96d1207d33d0 user: Alexandre Vassalotti date: Sat Nov 30 13:52:35 2013 -0800 summary: Fixed _pickle.Unpickler to handle empty persistent IDs correctly. files: Lib/test/pickletester.py | 40 +++++++++++++++------------ Misc/NEWS | 3 ++ Modules/_pickle.c | 2 +- 3 files changed, 26 insertions(+), 19 deletions(-) diff --git a/Lib/test/pickletester.py b/Lib/test/pickletester.py --- a/Lib/test/pickletester.py +++ b/Lib/test/pickletester.py @@ -1499,30 +1499,34 @@ if isinstance(object, int) and object % 2 == 0: self.id_count += 1 return str(object) + elif object == "test_false_value": + self.false_count += 1 + return "" else: return None def persistent_load(self, oid): - self.load_count += 1 - object = int(oid) - assert object % 2 == 0 - return object + if not oid: + self.load_false_count += 1 + return "test_false_value" + else: + self.load_count += 1 + object = int(oid) + assert object % 2 == 0 + return object def test_persistence(self): - self.id_count = 0 - self.load_count = 0 - L = list(range(10)) - self.assertEqual(self.loads(self.dumps(L)), L) - self.assertEqual(self.id_count, 5) - self.assertEqual(self.load_count, 5) - - def test_bin_persistence(self): - self.id_count = 0 - self.load_count = 0 - L = list(range(10)) - self.assertEqual(self.loads(self.dumps(L, 1)), L) - self.assertEqual(self.id_count, 5) - self.assertEqual(self.load_count, 5) + L = list(range(10)) + ["test_false_value"] + for proto in protocols: + self.id_count = 0 + self.false_count = 0 + self.load_false_count = 0 + self.load_count = 0 + self.assertEqual(self.loads(self.dumps(L, proto)), L) + self.assertEqual(self.id_count, 5) + self.assertEqual(self.false_count, 1) + self.assertEqual(self.load_count, 5) + self.assertEqual(self.load_false_count, 1) class AbstractPicklerUnpicklerObjectTests(unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -21,6 +21,9 @@ - Issue #19088: Fixed incorrect caching of the copyreg module in object.__reduce__() and object.__reduce_ex__(). +- Fixed _pickle.Unpickler to not fail when loading empty strings as + persistent IDs. + - Issue #11508: Fixed uuid.getnode() and uuid.uuid1() on environment with virtual interface. Original patch by Kent Frazier. diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -4665,7 +4665,7 @@ if (self->pers_func) { if ((len = _Unpickler_Readline(self, &s)) < 0) return -1; - if (len < 2) + if (len < 1) return bad_readline(); pid = PyBytes_FromStringAndSize(s, len - 1); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 22:56:32 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Sat, 30 Nov 2013 22:56:32 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?b?KTogTWVyZ2Ugd2l0aCAzLjMu?= Message-ID: <3dX5yS5WrLz7Lk1@mail.python.org> http://hg.python.org/cpython/rev/f20d966dc499 changeset: 87657:f20d966dc499 parent: 87654:d565310e7ae3 parent: 87656:b92f9eaedb76 user: Alexandre Vassalotti date: Sat Nov 30 13:55:39 2013 -0800 summary: Merge with 3.3. files: Lib/test/pickletester.py | 40 +++++++++++++++------------ Misc/NEWS | 3 ++ Modules/_pickle.c | 2 +- 3 files changed, 26 insertions(+), 19 deletions(-) diff --git a/Lib/test/pickletester.py b/Lib/test/pickletester.py --- a/Lib/test/pickletester.py +++ b/Lib/test/pickletester.py @@ -1785,30 +1785,34 @@ if isinstance(object, int) and object % 2 == 0: self.id_count += 1 return str(object) + elif object == "test_false_value": + self.false_count += 1 + return "" else: return None def persistent_load(self, oid): - self.load_count += 1 - object = int(oid) - assert object % 2 == 0 - return object + if not oid: + self.load_false_count += 1 + return "test_false_value" + else: + self.load_count += 1 + object = int(oid) + assert object % 2 == 0 + return object def test_persistence(self): - self.id_count = 0 - self.load_count = 0 - L = list(range(10)) - self.assertEqual(self.loads(self.dumps(L)), L) - self.assertEqual(self.id_count, 5) - self.assertEqual(self.load_count, 5) - - def test_bin_persistence(self): - self.id_count = 0 - self.load_count = 0 - L = list(range(10)) - self.assertEqual(self.loads(self.dumps(L, 1)), L) - self.assertEqual(self.id_count, 5) - self.assertEqual(self.load_count, 5) + L = list(range(10)) + ["test_false_value"] + for proto in protocols: + self.id_count = 0 + self.false_count = 0 + self.load_false_count = 0 + self.load_count = 0 + self.assertEqual(self.loads(self.dumps(L, proto)), L) + self.assertEqual(self.id_count, 5) + self.assertEqual(self.false_count, 1) + self.assertEqual(self.load_count, 5) + self.assertEqual(self.load_false_count, 1) class AbstractPicklerUnpicklerObjectTests(unittest.TestCase): diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -30,6 +30,9 @@ - Issue #19698: Removed exec_module() methods from importlib.machinery.BuiltinImporter and ExtensionFileLoader. +- Fixed _pickle.Unpickler to not fail when loading empty strings as + persistent IDs. + - ssl.create_default_context() sets OP_NO_COMPRESSION to prevent CRIME. - Issue #19802: Add socket.SO_PRIORITY. diff --git a/Modules/_pickle.c b/Modules/_pickle.c --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -5355,7 +5355,7 @@ if (self->pers_func) { if ((len = _Unpickler_Readline(self, &s)) < 0) return -1; - if (len < 2) + if (len < 1) return bad_readline(); pid = PyBytes_FromStringAndSize(s, len - 1); -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 23:03:10 2013 From: python-checkins at python.org (alexandre.vassalotti) Date: Sat, 30 Nov 2013 23:03:10 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=282=2E7=29=3A_Fix_typo_in_Mi?= =?utf-8?q?sc/NEWS=2E?= Message-ID: <3dX6660pZTz7LjY@mail.python.org> http://hg.python.org/cpython/rev/ce55df8ee815 changeset: 87658:ce55df8ee815 branch: 2.7 parent: 87655:ee627983ba28 user: Alexandre Vassalotti date: Sat Nov 30 14:02:47 2013 -0800 summary: Fix typo in Misc/NEWS. files: Misc/NEWS | 4 ++-- 1 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Misc/NEWS b/Misc/NEWS --- a/Misc/NEWS +++ b/Misc/NEWS @@ -17,8 +17,8 @@ - Issue #16231: Fixed pickle.Pickler to only fallback to its default pickling behaviour when Pickler.persistent_id returns None, but not for any other - false values. This allows false values other values other than None to be - used as persistent IDs. This behaviour is consistent with cPickle. + false values. This allows false values other than None to be used as + persistent IDs. This behaviour is consistent with cPickle. - Issue #11508: Fixed uuid.getnode() and uuid.uuid1() on environment with virtual interface. Original patch by Kent Frazier. -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 23:46:43 2013 From: python-checkins at python.org (vinay.sajip) Date: Sat, 30 Nov 2013 23:46:43 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMi43KTogSXNzdWUgIzE5Nzg5?= =?utf-8?q?=3A_Clarified_documentation_for_logging=2Edisable=2E?= Message-ID: <3dX74M0VWMz7LlY@mail.python.org> http://hg.python.org/cpython/rev/b6377ca8087a changeset: 87659:b6377ca8087a branch: 2.7 user: Vinay Sajip date: Sat Nov 30 22:43:13 2013 +0000 summary: Issue #19789: Clarified documentation for logging.disable. files: Doc/library/logging.rst | 6 ++++-- 1 files changed, 4 insertions(+), 2 deletions(-) diff --git a/Doc/library/logging.rst b/Doc/library/logging.rst --- a/Doc/library/logging.rst +++ b/Doc/library/logging.rst @@ -837,8 +837,10 @@ effect is to disable all logging calls of severity *lvl* and below, so that if you call it with a value of INFO, then all INFO and DEBUG events would be discarded, whereas those of severity WARNING and above would be processed - according to the logger's effective level. To undo the effect of a call to - ``logging.disable(lvl)``, call ``logging.disable(logging.NOTSET)``. + according to the logger's effective level. If + ``logging.disable(logging.NOTSET)`` is called, it effectively removes this + overriding level, so that logging output again depends on the effective + levels of individual loggers. .. function:: addLevelName(lvl, levelName) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 23:46:44 2013 From: python-checkins at python.org (vinay.sajip) Date: Sat, 30 Nov 2013 23:46:44 +0100 (CET) Subject: [Python-checkins] =?utf-8?b?Y3B5dGhvbiAoMy4zKTogSXNzdWUgIzE5Nzg5?= =?utf-8?q?=3A_Clarified_documentation_for_logging=2Edisable=2E?= Message-ID: <3dX74N2DQjz7Lls@mail.python.org> http://hg.python.org/cpython/rev/5c8130b85c17 changeset: 87660:5c8130b85c17 branch: 3.3 parent: 87656:b92f9eaedb76 user: Vinay Sajip date: Sat Nov 30 22:45:29 2013 +0000 summary: Issue #19789: Clarified documentation for logging.disable. files: Doc/library/logging.rst | 6 ++++-- 1 files changed, 4 insertions(+), 2 deletions(-) diff --git a/Doc/library/logging.rst b/Doc/library/logging.rst --- a/Doc/library/logging.rst +++ b/Doc/library/logging.rst @@ -987,8 +987,10 @@ effect is to disable all logging calls of severity *lvl* and below, so that if you call it with a value of INFO, then all INFO and DEBUG events would be discarded, whereas those of severity WARNING and above would be processed - according to the logger's effective level. To undo the effect of a call to - ``logging.disable(lvl)``, call ``logging.disable(logging.NOTSET)``. + according to the logger's effective level. If + ``logging.disable(logging.NOTSET)`` is called, it effectively removes this + overriding level, so that logging output again depends on the effective + levels of individual loggers. .. function:: addLevelName(lvl, levelName) -- Repository URL: http://hg.python.org/cpython From python-checkins at python.org Sat Nov 30 23:46:45 2013 From: python-checkins at python.org (vinay.sajip) Date: Sat, 30 Nov 2013 23:46:45 +0100 (CET) Subject: [Python-checkins] =?utf-8?q?cpython_=28merge_3=2E3_-=3E_default?= =?utf-8?q?=29=3A_Closes_=2319789=3A_Merged_update_from_3=2E3=2E?= Message-ID: <3dX74P40Csz7Llk@mail.python.org> http://hg.python.org/cpython/rev/eae4b83108fb changeset: 87661:eae4b83108fb parent: 87657:f20d966dc499 parent: 87660:5c8130b85c17 user: Vinay Sajip date: Sat Nov 30 22:46:29 2013 +0000 summary: Closes #19789: Merged update from 3.3. files: Doc/library/logging.rst | 6 ++++-- 1 files changed, 4 insertions(+), 2 deletions(-) diff --git a/Doc/library/logging.rst b/Doc/library/logging.rst --- a/Doc/library/logging.rst +++ b/Doc/library/logging.rst @@ -987,8 +987,10 @@ effect is to disable all logging calls of severity *lvl* and below, so that if you call it with a value of INFO, then all INFO and DEBUG events would be discarded, whereas those of severity WARNING and above would be processed - according to the logger's effective level. To undo the effect of a call to - ``logging.disable(lvl)``, call ``logging.disable(logging.NOTSET)``. + according to the logger's effective level. If + ``logging.disable(logging.NOTSET)`` is called, it effectively removes this + overriding level, so that logging output again depends on the effective + levels of individual loggers. .. function:: addLevelName(lvl, levelName) -- Repository URL: http://hg.python.org/cpython From victor.stinner at gmail.com Wed Nov 27 23:57:25 2013 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 27 Nov 2013 23:57:25 +0100 Subject: [Python-checkins] cpython: asyncio: Change write buffer use to avoid O(N**2). Make write()/sendto() accept In-Reply-To: <3dVGSl4Tl3zRNM@mail.python.org> References: <3dVGSl4Tl3zRNM@mail.python.org> Message-ID: 2013/11/27 guido.van.rossum : > http://hg.python.org/cpython/rev/80e0040d910c > changeset: 87617:80e0040d910c > user: Guido van Rossum > date: Wed Nov 27 14:12:48 2013 -0800 > summary: > asyncio: Change write buffer use to avoid O(N**2). Make write()/sendto() accept bytearray/memoryview too. Change some asserts with proper exceptions. Was this change discussed somewhere? Antoine told me on IRC that it was discussed on the tulip mailing list. I would be nice to mention it in the commit message (for the next change ;-). > diff --git a/Lib/asyncio/selector_events.py b/Lib/asyncio/selector_events.py > --- a/Lib/asyncio/selector_events.py > +++ b/Lib/asyncio/selector_events.py > @@ -340,6 +340,8 @@ > > max_size = 256 * 1024 # Buffer size passed to recv(). > > + _buffer_factory = bytearray # Constructs initial value for self._buffer. > + Because the buffer type is now configurable, it would be nice to explain in a short comment why bytearray fits better this use case. (I like the bytearray object, so I like your whole changeset!) > @@ -762,6 +776,8 @@ > > class _SelectorDatagramTransport(_SelectorTransport): > > + _buffer_factory = collections.deque > + > def __init__(self, loop, sock, protocol, address=None, extra=None): > super().__init__(loop, sock, protocol, extra) > self._address = address Why collections.deque is preferred here? Could you also please add a comment? Victor