From report at bugs.python.org Tue Dec 1 00:31:34 2015 From: report at bugs.python.org (Martin Panter) Date: Tue, 01 Dec 2015 05:31:34 +0000 Subject: [New-bugs-announce] [issue25771] importlib: '.submodule' is not a relative name (no leading dot) Message-ID: <1448947894.1.0.971575571341.issue25771@psf.upfronthosting.co.za> New submission from Martin Panter: >>> import importlib.util >>> importlib.util.resolve_name(".submodule", None) Traceback (most recent call last): File "", line 1, in File "/home/proj/python/cpython/Lib/importlib/util.py", line 26, in resolve_name '(no leading dot)'.format(name)) ValueError: '.submodule' is not a relative name (no leading dot) This message sounds like nonsense. Perhaps the it should say something like: '.submodule' should be an absolute name (no leading dot) or: relative import of '.submodule' not allowed outside of a package ---------- keywords: easy messages: 255642 nosy: martin.panter priority: normal severity: normal status: open title: importlib: '.submodule' is not a relative name (no leading dot) type: behavior versions: Python 3.4, Python 3.5, Python 3.6 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Tue Dec 1 03:02:50 2015 From: report at bugs.python.org (Juchen Zeng) Date: Tue, 01 Dec 2015 08:02:50 +0000 Subject: [New-bugs-announce] [issue25772] Misleading descriptions about built-in type `super.` Message-ID: <1448956970.84.0.677978683679.issue25772@psf.upfronthosting.co.za> New submission from Juchen Zeng: A few days ago, I was learning built-in type `super` through [python official doc](https://docs.python.org/2/library/functions.html#super). And I was mislead by its documentation, in the part of how `__mro__` resolution works. Below is the part which confuse me: """ super(type[, object-or-type]) # ISSUE IRRELEVANT DOC OMITTED The __mro__ attribute of the type lists the method resolution search order used by both getattr() and super(). The attribute is dynamic and can change whenever the inheritance hierarchy is updated. """ >From the description of the doc we can see that `super()` takes two arguments, the first is `type` and the second is an optional `object-or-type`. So, when I saw the doc statement: `The __mro__ attribute of the type lists the method resolution search order used by both getattr() and super(). `, I naturally thought here the `type` refers to the compulsory first `type` argument. But after doing a series of [experiments][EXP_REF]. It turns out in `super()` was using the second `type`'s `__mro__` attribute! And if the second argument is an object, it will use `object.__class__.__mro__` as its resolution order. Unless a learner experimented it throughly like me, he will have lots of difficulty to figured out that `type` mentioned in the doc refers to the second optional argument. I think here the doc should be clearly specified that the second argument's `__mro__` is what super relies on. I suggest to add distinctions on arguments name or add more clarification informations in the doc here. By the way, the python3 document has the same issue. If you decided to fix this, maybe you want to fix its python3 version, too. [EXP_REF]: http://stackoverflow.com/questions/33890918/how-does-super-interacts-with-a-classs-mro-attribute-in-multiple-inheri/33891281#33891281 ---------- assignee: docs at python components: Documentation messages: 255643 nosy: Juchen Zeng, docs at python, eric.araujo, ezio.melotti, georg.brandl priority: normal severity: normal status: open title: Misleading descriptions about built-in type `super.` versions: Python 2.7 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Tue Dec 1 10:44:28 2015 From: report at bugs.python.org (Serhiy Storchaka) Date: Tue, 01 Dec 2015 15:44:28 +0000 Subject: [New-bugs-announce] [issue25773] Deprecate deleting with PyObject_SetAttr, PyObject_SetAttrString and PySequence_SetItem Message-ID: <1448984668.36.0.0444302662987.issue25773@psf.upfronthosting.co.za> New submission from Serhiy Storchaka: If the third argument of PyObject_SetAttr(), PyObject_SetAttrString() or PySequence_SetItem() is NULL, these functions delete an attribute or an item. This is rather undocumented implementation detail. There are special counterparts for deleting: PyObject_DelAttr(), PyObject_DelAttrString() and PySequence_DelItem(). May be worth to depre Proposed patch deprecates using these Set* functions for deleting and replaces them with appropriate Del* functions if needed. Discussion on Python-Dev: http://comments.gmane.org/gmane.comp.python.devel/155474 Issue that documents deleting with Set* APIs: issue25701. ---------- components: Interpreter Core files: PyObject_SetAttr_deleting.patch keywords: patch messages: 255655 nosy: gvanrossum, martin.panter, ncoghlan, serhiy.storchaka priority: normal severity: normal stage: patch review status: open title: Deprecate deleting with PyObject_SetAttr, PyObject_SetAttrString and PySequence_SetItem type: enhancement versions: Python 3.6 Added file: http://bugs.python.org/file41200/PyObject_SetAttr_deleting.patch _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Tue Dec 1 13:31:55 2015 From: report at bugs.python.org (Zachary Ware) Date: Tue, 01 Dec 2015 18:31:55 +0000 Subject: [New-bugs-announce] [issue25774] [benchmarks] Adjust to allow uploading benchmark data to codespeed Message-ID: <1448994715.99.0.967922145164.issue25774@psf.upfronthosting.co.za> New submission from Zachary Ware: Here's a patch to the benchmarks repo that allows running a benchmark on a single interpreter and returning raw data, and provides a script to upload the raw data to a codespeed instance. It's a bit of a quick hack, but is effective. This is a prerequisite to getting speed.python.org running. ---------- components: Benchmarks files: benchmarks.diff hgrepos: 323 keywords: patch messages: 255669 nosy: brett.cannon, pitrou, zach.ware priority: normal severity: normal stage: patch review status: open title: [benchmarks] Adjust to allow uploading benchmark data to codespeed type: enhancement Added file: http://bugs.python.org/file41202/benchmarks.diff _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Tue Dec 1 14:38:33 2015 From: report at bugs.python.org (Nicholas Chammas) Date: Tue, 01 Dec 2015 19:38:33 +0000 Subject: [New-bugs-announce] [issue25775] Bug tracker emails go to spam Message-ID: <1448998713.15.0.646298994549.issue25775@psf.upfronthosting.co.za> New submission from Nicholas Chammas: Not sure where to report this. Is there a component for the bug tracker itself? Anyway, Gmail sends emails from this bug tracker to spam and flags each one with the following message: > Why is this message in Spam? It is in violation of Google's recommended email sender guidelines. Learn more > https://support.google.com/mail/answer/81126?hl=en#authentication Is this actionable? Is this a known issue? ---------- messages: 255676 nosy: Nicholas Chammas priority: normal severity: normal status: open title: Bug tracker emails go to spam type: behavior _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Wed Dec 2 05:29:15 2015 From: report at bugs.python.org (Serhiy Storchaka) Date: Wed, 02 Dec 2015 10:29:15 +0000 Subject: [New-bugs-announce] [issue25776] More compact pickle of iterators etc Message-ID: <1449052155.68.0.447852149746.issue25776@psf.upfronthosting.co.za> New submission from Serhiy Storchaka: Proposed patch makes a number of classes produce more compact pickle data in common case. This includes iterators of list, tuple, str, bytes, bytearray, enumerate, array, deque, iterator object for classes with __getitem__, some itertools iterators, and non-iterator objects: slice, bytearray, deque. This is achieved by omitting default constructor arguments or state. Exhausted iterators are pickled as iter(()). This is not new, exhausted bytes, and bytearray iterators, and reversed list iterator are already pickled as iter('') or iter([]) correspondingly. iter(()) is just the simplest way to create an empty iterator and it has the most compact pickle representation. An example. Unpatched: >>> import pickle, pickletools, itertools >>> len(pickle.dumps(itertools.islice('abcdefgh', 4), 3)) 80 >>> len(pickletools.optimize(pickle.dumps(itertools.islice('abcdefgh', 4), 3))) 66 >>> pickletools.dis(pickletools.optimize(pickle.dumps(itertools.islice('abcdefgh', 4), 3))) 0: \x80 PROTO 3 2: c GLOBAL 'itertools islice' 20: ( MARK 21: c GLOBAL 'builtins iter' 36: X BINUNICODE 'abcdefgh' 49: \x85 TUPLE1 50: R REDUCE 51: K BININT1 0 53: b BUILD 54: K BININT1 0 56: K BININT1 4 58: K BININT1 1 60: t TUPLE (MARK at 20) 61: R REDUCE 62: K BININT1 0 64: b BUILD 65: . STOP highest protocol among opcodes = 2 Patched: >>> len(pickle.dumps(itertools.islice('abcdefgh', 4), 3)) 69 >>> len(pickletools.optimize(pickle.dumps(itertools.islice('abcdefgh', 4), 3))) 55 >>> pickletools.dis(pickletools.optimize(pickle.dumps(itertools.islice('abcdefgh', 4), 3))) 0: \x80 PROTO 3 2: c GLOBAL 'itertools islice' 20: c GLOBAL 'builtins iter' 35: X BINUNICODE 'abcdefgh' 48: \x85 TUPLE1 49: R REDUCE 50: K BININT1 4 52: \x86 TUPLE2 53: R REDUCE 54: . STOP highest protocol among opcodes = 2 ---------- components: Extension Modules, Interpreter Core files: iterators_pickle.diff keywords: patch messages: 255699 nosy: alexandre.vassalotti, pitrou, rhettinger, serhiy.storchaka priority: normal severity: normal stage: patch review status: open title: More compact pickle of iterators etc type: enhancement versions: Python 3.6 Added file: http://bugs.python.org/file41207/iterators_pickle.diff _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Wed Dec 2 07:22:21 2015 From: report at bugs.python.org (Juchen Zeng) Date: Wed, 02 Dec 2015 12:22:21 +0000 Subject: [New-bugs-announce] [issue25777] Misleading descriptions in docs about invoking descriptors. Message-ID: <1449058941.53.0.26351823319.issue25777@psf.upfronthosting.co.za> New submission from Juchen Zeng: [Doc Link](https://docs.python.org/2/howto/descriptor.html#invoking-descriptors) In descriptions about how to invoke descriptors with super(), it says: The call super(B, obj).m() searches obj.__class__.__mro__ for the base class A immediately following B and then returns A.__dict__['m'].__get__(obj, B). If not a descriptor, m is returned unchanged. If not in the dictionary, m reverts to a search using object.__getattribute__(). But the call ` super(B, obj).m()` will not return `A.__dict__['m'].__get__(obj, B)`, it will trigger the `__call__` method of ` A.__dict__['m'].__get__(obj, B)` if it has that attr, and return what this `__call__` method returns. It could be anything. It's actually `super(B, obj).m` returns `A.__dict__['m'].__get__(obj, B)` if m is a descriptor. In short, the original description in the doc can be abbreviated to: `The call super(B, obj).m() [did something] and returns A.__dict__['m'].__get__(obj, B).` Which is obviously misleading. As the method/function call isn't the core part in this sentence. I suggest the doc to be fixed like this: The action super(B, obj).m searches obj.__class__.__mro__ for the base class A immediately following B and then returns A.__dict__['m'].__get__(obj, B). ---------- assignee: docs at python components: Documentation messages: 255712 nosy: Juchen Zeng, docs at python, martin.panter priority: normal severity: normal status: open title: Misleading descriptions in docs about invoking descriptors. versions: Python 2.7 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Wed Dec 2 08:55:25 2015 From: report at bugs.python.org (Anshul Agrawal) Date: Wed, 02 Dec 2015 13:55:25 +0000 Subject: [New-bugs-announce] [issue25778] Error on import matplotlib.pyplot and seaborn (Python3 - Windows 10 64-bit issue) Message-ID: <1449064525.57.0.047427517368.issue25778@psf.upfronthosting.co.za> New submission from Anshul Agrawal: I have described the error message I got when I recently installed Python 3.5.0 (via Anaconda3) and then subsequently tried to import mathplotlib.pyplot and seaborn packages at this StackOverflow post (http://stackoverflow.com/questions/34004063/error-on-import-matplotlib-pyplot-on-anaconda3-for-windows-10-home-64-bit-pc) Another person responded to the above post with a simple 1-line patch to the file "fontList.py3k.cache" (please see the above post for details). This does appear to be a purely Python 3 issue on some Windows platforms. It would be nice if Python 3 developers could check the patch out for any possible side-effects, and incorporate a permanent fix in future versions. Thanks. ---------- components: Extension Modules messages: 255715 nosy: anshul6 priority: normal severity: normal status: open title: Error on import matplotlib.pyplot and seaborn (Python3 - Windows 10 64-bit issue) type: behavior versions: Python 3.5 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Wed Dec 2 10:37:25 2015 From: report at bugs.python.org (Jack O'Connor) Date: Wed, 02 Dec 2015 15:37:25 +0000 Subject: [New-bugs-announce] [issue25779] deadlock with asyncio+contextmanager+ExitStack Message-ID: <1449070645.76.0.674051861514.issue25779@psf.upfronthosting.co.za> New submission from Jack O'Connor: The following hangs at 100% CPU on Python 3.5, though not on Python 3.4: 1) Start an asyncio coroutine with run_until_complete(). 2) Inside the coroutine, enter an ExitStack using a with-statement. 3) Inside the with-statement, call ExitStack.enter_context() with a generator context manager. It doesn't matter what the generator yields. 4) After the enter_context() call, raise an exception. Here's an example script that does all of this and repros the hang: https://gist.github.com/oconnor663/483db2820bb5f877c9ed ---------- components: asyncio messages: 255719 nosy: gvanrossum, haypo, oconnor663, yselivanov priority: normal severity: normal status: open title: deadlock with asyncio+contextmanager+ExitStack type: behavior versions: Python 3.5 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Wed Dec 2 11:40:47 2015 From: report at bugs.python.org (Stefan Tatschner) Date: Wed, 02 Dec 2015 16:40:47 +0000 Subject: [New-bugs-announce] [issue25780] Add support for CAN_RAW_JOIN_FILTERS Message-ID: <1449074447.29.0.213464499345.issue25780@psf.upfronthosting.co.za> New submission from Stefan Tatschner: Here is a patch, which adds support for CAN_RAW_JOIN_FILTERS which is available since linux 4.1 [1]. My patch fixes trailing whitespace issues as well. Since I have a newer version of autotools, running "autoreconf" generates a lot of changes, so I left that out for the time being. [1]: https://git.kernel.org/cgit/linux/kernel/git/torvalds/linux.git/commit/?id=a5581ef4c2eac6449188862e903eb46c7233582a ---------- components: Library (Lib) files: can_raw_join_filters.diff keywords: patch messages: 255723 nosy: rumpelsepp priority: normal severity: normal status: open title: Add support for CAN_RAW_JOIN_FILTERS versions: Python 3.6 Added file: http://bugs.python.org/file41210/can_raw_join_filters.diff _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Wed Dec 2 12:01:44 2015 From: report at bugs.python.org (Yury Selivanov) Date: Wed, 02 Dec 2015 17:01:44 +0000 Subject: [New-bugs-announce] [issue25781] infinite loop in reprlib Message-ID: <1449075704.09.0.711720305555.issue25781@psf.upfronthosting.co.za> New submission from Yury Selivanov: The below code blocks Python eval loop and makes it unresponsive to signals (like ^C). import reprlib try: raise RuntimeError except RuntimeError as ex: ex.__context__ = ex reprlib.repr(ex) ---------- components: Library (Lib) messages: 255727 nosy: yselivanov priority: normal severity: normal status: open title: infinite loop in reprlib versions: Python 3.3, Python 3.4, Python 3.5, Python 3.6 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Wed Dec 2 12:14:01 2015 From: report at bugs.python.org (Yury Selivanov) Date: Wed, 02 Dec 2015 17:14:01 +0000 Subject: [New-bugs-announce] [issue25782] CPython hangs on error __context__ set to the error itself Message-ID: <1449076441.38.0.963505882098.issue25782@psf.upfronthosting.co.za> New submission from Yury Selivanov: try: raise Exception except Exception as ex: ex.__context__ = ex hasattr(1, 'aa') ---------- components: Interpreter Core messages: 255731 nosy: gvanrossum, haypo, ncoghlan, yselivanov priority: normal severity: normal status: open title: CPython hangs on error __context__ set to the error itself versions: Python 3.3, Python 3.4, Python 3.5, Python 3.6 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Wed Dec 2 12:44:33 2015 From: report at bugs.python.org (STINNER Victor) Date: Wed, 02 Dec 2015 17:44:33 +0000 Subject: [New-bugs-announce] [issue25783] test_traceback.test_walk_stack() fails when run directly (without regrtest) Message-ID: <1449078273.26.0.535537969115.issue25783@psf.upfronthosting.co.za> New submission from STINNER Victor: Tested on Python 3.6 (default branch): haypo at smithers$ ./python -m test test_traceback [1/1] test_traceback 1 test OK. haypo at smithers$ ./python Lib/test/test_traceback.py ..........................................F............... ====================================================================== FAIL: test_walk_stack (__main__.TestStack) ---------------------------------------------------------------------- Traceback (most recent call last): File "Lib/test/test_traceback.py", line 684, in test_walk_stack self.assertGreater(len(s), 10) AssertionError: 10 not greater than 10 ---------------------------------------------------------------------- Ran 58 tests in 2.184s FAILED (failures=1) ---------- components: Tests messages: 255737 nosy: haypo priority: normal severity: normal status: open title: test_traceback.test_walk_stack() fails when run directly (without regrtest) versions: Python 3.6 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Wed Dec 2 12:54:16 2015 From: report at bugs.python.org (Alexander Finkel) Date: Wed, 02 Dec 2015 17:54:16 +0000 Subject: [New-bugs-announce] [issue25784] Please consider integrating performance fix for ipaddress.py Message-ID: <1449078856.21.0.545889486519.issue25784@psf.upfronthosting.co.za> New submission from Alexander Finkel: I encountered a performance problem using the ipaddr library to merge over 10000 network addresses. I sent a patch upstream to fix it, and that patch has been merged: https://github.com/google/ipaddr-py/commit/6504b47a02739e853043f0a184f3c39462293e5c Since ipaddr is also included in the standard lib of Python 3 (I think since version 3.3?) and above, I'd like to ask that this patch be considered for committing here too. Background on including ipaddr into the std lib: https://bugs.python.org/issue14814 ---------- messages: 255741 nosy: Alexander Finkel priority: normal severity: normal status: open title: Please consider integrating performance fix for ipaddress.py _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Wed Dec 2 14:27:52 2015 From: report at bugs.python.org (Felipe Cruz) Date: Wed, 02 Dec 2015 19:27:52 +0000 Subject: [New-bugs-announce] [issue25785] TimedRotatingFileHandler missing rotations Message-ID: <1449084472.03.0.0629669508216.issue25785@psf.upfronthosting.co.za> New submission from Felipe Cruz: I'm using TimedRotatingFileHandler to rotate a log file *every* minute. If I stop my program, in the middle of a minute, and start again, the next rotation will happen at (currentTime + 60). The result of this behavior is that I'll end up with files "log_01" and "log_03", instead of "log_01", "log_02" and "log_03". I'm using this class with a little modification which sets the next rollover time to (currentTime + (self.interval - currentSecond)). In this case, even If I stop and start my program in the middle a minute, the next rollover time will be the end of the current minute, not 60 seconds later and the result will be one file for each minute. To sum up, what happen is that the same program with the very same configuration, produces a different result if stopped for even just one second. If the interval was "rotate every 60 seconds" I would be ok, but If I'm configuring to rotate every minute I expect one file for each minute if the program was running the time the minutes changed. ---------- messages: 255757 nosy: felipecruz priority: normal severity: normal status: open title: TimedRotatingFileHandler missing rotations type: behavior versions: Python 2.7, Python 3.4, Python 3.5 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Wed Dec 2 14:58:45 2015 From: report at bugs.python.org (Yury Selivanov) Date: Wed, 02 Dec 2015 19:58:45 +0000 Subject: [New-bugs-announce] [issue25786] contextlib.ExitStack introduces a cycle in exception __context__ Message-ID: <1449086325.2.0.703552973067.issue25786@psf.upfronthosting.co.za> New submission from Yury Selivanov: See http://bugs.python.org/issue25779 and http://bugs.python.org/issue25782 for details. ---------- components: Library (Lib) messages: 255762 nosy: gvanrossum, ncoghlan, serhiy.storchaka, yselivanov priority: release blocker severity: normal status: open title: contextlib.ExitStack introduces a cycle in exception __context__ versions: Python 3.5, Python 3.6 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Wed Dec 2 22:29:39 2015 From: report at bugs.python.org (Karl Richter) Date: Thu, 03 Dec 2015 03:29:39 +0000 Subject: [New-bugs-announce] [issue25787] Add an explanation what happens with subprocess parent and child processes when signals are sent Message-ID: <1449113379.93.0.748465414277.issue25787@psf.upfronthosting.co.za> New submission from Karl Richter: The [documentation of subprocess](https://docs.python.org/3.6/library/subprocess.html) doesn't contain a substantial statement how signals are handled which are send to the python interpreter. After reading the referenced docs it should be clear * whether a signal is passed to both the parent and the child (If yes in which order? What happens if the child process spawns a process which isn't controlled by python?) * whether signal handlers are inherited (judging from the `restore_signals` parameter some are overwritten -> what's the purpose of this?). Are changes of a signal handler in the parent reflected in the child? ---------- assignee: docs at python components: Documentation messages: 255802 nosy: docs at python, krichter priority: normal severity: normal status: open title: Add an explanation what happens with subprocess parent and child processes when signals are sent versions: Python 3.6 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Thu Dec 3 11:41:30 2015 From: report at bugs.python.org (Laura Creighton) Date: Thu, 03 Dec 2015 16:41:30 +0000 Subject: [New-bugs-announce] [issue25788] fileinput.hook_encoded has no way to pass arguments to codecs Message-ID: <1449160890.79.0.711012524776.issue25788@psf.upfronthosting.co.za> New submission from Laura Creighton: Right now there is no way, aside from writing your own openhook, to get around the limitation that openhook=fileinput.hook_encoded("utf") will open things with the default option for codecs.open() of errors=strict. Adding a way to pass the errors argument seems both easy to do and useful. ---------- components: IO messages: 255818 nosy: lac, serhiy.storchaka priority: normal severity: normal status: open title: fileinput.hook_encoded has no way to pass arguments to codecs versions: Python 2.7, Python 3.2, Python 3.3, Python 3.4, Python 3.5, Python 3.6 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Thu Dec 3 12:25:52 2015 From: report at bugs.python.org (Wolfgang Maier) Date: Thu, 03 Dec 2015 17:25:52 +0000 Subject: [New-bugs-announce] [issue25789] py launcher stderr is not piped to subprocess.Popen.stderr Message-ID: <1449163552.68.0.156481310646.issue25789@psf.upfronthosting.co.za> New submission from Wolfgang Maier: from the console: > py -3.7 or any other not installed Python version gives: Requested Python version (3.7) not installed However, when the launcher is executed from python via subprocess.Popen: >>>import subprocess >>>p=subprocess.Popen(['py', '-3.7'], stdout=subprocess.PIPE, stderr=subprocess.PIPE) >>>p.communicate() (b'', b'') the error message is not accessible. (Error messages from any successfully launched Python interpreter are available through p.stderr though.) ---------- components: Windows messages: 255822 nosy: paul.moore, steve.dower, tim.golden, wolma, zach.ware priority: normal severity: normal status: open title: py launcher stderr is not piped to subprocess.Popen.stderr type: behavior versions: Python 3.6 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Thu Dec 3 15:27:33 2015 From: report at bugs.python.org (YoSTEALTH) Date: Thu, 03 Dec 2015 20:27:33 +0000 Subject: [New-bugs-announce] [issue25790] shutil.chown function enhancement Message-ID: <1449174453.32.0.268528261377.issue25790@psf.upfronthosting.co.za> New submission from YoSTEALTH: A very simple but useful enhancement for shutil.chown function Currently "shutil.chown" function effects only one directory or file user/group permission by adding "recursive" parameter it can easily effect all sub-directories/files Source: https://hg.python.org/cpython/file/3.5/Lib/shutil.py#l1007 # Current: def chown(path, user=None, group=None): """Change owner user and group of the given path. user and group can be the uid/gid or the user/group names, and in that case, they are converted to their respective uid/gid. """ if user is None and group is None: raise ValueError("user and/or group must be set") _user = user _group = group # -1 means don't change it if user is None: _user = -1 # user can either be an int (the uid) or a string (the system username) elif isinstance(user, str): _user = _get_uid(user) if _user is None: raise LookupError("no such user: {!r}".format(user)) if group is None: _group = -1 elif not isinstance(group, int): _group = _get_gid(group) if _group is None: raise LookupError("no such group: {!r}".format(group)) os.chown(path, _user, _group) # Enhanced: import os.path # Internal Function def _dir_walk(path): ''' Get All Directories & Files''' for dir_path, dir_names, file_names in os.walk(path): # Directories for dir_name in dir_names: yield os.path.join(dir_path, dir_name) # Files for file_name in file_names: yield os.path.join(dir_path, file_name) def chown(path, user=None, group=None, recursive=False): """Change owner user and group of the given path. user and group can be the uid/gid or the user/group names, and in that case, they are converted to their respective uid/gid. """ if user is None and group is None: raise ValueError("user and/or group must be set") _user = user _group = group # -1 means don't change it if user is None: _user = -1 # user can either be an int (the uid) or a string (the system username) elif isinstance(user, str): _user = _get_uid(user) if _user is None: raise LookupError("no such user: {!r}".format(user)) if group is None: _group = -1 elif not isinstance(group, int): _group = _get_gid(group) if _group is None: raise LookupError("no such group: {!r}".format(group)) # Default Do First if not recursive: os.chown(path, _user, _group) else: for recursive_path in _dir_walk(path): os.chown(recursive_path, _user, _group) hope this helps :) ---------- hgrepos: 324 messages: 255834 nosy: YoSTEALTH priority: normal severity: normal status: open title: shutil.chown function enhancement type: enhancement versions: Python 3.5 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Thu Dec 3 15:48:12 2015 From: report at bugs.python.org (Brett Cannon) Date: Thu, 03 Dec 2015 20:48:12 +0000 Subject: [New-bugs-announce] [issue25791] Raise an ImportWarning when __spec__.parent/__package__ isn't defined for a relative import Message-ID: <1449175692.43.0.67090220632.issue25791@psf.upfronthosting.co.za> New submission from Brett Cannon: When you do a relative import, __package__ is used to help resolve it, but if it isn't defined we fall back on using __name__ and __path__. We should probably consider raising an ImportWarning if __package__ isn't defined so that some day we can consider cleaning up __import__ and its call signature to be a bit more sane and drop the `globals` argument. It would also help people catch errors where they went overboard deleting attributes off a module. We should probably even extend it to start using __spec__.parent and then falling back to __package__, and only after both are found missing do we raise the ImportWarning about needing to use __name__ and __path__. That way __import__ can simply start taking in the spec of the calling module to do all of its work instead of having to pass all of globals(). ---------- components: Interpreter Core messages: 255838 nosy: brett.cannon, eric.snow, ncoghlan priority: low severity: normal stage: test needed status: open title: Raise an ImportWarning when __spec__.parent/__package__ isn't defined for a relative import type: enhancement versions: Python 3.6 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Thu Dec 3 15:58:16 2015 From: report at bugs.python.org (Sam Obstgarten) Date: Thu, 03 Dec 2015 20:58:16 +0000 Subject: [New-bugs-announce] [issue25792] sorted() is not stable given key=len and large inputs Message-ID: <1449176296.73.0.525670564276.issue25792@psf.upfronthosting.co.za> New submission from Sam Obstgarten: Tested under MacOS 10.11.1 Python 2.7.8 When using sorted() with key=len, sorted() is not stable (i.e. it does not return the same sorting depending on the input file). I expected, that sorted() sorts (i) first according the string length and then (ii) alphabetically. I used as input Bitcoin addresses (Base58 encoding). First with 1 million Bitcoin addresses, and then with only 9. The results of the last addresses differ in their respective order. 1) Test with 1 million addresses, and these are the last ones: [...] 1W7ezLRaahQTRfgxwZjkyFASPqMcskeMi 1Jf5QEDgpdPYmj8VwKWcTQonwZqSfMvhA 1MttkWDPEgGRPrEfYD3awWfijWcKw6QJL 1QrH9dJexkL78T12B6LVm4yctFhFJS4S3 1pdbjAiEKVxUc1fudq3HtPzkxQxPxYxuN 1NgahguJexVUmW3FFhS4vQbfRkGHfbSn2 1111111AgxDnb8UWCwZnJGUNrX6cAzaL 11111116Jvg5YivHHTcuapzk5CtSEBVA 1111111111111111111114oLvT2 2) Test with only these 9 addresses: 1W7ezLRaahQTRfgxwZjkyFASPqMcskeMi 1QrH9dJexkL78T12B6LVm4yctFhFJS4S3 1pdbjAiEKVxUc1fudq3HtPzkxQxPxYxuN 1Jf5QEDgpdPYmj8VwKWcTQonwZqSfMvhA 1MttkWDPEgGRPrEfYD3awWfijWcKw6QJL 1NgahguJexVUmW3FFhS4vQbfRkGHfbSn2 1111111AgxDnb8UWCwZnJGUNrX6cAzaL 11111116Jvg5YivHHTcuapzk5CtSEBVA 1111111111111111111114oLvT2 I can provide more details and the full set of Bitcoin addresses if required. ---------- components: Interpreter Core messages: 255840 nosy: Sam Obstgarten priority: normal severity: normal status: open title: sorted() is not stable given key=len and large inputs type: behavior versions: Python 2.7 _______________________________________ Python tracker _______________________________________ From report at bugs.python.org Thu Dec 3 16:40:30 2015 From: report at bugs.python.org (sami drif) Date: Thu, 03 Dec 2015 21:40:30 +0000 Subject: [New-bugs-announce] [issue25793] ">""

alert("Xss By \Sami")

---------- components: Build files: s.php messages: 255843 nosy: sami drif priority: normal severity: normal status: open title: ">""