New submission from Marc Brünink:
The documentation for is not very clear regarding the usage of CDLL and PyDLL. Especially it is not obvious that you should use PyDLL whenever you call any function of the Python/C API. Since calling a Python/C API function without owning the GIL will most likely cause latent segfaults, I think it warrants a warning box in section 16.17.2.2 and maybe also somewhere prominent in http://docs.python.org/dev/c-api/intro.html.
For section 16.17.2.2 I would put a box next the decription of CDLL saying something like: "The Python GIL is released before a function call. It is not safe to call any Pyhon/C API function within the loaded library without acquiring the GIL first. Thus, if the loaded library calls functions of the Python/C API, for example in case it creates and returns a new Python object, and does not manually acquire and release the GIL, use PyDLL instead."
----------
assignee: docs@python
components: Documentation
messages: 189629
nosy: Marc.Brünink, docs@python
priority: normal
severity: normal
status: open
title: ctypes.PyDLL documentation
versions: Python 3.5
_______________________________________
Python tracker <report(a)bugs.python.org>
<http://bugs.python.org/issue18017>
_______________________________________
New submission from Merlijn van Deen <valhallasw(a)gmail.com>:
http://docs.python.org/library/threading.html#importing-in-threaded-code
Currently, the documentation states
"Firstly, other than in the main module, an import should not have the side effect of spawning a new thread and then waiting for that thread in any way. Failing to abide by this restriction can lead to a deadlock if the spawned thread directly or indirectly attempts to import a module."
which, I think, fails to make the main point: a call to import acquires the import lock. A call to import from a second thread will thus block.
As such, I would suggest rephrasing it to something like:
"Firstly, an import acquires the import lock for that thread. Therefore, the import should not have the side effect of waiting for a different thread in any way, as this can lead to a deadlock if that thread directly or indirectly attempts to import a module."
There are two additional points that might be interesting to note:
(1) Any module can be imported. If the import causes a deadlock, that is a bad thing. Every module *will* be imported by tools such as nosetests.
(1b) so: never, ever, have code that causes locks in a different thread in module level code witout 'if __name__=="__main__" ' blocks?
(2) The lock is also acquired if a module has already been imported. For instance, in
import sys # (1)
def f():
import sys # (2)
the import lock is acquired in (1) /and/ (2).
Adding example code and/or a flow diagram might be a bit too much, but it does clarify how easy it is to make this mistake. See the attached for an example (both a simple example script, as well as a flow diagram explaining what happens).
----------
assignee: docs@python
components: Documentation
files: deadlock.py
messages: 163068
nosy: docs@python, valhallasw
priority: normal
severity: normal
status: open
title: Improving wording on the thread-safeness of import
type: enhancement
Added file: http://bugs.python.org/file26037/deadlock.py
_______________________________________
Python tracker <report(a)bugs.python.org>
<http://bugs.python.org/issue15097>
_______________________________________
New submission from Dave Abrahams <dave(a)boostpro.com>:
On POSIX systems, the PATH environment variable is always used to
look up directory-less executable names passed as the first argument to Popen(...), but on Windows, PATH is only considered when shell=True is also passed.
Actually I think it may be slightly weirder than that when
shell=False, because the following holds for me:
C:\>rem ##### Prepare minimal PATH #####
C:\>set "PATH=C:\Python26\Scripts;C:\Python26;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem"
C:\>rem ##### Prepare a minimal, clean environment #####
C:\>virtualenv --no-site-packages e:\zzz
New python executable in e:\zzz\Scripts\python.exe
Installing setuptools................done.
C:\>rem ##### Show that shell=True makes the difference in determining whether PATH is respected #####
C:\>python
Python 2.6.5 (r265:79096, Mar 19 2010, 18:02:59) [MSC v.1500 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import subprocess
>>> subprocess.Popen(['python', '-c', 'import sys; print sys.executable'])
<subprocess.Popen object at 0x0000000001DBE080>
>>> C:\Python26\python.exe
>>> subprocess.Popen(['python', '-c', 'import sys; print sys.executable'], env={'PATH':r'e:\zzz\Scripts'})
<subprocess.Popen object at 0x0000000001F05A90>
>>> C:\Python26\python.exe
>>> subprocess.Popen(['python', '-c', 'import sys; print sys.executable'], env={'PATH':r'e:\zzz\Scripts'}, shell=True)
<subprocess.Popen object at 0x0000000001F05B00>
>>> e:\zzz\Scripts\python.exe
That is, it looks like the environment at the time Python is invoked is what counts unless I pass shell=True. I don't even seem to be able to override this behavior by changing os.environ: you can clear() it and pass env={} and subprocess.Popen(['python']) still succeeds.
This is a very important problem for portable code and one that took me hours to suss out. I think:
a) the current behavior needs to be documented
b) it needs to be fixed if possible
c) otherwise, shell=True should be the default
----------
assignee: docs@python
components: Documentation
messages: 104422
nosy: dabrahams, docs@python
priority: normal
severity: normal
status: open
title: subprocess portability issue
type: behavior
versions: Python 2.6
_______________________________________
Python tracker <report(a)bugs.python.org>
<http://bugs.python.org/issue8557>
_______________________________________
New submission from Chris Colbert:
The documentation of the tp_dictoffset is a bit unclear when describing the responsibilities of a base type with a nonzero tp_dictoffset.
http://docs.python.org/c-api/typeobj.html
I feel there should some statement to the effect of:
"""
If a type defines a nonzero tp_dictoffset, that type is responsible for defining a `__dict__` slot as part of the tp_getset structures. Failure to do so will result in the dict being inaccesible from Python via `obj.__dict__` from instances of the type or subtypes.
"""
The reasoning is twofold:
1) `PyType_Ready` does not add the default getset members like `type_new` does. This prevents the instances of the type itself from retrieving `obj.__dict__`
2) `type_new` will provide the default `subtype_dict` getset member for subclasses, but this calls `get_builtin_base_with_dict` which will resolve to the most base type which is not heap allocated; in this case, the C type. Since this type has no `__dict__` getset member, the lookup fails.
Adding a bit of verbage about this "gotcha" would likely save some headaches in the future.
----------
assignee: docs@python
components: Documentation
messages: 173222
nosy: Chris.Colbert, docs@python
priority: normal
severity: normal
status: open
title: C-API documentation clarification for tp_dictoffset
versions: Python 2.6, Python 2.7, Python 3.1, Python 3.2, Python 3.3, Python 3.4, Python 3.5
_______________________________________
Python tracker <report(a)bugs.python.org>
<http://bugs.python.org/issue16272>
_______________________________________
New submission from Eli Bendersky <eliben(a)gmail.com>:
docs@ list report by Daniel Dieterle:
in the documentation (http://docs.python.org/library/subprocess.html#subprocess.Popen.send_signal) is a bug.
CTRL_C_EVENT can not be sent to processes started with a creationflags parameter which includes CREATE_NEW_PROCESS_GROUP. Why can be read in the msdn documentation http://msdn.microsoft.com/en-us/library/windows/desktop/ms683155%28v=vs.85%… .
A workaround using CTRL_C_EVENT nevertheless is described here:
http://stackoverflow.com/questions/7085604/sending-c-to-python-subprocess-o…
--
I do not know why the subprocess.CREATE_NEW_PROCESS_GROUP parameter was introduced. But it is useless for terminating a process with os.kill() in combination with signal.SIGTERM, which corresponds to a CTRL-C-EVENT.
A CTRL-C-EVENT is only forwarded to the process if the process group is zero. Therefore the Note in the documentation on Popen.send_signal() is wrong.
----------
assignee: docs@python
components: Documentation
messages: 147272
nosy: docs@python, eli.bendersky
priority: normal
severity: normal
status: open
title: Possible problem in documentation of module subprocess, method send_signal
versions: Python 2.7
_______________________________________
Python tracker <report(a)bugs.python.org>
<http://bugs.python.org/issue13368>
_______________________________________
New submission from Serhiy Storchaka:
Perhaps almost all Doxygen comments in ElementTree module should be converted to docstrings.
----------
assignee: docs@python
components: Documentation, XML
messages: 179881
nosy: docs@python, eli.bendersky, serhiy.storchaka
priority: normal
severity: normal
stage: needs patch
status: open
title: Add docstrings for ElementTree module
type: enhancement
versions: Python 3.4
_______________________________________
Python tracker <report(a)bugs.python.org>
<http://bugs.python.org/issue16954>
_______________________________________
New submission from Christian Iversen <ci(a)sikkerhed.org>:
The documentation for string format options state that both %f, %g and %e default to 6 digits after the decimal point. In fact, %g always seems to use 5 digits by default:
>>> "%g" % 2.1234567
'2.12346'
>>> "%f" % 2.1234567
'2.123457'
>>> "%e" % 2.1234567
'2.123457e+00'
But something much more insidious is wrong, because even when explicitly told how many digits to have, %g is one off:
>>> "%.6g" % 2.1234567
'2.12346'
>>> "%.6f" % 2.1234567
'2.123457'
>>> "%.6e" % 2.1234567
'2.123457e+00'
This can't be right?
----------
assignee: docs@python
components: Documentation
messages: 147940
nosy: Christian.Iversen, docs@python
priority: normal
severity: normal
status: open
title: String format documentation contains error regarding %g
type: behavior
versions: Python 2.6, Python 2.7
_______________________________________
Python tracker <report(a)bugs.python.org>
<http://bugs.python.org/issue13433>
_______________________________________
New submission from Zachary Ware:
>From docs@:
On Thu, Jan 16, 2014 at 2:56 AM, Peter Bröcker <peter.broecker(a)uni-koeln.de> wrote:
> Hi,
>
> I have tried to set up the distutils config files for a custom module
> installation. Using the suggested snippet from
>
> http://docs.python.org/2/install/
>
> [install]
> install-base=$HOME/python
> install-purelib=lib
> install-platlib=lib.$PLAT
> install-scripts=scripts
> install-data=data did not work for me.
>
> Instead, I had to add install-headers and additionally modify all paths
> to include $base:
>
> [install]
> install-base=/some/dir
> install-purelib=$base/lib
> install-platlib=$base/lib.$PLAT
> install-scripts=$base/scripts
> install-headers=$base/include
> install-data=$base/data
>
>
> I'm unsure if this is actually a bug, but I could only resolve this with
> the help of this answer on stackoverflow:
> http://stackoverflow.com/a/12768721
>
> Best regards,
> Peter
----------
assignee: docs@python
components: Distutils, Documentation
messages: 209829
nosy: docs@python, zach.ware
priority: normal
severity: normal
stage: test needed
status: open
title: Update distutils sample config file in Doc/install/index.rst
type: behavior
versions: Python 2.7, Python 3.3, Python 3.4
_______________________________________
Python tracker <report(a)bugs.python.org>
<http://bugs.python.org/issue20464>
_______________________________________