(reposted from c.l.py)
the following FAQ item talks about using sleep to make sure that threads
I suspect it was originally written for the "thread" module, since as
far as I know, the "threading" module takes care of the issues described
here all by itself.
so, should this item be removed?
or can anyone suggest a rewrite that's more relevant for "threading"
Patch #1067760 deals with passing of float values to file.seek;
the original version tries to fix the current implementation
by converting floats to long long, rather than plain C long
(thus supporting files larger than 2GiB).
I propose a different approach: passing floats to seek should
be an error. My version of the patch uses the index API, this
will automatically give an error.
a) should floats be supported as parameters to file.seek
b) if not, should Python 2.6 just deprecate such usage,
or outright reject it?
so, if you remember, i suggested adding __dir__ to objects, so as to make
dir() customizable, remove the deprecated __methods__ and __members__,
and make it symmetrical to other built-in functions.
you can see the original post here:
which was generally accepted by the forum:
so i went on, now that i have some spare time, to research the issue.
the current dir() works as follows:
(*) builtin_dir calls PyObject_Dir to do the trick
(*) if the object is NULL (dir with no argument), return the frame's locals
(*) if the object is a *module*, we're just using it's __dict__
(*) if the object is a *type*, we're using it's __dict__ and __bases__,
but not __class__ (so as not to show the metaclass)
(*) otherwise, it's a "normal object", so we take it's __dict__, along with
__methods__, __members__, and dir(__class__)
(*) create a list of keys from the dict, sort, return
we'll have to change that if we were to introduce __dir__. my design is:
(*) builtin_dir, if called without an argument, returns the frame's locals
(*) otherwise, it calls PyObject_Dir(self), which would dispatch self.__dir__()
(*) if `self` doesn't have __dir__, default to object.__dir__(self)
(*) the default object.__dir__ implementation would do the same as
today: collect __dict__, __members__, __methods__, and dir(__class__).
by py3k, we'll remove looking into __methods__ and __members__.
(*) type objects and module objects would implement __dir__ to their
liking (as PyObject_Dir does today)
(*) builtin_dir would take care of sorting the list returned by PyObject_Dir
so first i'd want you people to react on my design, maybe you'd find
flaws whatever. also, should this become a PEP?
and last, how do i add a new method slot? does it mean i need to
change all type-object definitions throughout the codebase?
do i add it to some protocol? or directly to the "object protocol"?
On Thu, Nov 09, 2006 at 02:51:15PM +0100, andrew.kuchling wrote:
> Author: andrew.kuchling
> Date: Thu Nov 9 14:51:14 2006
> New Revision: 52692
> [Patch #1514544 by David Watson] use fsync() to ensure data is really on disk
Should I backport this change to 2.5.1? Con: The patch adds two new
internal functions, _sync_flush() and _sync_close(), so it's an
internal API change. Pro: it's a patch that should reduce chances of
data loss, which is important to people processing mailboxes.
Because it fixes a small chance of potential data loss and the new
functions are prefixed with _, my personal inclination would be to
backport this change.
Comments? Anthony, do you want to pronounce on this issue?
Patch #841454 takes a stab at cross-compilation
(for MingW32 on a Linux system, in this case),
and proposes to use SCons instead of setup.py
to compile extension modules. Usage of SCons
would be restricted to cross-compilation (for
What do you think?
On Thu Nov 9 07:45:30 CET 2006, Anthony Baxter wrote:
> On Thursday 09 November 2006 16:30, Martin v. Löwis wrote:
> > Patch #841454 takes a stab at cross-compilation
> > (for MingW32 on a Linux system, in this case),
> > and proposes to use SCons instead of setup.py
> > to compile extension modules. Usage of SCons
> > would be restricted to cross-compilation (for
> > the moment).
> > What do you think?
> So we'd now have 3 places to update when things change (setup.py, PCbuild
> area, SCons)? How does this deal with the problems that autoconf has with
> cross-compilation? It would seem to me that just fixing the extension module
> building is a tiny part of the problem... or am I missing something?
I've been working on adding cross-compiling support to Python's build system,
too, though I've had the luxury of building on Linux for a target platform
that also runs Linux. Since the build system originally came from the GCC
project, it shouldn't surprise anyone that there's already a certain level
of support for cross-compilation built in. Simply setting the --build and
--host options is a good start, for example.
It seems that Martin's patch solves some problems I encountered more cleanly
(in certain respects) than the solutions I came up with. Here are some
issues I encountered (from memory):
* The current system assumes that Parser/pgen will be built using the
compiler being used for the rest of the build. This obviously isn't
going to work when the executable is meant for the target platform.
At the same time, the object files for pgen need to be compiled for
the interpreter for the target platform.
* The newly-compiled interpreter is used to compile the standard library,
run tests and execute the setup.py file. Some of these things should
be done by the interpreter, but it won't work on the host platform.
On the other hand, the setup.py script should be run by the host's
Python interpreter, but using information about the target interpreter's
* There are various extensions defined in the setup.py file that are
found and erroneously included if you execute it using the host's
interpreter. Ideally, it would be possible to use the target's
configuration to disable extensions, but a more configurable build
process would also be welcome.
I'll try to look at Martin's patch at some point. I hope these observations
and suggestions help explain the current issues with the build system when
Patch #1346572 proposes to also search for .pyc when OptimizeFlag
is set, and for .pyo when it is not set. The author argues this is
for consistency, as the zipimporter already does that.
This reasoning is somewhat flawed, of course: to achieve consistency,
one could also change the zipimporter instead.
However, I find the proposed behaviour reasonable: Python already
automatically imports the .pyc file if .py is not given and vice
versa. So why not look for .pyo if the .pyc file is not present?
What do you think?