addressing distutils inability to track file dependencies
I wonder if it would be better to have distutils generate the appropriate type of makefile and execute that instead of directly building objects and shared libraries. This would finesse some of the dependency tracking problems that pop up frequently. Skip
"SM" == Skip Montanaro <skip@pobox.com> writes:
SM> I wonder if it would be better to have distutils generate the SM> appropriate type of makefile and execute that instead of SM> directly building objects and shared libraries. This would SM> finesse some of the dependency tracking problems that pop up SM> frequently. That sounds really complicated. Jeremy
[Skip]
I wonder if it would be better to have distutils generate the appropriate type of makefile and execute that instead of directly building objects and shared libraries. This would finesse some of the dependency tracking problems that pop up frequently.
But that doesn't work for platforms that don't have a Make. And while Windows has one, its file format is completely different, so you'd have to teach distutils how to write each platform's Makefile format. -1 --Guido van Rossum (home page: http://www.python.org/~guido/)
>> I wonder if it would be better to have distutils generate the >> appropriate type of makefile and execute that instead... Guido> But that doesn't work for platforms that don't have a Make. And Guido> while Windows has one, its file format is completely different, Guido> so you'd have to teach distutils how to write each platform's Guido> Makefile format. I don't see that writing different makefile formats is any harder than writing different shell commands. On those systems where you don't have a make-like tool, either distutils already writes compile and link commands or it doesn't work at all. On those systems where you do have a make-like facility, I see no reason to not use it. You will get more reliable dependency checking for one thing. Skip
Skip Montanaro <skip@pobox.com> writes:
You will get more reliable dependency checking for one thing.
I doubt that. To get that checking, you need to tell make what the dependencies are - it won't automatically assume anything except that object files depend on their input sources. In particular, you will *not* get dependencies to header files - in my experience, those are the biggest source of build problems. If you add a scanning procedure for finding header files, you can integrate this into distutils' dependency mechanisms just as well as you can generate five different makefile formats from those data. Regards, Martin
"SM" == Skip Montanaro <skip@pobox.com> writes:
SM> I don't see that writing different makefile formats is any SM> harder than writing different shell commands. On those systems SM> where you don't have a make-like tool, either distutils already SM> writes compile and link commands or it doesn't work at all. On SM> those systems where you do have a make-like facility, I see no SM> reason to not use it. You will get more reliable dependency SM> checking for one thing. Only if distutils grows a way to specify all those dependencies. Once you've specified them, I'm not sure why it is difficult to check them in Python code instead of relying on make. i'm-probably-naive-ly y'rs, Jeremy
Jeremy Hylton <jeremy@zope.com> writes:
Only if distutils grows a way to specify all those dependencies. Once you've specified them, I'm not sure why it is difficult to check them in Python code instead of relying on make.
I believe people normally want their build process to know dependencies without any specification of dependencies. Instead, the build process should know what the dependencies are by looking at the source files. For C, there are two ways to do that: you can either scan the sources yourself for include statements, or you can let the compiler dump dependency lists into files. The latter is only supported for some compilers, but it would help enourmously: when compiling the first time, you know for sure that you will need to compile. When compiling the second time, you read the dependency information generated the first time, to determine whether any of the included headers has changed. If that is not the case, you can skip rebuilding. If you do rebuild, the dependency information will be updated automatically (since the change might have been to add an include). Regards, Martin
Skip Montanaro wrote:
I wonder if it would be better to have distutils generate the appropriate type of makefile and execute that instead of directly building objects and shared libraries. This would finesse some of the dependency tracking problems that pop up frequently.
That's what Perl does ("MakeMaker") but I think it just adds another level of complexity, especially with different "makes" out there doing different things. Paul Prescod
Paul> That's what Perl does ("MakeMaker") but I think it just adds Paul> another level of complexity, especially with different "makes" out Paul> there doing different things. If the extra complexity came with no added benefits I'd agree with you. However, most makes actually do support a fairly basic common syntax. Who cares about %-rules and suffix rules? Those are only there to make it easier for humans to maintain Makefiles Just generate a brute-force low-level makefile. Distutils will then do the right thing in the face of file edits. Skip
On Thu, 13 Jun 2002, Skip Montanaro wrote:
If the extra complexity came with no added benefits I'd agree with you. However, most makes actually do support a fairly basic common syntax. Who cares about %-rules and suffix rules? Those are only there to make it easier for humans to maintain Makefiles Just generate a brute-force low-level makefile. Distutils will then do the right thing in the face of file edits.
If you're not using sophisticated rules, the job make does is probably no more complicated than the job of generating a makefile. You just construct a dependency graph, then walk over it stat-ing the files, and running rules where needed. It's a SMOP ;-) -- Greg Ball
Skip Montanaro wrote:
...
If the extra complexity came with no added benefits I'd agree with you.
I guess most of us don't understand the benefits because we don't see dependency tracking as necessarily that difficult. It's no harder than the new method resolution order. ;) Jeremy says he has already started implementing dependency tracking. Would switching strategies to using make actually get us anywhere faster or easier?
However, most makes actually do support a fairly basic common syntax. Who cares about %-rules and suffix rules? Those are only there to make it easier for humans to maintain Makefiles Just generate a brute-force low-level makefile. Distutils will then do the right thing in the face of file edits.
Okay, so let's say that we want distutils to handle ".i" files for SWIG (it does) and .pyrx files for PyREX (it should), then we have to generate rules for those too. Paul Prescod
Paul> I guess most of us don't understand the benefits because we don't Paul> see dependency tracking as necessarily that difficult. It's no Paul> harder than the new method resolution order. ;) If it's not that difficult why isn't it being done? <no wink> Skip
If it's not that difficult why isn't it being done? <no wink>
It's done. Jeremy added it today. --Guido van Rossum (home page: http://www.python.org/~guido/)
Skip Montanaro wrote:
If it's not that difficult why isn't it being done? <no wink>
I think the hard part is getting the dependency information. Using it is trivial. 'make' does not help solve the former problem. Speaking as someone who spent time hacking on the Python Makefile, avoid 'make'. The portable subset is limited and sucky. Neil
Skip Montanaro <skip@pobox.com> writes:
Paul> I guess most of us don't understand the benefits because we don't Paul> see dependency tracking as necessarily that difficult. It's no Paul> harder than the new method resolution order. ;)
If it's not that difficult why isn't it being done? <no wink>
You are wrong assuming it is not done. distutils does dependency analysis since day 1. Regards, Martin
On 14 June 2002, Martin v. Loewis said:
Skip Montanaro <skip@pobox.com> writes:
Paul> I guess most of us don't understand the benefits because we don't Paul> see dependency tracking as necessarily that difficult. It's no Paul> harder than the new method resolution order. ;)
If it's not that difficult why isn't it being done? <no wink>
You are wrong assuming it is not done. distutils does dependency analysis since day 1.
Only insofar as foo.o depends on foo.c. The header file stuff Jeremy has been adding sounds like a very useful addition (haven't actually inspected his patches yet). Greg -- Greg Ward - Unix bigot gward@python.net http://starship.python.net/~gward/ Monday is an awful way to spend one seventh of your life.
Greg Ward <gward@python.net> writes:
You are wrong assuming it is not done. distutils does dependency analysis since day 1.
Only insofar as foo.o depends on foo.c. The header file stuff Jeremy has been adding sounds like a very useful addition (haven't actually inspected his patches yet).
Certainly true. However, the makefiles that Skip wanted to generate would not have offered anything beyond "foo.o depends on foo.c". He then recognized that dependencies are essential, here, and suggested makedepend... Regards, Martin
Martin> Certainly true. However, the makefiles that Skip wanted to Martin> generate would not have offered anything beyond "foo.o depends Martin> on foo.c". He then recognized that dependencies are essential, Martin> here, and suggested makedepend... Please don't put words into my mouth (or thoughts into my brain)? I have used make+makedepend for a long time and tend to think of them as inseparable. I was certainly thinking in terms of .o:.h dependencies. That is, after all, what my example demonstrated was missing. Skip
Skip Montanaro <skip@pobox.com> writes:
I wonder if it would be better to have distutils generate the appropriate type of makefile and execute that instead of directly building objects and shared libraries. This would finesse some of the dependency tracking problems that pop up frequently.
It was one of the design principles of distutils to not rely on any other tools but Python and the C compiler. Regards, Martin
>> I wonder if it would be better to have distutils generate the >> appropriate type of makefile and execute that instead... Martin> It was one of the design principles of distutils to not rely on Martin> any other tools but Python and the C compiler. Perhaps it's a design principle that needs to be rethought. If you can assume the presence of a C compiler I think you can generally assume the presence of a make tool of some sort. Skip
On Thu, 13 Jun 2002, Skip Montanaro wrote:
I wonder if it would be better to have distutils generate the appropriate type of makefile and execute that instead...
Martin> It was one of the design principles of distutils to not rely on Martin> any other tools but Python and the C compiler.
Perhaps it's a design principle that needs to be rethought. If you can assume the presence of a C compiler I think you can generally assume the presence of a make tool of some sort. ^^^^^^^^^^^^
That's the rub. The MAKE.EXE mostly found on WinDOS boxen doesn't have much more than the name in common with the well known Unix tool. /Paul
Skip Montanaro wrote:
>> I wonder if it would be better to have distutils generate the >> appropriate type of makefile and execute that instead...
Martin> It was one of the design principles of distutils to not rely on Martin> any other tools but Python and the C compiler.
Perhaps it's a design principle that needs to be rethought. If you can assume the presence of a C compiler I think you can generally assume the presence of a make tool of some sort.
isn't it funny that 'scons' as a *build system* doesn't rely on anything but python? I've heard rumors they even check dependencies<wink>... holger
holger> isn't it funny that 'scons' as a *build system* doesn't rely on holger> anything but python? I've heard rumors they even check holger> dependencies<wink>... Scons would be fine by me. It doesn't rely on a C compiler, but if you want to build something that needs to be compiled I suspect you'd need one. Skip
Skip Montanaro wrote:
holger> isn't it funny that 'scons' as a *build system* doesn't rely on holger> anything but python? I've heard rumors they even check holger> dependencies<wink>...
Scons would be fine by me. It doesn't rely on a C compiler, but if you want to build something that needs to be compiled I suspect you'd need one.
i didn't mean to include or require scons for distutils. I was just making the obvious point that for the dependency task at hand python should be powerful enough. Reusing some code from scons might be worthwile, though. why-use-a-car-when-you-can-beam-ly y'rs, holger
Skip Montanaro <skip@pobox.com> writes:
Perhaps it's a design principle that needs to be rethought. If you can assume the presence of a C compiler I think you can generally assume the presence of a make tool of some sort.
Maybe - although it removes reliability from the build process if you need to rely on locating another tool. For example, on Solaris, you could run into either the vendor's make, or GNU make. Also, it appears that nothing is gained by using make. Regards, Martin
Martin> Also, it appears that nothing is gained by using make. That's incorrect. Distutils is not a make substitute and I doubt it ever will be. What dependency checking it does do is incomplete and this gives rise to problems that are reported fairly frequently. Skip
Skip Montanaro <skip@pobox.com> writes:
That's incorrect. Distutils is not a make substitute and I doubt it ever will be. What dependency checking it does do is incomplete and this gives rise to problems that are reported fairly frequently.
Can you provide a specific example to support this criticism? Could you also explain how generating makefiles would help? Regards, Martin
>> That's incorrect. Distutils is not a make substitute and I doubt it >> ever will be. What dependency checking it does do is incomplete and >> this gives rise to problems that are reported fairly frequently. Martin> Can you provide a specific example to support this criticism? Martin> Could you also explain how generating makefiles would help?
Here's a simple test you can perform at home. Build Python. Touch Include/Python.h. Run make again. Notice how the core files are all rebuilt but no modules are. Touch Modules/dbmmodule.c (or something else that builds). Run make again.
Most of the time most of the rebuilds of the core are unnecessary.
I'm simply going to stop worrying about it and just keep deleting all the stuff distutils builds to make sure I get correct shared libraries.
Because we are religious about binary backwards compatibility, it is very rare that a change to a header file requires that extension are recompiled. But when this happens, it is often the last thing we think of when debugging. :-( I think the conclusion from this thread is that it's not the checking of dependencies which is the problem. (Jeremy just added this to distutils.) It is the specification of which files are dependent on which others that is a pain. I think that with Jeremy's changes it would not be hard to add a rule to our setup.py that makes all extensions dependent on all .h files in the Include directory -- a reasonable approximation of the rule that the main Makefile uses. --Guido van Rossum (home page: http://www.python.org/~guido/)
On Thu, Jun 13, 2002 at 10:05:00PM -0400, Guido van Rossum wrote:
I think the conclusion from this thread is that it's not the checking of dependencies which is the problem. (Jeremy just added this to distutils.) It is the specification of which files are dependent on which others that is a pain. I think that with Jeremy's changes it would not be hard to add a rule to our setup.py that makes all extensions dependent on all .h files in the Include directory -- a reasonable approximation of the rule that the main Makefile uses.
I for one would love to have dependencies in my extension modules, I usually end up deleting the build directory whenever I've changed a header file :( How about something like this: Extension('foo', ['foo1.c', 'foo2.c'], dependencies={'foo1.c': ['bar.h'], 'foo2.c': ['bar.h', 'bar2.h']}) though there is the problem of backwards compatability :/ Just my two cents, Martin -- Martin Sjögren martin@strakt.com ICQ : 41245059 Phone: +46 (0)31 7710870 Cell: +46 (0)739 169191 GPG key: http://www.strakt.com/~martin/gpg.html
Martin Sjögren wrote:
On Thu, Jun 13, 2002 at 10:05:00PM -0400, Guido van Rossum wrote:
I think the conclusion from this thread is that it's not the checking of dependencies which is the problem. (Jeremy just added this to distutils.) It is the specification of which files are dependent on which others that is a pain. I think that with Jeremy's changes it would not be hard to add a rule to our setup.py that makes all extensions dependent on all .h files in the Include directory -- a reasonable approximation of the rule that the main Makefile uses.
I for one would love to have dependencies in my extension modules, I usually end up deleting the build directory whenever I've changed a header file :(
How about something like this:
Extension('foo', ['foo1.c', 'foo2.c'], dependencies={'foo1.c': ['bar.h'], 'foo2.c': ['bar.h', 'bar2.h']})
though there is the problem of backwards compatability :/
Just curious: distutils, as the name says, is a tool for distributing source code; that doesn't have much to do with developing code where dependency analysis is nice to have since it saves compile time. When distributing code, the standard setup is that a user unzips the distutils created archive, types "python setup.py install" and that's it. Dependency analyis doesn't gain him anything. Now if you want to use distutils in the development process then you have a different mindset and therefore need different tools like e.g. scons or make (+ makedep, etc.). The question is whether we want distutils to be a development tool as well, or rather stick to its main purpose: that of simplifying distribution and installation of software (and thanks to Greg, it's great at that !). -- Marc-Andre Lemburg CEO eGenix.com Software GmbH ______________________________________________________________________ Company & Consulting: http://www.egenix.com/ Python Software: http://www.egenix.com/files/python/ Meet us at EuroPython 2002: http://www.europython.org/
On Friday 14 June 2002 10:04 am, M.-A. Lemburg wrote: ...
distutils, as the name says, is a tool for distributing source code; that doesn't have much to do with developing code where dependency analysis is nice to have since it saves compile
However, distutils is already today the handiest building environment, particularly if your extension needs to support several platforms and/or several versions of Python.
The question is whether we want distutils to be a development tool as well, or rather stick to its main purpose: that of simplifying distribution and installation of software (and thanks to Greg, it's great at that !).
The "problem" (:-) is that it's great at just building extensions, too. python2.1 setup.py install, python2.2 setup.py install, python2.3 setup.py install, and hey pronto, I have my extension built and installed on all Python versions I want to support, ready for testing. Hard to beat!-) Alex
alex wrote:
The "problem" (:-) is that it's great at just building extensions, too.
python2.1 setup.py install, python2.2 setup.py install, python2.3 setup.py install, and hey pronto, I have my extension built and installed on all Python versions I want to support, ready for testing. Hard to beat!-)
does your code always work right away? I tend to use an incremental approach, with lots of edit-compile-run cycles. I still haven't found a way to get the damn thing to just build my extension and copy it to the current directory, so I can run the test scripts. does anyone here know how to do that, without having to resort to ugly wrapper batch files/shell scripts? (distutils is also a pain to use with a version management system that marks files in the repository as read-only; distutils copy function happily copies all the status bits. but the remove function refuses to remove files that are read-only, even if the files have been created by distutils itself...) </F>
does your code always work right away?
I tend to use an incremental approach, with lots of edit-compile-run cycles. I still haven't found a way to get the damn thing to just build my extension and copy it to the current directory, so I can run the test scripts.
does anyone here know how to do that, without having to resort to ugly wrapper batch files/shell scripts?
(distutils is also a pain to use with a version management system that marks files in the repository as read-only; distutils copy function happily copies all the status bits. but the remove function refuses to remove files that are read-only, even if the files have been created by distutils itself...)
</F>
setup.py install --install-lib=. Thomas
thomas wrote:
does anyone here know how to do that, without having to resort to ugly wrapper batch files/shell scripts?
(distutils is also a pain to use with a version management system that marks files in the repository as read-only; distutils copy function happily copies all the status bits. but the remove function refuses to remove files that are read-only, even if the files have been created by distutils itself...)
setup.py install --install-lib=.
doesn't work: distutils ends up trying to overwrite (readonly) original source files. consider PIL, for example: in my source directory, I have the following files, checked out from a repository: setup.py _imaging.c *.c PIL/*.py I want to be able to run setup.py and end up with an _imaging.pyd in the same directory. I don't want distutils to attempt to copy stuff from PIL/*.py to PIL/*.py, mess up other parts of my source tree, install any scripts (broken or not) in the Python directory, or just generally make an ass of itself when failing to copy readonly files on top of other readonly files. the following is a bit more reliable (windows version): rd /s /q build python setup.py build rd /s /q install python setup.py install --prefix install copy install\*.pyd . if distutils didn't mess up when deleting readonly files it created all by itself, the following command could perhaps work: setup.py install_ext --install-lib=. but there is no install_ext command in the version I have... </F>
From: "Fredrik Lundh" <fredrik@pythonware.com>
consider PIL, for example: in my source directory, I have the following files, checked out from a repository:
setup.py _imaging.c *.c PIL/*.py
I want to be able to run setup.py and end up with an _imaging.pyd in the same directory. I don't want distutils to attempt to copy stuff from PIL/*.py to PIL/*.py, mess up other parts of my source tree, install any scripts (broken or not) in the Python directory, or just generally make an ass of itself when failing to copy readonly files on top of other readonly files.
Then there's Berthold Höllmanns test-command he posted to the distutils sig, which internally runs the 'build' command, then extends sys.path by build_purelib, build_platlib, and the test-directory, and finally runs the tests in the test-directory files. For the readonly file issue, I have a force_remove_tree() function in one of my setup-scripts (well, actually it is part of the pyexe-distutils extension), cloned from distutils' remove_tree() function. Thomas
consider PIL, for example: in my source directory, I have the following files, checked out from a repository:
setup.py _imaging.c *.c PIL/*.py
I want to be able to run setup.py and end up with an _imaging.pyd in the same directory. I don't want distutils to attempt to copy stuff from PIL/*.py to PIL/*.py, mess up other parts of my source tree, install any scripts (broken or not) in the Python directory, or just generally make an ass of itself when failing to copy readonly files on top of other readonly files.
I thought that the thing to do this was python setup.py build_ext -i --Guido van Rossum (home page: http://www.python.org/~guido/)
guido wrote:
I thought that the thing to do this was
python setup.py build_ext -i
oh, that's definitely close enough. that's what you get for reading the docs instead of trying every combination of the available options ;-) (maybe someone who knows a little more about distutils could take an hour and add brief overviews of all standard commands to the reference section(s)? just having a list of all commands and command options would have helped me, for sure...) thanks /F
I thought that the thing to do this was
python setup.py build_ext -i
oh, that's definitely close enough.
that's what you get for reading the docs instead of trying every combination of the available options ;-)
(maybe someone who knows a little more about distutils could take an hour and add brief overviews of all standard commands to the reference section(s)? just having a list of all commands and command options would have helped me, for sure...)
Instead of bothering with the (mostly) harmless but also mostly unhelpful manuals, try the --help feature. E.g. this has the info you want: $ python setup.py build_ext --help Global options: --verbose (-v) run verbosely (default) --quiet (-q) run quietly (turns verbosity off) --dry-run (-n) don't actually do anything --help (-h) show detailed help message Options for 'PyBuildExt' command: --build-lib (-b) directory for compiled extension modules --build-temp (-t) directory for temporary files (build by-products) --inplace (-i) ignore build-lib and put compiled extensions into the source directory alongside your pure Python modules --include-dirs (-I) list of directories to search for header files (separated by ':') --define (-D) C preprocessor macros to define --undef (-U) C preprocessor macros to undefine --libraries (-l) external C libraries to link with --library-dirs (-L) directories to search for external C libraries (separated by ':') --rpath (-R) directories to search for shared C libraries at runtime --link-objects (-O) extra explicit link objects to include in the link --debug (-g) compile/link with debugging information --force (-f) forcibly build everything (ignore file timestamps) --compiler (-c) specify the compiler type --swig-cpp make SWIG create C++ files (default is C) --help-compiler list available compilers usage: setup.py [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...] or: setup.py --help [cmd1 cmd2 ...] or: setup.py --help-commands or: setup.py cmd --help $ --Guido van Rossum (home page: http://www.python.org/~guido/)
[Fredrik]
(maybe someone who knows a little more about distutils could take an hour and add brief overviews of all standard commands to the reference section(s)? just having a list of all commands and command options would have helped me, for sure...)
[Guido]
Instead of bothering with the (mostly) harmless but also mostly unhelpful manuals, try the --help feature. E.g. this has the info you want:
$ python setup.py build_ext --help
[ ... ] It seems like a shame that effort was wasted producing "unhelpful" documentation (and I have to say my experience was similar, but I thought it was just me). The better the docs, the more module and extension authors will use distutils. Is the problem simply too generic for it to be logged as a documentation bug? (A bit like the famous DEC SIR: "VMS 2.0 does not work". DEC's response? "Fixed in next release"). Couldn't find anything in SF. regards ----------------------------------------------------------------------- Steve Holden http://www.holdenweb.com/ Python Web Programming http://pydish.holdenweb.com/pwp/ -----------------------------------------------------------------------
On Fri, Jun 14, 2002 at 03:43:04PM -0400, Steve Holden wrote:
It seems like a shame that effort was wasted producing "unhelpful" documentation (and I have to say my experience was similar, but I thought it was just me). The better the docs, the more module and extension authors will use distutils.
Part of it is not having an idea of what tasks people commonly need to do with Distutils. --amk
Guido wrote:
Instead of bothering with the (mostly) harmless but also mostly unhelpful manuals, try the --help feature. E.g. this has the info you want:
I think I got sidetracked by the --help-commands summary, which sort of implies that build_ext is just a subvariant of build... (maybe we could add a --help-commands-long option that lists both the command names and their descriptions? my brain clearly couldn't execute [--help x for x in commands] without adding an arbitrary if-clause, but I'm sure distutils can do that...) </F>
$ python setup.py build_ext --help Global options: ... --dry-run (-n) don't actually do anything
Last time I tried that with a package, it went ahead and installed itself anyway. -- Gordon http://www.mcmillan-inc.com/
[/F]
... (maybe someone who knows a little more about distutils could take an hour and add brief overviews of all standard commands to the reference section(s)? just having a list of all commands and command options would have helped me, for sure...)
Me too, except that it still would <wink>. The docs do a fine job of explaining the framework, but it turns out every option I actually have to use gets extracted from one of my coworkers at the point of tears <wink>.
On Friday 14 June 2002 11:45 am, Fredrik Lundh wrote:
alex wrote:
The "problem" (:-) is that it's great at just building extensions, too.
python2.1 setup.py install, python2.2 setup.py install, python2.3 setup.py install, and hey pronto, I have my extension built and installed on all Python versions I want to support, ready for testing. Hard to beat!-)
does your code always work right away?
Never! As the tests fail and problems are identified, I edit the sources, and redo the setup.py install on one or more of the Python versions.
I tend to use an incremental approach, with lots of edit-compile-run
Me too. Iterative and incremental is highly productive.
cycles. I still haven't found a way to get the damn thing to just build my extension and copy it to the current directory, so I can run the test scripts.
I haven't even looked for such a way, since going to site-packages is no problem for me. If I was developing on a Python installation shared by several users I'd no doubt feel differently about it.
(distutils is also a pain to use with a version management system that marks files in the repository as read-only; distutils copy function
Many things are. Fortunately, cvs, for all of its problem, doesn't do the readonly thing:-). Alex
I tend to use an incremental approach, with lots of edit-compile-run cycles. I still haven't found a way to get the damn thing to just build my extension and copy it to the current directory, so I can run the test scripts.
Funny, I use an edit-compile-run cycle too, but I don't have the need to copy anything to the current directory.
does anyone here know how to do that, without having to resort to ugly wrapper batch files/shell scripts?
(distutils is also a pain to use with a version management system that marks files in the repository as read-only; distutils copy function happily copies all the status bits. but the remove function refuses to remove files that are read-only, even if the files have been created by distutils itself...)
This smells like a bug. Maybe it can be fixed rather than used as a stick to hit the dog. --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum <guido@python.org> writes:
I tend to use an incremental approach, with lots of edit-compile-run cycles. I still haven't found a way to get the damn thing to just build my extension and copy it to the current directory, so I can run the test scripts.
Funny, I use an edit-compile-run cycle too, but I don't have the need to copy anything to the current directory.
That's because Python treats its own build directory special: it adds build/something to sys.path when it finds that it is started from the build directory. If you are developing a separate package, all your code ends up in ./build/lib.platform, which is not on sys.path. Regards, Martin
"Fredrik Lundh" <fredrik@pythonware.com> writes:
I tend to use an incremental approach, with lots of edit-compile-run cycles. I still haven't found a way to get the damn thing to just build my extension and copy it to the current directory, so I can run the test scripts.
does anyone here know how to do that, without having to resort to ugly wrapper batch files/shell scripts?
I usually make a symlink into the build directory. Then, whenever it is rebuild, the symlink will still be there.
(distutils is also a pain to use with a version management system that marks files in the repository as read-only; distutils copy function happily copies all the status bits. but the remove function refuses to remove files that are read-only, even if the files have been created by distutils itself...)
That's a bug, IMO. Regards, Martin
martin@v.loewis.de (Martin v. Loewis) writes:
(distutils is also a pain to use with a version management system that marks files in the repository as read-only; distutils copy function happily copies all the status bits. but the remove function refuses to remove files that are read-only, even if the files have been created by distutils itself...)
That's a bug, IMO.
And hang on, wasn't it fixed by revision 1.12 of Lib/distutils/file_util.py? If not, more details would be appreciated. Cheers, M. -- Now this is what I don't get. Nobody said absolutely anything bad about anything. Yet it is always possible to just pull random flames out of ones ass. -- http://www.advogato.org/person/vicious/diary.html?start=60
On 14 June 2002, Fredrik Lundh said:
alex wrote:
The "problem" (:-) is that it's great at just building extensions, too.
python2.1 setup.py install, python2.2 setup.py install, python2.3 setup.py install, and hey pronto, I have my extension built and installed on all Python versions I want to support, ready for testing. Hard to beat!-)
does your code always work right away?
If we're talking about a downloaded third party extension -- the main use case for the Distutils -- one certainly hopes so! It's only a happy accident that the Distutils are moderately useful for building/development.
I tend to use an incremental approach, with lots of edit-compile-run cycles. I still haven't found a way to get the damn thing to just build my extension and copy it to the current directory, so I can run the test scripts.
Last time I checked: python setup.py build_ext --inplace
(distutils is also a pain to use with a version management system that marks files in the repository as read-only; distutils copy function happily copies all the status bits. but the remove function refuses to remove files that are read-only, even if the files have been created by distutils itself...)
Yeah, that's a stupid situation. I'm sure there are "XXX" comments in the code where I ponder the wisdom of preserving mtime and mode. Greg -- Greg Ward - just another Python hacker gward@python.net http://starship.python.net/~gward/
The question is whether we want distutils to be a development tool as well, or rather stick to its main purpose: that of simplifying distribution and installation of software (and thanks to Greg, it's great at that !).
Yes. Much of distutils is concerned with compiling, and that part is also needed by a development tool. So I'd say it's a pretty good match. You have to specify the extension build rules as some kind of script. We found Modules/Setup + makesetup inadequate, and moved to setup.py + distutils. Distutils is the best we got; it knows about many compilers and platforms; it was pretty easy to add .h file dependency handling (though not discovery). --Guido van Rossum (home page: http://www.python.org/~guido/)
"M.-A. Lemburg" <mal@lemburg.com> writes:
The question is whether we want distutils to be a development tool as well, or rather stick to its main purpose: that of simplifying distribution and installation of software (and thanks to Greg, it's great at that !).
IMO, that's not a question anymore: distutils already *is* a tool used in build and development environments. Regards, Martin
mal> The question is whether we want distutils to be a development tool mal> as well, or rather stick to its main purpose: that of simplifying mal> distribution and installation of software (and thanks to Greg, it's mal> great at that !). Thanks for elaborating the distinction. That is exactly what I missed. I really want make+makedepend. I think that's what others have missed as well. Skip
Thanks for elaborating the distinction. That is exactly what I missed. I really want make+makedepend. I think that's what others have missed as well.
Sorry Skip, but many others pointed out early on in this discussion that dependency discovery is the important issue. --Guido van Rossum (home page: http://www.python.org/~guido/)
>> Thanks for elaborating the distinction. That is exactly what I >> missed. I really want make+makedepend. I think that's what others >> have missed as well. Guido> Sorry Skip, but many others pointed out early on in this Guido> discussion that dependency discovery is the important issue. Which distutils doesn't do, but for which make and/or compilers have done for years. Skip
Guido> Sorry Skip, but many others pointed out early on in this Guido> discussion that dependency discovery is the important issue.
Which distutils doesn't do, but for which make and/or compilers have done for years.
That same imprecise language again that got you in trouble before! :-) Make doesn't do dependency discovery (beyond the trivial .c -> .o). There may be a few compilers that do this but I don't think it's the norm. --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido> That same imprecise language again that got you in trouble Guido> before! :-) Yes, I realize that. Guido> Make doesn't do dependency discovery (beyond the trivial .c -> Guido> .o). There may be a few compilers that do this but I don't think Guido> it's the norm. I also realize that. Gcc has had good dependency checking for probably ten years. Sun's C compiler for a similar length of time. Larry Wall did a pretty good job of dependency checking for patch in the mid-80's. Scons does it as well. Skip
Gcc has had good dependency checking for probably ten years.
How do you invoke this? Maybe we can use this to our advantage. --Guido van Rossum (home page: http://www.python.org/~guido/)
>> Gcc has had good dependency checking for probably ten years. Guido> How do you invoke this? Maybe we can use this to our advantage. "gcc -M" gives you all dependencies. "gcc -MM" gives you just the stuff included via '#include "file"' and omits the headers included via '#include <file>'. Programmers use <file> and "file" inconsistently enough that it's probably better to just use -M and eliminate the files you don't care about (or leave them in and have Python rebuild automatically after OS upgrades). There are several other variants as well. Search the GCC man page for "-M". It seems to me that distutils' base compiler class could provide a generic makedepend-like method which could be overridden in subclasses where specific compilers have better builtin schemes for dependency generation. Skip
"gcc -M" gives you all dependencies. "gcc -MM" gives you just the stuff included via '#include "file"' and omits the headers included via '#include <file>'. Programmers use <file> and "file" inconsistently enough that it's probably better to just use -M and eliminate the files you don't care about (or leave them in and have Python rebuild automatically after OS upgrades). There are several other variants as well. Search the GCC man page for "-M".
Cool.
It seems to me that distutils' base compiler class could provide a generic makedepend-like method which could be overridden in subclasses where specific compilers have better builtin schemes for dependency generation.
Care to whip up a patch? --Guido van Rossum (home page: http://www.python.org/~guido/)
From: "Skip Montanaro" <skip@pobox.com>
>> Gcc has had good dependency checking for probably ten years.
Guido> How do you invoke this? Maybe we can use this to our advantage.
"gcc -M" gives you all dependencies. "gcc -MM" gives you just the stuff included via '#include "file"' and omits the headers included via '#include <file>'. Programmers use <file> and "file" inconsistently enough that it's probably better to just use -M and eliminate the files you don't care about (or leave them in and have Python rebuild automatically after OS upgrades). There are several other variants as well. Search the GCC man page for "-M".
It seems to me that distutils' base compiler class could provide a generic makedepend-like method which could be overridden in subclasses where specific compilers have better builtin schemes for dependency generation.
MSVC could do something similar with the /E or /P flag (preprocess to standard out or to file). A simple python filter looking for #line directives could then collect the dependencies. Isn't -E and -P also available in any unixish compiler? Thomas
Thomas> MSVC could do something similar with the /E or /P flag Thomas> (preprocess to standard out or to file). A simple python filter Thomas> looking for #line directives could then collect the Thomas> dependencies. Isn't -E and -P also available in any unixish Thomas> compiler? Yes. I believe this is how some makedepend scripts work. Skip
Skip Montanaro <skip@pobox.com> writes:
"gcc -M" gives you all dependencies. "gcc -MM" gives you just the stuff included via '#include "file"' and omits the headers included via '#include <file>'.
Both options are somewhat obsolete. It requires a separate invocation of the compiler to output the dependencies, since it outputs the dependencies to stdout; it can't do compilation at the same time. It is much better if compilation of a file updates the dependency information as a side effect. For that, gcc supports -MD/-MMD since 1989; this generates dependencies in a file obtained by replacing the .o extension of the target with .d. SunPRO supports generation of dependency files also as a separate compiler invocation. It also supports the undocumented environment variable SUNPRO_DEPENDENCIES, which allows specification of the dependency file, along with specification of directories. GCC also supports SUNPRO_DEPENDENCIES, so this is the most effective and portable way to get dependency file generation. Regards, Martin
"MvL" == Martin v Loewis <martin@v.loewis.de> writes:
MvL> GCC also supports SUNPRO_DEPENDENCIES, so this is the most MvL> effective and portable way to get dependency file generation. Here's a rough strategy for exploiting this feature in distutils. Does it make sense? Happily, I can't see any possible use of make. There is an option to enable dependency tracking. Not sure how the option is passed: command line (tedious), setup (not easily customized by user), does distutils have a user options file of some sort? Each time distutils compiles a file it passes the -MD file to generate a .d file. On subsequent compilations, it checks for the .d file. If the .d file does not exist or is older than the .c file, it recompiles. Otherwise, it parses the .d file and compares the times for each of the dependencies. This doesn't involve make because the only thing make would do for us is check the dependencies and invoke the compiler. distutils already knows how to do both those things. Jeremy
Here's a rough strategy for exploiting this feature in distutils. Does it make sense? Happily, I can't see any possible use of make.
There is an option to enable dependency tracking. Not sure how the option is passed: command line (tedious), setup (not easily customized by user), does distutils have a user options file of some sort?
We could make the configure script check for GCC, and if detected, add -MD to it.
Each time distutils compiles a file it passes the -MD file to generate a .d file.
On subsequent compilations, it checks for the .d file. If the .d file does not exist or is older than the .c file, it recompiles. Otherwise, it parses the .d file and compares the times for each of the dependencies.
Sounds good. It could skip parsing the .d file if the .o file doesn't exist or is older than the .c file. If there is no .d file, I would suggest only recompiling if the .c file is newer than the .o file (otherwise systems without GCC will see recompilation of everything all the time -- not a good idea IMO.) Go ahead and implement this! --Guido van Rossum (home page: http://www.python.org/~guido/)
Jeremy> Here's a rough strategy for exploiting this feature in Jeremy> distutils. Does it make sense? Happily, I can't see any Jeremy> possible use of make. I still don't quite understand what everyone's aversion to make is (yes, I realize it's not platform-independent, but then neither are C compilers or linkers and we manage to live with that), but I will let that slide. Instead, I see a potentially different approach. Write an scons build file (typically named SConstruct) and deliver that in the Modules directory. Most people can safely ignore it. The relatively few people (mostly on this list) who care about such things can simply install SCons (it's quite small) and run it to build the stuff in the Modules directory. The benefits as I see them are * SCons implements portable automatic dependency analysis already * Dependencies are based upon file checksums instead of timestamps (worthwhile in highly networked development environments) * Clearer separation between build/install and edit/compile/test types of tasks. I was able to create a simple SConstruct file over the weekend that builds many of the extension modules. I stalled a bit on library/include file discovery, but hopefully that barrier will be passed soon. I realize in the short-term there are also several disadvantages to this idea: * There will initially be a lot of overlap between setup.py and SCons. * SCons doesn't yet implement a VPATH-like capability so the source and build directories can't easily be separated. One is in the works though, planned for initial release in 0.09. The current version is 0.07. Skip
[Proposal to use SCons] Let's not tie ourselves to SCons before it's a lot more mature. --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido> [Proposal to use SCons] Guido> Let's not tie ourselves to SCons before it's a lot more mature. I wasn't proposing that, at least not for the short-term. I was suggesting that distutils be left as is, and a SConstruct file be delivered in .../Modules, to be used manually by developers to update module .so files. Skip
Guido> Let's not tie ourselves to SCons before it's a lot more mature.
I wasn't proposing that, at least not for the short-term. I was suggesting that distutils be left as is, and a SConstruct file be delivered in .../Modules, to be used manually by developers to update module .so files.
I don't object to that, but it wouldn't do me any good. --Guido van Rossum (home page: http://www.python.org/~guido/)
"SM" == Skip Montanaro <skip@pobox.com> writes:
SM> I still don't quite understand what everyone's aversion to make SM> is (yes, I realize it's not platform-independent, but then SM> neither are C compilers or linkers and we manage to live with SM> that), but I will let that slide. You didn't let it slide. You brought it up again. Many people have offered many reasons not to use make. You haven't offered any rebuttal to their arguments, which comes across as rather cavalier. SM> Instead, I see a potentially different approach. Write an scons SM> build file (typically named SConstruct) and deliver that in the SM> Modules directory. I don't care much about the Modules directory actually. I want this for third-party extensions that use distutils for distribution, particularly for my own third-party extensions :-). It sounds like you're proposing to drop distutils in favor of SCons, but not saying so explicitly. Is that right? If so, we'd need to strong case for dumping distutils than automatic dependency tracking. If that isn't right, I don't understand how SCons and distutils meet in the middle. Would extension writers need to learn distutils and SCons? It seems like the primary benefit of SCons is that it does the dependency analysis for us, while only gcc and MSVC seem to offer something similar as a compiler builtin. Since those two compilers cover all the platforms I ever use, it isn't something that interests me a lot. SM> The benefits as I see them are SM> * SCons implements portable automatic dependency analysis SM> already That's good. SM> * Dependencies are based upon file checksums instead of SM> timestamps (worthwhile in highly networked development SM> environments) That's good, too, although we could do the same for distutils. Not too much work, but not my first priority. SM> * Clearer separation between build/install and edit/compile/test SM> types of tasks. I don't know what you mean. I use the current Python make file for both tasks, and haven't had much problem. SM> I was able to create a simple SConstruct file over the weekend SM> that builds many of the extension modules. I stalled a bit on SM> library/include file discovery, but hopefully that barrier will SM> be passed soon. That's cool. SM> I realize in the short-term there are also several disadvantages SM> to this idea: SM> * There will initially be a lot of overlap between setup.py and SM> SCons. Won't there be a lot of overlap for all time unless Python adopts SCons as the one true way to build extension modules? It's not like setup.py is going to be replaced. SM> * SCons doesn't yet implement a VPATH-like capability so the SM> source and build directories can't easily be separated. SM> One is in the works though, planned for initial release in SM> 0.09. The current version is 0.07. Absolute requirement for me :-(. I've got three CVS checkouts of Python and probably 10 total build directories that I use on a regular basis -- normal builds, debug builds, profiled builds, etc. Jeremy
SM> Instead, I see a potentially different approach. Write an scons SM> build file (typically named SConstruct) and deliver that in the SM> Modules directory. Jeremy> I don't care much about the Modules directory actually. I want Jeremy> this for third-party extensions that use distutils for Jeremy> distribution, particularly for my own third-party extensions Jeremy> :-). As I think has been hashed out here recently, there are two functions that need to be addressed. Distribution/installation of modules is fine with distutils as it currently sits. Jeremy> It sounds like you're proposing to drop distutils in favor of Jeremy> SCons, but not saying so explicitly. Is that right? No. Here, I'll put it in writing: I am explicitly not suggesting that distutils be dropped. I suggested that a SConstruct file be added to the modules directory to be used by people who need to do more than install modules. That's it. Jeremy> If so, we'd need to strong case for dumping distutils than Jeremy> automatic dependency tracking. If that isn't right, I don't Jeremy> understand how SCons and distutils meet in the middle. Would Jeremy> extension writers need to learn distutils and SCons? No. I'm only suggesting that a SConstruct file be added to the Modules directory. I don't want it tied into the build process, at least for the time being. As Guido indicated, scons is still in its infancy. Skip
"SM" == Skip Montanaro <skip@pobox.com> writes:
SM> No. I'm only suggesting that a SConstruct file be added to the SM> Modules directory. I don't want it tied into the build process, SM> at least for the time being. As Guido indicated, scons is still SM> in its infancy. Oh! That sounds fine with me. Jeremy
Jeremy Hylton <jeremy@zope.com> writes:
Here's a rough strategy for exploiting this feature in distutils. Does it make sense?
Sounds good. Unlike make, it should not choke if it cannot locate one of the inputs of the dependency file - it may be that the header file has gone away, and subsequent recompilation would update the dependency file to show that. If that is done, I'd still encourage use of the SUNPRO_DEPENDENCIES feature for use with SunPRO (aka Forte, aka Sun ONE). Not that I'm asking you to implement it, but it would be good if another such mechanism would be easy to hook into whatever you implement. Regards, Martin
If that is done, I'd still encourage use of the SUNPRO_DEPENDENCIES feature for use with SunPRO (aka Forte, aka Sun ONE). Not that I'm asking you to implement it, but it would be good if another such mechanism would be easy to hook into whatever you implement.
I don't recall that you explained the meaning of the SUNPRO_DEPENDENCIES variable, only that it was undocumented and did something similar to GCC's -M. That's hardly enough. :-) --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum <guido@python.org> writes:
I don't recall that you explained the meaning of the SUNPRO_DEPENDENCIES variable, only that it was undocumented and did something similar to GCC's -M. That's hardly enough. :-)
I see :-) Suppose you have a file x.c, and you invoke env SUNPRO_DEPENDENCIES="x.deps build/x.o" gcc -c -o x.o x.c then a file x.deps is generated, and has, on the left-hand side of the dependency rule, build/x.o. It works the same way for compilers identifying themselves as cc: Sun WorkShop 6 update 1 C 5.2 2000/09/11 when invoked with -V. I can't give a complete list of compilers that support that feature, but setting the variable can't hurt - the worst case is that it is ignored. Regards, Martin
Guido van Rossum wrote:
Gcc has had good dependency checking for probably ten years.
How do you invoke this? Maybe we can use this to our advantage.
Here's a bunch of useful options. gcc --help -v | grep -e -M -M Generate make dependencies -MM As -M, but ignore system header files -MF <file> Write dependency output to the given file -MG Treat missing header file as generated files -MP Generate phony targets for all headers -MQ <target> Add a MAKE-quoted target -MT <target> Add an unquoted target -MD Print dependencies to FILE.d -MMD Print dependencies to FILE.d -M Print dependencies to stdout -MM Print dependencies to stdout
Guido:
Make doesn't do dependency discovery (beyond the trivial .c -> .o). There may be a few compilers that do this but I don't think it's the norm.
Borland make does in conjunction with the compiler including header dependencies in the object file. Thus there is no need for dependency generation options like gcc's and no such options are provided. Its differences in functionality like this that will cause problems with moving towards greater use of make. Neil
Skip> Which distutils doesn't do, but for which make and/or compilers Skip> have done for years. Bad English, sorry. Should have been "which has been available for make for years". Skip
The question is whether we want distutils to be a development tool as well
I'd say yes, we do -- otherwise we have to maintain two parallel systems for building stuff, which sucks for what should be obvious reasons. What's more -- on Windows, distutils is the only way I know *how* to build extension modules! I once tried doing it on my own and gave up in disgust. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+
How about something like this:
Extension('foo', ['foo1.c', 'foo2.c'], dependencies={'foo1.c': ['bar.h'], 'foo2.c': ['bar.h', 'bar2.h']})
though there is the problem of backwards compatability :/
But this is wrong: it's not foo1.c that depends on bar.h, it's foo1.o. With the latest CVS, on Unix or Linux, try this: - Run Make to be sure you are up to date - Touch Modules/socketobject.h - Run Make again The latest setup.py has directives that tell it that the _socket and _ssl modules depend on socketmodule.h, and this makes it rebuild the necessary .o and .so files (through the changes to distutils that Jeremy made). All we need is for someone to add all the other dependencies to setup.py. --Guido van Rossum (home page: http://www.python.org/~guido/)
On Fri, Jun 14, 2002 at 08:05:32AM -0400, Guido van Rossum wrote:
How about something like this:
Extension('foo', ['foo1.c', 'foo2.c'], dependencies={'foo1.c': ['bar.h'], 'foo2.c': ['bar.h', 'bar2.h']})
though there is the problem of backwards compatability :/
But this is wrong: it's not foo1.c that depends on bar.h, it's foo1.o.
You're right.
With the latest CVS, on Unix or Linux, try this:
- Run Make to be sure you are up to date - Touch Modules/socketobject.h - Run Make again
The latest setup.py has directives that tell it that the _socket and _ssl modules depend on socketmodule.h, and this makes it rebuild the necessary .o and .so files (through the changes to distutils that Jeremy made).
Cool. But my module consists of several .c files, how do I specify which .o files depend on which .h files? Now, it's a shame I have to maintain compatability with the Python 2.1 and Python 2.2 distributions in my setup.py ;) I suppose I could try/except... Regards, Martin -- Martin Sjögren martin@strakt.com ICQ : 41245059 Phone: +46 (0)31 7710870 Cell: +46 (0)739 169191 GPG key: http://www.strakt.com/~martin/gpg.html
"MS" == Martin Sjögren <martin@strakt.com> writes:
But this is wrong: it's not foo1.c that depends on bar.h, it's foo1.o.
MS> You're right. On the other hand, distutils setup scripts don't talk about .o files directly. They talk about the .c file and assume there is a one-to-one correspondence between .c files and .o files.
With the latest CVS, on Unix or Linux, try this:
- Run Make to be sure you are up to date - Touch Modules/socketobject.h - Run Make again
The latest setup.py has directives that tell it that the _socket and _ssl modules depend on socketmodule.h, and this makes it rebuild the necessary .o and .so files (through the changes to distutils that Jeremy made).
MS> Cool. But my module consists of several .c files, how do I MS> specify which .o files depend on which .h files? I did something simpler, as Guido mentioned. I added global dependencies for an extension. This has been fine for all the extensions that I commonly build because they have only one or several source files. Recompiling a few .c files costs little. I agree that it would be nice to have fine-grained dependency tracking, but that costs more in the implementation and to use. Thomas Heller has a patch on SF (don't recall the number) that handles per-file dependencies. I didn't care for the way the dependencies are spelled in the setup script, but something like the dict that Martin (the other Martin, right?) suggested seems workable. MS> Now, it's a shame I have to maintain compatability with the MS> Python 2.1 and Python 2.2 distributions in my setup.py ;) MS> I suppose I could try/except... We should come up with a good hack to use in setup scripts. This is my first try. It's got too many lines, but it works. # A hack to determine if Extension objects support the depends keyword arg. if not "depends" in Extension.__init__.func_code.co_varnames: # If it doesn't, create a local replacement that removes depends # from the kwargs before calling the regular constructor. _Extension = Extension class Extension(_Extension): def __init__(self, name, sources, **kwargs): if "depends" in kwargs: del kwargs["depends"] _Extension.__init__(self, name, sources, **kwargs) Jeremy
On Fri, Jun 14, 2002 at 04:25:15AM -0400, Jeremy Hylton wrote:
MS> Cool. But my module consists of several .c files, how do I MS> specify which .o files depend on which .h files?
I did something simpler, as Guido mentioned. I added global dependencies for an extension. This has been fine for all the extensions that I commonly build because they have only one or several source files. Recompiling a few .c files costs little.
I agree that it would be nice to have fine-grained dependency tracking, but that costs more in the implementation and to use. Thomas Heller has a patch on SF (don't recall the number) that handles per-file dependencies. I didn't care for the way the dependencies are spelled in the setup script, but something like the dict that Martin (the other Martin, right?) suggested seems workable.
Extension('foo', ['foo1.c', 'foo2.c'], dependencies={'foo1.c': ['bar.h'], 'foo2.c': ['bar.h', 'bar2.h']}) That's what I suggested, is that what you meant?
MS> Now, it's a shame I have to maintain compatability with the MS> Python 2.1 and Python 2.2 distributions in my setup.py ;) MS> I suppose I could try/except...
We should come up with a good hack to use in setup scripts. This is my first try. It's got too many lines, but it works.
# A hack to determine if Extension objects support the depends keyword arg. if not "depends" in Extension.__init__.func_code.co_varnames: # If it doesn't, create a local replacement that removes depends # from the kwargs before calling the regular constructor. _Extension = Extension class Extension(_Extension): def __init__(self, name, sources, **kwargs): if "depends" in kwargs: del kwargs["depends"] _Extension.__init__(self, name, sources, **kwargs)
Eep :) Looks like it could work, yes, but I think I'll skip that one while I'm still running Python 2.2. :) Cheers, Martin -- Martin Sjögren martin@strakt.com ICQ : 41245059 Phone: +46 (0)31 7710870 Cell: +46 (0)739 169191 GPG key: http://www.strakt.com/~martin/gpg.html
Cool. But my module consists of several .c files, how do I specify which .o files depend on which .h files?
You can't. Compared to throwing away the entire build directory containing all Python extensions, it's still a huge win. For your extension, it may not make much of a difference. --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido> All we need is for someone to add all the other dependencies to Guido> setup.py. May I humbly propose that this task should be automated? Tools like makedepend have been invented and reinvented many times over precisely because it's too error-prone for humans to maintain that information manually. Switching from Make's syntax to Python's syntax won't make that task substantially easier. (Yes, I realize that backward compatibility is a strong goal so the layout of objects tends to change rarely. I still prefer having correct dependencies.) Skip
Skip Montanaro <skip@pobox.com> writes:
May I humbly propose that this task should be automated? Tools like makedepend have been invented and reinvented many times over precisely because it's too error-prone for humans to maintain that information manually.
They also have been invented and reinvented because the previous tool would not work, just like the next one wouldn't. makedepend is particularly bad: you need to reinvoke makedepend manually whenever you change a file, which is as easily to forget as updating dependency lists whenever you change a file. In addition, makedepend has problems finding out the names of header files used. That said, feel free to contribute patches that automate this task. Regards, Martin
May I humbly propose that this task should be automated? Tools like makedepend have been invented and reinvented many times over precisely because it's too error-prone for humans to maintain that information manually. Switching from Make's syntax to Python's syntax won't make that task substantially easier.
Unfortunately it's also darn tooting hard to do a good job of discovering dependencies, which is why there is still no standard tool that does this. Makedepend tries, but is still hard to use. --Guido van Rossum (home page: http://www.python.org/~guido/)
Extension('foo', ['foo1.c', 'foo2.c'], dependencies={'foo1.c':
['bar.h'], 'foo2.c': ['bar.h', 'bar2.h']})
But this is wrong: it's not foo1.c that depends on bar.h, it's foo1.o.
It's not wrong if you read the dependency statement as "anything which depends on foo1.c also depends on bar.h" etc. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+
Skip Montanaro <skip@pobox.com> writes:
Martin> Can you provide a specific example to support this criticism? Martin> Could you also explain how generating makefiles would help?
From python-list on June 10 (this is what made me wish yet again for better dependency checking):
http://mail.python.org/pipermail/python-list/2002-June/108153.html
That does not answer my question: How would generating a makefile have helped? Notice that setup.py *will* regenerate nis.so if nis.c changes. The OP is right that it refused to do so because, meanwhile, he had changed Setup to build nismodule.so instead. This is where the real problem lies: that building modules via makesetup generates module.so, whereas building modules via setup.py builds .so. This needs to be fixed, and I feel that setup.py is right and makesetup is wrong.
It's clear nobody but me wants this
Hard to tell, since I still don't quite get what "this" is. Generating makefiles: certainly I don't want this. The reason is not that I think there are no problems - I think that generating makefiles will not solve these problems.
though I find it hard to believe most of you haven't been burned in the past the same way the above poster was.
The specific problem comes from building shared modules through Setup. I never do this, so I have not been burned by that.
Frequently, after executing "cvs up" I see almost all of the Python core rebuild because some commonly used header file was modified, yet find that distutils rebuilds nothing.
That I noticed. It has nothing to do with the article you quote, though, and I question that generating makefiles would help. I routinely rm -rf build when I see that some common header has changed.
Here's a simple test you can perform at home. Build Python. Touch Include/Python.h. Run make again. Notice how the core files are all rebuilt but no modules are. Touch Modules/dbmmodule.c (or something else that builds). Run make again.
I can reproduce your observations. I still don't see how generating makefiles will help to solve this problem. Regards, Martin
This is where the real problem lies: that building modules via makesetup generates module.so, whereas building modules via setup.py builds .so. This needs to be fixed, and I feel that setup.py is right and makesetup is wrong.
IMO that's entirely accidental. You can use Setup to build either form. I would assume you can use setup.py to build either form too, but I'm not sure. --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum <guido@python.org> writes:
IMO that's entirely accidental. You can use Setup to build either form. I would assume you can use setup.py to build either form too, but I'm not sure.
Can you please elaborate? I believe that, because of the fragment case $objs in *$mod.o*) base=$mod;; *) base=${mod}module;; esac the line nis nismodule.c -lnsl # Sun yellow pages -- not everywhere will always cause makesetup to build nismodule.so - if you want to build nis.so, you have to rename the source file. I don't think you can tell setup.py to build nismodule.so. So what do you propose to do to make the resulting shared library name consistent regardless of whether it is build through setup.py or makesetup? Regards, Martin
IMO that's entirely accidental. You can use Setup to build either form. I would assume you can use setup.py to build either form too, but I'm not sure.
Can you please elaborate? I believe that, because of the fragment
case $objs in *$mod.o*) base=$mod;; *) base=${mod}module;; esac
the line
nis nismodule.c -lnsl # Sun yellow pages -- not everywhere
will always cause makesetup to build nismodule.so - if you want to build nis.so, you have to rename the source file.
Oops, I was mistaken.
I don't think you can tell setup.py to build nismodule.so.
Actually, you can. Just specify "nismodule" as the extension name. Whether you should, I don't know.
So what do you propose to do to make the resulting shared library name consistent regardless of whether it is build through setup.py or makesetup?
I don't know if we need consistency, but if we do, I propose that we deprecate the "module" part. --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum <guido@python.org> writes:
I don't think you can tell setup.py to build nismodule.so.
Actually, you can. Just specify "nismodule" as the extension name.
That won't work. setup.py tries to import "md5module", which fails since md5module.so has no function initmd5module.
I don't know if we need consistency, but if we do, I propose that we deprecate the "module" part.
Ok, I'll try to remove the feature that makesetup adds "module". Regards, Martin
On Thursday, June 13, 2002, at 07:49 , Skip Montanaro wrote:
I wonder if it would be better to have distutils generate the appropriate type of makefile and execute that instead of directly building objects and shared libraries. This would finesse some of the dependency tracking problems that pop up frequently.
+1 Distutils is very unix-centric in that it expects there to be separate compile and link steps. While this can be made to work on Windows (at least for MSVC) where there are such separate compilers if you look hard enough it can't be made to work for MetroWerks on the Mac, and also for MSVC it's a rather funny way to do things. I would much prefer it if distutils would (optionally) gather all it's knowledge and generate a Makefile or an MW projectfile or an MSVC projectfile. For MW distutils already does this (every step simply remembers information, and at the "link" step it writes out a project file and builds that) but it would be nice if this way of operation was codified. Note that for people having an IDE this would also make debugging a lot easier: if you have an IDE project you can easily do nifty things like turn on debugging, use its class browser, etc. -- - Jack Jansen <Jack.Jansen@oratrix.com> http://www.cwi.nl/~jack - - If I can't dance I don't want to be part of your revolution -- Emma Goldman -
Distutils is very unix-centric in that it expects there to be separate compile and link steps. While this can be made to work on Windows (at least for MSVC) where there are such separate compilers if you look hard enough it can't be made to work for MetroWerks on the Mac, and also for MSVC it's a rather funny way to do things.
Actually, the setup dialogs and general structure of MSVC make you very aware of the Unixoid structure of the underlying compiler suite. :-) But I believe what you say about MW.
I would much prefer it if distutils would (optionally) gather all it's knowledge and generate a Makefile or an MW projectfile or an MSVC projectfile.
For MW distutils already does this (every step simply remembers information, and at the "link" step it writes out a project file and builds that) but it would be nice if this way of operation was codified.
I'm not sure what's to codify -- this is different for each compiler suite. When using setup.py with a 3rd party extension on Windows, I like the fact that I don't have to fire up the GUI to build it. (I just wish it were easier to make distutils do the right thing for debug builds of Python. This has improved on Unix but I hear it's still broken on Windows.)
Note that for people having an IDE this would also make debugging a lot easier: if you have an IDE project you can easily do nifty things like turn on debugging, use its class browser, etc.
That's for developers though, not for people installing extensions that come with a setup.py script. --Guido van Rossum (home page: http://www.python.org/~guido/)
[Guido]
... (I just wish it were easier to make distutils do the right thing for debug builds of Python. This has improved on Unix but I hear it's still broken on Windows.)
Hard to say. "stupid_build.py --debug" works great on Windows in the Zope3 tree. "setup.py --debug" on Windows in the Zope tree builds the debug stuff but leaves the results in unusable places. Since I don't understood disutils or the Zope build process, I'm not complaining. There's nothing that can't be fixed by hand via a mouse, Windows Explorer, and a spare hour each time around <wink>.
From: "Guido van Rossum" <guido@python.org>
I'm not sure what's to codify -- this is different for each compiler suite. When using setup.py with a 3rd party extension on Windows, I like the fact that I don't have to fire up the GUI to build it. (I Same for me. just wish it were easier to make distutils do the right thing for debug builds of Python. This has improved on Unix but I hear it's still broken on Windows.)
What do you think is broken with the debug builds? I use it routinely and have no problems at all... [Jack]
Note that for people having an IDE this would also make debugging a lot easier: if you have an IDE project you can easily do nifty things like turn on debugging, use its class browser, etc.
I prefer to insert #ifdef _DEBUG _asm int 3; /* breakpoint */ #endif into the problematic sections of my code, and whoops, the MSVC GUI debugger opens just when this code is executed, even if it was started from the command line. Thomas
What do you think is broken with the debug builds? I use it routinely and have no problems at all...
I was repeating hearsay. Here's what used to be broken on Unix: if you built a debug Python but did not install it (assuming a non-debug Python was already installed), and then used that debug Python to build a 3rd party extension, the debug Python's configuration would be ignored, and the extension would be built with the configuration of the installed Python instead. Such extensions can't be linked with the debug Python, which was the whole point of using the debug Python to build in the first place. Jeremy recently fixed this for Unix, and I'm very happy. But I believe that on Windows you still have to add "--debug" to your setup.py build command to get the same effect. I think that using the debug executable should be sufficient to turn on the debug flags. More generally, I think that when you use a Python executable that lives in a build directory, the configuration of that build directory should be used for all extensions you build. This is what Jeremy did in his fix. (As a side effect, building the Python extensions no longer needs to be special-cased.) --Guido van Rossum (home page: http://www.python.org/~guido/)
From: "Guido van Rossum" <guido@python.org>
What do you think is broken with the debug builds? I use it routinely and have no problems at all...
I was repeating hearsay.
The complaints I remember (mostly from c.l.p) are from people who want to build debug versions of extensions while at the same time refusing to build a debug version of Python from the sources.
Here's what used to be broken on Unix: if you built a debug Python but did not install it (assuming a non-debug Python was already installed), and then used that debug Python to build a 3rd party extension, the debug Python's configuration would be ignored, and the extension would be built with the configuration of the installed Python instead. Such extensions can't be linked with the debug Python, which was the whole point of using the debug Python to build in the first place.
Jeremy recently fixed this for Unix, and I'm very happy.
But I believe that on Windows you still have to add "--debug" to your setup.py build command to get the same effect. I think that using the debug executable should be sufficient to turn on the debug flags.
More generally, I think that when you use a Python executable that lives in a build directory, the configuration of that build directory should be used for all extensions you build. This is what Jeremy did in his fix. (As a side effect, building the Python extensions no longer needs to be special-cased.)
I don't know anything about building Python (and extensions) on Unix, but here's how it works on windows: You can use the release as well as the debug version of Python to build release debug or release extensions with distutils. You have to use the --debug switch to specify which one to use. The debug version needs other libraries than the release version, they all have an _d inserted into the filename just before the filename- extension (but you probably know this already ;-). I don't know if it even is possible (in Python code) to determine whether the debug or the release exe is currently running. With changes I recently made to distutils, you can even do all this in a 'not installed' version, straight from CVS, for example. Thomas
[Thomas Heller]
... I don't know if it even is possible (in Python code) to determine whether the debug or the release exe is currently running.
FYI, the sys module exposes some debugging tools only in the debug build. So, e.g., def is_debug_build(): import sys return hasattr(sys, "getobjects") returns the right answer (and, I believe, under all versions of Python).
From: "Tim Peters" <tim.one@comcast.net>
[Thomas Heller]
... I don't know if it even is possible (in Python code) to determine whether the debug or the release exe is currently running.
FYI, the sys module exposes some debugging tools only in the debug build. So, e.g.,
def is_debug_build(): import sys return hasattr(sys, "getobjects")
returns the right answer (and, I believe, under all versions of Python).
I can (in 2.2) see sys.getobjects() and sys.gettotalrefcount(). I can also guess what gettotalrefcount does, but what does getobjects() do? Is it documented somewhere? Thomas
[Thomas Heller]
I can (in 2.2) see sys.getobjects() and sys.gettotalrefcount(). I can also guess what gettotalrefcount does, but what does getobjects() do? Is it documented somewhere?
Sorry, I don't think any debug-mode-only gimmicks are documented outside of comments in the source files. In a debug build, the PyObject layout changes (btw, that's why you can't mix debug-build modules w/ release-build modules), adding new _ob_next and _ob_prev pointers at the start of every PyObject. The pointers form a doubly-linked list, which contains every live object in existence, except for those statically allocated (the builtin type objects). The head of the list is in object.c's static refchain vrbl. sys.getobjects(n) returns that C list of (almost) all live objects, as a Python list. Excluded from the list returned are the list itself, and the objects created to *call* getobjects(). The list of objects is in allocation order, most-recently allocated at the start (getobjects()[0]). n is the maximum number of objects it will return, where n==0 means (of course <wink>) infinity. You can also pass it a type after the int, and, if you do, only objects of that type get returned. getobjects() is the tool of last resort when trying to track down an excess of increfs over decrefs. Python code that's exceedingly careful to account for its own effects can figure out anything using it. I once determined that the compiler was leaking references to the integer 2 this way <wink>.
[Tim, on the debug-build sys.getobjects()]
... You can also pass it a type after the int, and, if you do, only objects of that type get returned.
Speaking of which, that become a lot more pleasant in 2.2, as new-style classes create new types, and most builtin types have builtin names. You can pee away delighted weeks pondering the mysteries <wink>. For example: Python 2.3a0 (#29, Jun 13 2002, 17:06:59) [MSC 32 bit (Intel)] on win32 Type "help", "copyright", "credits" or "license" for more information.
import sys [8285 refs] sys.getobjects(0, int) [17, 19, 20, 18, 14, 512, 56, 448, 128, 256, 512, 1024, 2048, 49152, 40960, 4096, 32768, 24576, 8192, 16384, 61440, 4095, 9, 7, 6, 5, 10, 32, 16, 64, 4096, 128, 16384, 32768, 512, 1024, 256, 32767, 511, -4, -1, 15, 11, 8, 22, 4, 21, 23, 503316480, 65535, 2147483647, 1, 0, 3, 2, 33751201] [8348 refs]
Why would the first int Python allocates be 33751201? The answer is clear with a little hexification:
hex(_[-1]) '0x20300a1' [8292 refs]
Or, if that answer isn't clear, you should unsubscribe from Python-Dev immediately <wink>.
On Friday, June 14, 2002, at 04:38 , Thomas Heller wrote:
I prefer to insert #ifdef _DEBUG _asm int 3; /* breakpoint */ #endif into the problematic sections of my code, and whoops, the MSVC GUI debugger opens just when this code is executed, even if it was started from the command line.
Ok, MSVC finally scored a point with me, this is nifty:-) -- - Jack Jansen <Jack.Jansen@oratrix.com> http://www.cwi.nl/~jack - - If I can't dance I don't want to be part of your revolution -- Emma Goldman -
On Fri, Jun 14, 2002 at 05:30:47PM +0200, Jack Jansen wrote:
On Friday, June 14, 2002, at 04:38 , Thomas Heller wrote:
I prefer to insert #ifdef _DEBUG _asm int 3; /* breakpoint */ #endif into the problematic sections of my code, and whoops, the MSVC GUI debugger opens just when this code is executed, even if it was started from the command line.
Ok, MSVC finally scored a point with me, this is nifty:-)
You can "set" a breakpoint this way in x86 Linux too. Unfortunately, when this is not run under the debugger, it simply sends a SIGTRAP to the process. In theory the standard library could handle SIGTRAP by invoking the debugger, but 5 minutes fiddling around didn't produce a very dependable way of doing so. (gdb) run Starting program: ./a.out a Program received signal SIGTRAP, Trace/breakpoint trap. main () at bp.c:21 21 printf("b\n"); (gdb) cont Continuing. b #include <stdio.h> #define _DEBUG #ifdef _DEBUG #if defined(WIN32) #define BREAKPOINT _asm int 3 #elif defined(__GNUC__) && defined(__i386__) #define BREAKPOINT __asm__ __volatile__ ("int3") #else #warning "BREAKPOINT not defined for this OS / Compiler" #define BREAKPOINT (void)0 #endif #else #define _DEBUG (void)0 #endif main() { printf("a\n"); BREAKPOINT; printf("b\n"); return 0; }
From: "Guido van Rossum" <guido@python.org>
Distutils is very unix-centric in that it expects there to be separate compile and link steps. While this can be made to work on Windows (at least for MSVC) where there are such separate compilers if you look hard enough it can't be made to work for MetroWerks on the Mac, and also for MSVC it's a rather funny way to do things.
Actually, the setup dialogs and general structure of MSVC make you very aware of the Unixoid structure of the underlying compiler suite. :-)
But I believe what you say about MW.
Well, that really depends on whether you think supporting MacOS 9 development is important. MW supplies regular command-line tools for MacOS X. -Dave
participants (28)
-
Alex Martelli
-
Andrew Kuchling
-
barry@zope.com
-
David Abrahams
-
Fredrik Lundh
-
Gordon McMillan
-
Greg Ball
-
Greg Ewing
-
Greg Ward
-
Guido van Rossum
-
holger krekel
-
Jack Jansen
-
Jeff Epler
-
Jeremy Hylton
-
jeremy@zope.com
-
M.-A. Lemburg
-
Martin Sjögren
-
martin@v.loewis.de
-
Michael Hudson
-
Neal Norwitz
-
Neil Hodgson
-
Neil Schemenauer
-
Paul Prescod
-
Paul Svensson
-
Skip Montanaro
-
Steve Holden
-
Thomas Heller
-
Tim Peters