Hi there. i am preparing a patch for #655 (write dependency makefiles) at [1]. I think it's best to use the -M, -F and -D short options for this functionality. please comment, improve and/or merge. regards felix [1] http://trac.sagemath.org/sage_trac/ticket/14728
We prefer changes as pull requests to https://github.com/cython/cython? One first comment, I think it's a lot to reserve three flags for this. Perhaps "-M [filename] would be sufficient, using the convention that - is stdout. If necessary, we could have a special syntax for the -D option. I'm still, however, trying to figure out exactly what the usecase for this is. Generally extensions are created with distutils, and cythonize handles the dependencies in that framework for you, so I'm not sure how you'd use the resulting makefiles (one per .pyx file?) anyways. An example/test would be useful as well (see tests/build). On Wed, Jun 19, 2013 at 12:16 AM, Felix Salfelder <felix@salfelder.org> wrote:
Hi there.
i am preparing a patch for #655 (write dependency makefiles) at [1]. I think it's best to use the -M, -F and -D short options for this functionality. please comment, improve and/or merge.
regards felix
[1] http://trac.sagemath.org/sage_trac/ticket/14728 _______________________________________________ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
On Tue, Jun 25, 2013 at 01:06:35AM -0700, Robert Bradshaw wrote:
One first comment, I think it's a lot to reserve three flags for this. Perhaps "-M [filename] would be sufficient, using the convention that - is stdout. If necessary, we could have a special syntax for the -D option.
it's a tradeoff between simplicity and force of habit. anybody using gcc and make knows, what -M and -M[A-Z] mean. anybody who doesnt use any of these, will never need this functionality. but yes, of you *do* need -P and -D otherwise, we might find other characters...
I'm still, however, trying to figure out exactly what the usecase for this is.
it's about keeping track of build dependencies.
Generally extensions are created with distutils, and cythonize handles the dependencies in that framework for you, so I'm not sure how you'd use the resulting makefiles (one per .pyx file?) anyways.
cythonize doesnt know, which headers gcc will use when compiling the cython output. now what any other compiler will do. i have no idea how to fix that (design flaw?), and its currently easier to just use makefiles from the beginning. with makefiles, dependencies are easy and fast, if all involved compilers support it. regards felix
Felix Salfelder, 25.06.2013 10:34:
On Tue, Jun 25, 2013 at 01:06:35AM -0700, Robert Bradshaw wrote:
I'm still, however, trying to figure out exactly what the usecase for this is.
it's about keeping track of build dependencies.
Generally extensions are created with distutils, and cythonize handles the dependencies in that framework for you, so I'm not sure how you'd use the resulting makefiles (one per .pyx file?) anyways.
I fail to see the use case, too. It's fairly limited in any case.
cythonize doesnt know, which headers gcc will use when compiling the cython output.
Make doesn't know that either. Cython at least knows which ones are used directly. Handling transitive dependencies would require parsing header files. If you need to keep track of changes in transitively included header files, why not cimport from them in the Cython source to make them an explicit dependency?
now what any other compiler will do.
This sentence barely passes through my English parser and then fails to link with the rest.
i have no idea how to fix that (design flaw?), and its currently easier to just use makefiles from the beginning. with makefiles, dependencies are easy and fast, if all involved compilers support it.
Maybe you should start by describing more clearly what exactly is missing in the current setup. Building with make instead of distutils seems like a major complication to me all by itself. Stefan
Hi Stefan. On Thu, Jun 27, 2013 at 08:58:28AM +0200, Stefan Behnel wrote:
Make doesn't know that either. Cython at least knows which ones are used directly. Handling transitive dependencies would require parsing header files. If you need to keep track of changes in transitively included header files, why not cimport from them in the Cython source to make them an explicit dependency?
explicit dependency tracking would imply "manual". which is painful and error-prone. without running gcc -M (with all flags) you cannot even guess the headers used transitively. I haven't found a gcc -M call within the cython souce code.
now what any other compiler will do.
This sentence barely passes through my English parser and then fails to link with the rest.
I'm sorry -- read that as noR [cythonize will know] what any other compiler will do.
i have no idea how to fix that (design flaw?), and its currently easier to just use makefiles from the beginning. with makefiles, dependencies are easy and fast, if all involved compilers support it.
Maybe you should start by describing more clearly what exactly is missing in the current setup. Building with make instead of distutils seems like a major complication to me all by itself.
Its still just that "cython does not track (all) build dependencies". but lets make a short story long: look at /src/module_list.py within the sage project. it contains lots of references to headers at hardwired paths. these paths are wrong in most cases, and they require manual messing with build system internals *because* cythonize does not (can not?) keep track of them. building with make (read: autotools) just works the way it always did (+ some obvious quirks that are not currently included within upstram autotools) -- after patching cython. (i know, that many people hate autotools, and i don't want to start a rant about it, but it would be better for everybody if a) make/autotools was taken seriously b) the missing functionality will be implemented into cython(ize) some day, start with dependencies, then port/reimplement the AC_* macros ) regards felix
On Thu, Jun 27, 2013 at 1:26 AM, Felix Salfelder <felix@salfelder.org> wrote:
Hi Stefan.
On Thu, Jun 27, 2013 at 08:58:28AM +0200, Stefan Behnel wrote:
Make doesn't know that either. Cython at least knows which ones are used directly. Handling transitive dependencies would require parsing header files. If you need to keep track of changes in transitively included header files, why not cimport from them in the Cython source to make them an explicit dependency?
explicit dependency tracking would imply "manual". which is painful and error-prone. without running gcc -M (with all flags) you cannot even guess the headers used transitively. I haven't found a gcc -M call within the cython souce code.
Why would it be needed?
now what any other compiler will do.
This sentence barely passes through my English parser and then fails to link with the rest.
I'm sorry -- read that as noR [cythonize will know] what any other compiler will do.
i have no idea how to fix that (design flaw?), and its currently easier to just use makefiles from the beginning. with makefiles, dependencies are easy and fast, if all involved compilers support it.
Maybe you should start by describing more clearly what exactly is missing in the current setup. Building with make instead of distutils seems like a major complication to me all by itself.
Its still just that "cython does not track (all) build dependencies". but lets make a short story long:
look at /src/module_list.py within the sage project. it contains lots of references to headers at hardwired paths. these paths are wrong in most cases, and they require manual messing with build system internals *because* cythonize does not (can not?) keep track of them.
Ah, I know a bit more here. module_list.py is structured so because it grew up organically by people with a wide range programming backgrounds and one of the explicit goals of cythonize was (among other things) to remove the needs for such explicit and error-prone declarations. module_list.py has not been "simplified" yet because it was a moving target (I think it was rebased something like a dozen times over a period of about a year before we decided to just get cythonize() in and do module_list cleanup later). It should be entirely sufficient, even for sage.
building with make (read: autotools) just works the way it always did (+ some obvious quirks that are not currently included within upstram autotools) -- after patching cython.
Can you explain? Are you saying you can type cython -M *.pyx make
(i know, that many people hate autotools, and i don't want to start a rant about it, but it would be better for everybody if a) make/autotools was taken seriously b) the missing functionality will be implemented into cython(ize) some day, start with dependencies, then port/reimplement the AC_* macros )
One of the goals of Cythonize is to *only* handle the pyx -> c[pp] step, and let other existing tools handle the rest. - Robert
On Thu, Jun 27, 2013 at 09:23:21AM -0700, Robert Bradshaw wrote:
explicit dependency tracking would imply "manual". which is painful and error-prone. without running gcc -M (with all flags) you cannot even guess the headers used transitively. I haven't found a gcc -M call within the cython souce code.
Why would it be needed?
well it is not. if I can use something else to track dependendencies (like autotools), something else takes care of gcc -M. but this now also needs to call cython with -M, to know when cython needs to be called again.
Its still just that "cython does not track (all) build dependencies". but lets make a short story long:
I'm probably wrong here, and it's that other tool, "distutils", that would be responsible. anyhow, it's cython I want to write out dependencies.
look at /src/module_list.py within the sage project. it contains lots of references to headers at hardwired paths. these paths are wrong in most cases, and they require manual messing with build system internals *because* cythonize does not (can not?) keep track of them.
Ah, I know a bit more here. module_list.py is structured so because it grew up organically by people with a wide range programming backgrounds and one of the explicit goals of cythonize was (among other things) to remove the needs for such explicit and error-prone declarations. module_list.py has not been "simplified" yet because it was a moving target (I think it was rebased something like a dozen times over a period of about a year before we decided to just get cythonize() in and do module_list cleanup later).
It should be entirely sufficient, even for sage.
sage currently uses hardwired paths for all and everything. in particular for header locations. it works right now, but the plan is to support packages installed to the host system. and: sage is just *my* example, it wasnt the original reason for opening #655.
building with make (read: autotools) just works the way it always did (+ some obvious quirks that are not currently included within upstram autotools) -- after patching cython.
Can you explain? Are you saying you can type
cython -M *.pyx make
no. the input for autotools contains a list of things, that you want. for example foo.so. now it creates makefiles that implement the rules that achieve this. for example foo.so will be built from foo.c, from foo.pyx (if foo.pyx exists, of course). deep down in the rules, the cython -M (and gcc -M) call just does the right thing without you even noticing.
(i know, that many people hate autotools, and i don't want to start a rant about it, but it would be better for everybody if a) make/autotools was taken seriously b) the missing functionality will be implemented into cython(ize) some day, start with dependencies, then port/reimplement the AC_* macros )
One of the goals of Cythonize is to *only* handle the pyx -> c[pp] step, and let other existing tools handle the rest.
That's exactly what i want to do. use cython to translate .pyx->.c[pp] and nothing else. the existing tool (make) needs to know when cython needs to be called, so it has to know the dependency chain. regards felix
On Thu, Jun 27, 2013 at 10:25 AM, Felix Salfelder <felix@salfelder.org> wrote:
On Thu, Jun 27, 2013 at 09:23:21AM -0700, Robert Bradshaw wrote:
explicit dependency tracking would imply "manual". which is painful and error-prone. without running gcc -M (with all flags) you cannot even guess the headers used transitively. I haven't found a gcc -M call within the cython souce code.
Why would it be needed?
well it is not. if I can use something else to track dependendencies (like autotools), something else takes care of gcc -M. but this now also needs to call cython with -M, to know when cython needs to be called again.
And you're planning on calling cython manually, cutting distutils out of the loop completely?
Its still just that "cython does not track (all) build dependencies". but lets make a short story long:
I'm probably wrong here, and it's that other tool, "distutils", that would be responsible. anyhow, it's cython I want to write out dependencies.
look at /src/module_list.py within the sage project. it contains lots of references to headers at hardwired paths. these paths are wrong in most cases, and they require manual messing with build system internals *because* cythonize does not (can not?) keep track of them.
Ah, I know a bit more here. module_list.py is structured so because it grew up organically by people with a wide range programming backgrounds and one of the explicit goals of cythonize was (among other things) to remove the needs for such explicit and error-prone declarations. module_list.py has not been "simplified" yet because it was a moving target (I think it was rebased something like a dozen times over a period of about a year before we decided to just get cythonize() in and do module_list cleanup later).
It should be entirely sufficient, even for sage.
sage currently uses hardwired paths for all and everything. in particular for header locations. it works right now, but the plan is to support packages installed to the host system.
I don't see how that would change anything.
and: sage is just *my* example, it wasnt the original reason for opening #655.
building with make (read: autotools) just works the way it always did (+ some obvious quirks that are not currently included within upstram autotools) -- after patching cython.
Can you explain? Are you saying you can type
cython -M *.pyx make
no. the input for autotools contains a list of things, that you want. for example foo.so. now it creates makefiles that implement the rules that achieve this. for example foo.so will be built from foo.c, from foo.pyx (if foo.pyx exists, of course).
deep down in the rules, the cython -M (and gcc -M) call just does the right thing without you even noticing.
(i know, that many people hate autotools, and i don't want to start a rant about it, but it would be better for everybody if a) make/autotools was taken seriously b) the missing functionality will be implemented into cython(ize) some day, start with dependencies, then port/reimplement the AC_* macros )
One of the goals of Cythonize is to *only* handle the pyx -> c[pp] step, and let other existing tools handle the rest.
That's exactly what i want to do. use cython to translate .pyx->.c[pp] and nothing else. the existing tool (make) needs to know when cython needs to be called, so it has to know the dependency chain.
It also needs to know how cython needs to be called, and then how gcc needs to be called (or, would you invoke setup.py when any .pyx file changes, in which case you don't need a more granular rules). In general, I'm +1 on providing a mechanism for exporting dependencies for tools to do with whatever they like. I have a couple of issues with the current approach: (1) Doing this on a file-by-file basis is quadratic time (which for something like Sage takes unbearably long as you have to actually read and parse the entire file to understand its dependencies, and then recursively merge them up to the leaves). This could be mitigated (the parsing at least) by writing dep files and re-using them, but it's still going to be sub-optimal. The exact dependencies may also depend on the options passed into cythonize (e.g. the specific include directories, some dynamically computed like numpy_get_includes()). (2) I don't think we need to co-opt gcc's flags for this. A single flag that writes its output to a named file should be sufficient. No one expects to be able to pass gcc options to Cython, and Cython can be used with more C compilers than just gcc. (3) The implementation is a bit hackish, with global dictionaries and random printing. - Robert
Hi Robert. On Thu, Jun 27, 2013 at 11:05:48AM -0700, Robert Bradshaw wrote:
And you're planning on calling cython manually, cutting distutils out of the loop completely?
If someone tells me, how to fix distutils, (better: does it), i might change my mind. also, I need VPATH... just something that works.
sage currently uses hardwired paths for all and everything. in particular for header locations. it works right now, but the plan is to support packages installed to the host system.
I don't see how that would change anything.
well, what would $SAGE_LOCAL/include/something.h be then? and how to tell without reimplementing gcc -M functionality?
That's exactly what i want to do. use cython to translate .pyx->.c[pp] and nothing else. the existing tool (make) needs to know when cython needs to be called, so it has to know the dependency chain.
It also needs to know how cython needs to be called, and then how gcc needs to be called (or, would you invoke setup.py when any .pyx file changes, in which case you don't need a more granular rules).
autotools takes care of that. for C/C++ this has been working for ages. automatically. cython rules need to be added manually (currently, until somebody tweaks autotools a bit). setup.py is not needed.
In general, I'm +1 on providing a mechanism for exporting dependencies for tools to do with whatever they like. I have a couple of issues with the current approach:
(1) Doing this on a file-by-file basis is quadratic time (which for something like Sage takes unbearably long as you have to actually read and parse the entire file to understand its dependencies, and then recursively merge them up to the leaves). This could be mitigated (the parsing at least) by writing dep files and re-using them, but it's still going to be sub-optimal.
i do not understand. the -MF approach writes out dependencies *during* compilation and does no extra parsing. it can't be more efficient (can it, how?). the makefiles maybe are the dep files you are referring to. a second make run will just read the dependency output and compare timestamps.
The exact dependencies may also depend on the options passed into cythonize (e.g. the specific include directories, some dynamically computed like numpy_get_includes()).
Do you want to change include paths (reconfigure the whole thing) beween two runs? in general this wont work without "make clean"... if it's options within a config.h file, it may trigger recompilation of course (also you can add any sort of dependency, if you want that)
(2) I don't think we need to co-opt gcc's flags for this.
See e.g. /usr/share/automake-1.11/depcomp, to get an idea on how many compilers support the -M family. There is at least "hp, "aix", "icc", "tru64", "gcc". the "sgi" case looks similar. I don't know who started it.
A single flag that writes its output to a named file should be sufficient. No one expects to be able to pass gcc options to Cython, and Cython can be used with more C compilers than just gcc.
okay, if you feel like it, lets translate -M -MF, -MD, -MP to something more pythonic. it would be great to use single letter options, as otherwise the commands are unnecessarily lengthy. My current rules just set -M -MD -MP (==-MDP).
(3) The implementation is a bit hackish, with global dictionaries and random printing.
i need a global dictionary, as some files are accessed multiple times. how can i avoid this? what is "random printing?". i'm not a cython expert, but with some hints I might be able to improve the patch. thanks felix
On Thu, Jun 27, 2013 at 12:18 PM, Felix Salfelder <felix@salfelder.org> wrote:
Hi Robert.
On Thu, Jun 27, 2013 at 11:05:48AM -0700, Robert Bradshaw wrote:
And you're planning on calling cython manually, cutting distutils out of the loop completely?
If someone tells me, how to fix distutils, (better: does it), i might change my mind. also, I need VPATH... just something that works.
sage currently uses hardwired paths for all and everything. in particular for header locations. it works right now, but the plan is to support packages installed to the host system.
I don't see how that would change anything.
well, what would $SAGE_LOCAL/include/something.h be then? and how to tell without reimplementing gcc -M functionality?
OK, currently it resolves this to an absolute path (only because it's ephemeral information that's due to be tossed anyways).
That's exactly what i want to do. use cython to translate .pyx->.c[pp] and nothing else. the existing tool (make) needs to know when cython needs to be called, so it has to know the dependency chain.
It also needs to know how cython needs to be called, and then how gcc needs to be called (or, would you invoke setup.py when any .pyx file changes, in which case you don't need a more granular rules).
autotools takes care of that. for C/C++ this has been working for ages. automatically. cython rules need to be added manually (currently, until somebody tweaks autotools a bit). setup.py is not needed.
Building Python extensions with makefiles/autotools rather than distutils is less supported, but I suppose you could do that manually.
In general, I'm +1 on providing a mechanism for exporting dependencies for tools to do with whatever they like. I have a couple of issues with the current approach:
(1) Doing this on a file-by-file basis is quadratic time (which for something like Sage takes unbearably long as you have to actually read and parse the entire file to understand its dependencies, and then recursively merge them up to the leaves). This could be mitigated (the parsing at least) by writing dep files and re-using them, but it's still going to be sub-optimal.
i do not understand. the -MF approach writes out dependencies *during* compilation and does no extra parsing. it can't be more efficient (can it, how?).
Currently cythonize() allows you to determine quickly upfront what needs to be compiled *without* actually compiling anything. I suppose the idea is that by default you compile everything, and the next time around you have some kind of artifact that lets you understand the dependencies better? But you still need a rule for the initial run, right? It would still help if you posted exactly how you're using it. E.g. here's a set of .pyx files, I run this to generate some make files, which invokes "cython -M ..."
the makefiles maybe are the dep files you are referring to. a second make run will just read the dependency output and compare timestamps.
The exact dependencies may also depend on the options passed into cythonize (e.g. the specific include directories, some dynamically computed like numpy_get_includes()).
Do you want to change include paths (reconfigure the whole thing) beween two runs? in general this wont work without "make clean"... if it's options within a config.h file, it may trigger recompilation of course
(also you can add any sort of dependency, if you want that)
(2) I don't think we need to co-opt gcc's flags for this.
See e.g. /usr/share/automake-1.11/depcomp, to get an idea on how many compilers support the -M family. There is at least "hp, "aix", "icc", "tru64", "gcc". the "sgi" case looks similar. I don't know who started it.
A single flag that writes its output to a named file should be sufficient. No one expects to be able to pass gcc options to Cython, and Cython can be used with more C compilers than just gcc.
okay, if you feel like it, lets translate -M -MF, -MD, -MP to something more pythonic. it would be great to use single letter options, as otherwise the commands are unnecessarily lengthy. My current rules just set -M -MD -MP (==-MDP).
OK, fair point. I suppose we can go with -M[x].
(3) The implementation is a bit hackish, with global dictionaries and random printing.
i need a global dictionary, as some files are accessed multiple times. how can i avoid this? what is "random printing?".
i'm not a cython expert, but with some hints I might be able to improve the patch.
File a pull request and I'll take another look. - Robert
On Thu, Jun 27, 2013 at 12:39:48PM -0700, Robert Bradshaw wrote:
Building Python extensions with makefiles/autotools rather than distutils is less supported, but I suppose you could do that manually.
i've done that. I ran into a few peculiarities with "-I", "-w", and __init__.py, but nothing serious. the only thing that will affect the user (and which i should mention) is: make uses file extensions to determine file types. particularly, i have not found a portable hack that allows the use of .pyx for both .c and .cpp within the scope of one makefile yet...
Currently cythonize() allows you to determine quickly upfront what needs to be compiled *without* actually compiling anything. I suppose the idea is that by default you compile everything, and the next time around you have some kind of artifact that lets you understand the dependencies better?
yes, the dependencies for the first run are empty and filled/refreshed during the cython run. no upfront determination required.
But you still need a rule for the initial run, right?
the rules are the same each time, if the target doesn't exist, it will be made regardless of how empty the dependency file is.
It would still help if you posted exactly how you're using it. E.g. here's a set of .pyx files, I run this to generate some make files, which invokes "cython -M ..."
The basic principle to build file.so out of file.pyx is this: $ cat Makefile # handwritten example demo all: file.so %.so: %.cc gcc -shared -fpic $(CFLAGS) $(CPPFLAGS) $< -o $@ -M -MD -MP %.c: %.pyx cython $< -o $@ -MD -MP include file.pyx.d file.cP $ : > file.pyx.d $ : > file.cP # these files are initially empty $ vim file.pyx # write down some code using "file.pxi" somehow [..] $ vim file.pxi # write down some code using <file.h> somehow [..] $ make file.so # will find the .pyx -> .c -> .so chain cython file.pyx -o file.c -MD -MP # creates file.c and file.pyx.d [..] gcc -shared -fpic file.so -o file.so -M -MD -MP # creates file.{so,cP} [..] $ cat file.pyx.d # the dependency makefile cython has created file.c: file.pxi file.pxi: $ cat file.cP # the dependency makefile gcc has written file.so: /path/to/file.h /path/to/file.h: $ make # doesnt do anything after checking timestamps. $ touch file.pyx; make # calls cython, gcc [..] $ touch /path/to/file.h; make # just calls gcc [..] $ touch file.pxi; make # calls both [..] (lets hope that this is syntactically correct and half way comprehensible) eventually, it (i.e. what i've implemented for sage) could look more like this: $ ./configure # creates Makefiles from templates $ make $ CYTH file.c $ CC file.lo $ LD file.so $ touch file.h; make $ CC file.lo $ LD file.so ... (automake will call the linker seperately, to increase portability or something)
File a pull request and I'll take another look.
within the next few days... thanks felix
Felix Salfelder, 27.06.2013 23:06:
On Thu, Jun 27, 2013 at 12:39:48PM -0700, Robert Bradshaw wrote:
Building Python extensions with makefiles/autotools rather than distutils is less supported, but I suppose you could do that manually.
i've done that. I ran into a few peculiarities with "-I", "-w", and __init__.py, but nothing serious.
the only thing that will affect the user (and which i should mention) is: make uses file extensions to determine file types. particularly, i have not found a portable hack that allows the use of .pyx for both .c and .cpp within the scope of one makefile yet...
That's unfortunate, but not too serious either. Larger projects may end up using both, but since it should often work to compile everything in C++ mode, and the right way to do it is distutils anyway, people who really want to go through the hassle of using make will just have to live with it. In the worst case, you could spell out the build targets explicitly (i.e. not as patterns) in the makefile, as part of the dependencies.
Currently cythonize() allows you to determine quickly upfront what needs to be compiled *without* actually compiling anything. I suppose the idea is that by default you compile everything, and the next time around you have some kind of artifact that lets you understand the dependencies better?
yes, the dependencies for the first run are empty and filled/refreshed during the cython run. no upfront determination required.
But you still need a rule for the initial run, right?
the rules are the same each time, if the target doesn't exist, it will be made regardless of how empty the dependency file is.
It would still help if you posted exactly how you're using it. E.g. here's a set of .pyx files, I run this to generate some make files, which invokes "cython -M ..."
The basic principle to build file.so out of file.pyx is this: $ cat Makefile # handwritten example demo all: file.so %.so: %.cc gcc -shared -fpic $(CFLAGS) $(CPPFLAGS) $< -o $@ -M -MD -MP %.c: %.pyx cython $< -o $@ -MD -MP include file.pyx.d file.cP $ : > file.pyx.d $ : > file.cP # these files are initially empty
Hmm, does that mean you either have to create them manually before the first run, and/or you have to manually collect all dependency files for the "include" list? And then keep track of them yourself when you add new files? I suppose you could also use a wildcard file search to build the list of include files?
$ vim file.pyx # write down some code using "file.pxi" somehow [..] $ vim file.pxi # write down some code using <file.h> somehow [..] $ make file.so # will find the .pyx -> .c -> .so chain cython file.pyx -o file.c -MD -MP # creates file.c and file.pyx.d [..] gcc -shared -fpic file.so -o file.so -M -MD -MP # creates file.{so,cP} [..] $ cat file.pyx.d # the dependency makefile cython has created file.c: file.pxi file.pxi: $ cat file.cP # the dependency makefile gcc has written file.so: /path/to/file.h /path/to/file.h: $ make # doesnt do anything after checking timestamps. $ touch file.pyx; make # calls cython, gcc [..] $ touch /path/to/file.h; make # just calls gcc [..] $ touch file.pxi; make # calls both [..]
(lets hope that this is syntactically correct and half way comprehensible)
eventually, it (i.e. what i've implemented for sage) could look more like this: $ ./configure # creates Makefiles from templates $ make $ CYTH file.c $ CC file.lo $ LD file.so $ touch file.h; make $ CC file.lo $ LD file.so ... (automake will call the linker seperately, to increase portability or something)
If you want this feature to go in, I think you should write up documentation for it, so that other people can actually use it as well. Even writing a correct makefile requires a huge amount of digging into distutils. Here are examples for portable makefiles: https://github.com/cython/cython/tree/master/Demos/embed It would be nice to have a similar demo setup for a complete make build. The textual documentation should go here: http://docs.cython.org/src/reference/compilation.html (see the .rst files in docs/src/) Stefan
On Fri, Jun 28, 2013 at 06:39:45AM +0200, Stefan Behnel wrote:
the only thing that will affect the user (and which i should mention) is: make uses file extensions to determine file types. particularly, i have not found a portable hack that allows the use of .pyx for both .c and .cpp within the scope of one makefile yet...
That's unfortunate, but not too serious either.
its bearable, and it's just automake. manually created makefiles are not affected. why does cython use .pyx for two different things? what would be the canonical extension for C++ cython files? i've chosen to use .pyxx, any better options?
Hmm, does that mean you either have to create them manually before the first run, and/or you have to manually collect all dependency files for the "include" list? And then keep track of them yourself when you add new files?
my goal is to use autotools for that sort of stuff. but yes, you can always do everything manually.
I suppose you could also use a wildcard file search to build the list of include files?
wildcards are seen as bad [1]. might be a matter of taste, of course. with manually created makefiles, you can do what you like best, of course.
(automake will call the linker seperately, to increase portability or something)
If you want this feature to go in, I think you should write up documentation for it, so that other people can actually use it as well.
i have. type cython --help. this should be copied into the manpage, but we need to agree on the choice of option names first (and some other details, see the push request on github).
Even writing a correct makefile requires a huge amount of digging into distutils.
makefiles are documented in the make manual of your favourite make implementation. I don't know why you would want to dig into distutils. regards felix [1] http://www.gnu.org/software/automake/manual/html_node/Wildcards.html
Felix Salfelder, 23.07.2013 11:25:
On Fri, Jun 28, 2013 at 06:39:45AM +0200, Stefan Behnel wrote:
Even writing a correct makefile requires a huge amount of digging into distutils.
makefiles are documented in the make manual of your favourite make implementation. I don't know why you would want to dig into distutils.
In order to figure out how to build C extensions in a portable way, e.g. https://github.com/cython/cython/blob/master/Demos/embed/Makefile The sysconfig module seems to be amongst the even more underdocumented parts of distutils. Stefan
On Tue, Jul 23, 2013 at 2:25 AM, Felix Salfelder <felix@salfelder.org> wrote:
On Fri, Jun 28, 2013 at 06:39:45AM +0200, Stefan Behnel wrote:
the only thing that will affect the user (and which i should mention) is: make uses file extensions to determine file types. particularly, i have not found a portable hack that allows the use of .pyx for both .c and .cpp within the scope of one makefile yet...
That's unfortunate, but not too serious either.
its bearable, and it's just automake. manually created makefiles are not affected. why does cython use .pyx for two different things? what would be the canonical extension for C++ cython files? i've chosen to use .pyxx, any better options?
Though there are some features only available with C++, I don't consider this "two different things" as the similarities far, far outweigh the differences.
Hmm, does that mean you either have to create them manually before the first run, and/or you have to manually collect all dependency files for the "include" list? And then keep track of them yourself when you add new files?
my goal is to use autotools for that sort of stuff. but yes, you can always do everything manually.
I suppose you could also use a wildcard file search to build the list of include files?
wildcards are seen as bad [1]. might be a matter of taste, of course. with manually created makefiles, you can do what you like best, of course.
(automake will call the linker seperately, to increase portability or something)
If you want this feature to go in, I think you should write up documentation for it, so that other people can actually use it as well.
i have. type cython --help. this should be copied into the manpage, but we need to agree on the choice of option names first (and some other details, see the push request on github).
Even writing a correct makefile requires a huge amount of digging into distutils.
makefiles are documented in the make manual of your favourite make implementation. I don't know why you would want to dig into distutils.
To expand on what Stefan wrote, distutils automatically figures out the flags, linker options, headers, etc. to make an extension compatible with that specific Python on your system, which one would either have to do manually or "extract" from distutils otherwise. - Robert
On Fri, Aug 02, 2013 at 10:36:48PM -0700, Robert Bradshaw wrote:
its bearable, and it's just automake. manually created makefiles are not affected. why does cython use .pyx for two different things? what would be the canonical extension for C++ cython files? i've chosen to use .pyxx, any better options?
Though there are some features only available with C++, I don't consider this "two different things" as the similarities far, far outweigh the differences.
well, there are two build chains .pyx->.c->.so and .pyx->.cc->.so. without additional information, you cannot tell how to resolve .pyx->.so. usually (outside distutils) there's a tendency towards using different suffixes for different purposes.
To expand on what Stefan wrote, distutils automatically figures out the flags, linker options, headers, etc. to make an extension compatible with that specific Python on your system, which one would either have to do manually or "extract" from distutils otherwise.
this has already been implemented (outside and without and before distutils). see [1] for how it may work for python extensions. it a bit beyond the scope of cython and really beyond the scope of my proposed patch. regards felix [1] www.gnu.org/software/automake/manual/html_node/Python.html
participants (3)
-
Felix Salfelder -
Robert Bradshaw -
Stefan Behnel