From: "M.-A. Lemburg"
My C extensions (are supposed to) share code and data. But distutils insists that each extension be self-contained and include everything that it depends on. [....] Hmm... come to think of it I guess I could fake it with ext_modules = [Extension('foo', source=['foo.cc', 'bar.cc'])] and then calling initbar() from initfoo(). I'd like to find a cleaner solution though.
That won't work: Python uses the DLL name to determine the init function name, so in your example it will look for initfoo().
I tried it out and I see what you mean. I figure the point is that the .pyd isn't (necessarily) loaded until "import foo" forces it, so I can't expect "import bar" to work. But then if I require "import bar" to always be preceeded by "import foo", it seems to work. Yeah I know, "seems", but then it doesn't need to be completely polished, and it doesn't need to be portable, as this is just for an internal distribution whose current user count is 2.
If you want to share data between extensions, the right thing to do is to wrap the data in a PyCObject which you then access using the standard Python import mechanisms.
Ok, I took a look at PyCObjects and I can see how this is the way to go for a serious Python extension with wide distribution. It would, at the cost of some added complexity and void*-magic, solve part of the problem. But no all of it - it's no good for using Python to wrap source trees with internal dependencies: Say I have three C files, foo.c, bar.c and shared.c, that I want to access from Python as separate modules, and to do that I write some wrapper code in pyFoo.c, pyBar.c and pyShared.c. foo.c and bar.c both call functions in shared.c. I can't use PyCObjects here without modifying the original source code. But the original source code is used in other places too and shouldn't depend on Python. - Anders