Hi everybody, I've spent the last two hours trying to get relative package imports to work because I need them for my extension packages which will soon all move under a new top-level package name to overcome the conflicts with PIL and Zope. Here are the results... Demo Package Structure: (see the attached demopkg.zip) [a] [b] bc.py ab.py With the attached patch you can do the following: # Pretty useless... import a.b.__.ab # From inside bc.py: from __ import ab # At top-level (also useless, but shows how this situation is handled): import __.sys # __ is bound to None since we are at top-level; sys is still # being loaded though. Of course, common usage will be of the form: form __.__ import submodule_at_higher_level Please tell me what you think and give it a try. It's a first try and may have some design errors. Especially the way head and tail are treated in Python/import.c:import_module_ex may cause trouble -- I need help here. Note: The patch is against the CVS version. If you run Python in verbose mode, the patch will produce some verbose output of what it's doing. Enjoy, -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 113 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/ --- /home/lemburg/orig/Python/Python/import.c Fri Apr 9 19:00:51 1999 +++ Python/import.c Fri Sep 10 20:51:02 1999 @@ -1572,10 +1572,14 @@ load_next(mod, altmod, p_name, buf, p_bu char *dot = strchr(name, '.'); int len; char *p; PyObject *result; + if (Py_VerboseFlag) + printf("# load_next: (1) name='%s', buf='%.*s'\n", + name,*p_buflen,buf); + if (dot == NULL) { *p_name = NULL; len = strlen(name); } else { @@ -1586,10 +1590,39 @@ load_next(mod, altmod, p_name, buf, p_bu PyErr_SetString(PyExc_ValueError, "Empty module name"); return NULL; } + /* Handle "__" indicator telling the import mechanism to + continue the search one level higher in the package + hierarchy */ + if (strncmp(name,"__",len) == 0) { + PyObject *modules = PyImport_GetModuleDict(); + + /* Strip the final dotted name from buf */ + dot = strrchr(buf, '.'); + if (dot == NULL) + *p_buflen = 0; + else + *p_buflen = dot - buf; + buf[*p_buflen] = '\0'; + + /* Fetch the parent module or revert to a top-level search */ + if (*p_buflen > 0) { + mod = PyDict_GetItemString(modules,buf); + if (mod == NULL) { + PyErr_SetString(PyExc_SystemError, + "Parent module missing"); + return NULL; + } + } + else + mod = Py_None; + Py_INCREF(mod); + return mod; + } + p = buf + *p_buflen; if (p != buf) *p++ = '.'; if (p+len-buf >= MAXPATHLEN) { PyErr_SetString(PyExc_ValueError, @@ -1597,10 +1630,14 @@ load_next(mod, altmod, p_name, buf, p_bu return NULL; } strncpy(p, name, len); p[len] = '\0'; *p_buflen = p+len-buf; + + if (Py_VerboseFlag) + printf("# load_next: (2) modname='%s', fullname=buf='%s'\n", + p,buf); result = import_submodule(mod, p, buf); if (result == Py_None && altmod != mod) { Py_DECREF(result); /* Here, altmod must be None and mod must not be None */
I've spent the last two hours trying to get relative package imports to work because I need them for my extension packages which will soon all move under a new top-level package name to overcome the conflicts with PIL and Zope.
I'd much rather use absolute package names for anything that's not in the same directory as the current module. --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum wrote:
I've spent the last two hours trying to get relative package imports to work because I need them for my extension packages which will soon all move under a new top-level package name to overcome the conflicts with PIL and Zope.
I'd much rather use absolute package names for anything that's not in the same directory as the current module.
Of course, you could do everything with absolute names, but then the package author would dictate the complete absolute path which is not always desirable since it can cause name collisions such as DateTime in mxDateTime and Zope or Queue in mxQueue (to be released) and Mark's win32 stuff. As more and more packages appear, we run into this problem more and more often. Relative imports would greatly reduce the risk these collisions, because packages could be written self-contained meaning that they can reach their internal modules via relative paths only and don't have to know about the absolute paths by which they are reachable from other parts of the Python universe. I could then make all my packages self-contained and distribute them in two forms without having to change a single line for the added support: 1. under the new 'mx' package, e.g. mx.DateTime 2. for backward compatibility under 'DateTime' Another major advantage is that I could take any other self-contained package and install it under absolute paths of my choice, e.g. put Zope under org.zope.core, Python under org.python.core etc., without harming their functionality or having to dive deep into their import structures to fix them manually. To further enhance this mechanism I would like to have an alias mechanism in import, pickle et al. so that changes in the package structures become manageable without user intervention: pickles are a major problem whenever import structures change because they store absolute module names. -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 111 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
[Guido]
I'd much rather use absolute package names for anything that's not in the same directory as the current module.
[M.-A. Lemburg]
Of course, you could do everything with absolute names, but then the package author would dictate the complete absolute path which is not always desirable since it can cause name collisions such as DateTime in mxDateTime and Zope or Queue in mxQueue (to be released) and Mark's win32 stuff.
As more and more packages appear, we run into this problem more and more often.
I never understand package complaints. Maybe it's the imprecision of the language people use, or maybe because it's because people don't give fully fleshed-out examples. Whatever, in the end, I never have any idea what the complaint is, or in what way the solution is "solving" anything. In the above, "absolute" doesn't appear to mean "absolute" in any OS-sensible sense. So what does it really mean? Does it mean the same things to Guido and MAL? In MAL's hint of examples, I don't see any problem. If mxDateTime unpacks itself into a directory named DateTime, then *of course* it's going to collide with other packages that want to do likewise. Install it into mxDateTime instead, and take "absolute" to mean "any module that wants an mxDateTime service and does not itself live directly in mxDateTime/ must import the desired module via a path beginning 'mxDateTime.'", and everything looks straightforward to me (and that outcome makes me infer that this is thus probably what Guido has in mind too). Similarly for win32.Queue vs mxQueue.Queue vs the Queue in the std library (it *would* be good to have an explicit way to say "std library" -- "Lib." comes to mind).
I could then make all my packages self-contained and distribute them in two forms without having to change a single line for the added support:
1. under the new 'mx' package, e.g. mx.DateTime 2. for backward compatibility under 'DateTime'
Ah, so that's what this is about. I vote screw #2. Releasing it that way was a mistake. Better to live with the brief & finite pain of repairing it than complicating Python to cover up for it.
Another major advantage is that I could take any other self-contained package and install it under absolute paths of my choice, e.g. put Zope under org.zope.core, Python under org.python.core etc., without harming their functionality or having to dive deep into their import structures to fix them manually.
I view that not as an advantage but as harmful complication. Zope etc add great value to a Python installation, and when I write a killer app full of "import zope.this" and "import zope.that", I don't want the possibility that it's not going to work on my client's machine just because their sysadmin installed Zope into some silly site-specific path named after his soon-to-be ex-girlfriend <wink>. I don't want a way to work around him doing that, either: I don't want him to be able to screw me to begin with.
To further enhance this mechanism I would like to have an alias mechanism in import, pickle et al. so that changes in the package structures become manageable without user intervention: pickles are a major problem whenever import structures change because they store absolute module names.
This is a different issue, and may have merits of its own. WRT the relative import scheme, its advantage seems to lie in providing a way to partially recover from the damage the new scheme causes <0.5 wink>. As is, the package name used by a release is part of its published interface. You can't change it without causing pain, any more than you can make incompatible changes to public class methods or input-output behavior. In return, package clients are uniform, simple and portable, making life easiest for the people who know least. The burden is on package authors to choose names wisely, and that's where the burden should be. if-100-pkgs-all-want-their-own-queue-they-already-can-ly y'rs - tim
Tim Peters wrote:
[Guido]
I'd much rather use absolute package names for anything that's not in the same directory as the current module.
[M.-A. Lemburg]
Of course, you could do everything with absolute names, but then the package author would dictate the complete absolute path which is not always desirable since it can cause name collisions such as DateTime in mxDateTime and Zope or Queue in mxQueue (to be released) and Mark's win32 stuff.
As more and more packages appear, we run into this problem more and more often.
I never understand package complaints. Maybe it's the imprecision of the language people use, or maybe because it's because people don't give fully fleshed-out examples. Whatever, in the end, I never have any idea what the complaint is, or in what way the solution is "solving" anything.
My original post contained an example package using relative imports. The example uses intra-package imports across subpackage levels which currently is only possible using absolute module names (see below for a definition). Note that I did not want to start a discussion about absolute vs. relative names (I believe everybody agrees that realtive file names are a Good Thing). The 'import __.module' thing is not new: ni.py had support for this and my patch simply adds it back to the implementation. Think of the '__' as an additional feature that authors can use at their own will. They don't *have* to, just as you don't have to use relative file paths in your config files. It's just a convenience that some authors may want to use in their code.
In the above, "absolute" doesn't appear to mean "absolute" in any OS-sensible sense. So what does it really mean? Does it mean the same things to Guido and MAL?
"Absolute" means fully qualified name, i.e. the complete path to the modules from the top-level root via all subpackage down to the module name itself, e.g. TextTools.Constants.TagTable.
In MAL's hint of examples, I don't see any problem. If mxDateTime unpacks itself into a directory named DateTime, then *of course* it's going to collide with other packages that want to do likewise. Install it into mxDateTime instead, and take "absolute" to mean "any module that wants an mxDateTime service and does not itself live directly in mxDateTime/ must import the desired module via a path beginning 'mxDateTime.'", and everything looks straightforward to me (and that outcome makes me infer that this is thus probably what Guido has in mind too).
This is what I intend to do: move all my stuff under a 'mx' package and then continue reinventing (faster, different, doesn't-slip-when-wet) wheels ;-). BTW, the queue thingie is just a proof of concept fun project.
Similarly for win32.Queue vs mxQueue.Queue vs the Queue in the std library (it *would* be good to have an explicit way to say "std library" -- "Lib." comes to mind).
I'd prefer 'org.python.core.' or 'stdlib.' or just simply 'python.' but that's a different issue.
I could then make all my packages self-contained and distribute them in two forms without having to change a single line for the added support:
1. under the new 'mx' package, e.g. mx.DateTime 2. for backward compatibility under 'DateTime'
Ah, so that's what this is about. I vote screw #2. Releasing it that way was a mistake.
Not until Zope went Open Source ;-) ...
Better to live with the brief & finite pain of repairing it than complicating Python to cover up for it.
Screwing #2 is not possible unless I want all those already stored DateTime pickles to fail loading... ok, I could probably provide some kind of compatibility package which then redirects the import to mx.DateTime.
Another major advantage is that I could take any other self-contained package and install it under absolute paths of my choice, e.g. put Zope under org.zope.core, Python under org.python.core etc., without harming their functionality or having to dive deep into their import structures to fix them manually.
I view that not as an advantage but as harmful complication. Zope etc add great value to a Python installation, and when I write a killer app full of "import zope.this" and "import zope.that", I don't want the possibility that it's not going to work on my client's machine just because their sysadmin installed Zope into some silly site-specific path named after his soon-to-be ex-girlfriend <wink>. I don't want a way to work around him doing that, either: I don't want him to be able to screw me to begin with.
Zope is not a package AFAIK (and probably never will be due to the pickle complications), so 'import zope.this' won't work anyways unless you add a Zope wrapping package of your own -- and this will only work for Zope modules not relying on other Zope modules unless they use relative imports.
To further enhance this mechanism I would like to have an alias mechanism in import, pickle et al. so that changes in the package structures become manageable without user intervention: pickles are a major problem whenever import structures change because they store absolute module names.
This is a different issue, and may have merits of its own. WRT the relative import scheme, its advantage seems to lie in providing a way to partially recover from the damage the new scheme causes <0.5 wink>.
I'm not proposing a new scheme... only a convenience for package authors.
As is, the package name used by a release is part of its published interface. You can't change it without causing pain, any more than you can make incompatible changes to public class methods or input-output behavior. In return, package clients are uniform, simple and portable, making life easiest for the people who know least. The burden is on package authors to choose names wisely, and that's where the burden should be.
Sure, but we are heading into these kind of problems just now and have no proper solution at hand. Simply coping out by blaiming the package authors is not the right way to handle the situation; referring them to some global name registry (the one at NIST) isn't either. As the Zope example shows, it's not easy to move from half-packaged to full packaged (relative imports will ease this move, though) due to external references using absolute object type/class names. Changes in the package structure produce the same problems. It's not so much the burden of the programmer having to adjust his code to the new layout I'm talking about here: it's the user with all his data stored in object databases that worries me.
if-100-pkgs-all-want-their-own-queue-they-already-can-ly y'rs - tim
Not if just one of them decides to make it top-level... grabbing the name until eternity ;-) -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 109 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
"M.-A. Lemburg" wrote:
Tim Peters wrote:
[Guido]
I'd much rather use absolute package names for anything that's not in the same directory as the current module.
[M.-A. Lemburg]
Of course, you could do everything with absolute names, but then the package author would dictate the complete absolute path which is not always desirable since it can cause name collisions such as DateTime in mxDateTime and Zope or Queue in mxQueue (to be released) and Mark's win32 stuff.
As more and more packages appear, we run into this problem more and more often.
I never understand package complaints. Maybe it's the imprecision of the language people use, or maybe because it's because people don't give fully fleshed-out examples. Whatever, in the end, I never have any idea what the complaint is, or in what way the solution is "solving" anything.
My original post contained an example package using relative imports. The example uses intra-package imports across subpackage levels which currently is only possible using absolute module names (see below for a definition).
I agree that this is a valid complaint about the current scheme.
Note that I did not want to start a discussion about absolute vs. relative names (I believe everybody agrees that realtive file names are a Good Thing). The 'import __.module' thing is not new: ni.py had support for this and my patch simply adds it back to the implementation.
And, in fact, the current scheme does let you use relative paths to go down. (snip)
I could then make all my packages self-contained and distribute them in two forms without having to change a single line for the added support:
1. under the new 'mx' package, e.g. mx.DateTime 2. for backward compatibility under 'DateTime'
Ah, so that's what this is about. I vote screw #2. Releasing it that way was a mistake.
Not until Zope went Open Source ;-) ...
Better to live with the brief & finite pain of repairing it than complicating Python to cover up for it.
Screwing #2 is not possible unless I want all those already stored DateTime pickles to fail loading... ok, I could probably provide some kind of compatibility package which then redirects the import to mx.DateTime.
Regardless of how you spell the import, the pickles *must* reflect the absolute path. Otherwise, pickleability depends on where you unpickle. If you rearrange packages, or rename modules, there is a pickling issue. This is clearly a problem in need of a better solution. (snip)
Zope is not a package AFAIK
Nope. If I had it to do over it would be. It will be eventually. (While Zope is not a package, it makes extensive use of packges. Zope has a total of about 350 modules and packages, of which about 44 are in the top-level namespace.)
(and probably never will be due to the pickle complications),
The pickling issues are solvable in a number of ways, although it is a bit painfull. I considered fixing the package layout in Zope 2, but I ran out of time. :)
so 'import zope.this' won't work anyways unless you add a Zope wrapping package of your own -- and this will only work for Zope modules not relying on other Zope modules unless they use relative imports.
This is a good example. It should be possible to make Zope package- portable but it isn't, at least not without writing import hooks, which make my head hurt alot more than meta-classes. ;)
To further enhance this mechanism I would like to have an alias mechanism in import, pickle et al. so that changes in the package structures become manageable without user intervention: pickles are a major problem whenever import structures change because they store absolute module names.
This is a different issue, and may have merits of its own. WRT the relative import scheme, its advantage seems to lie in providing a way to partially recover from the damage the new scheme causes <0.5 wink>.
I'm not proposing a new scheme... only a convenience for package authors.
I think that this is an important convenience that is probably not appreciated until you write a complex package structure. Jim -- Jim Fulton mailto:jim@digicool.com Python Powered! Technical Director (888) 344-4332 http://www.python.org Digital Creations http://www.digicool.com http://www.zope.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
Note that I did not want to start a discussion about absolute vs. relative names (I believe everybody agrees that realtive file names are a Good Thing). The 'import __.module' thing is not new: ni.py had support for this and my patch simply adds it back to the implementation.
It was left out of the 1.5 package implementation on purpose, and I will fight proposals to get it back. It is an ugly hack that deserves to die. So there!
"Absolute" means fully qualified name, i.e. the complete path to the modules from the top-level root via all subpackage down to the module name itself, e.g. TextTools.Constants.TagTable.
No disagreement there. BTW, The compatibility issues with class references in object databases need to be resolved in a different way -- package renamings are but one of the problems here. --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum wrote:
Note that I did not want to start a discussion about absolute vs. relative names (I believe everybody agrees that realtive file names are a Good Thing). The 'import __.module' thing is not new: ni.py had support for this and my patch simply adds it back to the implementation.
It was left out of the 1.5 package implementation on purpose, and I will fight proposals to get it back. It is an ugly hack that deserves to die. So there!
I agree that it's ugly to include the __ attribute in the module namespace due to the possible circular reference (parent->module, module->parent), but the patch I sent doesn't do this... or was "ugly" referring to the two underscores looking strange ? Could you elaborate a bit on the reasons for dropping __ support ? Not-really-looking-for-a-fight-here-<wink>-ly, -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 109 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
[MAL]
I agree that it's ugly to include the __ attribute in the module namespace due to the possible circular reference (parent->module, module->parent), but the patch I sent doesn't do this... or was "ugly" referring to the two underscores looking strange ?
Could you elaborate a bit on the reasons for dropping __ support ?
There are two sections on why __ was dropped in http://www.python.org/doc/essays/packages.html They don't refer to circularity, but to "limited use", "poor readability" and "awkwardness". A deeper reason may be hiding in the essay's "most packages will have a relative shallow substructure": this is Guido <wink>, the man who invented two-level scoping, and class inheritance without a "super" hook back to the (anonymous) parent. For all Python's dynamicism, it very much favors shallow, static name hierarchies. I don't think it's coincidence that Python's own source code is in a two-level directory structure either! The only #include with a ".." is in grammar.h, and there it's in a comment <wink>: #include "bitset.h" /* Sigh... */ So if we cut to the core here, I'd bet Guido doesn't object so much to relative imports as to the idea that anyone would go off and create a package structure so fractally convoluted that relative imports are strongly more attractive than naming the target package in full. Or maybe Guido doesn't care about that at all. I do regardless. I know Python's restrictions can grate, but in all, and in my repeated experience, they force you to rethink complicated designs and refactor them into simpler schemes that fit what Python is best at spelling. Nesting packages 8 deep is clumsy now? Damn straight, and I'm thankful for that: the clumsier it is, the less gratuitous inherited complexity I'll have to deal with in my future lives <0.5 wink>. Things that came up in this thread that are worth fixing include: + Problems with persistent class references (incl. pickles). + Dealing with incompatible versions of packages. If someone wants to embed a copy of (say) mxDateTime in their own package, the only excuses are that they're afraid of overwriting the user's existing mxDateTime installation (if any), and/or of having some future installation of something else overwrite mxDateTime with an incompatible version. Those are bad problems, but package embedding is no solution. You have a much better approach to that already via the DateTime.__version__ string! "Something like that" needs to be formalized and imposed on all public packages. at-which-point-the-distutils-sig-jumps-in-and-saves-the-day-ly y'rs - tim PS:
Just think of what happens to Win9x if you constantly update the DLLs...
As a matter of personal experience, it gets much stabler! The older DLLs get replaced by less-buggy newer ones, thanks to version numbers, rules, and installers that finally play by the rules. The mean time between crashes when I installed Win95 a few years ago was about an hour; now it's at least days and possibly weeks (don't know -- never leave the puter on that long). When a version upgrade fails, it's not a mystery, it's a bug <0.9 wink>.
Tim Peters wrote:
[MAL]
I agree that it's ugly to include the __ attribute in the module namespace due to the possible circular reference (parent->module, module->parent), but the patch I sent doesn't do this... or was "ugly" referring to the two underscores looking strange ?
Could you elaborate a bit on the reasons for dropping __ support ?
There are two sections on why __ was dropped in
http://www.python.org/doc/essays/packages.html
They don't refer to circularity, but to "limited use", "poor readability" and "awkwardness".
Ok, so the two underscores look strange... that's arguable. "Limited use" is not: there clearly is a use for relative imports. Please remember that we are talking about package internals here -- not user interfaces of that package.
A deeper reason may be hiding in the essay's "most packages will have a relative shallow substructure": this is Guido <wink>, the man who invented two-level scoping, and class inheritance without a "super" hook back to the (anonymous) parent. For all Python's dynamicism, it very much favors shallow, static name hierarchies. I don't think it's coincidence that Python's own source code is in a two-level directory structure either! The only #include with a ".." is in grammar.h, and there it's in a comment <wink>:
#include "bitset.h" /* Sigh... */
So if we cut to the core here, I'd bet Guido doesn't object so much to relative imports as to the idea that anyone would go off and create a package structure so fractally convoluted that relative imports are strongly more attractive than naming the target package in full.
This is not the intention of relative imports. Its all about making packages relocatable. And that's a Good Thing IMHO.
Or maybe Guido doesn't care about that at all. I do regardless. I know Python's restrictions can grate, but in all, and in my repeated experience, they force you to rethink complicated designs and refactor them into simpler schemes that fit what Python is best at spelling. Nesting packages 8 deep is clumsy now? Damn straight, and I'm thankful for that: the clumsier it is, the less gratuitous inherited complexity I'll have to deal with in my future lives <0.5 wink>.
Tim, I promise you not use 8 levels deep ;-) I usually create my packages using at most two levels (moving to at most three levels in future mx package releases) and even better: the common package interfaces are all available at the top-most level while internal stuff hides in higher levels. You don't have to think much about the internal structure of the package while you use it. That's something I have to worry about and the advantages of relative imports show up in such a context which is why I posted the patch.
Things that came up in this thread that are worth fixing include:
+ Problems with persistent class references (incl. pickles).
+ Dealing with incompatible versions of packages. If someone wants to embed a copy of (say) mxDateTime in their own package, the only excuses are that they're afraid of overwriting the user's existing mxDateTime installation (if any), and/or of having some future installation of something else overwrite mxDateTime with an incompatible version. Those are bad problems, but package embedding is no solution. You have a much better approach to that already via the DateTime.__version__ string! "Something like that" needs to be formalized and imposed on all public packages.
So the distutils will have to check for already installed package versions and warn about conflicts. Right, that's something which definitely needs to become a standard (too ;).
at-which-point-the-distutils-sig-jumps-in-and-saves-the-day-ly y'rs - tim
PS:
Just think of what happens to Win9x if you constantly update the DLLs...
As a matter of personal experience, it gets much stabler! The older DLLs get replaced by less-buggy newer ones, thanks to version numbers, rules, and installers that finally play by the rules. The mean time between crashes when I installed Win95 a few years ago was about an hour; now it's at least days and possibly weeks (don't know -- never leave the puter on that long). When a version upgrade fails, it's not a mystery, it's a bug <0.9 wink>.
My experience is that intra-DLL references simply don't match anymore and cause the system to become instable. Also, some weird installers don't care about the version numbers and install older versions at their will. The outcome is a complete version mess. -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 108 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
"M.-A. Lemburg" wrote:
Tim Peters wrote:
(snip)
As a matter of personal experience, it gets much stabler! The older DLLs get replaced by less-buggy newer ones, thanks to version numbers, rules, and installers that finally play by the rules. The mean time between crashes when I installed Win95 a few years ago was about an hour; now it's at least days and possibly weeks (don't know -- never leave the puter on that long). When a version upgrade fails, it's not a mystery, it's a bug <0.9 wink>.
My experience is that intra-DLL references simply don't match anymore and cause the system to become instable. Also, some weird installers don't care about the version numbers and install older versions at their will. The outcome is a complete version mess.
This has been my experince too. I cringe anytime I see some installer stuff DLLs in my system areas. Jim -- Jim Fulton mailto:jim@digicool.com Technical Director (888) 344-4332 Python Powered! Digital Creations http://www.digicool.com http://www.python.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
Jim Fulton wrote:
"M.-A. Lemburg" wrote:
Tim Peters wrote:
(snip)
As a matter of personal experience, it gets much stabler! The older DLLs get replaced by less-buggy newer ones, thanks to version numbers, rules, and
My experience is that intra-DLL references simply don't match anymore and cause the system to become instable. Also, some weird installers don't care about the version numbers and install older versions at their will. The outcome is a complete version mess.
This has been my experince too. I cringe anytime I see some installer stuff DLLs in my system areas.
My experience has been the same as Tim's. You should generally upgrade DLL's as a matter of course, because your system will be more stable. Installers are required to check DLL versions before replacing them. Not doing this is a major sin, and reason enough to stop dealing with a vendor. Jim Ahlstrom
Tim Peters wrote:
[MAL]
I agree that it's ugly to include the __ attribute in the module namespace due to the possible circular reference (parent->module, module->parent), but the patch I sent doesn't do this... or was "ugly" referring to the two underscores looking strange ?
Could you elaborate a bit on the reasons for dropping __ support ?
There are two sections on why __ was dropped in
http://www.python.org/doc/essays/packages.html
They don't refer to circularity, but to "limited use", "poor readability" and "awkwardness".
Which are all quite subjective.
A deeper reason may be hiding in the essay's "most packages will have a relative shallow substructure": this is Guido <wink>, the man who invented two-level scoping, and class inheritance without a "super" hook back to the (anonymous) parent. For all Python's dynamicism, it very much favors shallow, static name hierarchies. I don't think it's coincidence that Python's own source code is in a two-level directory structure either! The only #include with a ".." is in grammar.h, and there it's in a comment <wink>:
#include "bitset.h" /* Sigh... */
So if we cut to the core here, I'd bet Guido doesn't object so much to relative imports as to the idea that anyone would go off and create a package structure so fractally convoluted that relative imports are strongly more attractive than naming the target package in full.
Or maybe Guido doesn't care about that at all. I do regardless. I know Python's restrictions can grate, but in all, and in my repeated experience, they force you to rethink complicated designs and refactor them into simpler schemes that fit what Python is best at spelling. Nesting packages 8 deep is clumsy now? Damn straight, and I'm thankful for that: the clumsier it is, the less gratuitous inherited complexity I'll have to deal with in my future lives <0.5 wink>.
Nobody has suggested building 8-level package hierarchies. In fact, the example I gave occured when a two-level package hierarchy was used in a two-level hierarchy. I don't think the Package structure of Zope is flawed *except* for the fact that it is one level too *shallow*. The ability to do relative imports would be very helpful for the work we're doing. Jim -- Jim Fulton mailto:jim@digicool.com Technical Director (888) 344-4332 Python Powered! Digital Creations http://www.digicool.com http://www.python.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
On 14 September 1999, Tim Peters said:
+ Dealing with incompatible versions of packages. If someone wants to embed a copy of (say) mxDateTime in their own package, the only excuses are that they're afraid of overwriting the user's existing mxDateTime installation (if any), and/or of having some future installation of something else overwrite mxDateTime with an incompatible version. Those are bad problems, but package embedding is no solution. You have a much better approach to that already via the DateTime.__version__ string! "Something like that" needs to be formalized and imposed on all public packages.
at-which-point-the-distutils-sig-jumps-in-and-saves-the-day-ly y'rs - tim
Been there, tried that, bought the flame war. I made the mistake of kicking off the Distutils SIG back in Decemver with a proposal for a standard version numbering scheme for Python module distributions. See http://www.python.org/pipermail/distutils-sig/1998-December/000016.html for the kick-off of that "heated discussion". ;-) FWIW, if I was posting that message today, I would s/must/should/ and that's about it. And I would stress that these are "Proposed Recommended Guidelines" rather than "Things You Must Do or Distutils Will Slam the Door in Your Face", which is what it must have sounded like based on some of the responses. Folks: please don't reheat this flamewar on python-dev -- if you're interested in it, it will undoubtedly come up again on distutils-sig in due course! Greg -- Greg Ward - software developer gward@cnri.reston.va.us Corporation for National Research Initiatives 1895 Preston White Drive voice: +1-703-620-8990 Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913
[Tim, speaks of the devil ...]
"Something like that" [MAL's __version__ string] needs to be formalized and imposed on all public packages.
at-which-point-the-distutils-sig-jumps-in-and-saves-the-day-ly y'rs - tim
[... and Greg Ward of his legions appears!]
Been there, tried that, bought the flame war. I made the mistake of kicking off the Distutils SIG back in Decemver with a proposal for a standard version numbering scheme for Python module distributions. See
http://www.python.org/pipermail/distutils-sig/1998-December/000016.html
for the kick-off of that "heated discussion". ;-)
Greg, if you call that a flame war, your credentials as an ex-Perl'er are in serious doubt <wink>. Except for the cowboy contingent, most participants were moving swiftly to consensus!
FWIW, if I was posting that message today, I would s/must/should/ and that's about it.
No, it's "must" or it's useless. What wasn't brought up in that thread is that the Distutil "version number" is an artficial construct created for the primary benefit of Distutil tools -- it needn't have anything whatsoever to do with whatever silly string the developer wants to *display* as being their "version number". It's instead a coordinate in an abstract but rigidly defined Distutil space, specifically designed to make programmatic navigation of that space reliable in a shared and uniform way. If a developer chooses, users need never be exposed to it. I'd use the x.y.z Distutil version number directly to keep my own life simpler, but if someone else wants to display a GUID followed by a 3-letter country code and the number of nanoseconds since the birth of Mohammed, fine -- they still have to map that to Distutil VN space internally or write their own stinkin' disttools. You may have went overboard on the *semantics* of the Distutil VN, though: its only real meaning is in what Distutil tools *do* with it. Fight this battle again. Without a uniform way for an installer to *know* when it's thought safe to replace a package with another version of that package, Python installations will never move beyond the similar hell of Windows 3.1. even-herds-of-cats-wear-collars-ly y'rs - tim
Tim Peters wrote:
(snip)
As is, the package name used by a release is part of its published interface. You can't change it without causing pain, any more than you can make incompatible changes to public class methods or input-output behavior. In return, package clients are uniform, simple and portable, making life easiest for the people who know least. The burden is on package authors to choose names wisely, and that's where the burden should be.
Not all packages are part of the external interface. In fact, all Zope names are essentially internal, since Zope is an application. The issue is not so much access to access from outside as it is access between packages within Zope. Further, the current support for relative imports allows a package to be moved into another package without breaking the pulic interface wrt the containing package. Here's an example that I hope will be motivating: Suppose Marc-Andre has a package mx with subpackages DateTime and stringtools. If mx was installed in the Python path then a module in the mx.DateTime package could get at stringtools like: import mx.stringtools So far, so good. Zope has a notion of products which are *self contained* packages that are sub-packages of the Products package. So, suppose someone wants to write a NiftyDB product, which is a Zope product that provides access to an external database. Now the author of the NiftyDB product wants to use the mx package. The mx package is not a standard part of Zope, or of Python, so they simpley include it in the NiftyDB product directory. Becase relative imports are allowed in the current import scheme, they can use mx as usual. A NiftyDB module can import DateTime as follows: import mx.DateTime So even though mx is istalled as a sub-package, the public interface is unchanged, at least wrt the containing package. Unfortunately, the internal import of stringtools in the DateTime package: import mx.stringtools will fail, because mx is no longer a top-level module. Jim -- Jim Fulton mailto:jim@digicool.com Python Powered! Technical Director (888) 344-4332 http://www.python.org Digital Creations http://www.digicool.com http://www.zope.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
[Jim Fulton]
Here's an example that I hope will be motivating:
Suppose Marc-Andre has a package mx with subpackages DateTime and stringtools. ... Zope has a notion of products which are *self contained* packages that are sub-packages of the Products package. So, suppose someone wants to write a NiftyDB product, which is a Zope product that provides access to an external database. Now the author of the NiftyDB product wants to use the mx package. The mx package is not a standard part of Zope, or of Python, so they simpley include it in the NiftyDB product directory.
This would motivate me to strangle the author of NiftyDB. He should say "This requires Marc-Andre's mx package", and put error handling on the import. Otherwise I might end up with a gazillion separate copies of the mx package installed on my system and no way to clean it up. Your notion of "self contained packages" is flawed. Reducto ad adsurdum, it leads to downloading the OS with every app you install. (Crap, I hope I haven't given Sun any new ideas...). - Gordon
Gordon McMillan wrote:
[Jim Fulton]
Here's an example that I hope will be motivating:
Suppose Marc-Andre has a package mx with subpackages DateTime and stringtools. ... Zope has a notion of products which are *self contained* packages that are sub-packages of the Products package. So, suppose someone wants to write a NiftyDB product, which is a Zope product that provides access to an external database. Now the author of the NiftyDB product wants to use the mx package. The mx package is not a standard part of Zope, or of Python, so they simpley include it in the NiftyDB product directory.
This would motivate me to strangle the author of NiftyDB. He should say "This requires Marc-Andre's mx package", and put error handling on the import.
This is unacceptable. People should not have to hack a global namespace to get a local feature.
Otherwise I might end up with a gazillion separate copies of the mx package installed on my system and no way to clean it up.
Or you might not. In any case, the ability to have localized references should be there. Actually, Zope does provide a place to share packages that multiple products use. This involves putting shared packages in a sub-package of the 'Shared' package. For example, Digital Creations shared packages go in Shared.DC. Putting mx in Shared.ACME runs into the same problem. Although the product author can: import Shared.ACME.mx.DateTime the import of mx.stringtools in DateTime will still fail. Now, maybe ACME will hack mx to change the internal imports: import Shared.ACME.mx.stringutils but their hacks will get broken when we make Zope a package and Shared becomes a sub-package of Zope.
Your notion of "self contained packages" is flawed. Reducto ad adsurdum, it leads to downloading the OS with every app you install. (Crap, I hope I haven't given Sun any new ideas...).
I'm sorry, but relying on existing Python installations for commercial applications is just not practical. In fact, Zope binary distributions contain their own Python distributions because relying on preinstalled Python distributions is just way too much of a support burden and a hassle for out customers. Jim -- Jim Fulton mailto:jim@digicool.com Python Powered! Technical Director (888) 344-4332 http://www.python.org Digital Creations http://www.digicool.com http://www.zope.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
Jim Fulton wrote:
Gordon McMillan wrote:
[Jim Fulton]
Here's an example that I hope will be motivating:
Suppose Marc-Andre has a package mx with subpackages DateTime and stringtools. ... Zope has a notion of products which are *self contained* packages that are sub-packages of the Products package. So, suppose someone wants to write a NiftyDB product, which is a Zope product that provides access to an external database. Now the author of the NiftyDB product wants to use the mx package. The mx package is not a standard part of Zope, or of Python, so they simpley include it in the NiftyDB product directory.
First, I am not all that opposed to having a notion of ".." available in the import statement. If we can write zope.dir1.mod1 which is a relative import going down, maybe we can write ../dir2/mod2 or something spelled differently. But I think there would still be problems. We would be relying on all package authors to use ".." or "__" within their package. But it is more natural to write zope.this.that or mx.this.that everywhere, and that is what people have in fact done. Probably there would be bugs and at least annoyance. And it seems unfortunate to need this feature. A proper package scheme should really isolate package internals from the outside world, and if this isn't happening we need a new scheme. I also don't think there is any escape from needing standard package names, at least the head name "zope" for example. Of course all this currently works if (in Jim's example) both "zope" and "mx" are top-level names. Then the ".." is not necessary. But, as Jim points out:
But why shouldn't it be visible?
Because visibility has a cost. Making mx a top-level name means that someone else can't make it a top-level name. This is why packages are a good idea.
The global shared nature of PYTHONPATH and its name space makes it difficult to guarantee that all required packages are going to be present in a complicated installation like zope. And if anyone else installs another Python package, it can easily break the first installation. One solution is for an application to establish its own PYTHONPATH which can not be altered. If this points to "zopedir" then the installer can freely install mx to the directory zopedir/mx and be confident that another mx installation is not damaged, nor used. The logical extension is to place each package into its own file using a scheme like Gordon McMillan is using. If the package contents is obtained by seeking from the END of the file, then multiple package files can be concattenated with cat package1 package2 zope mx >> bigpackage and a large installation like zope can be shipped with its own "bigpackage" library which is essentially a normal PYTHONPATH archive with everything above the head directory names thrown away. The Python library is included under "Lib" just as it is now. The bigpackage library implies its own PYTHONPATH of "Lib;." If the Python library files are placed in the root, the implied PYTHONPATH is ".". Jim Ahlstrom
James C. Ahlstrom wrote:
First, I am not all that opposed to having a notion of ".." available in the import statement. If we can write zope.dir1.mod1 which is a relative import going down, maybe we can write ../dir2/mod2 or something spelled differently. But I think there would still be problems.
We would be relying on all package authors to use ".." or "__" within their package. But it is more natural to write zope.this.that or mx.this.that everywhere, and that is what people have in fact done. Probably there would be bugs and at least annoyance.
Think of it as an opportunity: it makes intra-package imports very simple without hard-wiring paths to submodules into the package. In the end, having written a self-contained package is a "feature" of that package. Other authors can then say: ok, I can plugin your package if it's self-contained. -- I have a feeling that this discussion is loosing its grip on reality a bit. Jim and I are not argueing to change Python's face, internal structure or look&feel. The '__' thing doesn't harm anyone, doesn't break any functionality or code. It is even now already possible using import hooks and has been implemented before in ni.py using just that technology. All that we are talking about is reviving relative imports so that its usage is possible *without* adding yet another hook. This is not about the pros and cons of deeply nested packages and neither about advantages of shallow structures. Its only about adding semantics to the string '__' used in import statements. Nothing more. People are not forced to use it. Besides, if used it will be a package internal technique and not be visible to users of that package. Now why would anyone want to make life harder for package authors ? Give the people something to play with so that they have more fun at what their doing for the benefit of the community ! --
And it seems unfortunate to need this feature. A proper package scheme should really isolate package internals from the outside world, and if this isn't happening we need a new scheme.
I also don't think there is any escape from needing standard package names, at least the head name "zope" for example.
Of course all this currently works if (in Jim's example) both "zope" and "mx" are top-level names. Then the ".." is not necessary. But, as Jim points out:
But why shouldn't it be visible?
Because visibility has a cost. Making mx a top-level name means that someone else can't make it a top-level name. This is why packages are a good idea.
The global shared nature of PYTHONPATH and its name space makes it difficult to guarantee that all required packages are going to be present in a complicated installation like zope. And if anyone else installs another Python package, it can easily break the first installation.
One solution is for an application to establish its own PYTHONPATH which can not be altered. If this points to "zopedir" then the installer can freely install mx to the directory zopedir/mx and be confident that another mx installation is not damaged, nor used.
The logical extension is to place each package into its own file using a scheme like Gordon McMillan is using. If the package contents is obtained by seeking from the END of the file, then multiple package files can be concattenated with cat package1 package2 zope mx >> bigpackage and a large installation like zope can be shipped with its own "bigpackage" library which is essentially a normal PYTHONPATH archive with everything above the head directory names thrown away. The Python library is included under "Lib" just as it is now. The bigpackage library implies its own PYTHONPATH of "Lib;." If the Python library files are placed in the root, the implied PYTHONPATH is ".".
This is a possibility and in fact I'm using such a setup in my current application. Still it doesn't avoid possible conflicts due to being top-level, e.g. the user could install an extension which relies on a specific top-level name already taken by the hosting package, e.g. the host defines a DateTime package and the extension comes with its own DateTime package. This is the current situation with Zope and mxODBC (which needs mxDateTime). -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 108 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
"M.-A. Lemburg" wrote:
James C. Ahlstrom wrote:
First, I am not all that opposed to having a notion of ".." available in the import statement. If we can write zope.dir1.mod1 which is a relative import going down, maybe we can write ../dir2/mod2 or something spelled differently. But I think there would still be problems.
We would be relying on all package authors to use ".." or "__" within their package. But it is more natural to write zope.this.that or mx.this.that everywhere, and that is what people have in fact done. Probably there would be bugs and at least annoyance.
Think of it as an opportunity: it makes intra-package imports very simple without hard-wiring paths to submodules into the package.
In the end, having written a self-contained package is a "feature" of that package. Other authors can then say: ok, I can plugin your package if it's self-contained.
Well said. It really should be possible to provide *self-contained* package hierarchies.
I have a feeling that this discussion is loosing its grip on reality a bit. Jim and I are not argueing to change Python's face, internal structure or look&feel. The '__' thing doesn't harm anyone, doesn't break any functionality or code. It is even now already possible using import hooks and has been implemented before in ni.py using just that technology.
All that we are talking about is reviving relative imports so that its usage is possible *without* adding yet another hook. This is not about the pros and cons of deeply nested packages and neither about advantages of shallow structures. Its only about adding semantics to the string '__' used in import statements. Nothing more. People are not forced to use it. Besides, if used it will be a package internal technique and not be visible to users of that package.
Now why would anyone want to make life harder for package authors ?
I don't think nearly enough Python code has been packagized. Packages are critcal for large projects and for reuse of code within large projects. There are at least two people in this discussion that are trying hard to make use of packages and have observed a significant problem. If we really want to encourage developers of reusable Python software to use packages, then I think it would be wise to pay attention to people who are actively using packages and have encountered problems. Jim -- Jim Fulton mailto:jim@digicool.com Technical Director (888) 344-4332 Python Powered! Digital Creations http://www.digicool.com http://www.python.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
Jim Fulton wrote:
"M.-A. Lemburg" wrote:
In the end, having written a self-contained package is a "feature" of that package. Other authors can then say: ok, I can plugin your package if it's self-contained.
Well said. It really should be possible to provide *self-contained* package hierarchies.
There are at least two people in this discussion that are trying hard to make use of packages and have observed a significant problem.
OK, just to make sure I understand this. You are saying that a ".." operation is needed in imports so that a package author with a package which looks like this: mypackage mypackage/sub1/mod1.py mypackage/sub2/mod2.py can write "import __.sub2.mod2" instead of "import mypackage.sub2.mod2" within the file mod1.py. Right? Jim Ahlstrom
"James C. Ahlstrom" wrote:
Jim Fulton wrote:
"M.-A. Lemburg" wrote:
In the end, having written a self-contained package is a "feature" of that package. Other authors can then say: ok, I can plugin your package if it's self-contained.
Well said. It really should be possible to provide *self-contained* package hierarchies.
There are at least two people in this discussion that are trying hard to make use of packages and have observed a significant problem.
OK, just to make sure I understand this. You are saying that a ".." operation is needed in imports so that a package author with a package which looks like this: mypackage mypackage/sub1/mod1.py mypackage/sub2/mod2.py
can write "import __.sub2.mod2" instead of "import mypackage.sub2.mod2" within the file mod1.py. Right?
Yup. Note that sys.modules would still have the absolute paths for keys. Jim -- Jim Fulton mailto:jim@digicool.com Technical Director (888) 344-4332 Python Powered! Digital Creations http://www.digicool.com http://www.python.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
"M.-A. Lemburg" wrote:
Still it doesn't avoid possible conflicts due to being top-level, e.g. the user could install an extension which relies on a specific top-level name already taken by the hosting package, e.g. the host defines a DateTime package and the extension comes with its own DateTime package. This is the current situation with Zope and mxODBC (which needs mxDateTime).
Yes, the name conflict at the global level is exactly the problem. And to repeat, I don't really mind if "__" is added to imports, although I don't like the spelling. But... Perhaps a better solution is a multiple global name space. Suppose there is a command line option (or other mechanism such as a special file in sys.executable) which specifies sys.path for the application. To be specific, suppose zope is installed in the "zopehome" directory and the zope package lives in zopehome/zope. And suppose zope needs mx, so it installs it in zopehome/mx. This really is an obvious structure, and anyone could see that replacing mx/* would upgrade the mx package. The sys.path would be zopehome plus perhaps some subdirectories of zopehome. Doesn't this solve the problem? Why not? And don't complain about wasted disk space. The smallest disk you can buy today is 4 gigs and costs $150. I find the PYTHONPATH mechanism totally unreliable for commercial programs anyway. It is a global object, and an installation of a second Python program can break the first one. I don't think there is any solution to this other than specify sys.path on a per-application basis. If this is false, what is the other solution? Jim Ahlstrom
James C. Ahlstrom wrote:
"M.-A. Lemburg" wrote:
Still it doesn't avoid possible conflicts due to being top-level, e.g. the user could install an extension which relies on a specific top-level name already taken by the hosting package, e.g. the host defines a DateTime package and the extension comes with its own DateTime package. This is the current situation with Zope and mxODBC (which needs mxDateTime).
Yes, the name conflict at the global level is exactly the problem. And to repeat, I don't really mind if "__" is added to imports, although I don't like the spelling. But...
Doesn't look nice, but what other syntax would look better ? It will have to use identifiers and thus is restricted to [a-zA-Z0-9_]+.
Perhaps a better solution is a multiple global name space. Suppose there is a command line option (or other mechanism such as a special file in sys.executable) which specifies sys.path for the application. To be specific, suppose zope is installed in the "zopehome" directory and the zope package lives in zopehome/zope. And suppose zope needs mx, so it installs it in zopehome/mx. This really is an obvious structure, and anyone could see that replacing mx/* would upgrade the mx package. The sys.path would be zopehome plus perhaps some subdirectories of zopehome. Doesn't this solve the problem? Why not? And don't complain about wasted disk space. The smallest disk you can buy today is 4 gigs and costs $150.
This would solve the problem for Zope (but only after I have restructured my packages to all go under the mx package -- which is what started this thread; it wouldn't be needed using relative imports). But what about other installations or tools like mxCrypto which plug into existing packages (Andrew's crypto package) ? Today such plugins only work side-by-side, ie. you have to install one part which sits in the plugin slot and one part which gets installed somewhere in a top-level PYTHONPATH dir. The plugin slot part then imports the Real Thing from outside the host package (this is how mxCrypto integrates itself with AMK's lib).
I find the PYTHONPATH mechanism totally unreliable for commercial programs anyway. It is a global object, and an installation of a second Python program can break the first one. I don't think there is any solution to this other than specify sys.path on a per-application basis. If this is false, what is the other solution?
The solution I'm using (and Zope is too) is to wrap a complete Python installation into the product, all setup to do the right thing. It makes the archive a little heavier... which is actually not so bad from a marketing POV. Bizarre, but people seem to want all those megs. -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 108 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
[MA Lemburg]
The solution I'm using (and Zope is too) is to wrap a complete Python installation into the product, all setup to do the right thing. It makes the archive a little heavier... which is actually not so bad from a marketing POV. Bizarre, but people seem to want all those megs.
This is fine if you're distributing an app - it's a black box, it's standalone, no dependencies outside the app. Cool. I do that too. I use imputil to serve modules out of archives - no need to tweak the modules at all. In the upcoming version (of my installer) sys.path has exactly one entry, and the only python that needs to be outside an archive is exceptions.pyc. (I repeat here: while the pieces of my installer that handle binary dependencies are Win32 only, the parts that handle pure Python are cross-platform.) But both you and Jim are aiming for this non-existant in- between space - in some respects you want a "normal" Python installation, but you want to black-box your turf. I'm sorry, but if you're going to fit into a Python installation, you should make public your dependencies. Yes, installation becomes more complicated, but you're dealing with *Python users*. They're at least as likely to get screwed by having multiple different copies of the same thing around as you are in having dependencies outside of your personal control. If you also want to distribute as a standalone, freeze/squeeze/ whatever it. Just make sure that what you distribute this way won't interfere (or be confounded by) any existing Python installations. - Gordon
Gordon McMillan wrote:
But both you and Jim are aiming for this non-existant in- between space - in some respects you want a "normal" Python installation, but you want to black-box your turf.
I don't really have much choice here. For some purposes, PYTHONPATH is better than a black-box. It is better for development because it avoids creating the black-box. And I can look at and modify Python app and library files directly. But for a customer install I need a bullet-proof dumb-simple set of required state which I can explain to customer support staff. Thus the need for both.
I'm sorry, but if you're going to fit into a Python installation, you should make public your dependencies. Yes, installation becomes more complicated, but you're dealing with *Python users*. They're at least as likely to get screwed by having multiple different copies of the same thing around as you are in having dependencies outside of your personal control.
I can't fit into a Python installation because Python installations do not support commercial software concerns. I am not dealing with Python users, my customers are office workers. They don't know what Python is, have no other Python apps, and if they did, my install wouldn't hurt them anyway.
If you also want to distribute as a standalone, freeze/squeeze/ whatever it. Just make sure that what you distribute this way won't interfere (or be confounded by) any existing Python installations.
Yes, exactly my goal. Jim Ahlstrom
Gordon McMillan wrote:
[MA Lemburg]
The solution I'm using (and Zope is too) is to wrap a complete Python installation into the product, all setup to do the right thing. It makes the archive a little heavier... which is actually not so bad from a marketing POV. Bizarre, but people seem to want all those megs.
This is fine if you're distributing an app - it's a black box, it's standalone, no dependencies outside the app. Cool. I do that too. I use imputil to serve modules out of archives - no need to tweak the modules at all. In the upcoming version (of my installer) sys.path has exactly one entry, and the only python that needs to be outside an archive is exceptions.pyc.
(I repeat here: while the pieces of my installer that handle binary dependencies are Win32 only, the parts that handle pure Python are cross-platform.)
But both you and Jim are aiming for this non-existant in- between space - in some respects you want a "normal" Python installation, but you want to black-box your turf.
Actually, I want my stuff to be as flexible as possible. Of course I'm using it in my product too, but that's a different story. I have the unsatisfying impression that this discussion doesn't lead anywhere. I can't really understand all the evil sides Tim and Guido seem to see in their views of relative imports. For people like Jim and me, who have quite some experience in doing Python packages, this small (! hey, it's only about 20 lines of code !) additional feature could be of great use. I'm still waiting for some ultimate argument that blows relative imports away. The arguments put forward so far have all been in the category "evil", "ugly", "I don't like it". Hey, we can do better than that... Note that if this doesn't work out, Jim and I could always agree on some standard import hook that we'd both use, but that's exactly what we would very much like to avoid in favour of some established standard hard-wired into the Python distribution.
I'm sorry, but if you're going to fit into a Python installation, you should make public your dependencies. Yes, installation becomes more complicated, but you're dealing with *Python users*. They're at least as likely to get screwed by having multiple different copies of the same thing around as you are in having dependencies outside of your personal control.
I am probably going to ship all the mx* stuff in one big package -- not as separate packages anymore. That way I can write code which depends on other parts of the mx universe more easily.
If you also want to distribute as a standalone, freeze/squeeze/ whatever it. Just make sure that what you distribute this way won't interfere (or be confounded by) any existing Python installations.
It won't interfere with anything since the Python interpreter I use is configured *not* too look in any standard places for extensions and the like. Plus it's configured to run faster, but that's a different story ;-) -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 108 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
On Tue, 14 Sep 1999, M.-A. Lemburg wrote:
... I have the unsatisfying impression that this discussion doesn't lead anywhere. I can't really understand all the evil sides Tim and Guido seem to see in their views of relative imports. For people like Jim and me, who have quite some experience in doing Python packages, this small (! hey, it's only about 20 lines of code !) additional feature could be of great use.
$| was probably a small addition to Perl, but I don't think you could then argue that it was a good thing to do. Small doesn't mean good. Any change "could be of great use" to *somebody*, but does it make it good for Python as a whole? Changes imply doc, maintenance, future compatibility, etc.
I'm still waiting for some ultimate argument that blows relative imports away. The arguments put forward so far have all been in the category "evil", "ugly", "I don't like it". Hey, we can do better than that...
People seem to be disagreeing with your stated requirement. In other words, they're saying that you shouldn't be attempting to make your package "portable across the module-import-space." I tend to agree. Fix it at one location. If an app doesn't like that, then they can individually compensate. I don't believe that Python's standard machinery is required to handle this particular notion of packaging.
Note that if this doesn't work out, Jim and I could always agree on some standard import hook that we'd both use, but that's exactly what we would very much like to avoid in favour of some established standard hard-wired into the Python distribution.
No reason the standard hook could not go into imputil.py. I believe people mostly care that it doesn't become *standard* or *default* behavior -- that it only is available when explicitly requested by an app. Cheers, -g -- Greg Stein, http://www.lyra.org/
"M.-A. Lemburg" wrote:
Doesn't look nice, but what other syntax would look better ? It will have to use identifiers and thus is restricted to [a-zA-Z0-9_]+.
Maybe "__up__" or "__up" following the convention that double-under names are special? Just a thought...
But what about other installations or tools like mxCrypto which plug into existing packages (Andrew's crypto package) ? Today such plugins only work side-by-side, ie. you have to install one part which sits in the plugin slot and one part which gets installed somewhere in a top-level PYTHONPATH dir. The plugin slot part then imports the Real Thing from outside the host package (this is how mxCrypto integrates itself with AMK's lib).
I am not familiar with this. Dosen't this work?: zopehome/part_in_top_level_slot.py zopemome/plugin_slot/part_in_plugin_slot.py
The solution I'm using (and Zope is too) is to wrap a complete Python installation into the product, all setup to do the right thing. It makes the archive a little heavier... which is actually not so bad from a marketing POV. Bizarre, but people seem to want all those megs.
For the record, I distribute a hacked main program which sets its own sys.path. I think your and zope's solution may break if another Python program is installed, and such program changes PYTHONPATH and/or the registry. That is why I never change PYTHONPATH nor the registry. Jim Ahlstrom
Finally, something I can relate to. Although I have a goal of packagizing everything I write these days, I haven't experienced any of the problems that lead others to suggest relative imports. The most complicated app that I hack on (continuously) is Mailman, which has a main package and several subpackages off the main one. I always use absolute paths in my import statements, so I don't see what the fuss is about. But I'm perfectly willing to admit that I don't have enough experience. However...
"JCA" == James C Ahlstrom <jim@interet.com> writes:
JCA> I find the PYTHONPATH mechanism totally unreliable for JCA> commercial programs anyway. It is a global object, and an JCA> installation of a second Python program can break the first JCA> one. I don't think there is any solution to this other than JCA> specify sys.path on a per-application basis. If this is JCA> false, what is the other solution? I completely agree with JimA here. It's been a pain with the Knowbot stuff, a pain with Mailman, and a pain with other packages that I've installed for shared use within CNRI. The .pth files solve part of the problem nicely. They let me install, say PIL or PCT in a shared location, for access by all the Python users at my site, without the users having to individually hack their dot-files, etc. But this doesn't work so well for apps like Mailman or the Knowbot stuff because we can't expect that the person installing those applications will be able to install a .pth file in the right place. Also, .pth files don't let you tightly control sys.path, e.g. you can only add paths, not delete or reorder them. Plus you have a global naming problem. Mailman's top level package is called "Mailman", so I can be fairly confident that I'm not going to collide, but it means that I have an extra directory level within my install that contains all the core importable modules. I don't think that's a big deal, but it's a convention that other packaged app writers should follow. The problem is getting Mailman's (or the Knowbots') top level directory on sys.path, and in exactly controlling the contents of sys.path. Our first approach with Knowbots was to do direct sys.path.insert()s, which is quite ugly and error prone. Plus if you're adding many paths, or adding and deleting, that's a lot of gibberish at the top of your entry level executables. And now let's say that you have a dozen or two dozen entry level executables that all have to perform the same sys.path magic. That's a lot of cutting-and-pasting (and /highly/ error prone patching when directory structures change). It's a lose. So for Knowbots we wrote a small module called pathhack that all entry level executables imported. pathhack was good because it put all that sys.path munging nonsense in one place so it was manageable from a s/w engineering standpoint. But it sucked because those executables had to /find/ pathhack.py! Bootstrap lossage (we've actually gone back to sys.path.insert). With Mailman, I could solve that problem because I added a configure/make phase. This let me write a module template called paths.py.in which configure flippered into paths.py containing path hackage based on --prefix. The next trick was that "make install" copied that paths.py file into all the subdirectories that had top level entry points into the Mailman system (e.g. the bin directory, the cron directory, the cgi directory). So now, an executable need only do import paths import Mailman.Utils import Mailman.Logging.Utils and absolute paths work like a charm. I can even provide a `pythonlib' directory that contains newer versions of standard modules that have fixes for folks running older Pythons. Thus I do from Mailman.pythonlib import rfc822 and the rest of my code uses my special rfc822 module with no changes. I'm very happy with how this works for Mailman, however we can't use the same approach (or let's say Guido doesn't want to use this approach) for the Knowbots stuff because there /is/ no "make install" step. You just unpack it and go. But it still has to play lots of games searching the file system for various things. What I've been thinking is that Python needs a registry <shudder>. JPython's already got such a beast, and it integrates with Java's system properties, so that things like the PYTHONPATH equivalent are set in the registry and immediately available. But it's not very flexible, and you still need an install step in order to bootstrap the locating of the registry. I think we can do a little bit better. Python already knows how to find it's sys module. We can add an object into sys, call it sys.registry, which would contain things like sys.path definitions, and all sorts of other application specific keys. This object would be tied to a file (or files) which might be human readable, a marshal/pickle (or both). Bootstrap location of this file(s) is an issue, but see below. This would let you do things like the following at the beginning of every top level executable: import sys sys.application = 'zope' sys.registry.setpath(sys.application+'.pythonpath') I'm sure all kinds of lengthy discussion will now ensue about the exact interface of the registry object, but I'll make just a few observations: - There should be a system wide registry and a user specific registry. This let's an admin install shared applications easily, but also lets individual users have their own overrides. - The system-wide registry can be located in say sys.prefix/lib/python<version>/site-packages. The user registry would reside somewhere in $HOME. This could all be platform specific so that on Windows, maybe the Python registry is integrated with the Windows registry, while in JPython it would be integrated with the standard JPython registry mechanism. - You should be able to specify registry entries on the command line. - There needs to be defined rules for resolving registry keys b/w system, user, and command line specifications. JPython has some experience here (although there have been requests to change JPython's lookup order), and at the very least, JPython and CPython should be as consistent as possible (CPython won't have to merge in Java's system properties). - The sys.registry object should be read/writable. This would let an install script do something like: import sys sys.registry.lock() sys.registry.put('zope.pythonpath', '@prefix@:@prefix@/matools:@prefix@/pythonlib') sys.registry.write() sys.registry.unlock() which would write either the global system registry or the local user registry, depending on permissions (or maybe that's spelled explicitly in the API). - In a sense you're pushing the namespace issue up a level into the registry, but at least this is a domain we can completely control from Python; it abstracts away the file system, and I don't think there's any way to avoid requiring conventions and cooperation for registry key naming. I also don't think it'll be a big problem in practice. When I packagize and re-release my Zarathustra's Ocular Python Experience virtual reality system, I'll try to think of a non-colliding top level package name. - (oh darn, I know I had more points, but Guido just popped in and I lost my train of thought). Well, this has gone on long enough so I might as well let you guys shoot this idea all to hell. Let me close by saying that while I think the Windows registry is a mess, I also think that it might be useful for Python. Does it solve the same problem that the relative imports is trying to solve? I dunno, but that's why I changed the Subject: line above. :) -Barry
"Barry A. Warsaw" wrote:
So for Knowbots we wrote a small module called pathhack that all entry level executables imported. pathhack was good because it put all that sys.path munging nonsense in one place so it was manageable from a s/w engineering standpoint. But it sucked because those executables had to /find/ pathhack.py! Bootstrap lossage (we've actually gone back to sys.path.insert).
Yes, exactly the problem I had, bootstraping the import of pathhack. Actually it gets worse because Python imports exceptions.py, site.py and sitecustomize.py during Py_Initialize(), so if you are having a really bad day, you might pick up the wrong version of those. AFAIK, the only way to solve that currently is to use freeze to build pathhack into the binary executable. That is what I do anyway. But it is not an ideal solution.
What I've been thinking is that Python needs a registry <shudder>.
Yikes! As you say, Window's registry is a mess.
[Lots of good ideas omitted...]
- The system-wide registry can be located in say sys.prefix/lib/python<version>/site-packages. The user registry would reside somewhere in $HOME. This could all be platform specific so that on Windows, maybe the Python registry is integrated with the Windows registry, while in JPython it would be integrated with the standard JPython registry mechanism.
Python already has three directories it knows about: sys.executable is the directory of the interpreter binary, sys.dllfullpath could be the directory of the interpreter as a shared library (I have a patch for this), and there is the directory of the main Python program as given on the command line. Perhaps we can put the registry in one of these directories. That would be consistent on all platforms.
- You should be able to specify registry entries on the command line.
This is vital because I am worried about a bad registry.
- There needs to be defined rules for resolving registry keys b/w system, user, and command line specifications. JPython has some experience here (although there have been requests to change
I am not sure a full registry is required. Once you can control sys.path and can get an accurate import of sitecustomize.py, you can do everything else there. Maybe just a command line option is enough. But I will think about it... Jim Ahlstrom
I just had a long discussion with Barry and Fred, in response to his registry proposal. We quickly decided that a Python registry is overkill for the given problem. We also quickly came up with a nice variant of Mailman's approach which will work well in a variety of cases. --> The context: You have a large complicated application that contains many modules spread over many packages, and which has many "top-level" scripts that are invoked by the user (or via CGI, for example). All the code is properly packagized, with sufficiently globally unique package names being used all over the place. --> The problem: How to get the root directory of your application (where all your packages live) on sys.path. --> The rules: Using $PYTHONPATH is right out. You can't install new files in the core Python installation directory (nor modify existing ones), so using .pth files is also out. You don't want to have to edit each of the top-level scripts of your application. You want a cross-platform solution, in particular it should be amenable to Windows. --> The assumptions: You can use a reasonably intelligent installer. All your top-level scripts are installed in a single directory (or perhaps in a small number of separate bin directories, e.g. bin and cgi-bin). --> The solution: Suppose your application (as a whole, not the individual top-level script) is called Spam -- this may well also be the name of your top-level package. Then start each top-level script with the single line import Spam_path before importing anything else. Your installer, once it knows the absolute pathname of your application's root directory, crafts a file Spam_path.py which contains code that inserts the right absolute pathname into sys.path. Your installer then installs a copy of this file (or a symbolic link to it) *in each bin directory where it installs top-level Python scripts*. Because the script's directory is first on the default path, the Spam scripts will pick up Spam_path without any help from $PYTHONPATH. --> Notes: If you are Spam's developer, you probably want to be able to use its top-level scripts without having to install them. All you need to do is create a file Spam_path.py pointing to the top of your development tree, and set $PYTHONPATH to point to the directory that contains it. (Perhaps you already have $PYTHONPATH pointing to a personal directory of Python modules you like to have accessible -- then you can just drop Spam_path.py there, or link to it from there.) Note that adding a personal directory of Python goodies is about the only use of $PYTHONPATH that I approve of -- this way, you can set $PYTHONPATH in your .profile and never have to change it. I know this doesn't resolve the relative import thread (how's that going by the way? :-) but Barry & Fred & I agree that this is the best solution to the problem stated in Barry's message to which I am following up here. --Guido van Rossum (home page: http://www.python.org/~guido/)
Don't tell, but this is getting repetitive. One more batch of repeats from me, and I'm done: + The notion that Python packages aren't self-contained now is wrong: a package module can import other modules in the package without qualification. This is good design. + Where self-containment breaks down is in going across *sub*-packages. I was surprised Guido pretended to cater to them, since Python (wisely, IMO) stayed away from sub-modules from the start. Hierarchy is a great tool for managing complexity, but it also introduces complexity of its own. And when a flat package space suffices, hierarchy introduces artificial complications. + I agree with Gordon that, if anything, the default "import" behavior is already too complicated. By the time you wade thru PYTHONPATH, and PYTHONSTARTUP, and .pth files, and people mucking with sys.path, and site config files, and symlinks under Unix, and packages mucking with their own __path__ attributes, running Python with -v is the only hope of figuring out why nothing ever works <0.6 wink>. + All this dynamicism is in support of a concept that's actually static: the structure of an installation. As Barry laments, you can't always know the structure in advance of installation, but the solution is (I believe) along the lines he suggests: invent a way to register/query the structure once & for all after it is known. + JimF and MAL's shared notion that they're the only ones making significant use of packages is myopic. Dragon also uses packages, and heavily, although products using them have not been announced and I can't talk about them yet. Anything I'm within 10 miles of doesn't use *any* import hooks, or play any other tricks on the system. Instead I urge people to write their imports as explicitly, long-windedly and straightforwardly as possible (with binding to a convenient local alias name on the following line). Do we embed sub-packages? Sure. Do modules move around in the hierarchy over time? Of course. So how do we deal with this supposed nightmare? Because all imports are relentlessly straightforward (even intra-package imports are fully qualified), and none are hiding in dynamic functions, we get into Emacs and do a global search-and-replace. Takes 30 seconds, and the new structure is plain as day to everyone reading the code. low-tech-it's-not-just-for-breakfast-anymore<wink>-ly y'rs - tim
Tim Peters wrote:
+ The notion that Python packages aren't self-contained now is wrong: a package module can import other modules in the package without qualification. This is good design.
Agreed. And we would like to take that notion one step further -- without breaking the two-scope resolution Python uses now. The '__' tags are just place holders which could just as well be handled by a simple Python fuction (like the one Gordon posted) using perfectly valid technology. Sticking this technology into the import statement is really only a more elegant approach. Nothing more. The whole thing already works using the current Python implementation.
+ JimF and MAL's shared notion that they're the only ones making significant use of packages is myopic.
Oh, c'mon Tim, we never did say that. We only mentioned having used package for quite a while. This includes having seen all the different pitfalls they have to offer.
Dragon also uses packages, and heavily, although products using them have not been announced and I can't talk about them yet. Anything I'm within 10 miles of doesn't use *any* import hooks, or play any other tricks on the system. Instead I urge people to write their imports as explicitly, long-windedly and straightforwardly as possible (with binding to a convenient local alias name on the following line). Do we embed sub-packages? Sure. Do modules move around in the hierarchy over time? Of course. So how do we deal with this supposed nightmare? Because all imports are relentlessly straightforward (even intra-package imports are fully qualified), and none are hiding in dynamic functions, we get into Emacs and do a global search-and-replace. Takes 30 seconds, and the new structure is plain as day to everyone reading the code.
This is perfectly ok if you're just using your own code, but it gets a mess when third-party packages are involved simply because you can't be sure they don't use import hooks, local imports or hacks to __path__ etc. If you still want them to be useable, you'll have to go down and dive into their structure. This takes time, is error prone and not necessarily fun. Also, I don't know how "explicitly, long-windedly and straightforwardly" writing module imports is any different from using relative imports. An example: mx .DateTime .ODBC.Windows In ODBC.Windows I would write: import __.__.DateTime which is just as explicit as writing import mx.DateTime except that the information about the top-level hierarchy is *not* included in the import information. Anyway, I'm not too inclined continuing this discussion anymore. I would never have thought that such a nifty little patch would have stirred up so much negative response. I'll just hack up my very own import mechanism using imputil and be done. So there! -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 107 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
Sure is a lot of traffic on this -- can't we go back to hating the new license <wink>? [MAL]
... Sticking this technology into the import statement is really only a more elegant approach. Nothing more. The whole thing already works using the current Python implementation.
Your point there has been made many times -- I'm not missing it. My point has also been made many times, so one more can't hurt <wink>: it still looks to me like bad practice, so I don't *want* (more, actively oppose) it to be made easier to spell.
+ JimF and MAL's shared notion that they're the only ones making significant use of packages is myopic.
Oh, c'mon Tim, we never did say that. We only mentioned having used package for quite a while. This includes having seen all the different pitfalls they have to offer.
Well, JimF was laying it on a bit thick in two msgs, and sucked you in as his implicit conspirator. My apologies if you feel tarred unfairly. Yet you still can't resist implying that only you two are aware of the pitfalls <wink>.
[on Dragon's "relentlessly straightforward" import policies]
This is perfectly ok if you're just using your own code, but it gets a mess when third-party packages are involved simply because you can't be sure they don't use import hooks, local imports or hacks to __path__ etc. If you still want them to be useable, you'll have to go down and dive into their structure. This takes time, is error prone and not necessarily fun.
While true, it's much safer to install a "cheating" package at top level, since the cheats presumably *assume* that's where the package is to be run from. If someone distributes a package that does not run correctly from top level, that's one hopeless package. Else it does run correctly from top level, and then the only person with a problem here is one who insists on running it from some embedded location. You then buy the need to dive into their code very carefully indeed -- relative imports don't save you from the huge variety of "clever tricks" they *may* be playing with hooks and path hacks. But note that, unlike Jim (& perhaps you too), I have almost no fear of top-level name collisions. If we do nothing about that, it will be years before it becomes a real problem; and several months before that <wink>, we can flock to Java's hokey but effective reverse-domain scheme (which I suspect you're already in favor of -- fine by me!).
Also, I don't know how "explicitly, long-windedly and straightforwardly" writing module imports is any different from using relative imports.
Hugely and profoundly different, as your example shows:
An example:
mx .DateTime .ODBC.Windows
In ODBC.Windows I would write:
import __.__.DateTime
which is just as explicit as writing
import mx.DateTime
except that the information about the top-level hierarchy is *not* included in the import information.
Exactly: now instead of a bulletproof absolute import in a self-contained importer, whose precise meaning is clear from casual static inspection, you've got a relative one whose meaning depends vitally on where the importer happens to live. As code moves around, the import may fail to work when you're lucky, or pick up an unintended DateTime package when you're not. The semantics of the import are indeed relative now: you see the opportunity to type three additional characters <wink> as a strength, and I see nothing but headaches masquerading as convenience. As further consequence, simple "search-and-replace" strategies are rendered at best useless. That is, a simple grep can no longer find all & only references to mx's flavor of DateTime: it picks up all references to all DateTime guys, with no way short of a structure-aware tree crawler to sort them out again. Complexity spreads. Of course, in the end, I lose no matter what. If *everyone* used dirt-dumb fully-qualified imports, all hassles with shifting package structure could be solved with a text editor in seconds. But that's not whiz-bang enough, so people *will* play endlessly dubious tricks with hooks, paths, relative imports, and metapackages too once Gordon gets around to them <wink>. Having an explicit way to spell relative imports would make my life easier because at least those are statically recognizable for what they are. I don't want them and won't use them, but it could ease the pain others will cause by using them. That's not enough to make me in favor of them, though; just enough so I won't pout if they do go in.
Anyway, I'm not too inclined continuing this discussion anymore. I would never have thought that such a nifty little patch would have stirred up so much negative response. I'll just hack up my very own import mechanism using imputil and be done. So there!
Don't feel picked on, M-A! Opposition is normal and healthy; and the last idea I got anywhere with was list.pop(). and-even-then-jimf-tried-to-stick-a-default-on-it<wink>-ly y'rs - tim
Don't feel picked on, M-A! Opposition is normal and healthy;
As a (hopefully welcome) tangent to this discussion, I am currently reading "Dynamics of Software Development" by Jim McCarty. He has an interesting section on exactly this. I will quote it verbatim (typing it in, so forgive typos). I present it without comment other than: It appears we are a healthy team! Mark. The Problem of Creativity: Creativity in a group is always limited by the groups defensiveness, and some amount of defensiveness is healthy. In the healthy team, then, no matter how highly valued creativity is ostensibly, change beyond a certain normative quantity or degree will be resisted. And even change within bounds will be accepted only insofar as it can seem to fall under the rubric of "improvement". Change must seem to build on the previously known and accepted reality. Even if there are only two steps involved in a change, with the second building on the first, in a single reality cycle - single product development cycle for us - the change is likely to be rejected by the healthy team. What you need for truly creative change, then, is an environment that transcends good health, an environment that not only accepts a continuum of change, which is normal, but one that positively engenders, nurtures and propels forward wholly new dynamics. The transcendent organization values radical or revolutionary change and esteems utterly new modes of thought. It's possible for a team to be healthy and not particularly creative, but this state of affairs is not especially desirable. What is desirable is team fecundity [MH - had to look this up - "intellectually productive or inventive to a marked degree" - synonym of fertile], the radiating of the new and the original from the normal and the healthy. This kind of creativity requires a flexibility and a courage beyond the reach of most of us most of the time It's ironic and worth remembering that the healthier the team the more effective the resistance to revolutionary thought will be. This is a natural consequence of the teams overall effectiveness on every relevant domain. -- end of quote--
Tim Peters wrote:
Sure is a lot of traffic on this -- can't we go back to hating the new license <wink>?
The license was withdrawn... I would prefer starting to talk about imputil.py.
[MAL]
Also, I don't know how "explicitly, long-windedly and straightforwardly" writing module imports is any different from using relative imports.
Hugely and profoundly different, as your example shows:
An example:
mx .DateTime .ODBC.Windows
In ODBC.Windows I would write:
import __.__.DateTime
which is just as explicit as writing
import mx.DateTime
except that the information about the top-level hierarchy is *not* included in the import information.
Exactly: now instead of a bulletproof absolute import in a self-contained importer, whose precise meaning is clear from casual static inspection,
Ah, that's a new idea ! Package local absolute names... wow, that sounds weird.
you've got a relative one whose meaning depends vitally on where the importer happens to live. As code moves around, the import may fail to work when you're lucky, or pick up an unintended DateTime package when you're not. The semantics of the import are indeed relative now: you see the opportunity to type three additional characters <wink> as a strength, and I see nothing but headaches masquerading as convenience.
Tim, the intention was being able to move the mx level around, e.g. renaming it, eliminating it by sticking DateTime and ODBC under some other higher level package (yes, you'll have to watch out here because the higher level package might define DateTime and ODBC too -- but it's not all that likely), etc. The buzz word here is 'relocatable'.
As further consequence, simple "search-and-replace" strategies are rendered at best useless. That is, a simple grep can no longer find all & only references to mx's flavor of DateTime: it picks up all references to all DateTime guys, with no way short of a structure-aware tree crawler to sort them out again. Complexity spreads.
Of course, in the end, I lose no matter what. If *everyone* used dirt-dumb fully-qualified imports, all hassles with shifting package structure could be solved with a text editor in seconds. But that's not whiz-bang enough, so people *will* play endlessly dubious tricks with hooks, paths, relative imports, and metapackages too once Gordon gets around to them <wink>. Having an explicit way to spell relative imports would make my life easier because at least those are statically recognizable for what they are. I don't want them and won't use them, but it could ease the pain others will cause by using them. That's not enough to make me in favor of them, though; just enough so I won't pout if they do go in.
Anyway, I'm not too inclined continuing this discussion anymore. I would never have thought that such a nifty little patch would have stirred up so much negative response. I'll just hack up my very own import mechanism using imputil and be done. So there!
Don't feel picked on, M-A! Opposition is normal and healthy; and the last idea I got anywhere with was list.pop().
Yah well. Last thing I succeeded with was making Python methods wrap any callable type. Seems that we're a bit too healthy (see MarkH's post) sometimes, i.e. there isn't all that much room for experiments. Just think of cool developments like Chris' stackless python. Experience shows that these kind of things will never make it into the distribution. Unfortunately, maintaing patches to the dist across releases a real pain and much work, so these ideas will just sit there unused and untested. Much the same happened to gcc ... in the end corporate strength made egcs possible. Perhaps we need such a branch too ? -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 106 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
M.-A. Lemburg wrote:
... Seems that we're a bit too healthy (see MarkH's post) sometimes, i.e. there isn't all that much room for experiments. Just think of cool developments like Chris' stackless python. Experience shows that these kind of things will never make it into the distribution.
A portion of my threading patches went into 1.5. "Big" things can happen and they do... it's just a matter of passing muster with The Head Honcho. That, and the timing (e.g. the thread state changes wouldn't go into a 1.4.n release, but did go into 1.5).
Unfortunately, maintaing patches to the dist across releases a real pain and much work, so these ideas will just sit there unused and untested. Much the same happened to gcc ... in the end corporate strength made egcs possible. Perhaps we need such a branch too ?
NO -g -- Greg Stein, http://www.lyra.org/
[MAL]
... Seems that we're a bit too healthy (see MarkH's post) sometimes, i.e. there isn't all that much room for experiments.
The odds of a change making it into Python radically decreased when 1.0 hit the net, and have continued to decline (although slowly) since then. In recent years, Guido appears to me to have gotten ever more reluctant to entertain even 100% compatible changes to the internals, if they affect a delicate area of the implementation (ceval.c is the most obvious one there). But that's "normal & healthy" <wink> too. Languages & implementations get brittle with age, and it's eventually better to start over -- if Guido didn't have Python2 plans in mind, he'd be the first language designer ever to stop where he started!
Just think of cool developments like Chris' stackless python. Experience shows that these kind of things will never make it into the distribution.
Unfortunately, circumstances piled up and Chris got distracted from that, while nobody else made time to push it in his absence. Large changes have gone in, and even more may make it into the Python1 line, but it generally takes a large or "strategic" user base, and much persistence. GregS mentioned his massive work on threads (still not all in), and I'll add the NumPy extensions (which I wouldn't be surprised to see "mainstreamed"), BarryW's string methods, and DavidA's rich comparisons.
Unfortunately, maintaing patches to the dist across releases a real pain and much work, so these ideas will just sit there unused and untested. Much the same happened to gcc ... in the end corporate strength made egcs possible. Perhaps we need such a branch too ?
Don't tell, but I've always been surprised at how few people have tried to release a variant Python! The Alice version (case-insensitive names, and 1/2==0.5) is the only one that comes to mind, and the primary effect that had on today's Python is that raw expressions no longer print their value in non-interactive mode (before Alice, 1 + 2 on a line by itself caused "3" to get printed even in batch scripts; this interfered with the Alice team's favored object.method1().method2().method3() coding style, and Guido endured much pain to change "the real" Python to avoid a code split at that early stage of Python's life; ultimately futile, but then Alice Python didn't catch on anyway). So there's very little Python-related history to go on here. I don't mind seeing variants, but have to predict they won't get very far. Just picture what Python 1.6V would look like if its feature set were drawn from a consensus among you, me, Christian, Greg Ewing, John Skaller and Tom Christiansen <wink>. the-value-of-a-benevolent-dictator-is-easy-to-underestimate-ly y'rs - tim
Tim Peters wrote:
[MAL]
... Seems that we're a bit too healthy (see MarkH's post) sometimes, i.e. there isn't all that much room for experiments.
The odds of a change making it into Python radically decreased when 1.0 hit the net, and have continued to decline (although slowly) since then. In recent years, Guido appears to me to have gotten ever more reluctant to entertain even 100% compatible changes to the internals, if they affect a delicate area of the implementation (ceval.c is the most obvious one there).
But that's "normal & healthy" <wink> too. Languages & implementations get brittle with age, and it's eventually better to start over -- if Guido didn't have Python2 plans in mind, he'd be the first language designer ever to stop where he started!
Just think of cool developments like Chris' stackless python. Experience shows that these kind of things will never make it into the distribution.
Unfortunately, circumstances piled up and Chris got distracted from that, while nobody else made time to push it in his absence. Large changes have gone in, and even more may make it into the Python1 line, but it generally takes a large or "strategic" user base, and much persistence. GregS mentioned his massive work on threads (still not all in), and I'll add the NumPy extensions (which I wouldn't be surprised to see "mainstreamed"), BarryW's string methods, and DavidA's rich comparisons.
Plus the coercion stuff that's still sleeping in one of my project subdirs (I'll have to get this done *before* 1.6 hits the shelves).
Unfortunately, maintaing patches to the dist across releases a real pain and much work, so these ideas will just sit there unused and untested. Much the same happened to gcc ... in the end corporate strength made egcs possible. Perhaps we need such a branch too ?
Don't tell, but I've always been surprised at how few people have tried to release a variant Python! The Alice version (case-insensitive names, and 1/2==0.5) is the only one that comes to mind, and the primary effect that had on today's Python is that raw expressions no longer print their value in non-interactive mode (before Alice,
1 + 2
on a line by itself caused "3" to get printed even in batch scripts; this interfered with the Alice team's favored
object.method1().method2().method3()
coding style, and Guido endured much pain to change "the real" Python to avoid a code split at that early stage of Python's life; ultimately futile, but then Alice Python didn't catch on anyway).
So there's very little Python-related history to go on here. I don't mind seeing variants, but have to predict they won't get very far. Just picture what Python 1.6V would look like if its feature set were drawn from a consensus among you, me, Christian, Greg Ewing, John Skaller and Tom Christiansen <wink>.
Actually, what I was thinking about here was a Python 2.0 branch starting now rather than in a year or so and thus leaving much room for experiments etc. The intention was the same as with egcs and gcc: to fold the enhancements back into the main branch in a few years. E.g. if Guido points us in the right direction, we could start hacking on that piece of revolutionary work now. BTW, I'd suggest using C++ with namespaces but without templates as target language. By the time Python 2.0 will hit the shelves this setup should have reached the same portability as C has now. Perhaps we could even use RTTI (run time type information) to implement optional type safety... ok, just dreaming a little ;-) -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 102 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
Gordon McMillan wrote:
[Jim Fulton]
Here's an example that I hope will be motivating:
Suppose Marc-Andre has a package mx with subpackages DateTime and stringtools. ... Zope has a notion of products which are *self contained* packages that are sub-packages of the Products package. So, suppose someone wants to write a NiftyDB product, which is a Zope product that provides access to an external database. Now the author of the NiftyDB product wants to use the mx package. The mx package is not a standard part of Zope, or of Python, so they simpley include it in the NiftyDB product directory.
This would motivate me to strangle the author of NiftyDB. He should say "This requires Marc-Andre's mx package", and put error handling on the import. Otherwise I might end up with a gazillion separate copies of the mx package installed on my system and no way to clean it up.
The mx package installed in product dir won't be visible to outside scripts: it is self-contained and only works in the context of Zope. There's nothing bad about it if you take the POV of a typical Zope user who doesn't know about the internals too much. OTOH, if the user does have the skill to install packages himself, he could install the mx package top-level and then leave it out of the NiftyDB product dir. Both setups will work without any source code change... that's the real advantage. Currently, only the top-level installation of the mx package will work and this of course introduces other problems like version control, e.g. the NiftyDB product (relying on mx 1.9) may not support version 2.0 of mx which the sysadmin installed at top-level. There would be no way (apart from patching NiftyDB) to get back to a working setup.
Your notion of "self contained packages" is flawed. Reducto ad adsurdum, it leads to downloading the OS with every app you install. (Crap, I hope I haven't given Sun any new ideas...).
While you do of course have a point in that packages should not be loaded twice (a smarter import mechanism written in Python could handle these situations along with the relative import), you should consider that Zope is an application. It could go the WinXX way and simply hack around in the OS or wrap everything it needs up in the package itself. Both have their pros and cons, both are possible using relative imports (I'm starting to feel like one of those marketing guys ;). BTW, I really don't care much about this feature being coded in Python or C. All I would like is to have it enabled and available per default, so that writing self-contained packages becomes a standard in the Python world. -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 109 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
"M.-A. Lemburg" wrote:
Gordon McMillan wrote:
Jim Fulton wrote: [deeply nested snippety snip]
This would motivate me to strangle the author of NiftyDB. He should say "This requires Marc-Andre's mx package", [...]
Yes.
The mx package installed in product dir won't be visible to outside scripts: it is self-contained and only works in the context of Zope.
But why shouldn't it be visible? The only approach which will not self-destruct IMO, is to segment on source-of-origin. Your package in your namespace, Gordon's in his, and Jim's in yet another. Given that there is no ordering relationship, that means three areas next to each other. If Jim takes M-A's package and modifies it, then that would be a reason to put M-A's-modified-by-Jim-package within Jim's area.
Currently, only the top-level installation of the mx package will work and this of course introduces other problems like version control, e.g. the NiftyDB product (relying on mx 1.9) may not support version 2.0 of mx which the sysadmin installed at top-level. There would be no way (apart from patching NiftyDB) to get back to a working setup.
Which points to a solution in a completely different direction: version numbers in package directories, like GNU and Tcl do, to name two.
[...] have it enabled and available per default, so that writing self-contained packages becomes a standard in the Python world.
A noble goal. Now this may be a silly question, but what's the issue here? I'm used to writing things which sort of look like this: import myGoobledygook utils = myGoobledygook print utils.fun(123) Couldn't this be applied here as well: mxDT = aCoolImporter("Marc-Andre's latest date/time utilities") In other words: figure out a way to get at the proper modules, then use an alias in your own code to stay away from naming/access dependencies? And if a parent wants to tell a submodule how to find it, can't it just set a variable at module level in that submodule, to reach it? Aliases, bookmarks, symlinks, shortcuts, env vars, globals, registries, DNS or LDAP servers, FS mount points - it's all the same issue. -- Jean-Claude
Jean-Claude Wippler wrote:
(snip)
The mx package installed in product dir won't be visible to outside scripts: it is self-contained and only works in the context of Zope.
But why shouldn't it be visible?
Because visibility has a cost. Making mx a top-level name means that someone else can't make it a top-level name. This is why packages are a good idea. If the need is local, the definition shouldn't be made global *just* to get around a limitation in spelling.
The only approach which will not self-destruct IMO, is to segment on source-of-origin. Your package in your namespace, Gordon's in his, and Jim's in yet another. Given that there is no ordering relationship, that means three areas next to each other.
It must be possible to package things together regardless of point of origin. Otherwise, you have a big obstical to reuse. BTW, I think that there is ample evidence that relative imports will not cause anything to self destruct.
If Jim takes M-A's package and modifies it, then that would be a reason to put M-A's-modified-by-Jim-package within Jim's area.
But Jim wants to use M-A's package as a black box. I don't want to hack all of his imports due to a packaging detail. (snip)
Now this may be a silly question, but what's the issue here? I'm used to writing things which sort of look like this: import myGoobledygook utils = myGoobledygook print utils.fun(123) Couldn't this be applied here as well: mxDT = aCoolImporter("Marc-Andre's latest date/time utilities") In other words: figure out a way to get at the proper modules, then use an alias in your own code to stay away from naming/access dependencies?
And if a parent wants to tell a submodule how to find it, can't it just set a variable at module level in that submodule, to reach it?
Aliases, bookmarks, symlinks, shortcuts, env vars, globals, registries, DNS or LDAP servers, FS mount points - it's all the same issue.
Yes, these are options, as are import hooks. While these are possible alternatives, they are far less attractive than simply making it possible to spell relative imports. Why? Well, for one, they require a lot of cooperation among packages that might be written by totally different authors. I might come up with a handy relative import function, but how do I get M-A to use it? Well, I'm sure I could get him to use it, but how about all of the other people whose packages I want to reuse. No problem, we'll come up with a standard mechansism, maybe even a standard library module. Of course, someone is bound to realize sooner or later that it's silly to have a standard relative import library module *and* a standard import statement and add the feature to standard import. :) Jim -- Jim Fulton mailto:jim@digicool.com Python Powered! Technical Director (888) 344-4332 http://www.python.org Digital Creations http://www.digicool.com http://www.zope.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
Jean-Claude Wippler wrote:
"M.-A. Lemburg" wrote:
Gordon McMillan wrote:
Jim Fulton wrote: [deeply nested snippety snip]
This would motivate me to strangle the author of NiftyDB. He should say "This requires Marc-Andre's mx package", [...]
Yes.
Of course ;-) ... but that's not the point. Commercial products have to come with all batteries included, not just some of them. But you can't include everything known in the Python universe just to avoid conflicts when someone writes a cool plugin for your package.
The mx package installed in product dir won't be visible to outside scripts: it is self-contained and only works in the context of Zope.
But why shouldn't it be visible?
Because it would probably affect other apps using the same package. Just think of what happens to Win9x if you constantly update the DLLs...
The only approach which will not self-destruct IMO, is to segment on source-of-origin. Your package in your namespace, Gordon's in his, and Jim's in yet another. Given that there is no ordering relationship, that means three areas next to each other. If Jim takes M-A's package and modifies it, then that would be a reason to put M-A's-modified-by-Jim-package within Jim's area.
Currently, only the top-level installation of the mx package will work and this of course introduces other problems like version control, e.g. the NiftyDB product (relying on mx 1.9) may not support version 2.0 of mx which the sysadmin installed at top-level. There would be no way (apart from patching NiftyDB) to get back to a working setup.
Which points to a solution in a completely different direction: version numbers in package directories, like GNU and Tcl do, to name two.
A different topic -- an interesting one too :-) How does Java handle this problem ?
[...] have it enabled and available per default, so that writing self-contained packages becomes a standard in the Python world.
A noble goal.
And one which is only reachable by getting whatever mechanism is needed to make it work into the standard distribution -- not only a hook for relative imports but the real thing !
Now this may be a silly question, but what's the issue here? I'm used to writing things which sort of look like this: import myGoobledygook utils = myGoobledygook print utils.fun(123) Couldn't this be applied here as well: mxDT = aCoolImporter("Marc-Andre's latest date/time utilities") In other words: figure out a way to get at the proper modules, then use an alias in your own code to stay away from naming/access dependencies?
And if a parent wants to tell a submodule how to find it, can't it just set a variable at module level in that submodule, to reach it?
Aliases, bookmarks, symlinks, shortcuts, env vars, globals, registries, DNS or LDAP servers, FS mount points - it's all the same issue.
True, but please remember that we are not talking about adding some esoteric module storage to the standard mechanism (the one reachable via the "import" statement). Relative import only adds a logical notion to imports that opens up new possibilities which are not easily available using other techniques. BTW, would you want to miss relative file name paths ? -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 109 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
Marc-Andre,
BTW, would you want to miss relative file name paths ?
Well, that thought did cross my mind while posting that reply... I'm not sure. Yes, they are convenient, but "current working directory" is not always a pleasant concept (yet more global state - think of tree walks and the inconvenience of having to get, alter, work, restore it - when using chdir to implement this). I don't use .. as often as might be expected. I use "cd ~/something" a lot more. While that may look relative at another level, it really is not (as "echo ~" shows). It does illustrate how nearly equivalent the two approaches are. DOS/Windows has always had a broken relative path: the current drive. And the "hash/bang" headers of Unix seem to always use absolute paths. Some of this may seem to point to the need for relative paths. But I think it hides a more fundamental issue: you need to find out context. Once you do, relativeness no longer matters (obviously). A system which has a current directory and ".." is equivalent to one which has no such thing and passes a "starting directory" in the environment, say. I think a system with less global state is more modular -> preferable. Another angle: I have built many types of tree structures, still do. Less and less of those contain a "parent" link. Instead of storing a parent you can just as easily keep state while descending into children. - Unix file info does not maintain a parent directory, it's redundant. - Directories do, and file system mount points are messy because of it. - Afaik, it's considered bad style to use "../header.h" in C includes. - Upward pointers can introduce cycles (no not here, as has been said). - In C++, member objects rarely need pointers to the enclosing object. It is not for a module to know where it "is" in a hierarchy, it is for a parent to (sometimes) provide that reference when it uses the module. If you want want access to a module called mx, and there may be many of them - then you ask another module to locate it for you. One which is able to choose / decide for you. Proximity may be a good discriminator, but the decision of proximity was not taken by you - and you won't know what rule is best. So you ask an "importer" (which could well always be the standard Python mechanism). Given the task to write such as beast, I'd probably want to implement the following module search as default: - look for the module first in the parent (i.e. as sibling) - move one level up, look again, etc - all the way to the top That's just one way to do it - proximity is not *always* what you want. So if Zope chooses a different packaging style, let it override import. But please don't build ".." into your modules, it doesn't belong there. Sorry for all the handwaving generalities. The issues are the same IMO, whether relativeness is provided or context, and therefore the outcome of this discussion will never be conclusive - they both work. -- Jean-Claude
Jean-Claude Wippler wrote:
Marc-Andre,
BTW, would you want to miss relative file name paths ?
Well, that thought did cross my mind while posting that reply...
I'm not sure. Yes, they are convenient, but "current working directory" is not always a pleasant concept (yet more global state - think of tree walks and the inconvenience of having to get, alter, work, restore it - when using chdir to implement this). I don't use .. as often as might be expected. I use "cd ~/something" a lot more. While that may look relative at another level, it really is not (as "echo ~" shows). It does illustrate how nearly equivalent the two approaches are.
DOS/Windows has always had a broken relative path: the current drive.
And the "hash/bang" headers of Unix seem to always use absolute paths.
...and people have turned to /usr/bin/env to get around this necessity :-)
Some of this may seem to point to the need for relative paths. But I think it hides a more fundamental issue: you need to find out context. Once you do, relativeness no longer matters (obviously). A system which has a current directory and ".." is equivalent to one which has no such thing and passes a "starting directory" in the environment, say.
I think a system with less global state is more modular -> preferable.
Agreed. I would much prefer imputil.py to become a standard -- then I could add my hook into its chain and wouldn't have to argue for using relative imports ;-) But then, how would I tell if someone else already has integrated such a hook in the chain ? I sure wouldn't want to add another one just to make sure... If the standard mechanism already knows about '__' then I wouldn't have to worry.
Another angle: I have built many types of tree structures, still do. Less and less of those contain a "parent" link. Instead of storing a parent you can just as easily keep state while descending into children.
- Unix file info does not maintain a parent directory, it's redundant. - Directories do, and file system mount points are messy because of it. - Afaik, it's considered bad style to use "../header.h" in C includes. - Upward pointers can introduce cycles (no not here, as has been said). - In C++, member objects rarely need pointers to the enclosing object.
Note that my patch does exactly this: it resolves the __ while descending into the package structure -- without any explicit back reference stored in the modules.
It is not for a module to know where it "is" in a hierarchy, it is for a parent to (sometimes) provide that reference when it uses the module.
Right.
If you want want access to a module called mx, and there may be many of them - then you ask another module to locate it for you. One which is able to choose / decide for you. Proximity may be a good discriminator, but the decision of proximity was not taken by you - and you won't know what rule is best. So you ask an "importer" (which could well always be the standard Python mechanism). Given the task to write such as beast, I'd probably want to implement the following module search as default: - look for the module first in the parent (i.e. as sibling) - move one level up, look again, etc - all the way to the top That's just one way to do it - proximity is not *always* what you want.
Now this is an interesting idea... looks like acquisition at the module import level. Right now the scheme is as follows: - look for the module in the import module's dir - look for the module on the PYTHONPATH You would just add a third lookup step in between the two... it would cause many more module lookups though.
So if Zope chooses a different packaging style, let it override import. But please don't build ".." into your modules, it doesn't belong there.
Sorry for all the handwaving generalities. The issues are the same IMO, whether relativeness is provided or context, and therefore the outcome of this discussion will never be conclusive - they both work.
-- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 108 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
M.-A. Lemburg wrote: <color><param>0000,7F00,0000</param>> Agreed. I would much prefer imputil.py to become a standard -- then I
could add my hook into its chain and wouldn't have to argue for using
relative imports ;-)
</color>Hear hear! <color><param>0000,7F00,0000</param>> But then, how would I tell if someone else</color> <color><param>0000,7F00,0000</param>> already has integrated such a hook in the chain ? I sure wouldn't want
to add another one just to make sure...
</color>Actually, imputil was designed to be used either generally or specifically. In other words, a package can install a link in the chain just for its contents. The hook can pass on the request as soon as it realizes that the request is not "his". The overhead involved is probably insignificantly different from having complex tests in a single generalized hook. <nofill> - Gordon
M.-A. Lemburg wrote:
Guido van Rossum wrote:
I'd much rather use absolute package names for anything that's not in the same directory as the current module.
Of course, you could do everything with absolute names, but then the package author would dictate the complete absolute path which is not always desirable since it can cause name collisions such as DateTime in mxDateTime and Zope or Queue in mxQueue (to be released) and Mark's win32 stuff.
I can see your point (although I also believe that authors - Guido excepted - should come up with collision-free names, probably in a Java-ish scheme). But I strongly believe that import.c should be left alone, maybe even to die. There are too many people doing import hooks to make fiddling with its behavior safe. I'm also a strong proponent of Greg's imputil.py scheme, which makes it a breeze to do import hooks. And my experience disproves the notion that the import mechanism needs to be in C. If you don't believe me, try the ZlibArchive stuff (which is cross platform) from my Win32 installer stuff. You can pack the standard library into one 475K file, and get a perceptible performance boost. OTOH, I could see doing a framework of packages, in which case relative imports might be handy. This seems to work fine: def relimp(nm): rpth = string.split(nm, '/') tpth = string.split(__name__, '.')[:-1] for node in rpth: if node == '..': del tpth[-1] else: tpth.append(node) #print `tpth` return __import__(string.join(tpth, '.')) b = relimp('../packageA2.b')
As more and more packages appear, we run into this problem more and more often. Relative imports would greatly reduce the risk these collisions, because packages could be written self-contained meaning that they can reach their internal modules via relative paths only and don't have to know about the absolute paths by which they are reachable from other parts of the Python universe.
I could then make all my packages self-contained and distribute them in two forms without having to change a single line for the added support:
1. under the new 'mx' package, e.g. mx.DateTime 2. for backward compatibility under 'DateTime'
Another major advantage is that I could take any other self-contained package and install it under absolute paths of my choice, e.g. put Zope under org.zope.core, Python under org.python.core etc., without harming their functionality or having to dive deep into their import structures to fix them manually.
To further enhance this mechanism I would like to have an alias mechanism in import, pickle et al. so that changes in the package structures become manageable without user intervention: pickles are a major problem whenever import structures change because they store absolute module names.
-- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 111 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
_______________________________________________ Python-Dev maillist - Python-Dev@python.org http://www.python.org/mailman/listinfo/python-dev
- Gordon
Gordon McMillan wrote:
M.-A. Lemburg wrote:
Guido van Rossum wrote:
I'd much rather use absolute package names for anything that's not in the same directory as the current module.
Of course, you could do everything with absolute names, but then the package author would dictate the complete absolute path which is not always desirable since it can cause name collisions such as DateTime in mxDateTime and Zope or Queue in mxQueue (to be released) and Mark's win32 stuff.
I can see your point (although I also believe that authors - Guido excepted - should come up with collision-free names, probably in a Java-ish scheme).
Agreed.
But I strongly believe that import.c should be left alone, maybe even to die. There are too many people doing import hooks to make fiddling with its behavior safe.
I'm also a strong proponent of Greg's imputil.py scheme, which makes it a breeze to do import hooks. And my experience disproves the notion that the import mechanism needs to be in C. If you don't believe me, try the ZlibArchive stuff (which is cross platform) from my Win32 installer stuff. You can pack the standard library into one 475K file, and get a perceptible performance boost.
You're probably right in saying that we don't need the code in C. I just wanted to avoid yet another import hook being incompatible with all the other existing hooks. Perhaps we should restart the import discussion all over and come up with a more flexbile 100% compatible framework based on Greg's imputil scheme. Then I could add my hook for the relative imports and be happy ;-)
OTOH, I could see doing a framework of packages, in which case relative imports might be handy. This seems to work fine:
def relimp(nm): rpth = string.split(nm, '/') tpth = string.split(__name__, '.')[:-1] for node in rpth: if node == '..': del tpth[-1] else: tpth.append(node) #print `tpth` return __import__(string.join(tpth, '.'))
b = relimp('../packageA2.b')
This is pretty much how my patch works... except that I use the ni.py style '__' pseudo package instead of '../'. -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 109 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
On Mon, 13 Sep 1999, M.-A. Lemburg wrote:
Gordon McMillan wrote:
But I strongly believe that import.c should be left alone, maybe even to die. There are too many people doing import hooks to make fiddling with its behavior safe.
I'm also a strong proponent of Greg's imputil.py scheme, which makes it a breeze to do import hooks. And my experience disproves the notion that the import mechanism needs to be in C. If you don't believe me, try the ZlibArchive stuff (which is cross platform) from my Win32 installer stuff. You can pack the standard library into one 475K file, and get a perceptible performance boost.
You're probably right in saying that we don't need the code in C. I just wanted to avoid yet another import hook being incompatible with all the other existing hooks.
Perhaps we should restart the import discussion all over and come up with a more flexbile 100% compatible framework based on Greg's imputil scheme. Then I could add my hook for the relative imports and be happy ;-)
I'll code up some patches to strip down import.c and replace with imputil.py and some bootstrap/support C code -- as soon as 1.6 development opens up. Guido? Cheers, -g -- Greg Stein, http://www.lyra.org/
Guido van Rossum wrote:
I've spent the last two hours trying to get relative package imports to work because I need them for my extension packages which will soon all move under a new top-level package name to overcome the conflicts with PIL and Zope.
I'd much rather use absolute package names for anything that's not in the same directory as the current module.
I'll second Marc-Andre here. A significant headache occurs when you have a package that has sub-packages. Sub-packages need to be able to reference other sub-packages within the same package without knowing where the containing package is installed. Jim -- Jim Fulton mailto:jim@digicool.com Python Powered! Technical Director (888) 344-4332 http://www.python.org Digital Creations http://www.digicool.com http://www.zope.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
[MAL]
I've spent the last two hours trying to get relative package imports to work because I need them for my extension packages which will soon all move under a new top-level package name to overcome the conflicts with PIL and Zope. [GvR] I'd much rather use absolute package names for anything that's not in the same directory as the current module. [JimF] I'll second Marc-Andre here.
A significant headache occurs when you have a package that has sub-packages. Sub-packages need to be able to reference other sub-packages within the same package without knowing where the containing package is installed.
You never need to know where it is installed. When I said absolute package name I meant package name (e.g. zope.foo.bar.subpack) not filename. As Tim has argued, the ability to change the name of the toplevel here is a liability, not a feature. (On reduced bandwidth here, hence not so subtle.) --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum wrote:
[MAL]
I've spent the last two hours trying to get relative package imports to work because I need them for my extension packages which will soon all move under a new top-level package name to overcome the conflicts with PIL and Zope. [GvR] I'd much rather use absolute package names for anything that's not in the same directory as the current module. [JimF] I'll second Marc-Andre here.
A significant headache occurs when you have a package that has sub-packages. Sub-packages need to be able to reference other sub-packages within the same package without knowing where the containing package is installed.
You never need to know where it is installed. When I said absolute package name I meant package name (e.g. zope.foo.bar.subpack) not filename. As Tim has argued, the ability to change the name of the toplevel here is a liability, not a feature.
I wasn't refering to file path location either, but to package name-space location. See the example I just posted in reply to Tim. Jim -- Jim Fulton mailto:jim@digicool.com Python Powered! Technical Director (888) 344-4332 http://www.python.org Digital Creations http://www.digicool.com http://www.zope.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
Guido van Rossum wrote:
[MAL]
I've spent the last two hours trying to get relative package imports to work because I need them for my extension packages which will soon all move under a new top-level package name to overcome the conflicts with PIL and Zope. [GvR] I'd much rather use absolute package names for anything that's not in the same directory as the current module. [JimF] I'll second Marc-Andre here.
A significant headache occurs when you have a package that has sub-packages. Sub-packages need to be able to reference other sub-packages within the same package without knowing where the containing package is installed.
You never need to know where it is installed. When I said absolute package name I meant package name (e.g. zope.foo.bar.subpack) not filename. As Tim has argued, the ability to change the name of the toplevel here is a liability, not a feature.
I think Jim meant the absolute package name just like you and I did. I don't really understand why you and Tim oppose to relative imports... Tim's argument about sysadmins installing packages under fantasy names isn't really all that realistic, IMHO, even though it would work. The ability to host a subpackage inside another package is a very attractive feature for large package based applications: you don't have to edit a single line in the subpackage's source code and can install/update the subpackage without harming its functionality within the system. BTW: The patch I posted does not add a circular reference to the modules as the old ni.py implementation (sometimes) did. Since the __ handling is done dynamically and on import only there is no need to add a module attribute __ pointing to the parent module. -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 109 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
[relative package imports] <color><param>0000,7F00,0000</param>[JimF]
I'll second Marc-Andre here.
A significant headache occurs when you have a package
that has sub-packages. Sub-packages need to be able to
reference other sub-packages within the same package without
knowing where the containing package is installed.
</color>[GvR] <color><param>0000,7F00,0000</param>> You never need to know where it is installed. When I said absolute
package name I meant package name (e.g. zope.foo.bar.subpack) not
filename. As Tim has argued, the ability to change the name of the
toplevel here is a liability, not a feature.
</color>In between. I can see relative packages as *one* way of handling certain problems. Consider: zope.Win32 zope.Unix with both of these having alternate implementations of subpackages foo and bar. Then for (the current) foo.a to get to (the current) bar.b, using a relative import seems a natural. This can, of course, be done in pure Python. So can doing things in zope.__init__.py that make the appropriate implementations of foo and bar appear to be zope.foo and zope.bar. On any criteria I can think of, this would be a superior solution. (*) What I am against is further complicating the already over complicated built in import mechanism. (*) such as a zope.__init__.py that looks like this: import sys if sys.platform[:3] == 'win': nm = __name__ + '.Win32' else: nm = __name__ + '.Unix' new = __import__(nm) sys.modules[__name__] = sys.modules[nm]<bold> <nofill> - Gordon
Gordon McMillan wrote:
[relative package imports] [JimF]
I'll second Marc-Andre here.
A significant headache occurs when you have a package that has sub-packages. Sub-packages need to be able to reference other sub-packages within the same package without knowing where the containing package is installed. [GvR] You never need to know where it is installed. When I said absolute package name I meant package name (e.g. zope.foo.bar.subpack) not filename. As Tim has argued, the ability to change the name of the toplevel here is a liability, not a feature.
In between. I can see relative packages as *one* way of handling certain problems. Consider:
zope.Win32 zope.Unix
with both of these having alternate implementations of subpackages foo and bar. Then for (the current) foo.a to get to (the current) bar.b, using a relative import seems a natural.
I wouldn't use relative imports for this.
This can, of course, be done in pure Python. So can doing things in zope.__init__.py that make the appropriate implementations of foo and bar appear to be zope.foo and zope.bar. On any criteria I can think of, this would be a superior solution. (*)
I agree. This is not a good motivating example for relative imports.
What I am against is further complicating the already over complicated built in import mechanism.
The current scheme doesn't seem all that complicated to me. I wouldn't be in favor of making it more complicated if there wasn't a good reason. I think that, in working on the Zope framework, I've found some pretty good reasons for relative imports.
(*) such as a zope.__init__.py that looks like this:
import sys if sys.platform[:3] == 'win': nm = __name__ + '.Win32' else: nm = __name__ + '.Unix' new = __import__(nm) sys.modules[__name__] = sys.modules[nm]
Right, this would be silly. This is not a good example. Jim -- Jim Fulton mailto:jim@digicool.com Python Powered! Technical Director (888) 344-4332 http://www.python.org Digital Creations http://www.digicool.com http://www.zope.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
Jim Fulton wrote:
I wouldn't be in favor of making it more complicated if there wasn't a good reason. I think that, in working on the Zope framework, I've found some pretty good reasons for relative imports.
And in a later message:
I don't think the Package structure of Zope is flawed *except* for the fact that it is one level too *shallow*. The ability to do relative imports would be very helpful for the work we're doing.
But I haven't seen explained what it is that Zope is doing where relative packages would be helpful. --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum wrote:
Jim Fulton wrote:
I wouldn't be in favor of making it more complicated if there wasn't a good reason. I think that, in working on the Zope framework, I've found some pretty good reasons for relative imports.
And in a later message:
I don't think the Package structure of Zope is flawed *except* for the fact that it is one level too *shallow*. The ability to do relative imports would be very helpful for the work we're doing.
But I haven't seen explained what it is that Zope is doing where relative packages would be helpful.
I posted an example in an earlier message. I'll recast it here, hopefully more eloquently. :) I'll also offer an alternate proposal that also solves my (and I suspect, Marc-Andre's) problem. Zope is an application platform. It provides a mechanism for developers to plug their own products into Zope. The idea is that someone gets Zope from zope.org and installs it. Then they get third-party products from other places. Zope products are python packages installed as sub-packages of the Zope 'Products' packages. Products are generally self-contained. If they need anything that's not part of standard Python or standard Zope, they need to include it or install what they need in a sub-package of another Zope package, 'Shared'. Because products come from "third parties", it is important that they be self contained. Making assumptions about the Zope or Python environments or, worse, modifying the Zope or Python environments is a bad idea. In this context, consider the following concrete, though fictional example. Aaron has written a collection of modules that implement an RDBMS system for Python, gadfly. He also has a set of modules for parsing, kjParsing, which is needed by gadfly. Currently, these are just a bunch of top-level modules distributed as a combined collection. It would make sense to turn these into two packages, gadfly and kjParsing. Now, if this was done, then the gadfly package would need to use package imports for kjParsing modules, as in: import kjParsing.kjParser So far, so good. Now, suppose that someone wants to create a Zope product, ZGadfly, that used gadfly. The simplest approach would be to include the gadfly and kjParsing packages in their Zope product. Of course, this won't work, because the imports, like the one above, will fail, because kjParsing is no longer a top-level package. It wouldn't do any good to move gadlfy and kjParsing to the shared package, although that might be desirable to share these packages with other products. They could try to stuff the packages into the Zope or Python paths, but that would break the rules and lead to problems in the long term. Hopefully, this illustrates the problem. I think that this will be a common problem in the future, as people build bigger and bigger systems with Python that reuse other people's packages. I'd be curious to hear how folks would solve this problem. Personally, I'd like the problem to go away. :) I'd like the Python import rules to change to make this solvable without import hooks or path hacking. I can think of two ways to approach this: - Relative parent imports: import __.kjParsing.kjParser Note that Python already supports relative imports for going down. For example, we use gadfly and kjParsing together as a single sub-package of our ZGadflyDA product. - Gradually less local searches. Currently, when doing an import in a package, two paths are searched, the package path and then the Python path. If there are intermediate packages, then perhaps their paths should be searched as well. For example, suppose we have the directory structure: Products/ZGadfly/gadfly kjParsing where Products is a top-level package, and we did an import in a module in gadfly: import kjParsing.kjParser Python would search the path of the package Products.ZGadfly.gadfly first, as it does now. This search would fail. Then it would search the path of Products.ZGadfly, where it would find kjParsing, and the import would succeed. This approach has the benefits: o It solves the problem. :) o It has no impact on un-nested packages, o It requires no code changes; it doesn't use the ugly __. Thoughts? Jim -- Jim Fulton mailto:jim@digicool.com Python Powered! Technical Director (888) 344-4332 http://www.python.org Digital Creations http://www.digicool.com http://www.zope.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
JF> Zope products are python packages installed as sub-packages of JF> the Zope 'Products' packages. Products are generally JF> self-contained. If they need anything that's not part of JF> standard Python or standard Zope, they need to include it or JF> install what they need in a sub-package of another Zope JF> package, 'Shared'. Jim, from your description, it sounds like Zope should be putting both the `Shared' and possibly `Products' directories on sys.path. Maybe it does and I'm still missing your point. However if they are not, then wouldn't this solve your problem? A user would drop the kjParsing directory into Shared, and then all the import kjParsing.kjParser statements would Just Work. -Barry
"Barry A. Warsaw" wrote:
JF> Zope products are python packages installed as sub-packages of JF> the Zope 'Products' packages. Products are generally JF> self-contained. If they need anything that's not part of JF> standard Python or standard Zope, they need to include it or JF> install what they need in a sub-package of another Zope JF> package, 'Shared'.
Jim, from your description, it sounds like Zope should be putting both the `Shared' and possibly `Products' directories on sys.path.
It actually does now, although that is bad. What if some future python distribution has a Products package? Or a Products.py? A product is actually a sub-package of Products. Even if Products was not a package, the Product would be and gadfly and kjParsong would be sub-packages. There is a similar issue with Shared, since Shared software is organized by organization, as in 'Shared.DC.ZRDB'.
Maybe it does and I'm still missing your point.
The point is that stuffing arbitrary things into a single global name space is bad. That's why packages were invented. Unless packages can be effectively nested, you haven't gained anything but time.
However if they are not, then wouldn't this solve your problem?
No, see above.
A user would drop the kjParsing directory into Shared, and then all the import kjParsing.kjParser statements would Just Work.
But then they'd be dumping kjParsing into the global package namespace. Eventually, there will be collisions. Jim -- Jim Fulton mailto:jim@digicool.com Python Powered! Technical Director (888) 344-4332 http://www.python.org Digital Creations http://www.digicool.com http://www.zope.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
Jim Fulton wrote:
It actually does now, although that is bad. What if some future python distribution has a Products package? Or a Products.py?
Name it zopeProducts. Or Zope/Products and Zope/Shared.
A product is actually a sub-package of Products. Even if Products was not a package, the Product would be and gadfly and kjParsong would be sub-packages. There is a similar issue with Shared, since Shared software is organized by organization, as in 'Shared.DC.ZRDB'.
See below.
The point is that stuffing arbitrary things into a single global name space is bad. That's why packages were invented. Unless packages can be effectively nested, you haven't gained anything but time.
Packages nest effectively. Subpackages just don't masquerade as top level packages. [Barry]
A user would drop the kjParsing directory into Shared, and then all the import kjParsing.kjParser statements would Just Work.
But then they'd be dumping kjParsing into the global package namespace. Eventually, there will be collisions.
You are trying to do two incompatible things at once. You are trying to be both a standalone application, and a Python installation. You can be both, just not at the same time. To be a standalone application, you can use tricks that Jim Ahlstrom pointed out, or tricks from my Installer. The major point being that sys.path has nothing to do with the PYTHONPATH environment variable. You now have an installation where you can be assured of no collisions, because you control it completely. Of course, that will make liars out of the people at your company who claim you can mix and match pieces of Zope and your own stuff as you please. People who want to mix and match need Zope to play by normal Python rules - and that means packages are installed at the top level. Collisions happen, but your audience is people who *want* to do things differently and *want* control of their installations. By creating "self contained" packages and putting them in a normal Python installation, you're just pushing your headache off on the user. You're co-opting his installation. Yeah, now you have two versions, but they're 99% the same. Build the normal Python one first and distribute it to the hackers. Then install some import and startup hooks, and distribute that to your "users". I, for one, still use Bobo, because Zope doesn't play fair and I have no need or desire for yet another Python installation. - Gordon
Gordon McMillan wrote:
Jim Fulton wrote:
(snip)
Packages nest effectively. Subpackages just don't masquerade as top level packages.
They do currently when imported from the containing package.
[Barry]
A user would drop the kjParsing directory into Shared, and then all the import kjParsing.kjParser statements would Just Work.
But then they'd be dumping kjParsing into the global package namespace. Eventually, there will be collisions.
You are trying to do two incompatible things at once. You are trying to be both a standalone application, and a Python installation. You can be both, just not at the same time.
We don't try to be a Python installation. We try to be a Python application that is a platform for Zope applications.
To be a standalone application, you can use tricks that Jim Ahlstrom pointed out, or tricks from my Installer. The major point being that sys.path has nothing to do with the PYTHONPATH environment variable. You now have an installation where you can be assured of no collisions, because you control it completely.
Of course, that will make liars out of the people at your company who claim you can mix and match pieces of Zope and your own stuff as you please.
Zope reuses *alot* of code. I think Zope has a pretty good track record of working with existing code.
People who want to mix and match need Zope to play by normal Python rules - and that means packages are installed at the top level.
Sorry, I just don't buy that.
Collisions happen, but your audience is people who *want* to do things differently and *want* control of their installations.
Actually, many Zope user's come to Python by way of Zope. They want to be able to install Zope products without worrying about what their zope or python installation looks like. Zope developers want to be able to have their products drop into a Zope installation easily.
By creating "self contained" packages and putting them in a normal Python installation, you're just pushing your headache off on the user. You're co-opting his installation.
No, putting packages in the python path coops their installation. Putting Zope-specific products that happen to include reusable Python packages in a Zope-specific package leaves their installation alone.
Yeah, now you have two versions, but they're 99% the same. Build the normal Python one first and distribute it to the hackers. Then install some import and startup hooks, and distribute that to your "users".
This is exactly what I want to prevent. I'd like package developers to develop their packages once without making them use esoteric import hooks. OTOH I'd like to be able to *use* their packages wherever I damn well please. I don't to be forced to put their packages in the python path.
I, for one, still use Bobo, because Zope doesn't play fair
In what way does Zope not play fair?
and I have no need or desire for yet another Python installation.
Cool. You don't need to use another Python installation to use Zope. In fact, for development, we use a totally standard Python distribution. Developers here are not *allowed* to add anything special to the Python installations on the development machines here because we want to *make sure* that we can run with stock python installations. Jim -- Jim Fulton mailto:jim@digicool.com Python Powered! Technical Director (888) 344-4332 http://www.python.org Digital Creations http://www.digicool.com http://www.zope.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
Jim Fulton wrote:
Gordon McMillan wrote:
Jim Fulton wrote:
[Barry]
A user would drop the kjParsing directory into Shared, and then all the import kjParsing.kjParser statements would Just Work.
To be a standalone application, you can use tricks that Jim Ahlstrom pointed out, or tricks from my Installer. The major point being that sys.path has nothing to do with the PYTHONPATH environment variable. You now have an installation where you can be assured of no collisions, because you control it completely.
JimF, I am still trying to understand this. Gordon, Barry and I think a good solution is to put "Shared" on a private sys.path, but you object to this and I don't understand. Specifically zope/ zope/Shared zope/Shared/mx/... zope/Shared/AnotherPackage/... zope/Shared/some_file.py zope/ZopePackage1/... zope/ZopePackage2/... zope/bin/python.exe Suppose zope users must use the special binary zope/bin/python.exe. Suppose further that this binary reliably sets its sys.path to the directories it needs, namely zope, zope/Package*, zope/Shared. Suppose further that all user's packages must go into zope/Shared either as files or package directories and so are at the top level as package imports have always assumed. Then it seems that your zope app would absolutely work, and no other Python installation could damage it by changing PYTHONPATH, and that your zope installation would not damage any other Python installation either. Is the reason this doesn't work the posibility of name collisions in zope/Shared? I thought before that you were worried about name collisions in another co-existing Python installation which may have unknown packages installed, conflicting packages, or packages with incompatible versions. If so, you are trying to solve a problem more severe than I thought. I have always assumed packages would have unique names and so a flat name space suffices. In particular, trying to design for conflicting names can not be supported by the current python *.pyl library file design, and in general would require a PYTHONPATH mechanism. Such PYTHONPATH mechanism would have to be "local" to packages. That is, PYTHONPATH would vary depending on the package doing the importing. Are you sure this problem needs a solution right now? Jim Ahlstrom
"JF" == Jim Fulton <jim@digicool.com> writes:
JF> It actually does now, although that is bad. What if some JF> future python distribution has a Products package? Or a JF> Products.py? Wouldn't collide, because in my scheme, Shared isn't a package, it's a directory on sys.path. Everything inside Shared is a top-level package. You'd never "import Shared.DC.ZRDB" but instead you would "import DC.ZRDB". >> A user would drop the kjParsing directory into Shared, and then >> all the import kjParsing.kjParser statements would Just Work. JF> But then they'd be dumping kjParsing into the global package JF> namespace. Eventually, there will be collisions. Do you really think so? Someone else is going to write a kjParsing module? I think we're safe from this for the forseeable future through convention and peer pressure. Or is the concern that there will be two Products which use different versions of kjParsing? I'm also not concerned here because if my product needs a very specific version of kjParsing I can't see any way around shipping my own version, subpackaging it, and using an absolute package name to ensure that I'm getting at exactly the one I need. This is the approach Mailman takes. -Barry
"Barry A. Warsaw" wrote:
"JF" == Jim Fulton <jim@digicool.com> writes:
JF> It actually does now, although that is bad. What if some JF> future python distribution has a Products package? Or a JF> Products.py?
Wouldn't collide, because in my scheme, Shared isn't a package, it's a directory on sys.path. Everything inside Shared is a top-level package. You'd never "import Shared.DC.ZRDB" but instead you would "import DC.ZRDB".
I wasn't talking about your scheme. I was talking about how I thought it should be.
>> A user would drop the kjParsing directory into Shared, and then >> all the import kjParsing.kjParser statements would Just Work.
JF> But then they'd be dumping kjParsing into the global package JF> namespace. Eventually, there will be collisions.
Do you really think so?
Yes.
Someone else is going to write a kjParsing module?
No, but sooner or later, two people are going to write two "Parsing" packages, or two "Oracle" packages or two .... Are you suggesting that package names should begin with some hash value to assure that they are unique? (What does kj stand for anyway? :) I guess we could use some form of GUID to be really safe .....
I think we're safe from this for the forseeable future through convention and peer pressure.
8^o (I feel like I should respond to this in some way but .... I just can't understand this kind of argument and don't know how to frame a response.)
Or is the concern that there will be two Products which use different versions of kjParsing?
No, at least not primarily.
I'm also not concerned here because if my product needs a very specific version of kjParsing I can't see any way around shipping my own version, subpackaging it, and using an absolute package name to ensure that I'm getting at exactly the one I need.
But even in this case, if you used a relative import, you *would* get the specific version you want. I give up. I'll just have to write some import hook to get the behavior I want in Zope. I will probably implement the second alternative I wrote about in my earlier document. I'll get Python to look at intermediate package paths before it looks at the main path. That way, any package that works as a top-level package will also work as a sub-package without modification. Jim -- Jim Fulton mailto:jim@digicool.com Python Powered! Technical Director (888) 344-4332 http://www.python.org Digital Creations http://www.digicool.com http://www.zope.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
"JF" == Jim Fulton <jim@digicool.com> writes:
>> I think we're safe from this for the forseeable future through >> convention and peer pressure. JF> 8^o JF> (I feel like I should respond to this in some way but .... JF> I just can't understand this kind of argument and don't JF> know how to frame a response.) Maybe I should have included half a smiley. Python may eventually need package naming rules as strict as Java's. I think it's a good idea now for those who are writing packages to consider trying to pick unique top-level package names. We might decide to release our stuff under the `cnri' top level package. That's probably unique enough. Or `koe', or if we're really paranoid us.va.reston.cnri.koe :-! Personally, I think `Mailman' as a top-level package name is pretty unique, so I don't plan to change what I'm doing there. -Barry
BAW> Or `koe', or if we're really paranoid us.va.reston.cnri.koe :-! Umm, that would be "Knowbots Over Everything" or "Knowledge Of Everyone"? ;-) Skip
BAW> Or `koe', or if we're really paranoid us.va.reston.cnri.koe :-!
Skip> Umm, that would be "Knowbots Over Everything" or "Knowledge Of Everyone"? Actually, "koe" is dutch for cow. I like the idea of cows over everything -- reminds me of the flying pig on an old Pink Floyd cover :-) --Guido van Rossum (home page: http://www.python.org/~guido/)
Jim Fulton wrote:
In this context, consider the following concrete, though fictional example. ............
The classical solution would be to add "Shared" to sys.path. But how to do this? And what if another installation changes it?
I'd be curious to hear how folks would solve this problem.
1) Ship a custom binary interpreter with critical modules frozen in, and which sets a custom sys.path before anything else happens. So you are not dependent at all on PYTHONPATH. Currently this works. 2) Add a new command line option which sets sys.path and always use it to start your app. It lists only critical zope directories so zope has its own private name space. 3) Add your own import hooks using imputil.py and boot it as in (1). Better if it were part of Python. 4) Use a new Python.pyl file which has seeking from the end and thus allows concattenation of multiple *.pyl. Each package is a foo.pyl, bar.pyl, and to distribute an app, you concatenate packages. The implied PYTHONPATH in each *.pyl file is "." except perhaps for the Python lib which is "Lib". Boot it as in (3).
Personally, I'd like the problem to go away. :)
I have the same problem, so me too.
I'd like the Python import rules to change to make this solvable without import hooks or path hacking. I can think of two ways to approach this:
- Relative parent imports:
import __.kjParsing.kjParser
OK by me, but doesn't solve the Grand Problem.
- Gradually less local searches.
I am currently bewildered by Python's import mechanism, and this may push me into brain damage. Well, *more* brain damage. Jim Ahlstrom
"James C. Ahlstrom" wrote:
Jim Fulton wrote:
In this context, consider the following concrete, though fictional example. ............
The classical solution would be to add "Shared" to sys.path. But how to do this? And what if another installation changes it?
This doesn't work, because 'Shared' has sub-packages by organization.
I'd be curious to hear how folks would solve this problem.
1) Ship a custom binary interpreter with critical modules frozen in, and which sets a custom sys.path before anything else happens. So you are not dependent at all on PYTHONPATH. Currently this works.
This is what we do for our binary distribtions, but it doesn't do anything for third-party products added to Zope.
2) Add a new command line option which sets sys.path and always use it to start your app. It lists only critical zope directories so zope has its own private name space.
We have a start script that takes care of this. This gets Zope's top-level names (of which there are too many already) into the path. This doesn't help with third-party products added to Zope.
3) Add your own import hooks using imputil.py and boot it as in (1).
But imputil has a global effect. I suppose I could implement what I want with imputil or by otherwise creating my own importer using the standard import hooks. I suspect that's what I'll have to do eventually.
Better if it were part of Python. 4) Use a new Python.pyl file which has seeking from the end and thus allows concattenation of multiple *.pyl. Each package is a foo.pyl, bar.pyl, and to distribute an app, you concatenate packages. The implied PYTHONPATH in each *.pyl file is "." except perhaps for the Python lib which is "Lib". Boot it as in (3).
I'm not looking for a way to replace the current package scheme. Also, I'm not just looking at an app, but at an app that contains apps (aka components).
Personally, I'd like the problem to go away. :)
I have the same problem, so me too.
I'd like the Python import rules to change to make this solvable without import hooks or path hacking. I can think of two ways to approach this:
- Relative parent imports:
import __.kjParsing.kjParser
OK by me, but doesn't solve the Grand Problem.
- Gradually less local searches.
I am currently bewildered by Python's import mechanism, and this may push me into brain damage. Well, *more* brain damage.
I don't see what's so complicated. You just use a gradually widening search, rather than an immediately widening search. I don't think that this is any more complicated than the current rule. I actually prefer the second apprach because it doesn't require any code changes to take advantage of it. I may implement it for Zope if nothing is done in standard Python. Jim -- Jim Fulton mailto:jim@digicool.com Python Powered! Technical Director (888) 344-4332 http://www.python.org Digital Creations http://www.digicool.com http://www.zope.org Under US Code Title 47, Sec.227(b)(1)(C), Sec.227(a)(2)(B) This email address may not be added to any commercial mail list with out my permission. Violation of my privacy with advertising or SPAM will result in a suit for a MINIMUM of $500 damages/incident, $1500 for repeats.
I don't like the pressure to allow relative imports and I don't like the solutions that those who insist on having it propose (they are all writing their own import hooks). The reasoning seems to be something like this: (a) If my package needs non-standard package X, I can't require that X be installed as a top-level package, because there might be another unrelated package also called X; therefore I need to install a copy of package X as a subpackage of my own package. (b) Because any non-standard package X now may be installed at the toplevel or as a subpackage of some other package, X must use relative imports exclusively to access subparts of itself. I find (b) unacceptable for a number of reasons, that have all been explained by others: duplicate installs (and hence possible imports) of the same package as a subpackage of various other packages, the requirement to be religious in the use of relative imports, the ugliness of the __ notation (I do *not* look forward to explaining that to 12-year-olds in the context of CP4E). I think that the fear expressed in (a) is an overreaction to some recent actual top-level name clashes. This was a real pain, but notice that it happened because several popular module collections weren't packagized at all! In the Java world, the right solution exist: a better top-level naming scheme. I forget what the exact rules are, but I seem to recall that Java has something of the same relative import rules af Python currently has: you can import a class from the same leaf package by its simple name, or you can import a class by its full name. As Tim explains, the occasional renames are easily handled with a global substitute in a smart editor. I don't want to do the same thing as Java and require everybody to use their reversed domain name as their package name. Of course you are free to do so: there are currently no toplevel modules or packages that I know of named org, com, edu, gov, mil or net. Two letter domains are all free except for re=Reunion (and ni=Nicaragua, but ni is expired). Oops, mx=Mexico! Well, Python package semantics are slightly different than Java semantics in that shared ownership of a subpackage of (e.g. :-) mx is harder in Python than in Java (Python requires that they are installed in the same directory; Java doesn't). In any case the proper thing to do is to pick a good unique toplevel name. Aaron Watters did a great job with kj (not a country code either!); Marc Andre also has the right spirit with mx. CNRI will probably use cnri as the toplevel package name for CNRI-developed packages that aren't distributed together with Python. If there are still worries about name choices, I volunteer to run a toplevel name registry. This has worked well e.g. for Apple where application developers had to pick a unique 4-letter code; certainly better than the Windows world where conflicting 3-letter extension definitions are common. In the Python world, I would recommend longer toplevel names, so it's more likely that you can use your company name (e.g. pythonware). Note that I still resist the urge to start renaming all the existing standard modules -- too much breakage. This will happen in 2.0. --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum wrote:
I don't like the pressure to allow relative imports and I don't like the solutions that those who insist on having it propose (they are all writing their own import hooks).
I'm wasn't planning to write my own hook until now. I need this feature to be able to organize my package internal stuff swiftly and easily, plus to be able to possibly put it under new top-level packages. Obviously I seem to be more or less alone with this need, so I'll stop argueing for inclusion of "something like relative imports" in the distribution.
The reasoning seems to be something like this:
(a) If my package needs non-standard package X, I can't require that X be installed as a top-level package, because there might be another unrelated package also called X; therefore I need to install a copy of package X as a subpackage of my own package.
(b) Because any non-standard package X now may be installed at the toplevel or as a subpackage of some other package, X must use relative imports exclusively to access subparts of itself.
I find (b) unacceptable for a number of reasons, that have all been explained by others: duplicate installs (and hence possible imports) of the same package as a subpackage of various other packages, the requirement to be religious in the use of relative imports, the ugliness of the __ notation (I do *not* look forward to explaining that to 12-year-olds in the context of CP4E).
(b) is (if at all) a problem only to be taken into account by the author of package X. He may or may not use rel. imports. A 12-year old probably won't (but then: you never know ;). BTW, what is this CP4E thing you're talking about. If it's an syntax aware editor, I have a friend who is really interested in these things... could you send me an URL that I can send him ?
I think that the fear expressed in (a) is an overreaction to some recent actual top-level name clashes. This was a real pain, but notice that it happened because several popular module collections weren't packagized at all!
In the Java world, the right solution exist: a better top-level naming scheme. I forget what the exact rules are, but I seem to recall that Java has something of the same relative import rules af Python currently has: you can import a class from the same leaf package by its simple name, or you can import a class by its full name. As Tim explains, the occasional renames are easily handled with a global substitute in a smart editor.
I don't want to do the same thing as Java and require everybody to use their reversed domain name as their package name. Of course you are free to do so: there are currently no toplevel modules or packages that I know of named org, com, edu, gov, mil or net. Two letter domains are all free except for re=Reunion (and ni=Nicaragua, but ni is expired). Oops, mx=Mexico!
Oh well... I guess they'll have to use 'mex' if they decide to go the reverse domain way ;-)
Well, Python package semantics are slightly different than Java semantics in that shared ownership of a subpackage of (e.g. :-) mx is harder in Python than in Java (Python requires that they are installed in the same directory; Java doesn't).
True, perhaps we should lighten this requirement a little when we recode the import mechanism in Python ? E.g. if a local import fails continue the search with the fully qualified name and only if that fails, restart using the local name. This would need some kind of fastpath cache to make the search reasonably fast though.
In any case the proper thing to do is to pick a good unique toplevel name. Aaron Watters did a great job with kj (not a country code either!); Marc Andre also has the right spirit with mx. CNRI will probably use cnri as the toplevel package name for CNRI-developed packages that aren't distributed together with Python.
If there are still worries about name choices, I volunteer to run a toplevel name registry. This has worked well e.g. for Apple where application developers had to pick a unique 4-letter code; certainly better than the Windows world where conflicting 3-letter extension definitions are common. In the Python world, I would recommend longer toplevel names, so it's more likely that you can use your company name (e.g. pythonware).
Note that I still resist the urge to start renaming all the existing standard modules -- too much breakage. This will happen in 2.0.
Wouldn't it suffice to just put them into one package, e.g. 'python.' ? BTW, as Tim argued: the breakage an easily be leveraged by using a smart editor... ;-) -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 106 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
BTW, what is this CP4E thing you're talking about.
http://www.python.org/doc/essays/cp4e.html also see: http://www.deja.com/=dnc/thread/522068184 (on second thought, don't).
If it's an syntax aware editor, I have a friend who is really interested in these things...
aggressively python-aware editors are fun. want braces? sure, just change the style sheet ;-) </F>
I'm wasn't planning to write my own hook until now. I need this feature to be able to organize my package internal stuff swiftly and easily, plus to be able to possibly put it under new top-level packages. Obviously I seem to be more or less alone with this need, so I'll stop argueing for inclusion of "something like relative imports" in the distribution.
I still hope against hoping to convince you not to do this. I think this adds lots of complexity for the reader of your package. I think of *all* software that I write as something that eventually someone is going to read and try to understand. I also think of it as something that eventually someone is going to port to a platform that wasn't even designed when I wrote the software. Tricks like depending on a custom import hook make reading a pain. For example, the reader who is just starting to browse a large package often needs to follow the thread of execution from one module to the next. If the import statements contain magic like __magic__.foo.bar (actually proposed magic renamed for generality :-), this reader will first need to understand the custom importer -- which is likely one of the hariest components of the package. The same thing applies with even more force to tools like package browsers. IDLE for example has a class browser which displays the true module and class name for base classes. E.g. if you have code that says from Tkinter import Widget class MyWidget(Widget): ... it correctly deduces that MyWidget's base class is Tkinter.Widget. (And in a future version it will probably allow you to click through to that class, show you its methods, etc.) Custom importers break this feature, and thus make the code harder to analyze for the reader. (Not all custom importers are bad; there are plenty of reasons to augment the standard importer. But, while custom importers make different interpretations of the namespace possible, I frown upon that particular use.)
(b) is (if at all) a problem only to be taken into account by the author of package X. He may or may not use rel. imports. A 12-year old probably won't (but then: you never know ;).
Which severely limits your ability to do what you want with packages you didn't write. I know that this sounds politically incorrect in a radical free software world, but often in the end I find it more convenient to conform to the rest of the world and "fit in" than to be different in every little way. Note that my hair isn't blue :-); I've also replaced my "signature" glasses with a more conventional model.
BTW, what is this CP4E thing you're talking about. If it's an syntax aware editor, I have a friend who is really interested in these things... could you send me an URL that I can send him ?
Where have you been? It's on the python.org homepage, has been discussed in c.l.py, c.l.tcl, even c.l.ada, in lwn, and on /.! The syntax aware editor (a proposed super-version of IDLE) is only a small part of it. See python.org/doc/essays/cp4e.html
Oh well... I guess they'll have to use 'mex' if they decide to go the reverse domain way ;-)
Or, in a worse-case scenario, the first Mexican developers using the reverse domain will probably not be aware of the mx toplevel package, and by the time their software hits the street it will be too late.
True, perhaps we should lighten this requirement a little when we recode the import mechanism in Python ? E.g. if a local import fails continue the search with the fully qualified name and only if that fails, restart using the local name. This would need some kind of fastpath cache to make the search reasonably fast though.
Now this is something that I would consider.
Wouldn't it suffice to just put them into one package, e.g. 'python.' ?
And somehow make all user code live implicitly inside that package? I don't see how that solves anything.
BTW, as Tim argued: the breakage an easily be leveraged by using a smart editor... ;-)
What breakage? I think __ is a bit too hard to fix easily with a smart editor. And believe me, at some point *someone* is going to need to rip out your custom importer because it interferes with *his* custom importer, and he'll choose to replace all your __ imports with absolute package names, rather than trying to add the __ feature to his importer. --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum wrote:
... What breakage? I think __ is a bit too hard to fix easily with a smart editor. And believe me, at some point *someone* is going to need to rip out your custom importer because it interferes with *his* custom importer, and he'll choose to replace all your __ imports with absolute package names, rather than trying to add the __ feature to his importer.
This is an aspect of Python's singular import hook. imputil is designed to allow for multiple Importers to be present and to work in harmony. As long as each Importer can identify that a particular import is part of its namespace (e.g. HTTPImporter recognizes _http_... as one of its own), then no conflicts will arise and everything will be hunky-dory. btw, it might be nice to allow a string to the import statement. I'd much rather say: import "http://www.lyra.org/greg/python/qp_xml.py" Cheers, -g -- Greg Stein, http://www.lyra.org/
Guido van Rossum wrote:
I'm wasn't planning to write my own hook until now. I need this feature to be able to organize my package internal stuff swiftly and easily, plus to be able to possibly put it under new top-level packages. Obviously I seem to be more or less alone with this need, so I'll stop argueing for inclusion of "something like relative imports" in the distribution.
I still hope against hoping to convince you not to do this. I think this adds lots of complexity for the reader of your package. I think of *all* software that I write as something that eventually someone is going to read and try to understand. I also think of it as something that eventually someone is going to port to a platform that wasn't even designed when I wrote the software. Tricks like depending on a custom import hook make reading a pain.
For example, the reader who is just starting to browse a large package often needs to follow the thread of execution from one module to the next. If the import statements contain magic like __magic__.foo.bar (actually proposed magic renamed for generality :-), this reader will first need to understand the custom importer -- which is likely one of the hariest components of the package.
While it will perhaps look funny to him at first sight, I'm sure that a proper name will make the idea clear to him: from __superpackage__.foo import bar or just from __super__.foo import bar [Super Foo -- sounds like a comic star ... there's even integrated fun in these names ;-)]
The same thing applies with even more force to tools like package browsers. IDLE for example has a class browser which displays the true module and class name for base classes. E.g. if you have code that says
from Tkinter import Widget
class MyWidget(Widget): ...
it correctly deduces that MyWidget's base class is Tkinter.Widget. (And in a future version it will probably allow you to click through to that class, show you its methods, etc.)
Custom importers break this feature, and thus make the code harder to analyze for the reader.
That's why I tried argue for making it a standard: all tools will then know about the syntax used and take appropriate action.
(Not all custom importers are bad; there are plenty of reasons to augment the standard importer. But, while custom importers make different interpretations of the namespace possible, I frown upon that particular use.)
(b) is (if at all) a problem only to be taken into account by the author of package X. He may or may not use rel. imports. A 12-year old probably won't (but then: you never know ;).
Which severely limits your ability to do what you want with packages you didn't write.
I know that this sounds politically incorrect in a radical free software world, but often in the end I find it more convenient to conform to the rest of the world and "fit in" than to be different in every little way. Note that my hair isn't blue :-); I've also replaced my "signature" glasses with a more conventional model.
Naa, I don't want to be different just to be plain silly different. The case here is one of maintenance. I have around 12 Python packages which I maintain, some of them public, others proprietary. Due to the large number I try to make them all as easy to handle as possible. Now I can't just go and rename the imports in case I want to put them under some other common package name for application A and then do the same with another package name for application B. I would have to do the same thing for every new release of the packages since there are also others who use the standard mx-based name. I just want the packages to work whereever I put them, e.g. I could link them into the new location using symlinks and be done with the work. Of course there are other ways to make this work (and I want to thank everybody who contributed to the sys.path ideas), but they just add omplexity to a whole different area.
BTW, what is this CP4E thing you're talking about. If it's an syntax aware editor, I have a friend who is really interested in these things... could you send me an URL that I can send him ?
Where have you been? It's on the python.org homepage, has been discussed in c.l.py, c.l.tcl, even c.l.ada, in lwn, and on /.! The syntax aware editor (a proposed super-version of IDLE) is only a small part of it. See python.org/doc/essays/cp4e.html
Nope, haven't heard anything about it -- perhaps I was on vacation. Sound interesting though. An editor where you can just type away (and the editor fills in the gaps) would be great fun.
Oh well... I guess they'll have to use 'mex' if they decide to go the reverse domain way ;-)
Or, in a worse-case scenario, the first Mexican developers using the reverse domain will probably not be aware of the mx toplevel package, and by the time their software hits the street it will be too late.
Well, I guess then they'll have a good reason to upgrade their software. Just like all those companies who made extra money by sticking a "Y2000 + Euro-capable" button on their software. No, seriously, with the proposed extension (see below) this shouldn't be much of a problem because my packages all use capital first letters. The reverse domain name would use lower case ones, e.g. mx.pythonware.pilios or mx.zopistas...
True, perhaps we should lighten this requirement a little when we recode the import mechanism in Python ? E.g. if a local import fails continue the search with the fully qualified name and only if that fails, restart using the local name. This would need some kind of fastpath cache to make the search reasonably fast though.
Now this is something that I would consider.
Should be easy to add to the imputil based core emulation.
Wouldn't it suffice to just put them into one package, e.g. 'python.' ?
And somehow make all user code live implicitly inside that package? I don't see how that solves anything.
No, to free the top-level namespace and make the origin of the imports clearer, e.g. import python.image import PIL.image import zope.image etc.
And believe me, at some point *someone* is going to need to rip out your custom importer because it interferes with *his* custom importer, and he'll choose to replace all your __ imports with absolute package names, rather than trying to add the __ feature to his importer.
Note that with imputil doing its thing, he won't have to worry about adding code to his importer. It'll just pass on the import to my importer. That's the fun part about imputil. -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 105 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
[MAL]
... Of course there are other ways to make this work (and I want to thank everybody who contributed to the sys.path ideas), but they just add complexity to a whole different area.
The complexity is already in that area, and for all of us. It can be dealt with. Adding new complexity elsewhere doesn't decrease the complexity that's already there, it just adds more and new kinds of stuff to worry about. [Guido]
... And believe me, at some point *someone* is going to need to rip out your custom importer because it interferes with *his* custom importer, and he'll choose to replace all your __ imports with absolute package names, rather than trying to add the __ feature to his importer.
Note that with imputil doing its thing, he won't have to worry about adding code to his importer. It'll just pass on the import to my importer. That's the fun part about imputil.
As I recall, the old ni took the initial "__" as referring to the *current* package. So where you intend to write import __.sibling.func it was written under ni as import __.__.sibling.func Since ni is semi-blessed prior art, some other joker is going to slam in an Importer to use those rules. I mix your packages with theirs, and then the meaning of "__" depends on whose Importer sees the damn thing first. Again complexity spreads. If Jim is wary of top-level name collisions now, wait until he thinks about naming-gimmick collisions <wink>. Guido is telling the truth: no package is going into Dragon's products unless it's rewritten to purge dicey name tricks. I think we're all in favor of Greg's imputil approach, but where it adds real value (primarily in allowing imports to get satisfied from sources other than direct file images). if-it's-in-a-file-you-just-import-it-by-name-ly y'rs - tim
[Guido]
I know that this sounds politically incorrect in a radical free software world, but often in the end I find it more convenient to conform to the rest of the world and "fit in" than to be different in every little way. Note that my hair isn't blue :-); I've also replaced my "signature" glasses with a more conventional model.
!!! No wonder you tried to slam a conventional license down our throats. My advice is to get back the old glasses, and walk around the halls at CNRI counting backwards from 1000 by seventeens loudly and incessantly. you'd-be-amazed-at-how-few-lawyers-talk-to-me<wink>-ly y'rs - tim
[Guido]
... In the Java world, the right solution exist: a better top-level naming scheme. I forget what the exact rules are, but I seem to recall that Java has something of the same relative import rules af Python currently has: you can import a class from the same leaf package by its simple name, or you can import a class by its full name.
You can import only by the full name. There are no shortcuts for anyone ever in explicit Java imports. The dodge is that to reference classes in exactly the same package, you don't need an import statement at all (you *can* have one, but then you need to use the fully qualified name). The other twist is that all Java code is in *some* package, and code that (wisely) doesn't want to live in the ill-defined unnamed package must contain a package statement declaring which package it's a part of, like package COM.dragonsys.justanexample; The periods there actually have no semantic significance, and package COM.dragonsys.notanexample has no more in common with the above than does package ORG.python.guido.hitmeagain. It's purely a conceptual thing that justanexample and notanexample are *called* "subpackages" of package COM.dragonsys in Java; although implementations are free (but not required) to make up mappings between dotted package names and directory trees. Anyway, the Java rules work fine wrt the issues that have been argued in this thread. Things that are worse than Python: + Magical implied import of same-package names is a PITA for readability and maintenance. + Since subpackages are an illusion, and even packages aren't objects in their own right, there's no possibility to create a shorthand local name for the first M chunks of an N-chunk dotted access path (with M < N). For example, you may use java.awt.Font and java.awt.TextArea a lot. You either spell those out in full all the time, or do import java.awt.Font; import java.awt.TextArea; and use just "Font" and "TextArea" later. Making up a local name for just the "java.awt" part isn't an option. Which leads to gross overuse of the following: + "import package.*" is Java's way of spelling "from package import *", and because of the preceding is the only way to get convenient local names for classes imported from other packages (note that can you can never import a package/module in Java; you can only import a type name). So most Java code will do the above as import java.awt.*; and end up importing a gazillion names. This sucks for the same reasons "import *" sucks in Python, although Java catches the name conflicts at compile time. Things that are better than Python: + Name conflicts are caught at compile time. + At least in theory, the names of packages aren't constrained by platform file name limitations or directory quirks. + Relatedly, explicitly naming the package a compilation unit belongs can be a Good Thing. + They have *an* effective scheme for preventing package-name collisions.
... Note that I still resist the urge to start renaming all the existing standard modules -- too much breakage. This will happen in 2.0.
At home, I dropped an empty __init__.py into the std Lib directory, so that I can do e.g. from Lib.random import randrange in my own packages with their own random modules. Works fine. For 1.6 you should pick some explicit way to *allow* explicit import of at least the std library modules. I suggest taking the name "Lib" <wink>. Java is very cute with its java.util.etc stuff, but longwindedness is not a virtue here -- "util" would have worked just as well and been just as clear. Nobody should write a package with a generic short name like "util" (or "Lib") anyway, so rather than waste this pleasant little corner of the namespace the language may as well reserve it for its own use. heck-it-already-stole-"and"-and-"or"<wink>-ly y'rs - tim
"TP" == Tim Peters <tim_one@email.msn.com> writes:
TP> + "import package.*" is Java's way of spelling "from package TP> import *", and because of the preceding is the only way to get TP> convenient local names for classes imported from other TP> packages (note that can you can never import a package/module TP> in Java; you can only import a type name). So most Java code TP> will do the above as TP> import java.awt.*; TP> and end up importing a gazillion names. This sucks for the TP> same reasons "import *" sucks in Python, although Java catches TP> the name conflicts at compile time. The interesting this is that, while the Java developers did this at the language level, at the VM level, every class is "fully qualified"; you see the absolute path for every class name. import pkg.pkg.* is the reason why you still have to have unique named classes like PyThinging or JWiggie, and yup it sucks. -Barry
[Jim writes]
I'll second Marc-Andre here.
A significant headache occurs when you have a package that has sub-packages. Sub-packages need to be able to reference other sub-packages within the same package without knowing where the containing package is installed.
I certainly dont want to know _where_ the package is installed, but do want to know the package name. A package name, IMO, is like a module name - once named, you are largely stuck with it. I for one use packages quite extensively, but have never lamented the loss of __ from the ni days. If someone, for example, wanted to change the package name of "win32com" or "pywin", then I would expect lots of code to break, in the same way as changing the name any standard module would break code. I see Marc's issue with pickle, but I dont see the problem with absolute package names any more than I see a problem with explicitely naming modules in an import statement... Mark.
参加者 (13)
-
Barry A. Warsaw
-
Fredrik Lundh
-
Gordon McMillan
-
Greg Stein
-
Greg Ward
-
Guido van Rossum
-
James C. Ahlstrom
-
Jean-Claude Wippler
-
Jim Fulton
-
M.-A. Lemburg
-
Mark Hammond
-
Skip Montanaro
-
Tim Peters