Since I implemented[*] PEP 328, Aahz suggested I take over editing the PEP,
too, as there were some minor discussion points to add still. I haven't been
around for the discussioons, though, and it's been a while for everone else,
I think, so I'd like to rehash and ask for any other open points.
The one open point that Aahz forwarded me, and is expressed somewhat in
http://mail.python.org/pipermail/python-dev/2004-September/048695.html , is
the case where you have a package that you want to transparently supply a
particular version of a module for forward/backward compatibility, replacing
a version elsewhere on sys.path (if any.) I see four distinct situations for
this:
1) Replacing a stdlib module (or a set of them) with a newer version, if the
stdlib module is too old, where you want the whole stdlib to use the
newer version.
2) Same as 1), but private to your package; modules not in your package
should get the stdlib version when they import the 'replaced' module.
3) Providing a module (or a set of them) that the stdlib might be missing
(but which will be a new enough version if it's there)
1) and 3) are easy to solve: put the module in a separate directory, insert
that into sys.path; at the front for 1), at the end for 3). Mailman, IIRC,
does this, and I think it works fine.
2) is easy if it's a single module; include it in your package and import it
relatively. If it's a package itself, it's again pretty easy; include the
package and include it relatively. The package itself is hopefully already
using relative imports to get sibling packages. If the package is using
absolute imports to get sibling packages, well, crap. I don't think we can
solve that issue whatever we do: that already breaks.
The real problem with 2) is when you have tightly coupled modules that are
not together in a package and not using relative imports, or perhaps when
you want to *partially* override a package. I would argue that tightly
coupled modules should always use relative imports, whether they are
together in a package or not (even though they should probably be in a
package anyway.) I'd also argue that having different modules import
different versions of existing modules is a bad idea. It's workable if the
modules are only used internally, but exposing anything is troublesome. for
instance, an instance of a class defined in foo (1.0) imported by bar will
not be an instance of the same class defined in foo (1.1) imported by
feeble.
Am I missing anything?
([*] incorrectly, to be sure, but I have a 'correct' version ready that I'll
upload in a second; I was trying to confuse Guido into accepting my version,
instead.)
--
Thomas Wouters
On 2/25/06, Thomas Wouters
Since I implemented[*] PEP 328, Aahz suggested I take over editing the PEP, too, as there were some minor discussion points to add still. I haven't been around for the discussioons, though, and it's been a while for everone else, I think, so I'd like to rehash and ask for any other open points.
The one open point that Aahz forwarded me, and is expressed somewhat in http://mail.python.org/pipermail/python-dev/2004-September/048695.html , is the case where you have a package that you want to transparently supply a particular version of a module for forward/backward compatibility, replacing a version elsewhere on sys.path (if any.) I see four distinct situations for this:
1) Replacing a stdlib module (or a set of them) with a newer version, if the stdlib module is too old, where you want the whole stdlib to use the newer version.
2) Same as 1), but private to your package; modules not in your package should get the stdlib version when they import the 'replaced' module.
3) Providing a module (or a set of them) that the stdlib might be missing (but which will be a new enough version if it's there)
1) and 3) are easy to solve: put the module in a separate directory, insert that into sys.path; at the front for 1), at the end for 3). Mailman, IIRC, does this, and I think it works fine.
2) is easy if it's a single module; include it in your package and import it relatively. If it's a package itself, it's again pretty easy; include the package and include it relatively. The package itself is hopefully already using relative imports to get sibling packages. If the package is using absolute imports to get sibling packages, well, crap. I don't think we can solve that issue whatever we do: that already breaks.
The real problem with 2) is when you have tightly coupled modules that are not together in a package and not using relative imports, or perhaps when you want to *partially* override a package. I would argue that tightly coupled modules should always use relative imports, whether they are together in a package or not (even though they should probably be in a package anyway.) I'd also argue that having different modules import different versions of existing modules is a bad idea. It's workable if the modules are only used internally, but exposing anything is troublesome. for instance, an instance of a class defined in foo (1.0) imported by bar will not be an instance of the same class defined in foo (1.1) imported by feeble.
Am I missing anything?
([*] incorrectly, to be sure, but I have a 'correct' version ready that I'll upload in a second; I was trying to confuse Guido into accepting my version, instead.)
One thing you're missing here is that the original assertion about the impossibility of editing the source code of the third-party package that's being incorporated into your distribution, is simply wrong. Systematically modifying all modules in a package to change their imports to assume a slightly different hierarchy can easily be done mechanically. I'd also add that eggs promise to provide a different solution for most concerns. I believe we should go ahead and implement PEP 338 faithfully without revisiting the decisions. If we were wrong (which I doubt) we'll have the opportunity to take a different direction in 2.6. -- --Guido van Rossum (home page: http://www.python.org/~guido/)
"Thomas Wouters"
The one open point that Aahz forwarded me, and is expressed somewhat in http://mail.python.org/pipermail/python-dev/2004-September/048695.html , is the case where you have a package that you want to transparently supply a particular version of a module for forward/backward compatibility, replacing a version elsewhere on sys.path (if any.) I see four distinct situations for this:
Did you mean three?
1) Replacing a stdlib module (or a set of them) with a newer version, if the stdlib module is too old, where you want the whole stdlib to use the newer version.
2) Same as 1), but private to your package; modules not in your package should get the stdlib version when they import the 'replaced' module.
3) Providing a module (or a set of them) that the stdlib might be missing (but which will be a new enough version if it's there)
Or did you forget the fourth? In any case, the easy solution to breaking code is to not do it until 3.0. There might never be a 2.7 to worry about. Terry Jan Reedy
participants (3)
-
Guido van Rossum
-
Terry Reedy
-
Thomas Wouters