On Mon, 7 Mar 2016 at 15:02 Terry Reedy email@example.com wrote:
On 3/7/2016 12:22 PM, Chris Angelico wrote:
In theory, though, since it doesn't change syntax, it could be something
import sys sys.package_mode_ACTIVATE()
The trouble is that doing this after _any_ imports (including the ones that are done before your script starts) will risk those imports being resolved from local files.
Barring a setting in $PYTHONPATH, Python's startup imports will not be resolved from local files. It appears that '' or scriptdir are only added to sys.path afterwards. I base this on an experiment where I tried to shadow one of the Python-coded /lib modules with a local file.
C-coded builtin modules cannot be shadowed. I discovered this by experiment and then found
"Python includes a number of default finders and importers. The first one knows how to locate built-in modules, and the second knows how to locate frozen modules. A third default finder searches an import path for modules. The import path is a list of locations that may name file system paths or zip files."
I consider the behavior of local versus stdlib imports depending on the stdlib implementation language to be somewhat problematical. There was discussion of shadowing, I believe here, a few months ago (sometime after 3.5.0 was released).
There was a discussion about shadowing modules that wasn't too long ago, but I don't remember when.
Basically this idea of dropping '' from sys.path so that the local directory isn't included comes up on occasion. Usually it's someone teaching a beginner who named a module something in the stdlib and then got bit by this that suggests it. This discussion usually comes down to "help the beginners" vs "don't break compatibility!" The discussion is slightly nuanced, though, because '' becomes the location of the file passed on the command-line when it's specified so that things don't have to be in a package to be run (Chris' proposal deals with this by forcing the package concept, although you don't need an imaginary __init__.py since we have the concept of namespace packages). But if you drop the directory that a script is contained in, how do you import any packages that are in that directory? What about __main__.py files in the top directory of code checkouts for easy testing of making executable zipfiles? It's a slippery slope, hence why the semantics have not changed.
Having "import X" after activating package mode is useless if X gets resolved from sys.modules, and it'd be a nightmare to debug.
If the -p parameter is a problem, maybe there could be an environment variable that changes the default? Then people could opt-in to pseudo-package mode globally, run all their tests, and see what stops working.
FWIW, I do want this to eventually become the default behaviour. But it wouldn't be any time soon; in the meantime, it would be easy enough to recommend that people just "always do this to be safe" (in the same way that Python 2 didn't make new-style classes the default, but everyone's advised to "always subclass object"). By making it a long-term non-default feature, Python gets to provide safety without breaking anyone's code.
You might want to look at previous discussions of shadowing if you can find them.
-- Terry Jan Reedy
Python-ideas mailing list Pythonfirstname.lastname@example.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/