On Tue, Mar 8, 2016 at 4:07 AM, Random832 email@example.com wrote:
On Mon, Mar 7, 2016, at 11:54, Chris Angelico wrote:
It'd be a backward-incompatible change, so it would need to be explicitly invoked. Something like:
python3 -p somefile.py
which would pretend to create an __init__.py in the current directory, change to the parent, and "from dirname import somefile".
"#!/usr/bin/env python3 -p" won't work on many systems. So I think it'd be better as a magic statement within the file (possibly simply a future-import, with the possibility of eventually making it the default behavior), rather than a command-line argument.
It can't be a future import, though, as they apply to the file they're in, and not to anywhere else. This directive would change the way that the regular "import" statement behaves, for the whole process.
In theory, though, since it doesn't change syntax, it could be something like:
import sys sys.package_mode_ACTIVATE()
The trouble is that doing this after _any_ imports (including the ones that are done before your script starts) will risk those imports being resolved from local files. Having "import X" after activating package mode is useless if X gets resolved from sys.modules, and it'd be a nightmare to debug.
If the -p parameter is a problem, maybe there could be an environment variable that changes the default? Then people could opt-in to pseudo-package mode globally, run all their tests, and see what stops working.
FWIW, I do want this to eventually become the default behaviour. But it wouldn't be any time soon; in the meantime, it would be easy enough to recommend that people just "always do this to be safe" (in the same way that Python 2 didn't make new-style classes the default, but everyone's advised to "always subclass object"). By making it a long-term non-default feature, Python gets to provide safety without breaking anyone's code.