[Python-ideas] metamodules (was: Re: Idea to support lazy loaded names.)
Petr Viktorin
encukou at gmail.com
Thu Oct 9 14:13:50 CEST 2014
On Thu, Oct 9, 2014 at 5:24 AM, Nathaniel Smith <njs at pobox.com> wrote:
[...]
>> 1) Analogy with __new__: For packages only, if there's a __new__.py, that gets executed first. If it "returns" (not sure how that was defined) an instance of a subclass of ModuleType, that instance is used to run __init__.py instead of a normal module instance.
>
> This is very similar to the current approach of having __init__.py
> reassign sys.modules[__name__]. The advantages are:
> - It gives a way to enforce the rule that you have to do this
> assignment as the very first thing inside your module, before allowing
> arbitrary code to run (e.g. by importing other modules which might
> recursively import your module in turn, and access
> sys.modules[__name__] before you've modified it).
> - Because __new__.py would run *before* the It avoids the headache of
> having to juggle two module objects, one of whose __dict__'s is
> already being used as the execution environment for the code that is
> trying to do the switcheroo.
>
> But:
> - It's a pretty complicated way to accomplish the stated goals.
As in, a non-obvious way to do a non-obvious thing, from the user's
point of view?
On the implementation side it doesn't strike me as particularly
complicated, am I wrong?
> - The restriction to packages is unfortunate.
True, but it seems to me that you'd usually want it for the
__init__.py – after all if you need to do `from mypackage import
submodule` anyway, and submodule isn't a package itself, you can
usually can just make `submodule` a class directly. Or have the
top-level package define an import hook.
Seems to me that this would really only hurt single-file top-level
modules, but those are easily converted to a package.
> - The backcompat story is terrible -- faking __new__.py support in old
> versions of python would be really difficult, and the main reason I
> care about this stuff in the first place is because I want to be able
> to e.g. deprecate module attributes that are in the public API of old,
> widely-used packages. It will be many years before such packages can
> require 3.5.
That seems like a terrible reason to me – if it should work nicely on
older Pythons, it means a 3rd party module would be enough. Why bother
adding it to the language?
Valid reasons would be to make it easier for alternative interpreters,
or to catch edge cases (e.g. make it work nicely with custom
importers, zip files, etc). But at that point, we can just do it the
"right way"*, have the backcompat story to a third-party shim.
* i.e. "not paying attention to older Pythons" – I'm not saying
__new__.py is necessarily the right way, I'm criticizing this reason
against it
More information about the Python-ideas
mailing list