[Import-SIG] One last try: "virtual packages"
barry at python.org
Tue Jul 12 22:03:58 CEST 2011
On Jul 12, 2011, at 12:02 PM, P.J. Eby wrote:
>At 11:03 AM 7/12/2011 -0400, Barry Warsaw wrote:
>>It's a very interesting idea that is worth exploring. A few things come to
>>- Under this scheme it's possible for names in a module to "suddenly" appear.
>Bear in mind that you still have to actually *import* those names, so it's
>not like they really "suddenly" appear. And when you do import them, they'll
>be *modules*, not functions or classes or constants or anything.
Yeah, I was just thinking about something dumb like a typo in an import
statement, but I think that's nothing realistic to be worried about.
>> E.g. I could install packages that extend existing top level modules like
>> `time` or `string`. This might be a good thing in that it gives 3rd party
>> folks a more natural place to add things, but it could also open up a
>> land-grab type collision if lots of people want to publish their > packages as
>> subpackage extensions to existing modules.
>True -- an ironic side-effect, given our intent to make it easier to *avoid*
>such collisions. ;-) However, given that this feature will probably NOT be
>available on versions <3.3 by default (see discussion below), it probably
>won't get *too* far out of hand.
We'll let you eat those words in 15 years when Python 4.7 comes out. :)
>Also, because you can't add new module *contents*, there's little benefit to
>doing this anyway. Your users would have to do "from string.foobar import
>bizbaz" or "import string.foobar as foobar", anyway, so why not just make a
>"foobar.string" module and call it a day?
>I also don't think we should really advertise the ability to extend other
>people's packages, except maybe to say, "don't do it."
Agreed. I did want to bring this up as a side-effect of the feature.
>We could also shut down the capability by requiring virtual packages to be
>declared in the module, if there is a defining module. That would actually
>work well with cross-version compatibility (see below) but would add an extra
>step when turning a module into a package.
I'd rather go the other way. IOW, leave it open by default but perhaps
provide an API that allows a module to declare itself closed to submodules. I
don't actually expect that to be used much, so I'm happy to call YAGNI on it.
But I don't want to require a defining module for virtual packages, because
that makes it less useful for vendor packagers.
Generally, I think we'd prefer not to have defining modules, but when we do,
we can have the defmod.py owned by exactly one vendor package, and then
submodules would add dependencies on that defining module. This is actually
one way we currently handle colliding __init__.py files, but it kind of sucks
because it makes packaging submodules more complicated.
>>- It's unfortunate that this will be more difficult to back port to Python 2.
>Well, I'm not that bothered by it. Python 2 still has its two existing ways
>to do this, and it's not *that* terribly hard to make an __import__ wrapper.
>But there are some things that can be done to make it easier.
I'm also not entirely sure I'd want to back port this into our Python 2
versions anyway, at least not without fully understanding the performance and
other implications. I'd rather spend the effort to get people switched to
Python 3. :)
>>- It sounds like it will be more difficult to have a single code base that
>> supports Python 2, Python3 <= 3.2, and Python 3.3. This is because
>> __init__.py is required in the first two, but does the wrong thing (I
>> think ;) in a post-PEP 382 Python 3.3. Adding a .pyp file that's ignored
>> in anything that doesn't support PEP 382 would make it easier to support
>> multiple Pythons.
>There's a straightforward way to solve this. Suppose we have a module called
>'pep382', with a function 'make_virtual(packagename)'. In Python 2.x,
>setuptools will make "distributionname-version-nspkg.pth" files that just say
>'import pep382; pep382.make_virtual("toplevelnamespace")', and the same
>solution would work for Python 3 through 3.2. (In the .egg based install
>case, __init__.py gets used and the older API is called, but in future
>setuptools that'll be a wrapper over the pep382 module.)
>For Python 3.3, these APIs don't need to be used, but they'll still work.
>They just won't be doing anything significant. You can drop use of the APIs
>as you drop support for older Pythons, and code targeted to 3.3+ can just do
>For Python < 3.3, you have to get the pep382 module installed and activated
>somehow in order to use the feature. However, once you do, you can use "pure
>virtual" packages without an __import__ hook, because a meta_path importer
>can catch an otherwise-failed import and set up an empty module with a
>IOW, the difficult part of implementing this on 2.x is only the part where
>you allow transitioning from a 'foo' module to a 'foo' package without
>changing the module. If you're using namespaces the way people mostly do now
>on 2.x, it works without an __import__ hook.
>For this reason, I suggest that the default for the backwards-compatibility
>module be to only handle pure-virtual and declared-virtual packages, not
>module-extension virtual packages. That way, the overhead remains low.
>(Writing __import__ in Python adds overhead to *every* import statement,
>vs. the relatively small and infrequent overheads added by PEP 302 hooks.)
I'm less concerned about the foo-module-to-foo-package case, so I'm okay with
that being more difficult in Python < 3.3.
>>Let's see the PEP!
>Martin said something about working one up along similar lines himself; I'm
>curious to see what his proposal is.
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 836 bytes
Desc: not available
More information about the Import-SIG