
Greetings, Just like builtins module, would it be a good idea to have a stdlib module so that we can know all modules in the std lib (just like builtins let us know builtins)? Abdur-Rahmaan Janhangeer Mauritius

On Jul 15, 2019, at 11:47, Abdur-Rahmaan Janhangeer <arj.python@gmail.com> wrote:
Greetings,
Just like builtins module, would it be a good idea to have a stdlib module so that we can know all modules in the std lib (just like builtins let us know builtins)?
Is it just the names that are in stdlib, or some kind of lazy imports for the modules themselves, so you can do “from stdlib import pprint”? The latter is a bit more complicated, but it seems like it would make the feature a lot more useful—it’s a way to guarantee that you get the standard pprint even if it’s been shadowed by a local file, a way to make sure you get an early error if your distributor decided to leave pprint out of the distribution, and so on. And that would be similar to the builtins module (builtins.print is the builtin print, even if you’ve shadowed it with a module global). What exactly goes in stdlib? Would MicroPython include framebuf in stdlib, PyPI include cffi, etc.? Does it include things like test that are inside the lib/pythonX.Y directory but not intended to be imported? Or “accelerators” like _datetime, or other “internal” modules like _compression? What about __future__? Would Apple’s Python distribution include their extra modules like PyObjC, and would Debian’s exclude the ones they’ve taken out of the python package and put in separate debs? Is there a distributor and/or site mechanism for customizing this? You might want to take a look at https://github.com/abarnert/stdlib. Someone last year suggested something different but similar—a stdlib module that (lazily) includes the _ contents_ of every stdlib module, rather than the modules themselves—so I slapped together an implementation to play with the idea. Your suggestion should be a lot simpler—and potentially a lot more useful. So you really should consider implementing it and sharing live examples that people can play with themselves instead of having to explain everything. You can even put it on PyPI and see if it gets any traction. If people find it useful there—or, even better, if they find it useful but complain that it can’t do certain things (like tracking differences caused by distributors) without being part of Python itself—you’d have a great case.

I think this may be a worthwhile idea — after all, “namespaces are one honking great idea” This would be an opportunity to clearly define the “standard library” as something other than “all the stuff that ships with cPython” With that in mind:
Would MicroPython include framebuf in stdlib, PyPI include cffi, etc.?
Does it include things like test that are inside the lib/pythonX.Y
No, they wouldn’t. The Stalin would be, well, a standard. If something is only going to exist on some platforms or implementations, they should not be in that namespace. A “pypy” or “upy” namespace would be the place to put those things. Then it’s obvious to everyone where code that uses those will and won’t run. Granted, something like MicroPython May never support the entire Stdlib, but the goal should be to define what a “complete” stub is. directory but not intended to be imported? I don’t think so— the point would be to define a namespace to use to import stuff
What about __future__?
Nope: that isn’t really an import.
Would Apple’s Python distribution include their extra modules like PyObjC,
Nope -/ again, not “standard”
would Debian’s exclude the ones they’ve taken out of the python package and put in separate debs?
That’s a tricky one, but probably yes. So you really should consider implementing it and sharing live examples
that people can play with themselves instead of having to explain everything.
+1 -CHB -- Christopher Barker, PhD Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython

On Tue, Jul 16, 2019 at 12:30 PM Christopher Barker <pythonchb@gmail.com> wrote:
What about __future__?
Nope: that isn’t really an import.
Actually it is, and the __future__ module is the best way to find out what future directives there are and which versions they're needed in.
So IMO it *should* be in this sort of collection. ChrisA

I think a better current definition might be more like “all the stuff that has chapters in docs.python.org/library”. That documentation claims to define “The Python Standard Library” in the same way the reference defines “The Python Language”. (It also says to keep it under your pillow, which was not good for the screen of my tablet, but I’m not going to sue anyone over that.) But the fact that we don’t even agree on the current implicit definition is just more ammunition for your argument that we need an explicit definition. And having it enshrined in code (and tests!) seems even nicer.
No, they wouldn’t. The Stalin would be, well, a standard.
Surely a Lenin level of forced standardization is sufficient; we don’t need to go full Stalin :)
If something is only going to exist on some platforms or implementations, they should not be in that namespace.
I’m not sure about that. Plenty of Unix developers think of modules like pty and termios as “part of the stdlib”. (I’m not sure if that’s the same for Windows devs with winreg and msvcrt, but it wouldn’t surprise me.) And they’re documented and built and delivered the same way as cross-platform libs. Why shouldn’t they be able to import them from stdlib? And if that is the rule, there’s a lot of room for bikeshedding on the edges. Does the stdlib then include curses because in theory it builds on any platform if the lib is found, even though in practice nobody builds it on Windows? Exclude multiprocessing and mmap because they’re not on all platforms, only Windows and Unix (including Mac and Linux, and even most third-party iOS and Android builds), which is all 99.9% of the Python world cares about? Exclude dis because it’s CPython-specific even though PyPy includes it as well? One more issue: packages. With a simple lazy importer design, there’s no way to make stdlib include the portable xml.etree but not the optional xml.parsers.expat; it either includes the xml package or it doesn’t. Is that what we want, or do we actually need lazily build proxy package objects for things like xml that lazy-import some things but reject others? And, if so, is it a problem that xml and stdlib.xml can both be imported at the same time and will have different contents?
A “pypy” or “upy” namespace would be the place to put those things. Then it’s obvious to everyone where code that uses those will and won’t run.
Only if everyone stops writing import spam and starts writing from stdlib/cpython/pypy import spam. Just because people _can_ now (with this change) do that doesn’t mean we want to encourage them to, much less expect them to. It wouldn’t just make code gratuitously incompatible with older Python, it would mark a very visible difference between most 3.9+ code and “legacy” code, given that most modules and scripts start off with stdlib imports right at the top.
Granted, something like MicroPython May never support the entire Stdlib, but the goal should be to define what a “complete” stub is.
Oh, that raises another point: there are a bunch of libraries that MicroPython does include; but only partially implements. If you from stdlib import cmath, should that fail, or should it do exactly the same thing as import cmath (that is, import fine, and work fine as long as you stick to the subset it supports, but NameError if you try to call, e.g., cmath.acosh)? For that matter, I believe every platform has a gc module, and they all share at least the three core functions, but beyond that everything is different. Should gc only be in stdlib on platforms where it includes all the documented functions?
What about __future__?
Nope: that isn’t really an import.
Yes it is. It’s a magic name used by future statements, but it’s _also_ a perfectly normal module defined in __future__.py, included in the stdlib directory with every major implementation, and documented alongside things like sys and traceback.

I think a better definition might be: "A lib to access libraries you can use without installing 3rd party packages". That way, Unix's packages included in the stdlib will be accessible on Unix only. As referenced by @Christopher Barker <pythonchb@gmail.com>, a way to namespace. Like right now if you want to know the stdlib functions, you have to go through the docs. While teaching programming, i find the builtins module very convenient as students can discover by themselves. Similarly, a stdlib module when inspected, shows what you can import right out of the box. Inspection goes a long way in making learning fun, going through the dosctrings etc. -- Abdur-Rahmaan Janhangeer http://www.pythonmembers.club | https://github.com/Abdur-rahmaanJ Mauritius

On Tue, Jul 16, 2019 at 2:54 PM Andrew Barnert via Python-ideas <python-ideas@python.org> wrote:
And if that is the rule, there’s a lot of room for bikeshedding on the edges. Does the stdlib then include curses because in theory it builds on any platform if the lib is found, even though in practice nobody builds it on Windows? Exclude multiprocessing and mmap because they’re not on all platforms, only Windows and Unix (including Mac and Linux, and even most third-party iOS and Android builds), which is all 99.9% of the Python world cares about? Exclude dis because it’s CPython-specific even though PyPy includes it as well?
I'm not 100% sure of use-cases, but one potential use for this would be to check for accidental name collisions. That wouldn't need to include the *contents* of every module, just as long as it has the *name* (and this also deals with packages, since you need only the top level). In this way, it would be more parallel to the "keyword" module than "builtins"; you could have "import stdlib; print(stdlib.module_names)" or do a quick set intersection check against your file names. To that end, it would be best if it includes EVERY module that would be considered standard, including all of the ones you mentioned. (Possibly internal implementation modules should be separate, but anything public-facing should be included.) It'd be a quick tool for checking for collisions across Python platforms, versions (just load up another version's stdlib.py and check it), etc, etc. Actually figuring out which ones can be imported is a much harder job, albeit a simple one - just call __import__ on every name and see which ones bomb with ImportError. ChrisA

Just out of interest there must be something like this in venv - how else can it decide what to include in the virtual environment as copies or links? That would give the platform specific list. Steve Barnes -----Original Message----- From: Chris Angelico <rosuav@gmail.com> Sent: 16 July 2019 06:21 To: python-ideas <python-ideas@python.org> Subject: [Python-ideas] Re: Stdlib Module On Tue, Jul 16, 2019 at 2:54 PM Andrew Barnert via Python-ideas <python-ideas@python.org> wrote:
And if that is the rule, there’s a lot of room for bikeshedding on the edges. Does the stdlib then include curses because in theory it builds on any platform if the lib is found, even though in practice nobody builds it on Windows? Exclude multiprocessing and mmap because they’re not on all platforms, only Windows and Unix (including Mac and Linux, and even most third-party iOS and Android builds), which is all 99.9% of the Python world cares about? Exclude dis because it’s CPython-specific even though PyPy includes it as well?
I'm not 100% sure of use-cases, but one potential use for this would be to check for accidental name collisions. That wouldn't need to include the *contents* of every module, just as long as it has the *name* (and this also deals with packages, since you need only the top level). In this way, it would be more parallel to the "keyword" module than "builtins"; you could have "import stdlib; print(stdlib.module_names)" or do a quick set intersection check against your file names. To that end, it would be best if it includes EVERY module that would be considered standard, including all of the ones you mentioned. (Possibly internal implementation modules should be separate, but anything public-facing should be included.) It'd be a quick tool for checking for collisions across Python platforms, versions (just load up another version's stdlib.py and check it), etc, etc. Actually figuring out which ones can be imported is a much harder job, albeit a simple one - just call __import__ on every name and see which ones bomb with ImportError. ChrisA _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/NALK27... Code of Conduct: http://python.org/psf/codeofconduct/

On Jul 15, 2019, at 22:45, Steve Barnes <GadgetSteve@live.co.uk> wrote:
Just out of interest there must be something like this in venv - how else can it decide what to include in the virtual environment as copies or links? That would give the platform specific list.
Doesn’t it just do whatever’s in the libpython directory? I think the closest way to reproduce that behavior from inside is to look at syslib.get_path('stdlib') and 'platstdlib', map those to sys.path entries, and try to build an import loader for each file in those paths. Probably not something you want to do at every “import stdlib”. Especially for implementations like Brython, where import is a synchronous AJAX network call. Also, venv surely has to include private modules, or the public ones wouldn’t work. That might actually be the right decision, but people have already made suggestions both ways, so presumably that’s something that needs to be discussed and decided on usefulness grounds rather than forced on us by an implementation technique.

On Mon, Jul 15, 2019 at 07:29:59PM -0700, Christopher Barker wrote:
CPython is the reference implementation. It is expected that anything shipped by CPython ought to be shipped by all other implementations, unless there's a very good reason not to. Compliance is a Quality of Implementation issue, not a deal breaker.
"Standard" doesn't mean "available everywhere, in every version of every implementation".
Please, let's avoid scope-creep. The proposal here is simple and limited: make the std library modules available through a special "stdlib" namespace. It's going to be hard enough to write the PEP and get it approved without blowing the scope out to include defining acceptable levels of functionality for stubs used by alternate implementations. -- Steven

On Tue, Jul 16, 2019 at 4:44 AM Steven D'Aprano <steve@pearwood.info> wrote:
I agree there, *almost*: I would define that is "anything in the CPython standard library should be shipped by all other implimentiatons" we cant inforce that of course, but if not, then you are shipping an incomplete implimentions (which I expect everyone using, e.g. micro Python knows). "A complete implementation" is more aspirational than a requirement.
but the goal should be to define what a “complete” stdlib is.
You call it scope-creep, I call it an opportunity. This would be an opportunity to provide a new definition for what "the standard library" means. And I think that would be very useful -- far more useful than simply providing a new namespace for all the cruft that is already in there. As someone said -- there is room for a lot of bike shedding around the edges -- so be it. If someone takes up the mantle and makes a test implementation and starts a PEP. then we will have a framework for that bike shedding. If HolyGrailPython wants to package stdlib.black_knight we
fails in CPython. Sure -- I'm not suggesting any enforcement mechanism -- just like now, HolyGrailPython can implement "import sys" to simply print "Ni". I don't suspect their users would like that, but that's up to them. All I'm saying is that the stdlib module could be defined as the platform-agnostic standard stuff -- so third party implimenters can make a decision about how compatible they should be.
https://docs.python.org/3/library/index.html Which used to include SGI specific stuff, and still includes, e.g. macpath. I'm just suggesting that "ships with cPython" and "is the stdlib" should mean different things -- because that would be useful. Maybe I need to call it the "common library", rather than the standard library (from comlib import pathlib) -- would that seem useful to folks?
Only if everyone stops writing import spam and starts writing from stdlib/cpython/pypy import spam
I was thinking that it would be: from comlib import pathlib from pypy import something_pypy_ specific (or pypylib maybe?) Anyone not using pypy would never use the pypy module And anyone that runs that code on another Pyton could get a far more meaningful error Granted, this would require people to start im porting from the comlib (Or stdlib, or...) module, but it would go a long way to have a pypy module, even if folks still used the regular imports for non pypy stuff. I thought the idea was the people would be encouraged to use the new namespace? Or was it just to have it there to inspect? If the latter, than my idea of restricting it might make even more sense :-) Anyway -- I"m not going to run with this, so whoever does gets to drive the discussion. -CHB -- Christopher Barker, PhD Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython

On Jul 16, 2019, at 10:20, Abdur-Rahmaan Janhangeer <arj.python@gmail.com> wrote:
I must make a C or pure Python demo?
You can definitely implement this in pure Python. It doesn’t need C for performance reasons, it’s not going to be bootstrapped into the start sequence, it doesn’t need access to C-only APIs, etc. And keep it simple. You don’t need to do anything clever to compute the set of modules at install time or runtime, just hardcore them. You can probably just take the lazy import example from PEP 562 and change __all__ to be a list of all of the names you think should be included. That’s more than enough for people to play with it, and for you to show off useful examples that other people can run. If you decide to upload it to PyPI (which I think you probably should at some point), you’ll need to figure out how to handle providing the right stdlib for the user’s Python version (and implementation, platform, etc., if that’s the way the discussion goes). But I’d put that off for an 0.2 version, and get the 0.1 version done and uploaded to github or whatever first.

On Tue, Jul 16, 2019 at 09:50:19AM -0700, Christopher Barker wrote:
Is there a difference in meaning between "ought" and "should" here? Because I can't tell what the "almost" is here. What is the difference between what I said and what you said? [...]
Yes: an opportunity to bike-shed, an opportunity to derail the proposal, an opportunity to increase the complexity of the proposal by an order of magnitude without a corresponding increase in usefulness.
Why do we need a new definition for std lib that differs from the definition used for 30 years, one which is universally understood by Python developers and referenced in books and blog posts and Stackoverflow answers everywhere? We have a reference implementation (CPython) that defines the std lib. How is it helpful to distinguish between these? 1. The standard library modules which are documented in the reference manual, and maintained by the core developers; 2. and a *different* set of modules (maybe a subset, maybe a superset) which may or may not be "standard" in the sense above but will appear in the stdlib namespace. (These are not rhetorical questions. If you want to make a case for the usefulness of this, go right ahead.)
And I think that would be very useful -- far more useful than simply providing a new namespace for all the cruft that is already in there.
Whether it is cruft or not is an unrelated issue. Modules can still be deprecated, they can still be removed. But so long as they remain in the std lib, then they would remain accessible in the stdlib namespace. (What you call "cruft", someone else may call "critically important library I couldn't do without".) [...]
All I'm saying is that the stdlib module could be defined as the platform-agnostic standard stuff
There's a lot more platform-dependent features and behaviour in the stdlib than most people realise, and if we start excluding anything which is not platform-agnostic, we'd exclude major modules as: sys, os, stat, time, subprocess, shutil, math, datetime, random all of which include platform-specific behaviour or features.
-- so third party implimenters can make a decision about how compatible they should be.
Are you suggesting that implementers can't currently make that decision?
So? It also includes Windows specific stuff, and POSIX specific stuff.
I'm just suggesting that "ships with cPython" and "is the stdlib" should mean different things -- because that would be useful.
Just repeating "it would be useful" doesn't make it so. How would it be useful? Do you think there are many (e.g.) POSIX developers who are confused by the lack of winreg module? If not, what problem are you trying to solve by redefining the standard library?
How is the "common library" different from the standard library? pathlib is another one of the major standard library modules that fails to be platform-agnostic. [...]
And anyone that runs that code on another Pyton could get a far more meaningful error
I don't see that there is much difference between: py> import cffi # PyPy specific Traceback (most recent call last): File "<stdin>", line 1, in <module> ImportError: No module named 'cffi' py> from pypy import cffi Traceback (most recent call last): File "<stdin>", line 1, in <module> ImportError: No module named 'pypy' -- Steven

On Tue, Jul 16, 2019 at 3:22 PM Steven D'Aprano <steve@pearwood.info> wrote:
no -- "should" and "ought" are synonyms in this context - the difference is "shipped by CPython" vs "in the CPython standard library". cPYton ships stuff that is cPython specific, doesn't it. Yes: an opportunity to bike-shed, an opportunity to derail the proposal,
an opportunity to increase the complexity of the proposal by an order of magnitude without a corresponding increase in usefulness.
Frankly it is unlcear to me what "the proposal" actually is anyway, so it's an opportunity to to something more expansive that provide a module that does nothing but provide the exact same information that is already in the docs (and implicit in the fact that third party libraries are usually installed in site-packages). The OP did refer to being able to point people to __builtins__ to see what's built in. It that's the only goal here, then we could make a simple "list_stdlib_modules" function.
I'm made that point already -- you may not agree, but the idea is that there are things that "ship with cPython" and there are things that are really designed to be generically useful and counted on everywhere. I think that's a useful distinction. There is even a discussion right now on what to remove (or deprecate) from the stdlib -- perhaps there is a middle option -- keep it,but make it no longer part of the "common library".
it would be a subset -- but yes, that's the idea :-)
see above -- not necessarily -- in fact, leaving deprecated modules out of the stdlib namespace would be one good use case for my idea -- it would be far more clear that you were using something that is no longer "standard".
(What you call "cruft", someone else may call "critically important library I couldn't do without".)
Of course -- which is again why it could be useful to still give that user access, while also signaling clearly that they are using non-standard features.
All I'm saying is that the stdlib module could be defined as the
I think it's worth drawing a line between behaviour and features. platform-specific behaviour is too bad, but probably unavoidable. But I've always (which goes back to Python 1.5) thought that putting platform-specific behaviour in standard modules was a wart. My prime example is the os module -- it is a wrapper around platform specific modules, and most of it provided platform-nuetral features, but a there are a few oddballs in there. Why do I care? because folks can write code on a posix system, and have no idea that that code won't work on, e.g. Windows. It wouldn't seem such a burden to do: import os and then, when you posix only features -- import posix Making it very clear that that particular part of the code will only run under posix.
-- so third party implimenters can make a
decision about how compatible they should be.
Are you suggesting that implementers can't currently make that decision?
of course they can -- nothing would change in that regard.
We have a> simple definition of "standard library", namely anything documented
exactly my point -- should all that be in the same namespace? It has been because there was only one namespace anyway.
Do you think there are many (e.g.) POSIX developers who are confused by the lack of winreg module?
of course not -- but that actually proves my point -- by calling in the "winreg" it is clearly a Windows thing. ANyhone using it will not expect that code to run on non-windows system. But if it were called "systemconfig", that would be a different story. How is the "common library" different from the standard library?
Because "We already have a definition of the standard library", and that definition is "everything that ships and is documented with cPython" My suggestion is that we make something *new* which has a different defintion, and then people wont be saying that I'm proposing breaking things -- it's something different, maybe it should have different name.
pathlib is another one of the major standard library modules that fails to be platform-agnostic.
Though its intention is to provide a consistent interface for path manipulation -- and certainly most of it is consistent -- I'll take your work for it that not all of it is, but certainly most is. One reason folks should use it rather than doing things like: my_path = dir_name + r"\" + filename The fact is that Python is almost entirely platform agnostic, which is a really great feature -- I'm suggesting it would be a tad better if the non-standard parts were more clearly labelled, that's all.
There is a substantial difference -- a user of the first version has to think "why the heck don't I have the cffi module? I had it on that other machine, and it's not listed in requirements, nor on PyPi?? A user getting the second version can immediately see that the missing module is related to PyPy. even worse, under this proposal, they would get: py>from stdlib import cffi # PyPy specific Traceback (most recent call last): File "<stdin>", line 1, in <module> ImportError: No module named 'cffi' And think "what the heck, it's in the darn standard library ?!?! I must have a broken install".... All this is not all that big a deal, but if the definition of the "stdlib" module doesn't change at all, then I can't see the point of the proposal. But I don't have time to actually do any of this work for my idea, so we'll see what the OP comes up with. -CHB -- Christopher Barker, PhD Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython

On Thu, Jul 18, 2019 at 8:53 AM Christopher Barker <pythonchb@gmail.com> wrote:
That might work if it's entire functions that have different behaviour. What about these features? https://docs.python.org/3/library/os.html#files-and-directories And what if support changes? https://docs.python.org/3/library/os.html#os.link Would there be posix.link() and windows.link() with identical signatures, with the latter being "only since 3.2"? Or would "posix + windows" be considered universal enough to put it into the os module? And what about os.stat(fn).st_ctime ? ChrisA

On Mon, Jul 15, 2019 at 01:36:06PM -0700, Andrew Barnert via Python-ideas wrote:
What exactly goes in stdlib?
The standard library modules documented at python.org. Note that being part of the std lib doesn't necessarily mean that it will be available, even on CPython. The contents of stdlib will be version specific, of course, and there are platform-specific modules too. Linux distros like Debian might shift them to a different OS package (deb or rpm) so they may be missing if the user didn't install all the packages.
Would MicroPython include framebuf in stdlib, PyPI include cffi, etc.?
They're not (I presume...) maintained by the CPython core devs and documented as standard library modules, so no.
Are they documented as public modules at python.org? Then yes, otherwise no.
Or “accelerators” like _datetime, or other “internal” modules like _compression?
Since they're private modules, it should make no difference to anyone except the implementer of the public module. If the maintainer of datetime wants _datetime in the stdlib namespace, that is not part of the datetime public interface.
What about __future__?
Despite the dunder name, it's a standard (and ordinary) public module: https://docs.python.org/3/library/__future__.html so yes, it is a standard library module that ought to appear in the stdlib namespace.
Would Apple’s Python distribution include their extra modules like PyObjC,
Is it documented at python.org as a standard library module? If so, then yes, if not, then no.
There's no need to customize it. If Debian removes (let's say) the math module, because reasons, then from stdlib import math will fail unless the user has installed the python-math deb. Let's KISS and not complicate it to the point it can't happen. We have a simple definition of "standard library", namely anything documented here: https://docs.python.org/3/library/index.html including the optional components. If there's some special case (let's say, the HovercraftFullOfEels module, which we want to document but for some reason we don't want to be accessible in the stdlib namespace), then its fine for it to be left out (and documented as such). We ought to discourage implementations adding non-std libs to the stdlib namespace, but compliance is a Quality of Implementation issue, not a deal-breaker. If HolyGrailPython wants to package stdlib.black_knight we should neither encourage nor prevent them from doing so, but can "tut tut" loudly whenever anyone complains that from stdlib import black_knight fails in CPython. -- Steven

On Mon, Jul 15, 2019 at 10:47:06PM +0400, Abdur-Rahmaan Janhangeer wrote:
The builtins module doesn't exist only so we can "know builtins". If the only purpose of this proposal is to "know all modules in the std lib", it is better to just read the docs: https://docs.python.org/3/library/index.html One possible advantage here might be to distinguish between: import math from stdlib import math where the first one will use whatever "math" module appears first on the Python path, while the second will use the actual standard math module. But is this useful? Distinguishing between (for example) an open function defined in your module, and the builtins open, *is* useful and common: open # Modules may shadows the builtin function. builtins.open but it isn't clear to me that shadowing parts of the std lib is useful or common (except by accident, which is a problem to fix not a feature to encourage). I think if you are interested in this, you ought to start by defining the benefits of the proposal: what do you expect to gain? -- Steven

On Jul 15, 2019, at 11:47, Abdur-Rahmaan Janhangeer <arj.python@gmail.com> wrote:
Greetings,
Just like builtins module, would it be a good idea to have a stdlib module so that we can know all modules in the std lib (just like builtins let us know builtins)?
Is it just the names that are in stdlib, or some kind of lazy imports for the modules themselves, so you can do “from stdlib import pprint”? The latter is a bit more complicated, but it seems like it would make the feature a lot more useful—it’s a way to guarantee that you get the standard pprint even if it’s been shadowed by a local file, a way to make sure you get an early error if your distributor decided to leave pprint out of the distribution, and so on. And that would be similar to the builtins module (builtins.print is the builtin print, even if you’ve shadowed it with a module global). What exactly goes in stdlib? Would MicroPython include framebuf in stdlib, PyPI include cffi, etc.? Does it include things like test that are inside the lib/pythonX.Y directory but not intended to be imported? Or “accelerators” like _datetime, or other “internal” modules like _compression? What about __future__? Would Apple’s Python distribution include their extra modules like PyObjC, and would Debian’s exclude the ones they’ve taken out of the python package and put in separate debs? Is there a distributor and/or site mechanism for customizing this? You might want to take a look at https://github.com/abarnert/stdlib. Someone last year suggested something different but similar—a stdlib module that (lazily) includes the _ contents_ of every stdlib module, rather than the modules themselves—so I slapped together an implementation to play with the idea. Your suggestion should be a lot simpler—and potentially a lot more useful. So you really should consider implementing it and sharing live examples that people can play with themselves instead of having to explain everything. You can even put it on PyPI and see if it gets any traction. If people find it useful there—or, even better, if they find it useful but complain that it can’t do certain things (like tracking differences caused by distributors) without being part of Python itself—you’d have a great case.

I think this may be a worthwhile idea — after all, “namespaces are one honking great idea” This would be an opportunity to clearly define the “standard library” as something other than “all the stuff that ships with cPython” With that in mind:
Would MicroPython include framebuf in stdlib, PyPI include cffi, etc.?
Does it include things like test that are inside the lib/pythonX.Y
No, they wouldn’t. The Stalin would be, well, a standard. If something is only going to exist on some platforms or implementations, they should not be in that namespace. A “pypy” or “upy” namespace would be the place to put those things. Then it’s obvious to everyone where code that uses those will and won’t run. Granted, something like MicroPython May never support the entire Stdlib, but the goal should be to define what a “complete” stub is. directory but not intended to be imported? I don’t think so— the point would be to define a namespace to use to import stuff
What about __future__?
Nope: that isn’t really an import.
Would Apple’s Python distribution include their extra modules like PyObjC,
Nope -/ again, not “standard”
would Debian’s exclude the ones they’ve taken out of the python package and put in separate debs?
That’s a tricky one, but probably yes. So you really should consider implementing it and sharing live examples
that people can play with themselves instead of having to explain everything.
+1 -CHB -- Christopher Barker, PhD Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython

On Tue, Jul 16, 2019 at 12:30 PM Christopher Barker <pythonchb@gmail.com> wrote:
What about __future__?
Nope: that isn’t really an import.
Actually it is, and the __future__ module is the best way to find out what future directives there are and which versions they're needed in.
So IMO it *should* be in this sort of collection. ChrisA

I think a better current definition might be more like “all the stuff that has chapters in docs.python.org/library”. That documentation claims to define “The Python Standard Library” in the same way the reference defines “The Python Language”. (It also says to keep it under your pillow, which was not good for the screen of my tablet, but I’m not going to sue anyone over that.) But the fact that we don’t even agree on the current implicit definition is just more ammunition for your argument that we need an explicit definition. And having it enshrined in code (and tests!) seems even nicer.
No, they wouldn’t. The Stalin would be, well, a standard.
Surely a Lenin level of forced standardization is sufficient; we don’t need to go full Stalin :)
If something is only going to exist on some platforms or implementations, they should not be in that namespace.
I’m not sure about that. Plenty of Unix developers think of modules like pty and termios as “part of the stdlib”. (I’m not sure if that’s the same for Windows devs with winreg and msvcrt, but it wouldn’t surprise me.) And they’re documented and built and delivered the same way as cross-platform libs. Why shouldn’t they be able to import them from stdlib? And if that is the rule, there’s a lot of room for bikeshedding on the edges. Does the stdlib then include curses because in theory it builds on any platform if the lib is found, even though in practice nobody builds it on Windows? Exclude multiprocessing and mmap because they’re not on all platforms, only Windows and Unix (including Mac and Linux, and even most third-party iOS and Android builds), which is all 99.9% of the Python world cares about? Exclude dis because it’s CPython-specific even though PyPy includes it as well? One more issue: packages. With a simple lazy importer design, there’s no way to make stdlib include the portable xml.etree but not the optional xml.parsers.expat; it either includes the xml package or it doesn’t. Is that what we want, or do we actually need lazily build proxy package objects for things like xml that lazy-import some things but reject others? And, if so, is it a problem that xml and stdlib.xml can both be imported at the same time and will have different contents?
A “pypy” or “upy” namespace would be the place to put those things. Then it’s obvious to everyone where code that uses those will and won’t run.
Only if everyone stops writing import spam and starts writing from stdlib/cpython/pypy import spam. Just because people _can_ now (with this change) do that doesn’t mean we want to encourage them to, much less expect them to. It wouldn’t just make code gratuitously incompatible with older Python, it would mark a very visible difference between most 3.9+ code and “legacy” code, given that most modules and scripts start off with stdlib imports right at the top.
Granted, something like MicroPython May never support the entire Stdlib, but the goal should be to define what a “complete” stub is.
Oh, that raises another point: there are a bunch of libraries that MicroPython does include; but only partially implements. If you from stdlib import cmath, should that fail, or should it do exactly the same thing as import cmath (that is, import fine, and work fine as long as you stick to the subset it supports, but NameError if you try to call, e.g., cmath.acosh)? For that matter, I believe every platform has a gc module, and they all share at least the three core functions, but beyond that everything is different. Should gc only be in stdlib on platforms where it includes all the documented functions?
What about __future__?
Nope: that isn’t really an import.
Yes it is. It’s a magic name used by future statements, but it’s _also_ a perfectly normal module defined in __future__.py, included in the stdlib directory with every major implementation, and documented alongside things like sys and traceback.

I think a better definition might be: "A lib to access libraries you can use without installing 3rd party packages". That way, Unix's packages included in the stdlib will be accessible on Unix only. As referenced by @Christopher Barker <pythonchb@gmail.com>, a way to namespace. Like right now if you want to know the stdlib functions, you have to go through the docs. While teaching programming, i find the builtins module very convenient as students can discover by themselves. Similarly, a stdlib module when inspected, shows what you can import right out of the box. Inspection goes a long way in making learning fun, going through the dosctrings etc. -- Abdur-Rahmaan Janhangeer http://www.pythonmembers.club | https://github.com/Abdur-rahmaanJ Mauritius

On Tue, Jul 16, 2019 at 2:54 PM Andrew Barnert via Python-ideas <python-ideas@python.org> wrote:
And if that is the rule, there’s a lot of room for bikeshedding on the edges. Does the stdlib then include curses because in theory it builds on any platform if the lib is found, even though in practice nobody builds it on Windows? Exclude multiprocessing and mmap because they’re not on all platforms, only Windows and Unix (including Mac and Linux, and even most third-party iOS and Android builds), which is all 99.9% of the Python world cares about? Exclude dis because it’s CPython-specific even though PyPy includes it as well?
I'm not 100% sure of use-cases, but one potential use for this would be to check for accidental name collisions. That wouldn't need to include the *contents* of every module, just as long as it has the *name* (and this also deals with packages, since you need only the top level). In this way, it would be more parallel to the "keyword" module than "builtins"; you could have "import stdlib; print(stdlib.module_names)" or do a quick set intersection check against your file names. To that end, it would be best if it includes EVERY module that would be considered standard, including all of the ones you mentioned. (Possibly internal implementation modules should be separate, but anything public-facing should be included.) It'd be a quick tool for checking for collisions across Python platforms, versions (just load up another version's stdlib.py and check it), etc, etc. Actually figuring out which ones can be imported is a much harder job, albeit a simple one - just call __import__ on every name and see which ones bomb with ImportError. ChrisA

Just out of interest there must be something like this in venv - how else can it decide what to include in the virtual environment as copies or links? That would give the platform specific list. Steve Barnes -----Original Message----- From: Chris Angelico <rosuav@gmail.com> Sent: 16 July 2019 06:21 To: python-ideas <python-ideas@python.org> Subject: [Python-ideas] Re: Stdlib Module On Tue, Jul 16, 2019 at 2:54 PM Andrew Barnert via Python-ideas <python-ideas@python.org> wrote:
And if that is the rule, there’s a lot of room for bikeshedding on the edges. Does the stdlib then include curses because in theory it builds on any platform if the lib is found, even though in practice nobody builds it on Windows? Exclude multiprocessing and mmap because they’re not on all platforms, only Windows and Unix (including Mac and Linux, and even most third-party iOS and Android builds), which is all 99.9% of the Python world cares about? Exclude dis because it’s CPython-specific even though PyPy includes it as well?
I'm not 100% sure of use-cases, but one potential use for this would be to check for accidental name collisions. That wouldn't need to include the *contents* of every module, just as long as it has the *name* (and this also deals with packages, since you need only the top level). In this way, it would be more parallel to the "keyword" module than "builtins"; you could have "import stdlib; print(stdlib.module_names)" or do a quick set intersection check against your file names. To that end, it would be best if it includes EVERY module that would be considered standard, including all of the ones you mentioned. (Possibly internal implementation modules should be separate, but anything public-facing should be included.) It'd be a quick tool for checking for collisions across Python platforms, versions (just load up another version's stdlib.py and check it), etc, etc. Actually figuring out which ones can be imported is a much harder job, albeit a simple one - just call __import__ on every name and see which ones bomb with ImportError. ChrisA _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/NALK27... Code of Conduct: http://python.org/psf/codeofconduct/

On Jul 15, 2019, at 22:45, Steve Barnes <GadgetSteve@live.co.uk> wrote:
Just out of interest there must be something like this in venv - how else can it decide what to include in the virtual environment as copies or links? That would give the platform specific list.
Doesn’t it just do whatever’s in the libpython directory? I think the closest way to reproduce that behavior from inside is to look at syslib.get_path('stdlib') and 'platstdlib', map those to sys.path entries, and try to build an import loader for each file in those paths. Probably not something you want to do at every “import stdlib”. Especially for implementations like Brython, where import is a synchronous AJAX network call. Also, venv surely has to include private modules, or the public ones wouldn’t work. That might actually be the right decision, but people have already made suggestions both ways, so presumably that’s something that needs to be discussed and decided on usefulness grounds rather than forced on us by an implementation technique.

On Mon, Jul 15, 2019 at 07:29:59PM -0700, Christopher Barker wrote:
CPython is the reference implementation. It is expected that anything shipped by CPython ought to be shipped by all other implementations, unless there's a very good reason not to. Compliance is a Quality of Implementation issue, not a deal breaker.
"Standard" doesn't mean "available everywhere, in every version of every implementation".
Please, let's avoid scope-creep. The proposal here is simple and limited: make the std library modules available through a special "stdlib" namespace. It's going to be hard enough to write the PEP and get it approved without blowing the scope out to include defining acceptable levels of functionality for stubs used by alternate implementations. -- Steven

On Tue, Jul 16, 2019 at 4:44 AM Steven D'Aprano <steve@pearwood.info> wrote:
I agree there, *almost*: I would define that is "anything in the CPython standard library should be shipped by all other implimentiatons" we cant inforce that of course, but if not, then you are shipping an incomplete implimentions (which I expect everyone using, e.g. micro Python knows). "A complete implementation" is more aspirational than a requirement.
but the goal should be to define what a “complete” stdlib is.
You call it scope-creep, I call it an opportunity. This would be an opportunity to provide a new definition for what "the standard library" means. And I think that would be very useful -- far more useful than simply providing a new namespace for all the cruft that is already in there. As someone said -- there is room for a lot of bike shedding around the edges -- so be it. If someone takes up the mantle and makes a test implementation and starts a PEP. then we will have a framework for that bike shedding. If HolyGrailPython wants to package stdlib.black_knight we
fails in CPython. Sure -- I'm not suggesting any enforcement mechanism -- just like now, HolyGrailPython can implement "import sys" to simply print "Ni". I don't suspect their users would like that, but that's up to them. All I'm saying is that the stdlib module could be defined as the platform-agnostic standard stuff -- so third party implimenters can make a decision about how compatible they should be.
https://docs.python.org/3/library/index.html Which used to include SGI specific stuff, and still includes, e.g. macpath. I'm just suggesting that "ships with cPython" and "is the stdlib" should mean different things -- because that would be useful. Maybe I need to call it the "common library", rather than the standard library (from comlib import pathlib) -- would that seem useful to folks?
Only if everyone stops writing import spam and starts writing from stdlib/cpython/pypy import spam
I was thinking that it would be: from comlib import pathlib from pypy import something_pypy_ specific (or pypylib maybe?) Anyone not using pypy would never use the pypy module And anyone that runs that code on another Pyton could get a far more meaningful error Granted, this would require people to start im porting from the comlib (Or stdlib, or...) module, but it would go a long way to have a pypy module, even if folks still used the regular imports for non pypy stuff. I thought the idea was the people would be encouraged to use the new namespace? Or was it just to have it there to inspect? If the latter, than my idea of restricting it might make even more sense :-) Anyway -- I"m not going to run with this, so whoever does gets to drive the discussion. -CHB -- Christopher Barker, PhD Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython

On Jul 16, 2019, at 10:20, Abdur-Rahmaan Janhangeer <arj.python@gmail.com> wrote:
I must make a C or pure Python demo?
You can definitely implement this in pure Python. It doesn’t need C for performance reasons, it’s not going to be bootstrapped into the start sequence, it doesn’t need access to C-only APIs, etc. And keep it simple. You don’t need to do anything clever to compute the set of modules at install time or runtime, just hardcore them. You can probably just take the lazy import example from PEP 562 and change __all__ to be a list of all of the names you think should be included. That’s more than enough for people to play with it, and for you to show off useful examples that other people can run. If you decide to upload it to PyPI (which I think you probably should at some point), you’ll need to figure out how to handle providing the right stdlib for the user’s Python version (and implementation, platform, etc., if that’s the way the discussion goes). But I’d put that off for an 0.2 version, and get the 0.1 version done and uploaded to github or whatever first.

On Tue, Jul 16, 2019 at 09:50:19AM -0700, Christopher Barker wrote:
Is there a difference in meaning between "ought" and "should" here? Because I can't tell what the "almost" is here. What is the difference between what I said and what you said? [...]
Yes: an opportunity to bike-shed, an opportunity to derail the proposal, an opportunity to increase the complexity of the proposal by an order of magnitude without a corresponding increase in usefulness.
Why do we need a new definition for std lib that differs from the definition used for 30 years, one which is universally understood by Python developers and referenced in books and blog posts and Stackoverflow answers everywhere? We have a reference implementation (CPython) that defines the std lib. How is it helpful to distinguish between these? 1. The standard library modules which are documented in the reference manual, and maintained by the core developers; 2. and a *different* set of modules (maybe a subset, maybe a superset) which may or may not be "standard" in the sense above but will appear in the stdlib namespace. (These are not rhetorical questions. If you want to make a case for the usefulness of this, go right ahead.)
And I think that would be very useful -- far more useful than simply providing a new namespace for all the cruft that is already in there.
Whether it is cruft or not is an unrelated issue. Modules can still be deprecated, they can still be removed. But so long as they remain in the std lib, then they would remain accessible in the stdlib namespace. (What you call "cruft", someone else may call "critically important library I couldn't do without".) [...]
All I'm saying is that the stdlib module could be defined as the platform-agnostic standard stuff
There's a lot more platform-dependent features and behaviour in the stdlib than most people realise, and if we start excluding anything which is not platform-agnostic, we'd exclude major modules as: sys, os, stat, time, subprocess, shutil, math, datetime, random all of which include platform-specific behaviour or features.
-- so third party implimenters can make a decision about how compatible they should be.
Are you suggesting that implementers can't currently make that decision?
So? It also includes Windows specific stuff, and POSIX specific stuff.
I'm just suggesting that "ships with cPython" and "is the stdlib" should mean different things -- because that would be useful.
Just repeating "it would be useful" doesn't make it so. How would it be useful? Do you think there are many (e.g.) POSIX developers who are confused by the lack of winreg module? If not, what problem are you trying to solve by redefining the standard library?
How is the "common library" different from the standard library? pathlib is another one of the major standard library modules that fails to be platform-agnostic. [...]
And anyone that runs that code on another Pyton could get a far more meaningful error
I don't see that there is much difference between: py> import cffi # PyPy specific Traceback (most recent call last): File "<stdin>", line 1, in <module> ImportError: No module named 'cffi' py> from pypy import cffi Traceback (most recent call last): File "<stdin>", line 1, in <module> ImportError: No module named 'pypy' -- Steven

On Tue, Jul 16, 2019 at 3:22 PM Steven D'Aprano <steve@pearwood.info> wrote:
no -- "should" and "ought" are synonyms in this context - the difference is "shipped by CPython" vs "in the CPython standard library". cPYton ships stuff that is cPython specific, doesn't it. Yes: an opportunity to bike-shed, an opportunity to derail the proposal,
an opportunity to increase the complexity of the proposal by an order of magnitude without a corresponding increase in usefulness.
Frankly it is unlcear to me what "the proposal" actually is anyway, so it's an opportunity to to something more expansive that provide a module that does nothing but provide the exact same information that is already in the docs (and implicit in the fact that third party libraries are usually installed in site-packages). The OP did refer to being able to point people to __builtins__ to see what's built in. It that's the only goal here, then we could make a simple "list_stdlib_modules" function.
I'm made that point already -- you may not agree, but the idea is that there are things that "ship with cPython" and there are things that are really designed to be generically useful and counted on everywhere. I think that's a useful distinction. There is even a discussion right now on what to remove (or deprecate) from the stdlib -- perhaps there is a middle option -- keep it,but make it no longer part of the "common library".
it would be a subset -- but yes, that's the idea :-)
see above -- not necessarily -- in fact, leaving deprecated modules out of the stdlib namespace would be one good use case for my idea -- it would be far more clear that you were using something that is no longer "standard".
(What you call "cruft", someone else may call "critically important library I couldn't do without".)
Of course -- which is again why it could be useful to still give that user access, while also signaling clearly that they are using non-standard features.
All I'm saying is that the stdlib module could be defined as the
I think it's worth drawing a line between behaviour and features. platform-specific behaviour is too bad, but probably unavoidable. But I've always (which goes back to Python 1.5) thought that putting platform-specific behaviour in standard modules was a wart. My prime example is the os module -- it is a wrapper around platform specific modules, and most of it provided platform-nuetral features, but a there are a few oddballs in there. Why do I care? because folks can write code on a posix system, and have no idea that that code won't work on, e.g. Windows. It wouldn't seem such a burden to do: import os and then, when you posix only features -- import posix Making it very clear that that particular part of the code will only run under posix.
-- so third party implimenters can make a
decision about how compatible they should be.
Are you suggesting that implementers can't currently make that decision?
of course they can -- nothing would change in that regard.
We have a> simple definition of "standard library", namely anything documented
exactly my point -- should all that be in the same namespace? It has been because there was only one namespace anyway.
Do you think there are many (e.g.) POSIX developers who are confused by the lack of winreg module?
of course not -- but that actually proves my point -- by calling in the "winreg" it is clearly a Windows thing. ANyhone using it will not expect that code to run on non-windows system. But if it were called "systemconfig", that would be a different story. How is the "common library" different from the standard library?
Because "We already have a definition of the standard library", and that definition is "everything that ships and is documented with cPython" My suggestion is that we make something *new* which has a different defintion, and then people wont be saying that I'm proposing breaking things -- it's something different, maybe it should have different name.
pathlib is another one of the major standard library modules that fails to be platform-agnostic.
Though its intention is to provide a consistent interface for path manipulation -- and certainly most of it is consistent -- I'll take your work for it that not all of it is, but certainly most is. One reason folks should use it rather than doing things like: my_path = dir_name + r"\" + filename The fact is that Python is almost entirely platform agnostic, which is a really great feature -- I'm suggesting it would be a tad better if the non-standard parts were more clearly labelled, that's all.
There is a substantial difference -- a user of the first version has to think "why the heck don't I have the cffi module? I had it on that other machine, and it's not listed in requirements, nor on PyPi?? A user getting the second version can immediately see that the missing module is related to PyPy. even worse, under this proposal, they would get: py>from stdlib import cffi # PyPy specific Traceback (most recent call last): File "<stdin>", line 1, in <module> ImportError: No module named 'cffi' And think "what the heck, it's in the darn standard library ?!?! I must have a broken install".... All this is not all that big a deal, but if the definition of the "stdlib" module doesn't change at all, then I can't see the point of the proposal. But I don't have time to actually do any of this work for my idea, so we'll see what the OP comes up with. -CHB -- Christopher Barker, PhD Python Language Consulting - Teaching - Scientific Software Development - Desktop GUI and Web Development - wxPython, numpy, scipy, Cython

On Thu, Jul 18, 2019 at 8:53 AM Christopher Barker <pythonchb@gmail.com> wrote:
That might work if it's entire functions that have different behaviour. What about these features? https://docs.python.org/3/library/os.html#files-and-directories And what if support changes? https://docs.python.org/3/library/os.html#os.link Would there be posix.link() and windows.link() with identical signatures, with the latter being "only since 3.2"? Or would "posix + windows" be considered universal enough to put it into the os module? And what about os.stat(fn).st_ctime ? ChrisA

On Mon, Jul 15, 2019 at 01:36:06PM -0700, Andrew Barnert via Python-ideas wrote:
What exactly goes in stdlib?
The standard library modules documented at python.org. Note that being part of the std lib doesn't necessarily mean that it will be available, even on CPython. The contents of stdlib will be version specific, of course, and there are platform-specific modules too. Linux distros like Debian might shift them to a different OS package (deb or rpm) so they may be missing if the user didn't install all the packages.
Would MicroPython include framebuf in stdlib, PyPI include cffi, etc.?
They're not (I presume...) maintained by the CPython core devs and documented as standard library modules, so no.
Are they documented as public modules at python.org? Then yes, otherwise no.
Or “accelerators” like _datetime, or other “internal” modules like _compression?
Since they're private modules, it should make no difference to anyone except the implementer of the public module. If the maintainer of datetime wants _datetime in the stdlib namespace, that is not part of the datetime public interface.
What about __future__?
Despite the dunder name, it's a standard (and ordinary) public module: https://docs.python.org/3/library/__future__.html so yes, it is a standard library module that ought to appear in the stdlib namespace.
Would Apple’s Python distribution include their extra modules like PyObjC,
Is it documented at python.org as a standard library module? If so, then yes, if not, then no.
There's no need to customize it. If Debian removes (let's say) the math module, because reasons, then from stdlib import math will fail unless the user has installed the python-math deb. Let's KISS and not complicate it to the point it can't happen. We have a simple definition of "standard library", namely anything documented here: https://docs.python.org/3/library/index.html including the optional components. If there's some special case (let's say, the HovercraftFullOfEels module, which we want to document but for some reason we don't want to be accessible in the stdlib namespace), then its fine for it to be left out (and documented as such). We ought to discourage implementations adding non-std libs to the stdlib namespace, but compliance is a Quality of Implementation issue, not a deal-breaker. If HolyGrailPython wants to package stdlib.black_knight we should neither encourage nor prevent them from doing so, but can "tut tut" loudly whenever anyone complains that from stdlib import black_knight fails in CPython. -- Steven

On Mon, Jul 15, 2019 at 10:47:06PM +0400, Abdur-Rahmaan Janhangeer wrote:
The builtins module doesn't exist only so we can "know builtins". If the only purpose of this proposal is to "know all modules in the std lib", it is better to just read the docs: https://docs.python.org/3/library/index.html One possible advantage here might be to distinguish between: import math from stdlib import math where the first one will use whatever "math" module appears first on the Python path, while the second will use the actual standard math module. But is this useful? Distinguishing between (for example) an open function defined in your module, and the builtins open, *is* useful and common: open # Modules may shadows the builtin function. builtins.open but it isn't clear to me that shadowing parts of the std lib is useful or common (except by accident, which is a problem to fix not a feature to encourage). I think if you are interested in this, you ought to start by defining the benefits of the proposal: what do you expect to gain? -- Steven
participants (6)
-
Abdur-Rahmaan Janhangeer
-
Andrew Barnert
-
Chris Angelico
-
Christopher Barker
-
Steve Barnes
-
Steven D'Aprano