On Mon, 6 Apr 2020 at 04:14, Stephen J. Turnbull
That's OK for developers on their "own" machines, but I think there are still a lot of folks who need approval from corporate (though I haven't heard from Paul Moore on that theme in quite a while, maybe it's gotten better?)
It's actually got a bit worse, to the point where I don't bother trying to fight the system as much any more, so things have gone quieter because of that :-) But yes, I still believe that there are important use cases for code that only uses the stdlib, and "not being able to get to the internet" is a valid use case that we shouldn't just dismiss.
On the other hand, with regard to the comment you were replying to:
Up to around a decade ago, installing third-party libraries was a huge mess, but nowadays, PyPI works, Python comes with pip pre-installed, and telling developers they need internet access when they first start a new project isn’t considered onerous.
I think all of that is true. What I think is more important, though, is around things that aren't so much "projects" as "scripts", and that distributing such scripts that depend on 3rd party libraries is still problematic. Here, I'm not talking about making full-fledged packages, or py2exe style fully bundled apps, but "here's this monitoring script I wrote, let's ship it across 10 production systems and why don't you try it out on your dev boxes?" There's a significant step change in complexity between doing that if the script depends on just the stdlib, versus if it depends on a 3rd party module. And we don't have a really good answer to that yet (zipapps, and tools like shiv that wrap zipapps, are pretty good in practice, but they are not well known or commonly used).