On Thu, 29 Nov 2018 at 14:56, Benjamin Peterson email@example.com wrote:
While I'm sympathetic to users in such situations, I'm not sure how much we can really help them. These are the sorts of users who are likely to still be stuck using Python 2.6. Any stdlib improvements we discuss and implement today are easily a decade away from benefiting users in restrictive environments. On that kind of timescale, it's very hard to know what to do, especially since, as Paul says, we don't hear much feedback from such users.
As a user in that situation, I can confirm that there *are* situations where I am stuck using older versions of Python (2.6? Ha - luxury! I have to use 2.4 on some of our systems!) But there are also situations where I can do a one-off install of the latest version of Python (typically by copying install files from one machine to another, until I get it to a machine with no internet access) but installing additional modules, while possible (by similar means) is too painful to be worthwhile. Also, when sharing scripts with other users who are able to handle "download and run the installer from python.org", but for whom "pip install" (including the proxy configuration to let pip see the internet, which isn't needed for the installer because our browsers auto-configure) is impossibly hard. So "latest Python, no PyPI access" is not an unrealistic model for me to work to.
I can't offer more than one person's feedback, but my experience is a real-life situation. And it's one where I've been able to push Python over other languages (such as Perl) *precisely* because the stdlib provides a big chunk of built in functionality as part of the base install. If we hadn't been able to use Python with its stdlib, I'd have probably had to argue for Java (for the same "large stdlib" reasons).
The model these users are living in is increasingly at odds with how software is written and used these days.
I'd dispute that. It's increasingly at odds with how *open source* software, and modern web applications are written (and my experience is just as limited in the opposite direction, so I'm sure this statement is just as inaccurate as the one I'm disputing ;-)). But from what I can see there's a huge base of people in "enterprise" companies (the people who 20 years ago were still writing COBOL) who are just starting to use "modern" tools like Python, usually in a "grass roots" fashion, but whose corporate infrastructure hasn't got a clue how to deal with the sort of "everything is downloaded from the web as needed" environment that web-based development is embracing.
Browsers and Rust are updated every month. The node.js world is a frenzy of change. Are these users' environments running 5 year old browsers? I hope not for security's sake.
For people in those environments, I hope not as well. For people in locked down environments where upgrading internal applications is a massive project, and upgrading browsers without suffcient testing could potentially break old but key business applications, and for whom "limit internet access" is a draconian but mostly effective solution, slow change and limited connectivity is a practical solution.
It frustrates me enormously, and I hate having to argue that it's the right solution, but it's certainly *not* as utterly wrong-headed as some people try to argue. Priorities differ.
At some point, languorous corporate environments will have to catch up to how modern software development is done to avoid seriously hampering their employees' effectiveness (and security!).
And employees making a grass-roots effort to do so by gradually introducing modern tools like Python are part of that process. Making it harder to demonstrate benefits without needing infrastructure-level changes is not helping them do so. It's not necessarily Python's role to help that process, admittedly - but I personally have that goal, and therefore I'll continue to argue that the benefits of having a comprehensive stdlib are worth it.