I have 2 main concerns about PEP 582 that might just be me misunderstanding the pep. My first concern is the use of CWD, and prepending ./_pypackages_ for scripts. For example, if you were in a directory with a _pypackages_ subdirectory, and had installed the module "super.important.module". My understanding is that any scripts you run will have "super.important.module" available to it before the system site-packages directory. Say you also run "/usr/bin/an_apt-ly_named_python_script" that uses "super.important.module" (and there is no _pypackages_ subdirectory in /usr/bin). You would be shadowing "super.important.module". In this case, this adds no more version isolation than "pip install --user", and adds to the confoundment factor for a new user. If this is a misunderstanding of the pep (which it very well might be!), then ignore that concern. If it's not a misunderstanding, I think that should be emphasized in the docs, and perhaps the pep. My second concern is a little more... political. This pep does not attempt to cover all the use-cases of virtualenvs - which is understandable. However, this also means that we have to teach new users *both* right away in order to get them up and running, and teach them the complexities of both, and when to use one over the other. Instead of making it easier for the new user, this pep makes it harder. This also couldn't have come at a worse time with the growing use of pipenv which provides a fully third way of thinking about application dependencies (yes, pipenv uses virtualenvs under the hood, but it is a functionally different theory of operation from a user standpoint compared to traditional pip/virtualenv or this pep). Is it really a good idea to do this pep at this time? In a vacuum, I like this pep. Aside from the (possible) issue of unexpected shadowing, it's clean and straight forward. It's easy to teach. But it doesn't exist in a vacuum, and we have to teach the methods it is intended to simplify anyways, and it exists in competition with other solutions. I am not a professional teacher; I don't run python training courses. I do, however, volunteer quite a bit of time on the freenode channel. I get that the audience there is self-selecting to those who want to donate their time, and those who are having a problem (sometimes, those are the same people). This is the kind of thing that generates a lot of confusion and frustration to the new users I interact with there.
On Wed, 20 Feb 2019 at 22:40, Alex Walters <tritium-list@sdamon.com> wrote:
I have 2 main concerns about PEP 582 that might just be me misunderstanding the pep.
My first concern is the use of CWD, and prepending ./_pypackages_ for scripts. For example, if you were in a directory with a _pypackages_ subdirectory, and had installed the module "super.important.module". My understanding is that any scripts you run will have "super.important.module" available to it before the system site-packages directory. Say you also run "/usr/bin/an_apt-ly_named_python_script" that uses "super.important.module" (and there is no _pypackages_ subdirectory in /usr/bin). You would be shadowing "super.important.module".
From https://www.python.org/dev/peps/pep-0582/#specification: "In case of Python scripts, Python will try to find __pypackages__ in the same directory as the script." That behaviour then gets covered again in https://www.python.org/dev/peps/pep-0582/#security-considerations The initial paragraphs in the specification section are only talking about the cases where sys.path[0] is already getting set to the current working directory (e.g. the interactive prompt).
My second concern is a little more... political.
This pep does not attempt to cover all the use-cases of virtualenvs - which is understandable. However, this also means that we have to teach new users *both* right away in order to get them up and running, and teach them the complexities of both, and when to use one over the other. Instead of making it easier for the new user, this pep makes it harder. This also couldn't have come at a worse time with the growing use of pipenv which provides a fully third way of thinking about application dependencies (yes, pipenv uses virtualenvs under the hood, but it is a functionally different theory of operation from a user standpoint compared to traditional pip/virtualenv or this pep).
Is it really a good idea to do this pep at this time?
One of the potential amendments to the PEP is to have it simply be a more visible alternative naming scheme for the existing `.venv` convention (which `pipenv` supports via the PIPENV_VENV_IN_PROJECT environment setting). If that option gets pursued, then the key new behaviours would be: 1. Interpreters auto-activating the __pypackages__ venv on startup (without requiring `pipenv run` or a functional equivalent) 2. Installers auto-targeting the __pypackages__ venv at install time (without requiring `pipenv install` or a functional equivalent) That way, for folks that already understand virtual environments, the explanation is exactly that: "It's like .venv, but interpreters activate it automatically, and installers target it automatically" Whereas for folks that *don't* already understand virtual environments, the benefits are those described in the PEP: we can just tell people "When you have a __pypackages__ subdirectory in the current directory, anything you install will be installed there by default, and then be available for import when running Python from that directory". The other potential new behaviour which the PEP implies in its examples but doesn't currently spell out in the text is that it could potentially be defined in a way that better supports multiple interpreter versions sharing the same subdirectory (i.e. rather than being associated with a specific Python interpreter installation the way .venv is, the __pypackages__ directory may instead have separate subdirectories for each X.Y Python version). That would still be similar in spirit to `.venv`, it would just be natively version-aware. The benefit of doing that is better handling of situations where the required dependencies vary based on the Python version in use, while the downside is that you potentially end up with multiple copies of the dependencies, even when they could have been shared without any problems. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On 20/2/2019, at 20:38, Alex Walters <tritium-list@sdamon.com> wrote:
I have 2 main concerns about PEP 582 that might just be me misunderstanding the pep.
My first concern is the use of CWD, and prepending ./_pypackages_ for scripts. For example, if you were in a directory with a _pypackages_ subdirectory, and had installed the module "super.important.module". My understanding is that any scripts you run will have "super.important.module" available to it before the system site-packages directory. Say you also run "/usr/bin/an_apt-ly_named_python_script" that uses "super.important.module" (and there is no _pypackages_ subdirectory in /usr/bin). You would be shadowing "super.important.module”.
In this case, this adds no more version isolation than "pip install --user", and adds to the confoundment factor for a new user. If this is a misunderstanding of the pep (which it very well might be!), then ignore that concern. If it's not a misunderstanding, I think that should be emphasized in the docs, and perhaps the pep.
It is my understanding that the PEP already covers this: https://www.python.org/dev/peps/pep-0582/#security-considerations
While executing a Python script, it will not consider the __pypackages__ in the current directory, instead if there is a __pypackages__ directory in the same path of the script, that will be used.
This is also mentioned in the Specification section:
In case of Python scripts, Python will try to find __pypackages__ in the same directory as the script. If found (along with the current Python version directory inside), then it will be used, otherwise Python will behave as it does currently.
My second concern is a little more... political.
This pep does not attempt to cover all the use-cases of virtualenvs - which is understandable. However, this also means that we have to teach new users *both* right away in order to get them up and running, and teach them the complexities of both, and when to use one over the other. Instead of making it easier for the new user, this pep makes it harder. This also couldn't have come at a worse time with the growing use of pipenv which provides a fully third way of thinking about application dependencies (yes, pipenv uses virtualenvs under the hood, but it is a functionally different theory of operation from a user standpoint compared to traditional pip/virtualenv or this pep).
Is it really a good idea to do this pep at this time?
I am not in the position to comment on this in general, although I fully understand the concern as a volunteer myself. As one of the Pipenv maintainers, however, it is my personal opinion that this PEP would not be end up in the “yet another standard” situation, but even be beneficial to Pipenv, if done correctly. I hope this can provide some confidence :)
In a vacuum, I like this pep. Aside from the (possible) issue of unexpected shadowing, it's clean and straight forward. It's easy to teach. But it doesn't exist in a vacuum, and we have to teach the methods it is intended to simplify anyways, and it exists in competition with other solutions.
I am not a professional teacher; I don't run python training courses. I do, however, volunteer quite a bit of time on the freenode channel. I get that the audience there is self-selecting to those who want to donate their time, and those who are having a problem (sometimes, those are the same people). This is the kind of thing that generates a lot of confusion and frustration to the new users I interact with there. -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/SFMFK...
On 20Feb.2019 0533, Tzu-ping Chung wrote:
As one of the Pipenv maintainers, however, it is my personal opinion that this PEP would not be end up in the “yet another standard” situation, but even be beneficial to Pipenv, if done correctly.
I hope this can provide some confidence :)
I'd love to hear more about how Pipenv would make use of it. So far it's only really been designed with pip in mind (and their team in the discussion), but we've explicitly left it _very_ tool independent. So if you can describe how it would work with Pipenv, that would be helpful for finding things that need changing. Also, the `pythonloc` package is an implementation of this that anyone can try out today - https://pypi.org/project/pythonloc/ (the major difference is that when implemented, you won't have to use "pythonloc" and "piploc" to get the new behaviour). Cheers, Steve
I don’t have a ton of concern with regard to pipenv. We already just jump through hoops to modify paths and such at runtime, this honestly sounds like a cleaner approach. Obviously we won’t actually get to clean up the code for a long time but you know... My basic position is that we are just pointing at python libraries and code at the end of the day. The only real concern is scripts— where will they live, etc. When TP mentions it will benefit Pipenv, one way he likely means is that it gives us the option of standardizing on where to put environments. This has been a bit contentious and our current default is one that is divisive. It also eliminates a lot of complexity that arises from that choice, including how we handle hashing and case sensitivity to derive virtualenv names across platforms (this has bitten us more than once). One final thing this enables as far as I understand is a sort of npm-like option for ignoring resolution conflicts and simply performing a sort of nested installation of subdependencies inside a top level dependency’s __pypackages__ folder. So if you did install two packages with a conflict, they wouldn’t necessarily have to find a resolution. My only concerns with this PEP relate to bloat. I’d be interested in ways to reduce duplication across a system with many such folders. Dan Ryan // pipenv maintainer gh: @techalchemy
On Feb 20, 2019, at 10:19 AM, Steve Dower <steve.dower@python.org> wrote:
On 20Feb.2019 0533, Tzu-ping Chung wrote: As one of the Pipenv maintainers, however, it is my personal opinion that this PEP would not be end up in the “yet another standard” situation, but even be beneficial to Pipenv, if done correctly.
I hope this can provide some confidence :)
I'd love to hear more about how Pipenv would make use of it. So far it's only really been designed with pip in mind (and their team in the discussion), but we've explicitly left it _very_ tool independent. So if you can describe how it would work with Pipenv, that would be helpful for finding things that need changing.
Also, the `pythonloc` package is an implementation of this that anyone can try out today - https://pypi.org/project/pythonloc/ (the major difference is that when implemented, you won't have to use "pythonloc" and "piploc" to get the new behaviour).
Cheers, Steve -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/JBEBT...
On 20Feb2019 0803, Dan Ryan wrote:
One final thing this enables as far as I understand is a sort of npm-like option for ignoring resolution conflicts and simply performing a sort of nested installation of subdependencies inside a top level dependency’s __pypackages__ folder. So if you did install two packages with a conflict, they wouldn’t necessarily have to find a resolution.
I'm not sure where you got this idea from? It's certainly not part of the proposal. To be totally clear, and maybe this needs to be in the PEP (probably in three more various forms to make sure everyone gets it), you can emulate most of the PEP today with "pip install --target __pypackages__/3.7 ..." and "$env:PYTHONPATH = './__pypackages__/3.7'". Nothing else changes. The advantage is that even this amount of friction goes away.
My only concerns with this PEP relate to bloat. I’d be interested in ways to reduce duplication across a system with many such folders.
Are you interested in ways to reduce duplication across a system with many virtual environments? Again, there's literally no difference here (except under the PEP there isn't a need to _also_ duplicate the Python binaries). (And I snipped your concern about scripts/bin folders, but that's very much an point of open discussion. It really relies on some integration with whatever terminal a user happens to be using, and so far, the PEP does not concern itself with modifying PATH, only sys.path.) Cheers, Steve
On Wed, 20 Feb 2019 at 16:28, Steve Dower <steve.dower@python.org> wrote:
To be totally clear, and maybe this needs to be in the PEP (probably in three more various forms to make sure everyone gets it), you can emulate most of the PEP today with "pip install --target __pypackages__/3.7 ..." and "$env:PYTHONPATH = './__pypackages__/3.7'". Nothing else changes. The advantage is that even this amount of friction goes away.
Sorry for the drive-by comment (I don't have time to read the PEP right now) but does this mean that "pip install --upgrade" won't be supported against __pypackages__ directories (it currently doesn't work properly with --target, IIRC - certainly *some* things go weird with --target) or will the PEP include mechanisms to allow pip to work more cleanly with __pypackages__ than it currently does with --target? Paul
On 20Feb2019 0839, Paul Moore wrote:
On Wed, 20 Feb 2019 at 16:28, Steve Dower <steve.dower@python.org> wrote:
To be totally clear, and maybe this needs to be in the PEP (probably in three more various forms to make sure everyone gets it), you can emulate most of the PEP today with "pip install --target __pypackages__/3.7 ..." and "$env:PYTHONPATH = './__pypackages__/3.7'". Nothing else changes. The advantage is that even this amount of friction goes away.
Sorry for the drive-by comment (I don't have time to read the PEP right now) but does this mean that "pip install --upgrade" won't be supported against __pypackages__ directories (it currently doesn't work properly with --target, IIRC - certainly *some* things go weird with --target) or will the PEP include mechanisms to allow pip to work more cleanly with __pypackages__ than it currently does with --target?
The PEP explicitly doesn't say anything about what pip can/should do (at Donald's request), but the assumption is that it will support it properly. (Hence "*emulate* *most* of the PEP".) My point is just that the intent is for it to be a "normal" package directory, not a reimagining of packaging. Cheers, Steve
On Wed, 20 Feb 2019 at 16:44, Steve Dower <steve.dower@python.org> wrote:
On 20Feb2019 0839, Paul Moore wrote:
On Wed, 20 Feb 2019 at 16:28, Steve Dower <steve.dower@python.org> wrote:
To be totally clear, and maybe this needs to be in the PEP (probably in three more various forms to make sure everyone gets it), you can emulate most of the PEP today with "pip install --target __pypackages__/3.7 ..." and "$env:PYTHONPATH = './__pypackages__/3.7'". Nothing else changes. The advantage is that even this amount of friction goes away.
Sorry for the drive-by comment (I don't have time to read the PEP right now) but does this mean that "pip install --upgrade" won't be supported against __pypackages__ directories (it currently doesn't work properly with --target, IIRC - certainly *some* things go weird with --target) or will the PEP include mechanisms to allow pip to work more cleanly with __pypackages__ than it currently does with --target?
The PEP explicitly doesn't say anything about what pip can/should do (at Donald's request), but the assumption is that it will support it properly. (Hence "*emulate* *most* of the PEP".)
My point is just that the intent is for it to be a "normal" package directory, not a reimagining of packaging.
OK, cool. Thanks for clarifying. I suspect there's some non-trivial work needed for pip to support __pypackages__ (and potentially some questions at the interop standards level about how "installation frontends" should work in general when faced with __pypackages__), but that's fine. I just wanted to call out that it might not be trivial or transparent. Paul
On the first point I believe I heard this via word of mouth or some kind of media, not too sure. I didn’t have time to verify so I took it at face value — my fault on that one On the second point yes I am interested in reducing duplication in all cases, it’s a big problem for us with virtualenv usage too. Dan Ryan // pipenv maintainer gh: @techalchemy
On Feb 20, 2019, at 11:27 AM, Steve Dower <steve.dower@python.org> wrote:
On 20Feb2019 0803, Dan Ryan wrote: One final thing this enables as far as I understand is a sort of npm-like option for ignoring resolution conflicts and simply performing a sort of nested installation of subdependencies inside a top level dependency’s __pypackages__ folder. So if you did install two packages with a conflict, they wouldn’t necessarily have to find a resolution.
I'm not sure where you got this idea from? It's certainly not part of the proposal.
To be totally clear, and maybe this needs to be in the PEP (probably in three more various forms to make sure everyone gets it), you can emulate most of the PEP today with "pip install --target __pypackages__/3.7 ..." and "$env:PYTHONPATH = './__pypackages__/3.7'". Nothing else changes. The advantage is that even this amount of friction goes away.
My only concerns with this PEP relate to bloat. I’d be interested in ways to reduce duplication across a system with many such folders.
Are you interested in ways to reduce duplication across a system with many virtual environments? Again, there's literally no difference here (except under the PEP there isn't a need to _also_ duplicate the Python binaries).
(And I snipped your concern about scripts/bin folders, but that's very much an point of open discussion. It really relies on some integration with whatever terminal a user happens to be using, and so far, the PEP does not concern itself with modifying PATH, only sys.path.)
Cheers, Steve
Running entrypoints/executables in the bin directory has been solved by the node community with npx (https://www.npmjs.com/package/npx):
Executes <command> either from a local node_modules/.bin, or from a central cache, installing any packages needed in order for <command> to run.
I built similuar support into pipx as well, so `pipx run ENTRYPOINT` will search the appropriate `__pypackages__` path for the bin dir and the entrypoint. This is not the only way to solve it, but it seems to be working for the node. Indeed, massively popular projects like `create-react-app` (https://github.com/facebook/create-react-app#quick-overview) include npx as their sole installation instructions.
-----Original Message----- From: chadsmith27@gmail.com <chadsmith27@gmail.com> Sent: Thursday, February 28, 2019 12:42 PM To: distutils-sig@python.org Subject: [Distutils] Re: PEP-582 concerns
Running entrypoints/executables in the bin directory has been solved by
I kind of feel that "third party tool can/will use this feature" is orthogonal to "how the interpreter behaves out of the box" - unless I misunderstand and you are suggesting python grow support for launching entrypoints from the python executable. the
node community with npx (https://www.npmjs.com/package/npx):
Executes <command> either from a local node_modules/.bin, or from a central cache, installing any packages needed in order for <command> to run.
I built similuar support into pipx as well, so `pipx run ENTRYPOINT` will search the appropriate `__pypackages__` path for the bin dir and the entrypoint. This is not the only way to solve it, but it seems to be working for the node.
Indeed, massively popular projects like `create-react-app` (https://github.com/facebook/create-react-app#quick-overview) include npx as their sole installation instructions. -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils- sig@python.org/message/NAXGS7PZ7TELPEW2ZNNHW7N3RQDCPZSN/
On Wed, Feb 20, 2019, 08:13 Dan Ryan <dan@danryan.co> wrote:
I don’t have a ton of concern with regard to pipenv. We already just jump through hoops to modify paths and such at runtime, this honestly sounds like a cleaner approach. Obviously we won’t actually get to clean up the code for a long time but you know...
My basic position is that we are just pointing at python libraries and code at the end of the day. The only real concern is scripts— where will they live, etc.
Yeah, __pypackages__ has no way to handle scripts, and also no way to access packages when you're running from a directory. Pipenv already handles both of these cases fine today, so I'm not sure how having __pypackages__ several years from now could help you.
One final thing this enables as far as I understand is a sort of npm-like option for ignoring resolution conflicts and simply performing a sort of nested installation of subdependencies inside a top level dependency’s __pypackages__ folder. So if you did install two packages with a conflict, they wouldn’t necessarily have to find a resolution.
I don't think __pypackages__ would change anything here. The blocker for doing npm-style nested subdependencies in Python isn't that we only have one folder, it's that we only have one sys.modules, and I don't think there are any proposals to change that. -n
On 20Feb2019 0831, Nathaniel Smith wrote:
Yeah, __pypackages__ has no way to handle scripts, and also no way to access packages when you're running from a directory. Pipenv already handles both of these cases fine today, so I'm not sure how having __pypackages__ several years from now could help you.
Uh, it totally has both. It has no way to handle updating your terminal's environment for you, but it can put scripts *somewhere* ;) It can also handle accessing packages when running from your project directory. If you meant subdirectory, sure, that would be a major security issue to do that (much as I want to), but if you meant "both scripts and -m don't work" then that's just incorrect. That said, I prefer the approach of pipx (https://pypi.org/project/pipx/) for scripts anyway. It too has the problem of not updating your PATH for you, but at least it keeps tools separate from dependencies, as they should be. Cheers, Steve
On Wed, Feb 20, 2019 at 8:49 AM Steve Dower <steve.dower@python.org> wrote:
On 20Feb2019 0831, Nathaniel Smith wrote:
Yeah, __pypackages__ has no way to handle scripts, and also no way to access packages when you're running from a directory. Pipenv already handles both of these cases fine today, so I'm not sure how having __pypackages__ several years from now could help you.
Uh, it totally has both. It has no way to handle updating your terminal's environment for you, but it can put scripts *somewhere* ;)
It can also handle accessing packages when running from your project directory. If you meant subdirectory, sure, that would be a major security issue to do that (much as I want to), but if you meant "both scripts and -m don't work" then that's just incorrect.
Ugh, yeah, editing fail, I meant "subdirectory". And yeah, of course you can make both of these work, but I was specifically replying to Dan's comment about how he doesn't like pipenv has to jump through hoops and mess with paths manually. Maybe a better way to put it would be: the interpreter changes proposed in PEP 582 don't help pipenv, because even if pipenv ends up using __pypackages__ then the way it does it will be by jumping through hoops and messing with paths manually. The part that might benefit pipenv is to have a conventional place to put its environments. But that part doesn't need interpreter changes or even a PEP.
That said, I prefer the approach of pipx (https://pypi.org/project/pipx/) for scripts anyway. It too has the problem of not updating your PATH for you, but at least it keeps tools separate from dependencies, as they should be.
I think this is the third time we've had this conversation in a week :-(. Pipx is great for the cases it targets, and if it's sufficient for all of your use cases then that's great, but it isn't sufficient for mine, and that's not because your use cases are right and mine are wrong. (For anyone reading this and wondering about context, see: https://discuss.python.org/t/structured-exchangeable-lock-file-format-requir...) -n -- Nathaniel J. Smith -- https://vorpus.org
On 20Feb2019 0927, Nathaniel Smith wrote:
That said, I prefer the approach of pipx (https://pypi.org/project/pipx/) for scripts anyway. It too has the problem of not updating your PATH for you, but at least it keeps tools separate from dependencies, as they should be.
I think this is the third time we've had this conversation in a week :-(. Pipx is great for the cases it targets, and if it's sufficient for all of your use cases then that's great, but it isn't sufficient for mine, and that's not because your use cases are right and mine are wrong.
This should just be the description of distutils-sig :) Cheers, Steve
On 20/2/2019, at 23:19, Steve Dower <steve.dower@python.org> wrote:
On 20Feb.2019 0533, Tzu-ping Chung wrote:
As one of the Pipenv maintainers, however, it is my personal opinion that this PEP would not be end up in the “yet another standard” situation, but even be beneficial to Pipenv, if done correctly.
I hope this can provide some confidence :)
I'd love to hear more about how Pipenv would make use of it. So far it's only really been designed with pip in mind (and their team in the discussion), but we've explicitly left it _very_ tool independent. So if you can describe how it would work with Pipenv, that would be helpful for finding things that need changing.
When you run `pipenv install` (roughly analogous to pip install -r), Pipenv creates a virtual environment somewhere on the machine (depending on various configurations), and install packages into it. Afterward the user can run commands like `pipenv run python`, and Pipenv would activate the virtual environment for the command. With PEP 582, __pypackages__ can be used instead of virtual environments, and since “activation” is done automatically by the interpreter and pip, `pipenv run` can do less than it currently needs to. There are still some ergonomics problems, e.g. how does Pipenv know what Python version to install into, but I don’t think there’s anything in the PEP at the moment that would make the adoption impossible. We’ll definitely try to be heard if any blockers appear :)
Also, the `pythonloc` package is an implementation of this that anyone can try out today - https://pypi.org/project/pythonloc/ (the major difference is that when implemented, you won't have to use "pythonloc" and "piploc" to get the new behaviour).
Cheers, Steve
I'd caution against folks getting too worked up about PEP 582. I know it's been getting a lot of attention on social media recently, but, it's a draft that hasn't even been submitted for discussion yet. Most PEPs at this stage never end up going anywhere. And in general, when people start digging in on positions for and against something it always leads to worse decisions, and the earlier this happens the worse it gets. It has some interesting ideas and also some real limitations. I think it's a good signal that there are folks interested in helping make the python dev workflow easier, including changing the interpreter if that turns out to be the right thing to do. That's really all it means so far. I wonder if we should stick a header on the PEP draft saying something like this? There's a lot of scattershot responses happening and I think a lot of the people reacting are lacking context. -n On Wed, Feb 20, 2019, 04:40 Alex Walters <tritium-list@sdamon.com> wrote:
I have 2 main concerns about PEP 582 that might just be me misunderstanding the pep.
My first concern is the use of CWD, and prepending ./_pypackages_ for scripts. For example, if you were in a directory with a _pypackages_ subdirectory, and had installed the module "super.important.module". My understanding is that any scripts you run will have "super.important.module" available to it before the system site-packages directory. Say you also run "/usr/bin/an_apt-ly_named_python_script" that uses "super.important.module" (and there is no _pypackages_ subdirectory in /usr/bin). You would be shadowing "super.important.module".
In this case, this adds no more version isolation than "pip install --user", and adds to the confoundment factor for a new user. If this is a misunderstanding of the pep (which it very well might be!), then ignore that concern. If it's not a misunderstanding, I think that should be emphasized in the docs, and perhaps the pep.
My second concern is a little more... political.
This pep does not attempt to cover all the use-cases of virtualenvs - which is understandable. However, this also means that we have to teach new users *both* right away in order to get them up and running, and teach them the complexities of both, and when to use one over the other. Instead of making it easier for the new user, this pep makes it harder. This also couldn't have come at a worse time with the growing use of pipenv which provides a fully third way of thinking about application dependencies (yes, pipenv uses virtualenvs under the hood, but it is a functionally different theory of operation from a user standpoint compared to traditional pip/virtualenv or this pep).
Is it really a good idea to do this pep at this time?
In a vacuum, I like this pep. Aside from the (possible) issue of unexpected shadowing, it's clean and straight forward. It's easy to teach. But it doesn't exist in a vacuum, and we have to teach the methods it is intended to simplify anyways, and it exists in competition with other solutions.
I am not a professional teacher; I don't run python training courses. I do, however, volunteer quite a bit of time on the freenode channel. I get that the audience there is self-selecting to those who want to donate their time, and those who are having a problem (sometimes, those are the same people). This is the kind of thing that generates a lot of confusion and frustration to the new users I interact with there. -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/SFMFK...
I like the pep at first glance. I have long thought that virtualenv was a weird solution to an artificial problem notwithstanding that all programming problems are artificial. Virtualenv looks good only because a global interpreter centric environment is bad. A program centric alternative is welcome. On Wed, Feb 20, 2019, 08:57 Nathaniel Smith <njs@pobox.com wrote:
I'd caution against folks getting too worked up about PEP 582. I know it's been getting a lot of attention on social media recently, but, it's a draft that hasn't even been submitted for discussion yet. Most PEPs at this stage never end up going anywhere. And in general, when people start digging in on positions for and against something it always leads to worse decisions, and the earlier this happens the worse it gets.
It has some interesting ideas and also some real limitations. I think it's a good signal that there are folks interested in helping make the python dev workflow easier, including changing the interpreter if that turns out to be the right thing to do. That's really all it means so far.
I wonder if we should stick a header on the PEP draft saying something like this? There's a lot of scattershot responses happening and I think a lot of the people reacting are lacking context.
-n
On Wed, Feb 20, 2019, 04:40 Alex Walters <tritium-list@sdamon.com> wrote:
I have 2 main concerns about PEP 582 that might just be me misunderstanding the pep.
My first concern is the use of CWD, and prepending ./_pypackages_ for scripts. For example, if you were in a directory with a _pypackages_ subdirectory, and had installed the module "super.important.module". My understanding is that any scripts you run will have "super.important.module" available to it before the system site-packages directory. Say you also run "/usr/bin/an_apt-ly_named_python_script" that uses "super.important.module" (and there is no _pypackages_ subdirectory in /usr/bin). You would be shadowing "super.important.module".
In this case, this adds no more version isolation than "pip install --user", and adds to the confoundment factor for a new user. If this is a misunderstanding of the pep (which it very well might be!), then ignore that concern. If it's not a misunderstanding, I think that should be emphasized in the docs, and perhaps the pep.
My second concern is a little more... political.
This pep does not attempt to cover all the use-cases of virtualenvs - which is understandable. However, this also means that we have to teach new users *both* right away in order to get them up and running, and teach them the complexities of both, and when to use one over the other. Instead of making it easier for the new user, this pep makes it harder. This also couldn't have come at a worse time with the growing use of pipenv which provides a fully third way of thinking about application dependencies (yes, pipenv uses virtualenvs under the hood, but it is a functionally different theory of operation from a user standpoint compared to traditional pip/virtualenv or this pep).
Is it really a good idea to do this pep at this time?
In a vacuum, I like this pep. Aside from the (possible) issue of unexpected shadowing, it's clean and straight forward. It's easy to teach. But it doesn't exist in a vacuum, and we have to teach the methods it is intended to simplify anyways, and it exists in competition with other solutions.
I am not a professional teacher; I don't run python training courses. I do, however, volunteer quite a bit of time on the freenode channel. I get that the audience there is self-selecting to those who want to donate their time, and those who are having a problem (sometimes, those are the same people). This is the kind of thing that generates a lot of confusion and frustration to the new users I interact with there. -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/SFMFK...
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/5AGJJ...
On 20Feb.2019 0556, Nathaniel Smith wrote:
I wonder if we should stick a header on the PEP draft saying something like this? There's a lot of scattershot responses happening and I think a lot of the people reacting are lacking context.
I agree, I think given the amount of attention it's getting justifies making more clear than usual that it has not even been considered for acceptance. (Maybe we could even sneakily suggest that people could show their support by helping pip/virtualenv/pipenv burn down their bug count so they're ready to implement it ;) ) At the same time, it's nice to have a PEP be "controversial" for good reasons for a change. Nobody really hates it, they mostly just misread it or have different scopes in mind. Cheers, Steve
On Wed, Feb 20, 2019 at 5:59 AM Nathaniel Smith <njs@pobox.com> wrote:
I'd caution against folks getting too worked up about PEP 582. I know it's been getting a lot of attention on social media recently, but, it's a draft that hasn't even been submitted for discussion yet.
To this point, is this the right time and place to be discussing this PEP? I don’t mind any discussion happening myself, but I’m wondering more for me — if I should be digging up the questions I had about it and ask them here, or continue holding off. —Chris Most PEPs at this stage never end up going anywhere. And in general, when
people start digging in on positions for and against something it always leads to worse decisions, and the earlier this happens the worse it gets.
It has some interesting ideas and also some real limitations. I think it's a good signal that there are folks interested in helping make the python dev workflow easier, including changing the interpreter if that turns out to be the right thing to do. That's really all it means so far.
I wonder if we should stick a header on the PEP draft saying something like this? There's a lot of scattershot responses happening and I think a lot of the people reacting are lacking context.
-n
On Wed, Feb 20, 2019, 04:40 Alex Walters <tritium-list@sdamon.com> wrote:
I have 2 main concerns about PEP 582 that might just be me misunderstanding the pep.
My first concern is the use of CWD, and prepending ./_pypackages_ for scripts. For example, if you were in a directory with a _pypackages_ subdirectory, and had installed the module "super.important.module". My understanding is that any scripts you run will have "super.important.module" available to it before the system site-packages directory. Say you also run "/usr/bin/an_apt-ly_named_python_script" that uses "super.important.module" (and there is no _pypackages_ subdirectory in /usr/bin). You would be shadowing "super.important.module".
In this case, this adds no more version isolation than "pip install --user", and adds to the confoundment factor for a new user. If this is a misunderstanding of the pep (which it very well might be!), then ignore that concern. If it's not a misunderstanding, I think that should be emphasized in the docs, and perhaps the pep.
My second concern is a little more... political.
This pep does not attempt to cover all the use-cases of virtualenvs - which is understandable. However, this also means that we have to teach new users *both* right away in order to get them up and running, and teach them the complexities of both, and when to use one over the other. Instead of making it easier for the new user, this pep makes it harder. This also couldn't have come at a worse time with the growing use of pipenv which provides a fully third way of thinking about application dependencies (yes, pipenv uses virtualenvs under the hood, but it is a functionally different theory of operation from a user standpoint compared to traditional pip/virtualenv or this pep).
Is it really a good idea to do this pep at this time?
In a vacuum, I like this pep. Aside from the (possible) issue of unexpected shadowing, it's clean and straight forward. It's easy to teach. But it doesn't exist in a vacuum, and we have to teach the methods it is intended to simplify anyways, and it exists in competition with other solutions.
I am not a professional teacher; I don't run python training courses. I do, however, volunteer quite a bit of time on the freenode channel. I get that the audience there is self-selecting to those who want to donate their time, and those who are having a problem (sometimes, those are the same people). This is the kind of thing that generates a lot of confusion and frustration to the new users I interact with there. -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/SFMFK...
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/5AGJJ...
On Wed, Feb 20, 2019 at 12:27 PM Chris Jerdonek <chris.jerdonek@gmail.com> wrote:
On Wed, Feb 20, 2019 at 5:59 AM Nathaniel Smith <njs@pobox.com> wrote:
I'd caution against folks getting too worked up about PEP 582. I know it's been getting a lot of attention on social media recently, but, it's a draft that hasn't even been submitted for discussion yet.
To this point, is this the right time and place to be discussing this PEP? I don’t mind any discussion happening myself, but I’m wondering more for me — if I should be digging up the questions I had about it and ask them here, or continue holding off.
Technically any PEP that isn't rejected, withdrawn, or deferred is open for discussion, else the PEP was submitted prematurely. But in actual practice you need to ask the PEP authors. -Brett
—Chris
Most PEPs at this stage never end up going anywhere. And in general, when
people start digging in on positions for and against something it always leads to worse decisions, and the earlier this happens the worse it gets.
It has some interesting ideas and also some real limitations. I think it's a good signal that there are folks interested in helping make the python dev workflow easier, including changing the interpreter if that turns out to be the right thing to do. That's really all it means so far.
I wonder if we should stick a header on the PEP draft saying something like this? There's a lot of scattershot responses happening and I think a lot of the people reacting are lacking context.
-n
On Wed, Feb 20, 2019, 04:40 Alex Walters <tritium-list@sdamon.com> wrote:
I have 2 main concerns about PEP 582 that might just be me misunderstanding the pep.
My first concern is the use of CWD, and prepending ./_pypackages_ for scripts. For example, if you were in a directory with a _pypackages_ subdirectory, and had installed the module "super.important.module". My understanding is that any scripts you run will have "super.important.module" available to it before the system site-packages directory. Say you also run "/usr/bin/an_apt-ly_named_python_script" that uses "super.important.module" (and there is no _pypackages_ subdirectory in /usr/bin). You would be shadowing "super.important.module".
In this case, this adds no more version isolation than "pip install --user", and adds to the confoundment factor for a new user. If this is a misunderstanding of the pep (which it very well might be!), then ignore that concern. If it's not a misunderstanding, I think that should be emphasized in the docs, and perhaps the pep.
My second concern is a little more... political.
This pep does not attempt to cover all the use-cases of virtualenvs - which is understandable. However, this also means that we have to teach new users *both* right away in order to get them up and running, and teach them the complexities of both, and when to use one over the other. Instead of making it easier for the new user, this pep makes it harder. This also couldn't have come at a worse time with the growing use of pipenv which provides a fully third way of thinking about application dependencies (yes, pipenv uses virtualenvs under the hood, but it is a functionally different theory of operation from a user standpoint compared to traditional pip/virtualenv or this pep).
Is it really a good idea to do this pep at this time?
In a vacuum, I like this pep. Aside from the (possible) issue of unexpected shadowing, it's clean and straight forward. It's easy to teach. But it doesn't exist in a vacuum, and we have to teach the methods it is intended to simplify anyways, and it exists in competition with other solutions.
I am not a professional teacher; I don't run python training courses. I do, however, volunteer quite a bit of time on the freenode channel. I get that the audience there is self-selecting to those who want to donate their time, and those who are having a problem (sometimes, those are the same people). This is the kind of thing that generates a lot of confusion and frustration to the new users I interact with there. -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/SFMFK...
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/5AGJJ...
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/5KDK4...
participants (11)
-
Alex Walters
-
Brett Cannon
-
chadsmith27@gmail.com
-
Chris Jerdonek
-
Dan Ryan
-
Daniel Holth
-
Nathaniel Smith
-
Nick Coghlan
-
Paul Moore
-
Steve Dower
-
Tzu-ping Chung