This sounds like a great idea, but I've seen stuff like this done before, and it never ends well. You end up with a gargantuan-sized rabbit hole, since running a basic script could now involve using an internet connection and potentially root permissions.

IF one were to go this route, I prefer bundling a `requirements.txt` into the distribution and then doing something like this in the script:

import pkgcheck

If any of the requirements were missing, then something like this would happen:

Traceback (most recent call last):
MissingPackageError: This script requires package(s) which are not installed: mypackage>=1.0, other_package

That way, you don't get the weird import errors, but you don't have to worry about all the subtleties of automatic downloading.

[ERROR]: Your autotools build scripts are 200 lines longer than your program. Something’s wrong.

On Sep 19, 2016 11:26 AM, "אלעזר" <> wrote:
Many proposals to add something to stdlib are rejected here with the suggestion to add such library to pypi first. As noted by someone, pypi is not as reachable as stdlib, and one should install that package first, which many people don't know how. Additionally, there is no natural distinction between 3rd party dependencies and in-project imports (at least in tiny projects).

This can be made easier if the first line of the program will declare the required library, and executing it will try to download and install that library if it is not installed yet. Additionally, the 3rd party dependencies will be more explicit, and editors can then allow you to search for them as you type.

Of course it is *not* an alternative for real dependency management, but it will ease the burden on small scripts and tiny projects - which today simply break with errors that many users does not understand, instead of simply asking permission to install the dependency.


Python-ideas mailing list
Code of Conduct: