[Python-Dev] Python startup optimization: script vs. service

Victor Stinner victor.stinner at gmail.com
Mon Oct 2 09:26:09 EDT 2017


2017-10-02 13:10 GMT+02:00 INADA Naoki <songofacandy at gmail.com>:
> https://github.com/python/cpython/pull/3796
> In this PR, lazy loading only happens when uuid1 is used.
> But uuid1 is very uncommon for nowdays.

Antoine Pitrou added a new C extension _uuid which is imported as soon
as uuid(.py) is imported. On Linux at least, the main "overhead" is
still done on "import uuid". But Antoine change optimized a lot
"import uuid" import time!

> https://github.com/python/cpython/pull/3757
> In this PR, singledispatch is lazy loading types and weakref.
> But singledispatch is used as decorator.
> So if web application uses singledispatch, it's loaded before preforking.

While "import module" is fast, maybe we should use sometimes a global
variable to cache the import.

module = None
def func():
   global module
   if module is None: import module
   ...

I'm not sure that it's possible to write an helper for such pattern.

In *this case*, it's ok, since @singledispatch is more designed to be
used with top-level functions, not on nested functions. So the
overhead is only at startup, not at runtime in practice.

> Maybe, copying `tokenize.open()` into linecache is better than lazy loading
> tokenize.

Please don't copy code, only do that if we have no other choice.

> Anyway, I completely agree with you; we should careful enough about lazy
> (importing | compiling).

I think that most core devs are aware of tradeoffs and we try to find
a compromise on a case by case basis.

Victor


More information about the Python-Dev mailing list