[pypy-dev] Preheating pypy
Matt Billenstein
matt at vazor.com
Sun Mar 18 18:14:24 EDT 2018
Seems you need to just trigger whatever heuristic causes the JIT to run on the
interesting codepaths during application startup.
Would having a small for loop in a module global namespace that called down
through your stack do the trick?
===== somemodule.py
def foo(...):
# maybe this function calls through several layers of code in other
# modules...
...
for i in range(20):
foo(...) # prewarm foo...
=====
m
On Sun, Mar 18, 2018 at 11:59:21AM +0100, Nagy, Attila wrote:
> Hi,
>
> I use pypy to run an application in gunicorn.
> Gunicorn (as well) has a "preload" capability in multiprocess workers, which
> means it loads the application in one interpreter/process and instead of
> starting new interpreters/processes for additional workers, it just forks
> from the first one.
> This means the OS can share memory pages between the processes, which makes
> the app use less memory and start faster.
>
> This works nicely with pypy too and the memory savings are significant
> (taking into account that pypy uses much more memory than cpython, this is
> even more true).
>
> The problem is that each pypy process before the fork is "cold" and the JIT
> starts to compile code in the forked processes, independently from the
> original process, which makes a good deal of additonal memory (and CPU) go
> wasted.
>
> It would be nice to make this happen in the original process, so the
> operating system's COW semantics could work for the JIT-ed code too.
>
> Is there a way to achieve this?
>
> Thanks,
>
> _______________________________________________
> pypy-dev mailing list
> pypy-dev at python.org
> https://mail.python.org/mailman/listinfo/pypy-dev
--
Matt Billenstein
matt at vazor.com
http://www.vazor.com/
More information about the pypy-dev
mailing list