[pypy-dev] Benchmarking PyPy performance on real-world Django app

Igor Katson igor.katson at gmail.com
Fri Oct 7 20:07:53 CEST 2011


On 10/07/2011 08:50 PM, Maciej Fijalkowski wrote:
> On Fri, Oct 7, 2011 at 1:33 PM, Igor Katson<igor.katson at gmail.com>  wrote:
>> When I first started benchmarking one of my Django sites,
>> http://trip-travel.ru/ (using postgres driver pypq),
>> I was disappointed by the results. PyPy was visually much slower than
>> cPython (I just looked at how the site loads in the browser)
>>
>> But today I got some incredible results, I finally made PyPy to work faster
>> than cPython, and found out that it got faster after loading the page
>> several hundred times with "ab" or "siege"
>>
>> Here a the results, of mean response time by querying the home page with
>> apache's "ab" (cPython 2.7.2, Django 1.3, PyPy 1.6.0), served with cherrypy
>> wsgi server:
>>
>> After 10 requests (excluding the first request):
>> cPython - 163.529 ms
>> PyPy - 460.879 ms
>>
>> 50 request more:
>> cPython - 168.539
>> PyPy - 249.850
>>
>> 100 requests more:
>> cPython - 166.278 ms
>> PyPy - 131.104
>>
>> 100 requests more:
>> cPython - 165.820
>> PyPy - 115.446
>>
>> 300 requests more:
>> cPython - 165.543
>> PyPy - 107.636
>>
>> 300 requests more:
>> cPython - 166.425
>> PyPy - 103.065
> Thanks for doing the benchmarks :)
>
>> As we can see, the JIT needs much time to warm up, but when it does, the
>> result is pretty noticeable.
>> By the way, with psycopg2, the site responds for 155.766 ms in average (only
>> 10 ms faster), so using PyPy with Django makes much sense for me.
>>
>> As for now, pypy cannot run with uWSGI, which I use in production, but maybe
>> i'll switch to PyPy for production deployments if "PyPy + PyPQ + Some pure
>> python WSGI server" suite will outperform (uWSGI + cPython + psycopg2).
>> Though, the need to load the page 500 times after each server reload is not
>> comfortable.
> I've heard people using gunicorn. Maybe this is a good try? Loading
> the pages is indeed annoying, but you need to load it a couple times
> for results not to be noticably slower :) We kind of know that the JIT
> warmup time is high, but it's partly a thing to fix and partly an
> inherent property of the JIT.
>
>
Tried gunicorn, nothing special, the speed is roughly the same. 
Unfortunately, I noticed that a single instance takes way to much memory 
to bring that to production, where I pay for the actually used memory. 4 
uWSGI workers eat 14 megabytes each, and pypy's memory usage increases, 
and after a couple thousands or requests a single worker took 250mb, 
more than 15 times more.


More information about the pypy-dev mailing list