[pypy-dev] Benchmarking PyPy performance on real-world Django app

Maciej Fijalkowski fijall at gmail.com
Fri Oct 7 19:08:45 CEST 2011


On Fri, Oct 7, 2011 at 7:04 PM, Michael Foord <fuzzyman at gmail.com> wrote:
>
>
> On 7 October 2011 17:50, Maciej Fijalkowski <fijall at gmail.com> wrote:
>>
>> On Fri, Oct 7, 2011 at 1:33 PM, Igor Katson <igor.katson at gmail.com> wrote:
>> > When I first started benchmarking one of my Django sites,
>> > http://trip-travel.ru/ (using postgres driver pypq),
>> > I was disappointed by the results. PyPy was visually much slower than
>> > cPython (I just looked at how the site loads in the browser)
>> >
>> > But today I got some incredible results, I finally made PyPy to work
>> > faster
>> > than cPython, and found out that it got faster after loading the page
>> > several hundred times with "ab" or "siege"
>> >
>> > Here a the results, of mean response time by querying the home page with
>> > apache's "ab" (cPython 2.7.2, Django 1.3, PyPy 1.6.0), served with
>> > cherrypy
>> > wsgi server:
>> >
>> > After 10 requests (excluding the first request):
>> > cPython - 163.529 ms
>> > PyPy - 460.879 ms
>> >
>> > 50 request more:
>> > cPython - 168.539
>> > PyPy - 249.850
>> >
>> > 100 requests more:
>> > cPython - 166.278 ms
>> > PyPy - 131.104
>> >
>> > 100 requests more:
>> > cPython - 165.820
>> > PyPy - 115.446
>> >
>> > 300 requests more:
>> > cPython - 165.543
>> > PyPy - 107.636
>> >
>> > 300 requests more:
>> > cPython - 166.425
>> > PyPy - 103.065
>>
>> Thanks for doing the benchmarks :)
>>
>> >
>> > As we can see, the JIT needs much time to warm up, but when it does, the
>> > result is pretty noticeable.
>> > By the way, with psycopg2, the site responds for 155.766 ms in average
>> > (only
>> > 10 ms faster), so using PyPy with Django makes much sense for me.
>> >
>> > As for now, pypy cannot run with uWSGI, which I use in production, but
>> > maybe
>> > i'll switch to PyPy for production deployments if "PyPy + PyPQ + Some
>> > pure
>> > python WSGI server" suite will outperform (uWSGI + cPython + psycopg2).
>> > Though, the need to load the page 500 times after each server reload is
>> > not
>> > comfortable.
>>
>> I've heard people using gunicorn. Maybe this is a good try? Loading
>> the pages is indeed annoying, but you need to load it a couple times
>> for results not to be noticably slower :) We kind of know that the JIT
>> warmup time is high, but it's partly a thing to fix and partly an
>> inherent property of the JIT.
>>
>
>
> FWIW I shared this internally at Canonical, and whilst people were impressed
> there was some concern that having substantially worse performance for the
> first few hundred requests would a) be a showstopper and b) screw up
> metrics. The technique of deliberate warming immediately after restart is
> interesting, but it would be hard to hit all the code paths.
>
> I realise this is inherent in the way the jit works - but I thought it was a
> response worth sharing. I also assured them that pre-warm-up performance
> would continue to improve as pypy improves. ;-)
>
> All the best,
>
> Michael Foord
>

It's also true for anything based on JVM and I've *never* seen anyone
complain about it. The whole concept of a JIT is just "new" to python
world. Seriously, would actually anyone notice if first 50 requests
after restart take extra 200ms??? I'm not sure, my computer and my
internet connection both have hiccups of over 200ms. Note that many
code paths are common, which means that speed up is also shared.


More information about the pypy-dev mailing list