[pypy-dev] Benchmarking PyPy performance on real-world Django app

Michael Foord fuzzyman at gmail.com
Fri Oct 7 19:04:17 CEST 2011


On 7 October 2011 17:50, Maciej Fijalkowski <fijall at gmail.com> wrote:

> On Fri, Oct 7, 2011 at 1:33 PM, Igor Katson <igor.katson at gmail.com> wrote:
> > When I first started benchmarking one of my Django sites,
> > http://trip-travel.ru/ (using postgres driver pypq),
> > I was disappointed by the results. PyPy was visually much slower than
> > cPython (I just looked at how the site loads in the browser)
> >
> > But today I got some incredible results, I finally made PyPy to work
> faster
> > than cPython, and found out that it got faster after loading the page
> > several hundred times with "ab" or "siege"
> >
> > Here a the results, of mean response time by querying the home page with
> > apache's "ab" (cPython 2.7.2, Django 1.3, PyPy 1.6.0), served with
> cherrypy
> > wsgi server:
> >
> > After 10 requests (excluding the first request):
> > cPython - 163.529 ms
> > PyPy - 460.879 ms
> >
> > 50 request more:
> > cPython - 168.539
> > PyPy - 249.850
> >
> > 100 requests more:
> > cPython - 166.278 ms
> > PyPy - 131.104
> >
> > 100 requests more:
> > cPython - 165.820
> > PyPy - 115.446
> >
> > 300 requests more:
> > cPython - 165.543
> > PyPy - 107.636
> >
> > 300 requests more:
> > cPython - 166.425
> > PyPy - 103.065
>
> Thanks for doing the benchmarks :)
>
> >
> > As we can see, the JIT needs much time to warm up, but when it does, the
> > result is pretty noticeable.
> > By the way, with psycopg2, the site responds for 155.766 ms in average
> (only
> > 10 ms faster), so using PyPy with Django makes much sense for me.
> >
> > As for now, pypy cannot run with uWSGI, which I use in production, but
> maybe
> > i'll switch to PyPy for production deployments if "PyPy + PyPQ + Some
> pure
> > python WSGI server" suite will outperform (uWSGI + cPython + psycopg2).
> > Though, the need to load the page 500 times after each server reload is
> not
> > comfortable.
>
> I've heard people using gunicorn. Maybe this is a good try? Loading
> the pages is indeed annoying, but you need to load it a couple times
> for results not to be noticably slower :) We kind of know that the JIT
> warmup time is high, but it's partly a thing to fix and partly an
> inherent property of the JIT.
>
>

FWIW I shared this internally at Canonical, and whilst people were impressed
there was some concern that having substantially worse performance for the
first few hundred requests would a) be a showstopper and b) screw up
metrics. The technique of deliberate warming immediately after restart is
interesting, but it would be hard to hit all the code paths.

I realise this is inherent in the way the jit works - but I thought it was a
response worth sharing. I also assured them that pre-warm-up performance
would continue to improve as pypy improves. ;-)

All the best,

Michael Foord


> Cheers,
> fijal
> _______________________________________________
> pypy-dev mailing list
> pypy-dev at python.org
> http://mail.python.org/mailman/listinfo/pypy-dev
>



-- 

http://www.voidspace.org.uk/

May you do good and not evil
May you find forgiveness for yourself and forgive others
May you share freely, never taking more than you give.
-- the sqlite blessing http://www.sqlite.org/different.html
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/pypy-dev/attachments/20111007/1c4fd69c/attachment-0001.html>


More information about the pypy-dev mailing list