<br><br><div class="gmail_quote">On 7 October 2011 18:08, Maciej Fijalkowski <span dir="ltr"><<a href="mailto:fijall@gmail.com">fijall@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<div><div></div><div class="h5">On Fri, Oct 7, 2011 at 7:04 PM, Michael Foord <<a href="mailto:fuzzyman@gmail.com">fuzzyman@gmail.com</a>> wrote:<br>
><br>
><br>
> On 7 October 2011 17:50, Maciej Fijalkowski <<a href="mailto:fijall@gmail.com">fijall@gmail.com</a>> wrote:<br>
>><br>
>> On Fri, Oct 7, 2011 at 1:33 PM, Igor Katson <<a href="mailto:igor.katson@gmail.com">igor.katson@gmail.com</a>> wrote:<br>
>> > When I first started benchmarking one of my Django sites,<br>
>> > <a href="http://trip-travel.ru/" target="_blank">http://trip-travel.ru/</a> (using postgres driver pypq),<br>
>> > I was disappointed by the results. PyPy was visually much slower than<br>
>> > cPython (I just looked at how the site loads in the browser)<br>
>> ><br>
>> > But today I got some incredible results, I finally made PyPy to work<br>
>> > faster<br>
>> > than cPython, and found out that it got faster after loading the page<br>
>> > several hundred times with "ab" or "siege"<br>
>> ><br>
>> > Here a the results, of mean response time by querying the home page with<br>
>> > apache's "ab" (cPython 2.7.2, Django 1.3, PyPy 1.6.0), served with<br>
>> > cherrypy<br>
>> > wsgi server:<br>
>> ><br>
>> > After 10 requests (excluding the first request):<br>
>> > cPython - 163.529 ms<br>
>> > PyPy - 460.879 ms<br>
>> ><br>
>> > 50 request more:<br>
>> > cPython - 168.539<br>
>> > PyPy - 249.850<br>
>> ><br>
>> > 100 requests more:<br>
>> > cPython - 166.278 ms<br>
>> > PyPy - 131.104<br>
>> ><br>
>> > 100 requests more:<br>
>> > cPython - 165.820<br>
>> > PyPy - 115.446<br>
>> ><br>
>> > 300 requests more:<br>
>> > cPython - 165.543<br>
>> > PyPy - 107.636<br>
>> ><br>
>> > 300 requests more:<br>
>> > cPython - 166.425<br>
>> > PyPy - 103.065<br>
>><br>
>> Thanks for doing the benchmarks :)<br>
>><br>
>> ><br>
>> > As we can see, the JIT needs much time to warm up, but when it does, the<br>
>> > result is pretty noticeable.<br>
>> > By the way, with psycopg2, the site responds for 155.766 ms in average<br>
>> > (only<br>
>> > 10 ms faster), so using PyPy with Django makes much sense for me.<br>
>> ><br>
>> > As for now, pypy cannot run with uWSGI, which I use in production, but<br>
>> > maybe<br>
>> > i'll switch to PyPy for production deployments if "PyPy + PyPQ + Some<br>
>> > pure<br>
>> > python WSGI server" suite will outperform (uWSGI + cPython + psycopg2).<br>
>> > Though, the need to load the page 500 times after each server reload is<br>
>> > not<br>
>> > comfortable.<br>
>><br>
>> I've heard people using gunicorn. Maybe this is a good try? Loading<br>
>> the pages is indeed annoying, but you need to load it a couple times<br>
>> for results not to be noticably slower :) We kind of know that the JIT<br>
>> warmup time is high, but it's partly a thing to fix and partly an<br>
>> inherent property of the JIT.<br>
>><br>
><br>
><br>
> FWIW I shared this internally at Canonical, and whilst people were impressed<br>
> there was some concern that having substantially worse performance for the<br>
> first few hundred requests would a) be a showstopper and b) screw up<br>
> metrics. The technique of deliberate warming immediately after restart is<br>
> interesting, but it would be hard to hit all the code paths.<br>
><br>
> I realise this is inherent in the way the jit works - but I thought it was a<br>
> response worth sharing. I also assured them that pre-warm-up performance<br>
> would continue to improve as pypy improves. ;-)<br>
><br>
> All the best,<br>
><br>
> Michael Foord<br>
><br>
<br>
</div></div>It's also true for anything based on JVM and I've *never* seen anyone<br>
complain about it. The whole concept of a JIT is just "new" to python<br>
world. Seriously, would actually anyone notice if first 50 requests<br>
after restart take extra 200ms??? I'm not sure, my computer and my<br>
internet connection both have hiccups of over 200ms. Note that many<br>
code paths are common, which means that speed up is also shared.<br>
</blockquote></div><br>I think you have a point and I've added your response to the internal discussion.<br><br>Michael<br clear="all"><br>-- <br><pre cols="72"><a href="http://www.voidspace.org.uk/" target="_blank">http://www.voidspace.org.uk/</a><br>
<br>May you do good and not evil<br>May you find forgiveness for yourself and forgive others<br>May you share freely, never taking more than you give.<br>-- the sqlite blessing <a href="http://www.sqlite.org/different.html" target="_blank">http://www.sqlite.org/different.html</a></pre>
<br>