[pypy-dev] update (+patch) on embedding pypy

Maciej Fijalkowski fijall at gmail.com
Fri Apr 13 10:23:10 CEST 2012


On Fri, Apr 13, 2012 at 8:15 AM, Roberto De Ioris <roberto at unbit.it> wrote:

>
> > I am afraid it's not interesting.
> > I just wanted to test it with our small project - geoip server which
> > resolves ip address to region.   Now we use bjoern wsgi server for
> running
> > this project.   Basically it's does nothing but bisect search on
> > array.array.
> > When i run this code without wsgi wrapper using pypy it shows about 400%
> > speed boots.
> > Unfortunately uwsgi with pypy plugin shows me pretty same perfomance as
> > usual uwsgi.  When i wrote simple application which loop over "for" cycle
> > and increment some variable, i got same result.
> > I'll try running more tests to figure out what I'm doing wrong.  I am a
> > fan
> > of your work and looking forward to using pypy in our production project.
> >
> >
>
> Hi,
>
> when you compile libpypy, the translator should generate a pypy-c binary.
>
> Try to run your code without the wsgi wrapper with this new binary, to
> check if you still have a 400 times improvement.
>

I think he said 400%, that's not as good ;-)


>
> If you get 'slower' values over a standard binary release of pypy, it
> means you have compiled it wrongly (maybe without the jit).
>
> In my tests, 99% of the webapps (being IO-based) do not gain too much
> power on pypy, but if you have a CPU-bound webapp (like you), you will end
> screaming WOOOW all of the time :)
>

working on it...


>
> And take in account that in uWSGI we suggest using lua-jit for all of the
> need-to-be-fast parts, so to make us scream you have to be faster than it
> :)
>

FYI we outperform luajit on richards (giving *ample* warmup, luajit warms
up *so* fast), did not measure on anything else. Just sayin


>
> I would like to suggest you another approach, vastly pushed in the past
> years in pypy talks. It includes a bit of overhead, but you will end with
> a more solid platform:
>
> delegate the CPU-bound part to a pypy daemon (listening on unix sockets or
> whatever you want) and run your public webapp with your server of choice
> running on cpython. In your webapp you simply 'enqueue' tasks to the pypy
> daemon and wait for the result. Do not make the error to use a complex
> protocol for it. Go line-based. No need to add more overhead i often see.
>
> If for some reason the pypy daemon crashes (come on it happens ;), you can
> simply automatically restart it without propagating the down to your
> public web-services (if it crashes during a request you can simply
> re-enqueue it in the same transaction)
>

I think crashes are less of an issue (pypy is relatively stable) compared
to the libraries that you have to interface with (like lxml).


>
> With this approach you can continue using bjoern for the public IO-bound
> part, and abuse pypy for the cpu-heavy one.
>
> In uWSGI this kind of approach is a lot easier (from a sysadmin
> point-of-view) as you can use the --attach-daemon option, allowing you to
> 'attach' an external process (your pypy daemon) that will be monitored
> (and respawned) automatically.
>
>
> --
> Roberto De Ioris
> http://unbit.it
>

Thanks for insights Roberto! I'm really glad uWSGI community is interested
*and* seem to be taking a reasonable, non-overhyped approach

Cheers,
fijal
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/pypy-dev/attachments/20120413/cf64c225/attachment.html>


More information about the pypy-dev mailing list