[Chicago] Facebook open sources FriendFeed's real-time Python web framework, Tornado
Garrett Smith
g at rre.tt
Sat Sep 19 01:41:15 CEST 2009
On Fri, Sep 18, 2009 at 3:03 PM, Massimo Di Pierro
<mdipierro at cs.depaul.edu> wrote:
> I can contribute one more:
>
> http://web2py.com/examples/static/web2pyserver.py
>
> - api compatible with cherrypy and very much inspired by it.
> - works with cherrypy ssl_handler (to be tested) but will soon have its own.
> - multithreaded
> - can handle requests and responses via chunking (like cherrypy) (but not
> tested yet!)
> - should work with python 3 (but not tried yet!)
> - 30-50% faster then cherrypy in my tests.
>
> I could use some independent tests and benchmarks.
I've confirmed these results. The new web2py WSGI server is a playah!
web2server is on par with CherryPy at lowish levels of concurrency (<
1000) but is far better at handling very high levels of concurrency
requests (> 10,000). I was skeptical of Massimos 50% faster claims,
but it shows up at these very high levels.
It's *very* comparable, per my benchmarks, to Tornado. And it's threaded.
I hope word gets out about this.
As folks have been saying here for a while now -- it's really not a
good idea to plunge into async/event concurrency models for web apps.
Hell, even crazy highly concurrent apps like instant messaging or
simulators can probably get by perfectly well using threads, provided
the stack size is kept reasonable.
If you need a hundred thousand concurrent, long running processes,
fine. But you'll probably want something like Stackless anyway. Or
Erlang :)
Nice work Massimo!
P.S. The LGPL is kind of a pain for people that want to throw this
module into their source tree and not worry about the particulars of
that license. Massimo, if you plan to keep this as a single module
(hope so), would you consider an alternative or dual license?
More information about the Chicago
mailing list