Python threading?

Thomas Weholt 2002 at
Thu Sep 26 09:54:22 EDT 2002

This might be just a stupid question, but I'm trying to implement a
filesharing-app, based on the BaseHTTPServer in the standard python-distro.
This will of course be heavy IO-processing involved. Will threading still be
a bad idea? The project is aimed at small user-groups ( I do not think one
person will have more than say 5-10 concurrent connections at a time, most
of them downloading files 1MB + in size), it's not the new Gnutella/Naptser,
but I just can't get the hang of this Threading vs. Asyncore deal. It seems
as if threading is a very bad idea no matter what, still it's used almost
everywhere ( I got no statistics to back up that statement ;-) ).

Any info or opinion on what would be the best way to implement a filesharing
app in python; threaded, async, raw-socket etc. are of interest to me.

NB! It would be nice to be able to limit the number of threads, actually the
number of concurrent connections, on the server too, so if anybody has any
comments on some way of doing that and still handle request gracefully, it
would be great.

Best regards,

"Aahz" <aahz at> wrote in message
news:amogmi$r1l$1 at
> In article <f9Pi9.10275$Lg2.2191294 at>,
> Robert Oschler <Oschler at> wrote:
> >
> >Also, what is a reasonable number of threads to expect to be able to run
> >before context switching overhead becomes a problem (I'm using a PIII 500
> >Mhz with 512MB ram if that helps).
> It depends.  For pure Python-based computation, more than one thread
> will cost you more than you gain.  If it's almost pure I/O (file or
> socket stuff), you might be able to use as many as a couple of hundred
> threads.  Mostly, you're probably best off somewhere between five and
> thirty threads, depending on what you're doing.
> --
> Aahz (aahz at           <*>
> Project Vote Smart:

More information about the Python-list mailing list