urllib2 slow for multiple requests

Tomas Svarovsky svarovsky.tomas at gmail.com
Fri May 15 04:57:58 EDT 2009


On May 14, 6:33 pm, "Richard Brodie" <R.Bro... at rl.ac.uk> wrote:
> "Tomas Svarovsky" <svarovsky.to... at gmail.com> wrote in message
>
> news:747b0d4f-f9fd-4fa6-bb6d-0a4365f328ba at b1g2000vbc.googlegroups.com...
>
> > This is a good point, but then it would manifest regardless of the
> > language used AFAIK. And this is not the case, ruby and php
> > implementations are working quite fine.
>
> What I meant was: not reading the data and leaving the connection
> open is going to force the server to handle all 100 requests concurrently.
> I'm guessing that's not what your other implementations do.
> What happens to the timing if you call response.read(), response.close() ?

Now I get it, but nevertheless, even when I explicitely read from the
socket and then close it properly, the timing still doesn't change.

Thanks for advice though



More information about the Python-list mailing list