urllib2 slow for multiple requests
svarovsky.tomas at gmail.com
Thu May 14 11:33:29 CEST 2009
One more thing, since I am stuck with 2.4 (and if this is really 2.4
issue), is there some substitution for urllib2?
On May 14, 11:00 am, Tomas Svarovsky <svarovsky.to... at gmail.com>
> On May 13, 4:55 pm, cgoldberg <cgoldb... at gmail.com> wrote:
> > > Bascally it just grabs a page xy
> > > times and tells me how long it took.
> > you aren't doing a read(), so technically you are just connecting to
> > the web server and sending the request but never reading the content
> > back from the socket. So your timing wouldn't be accurate.
> > try this instead:
> > response = urllib2.urlopen(req).read()
> > But that is not the problem you are describing...
> Thanks for this pointer, didn't come to my mind.
> > > when I increase the number of repetitions, it is
> > > slowing down considerably (1 is like 3 ms, 100 takes 6 seconds).
> > > Maybe it is a known issue in urllib2
> > I ran your code and can not reproduce that behavior. No matter how
> > many repetitions, I still get a similar response time per transaction.
> > any more details or code samples you can provide?
> I don;t know, I have tried the program on my local MacOs, where I have
> several python runtimes installed and there is huge dfference between
> result after running at 2.6 and 2.4. So this might be the problem.
> When ran on the 2.6 result are comparable to php and better than ruby,
> which is what I expect.
> The problem is, that CentOS is running on the server and there is only
> 2.4 available. On wich version did you ran these tests?
> > -Corey Goldberg
More information about the Python-list