[python-uk] urllib latency
matth at netsight.co.uk
Mon Dec 20 14:54:13 CET 2010
Ok, here's a non job related post...
Anyone know why urllib.urlopen() can be so much slower than using ab to do the same thing? I seem to be getting an extra 100ms latency on a simple HTTP GET request of a static, small image.
>>> for x in range(10):
... t1 = time(); data = urlopen('http://example.com/kb-brain.png').read(); t2=time(); print t2-t1
versus, on the same machine:
dhcp90:funkload netsight$ ab -n10 -c1 http://example.com/kb-brain.png
Concurrency Level: 1
Time taken for tests: 0.309 seconds
Complete requests: 10
Failed requests: 0
Write errors: 0
Total transferred: 31870 bytes
HTML transferred: 28990 bytes
Requests per second: 32.32 [#/sec] (mean)
Time per request: 30.942 [ms] (mean)
Time per request: 30.942 [ms] (mean, across all concurrent requests)
Transfer rate: 100.59 [Kbytes/sec] received
I've tried it repeatedly and get consistent results. The server under test is a cluster of Plone instances behind haproxy. The client and server are connected via 100Mbit fairly lightly loaded network.
I've tried taking the read() part out, still the same... I've tried using urllib2 and still pretty much the same.
Matt Hamilton matth at netsight.co.uk
Netsight Internet Solutions, Ltd. Business Vision on the Internet
http://www.netsight.co.uk +44 (0)117 9090901
Web Design | Zope/Plone Development and Consulting | Co-location | Hosting
More information about the python-uk