multythreading app memory consumption
Bryan Olson
fakeaddress at nowhere.org
Tue Oct 24 01:32:08 EDT 2006
Roman Petrichev wrote:
> Hi folks.
> I've just faced with very nasty memory consumption problem.
> I have a multythreaded app with 150 threads
[...]
>
> The test app code:
>
>
> Q = Queue.Queue()
> for i in rez: #rez length - 5000
> Q.put(i)
>
>
> def checker():
> while True:
> try:
> url = Q.get()
> except Queue.Empty:
> break
> try:
> opener = urllib2.urlopen(url)
> data = opener.read()
> opener.close()
> except:
> sys.stderr.write('ERROR: %s\n' % traceback.format_exc())
> try:
> opener.close()
> except:
> pass
> continue
> print len(data)
>
>
> for i in xrange(150):
> new_thread = threading.Thread(target=checker)
> new_thread.start()
Don't know if this is the heart of your problem, but there's no
limit to how big "data" could be, after
data = opener.read()
Furthermore, you keep it until "data" gets over-written the next
time through the loop. You might try restructuring checker() to
make data local to one iteration, as in:
def checker():
while True:
onecheck()
def onecheck():
try:
url = Q.get()
except Queue.Empty:
break
try:
opener = urllib2.urlopen(url)
data = opener.read()
opener.close()
print len(data)
except:
sys.stderr.write('ERROR: %s\n' % traceback.format_exc())
try:
opener.close()
except:
pass
--
--Bryan
More information about the Python-list
mailing list