nntplib, huge xover object
webmaster at quanta1.world--vr.com
Wed Apr 2 06:52:52 CEST 2003
I built a small script that use the xover function in the nntplib module.
The problem that I came across is that what xover can return a huge tuple
when there is 1000s of article in a newsgroup(which is frequent)
testxover_resp,testxover_subs = s.xover(start,end)
On my system it's not an issue I have 768 Mb of RAM.... but I have to
believe that there is a way to optimise this while keeping all this simple.
How can I limit the amount of memory xover would take. the other I did an
xover of a huge group and the python process was taking about 650Mb Res.
Then I go on .and put the articlenumbers and subjects into a massive
dictionary which I agree it's not a very inteligent way to do this...but
right now it works, I will later get rid of this huge dictonary and replace
it with something more memory efficient.
maybe somebody can teach me a technique in python to deal with object that
may take lots of memory? Maybe I can just put the tuple returned by xover
into a file(I don't have a clue how)?
More information about the Python-list