dict is really slow for big truck

Aahz aahz at pythoncraft.com
Tue Apr 28 19:25:05 CEST 2009

In article <7f01f7b7-a561-483a-8e6d-861a8c05f92e at p6g2000pre.googlegroups.com>,
forrest yang  <Gforrest.yang at gmail.com> wrote:
>i try to load a big file into a dict, which is about 9,000,000 lines,
>something like
>1 2 3 4
>2 2 3 4
>3 4 5 6
>for line in open(file)
>   arr=line.strip().split('\t')
>   dict[arr[0]]=arr
>but, the dict is really slow as i load more data into the memory, by
>the way the mac i use have 16G memory.
>is this cased by the low performace for dict to extend memory or
>something other reason.

Try gc.disable() before the loop and gc.enable() afterward.
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"If you think it's expensive to hire a professional to do the job, wait
until you hire an amateur."  --Red Adair

More information about the Python-list mailing list