dict would be very slow for big data

Steve Howell showell30 at yahoo.com
Tue May 12 00:07:44 EDT 2009

On May 11, 8:28 pm, forrest yang <Gforrest.y... at gmail.com> wrote:
> hi
> i am trying to insert a lot of data into a dict, which may be
> 10,000,000 level.
> after inserting 100000 unit, the insert rate become very slow, 50,000/
> s, and the entire time used for this task would be very long,also.
> would anyone know some solution for this case?

Are you running out of memory?  What are your keys?  Are you able to
gather any more specific data about the slowdown--do all operations
slow down equally or are there spurts of slowness?

More information about the Python-list mailing list