DBM scalability

Paul Rubin http
Fri Oct 21 18:32:25 CEST 2005

"George Sakkis" <gsakkis at rutgers.edu> writes:
> I'm trying to create a dbm database with around 4.5 million entries
> but the existing dbm modules (dbhash, gdbm) don't seem to cut
> it. What happens is that the more entries are added, the more time
> per new entry is required, so the complexity seems to be much worse
> than linear. Is this to be expected

No, not expected.  See if you're using something like db.keys() which
tries to read all the keys from the db into memory, or anything like that.

More information about the Python-list mailing list