gsakkis at rutgers.edu
Fri Oct 21 03:54:52 CEST 2005
I'm trying to create a dbm database with around 4.5 million entries but the existing dbm modules
(dbhash, gdbm) don't seem to cut it. What happens is that the more entries are added, the more time
per new entry is required, so the complexity seems to be much worse than linear. Is this to be
expected and if so, should I expect better performance (i.e. linear or almost linear) from a real
database, e.g. sqlite ?
More information about the Python-list