speeding up dictionary creation

Bob van der Poel bvdpoel at uniserve.com
Thu Aug 17 21:29:15 EDT 2000


"John W. Baxter" wrote:
> 
> In article <399C4FDA.15F65A8F at uniserve.com>, Bob van der Poel
> <bvdpoel at uniserve.com> wrote:
> 
> > I've certainly NOT done an extensive test on this, but...I'm using
> > python to edit a mini-database. Each record is a dictionary entry:
> >
> >       items['foo']=...
> >       items['spam']=...
> >
> > it appears to me that when reading/parsing the initial datafile the
> > majority of the time is taken in creating new entries, not in parsing
> > the file. Is there a way to preallocate entries or otherwise speed this
> > up. On a short set to data there doesn't seem to be a problem, but as
> > the list increases in size the load time goes up in what appears to be
> > N^2.
> >
> > I assume this stems from the time needed to check for existing
> > recordnames not conflicting with existing ones...
> 
> How often are you going to engage in "reading/parsing the initial data
> file"?
> 
> Probably not often if this is in-house.  In which case you don't care
> how long it takes...you let it run over night.  (If it takes weeks, you
> care...if it takes weeks Python dictionaries are likely the wrong tool.)
> 
> Often, perhaps, if you plan to distribute the thing.
Baxter   Port Ludlow, WA USA  jwbnews at scandaroon.com

Oppps, I probably wasn't clear. The data is read every time the program
is started up (ie. each time you want to use it). So, anything more than
a few long seconds is getting unacceptable.



-- 
   __
  /  )      /         Bob van der Poel
 /--<  ____/__        bvdpoel at uniserve.com
/___/_(_) /_)         http://users.uniserve.com/~bvdpoel



More information about the Python-list mailing list