very large dictionary
Simon.Strobl at gmail.com
Tue Aug 5 10:20:08 CEST 2008
> Have you considered that the operating system imposes per-process limits
> on memory usage? You say that your server has 128 GB of memory, but that
> doesn't mean the OS will make anything like that available.
According to our system administrator, I can use all of the 128G.
> > I thought it would be practical not to create the
> > dictionary from a text file each time I needed it. I.e. I thought
> > loading the .pyc-file should be faster. Yet, Python failed to create a
> > .pyc-file
> Probably a good example of premature optimization.
Well, as I was using Python, I did not expect to have to care about
the language's internal affairs that much. I thought I could simply do
always the same no matter how large my files get. In other words, I
thought Python was really scalable.
> Out of curiosity, how
> long does it take to create it from a text file?
I do not remember this exactly. But I think it was not much more than
More information about the Python-list