zlib and performance

Juan Huertas jhg at galdon.com
Thu Mar 14 16:57:33 EST 2002


I have a file, with records generated using cPickle and zlib, for example:

record1=['juan',0,123.456,'Madrid',23,'Granada','Spain']

usualy in the real case the length of the list is 30 or 40.

My file has 60,000 records aprox.

For archive i use, a btrieve file:

f[id]=compress(dumps(record1))
f..sync()

When i need to read the whole file, i use:

r={}
for k in f.keys():
        r[k]=loads(decompress(f[k]))

and in Windows 2000 (256 Mb ram) P-III 1000 Ghz Python 2.1.2 this takes 4
sec aprox:

    1 sec for retrieving f[k]
    2 sec for decompress
    1 sec for loads

If i need, read the whole file a continuation, i proceed:

r={} # This takes 3 seconds aprox !!! (Why?)

for k in f.keys():
        r[k]=loads(decompress(f[k]))


    1 sec for retrieving f[k]
    28 sec for decompress !!! (Why?)
    1 sec for loads

Whats is the problem?. zlib does not work fine?

Please aid.

In Windows XP this does not occurs!!!









More information about the Python-list mailing list