Possible memory leak?

Tuvas tuvas21 at gmail.com
Wed Jan 25 12:38:04 EST 2006


FYI, to all who asked, I was indeed just simply monitering the system
memory. I changed my approach to one that uses arrays and simply joins
the statements together at the end, it seems to have improved the
speed. However, for some reason it still takes a small eternity to
process on one computer, and not the other. The oddest thing of all is
that the one that is taking longer is on a better computer. I have been
using Linux, running Python 2.4.

The modified version of my code is now as follows: (Note, a few small
changes have been made to simplify things, however, these things don't
apply to a full-scale picture, so the shouldn't slow anything down in
the slightest.)

def load_pic_data(width,heigth,inpdat, filt=TRUE):
        ldata=[]
        total=0
        tnum=0
        size=100
        array=[]
        for y in range(0,heigth):
                row=[]
                ts=time.time()
                for x in range(0,width):
                        index=2*(x+y*width)
                        num=ord(inpdat[index+1])*256+ord(inpdat[index])
                        if(vfilter.get() and d_filter and filt):
                                num=round((num-(d_filter[index/2])))
                        if(num<0):
                                num=0
                        if(num>255*64):
                                num=255*64
                        tba=chr(num/64)
                        row.append(tba)
                srow=''.join(row)
                ldata.append(srow)
                print y,time.time()-ts
        data=''.join(ldata)

There is considerable more to the function, however, I've traced the
slowest part to this one.
Note the statement "print y, time.time()-ts". A few of the outputs are
as follows, with the 1024x1024 image.

1 .0633
2. .07005
3. .06698
20 .07925
30 .08410
100 .16255
200 .270895
500 .59182
900 1.06439
Note that at the time I wrote this, 900 was the highest avaliable.

Note that the value seems to be consistant when a few close ones are
observed, but over time, it increases, until the program is running
very slowly. For some reason it does this on one computer, and not
another, and I believe the 2 computers have identical python
configuration, ei, same libraries, version, etc. Both are the same type
of linux as well. I no longer am joining the strings one at a time, but
only at the end. What could be the source of such an odd problem? I
understand that the large image will take longer to process, but I
would think that the relationship should be more or less linear with
the size, and not exponential. Thanks for all of the help till now!




More information about the Python-list mailing list