why does memory consumption keep growing?
breamoreboy at gmail.com
breamoreboy at gmail.com
Thu Oct 5 22:31:43 EDT 2017
On Thursday, October 5, 2017 at 10:07:05 PM UTC+1, Fetchinson . wrote:
> Hi folks,
>
> I have a rather simple program which cycles through a bunch of files,
> does some operation on them, and then quits. There are 500 files
> involved and each operation takes about 5-10 MB of memory. As you'll
> see I tried to make every attempt at removing everything at the end of
> each cycle so that memory consumption doesn't grow as the for loop
> progresses, but it still does.
>
> import os
>
> for f in os.listdir( '.' ):
>
> x = [ ]
>
> for ( i, line ) in enumerate( open( f ) ):
>
> import mystuff
> x.append( mystuff.expensive_stuff( line ) )
> del mystuff
>
> import mystuff
> mystuff.some_more_expensive_stuff( x )
> del mystuff
> del x
>
>
> What can be the reason? I understand that mystuff might be leaky, but
> if I delete it, doesn't that mean that whatever memory was allocated
> is freed? Similary x is deleted so that can't possibly make the memory
> consumption go up.
>
> Any hint would be much appreciated,
> Daniel
>
> --
> Psss, psss, put it down! - http://www.cafepress.com/putitdown
Nothing stands out so I'd start by closing all the file handles. As you don't need the call to `enumerate` as you don't use the `i` something like:-
with open(f) as g:
for line in g:
...
--
Kindest regards.
Mark Lawrence.
More information about the Python-list
mailing list