Is this a bug? Python intermittently stops dead for seconds

charlie strauss cems at earthlink.net
Sun Oct 1 10:43:58 EDT 2006


By the way if you are on a fast computer,  and an OS  whose time.time() function can resolve less than 0.5 seconds then you can see this problem on your machine at lower memory utilizations by changing the value of the default "allowed_gap" in the gtime class from 0.5 seconds down to say 0.1 second.  
This is the threshold for which the computer program flags the time it takes to create a "foo" object.  on a fast computer it should take much less than 0.1 sec.



-----Original Message-----
>From: charlie strauss <cems at earthlink.net>
>Sent: Oct 1, 2006 10:33 AM
>To: Steve Holden <steve at holdenweb.com>, python-list at python.org
>Subject: Re: Is this a bug? Python intermittently stops dead for seconds
>
>Steve and other good folks who replied:
>
>I want to clarify that, on my computer, the first instance of the gap occurs way before the memory if filled. (at about 20% of physical ram).  Additionally the process monitor shows no page faults.
>
>  Yes if you let the as-written demo program  run to completetion (all 20,000 iterations) then on many computers it would not be surprising that your computer eventually goes into forced page swapping at some point.  That would be expected and is not the issue than the one I am concerned with.
>
>in my case starts glicthing at around iteration 1000.
>
>1000(bars) x 100(foos)x(10 integers in array)
>
>is nominally 
>100,000 class objects and
>1,000,000 array elements.
>
>(note that the array if filled as [1]*10, so there is actually only one "integer", but 10 array elements refering to it, per foo class.)
>
>
>However steve may have put his finger on the reason why the duration grows with time.  Here is my current hypothesis.  The design of the program does not have and points where significant amounts of memory are released: all objects have held references till the end.  But prehaps there are some implicitly created objects of the same size created along the way???  For example when I write
>
>me.memory = [1]*nfoo
>
>perhaps internally, python is allocating an array of size foo and then __copying__ it into me.memory???  Since there is no reference to the intermediate it would then marked for future garbage collection.   
>
>If that were true then the memory would have interleaved entities of things to GC and things with references held in me.memory.
>
>Then to remove these would require GC to scan the entire set of existing objects, which is growing.
>
>Turning off GC would prevent this.
>
>
>In any case I don't think what I'm doing is very unusual.  The original program that trigger my investigation of the bug was doing this:
>
>foo was an election ballot holding 10 outcomes, and bar was a set of 100 ballots from 100 voting machines, and the total array was holding the ballot sets from a few thousand voting machines.  
>
>Almost any inventory program is likely to have such a simmilar set of nested array, so it hardly seems unusual.
>
>
>
>
>
>-- 
>http://mail.python.org/mailman/listinfo/python-list




More information about the Python-list mailing list