bug in my code?

Duncan Smith buzzard at urubu.freeserve.co.uk
Tue Jul 15 21:57:12 CEST 2003


"Terry Reedy" <tjreedy at udel.edu> wrote in message
news:ua-dnfQZuo4_946iXTWJkw at comcast.com...
>
> "Duncan Smith" <buzzard at urubu.freeserve.co.uk> wrote in message
> news:bevl84$880$1 at news7.svr.pol.co.uk...
> > Greetings.  I am struggling to track down what I presume is a bug in
> my
> > code.  The trouble is that it only raises its head occasionally.  I
> have
> > managed to prune the code to the following:
> >
> > shutTest.py
> > --------------------------------------------------------
> > import cPickle
> >
> > def test2(iterations):
> >     for i in range(iterations):
> >         try:
> >             f = file('C:\\Python22\\(2, 2, 2, 3, 4).shu', 'r')
> >             try:
> >                 dependencies, nodes = cPickle.load(f)
> >             finally:
> >                 f.close()
> >         except IOError:
> >             pass
> > ----------------------------------------------------------------
> >
> > Calling the function with a suitable argument (say, 100) doesn't
> usually
> > pose a problem.  The file is opened OK and the data unpickled.  But
> > occasionally (or usually if I call the function with 1000 as
> argument) I get
> > the all too common, on Windows,
> >
> > 'The instruction at {some memory address} referenced memory at {some
> other
> > (and on one occasion the same) memory address}.  The memory could
> not be
> > "read".'
> ...
> This sounds like maybe a hardware problem.
>

Yes.  I can't see any problems with the code (although it wouldn't be the
first time ...).  I can't reproduce the problem on my work machine.

> Is the unpickling necessary to get the error? -- or some discrepancy?
>

It seems so.

> If you simply reread the file over and over, do you always get
> *exactly* the same byte string?  Test this by reading once before the
> loop and comparing subsequent reads to first.

Yes I do.  I haven't been able to reproduce the problem (ie. the crashes, I
fixed the file) by just opening / reading / closing the file.

What I asking is
> whether you might have flacky or overheated disk drive giving
> intermittant errors more often than once a terabyte.
>

Possibly.  Thanks Terry.

Duncan






More information about the Python-list mailing list