[Tutor] Python execution timer/proficiency testing

Dino Bektešević ljetibo at gmail.com
Mon Aug 26 20:20:30 CEST 2013


Hello,

I'm interested in learning more about testing a program proficiency and how
to measure execution times in seconds. I have a very repetitive functions
and methods that work on images with large amount of measuring points each
one working with numpy.ndarrays which is really taxing and has to be really
repetitive because I either use .fill() or either have to go pix by pix.
I don't dare to run my program on a batch of ~9.5million images because I
can't assert how long could it last and because of obvious space issues my
program edits the information on the images and then overwrites the
original data. Should something go awry I'd have to spend a long time
cleaning it up. My plan is to test average profficiency on ~100 000 images
to see how it fairs and what to do next.

So far it takes about 1.5-2sec per image using the "guess_by_eye" method
(which isn't long, the first version took 25sec xD, but I will still add
couple of functions) but I still get the following warning:

Warning (from warnings module):
  File "/usr/lib/python2.7/dist-packages/scipy/optimize/minpack.py", line
152
    warnings.warn(msg, RuntimeWarning)
RuntimeWarning: The iteration is not making good progress, as measured by
the
  improvement from the last ten iterations.

It doesn't seem to produce any error in my data, but how dangerous is this?


thanks!
Dino
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/tutor/attachments/20130826/3460099f/attachment.html>


More information about the Tutor mailing list