[Tutor] could somebody please explain...
Steven D'Aprano
steve at pearwood.info
Fri Oct 3 15:21:15 CEST 2014
On Wed, Oct 01, 2014 at 09:43:29AM -0700, Clayton Kirkwood wrote:
> # program to test time and count options
>
> import datetime,operator, sys
> from datetime import time, date, datetime
> date = datetime.now()
> dayofweek = date.strftime("%a, %b")
> print("Today is", dayofweek, date.day, "at ", date.time())
>
> start = 0
> count_max=int(input("give me a number"))
> start_time = datetime.now()
>
> print( start_time )
> while start < count_max:
> start=start + 1
>
> end_time = datetime.now()
> print( "s=s+1 time difference is:", (end_time - start_time) )
The first problem you have here is that you are not
actually timing how long it takes to add "start + 1".
You're actually timing eight things:
- lookup the value of start;
- lookup the value of count_max;
- check whether the first is less than the second;
- decide whether to loop, or exit the loop;
- if we're still inside the loop, lookup start again;
- add 1 to it;
- store the result in start;
- jump back to the top of the loop.
So the results you get don't tell you much about the speed of start+1.
Analogy: you want to know how long it takes you to drive to work in the
morning. So you wake up, eat breakfast, brush your teeth, start the
stopwatch, have a shower, get dressed, get in the car, drive to the gas
station, fill up, buy a newspaper, and drive the rest of the way to
work, and finally stop the stopwatch. The time you get is neither
accurate as "driving time", nor "total time it takes to get to work"
time.
Ideally, we want to do as little extra work as possible inside the
timing loop, so we can get a figure as close as possible to the time
actually taken by + as we can.
The second problem is that you are using datetime.now() as your clock.
That's not a high-precision clock. It might be only be accurate to a
second, or a millisecond. It certainly isn't accurate enough to measure
a single addition:
py> from datetime import datetime
py> x = 1
py> t = datetime.now(); x + 1; datetime.now() - t
2
datetime.timedelta(0, 0, 85)
This tells me that it supposedly took 85 microseconds to add two
numbers, but as I showed before with timeit, the real figure is closer
to 0.09 microseconds. That's a lot of noise! About 85000% noise!
Unfortunately, it is tricky to know which clock to use. On Windows,
time.clock() used to be the best one; on Linux, time.time() was the
best. Starting in Python 3.3, there are a bunch more accurate clocks in
the time module. But if you use the timeit module, it already picks the
best clock for the job. But if in doubt, time.time() will normally be
acceptable.
https://docs.python.org/3/library/time.html
https://docs.python.org/3/library/timeit.html
--
Steven
More information about the Tutor
mailing list