How to demonstrate bigO cost of algorithms?

Mitja nun at
Wed Jun 2 23:25:32 CEST 2004

Rusty Shackleford <rs at>
( at wrote:
> Thanks for that explanation -- it really helped.  I forgot that O(n)
> can translate into dn + c where d and c are constants.

You'd NEVER write, e.g., O(n)=3n+7. What you'd say would be simply O(n)=n.
As already pointed out, this means that complexity (and, with that,
execution time) is PROPORTIONAL to n. It doesn't give you the number of
seconds or something.

Consider the following:
def foo(n):
  for i in range(n):
    for j in range(i):
      print i, j
The "print" statement is executed n*n/2 times. However, O(n)=n^2: if you
increase n seven times, the complexity increases 49 times.

> I think I'm going to keep trying to figure out ways to demonstrate
> big O stuff, because I just don't understand it until I see it in a
> non-abstract sense.
> Thanks again.

More information about the Python-list mailing list