# How to demonstrate bigO cost of algorithms?

Rusty Shackleford rs at overlook.homelinux.net
Wed Jun 2 17:40:12 CEST 2004

```Hi -

I'm studying algorithms and I want to write a python program that
calculates the actual runtimes.

I want to increment up some global variable called n in my program so
that I can see the relationship between the size of the problem and the
number of steps.

Here's my clumsy attempt to implement this:

def qsort(ll):
try:
global n
n += len(ll)
print "adding %d to %d." % (len(ll), n)
print "incrementing up n to %d." % n
except:
pass
if len(ll) == 0:
return []
p = ll
left = [x for x in ll if x < p]
mid = [x for x in ll if x == p]
right = [x for x in ll if x > p]
return qsort(left) + mid + qsort(right)

I'm not getting the results I hoped for.  I expected that after the
program is finished, I would get results near

len(ll) * math.log(len(ll), 2)

since the big O for quicksort is nlogn for all but the worst case.

I'd like to write similar functions for mergesort, bubblesort, and all
the other basic algorithms covered in data structures classes.

All help is welcome.

--
Give and take free stuff: http://freecycle.org

```