Big-O notation
David Eppstein
eppstein at ics.uci.edu
Fri Apr 18 04:41:09 CEST 2003
In article <3ckgewva.fsf at morpheus.demon.co.uk>,
Paul Moore <gustav at morpheus.demon.co.uk> wrote:
> But for quicksort you partition, and compare each element against the
> partitioning element. That's n comparisons against log(n) partitioning
> elements (each partitioning step halves the size of the chunks, so you
> do this log2(n) times). So the complexity is O(n*log(n)).
>
> Disclaimer: This is all *very* rough, and from memory as well. But I
> hope it's enough that you get the idea.
This is not completely accurate, since each partition is not usually
exactly in half. My favorite version of the quicksort analysis is in
the new Cormen-Leiserson-Rivest-Stein version of Intro to Algorithms:
number the items according to their eventual sorted positions, then item
i is compared to item j exactly when the first pivot in the range i..j
is either i or j, which occurs with probability 2/|i-j+1|. Sum up these
probabilities for all i<j to get the expected number of comparisons and
you get roughly 2 n H_n (harmonic numbers) = O(n log n).
--
David Eppstein http://www.ics.uci.edu/~eppstein/
Univ. of California, Irvine, School of Information & Computer Science
More information about the Python-list
mailing list