<br><div class="gmail_quote">On Tue, May 1, 2012 at 12:21 AM, Ian Kelly <span dir="ltr"><<a href="mailto:ian.g.kelly@gmail.com" target="_blank">ian.g.kelly@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div class="im">On Mon, Apr 30, 2012 at 11:25 PM, Dan Stromberg <<a href="mailto:drsalists@gmail.com">drsalists@gmail.com</a>> wrote:<br>
><br>
> A while back I did a sort algorithm runtime comparison for a variety of<br>
> sorting algorithms, and then mostly sat on it.<br>
><br>
> Recently, I got into a discussion with someone on stackoverflow about the<br>
> running time of radix sort.<br>
><br>
> I realize it's commonly said that radixsort is n*k rather than n*log(n).<br>
> I've been making that case that in real life, frequently k==log(n).<br>
><br>
> Anyway, here's the comparison, with code and graph:<br>
> <a href="http://stromberg.dnsalias.org/%7Estrombrg/sort-comparison/" target="_blank">http://stromberg.dnsalias.org/~strombrg/sort-comparison/</a><br>
<br>
</div>It can be difficult to distinguish O(n) from O(n log n) on a log-log<br>
plot, so I don't think that graph makes your point very well.<br><div class="im"></div></blockquote><div>Thank you for your comment.<br><br>Do you have a suggestion for a better type of graph - or other linearity/nonlinearity test?<br>
<br>I thought that on a log-log graph, a straight line was still a straight line, but it's compressed at one end. <br><br></div></div>