efficient idiomatic queue?

Delaney, Timothy tdelaney at avaya.com
Tue Jan 15 06:07:20 CET 2002


> From: David Eppstein [mailto:eppstein at ics.uci.edu]
> 
> In article <mailman.1011063861.31491.python-list at python.org>,
>  "Jason Orendorff" <jason at jorendorff.com> wrote:
> 
> > The given operations are usually "quick enough", for small lists
> > (less than, say, 5000 items) since the constant coefficient is
> > small, compared to the ordinary semi-inefficiency of running
> > Python code in the first place.
> 
> My philosophy to writing general purpose code like a sorting 
> routine is to 
> use as efficient algorithms as reasonably possible, because 
> even if you 
> think you are only going to see 5000-element lists and that 
> 1-second delays 
> are acceptable, someone later may well try it on a 
> million-element list and 
> get disgusted when the program takes hours to run.  I don't 
> mind using 

This very much falls under "knowing the data structures you will be using"
or "knowing which data structures your language uses.", as well as knowing
your audience.

Students are notorious for taking an example and trying to apply it in all
cases.

The approach that should therefore be taken (for teaching an algorithms
class) would be along the lines of ...

"I will use the simplest algorithm possible in Python to demonstrate the
principles of the algorithm. The algorithm itself is O(n^2). For the data
sets you will be using in the examples and assignments, you can consider
this to be the case.

However, it is important to also consider the data structure that the
algorithm is being used on - in particular in this case, whether the data
structure is optimised for random access, access at the head of the
structure, the tail of the structure, or some other type of access. These
factors may well increase the running time of a particular algorithm
many-fold.

For example, you would expect that increasing the size of the dataset from
10 to 20 would increase the time required approximately 4-fold for this
algorithm. However, because of the data structure that is used, the increase
is actually approximately 16-fold. So don't use this exact implementation of
the algorithm on millions of elements (laugh)."

... thus making some of the students aware that you cannot simply say "this
is the best algorithm in all cases".

Tim Delaney




More information about the Python-list mailing list