[Baypiggies] hello and returning unique elements

Hy Carrinski hcarrinski at gmail.com
Fri Apr 8 03:42:06 CEST 2011

Dear BayPIGgies,

Thank you for providing exceptional reading material over the past
year for an avid conversation follower and occasional meeting

I would like to join the conversation and contribute to the community
beginning with this post. I will attend our meeting this month, and
look forward to meeting at Noisebridge or elsewhere to watch PyCon
2011 videos together.

While thinking about unique lists in the context of Vikram's recent
question, I came across a related blog post at

The post emphasizes speed and maintaining original list order when
removing duplicates from a list.

Although the post is several years old, I thought of a solution to the
question it poses that is different from the ones proposed by
BayPIGgies either in the blog's comments or at a previous discussion
(before reversed, sorted, and set were introduced in Python 2.4) at

I will be grateful for any comments on my ask forgiveness rather than
permission implementation. It requires that all elements of the list
be hashable and works for Python 2.x where x >= 4.

def unique(seq):
    """Return unique list of hashable objects while preserving order."""
    seen = {}
    for ix in reversed(xrange(len(seq))):
        seen[seq[ix]] = ix
    return sorted(seen,key=seen.__getitem__)

For the case of hashable objects are there disadvantages to this
implementation versus the ones referenced above? Is there a reason to
prefer operator.itemgetter over __getitem__ in this limited context?
Is there an advantage in clarity or performance to using
collections.OrderedDict in Python 2.7 and above? For lists containing
a great number of distinct objects the sort step will introduce a
lower bound on speed for this implementation.

I look forward to meeting more BayPIGgies in the coming months and years.


More information about the Baypiggies mailing list