Best way to handle large lists?

Hari Sekhon hpsekhon at googlemail.com
Tue Oct 3 15:47:32 CEST 2006


I don't know much about the python internals either, so this may be the 
blind leading the blind, but aren't dicts much slower to work with than 
lists and therefore wouldn't your suggestion to use dicts be much 
slower? I think it's something to do with the comparative overhead of 
using keys in dicts rather than using positional indexes in lists/arrays...

At least that is what I thought.

Can anyone confirm this?

-h

Hari Sekhon



Bill Williams wrote:
> I don't know enough about Python internals, but the suggested solutions 
> all seem to involve scanning bigList. Can this presumably linear 
> operation be avoided by using dict or similar to find all occurrences of 
> smallist items in biglist and then deleting those occurrences?
>
> Bill Williams
>
>
>
> In article <prrUg.1662$We.477 at trndny08>,
>  Chaz Ginger <cginboston at hotmail.com> wrote:
>
>   
>> I have a system that has a few lists that are very large (thousands or
>> tens of thousands of entries) and some that are rather small. Many times
>> I have to produce the difference between a large list and a small one,
>> without destroying the integrity of either list. I was wondering if
>> anyone has any recommendations on how to do this and keep performance
>> high? Is there a better way than
>>
>> [ i for i in bigList if i not in smallList ]
>>
>> Thanks.
>> Chaz
>>     



More information about the Python-list mailing list