Efficient way to break up a list into two pieces
marwie
marwie at gmx.de
Sun Feb 21 14:07:24 EST 2010
On 21 Feb., 04:40, Steven D'Aprano <st... at REMOVE-THIS-
cybersource.com.au> wrote:
>
> Additionally, Python lists are over-allocated so that appends are fast. A
> list of (say) 1000 items might be over-allocated to (say) 1024 items, so
> that you can do 24 appends before the array is full and the array needs
> to be resized. This means that, on average, Python list appending is O(1)
> instead of O(N). You can't just change the length blindly, you need to
> worry about the over-allocation.
Ok, I see your point. However, other data representation might still
be able to optimize such a multi-element pop. I'm thinking of deques,
for example.
> I'm sympathetic to your concern: I've often felt offended that doing
> something like this:
>
> x = SomeReallyBigListOrString
> for item in x[1:]:
> process(item)
>
> has to copy the entire list or string (less the first item). But
> honestly, I've never found a situation where it actually mattered.
Good grief, it copies that, too? I assumed that the copying is at
least delayed until the assignment and that x[1:] is represented by
some wrapper that just shuffles the indices around (much like
the .indices method that MRAB suggested).
Maybe copying chunks of data really isn't that much of an issue as it
used to be (and maybe I'm getting old). The application I have in mind
has to do with symbolic computation, and expressions would be
represented by python lists. It's too early to do any profiling (in
fact, it's at the "deciding if python is the right language to
implement it" stage), but it will have to handle expressions with many
terms (i.e. long lists), and in the end taking these lists apart and
putting them back together in different order is the only thing the
code will do. That to explain my interest in performance issues
related to pyhton lists.
Anyway, thanks for your help.
Martin.
More information about the Python-list
mailing list