
[Terry Reedy]
On enhancing list object: queue vs. deque.
My general impression is that queues, while less common than stacks, are used enough in enough different algorithms that adding one implementation field to list headers so that pop(0) is an O(1) pointer move might be an overall win for Python. The current O(n) move everything every time behavior is certainly a nuisance when queues are needed.
See the other msgs in this thread today -- I'm not convinced that queue-like behavior can be supported that efficiently.
Deques, on the other hand, seem to be more a theoretical construct. Algorithm books include them for completeness in describing possible list disciplines, but generally ignore them thereafter. I am hard put to remember an algorithm that actually needs them. They are certainly much rarer than queues. So I do not think trying to make prepends O(1) should be considered further.
I expect that if queue-like behavior *can* be supported efficiently using an always-contiguous vector, efficient dequeue-like behavior will fall out for free.
My possibly naive thought is that before reallocating, look at the ratio of left free to total.
I don't know what that means (neither the "left free" part nor what exactly "total" is counting). The devil is in the details here.
If memmove would result in as much, or perhaps nearly as much, over-allocation as realloc,
memmove() leaves the total number of unused slots unchanged. realloc can do that too, or increase it by any amount, or decrease it by any amount. I just don't know what you're trying to say ... explain how it works, step by step, in the q = [incoming() for i in range(1000)] while True: q.append(incoming()) consume(q.pop(0)) example.
then just do that. Certainly if the ratio is small (like say .1), don't bother with memmove. The exact cutoff is a tuning matter.
Separate issue: for best performance, one would like to be able trade space for time and preallocate a large enough block so that most append faults result in a memmove back to the left. Does the current builtin shrinkage discipline would allow this. Or does "q=[0]*1000; q[:]=[]" shrink the space for q back to nearly nothing?
Python *offers* to give 992 slots back; whether the platform realloc() accepts the offer is up to it. In the "details are everything" department, Python *also* creates a temp vector of length 1000, and copies all of q[:] into it before it deletes any of q: emptying a list is a linear-time operation, not constant-time. That's because it's not safe to decref anything in a list while tearing it down (e.g., the list may be visible to a __del__ method, which must not be allowed to see NULL pointers, or reclaimed objects, in the list).
(One might also want to do same for stack. And yes, I remember: 'user tuning' is a 'bad idea'. But....)