On Wed, Sep 4, 2019, 10:40 PM Dan Sommers < 2QdxY4RzWzUUiLuE@potatochowder.com> wrote:
I'm sure I'm missing something, but isn't that the point of a ThreadPoolExecutor? Yes, you can submit more requests than you have resources to execute concurrently, but the executor itself limits the number of requests it executes at once to a given (to the executor's initializer) number. The "blocked" requests are simply entries in a queue, and shouldn't consume lots of memory.
The enteries in the queue might take more memory than you realize. For example, if you have 100,000 files, and you process only 8 at a time, why would you have more than 8 items in the queue? 100k queue enteries will be a memory hog, not talking about the memory fragmentation it might cause. The queue is generated dynamically, you can stop generating it once the limit is reached.