<div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div class="im">To be clear, I meant to refer to processes *or* threads when discussing</div>
the problem originally. The ProcessPoolExecutor is pretty useful (in my<br>
experience) for easily getting speedup even on pure-Python CPU-bound<br>
workloads.<br></blockquote><div><br></div><div>FWIW that wasn't the default "use processes" spike. In my experience toying with concurrency in Python, trying to manage the load threads put on the system always ends badly. The 2 best supported concurrency mechanisms, threads and processes are constantly tête-à-tête, neither are adequate when you start to consider extreme concurrency scenarios. I suggest this because if you're considering composing executors, you're already trying to reduce the overhead (wastage) that processes and threads are incurring on your system for these purposes.</div>
</div><br>