The way I see it, the concurrency model to be used is selected by developers. They can choose between multi-threading, multi-process, or asyncio, or even a hybrid. If developers select multithreading, then they carry the burden of ensuring mutual exclusion and avoiding race conditions, dead locks, live locks, etc.

On Mon, 2021-10-18 at 13:17 +0000, Mohamed Koubaa wrote:
I love everything about this - but I expect some hesitancy due to this "Multithreaded programs are prone to concurrency bugs.".

If there is significant pushback, I have one suggestion:

Would it be helpful to think of the python concurrency mode as a property of interpreters?
`interp = interpreters.create(concurrency_mode=interpreters.GIL)`
`interp = interpreters.create(concurrency_mode=interpreters.NOGIL)`

and subsequently python _environments_ can make different choices about what to use for the 0th interpreter, via some kind of configuration.
Python modules can declare which concurrency modes they supports.  Future concurrency modes that address specific use cases could be added.

This would allow python environments who would rather not audit their code for concurrency isuses to opt out, and allow incremental adoption.  I can't intuit whether this indirection would cause a performance problem in the C implementation or if there is some clever way to have different variants of relevant objects at compile time and switch between them based on the interpreter concurrency mode.
Python-Dev mailing list --
To unsubscribe send an email to
Message archived at
Code of Conduct: