[Python-ideas] Concurrent safety?

Mike Meyer mwm at mired.org
Mon Oct 31 18:59:56 CET 2011


On Sun, Oct 30, 2011 at 8:21 PM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> On Mon, Oct 31, 2011 at 1:11 PM, Mike Meyer <mwm at mired.org> wrote:
>> The one glaring exception is in concurrent programs. While the tools
>> python has for dealing with such are ok, there isn't anything to warn
>> you when you fail to use those tools and should be.
>
> This will basically run into the same problem that
> free-threading-in-CPython concepts do - the fine grained checks you
> need to implement it will kill your single-threaded performance.

These argument seems familiar. Oh, right, it's the "lack of
performance will kill you." That was given as the reason that all of
the following were unacceptable:

- High level languages.
- Byte-compiled languages.
- Structured programming.
- Automatic memory management.
- Dynamic typing.
- Object Oriented languages.

All of those are believed  (at least by their proponents) to make
programming easier and/or faster at the cost of performance. The
performance cost was "too high" when all of them when they were
introduced, but they all became broadly accepted as the combination of
increasing computing power (especially CPU support for them) and
increasingly efficient implementation techniques drove that cost down
to the point where it wasn't a problem except for very special cases.

> Since Python is a scripting language that sees heavy single-threaded use,
> that's not an acceptable trade-off.

Right - few languages manage to grow one of those features without a
name change of some sort, much less two (with the obvious exception of
LISP). Getting them usually requires moving to a new language. That's
one reason I said it might never make it into CPython.

But the goal is to get people to think about fixing the problems, not
dismiss the suggestion because of problems that will go away if we
just wait long enough.

For instance, the issue of single-threaded performance can be fixed by
taking threading out of a library, and giving control of it to the
interpreter. This possibility is why I said "thread of execution"
instead of just "thread."  If the interpreter knows when an
application has concurrent execution, it also knows when there aren't
any, so it can support an option not to make those checks until the
performance issues go away.

> Software transactional memory does offer some hope for a more
> reasonable alternative, but that has its own problems (mainly I/O
> related). It will be interesting to see how PyPy's experiments in this
> space pan out.

Right - you can't do I/O inside a transaction. For writes, this isn't
a problem. For reads, it does, since they imply binding and/or
rebinding. So an STM solution may require a second mechanism designed
for single statements to allow reads to happen.

    <mike



More information about the Python-ideas mailing list