[Python-ideas] Fwd: Concurrent safety?
Stephen J. Turnbull
stephen at xemacs.org
Wed Nov 2 09:49:48 CET 2011
Mike Meyer writes:
> > No, "forcing" is. Consenting adults and all that.
>
> But you yourself admit that this isn't forcing you to think:
Nice try, but I didn't say it forces me to think. It forces me to do
something to shut up the language. That's ugly.
> It just makes you add a statement to shut up the warnings. Pretty
> much the same thing as using a bare except clause.
The bare except clause is optional; I can (and often do) simply let
the exception terminate the process *if* it ever happens. My
understanding is that that isn't good enough for you (because
concurrency errors usually lead to silent data corruption rather than
a spectacular and immediate crash).
> And it comes about for much the same reason: I'm getting tired of
> chasing down bugs in concurrent code. There are languages that
> offer that.
Well, if you want help chasing down bugs in concurrent code, I would
think that you would want to focus on concurrent code. First, AFAICS
ordinary function calls don't expose additional objects to concurrency
(they may access exposed objects, of course, but they were passed in
from above by a task, or are globals). So basically every object
exposed to concurrency is in either args or kwargs in a call to
threading.Thread (or thread.start_new_thread), no?
Wouldn't it be possible to wrap those objects (and only those objects)
such that the wrapper intercepts attempts to access the wrapped
objects, and "does something" (warn, raise, dance on the head of a
pin) if the access is unlocked or whatever? Then only concurrent code
and the objects exposed to it pay the cost. If it's really feasible
to do it via wrapper, you could write a decorator or something that
could easily be turned into a no-op for tested code ready to go into
production.
> People are as likely to miss that data is shared as they are to
> screw up the locking. In other words, if we do it your way, it'll
> deal with less than half of whats bugging me.
[...]
> There's no way to find out except by trying.
Well, no, it's not about doing it my way; I'm perfectly happy with
processes and message-passing in my applications, and aside from wacky
ideas like the above, that I don't really know how to implement
myself, I don't have a lot of suggestions for concurrency by
threading. Rather, it's that my guess is that if you don't make the
costs of safe(r) concurrency look more reasonable you won't be getting
much help here.
More information about the Python-ideas
mailing list