Critical sections and mutexes
brueckd at tbye.com
brueckd at tbye.com
Tue Oct 23 21:33:23 EDT 2001
On Tue, 23 Oct 2001, Cliff Wells wrote:
> >From original post:
> "only case I can see this being a problem is when, say,
> Thread A locks something in the main thread, starts to
> interact with it, is preempted, and then the main
> thread gets a timeslice and interacts with the locked
> resource--which it views as local and doesn't need to
> check for a lock."
>
> If you are going to share a resource among threads, then _every_ access to
> that resource _must_ be enclosed in locks (even if you are only reading from
> the resource).
[snip]
No, "normal" operations on Python objects are atomic as far as threads are
concerned. There are some very good reasons for using locking/signaling
(to sequentialize access to a function, to keep a worker thread asleep
until you use a semaphore to signal it to awake, etc), but it's not always
a requirement. Consider a simple producer/consumer situation:
import threading, time, random
foo = []
def producer():
for i in range(10):
print 'Producing', i
foo.append(i)
time.sleep(random.random() * 1.0)
print 'Producer done'
def consumer():
count = 0
while 1:
try:
num = foo.pop(0)
print 'Consuming', num
count += 1
if count >= 10:
break
except IndexError:
pass
time.sleep(random.random() * 1.0)
print 'Consumer done'
threading.Thread(target=consumer).start()
threading.Thread(target=producer).start()
Output:
Producing 0
Producing 1
Consuming 0
Producing 2
Producing 3
Consuming 1
Producing 4
Consuming 2
Producing 5
Consuming 3
Consuming 4
Producing 6
Consuming 5
Producing 7
Consuming 6
Producing 8
Producing 9
Producer done
Consuming 7
Consuming 8
Consuming 9
Consumer done
No crashes, stomped memory, or any other problems you'd expect.
-Dave
More information about the Python-list
mailing list