The Python Way

Skip Montanaro skip at pobox.com
Wed Mar 27 16:01:56 EST 2002


    Joel> In C++ it's common to have a wrapper class that acquires the lock
    Joel> in the ctor and releases it in the dtor, then allocating an object
    Joel> of that class on the stack local to the function.  Is there a
    Joel> Pythonesque version of the same design pattern?  

Yeah, it's pretty easily doable, but it assumes the granularity of the lock
is appropriately a whole function, or that only a single lock needs
acquiring, or that you don't need to release and reacquire the lock sometime
in the middle, or that you are willing to artificially break up a function
that needs to use the lock into pieces that can acquire the lock upon entry
and release it upon exit ...

    Joel>     class A:
    Joel>         def fn(self,x):
    Joel>             pass

    Joel>     class B(A):
    Joel>         def fn(self,x):
    Joel>             lock.acquire()
    Joel>             try:
    Joel>                 A.fn(self,x)
    Joel>             finally:
    Joel>                 lock.release()

    Joel> Or is this considered poor form?

That's fine form, subject to the considerations I mentioned above.

I have a decent-sized XML-RPC server (about 4000 lines) that was
single-threaded until about a year ago.  Most of the effort necessary to
make it multi-threaded was to lock various data structures.  I settled on
four threading.RLock objects and two Queue.Queue objects.  Amazingly enough,
I don't recall having any deadlocks or corrupt data.  (Using Queue.Queue in
the right places probably helped there.)

Looking back at the code now, most of the lock acquisition occurs for a
single lock which is protecting 12 separate small caches.  My code that
looks like

    self.cache_lock.acquire()
    try:
        fiddle_some_cache...
    finally:
        self.cache_lock.release()

would look a lot cleaner if I simply dumped the threading.RLock objects
altogether and used Queue.Queue objects holding the cache:

    some_cache = self.some_cache_queue.get()
    fiddle_some_cache...
    self.some_cache_queue.put(some_cache)

Aahz is always admonishing us to just use Queue.Queue.  Maybe it's time
I paid a bit closer attention to what he's been saying...

someday-i'll-learn-ly, y'rs,

-- 
Skip Montanaro (skip at pobox.com - http://www.mojam.com/)




More information about the Python-list mailing list