network server threading

Jeremy Jones zanesdad at bellsouth.net
Wed Aug 18 13:46:24 CEST 2004


James R. Saker Jr. wrote:

>I've got a: "start server thread > Queue object, start server thread <>
>Queue object, start parsing client < Queue object" application that's
>got me puzzled. Probably an easy threads issue, but after digging thru
>Programming Python and Python Recipes sections on Threading class and
>running thru the examples, I'm still missing something.
>  
>
Are you trying to spawn multiple threads (i.e in the same process) or 
are you wanting to spawn multiple processes or a combination of both?

>My Server/Server/Client app is a syslog collector app (syslog input in,
>zope control web interface to manage ala start/stop/status, and BEEPy
>out) - two servers and one client app that has got me somewhat puzzled
>per how to handle with threads. The app:
>
>1. read in config file
>2. launch a syslog server to listen on port 514, taking input and
>putting it into a database-persistent Queue (I'm using BDB's DB_QUEUE
>implemented in a class Logfile which I've created to allow me to:
>  
>
If you are using multiple processes rather than multiple threads, I 
would recommend against pointing the multiple processes at a database 
file (or any file, for that matter).  That just gives me the heebie 
jeebies thinking about it.  You run too great of a risk that two 
processes could try to update the database file at the same time, which 
would be disastrous.  Or, two of the processes could pull the same thing 
out of the queue at the same time, which could also be disastrous.  So, 
if you're passing the queue object among multiple threads, you may be 
OK, assuming that this library is thread safe (which is not necessarily 
a safe assumption).

>	l = Logfile(db_filename, db_recsize) ## BSD DB's Queue has a fixed
>record length limitation which gets set by db.set_re_len
>
>	and then
>
>	l.push("Syslog data.....")
>	l.push("More syslog data...")
>
>	as well as l.pop(), l.hasrec() and l.size() methods to control
>	the Queue.
>
>3. runs a test message to see if I've launched the first server and am
>ready to do more work (this is where the second server and client will
>come in once I'm crawling along).
>
>The problem I run into is that I'm apparently not threading the syslog
>server and returning control to my app:
>
>class Syslogd(ThreadingUDPServer, InterruptibleServer):
>
>is based on:
>http://www.drbeat.li/pycgi/webutil.py/html?py=py/syslogd.py.txt
>with modifications to reference my Berkeley DB Queue, instead of
>displaying to sys.stdout.write. So far, so good - I'm logging and
>writing to the database (though it appears to be committing on the
>db.close() rather than writes, but that's another issue I'll have to
>deal with via the BSD DB C/Java docs since the Python docs on it's Queue
>method are limited - I've looked at ZODB for this as well).
>
>Here's where I get into trouble, and as mentioned at the beginning, is
>probably just a lack of me getting threads:
>
>if __name__ == '__main__':
>
>    try:
>        log = Logfile('syslog.db',255) # create BSD DB Queue object
>        syslogd = Syslogd(log, timefmt='%H:%M:%S') #create syslog object
>        syslogd.serve() #start serving syslog input on port
>  
>
OK - what you've just done here with the .serve() method is to start the 
syslogd server in the process's main thread of control.  Is that what 
you are really wanting to do?  This is not a bad thing if that's what 
you intended.  But from above, it sounded like you wanted to spawn a 
thread to run the syslogd server, then spawn another thread to run some 
other server, and maybe another thread to pull stuff off of the queue 
and parse it and (I guess) do something with it.  And, actually, since 
you are subclassing InterruptibleServer, you probably do want to spin 
this off into another thread (with a reference to it in the current 
thread so you can call the "stop" method on it when you get ready to).

>	print "This is a test"
>    except KeyboardInterrupt:
>        print "Closing queue database..."
>        log.close()
>        print "Operation canceled by user."
>    
>
>I never get to the test print, as once I'm in syslog.serve(), I'm there
>until I quit. Maybe I'm missing the logic here completely - if I want to
>share access to Logfile between syslogd object and two other objects, am
>I on the wrong track? 
>  
>
I'd have to see more of your code to be sure, but it's looking like the 
answer is, "probably."

>Eventually, after threading syslog.serve(), I need to do the same with
>the Zope HTTP service (which allows control to the application, much
>like HTTP to a Linksys router for status, configuration, start, stop
>functions), and also launching the BEEPy parsing of data from the Queue
>which gets passed onto a database upstream. Talking BEEP is a must due
>to the firewalls involved, so no distributed object approach will help
>here. A "syslogd in, dump to queue backed by database in case device
>gets shut off before it can parse and send, and a parse & send via BEEP
>engine" model.
>  
>
<shameless_plug>If you need a threadsafe, persistent queue library, you 
may want to check out Munkware:
http://munkware.sourceforge.net/
http://sourceforge.net/projects/munkware
If you need a queue to run in a threading server that will field 
multiple requests, you may want to get Munkware from CVS as I've just 
wrapped the base library in an XMLRPCServer.
</shameless_plug>

>Maybe there's a better example out there of:
>1. start server dumping to a shared object
>2. start another server accessing a shared object
>3. start a client processing the shared object
>
>model someone is aware of - ala Queue? 
>  
>
I don't have an example handy of  using one of the SocketServers, but 
here is an example of spawning some threads to put something into a 
shared queue and spawning more threads to take stuff off of the same 
shared queue:

################################################################
#!/usr/bin/python

'''
This script will create:
        10 consumer threads
        10 producer threads
Each producer will put something on the queue then sleep for 2 seconds.  
The consumers are busy getting stuff all the time.  I've put some dialog 
in so that each thread will tell what it is doing.
'''

import threading
import Queue
import time


class Putter(threading.Thread):
    def __init__(self, q, num):
        self.q = q
        self.num = num
        threading.Thread.__init__(self)
    def run(self):
        counter = 1
        prompt = "p-%s >" % self.num
        while 1:
            print "%s Putting..." % prompt
            q.put("Test %s::%s" % (self.num, counter))
            print "%s Put %s" % (prompt, counter)
            time.sleep(2)
            counter = counter + 1

class Getter(threading.Thread):
    def __init__(self, q, num):
        self.q = q
        self.num = num
        threading.Thread.__init__(self)
    def run(self):
        counter = 1
        prompt = "g-%s >" % self.num
        while 1:
            print "%s Getting...." % prompt
            item = q.get()
            print "%s Got %s" % (prompt, str(item))
            counter = counter + 1

if __name__ == "__main__":
    q = Queue.Queue()
    for foo in range(10):
        pt = Putter(q, foo)
        pt.start()
    for foo in range(10):
        gt = Getter(q, foo)
        gt.start()
################################################################

You may want to check out Aahz's excellent thread tutorial.  It can be 
found at http://starship.python.net/crew/aahz/OSCON2001/index.html.

Hope this helped,

Jeremy


>Thanks much... 
>
>Jamie
>"So much to learn, so little caffeine!"
>
>
>  
>




More information about the Python-list mailing list