[Tutor] Threads

Kent Johnson kent37 at tds.net
Tue Nov 16 01:46:23 CET 2004


OK, here is a simple threaded app. It uses worker threads to fetch data 
from URLs. There are two queues - one to pass jobs (URLs) to the worker 
threads, another to return results to the main program. The main program 
keeps a count of how many requests are outstanding; when there are none 
left it exits. The worker threads are marked as daemon threads so they 
don't have to be killed.

IMO short and sweet, and it just uses standard Python.

Kent

####################################

import Queue, threading, urllib2

# jobQ passes URLs to the worker threads
jobQ = Queue.Queue()

# resultsQ returns results to the main thread. A result is a triple of
#   - the original URL
#   - a boolean indicating success or failure
#   - for success, the contents of the url
#   - for failure, an error message
resultsQ = Queue.Queue()

urls = [
     'http://www.google.com',
     'http://www.python.org',
     'http://www.apple.com',
     'http://www.kentsjohnson.com',
     'http://www.foobarbaz.com/'
]

class Worker(threading.Thread):
     def __init__(self):
         threading.Thread.__init__(self)
         self.setDaemon(True)

     def run(self):
         while True:
             url = jobQ.get()
             try:
                 data = urllib2.urlopen(url).read()
             except urllib2.URLError, msg:
                 resultsQ.put( (url, False, msg) )
             else:
                 resultsQ.put( (url, True, data) )

def main():
     # Start two worker threads
     Worker().start()
     Worker().start()

     # Push jobs on the job queue
     for url in urls:
         jobQ.put(url)

     # Read results from the results queue
     count = len(urls)
     while count > 0:
         url, flag, msg = resultsQ.get()
         if flag:
             print url, 'read', len(msg), 'bytes'
         else:
             print "Couldn't read from", url, msg

         count -= 1

main()


More information about the Tutor mailing list