Building a new app in python and need some architecture advice.

Eddie Corns eddie at holyrood.ed.ac.uk
Wed Dec 10 12:22:19 EST 2003


Sam Watson <Sam watson at yahoo.com> writes:

>>Since web browsers are so familiar to everyone it's still worth considering
>>(assuming it can hack your 'rich' GUI client needs).  I've written a couple of
>>data entry web apps now based on BaseHTTPServer, one with state one without.
>I hadnt given the web app idea a whole lot of thought only because my
>experience with doing data entry with web pages has been pretty
>dissatisfying.  

Depends on the app of course, a really demanding interface is difficult to
make good even with javascript etc. but simple to just mildly complex can be
done well.  Only you can judge that of course.

>>pretty damn secure.  If you want to secure the app itself from snooping
>>etc. then you might be able to use SSL on top of that (not sure, it's on my
>>list of things to find out about).  It's simple enough that you can get useful
>>apps together quite quickly.  For evidence of larger apps doing similar
>>things, I believe roundup, which is a good system, works in the same way (but
>>is obviously a lot bigger).
>>
>So apache is pretty easy to lock down?  MY fear is i dont configure
>apache correctly and leave something wide open.  

I should probably have been more explicit, when I say no external software I
mean not even Apache (or any other web server).  Applications like Roundup
*can* use Apache but they can also run stand-alone.  Basically you have a
single program using BaseHTTPServer, which listens for HTTP requests and calls
the appropriate method (GET, POST etc.) which you have subclassed in your
app.  So when the user enters a URL like http://blah/index, you take the path
(after the host identifier) and go "Aha, he wants the index - I'll send him
the HTML for the index page" or eg /form1 which is a HTML page that will
probably contain a POST, when that POST comes in your do_POST method gets
called, pulls out the fields (using http library functions) into a dictionary
and you use that data.

This is all a bit abstract so here's some code for a proof of concept that I
made.  When a user called http://my.machine:8000/scan it runs nmap on them
(based on their IP address) and sends the result back.  I don't have a short
example with POST yet.

import BaseHTTPServer, os

# utils for making HTML
def wrap (tag, txt):
    return '<%s>%s</%s>' % (tag, txt, tag)

def mk_row (*cols):
    return wrap ('TR', ''.join ([wrap ('TD',a) for a in cols])) + '\n'

# do the biz
def scan_host (ip):
    x = os.popen ('nmap %s' % ip)
    res = x.read()
    x.close()
    return res

class Handler (BaseHTTPServer.BaseHTTPRequestHandler):
    def do_GET (self):
        p = os.fork()
        if p == 0: return
        args = self.path.split('/')[1:]
        if args[0] == 'scan':
            self.doscan (args)
        else:
            self.send_error (403, 'Away and raffle')

    def doscan (self, *args):
        ip = self.client_address[0]
        dump = self.wfile.write

        self.send_response(200, "Script output follows")

        self.send_header ('Content-type', 'text/html')
        self.end_headers()

        dump ('<Table border>\n')
        dump (mk_row ('host','%s' % ip))
        dump ('</Table>\n')

        dump ('<pre>\n')                # pro tem
        dump (scan_host (ip))
        dump ('</pre>')

httpd = BaseHTTPServer.HTTPServer ( ('',8000), Handler)
httpd.serve_forever()

This code does a fork because of the length of time nmap takes to complete but
that's optional, I do have an app that just handles each request serially -
makes it easier to remove update conflicts and to retain state because it's
the same code running in a loop.

Hope that isn't too confusing.

Eddie




More information about the Python-list mailing list