BaseHTTPServer fails POST requests: disconnected by peer; maybe sockets bug? help!?

Molly iajava at yahoo.com
Mon Aug 25 08:08:44 CEST 2003


Peace, Pythonphiles!

Python's std lib BaseHTTPServer: I'm stuck with a nasty spoiler -- I
get erratic "Network error: disconnected by peer" for _POST_ requests
only.

=Background=

I'm developing a Python CGI app. Was using Xitami, which is really
light and nice, reasonable performance (circa 200-300 msec to respond,
I guess mostly to start a Python process; localhost; win95 (retch!)).

But, deployed site is the usual Apache/Linux monstrosity. And I wanted
to use mod_rewrite, to export nice URLs. Xitami is inflexible about
URL associations with scripts.

So I tried hacking a python22/libs/SimpleHTTPServer to "simulate"
Apache's mod_rewrite. It's unbelievable easy!

=The problem=

GET requests work fantastically! 40-100 millisecs to render response
in HTML -- I was really surprised!

But POST requests fail: the script runs ok, response fully generated
and written to stdout, but somewhere while rendering Netscape aborts
with that "Network error: lost connection, disconnected by peer".

This is erratic: sometimes not a character was displayed before it
disconnects, and sometimes the entire page _except the last line_.

These tools are of course buggy as hell, and I learned to "live" with
their limitations during the years, but this one really gets me down:
I want to use Python's *HTTPServer to experiment with FCGI and a
standalone, dedicated Web server for this application, and the
performance was so nice, and _I haven't a clue as to why this strange
behavior occurs_!

Help!!!

=Details=

Invoking script:

   sys.stdout = self.wfile
   sys.stdin = self.rfile  
   execfile( "my_cgi.py", {} )

with both in/out set to _no buffering_:

   rbufsize = 0
   wbufsize = 0

(Actually, more like 

   sys.stdout = cgi_output = StringIO.StringIO()   
   #...
   self.send_response( status )
   self.wfile.write( cgi_output.getvalue() )

so can (keep a log and) scan the script's output for a "Status:"
header, aka "parsed-headers CGI" -- I'm simulating Apache, right?)

I tried pausing just before exiting the request handling call:

   time.sleep( 25 )   # Trying to get smart with bug?

and it _sometimes_ helps! ie, Netscape renders the page, then gives up
on waiting for the connection to close, I guess, so... there.
But, unreliable.

=Guess?=

I've no idea where to look -- what's different between a GET and
POST?!

It's probably some Micrapsoft buffoonery with sockets. 
I really like to patch this somehow. 

Any suggestions about how to find the exact point of failure?


Thanks!
-- M




More information about the Python-list mailing list