Question on Socket Timeouts
cs at zip.com.au
Mon Nov 19 04:40:54 CET 2012
On 18Nov2012 03:27, Abhijeet Mahagaonkar <abhi.forall at gmail.com> wrote:
| I'm new to network programming.
| I have a question.
| Can we set a timeout to limit how long a particular socket can read or
On the socket itself? Probably not. But...
| I have used a settimeout() function.
| The settimeout() works fine as long as the client doesnt send any data for
| x seconds.
| The data that I receive in the server after accept()ing a connect() from a
| client I check if the client is sending any invalid data.
| I'm trying to ensure that a client sending invalid data constantly cannot
| hold the server. So is there a way of saying I want the client to use this
| socket for x seconds before I close it, no matter what data I receive?
Not the time you set up the socket, or when you accept the client's
connection. Thereafter, ever time you get some data, look at the clock.
If enough time has elapsed, close the socket yourself.
So, not via an interface to the socket but as logic in your own code.
Cameron Simpson <cs at zip.com.au>
Their are thre mistakes in this sentence.
- Rob Ray DoD#33333 <rray at linden.msvu.ca>
More information about the Python-list