Re: [Twisted-Python] conceptually, why a deferred can't be used more than once?
Hi Jean
Concerning the concept of the deferred, why is it more useful to go with a deferred which gets consumed and can only be fired once?
I can't speak for the Twisted devs, but my #1 reason would be that it's just simpler that way (and Deferreds are already complex enough).
In my small script I realize I need to take special care that the deferred has not been used, and that I must recreate explicitly a deferred for each network request. In a parallel world someone might have come up with a deferred concept which happily fires the callback as many times as there data coming back from the server. Is it a dumb idea?
No, I don't think so. I've done some thinking and playing along these lines. At one point I thought about a Deferred class that could be called multiple times, but decided the code changes were too ugly. Recently I did some experiments in which Deferreds could be re-used. The first observation was that the time taken to create Deferreds is really negligible. That is, just re-using an already allocated object doesn't save you much. More interestingly, the adding of call/errback functions to a Deferred does take significant time. I don't have the details of what I did handy, but from memory I managed to get about a 2x speedup in a simple test when Deferreds were having about 5 call/errbacks added to them. I wrote a class that allowed you to reset a Deferred, leaving the call/errback chain on it (snipping off other things that might have been added to the chain), and there was also a DeferredPool class that could hand you back an already prepared Deferred (or make you a new one if the pool was empty). I.e., if you're in a situation where you're repeatedly using Deferreds and putting exactly the same call/errbacks on them, you might get a win by resetting and reusing already fired Deferred. On the other hand, you'd also be a social pariah :-) If anyone wants more details, I'm happy to dig up my experimental code. It was fun and it seemed to work just fine.
Could the deferred design be part of the solution to the network problem of two requests passing each other as each ends is not yet aware that the other has just sent a request? Buggy networks nodes would expect a response but get a request instead and go crazy...
OK, I'll leave that one for someone else... T
participants (1)
-
Terry Jones