
I did some benchmarking with the last patch applied, and there's not really a significant benefit anymore from the load balancer with this patch. Without caching (at 6/7 req per second) load balancing was a net 100% improvement, now it's down to a 23.5% improvement in terms of req per second. ab2 -n 1000 -c 100 localhost:8080/ | grep 'Requests Requests per second: 224.73 [#/sec] (mean) ab2 -n 1000 -c 100 localhost:8081/ | grep 'Requests Requests per second: 225.18 [#/sec] (mean) ab2 -n 1000 -c 100 localhost:8079/ | grep 'Requests Requests per second: 278.22 [#/sec] (mean) The load balancer was on port 79 and it balanced the load across 8080 and 8081 on a two-way through loopback interface. Probably on a 4-way it would make more difference since the load balancer itself takes some cpu. The nice thing is that the above is still a completely dynamic page, but it changes now at most once every 10 seconds, which is exactly what I need. Overall this is a much superior approach and much simpler to setup if compared to the reverse cache proxying IMHO, and it should perform better too since it avoids the two context switches between proxy -> server -> proxy. So the next improvement on top of this would be replace compy with zope.interfaces for the very dynamic ssl part that cannot be cached at all. Thanks! ;)