Load testing HTTP server with large number of concurrent connections

benchmarkinghttpload-testing

I'm trying to load test/benchmark a http server with a very large number of simultaneous connections (10-100k). What is a good procedure for doing this? On linux I've seen that both the client and host likely need to have the number of permitted threads to be explicitly increased.

Also, does anyone have a good feel for how many client machines are needed for testing 10k and up connections? Is one machine enough, or does there tend to be a cap on the number of sockets a machine can handle?

I'm currently using nperf to generate loads. I've been successful up to around 1-2k concurrent requests, after which some of the requests come back failed. I'm not sure if the failures are of the server, or the client and I'm reluctant (lazy) to provision more machines for use as clients if the issue can be resolved with the one I have.

Best Answer

Without any more details about your set-up, I'm tempted to say that you need to try it out.

The best way to benchmark your load generators is to try an easy load on a very light test page on your server (something that is going to cause the least amount of load on the http server). This will test your set-up's ability to generate the load so you can decide how many generators you need. You'll need to monitor your generators particularly for CPU, network, and the timeliness of each request (i.e. are they all sent out to schedule or do they build-up and lag behind?). Check the error logs on your generators, check for dropped requests.

Then you can move on to your real test. Now for the real test you may need to adjust your load generators set-up, depending on how complex your test case is, but the benchmark above will give you a good starting point.

Related Topic