Ratelimit POST requests

apache-2.2cacheperformance-tuningvarnish

I'm running a large WordPress multiuser site. I have a varnish cache in front of the WordPress application server. As it makes no sense to cache POST requests, I am vulnerable to DDoS using lots of POSTs against the varnish cache server.

I have tried setting op a firewall rules that only accepted 20 simultaneously connections from each client, but that had impact on the users sitting behind shared proxies and on schools with lots of users behind the same gateway.

Varnish does not have the option to rate-limit the numbers of POSTs, and I want to do the rate-limit BEFORE it hits the application server. Is there a small transparent proxy that would do the job?

Currently Varnish is receiving about 100-150 hits/second, the proxy should at least be able to handle this load.

Best Answer

I am going to put this down as an answer only because it's going to be long. As you quite accurately said blocking more than 20 concurrent requests from 1 IP won't do the trick. You have to set "smarter" criteria. I will go as far as to say that putting another proxy between a proxy and an application server is not elegant nor useful.

I don't know why you want to do the rate-limit before hitting apache because you are missing out on fail2ban, mod_qos, mod-antiloris (highly specific) and other solutions. Moreover I don't know if POST requests are your only problem in terms of a DDoS.

Caching POST request replies is possible and makes sense as well. Unless you are serving dynamic content every single time. That of course doesn't mean that you can cache authenticated pages.

You can apply request time-outs for POST if the request has taken more that 5s. Although users with slow connections won't be able to POST anything. You can also apply a rule on a per URL basis combined with the above rule. This makes sense, since it is reasonable for a user to POST 1000kb on a file uploading page but not on the login page. As I said create "smarter" criteria. They might be long and it might take some time to formulate them but they will provide you with a sustainable solution as I don't know of a one-size fits all in this kind of situation.

Another solution which you can combine is an application firewall. Might be more than you need but it can as well keep you safe from many other things. Here is a owasp page with recommendations and the relevant wiki

EDIT: I have to admit, I don't have any experience with that configuration (varnish and antiloris). Sooner or later there's so much that varnish can cache (although it is highly "programmable"). The main thing you can do is know the pattern of use and when it deviates from the norm. If you just want to prevent that very specific type of attack then writing better rules for Varnish should do it. However requests hitting apache is not a bad thing, as long as you have the proper mods/confs in place, Apache can tell when a request is legit and thus process it and when not to. Blocking X number of connections from each client is NOT a good thing to do, unless you can definitely blacklist that IP. You can do that on apache either through fail2ban(regex) or mod_qos