I am configuring Fail2Ban on my Ubuntu web server to prevent it from being a victim of DoS / DDoS. I don't want to use Cloudflare because I have to route my DNS over and use their SSl cert.
Currently, I found a script online that checks for more than 1 HTTP HEAD
request per second, or more than 1 request to xmlrpc.php
per second. I don't think it's sufficient protection, as these aren't the only kinds of requests that people can employ to execute a DDoS attack.
I'm looking at restricting the number of GET
/ POST
requests a given IP can make in a short window, but I'm not sure how I should set the restriction, since big pages that load a lot of Javascript, CSS or images will make a lot of GET
requests in a short amount of time. Should I be looking at limiting GET
/ POST
requests, or should I be looking at something else? Why?
Best Answer
It can be hard to see the difference of good or bad guys, just by checking rate of requests per second. You would need to run the script on Your environment to see how many requests from a single IP address per 5 minutes (example) is "normal" for Your website, before making the final decision.
Once you have figured out the normal rate, it should be possible to count GET and/or POST (depending on your logfile analysis) with your script.
It is though possible to find other suspicious activities in the logfiles to filter, like scanning for scripts or executables etc. (GET/POST that "hopefully" causes an error in a well configured web server ;-) )
I have used this external fail-2-ban-link on my own systems.