I have a server that keeps trying to brute force hack via xml-rpc post on a wordpress site. I've blocked the ip address in nginx.conf and noticed that I kept getting these errors in the log file, and since they are brute force, this is just a very, very slow DDOS (because they are causing log files to take space).
[error] 30912#0: *4600 access forbidden by rule, client:
I've searched here for log file changes but it looks like it's all or nothing on 403 errors and that wouldn't help me (wouldn't see any others).
To combat this, I've tried blocking by firewall (using UFW wrapper around firewall tables) and added an entry on that shows as this in status:
Anywhere DENY XXX.XXX.X.XXX (redacted)
However, even after enabling the firewall rules, and checking to make sure they are running, when tailing the log file I still the same error entries 403 errors writing over and over again.
Any thoughts on how to make this hacker go away without filling up the log file? It's a virtual 14.04 LTS server.
Edit: Would using limit_req
make any difference on this at all?
Edit Two: Here's UFW status, he's brute forcing a POST to the site. He's successfully blocked, but shouldn't the firewall prevent him from getting to nginx in the first place?
To Action From
-- ------ ----
22 ALLOW Anywhere
22/tcp ALLOW Anywhere
2222/tcp ALLOW Anywhere
80/tcp ALLOW Anywhere
21/tcp ALLOW Anywhere
Anywhere DENY XXX.XXX.X.XXX
22 (v6) ALLOW Anywhere (v6)
22/tcp (v6) ALLOW Anywhere (v6)
2222/tcp (v6) ALLOW Anywhere (v6)
80/tcp (v6) ALLOW Anywhere (v6)
21/tcp (v6) ALLOW Anywhere (v6)
Best Answer
Move you DENY rule above your ALLOW rule for port 80 - they're run in order.
SSH probably shouldn't be open to anywhere, but be careful not to lock yourself out if you have a dynamic IP.
Consider a CDN like CloudFlare, which offers protection against many threats, a firewall, etc, with free and paid plans.