Some search engine spiders like Bingbot crawl too rapidly and does not seem to obey robots.txt crawl-delay
directive. This triggers the DOS defence mechanism in mod_evasive
to generate HTTP 403 forbidden errors. But showing 403 errors to bots for perfectly valid pages is not ideal and may affect page rank. Is there a way to configure mod_evasive
to show HTTP status 429 instead of 403?
429 Too Many Requests
The 429 status code indicates that the user has sent too many requests in a given amount of time ("rate limiting").
The response representations SHOULD include details explaining the
condition, and MAY include a Retry-After header indicating how long to wait before making a new request.
Best Answer
Basically you just change the HTTP_FORBIDDEN to HTTP_TOO_MANY_REQUESTS in mod_evasive20.c and compile again.