Thoughts on on bots examining common ports?
Hello.
I am frequently seeing the likes of the following in my csf alert files which also blocks the probe:
The IP address 34.73.xxx.xx (US/United States/South Carolina/North Charleston/xx.xxx.73.34.bc.googleusercontent.com/[AS396982 GOOGLE-CLOUD-PLATFORM]) was found attacking mod_security on xxxxxxxx.com 3 times in the last 3600 seconds.
When I examine the logs they are only looking to read the /robots.txt file and "/".
Is the general consensus on these types of queries probes to see if they would be blocked with more aggressive tools? I ask because mod_security blocking robots.txt is not a great idea, no? Yes?
I can block AS ranges but that like taking a sledgehammer to an ant and could deny legitimate users as well, me thinks.
-
There's no need to be more aggressive. You will always have bots poking and prodding your servers for open ports, weak passwords, known vulnerabilities/exploits, etc. Being too aggressive will just block legitimate users and that is obviously something you don't want. 0 -
Thanks for that. I guess the question then is how to disable mod_security for those rules, i.e., /robots.txt and "/" which are the biggest, albeit not of consequence, offenders. Those rules seem to be hard-baked into mod_security. . . 0 -
Thanks for that. I guess the question then is how to disable mod_security for those rules, i.e., /robots.txt and "/" which are the biggest, albeit not of consequence, offenders. Those rules seem to be hard-baked into mod_security. . .
Do you see those in /etc/apache2/logs/modsec_audit.log ? What rules are triggered? There should not be any rules "hard-baked" in ModSecurity...0 -
Although getting constant block notifications can be annoying, your system recognizing and blocking bots and bad spiders/crawlers is something you want to see. Additionally, they're doing something more than just trying to look at robots.txt in order to trigger multiple modsecurity violations. 0
Please sign in to leave a comment.
Comments
4 comments