This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
fuss:security [2019/04/29 22:44] – [Varnish and Fail2Ban] office | fuss:security [2024/10/17 15:05] (current) – removed office | ||
---|---|---|---|
Line 1: | Line 1: | ||
- | ====== Blocking SemrushBot ====== | ||
- | |||
- | SemrushBot is an annoying web crawler that has proven to completely disregard the robots policies as well as hammering webservers hard by recursively following all the links on a website without delay and outright ignoring any repeating '' | ||
- | |||
- | {{: | ||
- | ===== IP Layer ===== | ||
- | |||
- | On the IP layer: | ||
- | <code bash> | ||
- | iptables -t mangle -A INPUT -p tcp --dport 80 -m string --string ' | ||
- | </ | ||
- | |||
- | Which is an awful solution to get rid of this pest without even hitting the application layer! | ||
- | |||
- | ===== Apache2 ===== | ||
- | |||
- | If are okay with your frontend being hammered by this total garbage, then the '' | ||
- | |||
- | Enable the '' | ||
- | <code bash> | ||
- | a2enmod rewrite | ||
- | </ | ||
- | and include in virtual hosts: | ||
- | <code apache2> | ||
- | < | ||
- | RewriteEngine on | ||
- | RewriteCond %{HTTP_USER_AGENT} googlebot [NC,OR] | ||
- | RewriteCond %{HTTP_USER_AGENT} sosospider [NC,OR] | ||
- | RewriteCond %{HTTP_USER_AGENT} BaiduSpider [NC] | ||
- | # Allow access to robots.txt and forbidden message | ||
- | # at least 403 or else it will loop | ||
- | RewriteCond %{REQUEST_URI} !^/ | ||
- | RewriteCond %{REQUEST_URI} !^/ | ||
- | RewriteRule ^.* - [F,L] | ||
- | </ | ||
- | </ | ||
- | |||
- | which is a bad solution because '' | ||
- | |||
- | ===== Varnish ===== | ||
- | |||
- | Perhaps blocking with Varnish may be a good compromise between having your Apache2 hammered and blocking the string '' | ||
- | <code varnish> | ||
- | sub vcl_recv { | ||
- | # Block user agents. | ||
- | if (req.http.User-Agent ~ " | ||
- | return (synth(403, " | ||
- | } | ||
- | | ||
- | # ... | ||
- | |||
- | } | ||
- | </ | ||
- | |||
- | An even better method would be to use fail2ban to block '' | ||
- | |||
- | ===== Varnish and Fail2Ban ===== | ||
- | |||
- | For Varnish, copy ''/ | ||
- | < | ||
- | badbotscustom = EmailCollector|WebEMailExtrac|TrackBack/ | ||
- | </ | ||
- | |||
- | then correct the '' | ||
- | < | ||
- | failregex = ^< | ||
- | </ | ||
- | |||
- | |||
- | and finally add the following to the jail configuration: | ||
- | < | ||
- | [varnish-badbots] | ||
- | enabled | ||
- | port = http,https | ||
- | filter | ||
- | logpath | ||
- | maxretry = 1 | ||
- | </ | ||
- | |||
- | and restart '' | ||
- | |||
- | To check that the bots are being banned, tail ''/ | ||
- | < | ||
- | fail2ban.jail[18168]: | ||
- | </ | ||
- | indicating that the '' | ||
- | |||
- | Hopefully followed by lines similar to: | ||
- | < | ||
- | NOTICE [varnish-badbots] Ban 46.229.168.68 | ||
- | </ | ||
- | |||
- | ====== Intercept SSL 2.0 / SSL 3.0 Using SSLsniff ====== | ||
- | |||
- | Redirect SMTPs to custom port for SSLsniff: | ||
- | <code bash> | ||
- | iptables -t nat -A PREROUTING -p tcp --destination-port 995 -j REDIRECT --to-ports 4995 | ||
- | sslsniff -a -c / | ||
- | </ | ||
- | |||
- | Example Session: | ||
- | < | ||
- | 1385227016 INFO sslsniff : Added OCSP URL: ocsp.ipsca.com | ||
- | 1385227016 INFO sslsniff : Certificate Ready: * | ||
- | sslsniff 0.8 by Moxie Marlinspike running... | ||
- | 1385227031 DEBUG sslsniff : Read from Server (mail.net.hu) : | ||
- | +OK POP3 PROXY server ready < | ||
- | |||
- | 1385227032 DEBUG sslsniff : Read from Client (mail.net.hu) : | ||
- | USER harry | ||
- | |||
- | 1385227032 DEBUG sslsniff : Read from Server (mail.net.hu) : | ||
- | +OK Password required | ||
- | |||
- | 1385227032 DEBUG sslsniff : Read from Client (mail.net.hu) : | ||
- | PASS secretpassword | ||
- | </ | ||
For the contact, copyright, license, warranty and privacy terms for the usage of this website please see the contact, license, privacy, copyright.