Most of the traffic hitting my sites lately hasn’t been human.
I was dealing with bots and scanners across multiple servers, and managing blocks on each one separately was getting messy fast.
So I built something simple for myself to track and block bad traffic from one place, and use that data across all my sites.
This builds a threat list of bad Ips if they have met certain criteria to be labeled as a Bot. This list is a shared threat list that can be used on multiple sites that use the same base js code. It ended up working better than I expected, so I turned it into a small SaaS called BlockABot.
Still early, but it’s already cutting down a lot of junk traffic.
If you deal with bots, scraping, or odd traffic patterns, I’d be curious what you think.
Top comments (0)