<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Greg B</title>
    <description>The latest articles on DEV Community by Greg B (@gb26).</description>
    <link>https://dev.to/gb26</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/gb26"/>
    <language>en</language>
    <item>
      <title>Logged My Traffic for 24 Hrs One IP Hit Me 13,000</title>
      <dc:creator>Greg B</dc:creator>
      <pubDate>Fri, 10 Apr 2026 18:43:28 +0000</pubDate>
      <link>https://dev.to/gb26/logged-my-traffic-for-24-hrs-one-ip-hit-me-13000-27ob</link>
      <guid>https://dev.to/gb26/logged-my-traffic-for-24-hrs-one-ip-hit-me-13000-27ob</guid>
      <description>&lt;p&gt;I’ve been working on a small side project to analyze traffic logs and surface bot activity in a way that’s actually useful.&lt;/p&gt;

&lt;p&gt;Not a full security platform. Just something simple that answers:&lt;/p&gt;

&lt;p&gt;who is hitting my site&lt;br&gt;
how often&lt;br&gt;
and what they’re trying to access&lt;/p&gt;

&lt;p&gt;I pointed it at one of my endpoints and let it run.&lt;/p&gt;

&lt;p&gt;Here’s what showed up in a short window:&lt;/p&gt;

&lt;p&gt;20.48.232.178   → 13,777 hits&lt;br&gt;
20.104.201.101  → 7,916 hits&lt;br&gt;
20.151.229.110  → 7,427 hits&lt;br&gt;
20.220.232.240  → 7,096 hits&lt;br&gt;
20.220.213.131  → 5,188 hits&lt;br&gt;
162.158.62.142  → 2,979 hits&lt;br&gt;
104.23.253.15   → 2,926 hits&lt;br&gt;
162.158.154.75  → 2,616 hits&lt;/p&gt;

&lt;p&gt;Most of these were hitting common scan paths like:&lt;/p&gt;

&lt;p&gt;/scanner.php&lt;br&gt;
/wp-admin&lt;br&gt;
random probe endpoints&lt;/p&gt;

&lt;p&gt;Nothing fancy. Just constant noise.&lt;/p&gt;

&lt;p&gt;What stood out&lt;/p&gt;

&lt;p&gt;It wasn’t one attacker. It was volume.&lt;/p&gt;

&lt;p&gt;Thousands of requests from single IPs. Repeated scans. Same patterns over and over.&lt;/p&gt;

&lt;p&gt;If you’re not looking at raw logs, you don’t really see it.&lt;/p&gt;

&lt;p&gt;And most small projects don’t have anything in place to filter or summarize this.&lt;/p&gt;

&lt;p&gt;What I built&lt;/p&gt;

&lt;p&gt;I put together a lightweight tool that:&lt;/p&gt;

&lt;p&gt;logs traffic&lt;br&gt;
aggregates hits per IP&lt;br&gt;
tags common scan patterns&lt;br&gt;
surfaces “top offenders” quickly&lt;/p&gt;

&lt;p&gt;No heavy infra. Just a small service and a database.&lt;/p&gt;

&lt;p&gt;The goal wasn’t to block everything automatically.&lt;/p&gt;

&lt;p&gt;It was to make the data obvious enough that you can decide what to do:&lt;/p&gt;

&lt;p&gt;block at firewall&lt;br&gt;
add nginx rules&lt;br&gt;
ignore known noise&lt;br&gt;
or just understand what’s hitting your app&lt;br&gt;
Why this matters (especially early)&lt;/p&gt;

&lt;p&gt;If you’re running:&lt;/p&gt;

&lt;p&gt;a side project&lt;br&gt;
a VPS&lt;br&gt;
a self-hosted app&lt;/p&gt;

&lt;p&gt;you’re getting hit like this too.&lt;/p&gt;

&lt;p&gt;You just don’t see it yet.&lt;/p&gt;

&lt;p&gt;Free access (early users)&lt;/p&gt;

&lt;p&gt;I’m opening this up for testing.&lt;/p&gt;

&lt;p&gt;First 100 people can use it free for 3 months.&lt;br&gt;
No credit card.&lt;/p&gt;

&lt;p&gt;Code: DTO2026&lt;br&gt;
&lt;a href="https://blockabot.com" rel="noopener noreferrer"&gt;https://blockabot.com&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After that it’s around $5/month for full access (logs, threat feed, etc).&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>cybersecurity</category>
      <category>programming</category>
    </item>
    <item>
      <title>This Is What’s Really Hitting Your Website (Hint: Not People)</title>
      <dc:creator>Greg B</dc:creator>
      <pubDate>Thu, 02 Apr 2026 23:38:01 +0000</pubDate>
      <link>https://dev.to/gb26/this-is-whats-really-hitting-your-website-hint-not-people-31p6</link>
      <guid>https://dev.to/gb26/this-is-whats-really-hitting-your-website-hint-not-people-31p6</guid>
      <description>&lt;p&gt;This Is What’s Really Hitting Your Website (Hint: Not People)&lt;/p&gt;

&lt;p&gt;I wanted to understand how much of our traffic was actually human, so I pulled and analyzed 48 hours of raw request logs.&lt;/p&gt;

&lt;p&gt;No filters, no analytics layer, just direct log data.&lt;/p&gt;

&lt;p&gt;Time Frame&lt;/p&gt;

&lt;p&gt;Start: 2026-03-31 10:00 UTC&lt;br&gt;
End: 2026-04-02 10:00 UTC&lt;/p&gt;

&lt;p&gt;All requests within that window were grouped by path patterns and behavior.&lt;/p&gt;

&lt;p&gt;Traffic Breakdown&lt;/p&gt;

&lt;p&gt;Requests were classified into four categories:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;WordPress probing (paths containing wp)&lt;/li&gt;
&lt;li&gt;XMLRPC access attempts&lt;/li&gt;
&lt;li&gt;PHP endpoint probing&lt;/li&gt;
&lt;li&gt;General scanning and enumeration&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Results:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;WordPress probes: 34 percent&lt;/li&gt;
&lt;li&gt;XMLRPC attempts: 18 percent&lt;/li&gt;
&lt;li&gt;PHP probes: 27 percent&lt;/li&gt;
&lt;li&gt;Other scanning: 21 percent&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Roughly 79 percent of requests were not normal user activity.&lt;/p&gt;

&lt;p&gt;Sample of Active IPs&lt;/p&gt;

&lt;p&gt;Below is a subset of IPs with the highest request volume or repeated attack patterns during the window:&lt;/p&gt;

&lt;p&gt;185.220.101.45   WordPress login brute force patterns&lt;br&gt;
45.146.165.12    XMLRPC pingback attempts&lt;br&gt;
103.248.70.33    PHP endpoint scanning&lt;br&gt;
91.134.23.198    Multi-path probing (/admin, /login, /.env)&lt;br&gt;
176.65.148.92    High-frequency requests consistent with botnet behavior&lt;br&gt;
198.54.117.210   Credential stuffing attempts&lt;br&gt;
5.188.62.76      Known scanner signature patterns&lt;br&gt;
194.147.142.88   Repeated wp-login hits&lt;br&gt;
212.83.150.120   PHPMyAdmin probing&lt;br&gt;
139.59.37.12     Generic crawler with attack signatures&lt;/p&gt;

&lt;p&gt;Many of these generated hundreds to thousands of requests over the 48 hour period.&lt;/p&gt;

&lt;p&gt;Observed Attack Patterns&lt;br&gt;
WordPress Probing&lt;/p&gt;

&lt;p&gt;Even on non-WordPress systems, these paths were repeatedly hit:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;/wp-login.php&lt;/li&gt;
&lt;li&gt;/wp-admin/&lt;/li&gt;
&lt;li&gt;/wp-content/plugins/&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is automated scanning, not targeted behavior.&lt;/p&gt;

&lt;p&gt;XMLRPC Access&lt;/p&gt;

&lt;p&gt;Frequent hits on:&lt;/p&gt;

&lt;p&gt;/xmlrpc.php&lt;/p&gt;

&lt;p&gt;Common uses include pingback abuse and brute force via API endpoints.&lt;/p&gt;

&lt;p&gt;PHP File Probing&lt;/p&gt;

&lt;p&gt;Requests targeting common configuration and entry points:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;/index.php&lt;/li&gt;
&lt;li&gt;/config.php&lt;/li&gt;
&lt;li&gt;/.env&lt;/li&gt;
&lt;li&gt;/db.php
These are looking for exposed configs or weak deployments.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Credential Stuffing&lt;/p&gt;

&lt;p&gt;Repeated requests to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;/login&lt;/li&gt;
&lt;li&gt;/admin&lt;/li&gt;
&lt;li&gt;/api/auth&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Often with high frequency and rotating IPs.&lt;/p&gt;

&lt;p&gt;What This Means&lt;/p&gt;

&lt;p&gt;If you rely on standard analytics:&lt;/p&gt;

&lt;p&gt;Traffic volume may be inflated&lt;br&gt;
Engagement metrics may be misleading&lt;br&gt;
Infrastructure may be handling unnecessary load&lt;/p&gt;

&lt;p&gt;More importantly, this traffic is constant. It is not tied to visibility or popularity. Any exposed service will receive it.&lt;/p&gt;

&lt;p&gt;Internal Response&lt;/p&gt;

&lt;p&gt;After seeing this across multiple systems, we started aggregating this data instead of treating each site in isolation.&lt;/p&gt;

&lt;p&gt;The approach:&lt;/p&gt;

&lt;p&gt;Track IPs across multiple deployments&lt;br&gt;
Classify behavior based on request patterns&lt;br&gt;
Identify repeat offenders&lt;br&gt;
Apply blocking rules based on shared observations&lt;/p&gt;

&lt;p&gt;This evolved into a simple shared threat dataset.&lt;/p&gt;

&lt;p&gt;Threat Network Concept&lt;/p&gt;

&lt;p&gt;Instead of reacting per site:&lt;/p&gt;

&lt;p&gt;An IP flagged on one system is known to others&lt;br&gt;
Patterns such as WordPress probing or XMLRPC abuse are categorized&lt;br&gt;
Repeated behavior increases confidence in classification&lt;br&gt;
Blocking decisions become faster and more consistent&lt;/p&gt;

&lt;p&gt;This reduces duplicate analysis and speeds up mitigation.&lt;/p&gt;

&lt;p&gt;Outcome&lt;/p&gt;

&lt;p&gt;After applying filtering based on this data:&lt;/p&gt;

&lt;p&gt;Cleaner traffic metrics&lt;br&gt;
Reduced unnecessary requests&lt;br&gt;
Lower noise in logs&lt;br&gt;
Better visibility into actual users&lt;br&gt;
Closing&lt;/p&gt;

&lt;p&gt;The main takeaway from this dataset is straightforward.&lt;/p&gt;

&lt;p&gt;A large portion of inbound traffic to public web services is automated and non-user driven.&lt;/p&gt;

&lt;p&gt;This data is from a limited 48 hour window across a small set of systems. Patterns may vary, but the presence of automated scanning is consistent.&lt;/p&gt;

&lt;p&gt;If you are interested in testing this type of visibility or contributing additional data points, I am running a small beta around this approach.&lt;/p&gt;

&lt;p&gt;&lt;a href="http://www.blockabot.com" rel="noopener noreferrer"&gt;www.blockabot.com&lt;/a&gt;&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>security</category>
      <category>cybersecurity</category>
    </item>
    <item>
      <title>I checked my logs this morning… the traffic wasn’t what I expected</title>
      <dc:creator>Greg B</dc:creator>
      <pubDate>Wed, 01 Apr 2026 11:47:03 +0000</pubDate>
      <link>https://dev.to/gb26/i-checked-my-logs-this-morning-the-traffic-wasnt-what-i-expected-4oen</link>
      <guid>https://dev.to/gb26/i-checked-my-logs-this-morning-the-traffic-wasnt-what-i-expected-4oen</guid>
      <description>&lt;p&gt;I knew bots were hitting one of my smaller sites, but I didn’t expect this level of activity.&lt;/p&gt;

&lt;p&gt;I ended up building a small tool called BlockABot (blockabot.com) to help track what was actually going on.&lt;/p&gt;

&lt;p&gt;Checked the logs this morning and over the last 24 hours:&lt;/p&gt;

&lt;p&gt;217 requests&lt;br&gt;
422 unique IPs&lt;br&gt;
286 of them were brand new&lt;br&gt;
What they were doing&lt;/p&gt;

&lt;p&gt;Most of it wasn’t random.&lt;/p&gt;

&lt;p&gt;WordPress-related paths → 95 hits&lt;br&gt;
General scanning → 100+ hits&lt;br&gt;
Smaller amounts of:&lt;br&gt;
.env probes&lt;br&gt;
admin panel scans&lt;br&gt;
PHP endpoint checks&lt;br&gt;
What stood out&lt;/p&gt;

&lt;p&gt;There were clear bursts of activity:&lt;/p&gt;

&lt;p&gt;Around 11PM → ~49 unique IPs in one hour&lt;br&gt;
Around 11AM → over 100 unique IPs in one hour&lt;/p&gt;

&lt;p&gt;That doesn’t look like normal traffic.&lt;br&gt;
It looks more like coordinated scanning.&lt;/p&gt;

&lt;p&gt;Why I built this&lt;/p&gt;

&lt;p&gt;I originally ran into this trying to figure out weird traffic patterns on a small site.&lt;/p&gt;

&lt;p&gt;So I put together something simple to:&lt;/p&gt;

&lt;p&gt;log requests&lt;br&gt;
detect common scan paths&lt;br&gt;
track repeat IPs&lt;br&gt;
build a shared list of known bad actors&lt;/p&gt;

&lt;p&gt;Basically trying to answer:&lt;/p&gt;

&lt;p&gt;is this traffic actually human?&lt;/p&gt;

&lt;p&gt;What I’ve noticed so far&lt;br&gt;
A lot of traffic is automated, even on small sites&lt;br&gt;
Bots tend to hit in waves, not randomly&lt;br&gt;
The same IPs show up repeatedly&lt;br&gt;
WordPress endpoints get hit constantly (even if you’re not using WordPress)&lt;br&gt;
One interesting stat&lt;/p&gt;

&lt;p&gt;Out of the 422 IPs:&lt;/p&gt;

&lt;p&gt;286 were new&lt;br&gt;
136 had been seen before&lt;/p&gt;

&lt;p&gt;If you think this might help you, or if you have some ideas to add ott it let me know.&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>devops</category>
      <category>security</category>
      <category>automation</category>
    </item>
    <item>
      <title>Built This to Stop Bots Across My Sites Turned It Into a SaaS</title>
      <dc:creator>Greg B</dc:creator>
      <pubDate>Sat, 28 Mar 2026 18:44:17 +0000</pubDate>
      <link>https://dev.to/gb26/built-this-to-stop-bots-across-my-sites-turned-it-into-a-saas-1l1d</link>
      <guid>https://dev.to/gb26/built-this-to-stop-bots-across-my-sites-turned-it-into-a-saas-1l1d</guid>
      <description>&lt;p&gt;Most of the traffic hitting my sites lately hasn’t been human.&lt;/p&gt;

&lt;p&gt;I was dealing with bots and scanners across multiple servers, and managing blocks on each one separately was getting messy fast.&lt;/p&gt;

&lt;p&gt;So I built something simple for myself to track and block bad traffic from one place, and use that data across all my sites.&lt;/p&gt;

&lt;p&gt;This builds a threat list of bad Ips if they have met certain criteria to be labeled as a Bot.  This list is a shared threat list that can be used on multiple sites that use the same base js code. It ended up working better than I expected, so I turned it into a small SaaS called BlockABot.&lt;/p&gt;

&lt;p&gt;Still early, but it’s already cutting down a lot of junk traffic.&lt;/p&gt;

&lt;p&gt;If you deal with bots, scraping, or odd traffic patterns, I’d be curious what you think.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blockabot.com" rel="noopener noreferrer"&gt;https://blockabot.com&lt;/a&gt;&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>cybersecurity</category>
      <category>devops</category>
      <category>saas</category>
    </item>
  </channel>
</rss>
