DEV Community

Aimiten
Aimiten

Posted on

30 Days of Daily Automation. The Bot Improved the Wrong Pages.

The daily automation has been running for 30 consecutive days. One calculator page per day, every day since April 3rd — verified benchmarks, worked examples, internal links added, body content rewritten. 28 different pages touched.

This week I pulled the GSC data to see what actually happened.

8 of those pages don't appear in the 200-page snapshot of the last 28 days. Not low impressions. Absent entirely.

The automation ran every day, made measurable improvements, and spent most of its effort on pages Google isn't showing to anyone.

The setup

The routine picks a calculator from a queue, fetches the current page, asks Claude to add verified benchmarks and worked examples, commits the result. Runs at 9 PM UTC. Has not missed a day since April 3.

This week's batch: NOI (Apr 26), DSCR (Apr 27), Cash-on-Cash Return (Apr 28), GRM (Apr 29), Occupancy Rate (Apr 30), WACC (May 1), CPC (May 2). All real estate and finance metrics. Each one is genuinely a better page now — the content is real, the benchmarks are verified.

The question I didn't ask until this week: which pages should have been first?

What I expected vs what the data showed

I expected the earliest-improved pages — improved in the first week of April, now four weeks old — to show some position movement. Google crawl cycles at this authority level run roughly 2–4 weeks. Four weeks should be enough to see something.

Here's what the 28-day GSC data shows for the early-April batch:

Page Improved Impressions 28d Clicks Position
arr-calculator Apr 10 46 0 8.0
cac-calculator Apr 8 115 0 22.5
burn-rate-calculator Apr 6 28 0 6.8
ltv-calculator Apr 9 26 0 6.5
mrr-calculator Apr 11 not in top 200
runway-calculator Apr 7 not in top 200

Zero clicks across all of them. The pages that appear have positions between 6.5 and 22.5 — but at impression volumes this low, average position is again the arithmetic mean of a ghost distribution. The query-level breakdown for ARR returns zero visible queries. For LTV, the same.

No movement. Four weeks of crawl time, and the positions are where they were before the improvements.

Finding #1: The site's highest-impression tool page hasn't been touched

The hourly rate calculator has 880 impressions in the last 28 days. That's the highest impression count of any tool page on the site. It sits at position 57.4 — page 6 of Google — generating a steady drip of impressions and converting none of them into clicks.

The automation has never touched it.

Git log confirms: no commits to hourly-rate-calculator.tsx in at least 90 days. I also curled the live page:

<title>Hourly Rate Calculator for Freelancers | Valuefy</title>
<meta name="description" content="Free hourly rate calculator for freelancers and
  consultants. Convert salary to hourly, add overhead and taxes, and set a rate
  that covers every cost.">
Enter fullscreen mode Exit fullscreen mode

Reasonable title. Reasonable description. The page also carries the og:title duplicate bug — two competing og:title tags, one static from the shell HTML, one injected by React Helmet. Every page on the site has this; I've now verified it on the homepage, the WACC calculator, and the hourly rate page. It's been there since Week 1.

The automation queue did not sort by impression count. It ran through a list. The list didn't cross-reference GSC.

Finding #2: The five pages that should have been first

When I sorted tool pages by 28-day impressions and filtered for pages the automation hasn't touched, this is what came out:

Page Impressions 28d Clicks Position
hourly-rate-calculator 880 0 57.4
pe-ratio-calculator 546 4 22.0
purchase-order-generator 395 3 39.6
roas-calculator 357 2 62.0
markup-calculator 345 2 43.0

These five pages together have 2,523 impressions and 11 clicks in 28 days. The four with clicks have CTR between 0.5% and 0.8%. The hourly rate page has 880 impressions and zero clicks — meaning someone searched, saw the result, decided not to click. A title or meta description problem, not a ranking problem. All of them sit between position 22 and 62 — basement-level, but visible. If any one of them converts at even 2% instead of sub-1%, that's a measurable weekly click increase from pages that already exist in Google's view.

The automation improved pages that averaged fewer than 30 impressions per month. These five averaged 505.

Finding #3: The ROI calculator — the Week 1 flagship — has nearly disappeared

In the Week 1 audit, /tools/roi-calculator/ led the table: 1,483 impressions over 90 days at position 58. The most visible tool page on the site at the time.

The automation improved it on April 20. Thirteen days ago.

It no longer appears in the top 200 pages in the 28-day data. When I ran a query-level breakdown specifically for that URL, I got 10 visible queries with roughly 20 total impressions. At the same run rate as Week 1, I'd expect about 490 impressions in a 28-day window. It's now at 20 plus whatever sits below GSC's privacy threshold.

I don't know the cause. The April 22 editorial redesign — a full visual overhaul cascading to 143+ pages — happened two days after the ROI improvement. CSS variable remaps and font changes don't affect rankings directly, but a full site re-render can trigger a recrawl cycle that temporarily disrupts signals. That's a more plausible cause than the content improvement itself.

What I know: the page dropped. I'll check it again in two weeks.

Finding #4: The WACC calculator has the most impressions — and was just improved

The WACC calculator has 560 impressions in 28 days, position 56, one click. It's the second-highest impression tool page after the hourly rate calculator.

The automation improved it on May 1 — two days ago. Too early to see any effect in GSC. I curled the page:

<title>WACC Calculator: Cost of Capital Formula | Valuefy</title>
<meta name="description" content="Calculate your weighted average cost of capital
  from debt, equity, and tax rate. Used in DCF models and capital structure decisions.">
Enter fullscreen mode Exit fullscreen mode

This is one of the rare cases where the automation hit a high-impression page by coincidence — WACC was next in the queue, and it happened to be second-highest in impressions. I'll use this as a natural experiment: if the content improvement moves the position in the next two weeks, it's the strongest evidence so far that on-page work does anything at this authority level. If it doesn't, that's also data.

The query breakdown confirms the impression distribution is real here — 11 visible queries, 508 visible impressions, with "wacc calculator" alone driving 188 impressions at position 59.9. This is not another averaged ghost. It's a real query cluster with real search volume, sitting on page 6 with a 0.18% CTR.

What I'm going to do about it

  1. Reorder the queue by impressions — sort descending, cross-referenced against GSC, starting with hourly rate (880), then P/E (546), then purchase-order generator (395). The queue already works; it just needs the right priority.
  2. Use the WACC calculator as a control — check its position in two weeks. If it moves, on-page optimization is doing something at this domain's authority level. If not, the Week 1 conclusion still holds.
  3. Watch the ROI calculator — if it recovers from the post-redesign dip, the April 22 redesign was the variable. If it keeps dropping, something else is happening.
  4. Fix the og:title duplicate — I keep saying this. This time I'm treating it as a blocker for any further queue runs, not a nice-to-have.

The uncomfortable lesson

Automation removes friction from a decision you've already made. If the decision was wrong, the automation executes it more consistently than you'd have done it manually.

The queue existed. It worked through a list. Nobody asked whether the list matched the data.

30 days later, 28 pages are better, 8 of them have fewer monthly impressions than a single Reddit comment gets views, and the site's highest-impression tool page still looks like it did in February.

I said in Week 1: "I'd rather run one audit routine that tells me which pages improved week over week than ship more content into a pile Google has already rated." I then ran 30 more improvements without that audit. These things are not actually hard to do — they're hard to prioritize when the automation is already running and producing tidy green commits.

I'll reorder the queue this week. In two weeks I'll report whether the WACC experiment moved. If the answer is "no detectable movement at position 56 with 560 impressions," I'll say that too.


I'm running these experiments on valuefy.app and writing the findings as I go. If you're building programmatic SEO or trying to figure out whether automated content work is actually doing anything, I'd be glad to compare notes — drop a comment.

I also run AImiten, where we build AI tooling for companies. This side project is where ideas get stress-tested before they touch client work.

Top comments (0)