We have 160 calculator pages on valuefy.app. I spent most of February rewriting every single page title and meta description by hand. I spent March wiring up a daily Claude Code routine that now improves one calculator page every night. There's a full blog-writing automation for new posts every Monday and Tuesday, a weekday blog audit routine, an AI-powered internal linking queue on Supabase, and a Puppeteer prerender step that runs on every build.
Since January I've been in Google Search Console constantly — submitting 10 URLs a day for indexing, glancing at the weekly totals.
Last week I stopped glancing. I pulled the 90-day data into a spreadsheet for the first time and looked at it properly.
45 clicks. Not per day. Total. Over 90 days.
This is what I found when I actually dug in.
The setup
Here's what was running at audit time:
- 160 calculator pages (ROI, CAC, LTV, burn rate, runway, EBITDA — the usual SaaS/finance suspects)
- ~70 of them submitted to Google for indexing via the URL Inspection tool (batched 10/day because of the quota)
- A daily Claude Code routine that picks one calculator and improves its body content — verified benchmarks, worked examples, internal links. Started about three weeks ago, 11 commits deep so far.
- Two blog-writing routines, one on Mondays and one on Tuesdays, both pick a keyword, research it against competitors, and publish a new React page
- A blog audit routine every weekday that verifies facts, checks links, fixes structure
- A Supabase edge function applying AI-powered internal links to a queue
- React Helmet for meta tags, full Puppeteer-based prerendering of every route at build time
- February's mass rewrites: every single tool page title and meta description rewritten by hand for specificity
On paper this is a lot of SEO work. More than most agencies ship for paying clients. I'd assumed that with this much surface area being polished, clicks had to be climbing steadily.
They weren't climbing. They'd never been climbing.
What I expected vs what the data showed
I expected positions slowly improving on target keywords. Maybe a handful of calculators already in the top 10. Some impressions turning into clicks as metadata settled.
What I got:
| Page | Impressions (90d) | Position |
|---|---|---|
| /tools/roi-calculator/ | 1,483 | 58 |
| /tools/markup-calculator/ | 629 | 56 |
| /tools/pe-ratio-calculator/ | 524 | 29 |
| /tools/cpc-calculator/ | 468 | 48 |
| /tools/hourly-rate-calculator/ | 418 | 59 |
| /tools/churn-rate-calculator/ | 378 | 71 |
| /tools/google-ads-calculator/ | 359 | 74 |
| /tools/conversion-rate-calculator/ | 321 | 53 |
| /tools/burn-rate-calculator/ | 280 | 73 |
| /tools/ctr-calculator/ | 265 | 59 |
Every page with meaningful impressions is sitting somewhere around page 5-8 of Google. Not page 2. Not "almost there." Basement-level.
Finding #1: Google has already judged these pages
Here's the uncomfortable part. Google knows the ROI calculator exists. It matches the page to the query "roi calculator" correctly. It has crawled it, rendered it, looked at it, and decided: rank 58 material.
All the content automation in the world doesn't change that assessment on its own. Over the last three weeks my daily routine has been adding worked examples and verified benchmarks to page after page. Google has seen all of it and the positions haven't moved. Because the problem isn't the content. The problem is authority signals — backlinks, brand mentions, referring traffic from trusted sources. The kind of thing you can't write a prompt for.
I'd somehow convinced myself that "enough good content" would eventually overpower the authority gap. The data says no. When you're a new domain with almost no backlinks, targeting head terms dominated by long-established financial-media sites, no amount of on-page polish moves you from page 6 to page 1.
Finding #2: The quick win that wasn't
This one I'm almost embarrassed about.
I was pulling page-level metrics and saw /tools/loan-payment-calculator/ sitting at an average position of 9.8 with 299 impressions and zero clicks. Position ten. Already on page one. A CTR problem, not a rankings problem. A 30-minute fix.
I got excited. I started drafting the title rewrite in my head before I'd even opened the page.
Then I opened the page. The title was fine:
Loan Payment Calculator: Monthly & Total Interest | Valuefy
58 characters, descriptive, specific, mentions "monthly" and "total interest" which are the two things a searcher actually wants to know. The meta description was also fine — "Free loan payment calculator: enter amount, rate, and term to see monthly payment, total interest paid, and payoff date. Auto, mortgage, and personal loans." Benefit-driven, 155 characters, specific use cases.
So I pulled the query-level breakdown for that page, expecting to see one big query at position 9.8. Here's what GSC actually returned:
| Query | Impressions | Position |
|---|---|---|
| 15000 loan 4.5% 36 months monthly payment | 2 | 2.5 |
| add on interest monthly payment | 8 | 62.9 |
| calculate loan amount from payment and interest rate | 1 | 98 |
| payment calculator with interest paid | 2 | 93 |
Thirteen impressions. Across four ultra-specific queries. I was missing two hundred and eighty-six impressions.
The rest of them were GSC's privacy-threshold graveyard — queries with one or two impressions that don't show up in the query breakdown at all. Dozens of tiny long-tail variations, each getting crumbs of traffic from different positions scattered between 2 and 100.
The "9.8 average position" was the arithmetic mean of that graveyard. It wasn't a real top-10 ranking at all. It was one query at position 2.5 and one at position 98 averaging out to look like hope.
There was no title fix. There was nothing to fix. The page just has a weirdly diffuse distribution of tiny long-tail queries with no dominant intent target, and GSC's aggregate math lied to me about what that meant.
That's the thing about SEO metrics. Every aggregate hides a distribution, and every distribution hides a reason.
Finding #3: I don't rank #1 for my own brand name
The query "valuefy" gets 325 impressions per month. Position 9. Five clicks.
Let that sit. My site, literally called valuefy, ranks ninth for its own brand name. Even searches for the misspelling "vacuefy" (269 impressions) put us at position 6 with zero clicks.
Part of this is the authority problem again — a new domain without external mentions isn't obviously "the valuefy" Google should trust. But part of it is something I can actually fix. I curled the homepage to see what Google sees:
<title>Free Business Calculators & AI Generators | Valuefy</title>
The brand name is at the end of the title, after the pipe. From Google's perspective, the primary tokens are "Free," "Business," "Calculators," "AI," and "Generators." "Valuefy" is tacked on like an afterthought. I'm telling Google "this page is about free business calculators that happens to be made by somebody called Valuefy," and then I'm surprised the homepage doesn't rank #1 for "valuefy."
Lead with the brand on the homepage. Organization schema with sameAs links to the social profiles that do exist. That's a 20-minute fix, not a three-month fix.
Finding #4: I'm invisible in the markets I actually care about
Top countries by clicks, 90 days:
| Country | Clicks | Impressions |
|---|---|---|
| India | 12 | 1,177 |
| France | 4 | 448 |
| Malaysia | 3 | 214 |
| Egypt | 2 | 94 |
| Nigeria | 2 | 125 |
| Pakistan | 2 | 189 |
| USA | — | not in top 20 |
I'm running an English-language SaaS tool site and Google is showing it primarily to users in India, France, Malaysia, Nigeria, and Pakistan. The USA — the single biggest English SEO market on earth — doesn't crack the top 20 countries by impression volume.
I don't have a clean answer yet. Possible culprits: missing hreflang, hosting region in Europe adding latency to US crawls, no US-specific content hooks, or simply DR so low we don't surface at all in the world's most competitive SERP. I'll rule these out one at a time.
Finding #5: I found one real technical bug on the way
While curling pages to investigate Finding #3, I noticed something odd in the HTML. Every page has two og:title meta tags:
<meta property="og:title" content="Valuefy - Free Business Calculators & Financial Tools">
...
<meta property="og:title" content="Loan Payment Calculator: Monthly & Total Interest | Valuefy" data-rh="true">
The first one is static, baked into index.html. The second is injected by React Helmet during prerender. Same story for og:description. Two competing tags per page, on every route.
Which one wins depends on which crawler you ask. Most modern crawlers take the last one, but some take the first. For social sharing, this means some platforms are showing a generic site-wide title for every individual page — the exact opposite of what the rest of the SEO work has been trying to achieve.
Fix: strip the static og: tags out of index.html and let React Helmet be the single source. Same for the static <meta name="description"> tag that's also fighting with the hydrated one. About 15 minutes of work, touches one file.
This wasn't the bug I was looking for. But it's the one I found, and it's the one I can fix tonight.
What I'm going to do about it
None of these is "more content."
-
Fix the duplicate
og:tags — tonight. Strip the static fallbacks fromindex.html. One-file change. - Fix brand SERP — lead the homepage title with "Valuefy," add proper Organization schema with sameAs links. Twenty minutes.
- Investigate the geo problem — verify hreflang, check hosting region, look at what signals Google is using to decide our audience.
- Build authority slowly, honestly — no routine, no script, no prompt fixes DR from zero. Outreach, guest posts, directory submissions, off-site work that compounds over 6-12 months. Including this post, which is the first brick.
- Pause new tool pages and new blog posts — not forever, but until I know which of the existing 160 can actually be rescued. I'd rather run one audit routine that tells me which pages improved week over week than ship more content into a pile Google has already rated.
The uncomfortable lesson
Before the audit, if you'd asked me what my SEO strategy was, I'd have said: "Build valuable tools, write good content, automate the tedium, let time do the work." All of those things have been happening, every day, on autopilot and by hand, for three months.
None of it mattered because none of it changed the one variable that was actually constraining the outcome.
Worse, the data tricked me twice inside one audit. Once at the top level — 45 clicks was lower than I'd been willing to admit. And once at the exception level — Finding #2 looked like hope and turned out to be an averaged fiction. Aggregates hide distributions. Distributions hide reasons. The only way out is to pull the actual query-level data and look at it with both eyes open.
Automation is a force multiplier on a working strategy. It is not a strategy by itself. Running faster in the wrong direction just gets you lost faster.
I don't know yet if the things I'm about to try will work. I'll check back in a month with whatever the data says. If the answer is "still 45 clicks," I'll say that too.
I'm Sampsa, CEO at AImiten. We build AI tooling for companies — and sometimes I run experiments on my own side projects to stress-test the ideas. valuefy.app is one of those experiments. If you're curious what we do at AImiten, have a look.
Top comments (0)