Before Google Search Console is useful — for indexing requests, coverage errors, or query data — you have to prove you own the domain. For three new custom domains on Astro sites, I did this in one afternoon. The fastest path was DNS TXT verification through Cloudflare, and there were a few non-obvious details worth writing down.
This is a short notes post, not a full tutorial. Google and Cloudflare both have step-by-step documentation. What follows is the things that tripped me up and what the console actually shows at day one.
DNS TXT vs HTML file for Astro SSG
Search Console offers four verification methods. For Astro sites deployed to Cloudflare Pages:
-
HTML file: Download a file, add to
public/, commit, push, wait for deploy. -
HTML meta tag: Modify the
<head>layout component, commit, push, wait for deploy. - DNS TXT record: Add a record in Cloudflare DNS. No deploy needed.
- Google Analytics or Search Console tag: Requires GA4 already wired up.
The DNS TXT method is clearly faster for static sites. You don't trigger a build, you don't wait for Cloudflare Pages CI, and you don't have to modify any source file. The TXT record propagates in minutes, and Search Console confirms it within a couple of minutes of Cloudflare showing the record as "Active."
One thing that confused me initially: DNS TXT verification proves ownership of the domain root, not of a specific path. A root domain verification covers all subdomains automatically, including www. That's broader coverage than the HTML file approach, which only verifies the exact URL where the file was placed.
Adding the record in Cloudflare
In Cloudflare's DNS dashboard: dash.cloudflare.com → [your domain] → DNS → Records → Add record. Record type is TXT, name is @ (the root), content is the string Search Console provides — something like google-site-verification=<long-string>.
Cloudflare defaults to TTL "Auto," which is 300 seconds. The record shows as "Active" in the Cloudflare dashboard almost immediately after saving. I verified each record was live with dig TXT yourdomain.com before clicking Verify in Search Console. Not strictly required, but it prevents wasting a verification attempt on propagation lag.
This is a per-domain operation. Each of the three sites needed its own Search Console property and its own TXT record. There's no bulk flow; you do it once per domain and move on.
Submitting sitemaps for @astrojs/sitemap output
After verification, submit the sitemap. For Astro sites using @astrojs/sitemap, the output at small site sizes is a /sitemap-0.xml file plus a /sitemap-index.xml that references it.
Submit the index file, not the shard: https://yourdomain.com/sitemap-index.xml. Search Console follows the index to discover the shards. Submitting a shard URL directly works, but you'd need to manually add each new shard if the site grows past the single-file limit (50,000 URLs). The index submission handles that automatically.
If Search Console shows "Couldn't fetch" on the sitemap, check two things: that your custom domain deployment is actually live (not still serving the *.pages.dev URL), and that robots.txt isn't accidentally blocking the /sitemap* path.
What the coverage report shows at day one
Almost nothing, which is expected.
Zero indexed pages is normal for a brand-new domain. Googlebot hasn't had time to crawl anything meaningfully — the custom domain records are fresh, and the sitemap was just submitted. All URLs will show as "Discovered — currently not indexed," which means Google knows the pages exist from the sitemap but hasn't crawled them yet.
The URL Inspection tool works immediately. You can paste any URL, click "Request Indexing," and Google queues a crawl for that specific page at elevated priority. I did this for the home page and top two or three category pages on each site. It's not the same as indexing — it just moves those specific URLs to the front of the crawl queue. Actual indexed status takes days to weeks.
Worth noting: IndexNow sends signals to Bing's index, not Google's. For Google, the URL Inspection request-indexing button is the only manual acceleration available. Running both is not redundant; they're independent crawlers.
The coverage report becomes genuinely useful after roughly a week, when errors start accumulating. The errors section is where you find 404s, redirect chain issues, canonical mismatches, and noindex tags that got applied by mistake. Those are all invisible until Googlebot actually visits. I'm planning to check each property weekly for the first month.
If you're running multiple sites, keep the Search Console properties separate rather than trying to look across them in one view. Error patterns for an AI tools directory and an indie games directory look completely different and would obscure each other if combined.
Part of an ongoing 6-month experiment running three AI-curated directory sites. The technical claims here are real; this article was AI-assisted.
Top comments (0)