Part of the series: WordPress Pre-Launch Technical Checks
One of the most frustrating issues after launching a WordPress site is discovering that search engines are still being told not to index it.
The site is live, the client has approved everything, the domain is public, and yet the indexing signals are still pointing to staging mode.
This usually happens because of leftover meta robots settings from development.
What the meta robots tag does
The meta robots tag is an instruction placed in the page's HTML head that tells search engines how they should treat that page.
A common configuration used during development looks like this:
<meta name="robots" content="noindex, nofollow">
This prevents search engines from indexing the page or following its links.
During staging, that is often exactly what you want.
The problem appears when those instructions remain active after launch.
How staging environments create this issue
Most WordPress projects go through at least one staging phase.
Developers block indexing intentionally so unfinished pages do not appear in search engines.
But once the site moves to production, several things can go wrong:
- the WordPress “Discourage search engines” option remains enabled
- SEO plugins keep a staging configuration
- server rules inject a robots directive
- page templates still output a
noindextag
The result is a perfectly functional site that quietly asks search engines to ignore it.
Common meta robots mistakes before launch
1. The WordPress indexing setting was never switched back
WordPress has a built-in option that discourages search engines from indexing the site.
If this option stays active after launch, it may generate a noindex directive across the entire site.
2. SEO plugins keeping staging settings
Many SEO plugins allow you to block indexing while a site is in development.
After migration or domain changes, those settings sometimes remain active without anyone noticing.
3. Template-level robots directives
Some themes or page templates include robots directives conditionally.
If those conditions are not updated for production, pages can unintentionally remain non-indexable.
4. Conflicting signals
Sometimes the problem is not a single noindex tag but a mixture of signals:
- WordPress allows indexing
- but the page outputs
noindex - or the sitemap includes URLs that the page itself blocks
Those contradictions can confuse both developers and search engines.
A quick meta robots launch check
Before delivering a WordPress site, it is worth verifying a few simple things:
- Does the homepage allow indexing?
- Do key pages output a normal robots directive?
- Is the WordPress indexing option disabled for staging but enabled for production?
- Do SEO plugins reflect the final launch configuration?
This check takes a couple of minutes but can save days of confusion after launch.
Why this belongs in a repeatable launch process
Meta robots issues are rarely dramatic during development.
The site works, pages load correctly, and the problem only appears once the project is already public.
That is why it makes sense to include indexing signals in a standard pre-launch checklist.
Where PreFlight fits in
PreFlight focuses on the technical review stage before a WordPress site is delivered or published.
If you want to run a quick check to detect indexing configuration issues before launch, you can start here:
https://preflightstandard.com/
Final thought
Launching a site while it still tells search engines “do not index me” is surprisingly common.
A quick review of meta robots settings before going live helps ensure the site is actually ready to be discovered.
Top comments (0)