Stop waiting weeks for Google to discover your pages. Learn how to use Google's Indexing API, URL Inspection API, and Search Console API to automate URL submission and track indexing status — with daily rate limits explained.
If your website has hundreds or thousands of pages — product listings, blog posts, tool pages, color palettes — you've probably noticed that Google can take weeks or even months to discover and index them all.
The good news: Google provides three powerful APIs that let you take control of your indexing. In this guide, we'll walk through how to set them up and use them to get your pages indexed in 24-48 hours instead of weeks.
The Three Google APIs You Need
Google offers three separate APIs for managing how your site appears in search:
- Google Search Console API — view search analytics (queries, clicks, impressions, positions) and manage sitemaps
- URL Inspection API — check if a specific URL is indexed, when it was last crawled, and why it might not be indexed
- Indexing API — tell Google to crawl a URL immediately. This is the fastest way to get new pages indexed
Each API has its own daily limits:
API
Daily Limit
Best For
Indexing API
200 requests/day
Submitting new or updated URLs
URL Inspection API
2,000 requests/day
Checking indexing status
Search Console API
25,000 requests/day
Analytics and sitemap management
Step 1: Set Up a Google Cloud Service Account
All three APIs use service account authentication — no user login required. Here's how to set it up:
- Go to Google Cloud Console and create a project (or use an existing one)
- Navigate to APIs & Services → Library and enable:
- Google Search Console API
- Web Search Indexing API
- Go to APIs & Services → Credentials → Create Credentials → Service Account
- Give it a name (e.g., "seo-indexing") and click Done
- Click on the service account → Keys → Add Key → Create new key → JSON
- Download the JSON file — this is your authentication credential
Important: Now go to Google Search Console → Settings → Users and permissions → Add user. Paste the client_email from your JSON file and set permission to Owner.
Step 2: Get Your Site URL Right (The Trailing Slash Trap)
This trips up almost everyone: the siteUrl parameter in API calls must exactly match the property registered in Search Console.
There are two property types:
- URL prefix:
https://example.com/(note the trailing slash) - Domain:
sc-domain:example.com(no protocol, no slash)
If your property is a URL prefix registered as https://example.com/ but you send https://example.com (no slash), the API will reject your request with "You do not own this site."
Tip: Call the sites.list() endpoint first to see the exact format Google expects for your property.
Step 3: Submit URLs for Indexing
The Indexing API is the fastest way to tell Google about new or updated pages. When you submit a URL, Google typically crawls it within minutes to hours.
How it works:
- Send a POST request to the Indexing API with the URL and type (
URL_UPDATEDorURL_DELETED) - Google queues the URL for crawling
- The page usually appears in search results within 24-48 hours
Daily limit: 200 URLs per day. This means you need to prioritize which URLs to submit first. A good strategy:
- New pages that have never been submitted → highest priority
- Pages that failed inspection (not indexed) → second priority
- Previously submitted pages that need re-crawling → lowest priority
Step 4: Check Which URLs Are Actually Indexed
Submitting a URL doesn't guarantee it gets indexed. Google might decide the page is low quality, duplicate, or blocked by robots.txt. The URL Inspection API tells you exactly what happened.
For each URL, you get:
- Verdict — PASS (indexed), NEUTRAL (might index later), or FAIL (won't index)
- Coverage state — "Submitted and indexed", "Crawled - currently not indexed", "Discovered - currently not indexed", etc.
- Last crawl time — when Google last visited the page
- Robots.txt status — whether the page is blocked
- Page fetch status — whether Google could actually load the page
Daily limit: 2,000 inspections per day. Inspect your most important pages first — new submissions, pages with errors, or pages that haven't been inspected recently.
Step 5: Track Everything in a Database
With thousands of URLs, you need a system to track what's been submitted, what's indexed, and what needs attention. For each URL, store:
- Status: pending → submitted → indexed (or not_indexed / error)
- Last submitted date and submit count
- Last inspection date and verdict
- Error messages if submission or inspection failed
The workflow becomes:
- Sync — pull all URLs from your sitemap into the database
- Submit — send pending URLs to the Indexing API (up to 200/day)
- Inspect — check submitted URLs via URL Inspection API (up to 2,000/day)
- Repeat — run daily until all pages are indexed
You can automate this with a daily cron job. Start with the submit batch, wait a few hours, then run the inspect batch to see which ones made it.
Step 6: Monitor Search Performance
Once your pages are indexed, the Search Console API gives you visibility into how they're performing:
- Clicks — how many users clicked through to your site from Google
- Impressions — how often your pages appeared in search results
- Click-through rate (CTR) — clicks divided by impressions
- Average position — where your pages rank on average
You can break this down by query (what people searched for) or by page (which of your pages appeared). This data is gold for finding optimization opportunities.
Step 7: Find Quick-Win Keyword Opportunities
The most valuable insight from search analytics is finding high-impression, low-CTR keywords. These are queries where your page appears in search results but users aren't clicking through.
Common patterns:
- Position 8-20 (page 1-2 of results): your page is close to the top. Improving the title tag and meta description can boost CTR significantly
- High impressions, zero clicks: your page title might not match what users expect. Review the content
- Impressions for keywords you didn't target: Google is showing your page for related queries. Create dedicated content for those keywords
This analysis naturally leads to a content plan: which new articles to write, which existing pages to optimize, and which keywords to target.
Automation Tips & Best Practices
- Run submit batches daily — with 200 URLs/day, a site with 2,000 URLs takes about 10 days to fully submit
- Wait 48-72 hours before inspecting — give Google time to process submitted URLs before checking their status
- Re-submit failed URLs — pages marked "not indexed" can sometimes succeed on a second attempt, especially after content improvements
- Don't waste quota on indexed pages — skip URLs already marked as PASS
- Submit sitemaps too — the Sitemap API lets you ping Google whenever your sitemap updates, which helps with discovery
- Monitor daily quota — always check remaining quota before batch operations to avoid hitting limits mid-process
Key Takeaways
- Don't wait for Googlebot — use the Indexing API to proactively submit new pages
- Verify with URL Inspection — submission doesn't guarantee indexing. Always check
- Respect the limits — 200 submits and 2,000 inspections per day. Plan your batches
- Track status in a database — you need to know which URLs are pending, submitted, indexed, or failed
- Use analytics for content strategy — search performance data reveals exactly what to write next
- Automate the cycle — sync → submit → inspect → repeat, daily
With this approach, new pages go from "published" to "appearing in Google search results" in 1-2 days instead of weeks. For sites with large, dynamic page catalogs, this kind of automation is essential.

Top comments (0)