Index your Blogger posts in 24-48 hours using Google Indexing API. Step-by-step guide with scripts. No coding needed.
A complete step‑by‑step guide for Blogger users – using Google Indexing API to get your posts indexed in 24‑48 hours.
What you'll learn
- What Google Indexing API is and why it works
- How to create a Google Cloud project (free)
- How to generate a service account and JSON key
- How to connect the API to your Google Search Console
- How to run a simple Python script (no coding skills needed) to submit all your URLs
- How to automate future posts
What is Google Indexing API?
Google Indexing API is a free service that lets website owners notify Google directly when pages are added or updated. Instead of waiting weeks for Google to discover your new posts through backlinks or sitemaps, you send a "ping" that tells Google: "Hey, I have fresh content – come crawl it now!"
It was designed for job postings and live streams, but many bloggers (including me) have used it successfully for regular blog posts. In this guide I'll show you exactly how to set it up for your Blogger blog.
Prerequisites
- A Blogger blog with at least 10‑20 published posts (you have 90+, so you're good)
- A Google account (you'll use it for Google Cloud and Search Console)
- Your blog already submitted to Google Search Console – if not, read this SEO guide
- Basic ability to copy/paste and follow instructions (no coding experience needed)
Step 1: Create a Google Cloud Project
- Go to Google Cloud Console and sign in with your Google account.
- At the top, click the project drop‑down → New Project.
- Name it something like
Blogger‑Indexingand click Create. - Make sure the new project is selected (you'll see it in the top bar).
Step 2: Enable the Indexing API
- In the Cloud Console, click the hamburger menu ☰ → APIs & Services → Library.
- Search for "Indexing API" and click on it.
- Click the blue ENABLE button.
Step 3: Create a Service Account
- Go to IAM & Admin → Service Accounts.
- Click + CREATE SERVICE ACCOUNT.
- Name it
blogger-indexer, description: "For indexing domebytes posts." - Click CREATE AND CONTINUE.
- In the role drop‑down, search for
Ownerand select it. - Click CONTINUE → DONE.
Step 4: Generate JSON Key File
- In the Service Accounts list, click the three dots (⋮) under Actions for your new service account → Manage keys.
- Click ADD KEY → Create new key.
- Choose JSON and click CREATE. A
.jsonfile will download – keep it safe (you'll need it later).
Step 5: Add Service Account to Google Search Console
- Open Google Search Console with the account that owns your blog.
- Select your blog property (e.g.,
domebytes.onlineordomebytes.blogspot.com). - Click Settings (gear icon) → Users and permissions → ADD USER.
- Paste the service account email (looks like
blogger-indexer@your-project-id.iam.gserviceaccount.com). - Select Owner permission → click ADD.
Step 6: Run the Indexing Script (No coding required)
We'll use Google Colab – a free online tool that runs Python scripts in your browser. No installation needed.
6.1 Upload your JSON key to Colab
- Go to Google Colab and log in.
- Click File → New notebook.
- On the left sidebar, click the folder icon (Files).
- Click Upload to session storage and select the JSON file you downloaded.
6.2 Run the test script (verify it works)
Copy the code below into the first code cell. Change the filename to match your uploaded JSON file.
# Install required library
!pip install google-auth-oauthlib google-auth-httplib2 google-api-python-client
import json
from google.oauth2 import service_account
from googleapiclient.discovery import build
# Use your actual JSON filename
JSON_KEY_FILE = '/content/your-json-filename.json'
SCOPES = ['https://www.googleapis.com/auth/indexing']
credentials = service_account.Credentials.from_service_account_file(JSON_KEY_FILE, scopes=SCOPES)
service = build('indexing', 'v3', credentials=credentials)
# Test with your homepage
url = "https://www.domebytes.online/"
content = {"url": url, "type": "URL_UPDATED"}
try:
result = service.urlNotifications().publish(body=content).execute()
print(f"✅ Success: {url}")
except Exception as e:
print(f"❌ Failed: {e}")
6.3 Submit ALL your blog posts at once
Run this script in a new cell – it will automatically fetch all your post URLs from your sitemap and submit them.
import requests
import xml.etree.ElementTree as ET
from google.oauth2 import service_account
from googleapiclient.discovery import build
JSON_KEY_FILE = '/content/your-json-filename.json'
SCOPES = ['https://www.googleapis.com/auth/indexing']
credentials = service_account.Credentials.from_service_account_file(JSON_KEY_FILE, scopes=SCOPES)
service = build('indexing', 'v3', credentials=credentials)
# Fetch URLs from your Blogger sitemap
sitemap_url = "https://www.domebytes.online/sitemap.xml"
response = requests.get(sitemap_url)
root = ET.fromstring(response.content)
namespaces = {'sitemap': 'http://www.sitemaps.org/schemas/sitemap/0.9'}
urls = [loc.text for loc in root.findall('.//sitemap:loc', namespaces)]
print(f"Found {len(urls)} URLs. Submitting...")
success = 0
for url in urls:
try:
service.urlNotifications().publish(body={"url": url, "type": "URL_UPDATED"}).execute()
print(f"✅ {url}")
success += 1
except Exception as e:
print(f"❌ {url} -> {e}")
print(f"\nSubmitted {success} URLs to Google Indexing API.")
After running, you should see " (no of posts) succeeded, 0 failed" (or similar). That means Google has received the notifications!
Step 7: Automate for Future Posts
You don't want to run the script manually every time you publish. Here's a simple workflow:
Bookmark the Colab notebook – you can re‑run it whenever you add new posts.
Set a reminder – run the script once a week to submit any new URLs.
Alternative: Use a free automation tool like IFTTT or Zapier to trigger the script when you publish.
Other helpful indexing methods
If you're interested in other approaches, here are some related tips and resources:
Manual Indexing – The quickest method for a single page is using Google Search Console's URL Inspection tool. Paste your new URL into the search bar, and if it's not indexed, click the "Request Indexing" button.
Sitemap Submission – Ensure your XML sitemap is up-to-date and submit it (or resubmit it) in Google Search Console. This helps Google discover all your pages, especially in bulk.
Internal Linking – Linking from your new post to an already well-indexed page on your site (and vice-versa) helps Google's crawlers find your fresh content more quickly.
Browse the DEV Community – You can also explore the seo and webdev tags on dev.to to see how other developers are tackling indexing challenges and discovering new tools.
Top comments (0)