Every Monday morning I was manually opening Google Search Console, exporting CSVs, and trying to figure out which keywords were almost ranking but not converting clicks. It took 45 minutes and I kept forgetting to do it.
So I built an agent to do it for me.
What the SEO Agent Does
The agent runs weekly (via cron or Azure Function), and in under 60 seconds it:
- Fetches the last 28 days of query + page data from the Google Search Console API
- Filters for your target keywords from a custom list you define (any niche — e-commerce, SaaS, compliance, local services)
- Identifies quick wins — keywords sitting at position 5–20 with impressions but zero clicks (these are the easiest to fix)
- Flags pages that are getting impressions but no clicks (title/meta description problems)
- Sends all of this to Azure OpenAI (GPT-4o-mini) with a structured prompt
- Outputs a prioritised weekly report with exact pages to fix, exact keywords to target, and content to create
-
Emails the report (optional, via Azure Communication Services) and saves it to a dated
.txtfile
The Architecture
Google Search Console API
↓
fetch_gsc_data()
↓
filter_keyword_data()
↓
analyse_with_azure() ← Azure OpenAI (GPT-4o-mini)
↓
send_email_report() ← Azure Comm Services (optional)
↓
seo_report_YYYY-MM-DD.txt
Key Code: Filtering Quick Wins
The most useful part of the agent is this simple filter. These are keywords you're almost ranking for — a small content or meta-title fix can push them to page 1:
# Customise this list for your niche
TARGET_KEYWORDS = [
"hipaa", "gdpr", "soc 2", "compliance", "certification",
"audit", "security assessment", "data privacy", "regulatory"
# → swap these for your own: "pricing", "review", "vs", etc.
]
def filter_keyword_data(queries, pages):
target_queries = [
q for q in queries
if any(kw in q["query"].lower() for kw in TARGET_KEYWORDS)
]
# Position 5-20, has impressions, zero clicks = easy win
quick_wins = [
q for q in queries
if 5 <= q["position"] <= 20
and q["impressions"] >= 5
and q["clicks"] == 0
]
zero_click_pages = [
p for p in pages
if p["clicks"] == 0 and p["impressions"] >= 20
]
return target_queries, quick_wins, zero_click_pages
Just replace TARGET_KEYWORDS with whatever fits your site — product names, service types, locations, anything.
Key Code: The Azure OpenAI Prompt
The prompt asks GPT-4o-mini to output a structured report with specific actions — not vague advice:
response = client.chat.completions.create(
model=AZURE_DEPLOYMENT,
messages=[
{
"role": "system",
"content": (
"You are an expert SEO analyst. "
"Generate precise, actionable weekly reports. Be specific — name exact pages, "
"exact keywords, exact meta title text. No fluff."
)
},
{
"role": "user",
"content": f"""Analyse this Search Console data and output exactly this structure:
## WEEKLY SEO REPORT
### TOP 3 ACTIONS THIS WEEK (do these first)
1. [specific action with exact page URL and what to change]
### KEYWORDS TO TARGET THIS WEEK
- [keyword] | position [X] | potential clicks if fixed
### PAGES TO FIX THIS WEEK
| Page | Problem | Fix |
### CONTENT TO CREATE THIS WEEK
- [specific page title and target keyword]
Data: {data_summary}
"""
}
],
max_tokens=800
)
Asking for a fixed output structure is what makes the report consistently usable — instead of getting a different format every run.
Fallback Mode (No API Keys Needed to Test)
The agent has graceful fallbacks — if GSC is not connected yet, it uses sample CSV data. If Azure is not configured, it generates a rule-based report without AI:
def fetch_gsc_data():
try:
# ... real GSC API call
except Exception as e:
print(f"⚠ GSC not connected: {e}")
return _load_from_csv_fallback() # uses sample data
This means you can run it right now without any API keys to see what the output looks like.
Setup in 5 Steps
1. Clone the repo
git clone https://github.com/vaibhavi-git/SEO-Agent.git
cd SEO-Agent
2. Install dependencies
pip install -r requirements.txt
3. Copy and fill in your .env
cp env.template.txt .env
AZURE_OPENAI_ENDPOINT=https://YOUR-RESOURCE.openai.azure.com/
AZURE_OPENAI_KEY=your_key_here
AZURE_DEPLOYMENT=gpt-4o-mini
GSC_SITE_URL=sc-domain:yourdomain.com
GSC_CREDENTIALS_FILE=gsc_credentials.json
4. Set up Google Search Console OAuth
- Go to Google Cloud Console
- Create a project → Enable Search Console API
- Credentials → Create OAuth 2.0 Client ID (Desktop app) → Download JSON → save as
gsc_credentials.json - First run opens a browser for Google auth — approve it once, token is saved automatically
5. Run it
python seo_agent.py
Schedule it (weekly, Monday 8am):
0 8 * * 1 cd /path/to/SEO-Agent && python seo_agent.py
Sample Output
## WEEKLY SEO REPORT
Week of: 2026-04-07
Biggest opportunity: keyword at position 15.5, 68 impressions, 0 clicks
### TOP 3 ACTIONS THIS WEEK
1. Update /your-page meta title → target "your keyword + city/category"
2. Add FAQ section to /another-page — 43 impressions at position 7.3, 0 clicks
3. Create a landing page for "affordable [your service] [location]" — 136 impressions waiting
### KEYWORDS TO TARGET THIS WEEK
- "your keyword" | pos 15.5 | ~8 clicks/week if reaches top 5
- "another keyword" | pos 7.95 | ~15 clicks/week potential
What I Learned
- Quick wins are real — the position 5–20 filter immediately surfaced keywords that just needed a meta title change to start getting clicks
- Structured prompts matter — asking GPT-4o-mini for a fixed output format made reports consistently actionable
- Fallback mode is critical — testing without API keys saved hours of debugging during setup
-
The keyword list is everything — spend time on
TARGET_KEYWORDS. The more specific to your niche, the more focused the AI recommendations
GitHub Repo
Full source code, requirements, and env template:
👉 github.com/vaibhavi-git/SEO-Agent
The repo includes:
-
seo_agent.py— complete agent -
requirements.txt— pinned dependencies -
env.template.txt— all config variables explained -
README.md— full setup guide
This pattern works for any niche — swap TARGET_KEYWORDS for your own list.
Top comments (0)