I do SEO research every Monday at 1 AM. Not manually — a cron job does it for me.
Output:
- Markdown report with keyword gaps
- 7 tasks created in NocoDB with priority
- E-E-A-T analysis (which keywords leverage my exit/experience)
Time invested: 5 minutes reading the report (vs 3 hours/week manually).
Savings: 12 hours/month.
It's not magic. It's a system that automates the tedious stuff (fetching data, comparing lists, detecting gaps) so I can focus on the strategic work (what to write, how to position it).
Here's how it works (with code included).
The Problem: Manual SEO Is Tedious
Traditional SEO research:
- Open Ahrefs/SEMrush/Serpstat
- See keywords you rank for
- See keywords competitors rank for
- Compare lists manually (Excel hell)
- Identify gaps ("they rank for X, I don't")
- Prioritize by volume + difficulty
- Create tasks in task manager
Time: 2-3 hours/week.
Frequency: Weekly (because keywords change, competitors publish).
Result: SEO gets deprioritized because it's boring.
The Solution: Automated SEO with Serpstat API
My setup:
- API: Serpstat (keyword research + competitor analysis)
- Cron: Monday 1 AM (isolated session, no clutter)
- Model: Opus 4.6 (strategic analysis requires deep thinking)
- Output: Markdown report + NocoDB tasks
- Cost: $69/month Serpstat + $1.20/month Opus cron
Workflow:
- Fetch keywords from my 2 domains
- Fetch high-volume keywords in tech/startup niche
- Gap analysis (keywords where I DON'T rank)
- E-E-A-T analysis (which ones leverage my real experience)
- Create NocoDB tasks with priority
- Save report to
content-strategy/keyword-research/weekly-YYYY-MM-DD.md
The Code (Simplified)
## Pseudo-code for the SEO cron
def weekly_seo_research():
# 1. Fetch keywords from my domains
cristiantala_kw = serpstat.get_domain_keywords("cristiantala.com")
# Returns: 98 keywords
# 2. Fetch target keywords (tech/startup niche)
target_keywords = serpstat.get_keywords_top(
query="startup OR pitch OR investment OR fundraising OR AI",
country="US",
limit=500
)
# 3. Gap analysis
my_keywords = set(cristiantala_kw)
target_set = set(target_keywords)
gaps = target_set - my_keywords
# Filter by volume > 100/month AND difficulty < 40
high_value_gaps = [
kw for kw in gaps
if kw['volume'] > 100 and kw['difficulty'] < 40
]
# 4. E-E-A-T Analysis with Opus
prompt = f"""
Analyze these keyword gaps for a founder who:
- Sold a fintech startup for $23M
- Has 30+ angel investments
- Is a tech mentor
Keyword gaps: {high_value_gaps}
For each keyword:
1. Relevance (1-10)
2. E-E-A-T advantage (does their real experience give them an edge?)
3. Content angle (what unique perspective can they offer?)
4. Priority (P0/P1/P2/P3)
Return top 10 opportunities.
"""
analysis = opus_model.generate(prompt)
# 5. Create NocoDB tasks
for opportunity in analysis['top_10']:
nocodb.create_task({
"title": f"SEO: Write '{opportunity['keyword']}' post",
"priority": opportunity['priority'],
"tags": ["SEO", "Content", "Blog"]
})
# 6. Generate report
report = generate_markdown_report(analysis)
save_to_file(f"content-strategy/keyword-research/weekly-{today}.md", report)
Sample Report Output (Real)
## SEO Weekly Report — 2026-02-16
### Summary
- **Site keywords ranking:** 98
- **Content gaps found:** 47 high-value keywords
- **Tasks created:** 7 (P1: 3, P2: 4)
### Top Opportunities
#### 1. "pitch deck template" (3.6K/mo, Difficulty 35)
**Angle:** Share actual pitch deck from exit + investor feedback
**E-E-A-T:** Real exit + 30+ investments = unfair advantage
**Traffic potential:** 500-800/month
**Priority:** P1
#### 2. "product market fit" (390/mo, Difficulty 28)
**Angle:** PMF lessons from scaling from 0 to exit
**E-E-A-T:** Built product from 0 → exit, knows PMF journey firsthand
**Traffic potential:** 150-250/month
**Priority:** P1
E-E-A-T: Your Competitive Advantage
E-E-A-T = Experience, Expertise, Authority, Trust (Google's quality guidelines).
My case:
- Experience: Sold startup, made 30+ investments
- Expertise: 11 years university professor (programming, algorithms)
- Authority: Forbes, Bloomberg, major business press coverage
- Trust: Real name, verified LinkedIn, public track record
SEO implication:
- Do NOT compete on generic keywords ("how to build a startup")
- DO compete on keywords where experience = advantage ("negotiate term sheets", "investor due diligence")
Result: I rank higher with fewer backlinks — because Google values E-E-A-T.
Lessons Learned
1. Use Your Best Model for Analysis
I tried a cheaper model for SEO analysis.
Cheap model output: Mechanically lists keywords, misses thematic connections, loses E-E-A-T context.
Powerful model output: "Your exit gives you credibility for 'pitch deck' that competitors lack." Suggests pillar pages, prioritizes by business impact, not just volume.
Verdict: SEO analysis IS strategic thinking. Use your best model.
2. Content Gaps ≠ Keyword Stuffing
190 keyword gaps does NOT mean "write 190 posts."
Better approach:
- Group keywords by topic ("pitch deck", "investor deck", "startup presentation" = 1 pillar post)
- Create comprehensive content (3,000+ words)
- Naturally cover 10-20 related keywords
Result: 10 pillar posts > 100 thin posts.
3. Automate Data, Not Decisions
Serpstat API fetches data → automated.
Which keywords to write about → human decision (AI-assisted).
Anti-pattern: "Just write about top 10 keywords by volume."
Reality: Volume without relevance = wasted effort.
4. E-E-A-T Beats Backlinks (For Personal Brands)
Experiment:
- Generic post ("how to make a pitch deck"): 20 backlinks, position #25
- Post with real exit story ("pitch deck that closed $23M"): 3 backlinks, position #8
Why: Google detects expertise (exit mentioned, real numbers, unique perspective).
Lesson: For founder content, E-E-A-T > backlinks.
OpenClaw Cron Config
{
"name": "SEO Weekly Report",
"schedule": {
"kind": "cron",
"expr": "0 1 * * 1",
"tz": "America/New_York"
},
"payload": {
"kind": "agentTurn",
"message": "Run weekly SEO research. Fetch keywords via Serpstat API. Identify content gaps, analyze E-E-A-T opportunities, create NocoDB tasks. Save report to content-strategy/keyword-research/weekly-YYYY-MM-DD.md.",
"model": "opus",
"timeoutSeconds": 600
},
"sessionTarget": "isolated",
"delivery": {
"mode": "announce",
"channel": "telegram"
}
}
Why Monday 1 AM:
- Before the work week starts
- SEO tasks ready for Monday morning review
- Doesn't interrupt daily workflow
Results (30 Days)
Stats:
- Keywords analyzed: 680
- Content gaps identified: 190
- High-value opportunities: 47
- Tasks created: 21 (P1: 7, P2: 10, P3: 4)
- Tasks completed: 8 (38% completion in 30 days)
Time saved:
- Manual: 3h/week
- Automated: 5 min review
- Savings: 2h 55min/week = 12 hours/month
ROI Calculation
Cost:
- Serpstat: $69/month
- Model cron: $1.20/month
- Total: $70/month
Value:
- Time saved: 12h/month
- At $100/hour = $1,200 value
- Net ROI: $1,130/month
Payback: Immediate (month 1).
Conclusion: SEO Doesn't Happen By Accident
Manual SEO = "I should do SEO."
Automated SEO = "Here are 7 tasks with clear ROI."
The system converts good intentions into consistent execution.
Result:
- 12 hours/month saved
- Weekly consistency (not "when I have time")
- E-E-A-T focus (leverages real competitive advantage)
If you do content marketing and have 50+ posts, automating SEO research pays for itself immediately.
Do you automate SEO research? What tools are you using? Share in the comments.
📝 Originally published in Spanish at cristiantala.com
Top comments (0)