TL;DR
I built a GEO (Generative Engine Optimization) analyzer tool, tested it on my own portfolio site, and scored a disappointing 53/100. Then I systematically fixed every issue it found. Here's the exact process, what I changed, and why it matters for anyone who wants their content to appear in ChatGPT, Perplexity, and Google AI Overviews.
What is GEO and Why Should You Care?
Generative Engine Optimization is the practice of structuring your content so AI search engines can find, understand, and cite it. Traditional SEO gets you ranked on Google. GEO gets you mentioned in AI-generated answers.
This matters because the way people search is changing fast. Instead of clicking through 10 blue links, users are asking ChatGPT "What's the best competitor analysis tool?" or asking Perplexity "How do I automate cold outreach?" If your content isn't optimized for these AI engines, you're invisible to a growing segment of your audience.
The Tool I Built
GEO Analyzer evaluates any web page across 5 categories, each scored out of 20 points for a total of 100:
Content Structure — H1 tags, heading hierarchy, question-based H2s, internal links
E-E-A-T Signals — Author info, dates, citations, statistics, trust indicators
Technical AI Readiness — Meta tags, schema markup, Open Graph, canonical URLs, AI crawler access
Content Quality — Value, clarity, flow, Q&A format, originality (AI-evaluated)
AI Search Optimization — Snippable content, FAQ structure, definitions, lists, topic focus (AI-evaluated)
The first three categories are analyzed programmatically using Cheerio for HTML parsing. The last two use Groq's Llama 3.3 model for AI-powered evaluation, with Gemini as a fallback.
My Score: 53/100 (Grade: F)
Here's what the analyzer found on my portfolio site:
CategoryScoreIssuesContent Structure6/20No H1 tag, no question-based H2s, only 1 internal linkE-E-A-T Signals13/20No publication date, no statisticsTechnical AI Readiness13/20No schema markup, no canonical URLContent Quality13/20Low Q&A format, weak flowAI Search Optimization8/20No snippable content, no FAQ, no definitions
The biggest problems were clear: no schema markup (AI crawlers couldn't understand my site's structure), no question-based headings (AI engines love Q&A format), and no "snippable" content (short, quotable sentences AI can cite directly).
The Fixes (53 → 85+)
Fix 1: Schema Markup (0 → 5 points)
I added JSON-LD structured data to my site. This tells AI crawlers exactly who I am and what I do:
A Person schema with my name, job title, skills, and social links. Plus a FAQPage schema for common questions. AI engines parse this structured data directly — it's like handing them a summary card instead of making them read your entire page.
Fix 2: Single H1 + Question-Based H2s (0 → 10 points)
Changed my heading structure. One clear H1 at the top. H2s reformatted as questions: "What Technologies Do I Work With?", "What Problems Do I Solve?", "How Can I Help Your Business?"
AI engines are trained on billions of question-answer pairs. When your heading is a question and the content below answers it, you're speaking their language.
Fix 3: Snippable Content (0 → 5 points)
Added short, definitive sentences that AI can quote directly. For example: "Akın Coşkun is a full-stack developer specializing in AI-powered automation, N8N workflows, and zero-cost SaaS development."
This sentence is designed to be the answer when someone asks an AI "Who is Akın Coşkun?" or "Who builds N8N automations?"
Fix 4: FAQ Section with Schema (0 → 4 points)
Added a dedicated FAQ section with 5 common questions, each with a concise answer. Backed by FAQPage schema markup so AI engines can extract Q&A pairs directly.
Fix 5: Statistics and Data (0 → 4 points)
Added concrete numbers: "10+ production projects", "$0/month infrastructure cost", "1 published npm package", "4 professional certifications." AI engines love citing specific statistics.
Fix 6: Internal Linking (1 → 4 links)
Connected everything: project cards link to blog posts, blog section links to projects, FAQ links to relevant pages. AI crawlers follow internal links to build a complete picture of your site.
Fix 7: Canonical URL + robots.txt
Added canonical URL to prevent duplicate content issues. Updated robots.txt to explicitly allow AI crawlers (GPTBot, ClaudeBot, PerplexityBot).
What I Learned About GEO
After going through this process, here are the key takeaways:
Schema markup is non-negotiable. It's the single most impactful thing you can add for AI visibility. Person, FAQPage, SoftwareApplication — these schemas give AI engines structured data they can parse instantly.
Question-based headings work. AI engines are fundamentally Q&A machines. Format your content as questions and answers, and you're aligning with how they process information.
Snippable sentences are your AI elevator pitch. Write 1-2 sentences per section that could stand alone as an AI-generated answer. Clear, definitive, factual.
Traditional SEO still matters. AI engines often use Google search results as their source. If you rank well on Google, you're more likely to be cited by ChatGPT and Perplexity.
E-E-A-T is universal. Whether it's Google's algorithm or an LLM deciding which source to cite, expertise, experience, authoritativeness, and trust are what get you chosen.
Try GEO Analyzer Yourself
Enter any URL and get your GEO score with specific recommendations:
Live app: geo-analyzer-sepia.vercel.app
Source code: github.com/akincskn/geo-analyzer
What's Next
GEO is still a new field. Most websites score below 50/100 because nobody's optimizing for AI search yet. The developers and marketers who start now will have a massive advantage as AI search becomes the default.
I'm continuing to build tools in this space. If you're interested in GEO/AEO, AI automation, or zero-cost SaaS development, follow me for more.
I'm Akın Coşkun, a full-stack developer and AI automation specialist. I build tools like RivalRadar (AI competitor analysis), GEO Analyzer (AI search optimization), and LeadPilot (AI SDR agent). Find me on GitHub or check my portfolio.
Top comments (0)