The Problem Nobody Talks About
I was spending hundreds of dollars on AI API calls. Half of it wasn't buying me useful information — it was paying to process navigation bars, cookie banners, and inline scripts.
A typical e-commerce product page? 47,000 tokens. The useful product information? ~300 tokens.
In many cases, over 90–99% of the processed tokens are structural or navigational noise.
I kept thinking: what if the server could just... give the agent what it needs?
Enter MAKO
MAKO (Markdown Agent Knowledge Optimization) is an open protocol that uses standard HTTP content negotiation to serve AI-optimized content.
The idea is dead simple:
- An AI agent sends
Accept: text/mako+markdownin its request header - The server detects this header
- Instead of HTML, it responds with structured markdown + YAML frontmatter
- The agent gets exactly what it needs in ~280 tokens
No new endpoints. No special APIs. Just HTTP, the way it was designed.
MAKO does not replace HTML. Browsers still receive HTML.
Only agents explicitly requesting text/mako+markdown receive the optimized version.
What a MAKO Response Looks Like
---
mako: "1.0"
type: product
entity: "Nike Air Max 90"
updated: "2026-02-10"
tokens: 280
language: en
summary: "Mid-range running shoe, 47% off"
links:
- url: /running-shoes
context: "Browse all running shoes"
actions:
- name: add_to_cart
description: "Add to shopping cart"
endpoint: /api/cart/add
---
# Nike Air Max 90
## Key Facts
- Price: €79.99 (was €149.99, -47%)
- In stock (23 available)
- Sizes: 38-46
- Rating: 4.3/5 (234 reviews)
## Overview
Casual running shoe, mid-range. Leather/mesh upper.
Competitors: Adidas Ultraboost (€129), NB 1080 (€139).
Strong point: price. Weak point: narrow fit per reviews.
Why Not Just Scrape HTML?
Scraping pushes complexity to the client.
MAKO moves structure to the source.
Instead of:
Agent → Parse DOM → Remove noise → Infer structure
You get:
Agent → Receive structured semantic content
Lower token cost. Clear semantic boundaries. Deterministic machine-readable structure.
These numbers come from real-world pages measured using GPT tokenization.
Exact savings depend on site structure, but the pattern is consistent.
The Numbers
| Page Type | HTML Tokens | MAKO Tokens | Savings |
|---|---|---|---|
| E-commerce product | 47,000 | 280 | 99.3% |
| Blog article | 35,000 | 670 | 98.1% |
| Documentation | 38,000 | 980 | 97.4% |
| Landing page | 110,000 | 640 | 99.4% |
You can see a real generated MAKO page here:
👉 https://makospec.vercel.app/en/p/aZo89_kQ
How to Implement It
Using the SDK
npm install @mako-spec/js
import { MakoGenerator, MakoParser } from '@mako-spec/js';
// Generate a MAKO file
const generator = new MakoGenerator();
const mako = generator.generate({
type: 'product',
entity: 'Nike Air Max 90',
language: 'en',
body: '# Nike Air Max 90\n\nCasual running shoe...',
links: [{ url: '/running', context: 'More running shoes' }],
actions: [{ name: 'add_to_cart', description: 'Add to cart' }]
});
// Parse a MAKO file
const parser = new MakoParser();
const result = parser.parse(makoContent);
console.log(result.frontmatter.entity); // "Nike Air Max 90"
Express Middleware (3 lines)
import { makoMiddleware } from '@mako-spec/js/middleware';
app.use(makoMiddleware({
getContent: async (req) => {
// Return your MAKO content for this URL
return await fetchMakoForUrl(req.path);
}
}));
Free Tools
- MAKO Score — Audit any site's AI-readiness (0-100). Checks 30 criteria across 4 categories.
- MAKO Analyzer — Paste any URL and see the generated MAKO output + token savings.
- WordPress Plugin — Full WooCommerce support, AI generation with your own API key.
How MAKO Fits In
| Tool | Scope | MAKO Difference |
|---|---|---|
| llms.txt | 1 file per site | MAKO: 1 file per page |
| Cloudflare MD | Auto-converted HTML | MAKO: Semantically optimized |
| Schema.org | For search engines | MAKO: For AI agents |
| WebMCP | Actions only | MAKO: Content + actions + context |
MAKO is complementary — it works alongside all of these.
Why This Matters
AI agents are becoming a first-class web audience.
If search engines needed HTML, AI agents need structured representations.
The web evolved once for browsers. It's evolving again for agents.
Visually, the difference looks like this:
Traditional Web Flow
Browser → HTML → Rendered Interface → Human
AI Agent → HTML → DOM Parsing → Layout + Navigation + Script Overhead
With MAKO
Browser → HTML → Rendered Interface → Human
AI Agent → MAKO → Structured Semantic Content → Deterministic Extraction
Open Source
Everything is Apache 2.0 licensed:
- Spec: github.com/juanisidoro/mako-spec
- JS SDK: npmjs.com/package/@mako-spec/js
- CLI: npmjs.com/package/@mako-spec/cli
- WordPress: github.com/juanisidoro/mako-wp
- Website: makospec.vercel.app
I'd love your feedback — on the protocol design, the scoring system, or anything else. Star the repo if you find it useful.
How are you handling web content extraction in your AI apps right now? Drop it in the comments.
Top comments (0)