Building a website with an AI assistant usually ends the same way: you get a wall of HTML in a code block, you paste it somewhere, and then you're on your own for hosting, deployment, and every update after that.
We wanted to fix that. So we built WebsitePublisher.ai — a platform where AI assistants don't just describe websites, they actually build and publish them.
Here's how it works under the hood.
The core idea: AI as a first-class developer
The premise is simple. Instead of AI being a code generator that hands off to a human, we wanted AI to be the developer — with access to a real API it can call directly.
That API needed to cover the full stack of what a website actually needs:
- Pages and assets (HTML, CSS, images)
- Structured data (entities, records)
- Forms and visitor sessions
- Integrations (email, SMS, payments)
- Scheduled tasks
- Vault (credentials management) So we built it. Eight API layers, all exposed through a Model Context Protocol (MCP) server.
What MCP gives us
MCP is an open protocol that lets AI assistants call tools — similar to function calling, but standardized across clients. Claude, ChatGPT, Cursor, Windsurf, GitHub Copilot, and others all support it.
Our MCP server exposes ~55 tools. An AI can call create_page with HTML content and it's live. It can call configure_form and a contact form appears. It can call create_scheduled_task and a nightly content refresh starts running.
The AI doesn't need to know about hosting, DNS, or deployment. It just calls the tools.
The API layers
We ended up with eight layers, each with a clear responsibility:
PAPI (Pages & Assets) — Create, update, and version HTML pages and static assets. Includes diff-patch for surgical updates, URL fetching, and a content quality warning system.
MAPI (Entities & Data) — A schema-less data layer. The AI defines entities (think: database tables) and creates records. Powers everything from contact lists to leaderboards to inventory.
SAPI (Sessions & Forms) — Anonymous visitor sessions, form submissions, visitor authentication, and analytics. No cookies to configure — it just works.
VAPI (Vault) — Encrypted credential storage. The AI stores API keys that are then used by integrations — never exposed back to the client.
IAPI (Integrations) — A proxy engine that routes calls through stored credentials to external services. Resend, Mailgun, Stripe, Mollie, Twilio — the AI picks the integration, the vault provides the credentials.
AAPI (Agent API) — Scheduled tasks. The AI creates cron jobs that run PHP handlers on a schedule. Daily content refresh, nightly cleanup, automated data sync.
CAPI (Coach API) — A conversational intake system. Ask four questions, generate a complete website. The AI handles the conversation; the platform handles the generation.
A real example
Here's what a Claude session looks like when building a site from scratch:
User: Build me a landing page for my consulting business. Focus on lead generation.
Claude: [calls get_skill to load WebsitePublisher context]
[calls create_page with full HTML/CSS]
[calls configure_form with name, email, message fields]
[calls setup_integration with Resend credentials]
[calls execute_integration to test email delivery]
Done — your page is live at yourproject.websitepublisher.ai.
The contact form sends leads to your inbox via Resend.
Want me to add a thank-you page or set up SMS notifications too?
No copy-paste. No deployment step. The AI did it.
The interesting engineering problems
A few things that weren't obvious until we built them:
Multi-session coordination. When multiple AI sessions work on the same project in parallel, they can overwrite each other's progress. We built TAPI — an append-only task tracking system — specifically to solve this. Each session logs progress via INSERTs only. MAX(completion_pct) from history records prevents any session from accidentally rolling back another's progress.
Tool count limits. Our MCP server returns 55 tools in tools/list. Some clients have limits on how many they load. Our workaround: the get_skill tool loads a SKILL.md document that gives the AI a map of the full API — so even with five tools loaded, it can use the REST API directly for everything else.
Content quality detection. AIs occasionally send a file path instead of HTML content to create_page. We added a WarningCollector that catches this pattern and returns a structured warning before anything gets saved.
Authentication across API layers. Each layer needed a different auth model. Project tokens (wpa_) for AI access. Dashboard sessions (wps_) for humans. Admin tokens (wsa_) for visitor-facing login flows. Getting these to coexist cleanly took a few iterations.
What's next
We're currently in the Mistral sprint — working on SSE streaming for the conversational intake (so responses feel instant instead of batched), parallel concept generation, and getting listed in Mistral's connector directory.
If you're building with MCP, or thinking about what "AI-native" infrastructure actually means in practice — we'd love to hear what you think.
The MCP server is at mcp.websitepublisher.ai
Full docs at websitepublisher.ai/docs
Top comments (0)