I'm building WebsitePublisher.ai — a platform where AI assistants build and publish websites through MCP (Model Context Protocol) tools. This week: first paying customer, team collaboration, and some interesting architecture decisions.
The Stack
Quick context: Laravel/PHP backend, dual-server DigitalOcean cluster, Redis Sentinel, MySQL, S3 for assets. The AI layer is a multi-API stack:
PAPI — pages and assets
MAPI — entities and structured data
VAPI — vault/secrets management
IAPI — third-party integrations (Stripe, Resend, Twilio, etc.)
SAPI — sessions, forms, visitor auth
TAPI — task tracking across AI sessions
AAPI — scheduled automations
All exposed as MCP tools. Currently 56 tools, accessible from Claude, ChatGPT, Cursor, Windsurf, GitHub Copilot, Gemini, Grok, Mistral, and n8n.
What shipped this week
Team Collaboration
Our first paying customer — an agency — asked for multi-user support on day one. We shipped it within 48 hours:
- Owner invites team members via email
- Team members get full access to all projects
- Magic link authentication (no passwords)
- Max 5 members per Agency account
Architecture decision: no per-project granularity in v1. Simpler model, ship fast, iterate based on real usage.
Dashboard Vault — Secrets Without AI Exposure
A security feature I'm particularly proud of: the Vault tab lets users manage API keys (Stripe, Resend, etc.) through the browser UI. The key insight: if you share an API key in an AI chat, it ends up in transcripts, logs, and context windows. The Vault bypasses AI entirely — write-once, never displayed, rotate or delete only.
Backend uses AES-256-GCM encryption keyed per project, so even if someone gains database access, secrets from other projects are unreadable.
Language Refactor — From 4 Hardcoded Lists to Zero
Our AI Coach (a conversational website builder) needed proper i18n. The old code had 4 separate hardcoded language→string mappings scattered across the codebase. We refactored to a single CapiLanguage value object with a Redis → DB → Haiku → fallback chain:
- Check Redis cache
- Check papi_language_meta table (27 seeded languages)
- Ask Claude Haiku for language detection (costs ~$0.001)
- Fall back to English
Result: any of the 250+ ISO 639-1 codes work automatically. Adding a new language = 1 database row.
get_asset MCP Tool — Closing the Read Gap
We had tools to list, upload, patch, and delete assets — but no tool to read them. AI agents were doing blind find/replace via patch_asset without knowing the current file state. get_asset closes that gap: text content returned directly, binary assets as base64.
The Activation Challenge
67 signups, one paying customer. The gap is real. Our research this week showed that the friction is in the "how do I start?" moment — users sign up, see a dashboard, but don't know which AI platform to connect or how to begin.
Our answer: an embedded AI coach right inside the dashboard. Uses Sonnet (not Opus — cost control), generates one concept, writes directly to the user's first project. From "I just signed up" to "I have a website" in under 2 minutes.
What's next
Friday release agent (AAPI-powered automated changelog + test plans)
patch_asset optimistic concurrency (base version hash to prevent conflicts)
CAPI language refactor retest (waiting on external tester confirmation)
If you're building with MCP or interested in AI-native development tools, I'd love to connect. The MCP ecosystem is moving fast.
GitHub: megberts/websitepublisher-mcp
MCP Server: mcp.websitepublisher.ai
Top comments (0)