DEV Community

Cover image for I deleted the chat feature from my own AI product. Here's what I built instead.
David Hamilton
David Hamilton

Posted on

I deleted the chat feature from my own AI product. Here's what I built instead.

I deleted the chat feature from my AI product.
The irony writes itself. A small Chrome extension whose entire purpose is bringing AI to your saved content. The first thing I cut was the AI-shaped feature.
It was the easiest call I've made on this build.
Three reasons. Each one obvious in hindsight.
API costs. A back-and-forth chat conversation eats £1-2 a month in tokens per active user. My Pro tier is £4 a month. The maths doesn't work the day you launch. It only gets worse with scale.
Product bloat. A chat box is the kind of thing you build because every other AI product has one. It looks like the AI feature. It isn't.
V2. If users actually want to chat with their bookmarks, they'll tell me. I'd rather build it then than guess now.
The product is called ContextBolt. It captures your bookmarks from X, Reddit, and LinkedIn. It auto-tags each one with Claude. And then it does something most products in 2026 still don't.
It exposes itself to Claude.
Pro users get a personal MCP endpoint. They paste one URL into Claude Desktop. From that moment on, Claude can search their entire bookmark library mid-conversation. You ask Claude something. Claude reads your saved content. The answer arrives with the context already in it.
The dashboard I built first turned out to be the legacy interface. The MCP endpoint is the actual product.
That's the irony resolved. The chat box was never the AI part. The integration was the AI part. I just couldn't see it until I deleted what I thought it was.
Here's the part I find harder to explain.
I'm one person. I have a day job. I get about ten hours a week on this thing, mostly evenings and weekends. The amount of code in this product is not the amount of code one person could write in ten hours a week.
I didn't write most of it.
Claude did.
I use Claude Code. I describe what I want in plain English. Claude writes the service worker. Claude writes the Cloudflare Worker. Claude writes the React dashboard. Claude writes the SQL migrations. I read what comes back, push back where the taste is off, and ship it.
Two days ago Google approved the extension on the Chrome Web Store. I downloaded my own production build. Within an hour I'd found four bugs in real-world testing. The LinkedIn dashboard didn't refresh on save. The X popup showed a scraped count instead of a deduplicated one. The auto-scroll stalled at twenty bookmarks. The licence-key save flow failed silently.
I described each symptom out loud. Claude found the wrong file (a content script that was dead code, while the loaded one had no listener). Claude wrote each patch. We shipped five extension versions and two worker deploys that evening. I never opened a debugger.
That is what working with Claude actually looks like in 2026. It is not autocomplete. It is a colleague who reads the whole codebase faster than you can blink, fixes the thing, and writes the commit message while it's at it.
The infrastructure is the other surprise.
The whole product runs on Cloudflare. Pages for the website. Workers for the API. D1 for the database. KV for the rate limits. Vectorize for the semantic search. There is no AWS bill. There is no Vercel bill. There is no managed database bill.
Including Claude API spend, the entire stack costs me under £5 a month at current usage. I keep waiting for that to be a lie. So far it isn't.
The marketing flywheel is the same recursive shape.
Thirty-three blog posts on the site. Sixty-six programmatic SEO pages. Every post drafted with Claude. Every meta description checked by Claude. Every internal link suggested by Claude. The site's scheduled tasks run with Claude reading my Search Console and writing the next thing to fix.
I review. I edit. I ship.
The asymmetric stack is the only reason a one-person, ten-hour-a-week build can keep up with funded teams. Claude does the parts that scale linearly with effort. Writing. Debugging. Testing. Drafting. I do the parts that don't. Taste. Decisions. Picking what to ship. Picking what to delete.
I keep coming back to that last one.
The chat box was never the AI part. It was the legacy UI we used while we figured out what the AI part actually was. The AI part is the integration. The AI part is your existing tools getting access to context they didn't have before.
Most AI products in two years will look like ContextBolt. A small piece of software that does one thing well and exposes itself to whatever model the user already pays for. The product team doesn't have to ship the chat box. The user already has one.
That's why deleting the chat was easy. Once you see it, you can't unsee it.
ContextBolt is live on the Chrome Web Store today. The site is at contextbolt.com. If you save more than you read, you'll feel it the first time Claude finds something for you that you'd already given up on.
I'd love to know what you'd connect it to.

Top comments (0)