The itch
I was doom-scrolling at 1am, and iOS politely informed me I'd spent 4h 12min on Reddit that day. Instead of, you know, changing my life, I built an app that makes fun of me for it.
The Feed Autopsy takes your screen time screenshot, has an AI gossip columnist read you for filth, and then mints a little 3D shame trophy you can spin around in your browser. It's therapy, but worse.
Why a gossip columnist?
Most "digital wellness" apps are insufferable. They chirp things like "You used your phone 23% more this week! Let's set a goal! 🌱" and I want to throw my laptop into the sea.
Shame, delivered with theatrical flair, works better on me. So the prompt doesn't ask for advice — it asks for dirt. The model plays a columnist who has seen your TikTok time and is now whispering about it at brunch. The tone constraint ended up being the hardest part of the prompt engineering: "mean but fun" is a narrow band. Too soft and it's a Hallmark card. Too harsh and it's genuinely upsetting at 2am.
The vision pipeline
Screen time reports are images, and the formats vary wildly (iOS, Android, different locales, people cropping weirdly). Instead of OCR + regex hell, I just hand the image to a multimodal model and ask it to extract structured JSON:
const res = await fetch("/api/autopsy", {
method: "POST",
body: formData, // the screenshot
});
const { apps, totalMinutes, roast, trophyPrompt } = await res.json();
The same call returns both the extracted data and the roast, which keeps latency down to one round trip. A second prompt generates a short trophyPrompt — a physical description of what the shame trophy should look like, based on the worst offender. ("A bronze thumb, calloused, mounted on a marble base engraved INSTAGRAM 2h 47m.")
The 3D trophy
The trophy renders with react-three-fiber. I'm not generating mesh geometry from scratch — that way lies madness. Instead I have a small library of base trophy shapes (pedestal, plaque, figure slot) and the AI's description drives materials, colors, engraved text, and which figure sits on top. It's basically a Mad Libs for 3D.
You can drag to spin it. That's the whole feature. It is deeply unserious and I love it.
Why its own Railway service?
The main site is a Next.js app, but Feed Autopsy runs as an isolated service on Railway with its own subdomain. Two reasons:
- Blast radius. Image uploads + LLM calls + occasional 3D asset generation is a different risk profile than my blog. If someone figures out how to make it expensive, I want to kill that service, not the whole house.
- Cold start honesty. Each experiment gets its own container, its own logs, its own budget alarm. When I inevitably abandon it in six months, I can nuke it without touching anything else.
Each app in my little factory of weird projects is deployed this way. It's more infra than a monorepo on Vercel would be, but it's made me way more willing to ship dumb ideas, because dumb ideas can't hurt the smart ones.
Try it
Upload a screen time screenshot. Get roasted. Keep the trophy.
👉 feed-autopsy.edgecasefactory.com
No signup, no email capture, no "premium tier." Just vibes and mild psychological damage. Tell me what the columnist said about you — I want to know if it's landing or if I need to make it meaner.
Top comments (0)