How a Supernatural binge session led to a weekend project that solved my first-world problem
The Crime Scene
Picture this: Sunday evening. Supernatural is playing on the monitor. Girlfriend next to me. Both of us have achieved that perfect couch position; you know, the one where your spine curves in a way that's technically concerning but feels like heaven.
Then her phone rings.
Now, any sane person would just... let it play in the background. But no. She has to take the call. Which means I have to get up, walk three whole feet to the laptop, and pause the show.
Three feet. Do you know what happens when you leave the perfect couch spot? It's gone. Forever. You can try to recreate it, but it's never the same. The cushions have betrayed you.
Twenty minutes later, the call ends. I get up again. Play the show. Sit back down. Spend the next five minutes trying to find that position again. Fail.
And don't even get me started on volume. Every time Dean Winchester decides to have an emotional moment with a whisper followed by an explosion, one of us has to become the sacrifice.
There has to be a better way, I thought.
The Search for Solutions
My first instinct was to Google "control laptop from phone." Surely this is a solved problem, right?
KDE Connect: Installed it. Configured it. It worked! ...sometimes. When Mercury was in retrograde and I held my phone at exactly 47 degrees. The rest of the time, it just... didn't.
Other apps: Most of them wanted me to:
- Download an app on my phone
- Download software on my laptop
- Create an account
- Pair the devices
- Sacrifice a goat under the full moon
- Still not work consistently
I just wanted to pause a video, man.
The "Wait, We Have AI Now" Moment
I'm a DevOps guy. I break things for a living and then blame the yaml file. Writing full-stack apps from scratch isn't exactly my weekend hobby.
But then I remembered: it's 2026. We have AI now.
Specifically, I have access to AntiGravity (a code editor with AI superpowers) running Claude Opus. I'd been using it for work stuff, but never for a personal project.
So I opened it up and basically said: "Hey, I want to control my laptop's media and volume from my phone's browser. Make it happen."
And it... did?
What We Built
CouchControl is embarrassingly simple in concept:
- A Python server running on my laptop
- A web page I open on my phone
- They talk to each other over my home WiFi
That's it. No app downloads. No accounts. No cloud services selling my data to advertisers who think I need more Supernatural merchandise (I don't. I have enough.)
The features:
- Play/Pause: The reason this exists
- Volume slider: For when Dean whispers and then the Impala explodes
- Now Playing display: Shows what's currently playing (works with Spotify, VLC, browser, everything)
- Dark/Light theme: Because I'm civilized
The Part Where AI Did All the Work
I'll be honest: I barely wrote any code.
Here's roughly how the conversation went:
Me: I want a media remote that works over LAN.
Claude: generates FastAPI server, WebSocket handlers, Windows audio integration, mobile-first UI
Me: Can it show what's currently playing?
Claude: creates a whole module using Windows SDK APIs I didn't know existed
Me: Add a dark/light theme toggle.
Claude: done in 30 seconds
The entire project took maybe 4-5 hours spread across a weekend. And most of that was me testing it while watching more Supernatural.
The one actual bug I encountered? Browser caching. My phone was loading the old version of the files. Claude added cache-busting query strings (?v=2) to the CSS and JS includes, and that was that.
My DevOps brain did kick in at one point: "Wait, is this secure?" The answer is "not really"; there's no authentication, and it binds to all network interfaces. But it's running on my home WiFi, so unless my neighbor wants to pause my shows, I'm fine.
The Tech Stack (For the Curious)
- Backend: Python, FastAPI, Uvicorn
-
Media Control:
pycaw(Windows audio API),keyboard(media key simulation) -
Now Playing:
winsdk(Windows Runtime APIs) - Frontend: Vanilla HTML, CSS, JavaScript, WebSocket
- AI Assistance: AntiGravity + Claude Opus
No React. No npm install taking 20 minutes. No node_modules folder that weighs more than the sun. Just simple tech that works.
The Result
Now, when we watch shows:
- Phone is always in hand anyway
- One tap to pause
- Slider to adjust volume
- Current track info visible without alt-tabbing
The couch position remains undisturbed. The relationship survives.
Try It Yourself
If you, too, suffer from the unbearable burden of getting up to control your laptop, here's the repo:
GitHub: github.com/TheOneOh1/Couch-Control
Setup takes about 2 minutes:
pip install -r server/requirements.txt
python -m uvicorn server.main:app --host 0.0.0.0 --port 8080
Then open http://<your-laptop-ip>:8080 on your phone.
Works with Windows 10/11. Probably works with any media app that responds to system media keys.
Final Thoughts
This was my first time vibe-coding with AI for a personal project. And honestly? It's changed how I think about side projects.
I had an idea. An itch to scratch. And instead of spending weeks learning new frameworks or giving up halfway, I just... had a conversation with an AI that turned my rambling into working code.
Is it overengineered for what it does? Maybe.
Did it solve my problem? Absolutely.
Will I use it every time I watch something? Already am.
Now if you'll excuse me, I have a couch to return to. Supernatural isn't going to watch itself.
Built with AntiGravity + Claude Opus. The keyboard was merely a formality.
Tags: python, fastapi, side-project, automation, self-hosted, ai-assisted-development, windows
Top comments (0)