It started on a random Sunday afternoon at our kitchen table. I was playing around with an M5Stack Core2, "vibe-coding" a simple pixel face that made a sound when you tapped the screen. I thought it was a fun little experiment—until I showed it to my 10-year-old son, Justus.
He looked at it, then at me, and said: "Dad, you can't leave it like that. It needs a life. It needs to eat, it needs to sleep, and it definitely needs to talk to other pets."
That was the spark. We spent the rest of the afternoon together, and by Sunday evening—thanks to an intense pairing session with Claude Opus 4.7 —we had the first working prototype of what is now Pixel-Pets.
The Team: Ideas vs. Execution
We established a professional workflow that felt like a high-speed dev shop:
Justus (Product Owner & Chief Designer): He provided the user stories and the creative vision. He designed the biological needs (hunger, sleep, mood) and the interactions.
Me (Technical Wingman & Prompter): With my background in software development, I’ve seen impressive things built with AI, but my role here was to manage the technical constraints and translate Justus's vision into prompts.
What Actually Surprised Me (As a Dev)
I’ve seen "vibe-coding" before, but what happened during this project reached a level of "AI competence" I honestly didn't expect.
Linux System Administration: The Local LLM module runs on a Linux environment. I didn't expect the AI to navigate this environment so fluently, making precise adjustments to running services and configurations—and getting it right on the very first try.
Vision to C++ Graphics: That the AI could analyze a photo of a stuffed animal is one thing. But that it managed to write the specific C++ code to draw a matching, stylized graphic on the Core2’s mini display? I wouldn't have thought that was possible.
Stable Iterations: The most impressive part was the stability. After the AI "drew" the pet in code, we could iterate on it. We could say, "Make the ears smaller," or "Adjust the snout," and the AI would refactor the C++ drawing logic perfectly without breaking the rest of the UI.
The Technical Ecosystem
What started as a Sunday afternoon whim evolved into a sophisticated open-source family:
Muffin (CoreS3 + LLM): Our high-end pet with on-device Local LLM capabilities. No cloud, 100% private.
Goo-Goo (Core2) & Visu (CoreS3): Standalone variants with rich interactions.
Pip (M5StickC PLUS2): A companion device using ESP-NOW to wirelessly "throw" treats to the bigger pets.
World-Awareness: The pets use IP-geolocation to sync with real-world weather and moon phases.
The Shift in Modern Making
This journey showed us that the definition of a "Maker" is shifting. In the past, the barrier to entry was the "suffering" of learning syntax for years. Today, the barrier is shifting toward vision, logic, and the ability to guide an AI mentor.
For Justus, the AI didn't take away the work; it removed the friction. It allowed him to focus on user experience and the joy of creating. We aren't replacing developers; we are empowering a whole new generation to start shipping before they even know what a semicolon is.
Open Source & Launch
Pixel-Pets is 100% Open Source. We want to show that the distance between a "what if" on a Sunday afternoon and a shipping product has never been shorter.
Justus is monitoring the GitHub stars today like a seasoned pro. If you believe that the future of making is about vision—not just syntax—we’d love your support!
Find us here:



Top comments (0)