The first post in this series was about why Omnismith exists. This one is about what happened after I launched it.
The Parts Nobody Sees on the Commit Graph
I've been a software engineer for a long time. What I had never done was build a complete commercial product from zero — alone — with real money and real legal obligations on the line.
Some of what that actually involved:
Two full architectural iterations. The first codebase went in a direction I wasn't satisfied with. I scrapped it and started over with a modular monolith on my own framework, DDD, and hexagonal/use-case architecture. It was the right call — the backend genuinely became enjoyable to work in, not something to fight against.
Opening a legal entity while in immigration. I'm not an EU citizen. I opened a UK LTD — something I'd never done in my life, in a country that isn't mine, in a second language. It's surprisingly straightforward once you start, but it took mental energy I hadn't budgeted for.
Pricing based on actual economics. Not "what sounds reasonable" but walking through unit cost, comparable tools, where the free tier lines make sense, and why flat pricing over per-seat billing fits the developer audience I'm building for. That analysis took longer than I expected and I'm glad I didn't rush it.
A full DevOps pipeline, built and maintained by me. GitHub CI/CD, my own Kubernetes cluster, ArgoCD for GitOps deployments. I'm the infrastructure team. I'm also the backend. I'm also the frontend. I'm also the support desk.
Nobody helped me build any of this. I'm not complaining — it's genuinely something I'm proud of, even though bragging isn't really my thing. I'm writing it down because I think solo builders often undersell how much ground they're covering.
The First Registrations
After the soft launch, a small group of closed beta testers — people I actually know and trust — tried the early version and gave honest feedback. They caught small things I'd stopped seeing — a missing translation here, a confusing label there. They pointed at specific moments where they got confused. They told me the onboarding wasn't landing.
That feedback was more valuable than months of me staring at my own code. I'm genuinely grateful for it.
And then real users — people I don't know — started showing up. Each one felt way more significant than it probably should have. After years as a hired engineer building things for other people's products, seeing a real stranger's account appear in my system landed differently. It still does.
The Blank Canvas Problem
Here's what the beta feedback actually taught me.
Omnismith's v1 onboarding worked like this: register → step through a tutorial → land in an empty project with nothing in it — no templates, no attributes, no data.
I thought that was fine. The tutorial explained the concepts. The user had a clean slate to start.
What actually happened: users finished the tutorial and then stared at an empty screen, having memorized a sequence of UI steps but with no immediate sense of what to actually do with them. The mental model wasn't formed yet. And blank canvases are paralyzing — they're for people who already know what they want to paint.
v2 is different. Now when you register, you don't get an empty project. You get a demo project pre-populated with a real dataset: a product catalog.
It has 3 templates — Category, Brand, and Product — with 11 attributes across them. Some attributes are shared across templates (Name, Description). The Product template uses References to link each product to its Category and Brand, and even has a self-referencing "Related Product" link. There are 19 sample entities: 4 categories (Electronics, Furniture, Apparel, Books), 5 brands, and 10 products with real prices, stock status, release dates, and tags.
It's not a lot. But it's enough to instantly understand the shape of the platform. There's something to click on. There's data to edit. The templates tell you "this is how structure works." The entities tell you "this is what your data looks like."
On top of that, I added an interactive tour — something in the spirit of Intro.js — that walks you through the actual UI with contextual pointers. Not a wall of text in a help doc, but a live guide that highlights the element you're looking at right now. That combination — real data you can touch plus a tour that shows you where to look — landed much better than v1's tutorial-then-empty-canvas approach.
My beta testers stopped asking "what do I do now?" Which was the only metric that mattered.
The Lesson That Should Have Been Obvious
The blank canvas problem is not new. Every tool with a new user flow has solved or failed to solve it. But when it's your tool, your mental model of it is so complete that you forget how much context you're carrying that your user doesn't have yet.
You see a blank project and feel possibilities. Your user sees nothing.
Show them a story they can edit. Don't make them write the opening chapter.
That was probably the most valuable lesson of my early founder days. Not a technical insight. Not an architecture decision. A UX realization delivered by real people being kind enough to tell me where they got stuck instead of just quietly leaving.
If you're building a tool and your new user flow ends with "and now they have a blank canvas" — reconsider that.
👉 omnismith.io — log in and see the demo project yourself
👉 app.omnismith.io — free tier, no credit card
👉 docs.omnismith.io — now actually worth reading
Part of the "I'm Building Omnismith in Public" series. Previous: I Built This Because I Was Tired of Building Admin Panels.


Top comments (2)
Some comments may only be visible to logged-in visitors. Sign in to view all comments.