Cross-posted from the Unitix Flow Blog
We rebuilt our release dashboard three times before we got it right.
Version 1: The Feature List
"Done" / "In Progress" / "Not Started" for each task. Looked great in demos. Nobody updated it. Always out of date by Thursday.
The problem: it was a manual status board. No integration with the actual code, branches, or pipelines. Team members had to remember to change the status, and they didn't.
Version 2: The GitLab Mirror
Pulled real-time data from GitLab. Branches, pipelines, merge requests — all automated. No manual updates needed.
Massive improvement. But it still missed two critical things:
- QA visibility — CI passes tests, but manual QA sign-off lived in Slack
- Multi-repo coordination — each repo showed up independently with no unified release view
Version 3: The 3-Question Dashboard
Built around one principle: answer 3 questions without clicking anything.
The 3 Questions
1. What's shipping?
Tasks + branches linked to actual tracker issues. Not "branches that were merged recently" — specifically the branches that belong to this release.
2. Is it tested?
QA matrix with pass/fail per test case. Not just "CI is green" — manual QA results embedded directly in the release view.
3. What's blocking?
Unmerged branches, failing pipelines, incomplete tests. The blockers are surfaced automatically, not discovered during a standup.
The 3 Metrics That Actually Get Looked At
After trying 20+ metrics, we narrowed down to three that teams consistently check:
Release Completion %
Not based on Jira ticket status (which is often wrong), but on branches merged + tests passing. A task isn't "done" until its branch is merged and its tests pass.
QA Coverage
How many test cases executed vs. assigned, and the pass rate. This tells you whether testing is keeping up with development or falling behind.
Time in Stage
How long has the release been in its current stage (development, QA, ready)? If it's been in QA for 3x the average, something is stuck and needs attention.
What Most Dashboards Get Wrong
Too many charts. 25 metrics means nobody looks at any of them. A dashboard with 25 charts is a reporting tool, not an operational tool.
Informative but not actionable. Showing that a branch isn't merged is informative. Showing a "Merge" button next to it is actionable. The best dashboards let you act, not just observe.
Optimized for reporting, not doing. Built for last quarter's review, not today's release. If the dashboard is only useful in retrospectives, it won't be used day-to-day.
Separate from the workflow. Another tab means another tool that gets ignored. The dashboard needs to be where the team already works.
Design Decisions That Worked
Real-time WebSocket updates — no refresh, no "let me check." When a branch is merged or a test passes, the dashboard updates instantly.
QA results embedded IN the release view — not a separate section you have to navigate to. Testing status is part of the release, not adjacent to it.
Activity timeline — a chronological feed of everything that happened in the release. Who merged what, when tests ran, when QA signed off.
Inline branch actions — merge from the dashboard without opening GitLab. Reduces context switching and keeps the team in one tool.
What We'd Do Differently
Start with Version 3's principles from day one. The feature list (Version 1) felt like a natural starting point but was a dead end. If you're building a release dashboard, start with the questions it needs to answer and work backward from there.
We built these lessons into Unitix Flow — a release dashboard that connects your branches, pipelines, and QA results in one real-time view.
Top comments (0)