DEV Community

Cover image for Why I cut my dashboard from 12 metrics to 5 — and decisions got faster
toshihiro shishido
toshihiro shishido

Posted on • Originally published at revenuescope.jp

Why I cut my dashboard from 12 metrics to 5 — and decisions got faster

"We built a sales dashboard, but looking at it doesn't tell us what to do next." This is the most common complaint I hear from ecommerce operators. I've felt it myself — staring at a 12-metric dashboard, ranking the movements in my head, and only then starting to think about action. By the time I had a hypothesis, the meeting was over.

The fix wasn't a fancier visualization. It was deletion. I cut the dashboard down to 5 metrics, and decisions started fitting into one minute.

Here's the design I landed on, and the reasoning behind each cut.

TL;DR

  1. A revenue dashboard needs only 5 KPIs: Revenue / CVR / AOV / RPS / ROAS. They fall out of the revenue formula Revenue = Sessions × CVR × AOV. Anything outside that formula needs an explicit justification to earn a spot on the screen.
  2. Information density doesn't speed up decisions — it slows them. GoodData's BI usage research shows the median time-to-decision for dashboards is around 42 seconds. Past that, users default to "I'll look later" — which means never.
  3. Channel-level RPS is the action axis. ROAS mixes "quality of traffic" and "volume of traffic." RPS (Revenue per Session) lets you compare channels apples-to-apples and decide where to push budget next.

Why "More Metrics" Slows Decisions

Tableau's official dashboard design guide is explicit on this: a dashboard is a tool for action, not observation. If all you need is to look up "what was last month's CVR," a report works fine. A dashboard exists so that the next move emerges in seconds, not minutes.

Base layout — 5 KPIs + channel RPS

When I had 12 metrics on screen, the cognitive flow was:

  1. Scan all 12 numbers
  2. Mentally rank the largest movements
  3. Try to reason about why the largest movers moved
  4. Then decide on action

Steps 1–3 were eating five minutes per review session. Step 4 — the only step that mattered — never got the airtime it needed. By cutting to 5 KPIs, steps 1–3 collapse into one glance, and step 4 actually becomes the meeting.

The 5 KPIs — Derived from the Revenue Formula

The set falls out of decomposing revenue:

Revenue = Sessions × CVR × AOV
Enter fullscreen mode Exit fullscreen mode

That gives you three factors. Add RPS (CVR × AOV in one number, for traffic-quality comparison) and ROAS (the efficiency check on paid spend), and you have five.

5 KPI cards in left-to-right reading order

What each one is for:

  • Revenue — the outcome. The thermometer of the business.
  • CVR — "of the visitors who showed up, what fraction bought?"
  • AOV — "of those who bought, how much did they spend?"
  • RPS — CVR × AOV compressed into one number. The efficiency of the site, independent of how loud you were marketing.
  • ROAS — "for every dollar of ad spend, how much revenue came back?" The sanity check on paid acquisition.

The test for any new metric trying to earn a spot on the dashboard: "if this metric moves by 0.1, will next week's actions change?" Bounce rate, time-on-page, scroll depth — all useful in their place, but they fail this test on a strategic dashboard. They go on a separate diagnostic tab, not on the main view.

Channel-Level RPS — The Action Axis

This is the second pillar that took me a while to internalize, and it's the one that produced the largest jump in decision speed.

The default move for budget allocation across paid channels is to compare ROAS. But ROAS = Revenue ÷ Spend. The "Revenue" in there bundles together traffic quality and traffic volume. A channel that drives lots of cheap, low-intent visitors can show the same ROAS as a channel that drives a few expensive, high-intent visitors. ROAS doesn't separate them.

RPS — Revenue ÷ Sessions — does. It's the per-visitor revenue, which is directly comparable across channels because the denominator is the same kind of unit (a session) for all of them.

In practice: place an RPS-by-channel table at the bottom of the dashboard. If Google Ads RPS is $2.50 and Meta Ads RPS is $0.80, the next budget decision is no longer a 30-minute debate. Meta is bringing visitors who don't convert at the same value — push the next dollar to Google Ads, dig into why Meta visitors aren't converting, and revisit in two weeks.

The 30-minute weekly debate became a 2-minute conversation once we built this view. ROAS still has a place — it's the bottom-line check after RPS-driven allocation — but it stops being the first thing the team looks at.

The 4 Most Common Dashboard Pitfalls

Designing the right dashboard is half the work. Avoiding the wrong design is the other half. The four most common pitfalls I see:

4 common pitfalls and their fixes

  1. Too many metrics. 20–30 KPIs and 10 chart types crammed onto one screen. Fix: 5 KPIs on the main view, helpers on a separate tab.
  2. GA4 metrics as-is. Bounce rate and dwell time as primary KPIs — both have weak revenue correlation. Fix: demote GA4-style metrics to a helper tab. Main view stays RPS-centric.
  3. Last-click attribution bias. Direct and Organic Search end up over-credited because they're the last touch before a converting visit, not the channel that drove acquisition. Fix: show last-click and data-driven side by side.
  4. Decoration over clarity. 3D pies, ornate gauges, animated transitions. They hurt readability without adding signal. Fix: limit visual elements to four types — number cards, bars, lines, and tables. Anything fancier needs a justification.

Layout Templates by Business Model

The 5 KPIs stay constant. The arrangement changes by business model:

Templates by business model — same 5 KPIs, different layout

  • D2C / own-EC: Revenue / CVR / AOV / RPS / ROAS in the top KPI cards, last-12-week revenue + RPS in the time series, RPS / ROAS by channel in the bottom table. Bonus: show ROAS by both last-click and data-driven side by side.
  • Subscription: re-prioritize CVR / AOV / RPS / ROAS / Revenue, swap the time series for new signups + churn rate, and add an LTV column to the channel table. Cohort views become essential.
  • One-off / seasonal: lead with sessions volatility — the time series becomes Revenue + Sessions with seasonal context, and YoY comparison should always be visible (last-month-only views are misleading on seasonal categories).

What This Doesn't Mean

To pre-empt the obvious counterargument: I'm not saying "throw away your 12-metric dashboard." I'm saying that the primary, daily-use dashboard should be 5 KPIs. The other 7+ metrics belong on diagnostic tabs that you reach for when the primary dashboard surfaces a problem. They're not deleted; they're demoted to "lookup" status.

The mental shift is from "the dashboard contains what I need to know" to "the dashboard triggers the next investigation, and the diagnostic tabs answer it."

Once that distinction is internalized, the design choices that follow — what to show, what to hide, what to demote — become almost mechanical.


What's on your team's primary dashboard? Curious whether anyone has hit the same "12 metrics, no decisions" ceiling and what cuts you ended up making.

Top comments (0)