DEV Community

Cover image for I replaced 140 custom analytics events with 5 lines of code. Here's what I learned
Alex Koval
Alex Koval

Posted on

I replaced 140 custom analytics events with 5 lines of code. Here's what I learned

Last year I was consulting for a B2B SaaS product. They had 140 custom Mixpanel events. One hundred and forty.
Nobody knew what half of them tracked. The engineer who set most of them up had left. The product manager was building dashboards for two weeks before making any decisions. And after all that work? The dashboards told them what happened. Not why. Not what to do next.
That experience broke something in my brain. I couldn't stop thinking about it.
So I quit my job and built the thing I wished existed.
The problem nobody talks about
Every analytics tool on the market works the same way:

You define events manually
You send them to a dashboard
You stare at charts
You try to figure out what they mean
You build another dashboard
Repeat

This is insane. We're in 2026. AI can generate photorealistic videos and write code. But product analytics still works like it's 2015.
The dirty secret of Mixpanel, Amplitude, and every other analytics tool: they show you data, not decisions.
You still need a data analyst. You still need weeks to get answers. You still need engineering time to instrument events. And by the time you act on the insight, your users already churned.
What 4,000 events per week taught me
I'm building Xora Analytics. Instead of asking teams to define 140 events, we ask them to care about 5 core metrics. The AI does the rest.
Here's the architecture in plain English:

You install a lightweight JS snippet (~2 min)
It captures user behavior automatically
AI analyzes patterns and finds where users drop off
You get recommendations, not charts

Not "your retention dropped 12% at week 2." Instead: "Users who skip the workspace setup step churn 3x more. Here's a suggested onboarding change with predicted impact."
That's the difference. One gives you homework. The other gives you answers.
The 5 patterns that actually matter
After analyzing user behavior across our beta products, I keep seeing the same 5 churn patterns. I wrote about them in detail on Medium, but here's the quick version:

  1. The 24-Hour Ghost If a user doesn't complete a meaningful action in 24 hours, they're 3x more likely to churn in the first month. Your time-to-first-value needs to be under 10 minutes.
  2. The Feature Desert Single-feature users have 34% retention at 90 days. Users with 3+ features: 89%. If someone only uses one thing, they will leave.
  3. The Week 2 Cliff Day 8-14 is where retention dies. Week 1 is exploration. Week 2 needs a habit trigger. If you don't create one, they're gone.
  4. The Champion Exit When the one person who loves your product changes roles or leaves, the whole account goes silent. Multi-user adoption is your insurance.
  5. The Quiet Downgrade Usage drops 60-70% over 3 months but they stay subscribed. They're not loyal. They just forgot to cancel. A 50%+ drop in weekly usage over 4 weeks predicts cancellation within 60 days. These aren't theoretical. These are real patterns from real products. And none of them require 140 custom events to detect. The technical bit For the devs who want to know how the integration looks: javascript// That's it. That's the setup. import { Xora } from '@xora/sdk';

Xora.init({
apiKey: 'your-key',
autocapture: true
});

Xora.identify(userId, { plan: 'pro', role: 'admin' });
Five lines. No event taxonomy to design. No engineering sprint to instrument. No 40-page tracking plan.
The AI figures out what matters based on actual user behavior, not your assumptions about what they should be doing.
What I'd do differently if starting analytics from scratch
If you're an early-stage SaaS and you're about to set up analytics, here's my honest advice regardless of what tool you use:
Don't start with events. Start with questions.
Write down the 3 questions you need answered this month:

Where do users drop off in onboarding?
Which feature correlates with retention?
What's the activation threshold?

Then instrument ONLY what answers those questions. Everything else is noise.
Don't build dashboards nobody checks.
If your team doesn't look at a dashboard weekly, delete it. Dashboard proliferation is a disease. I've seen companies with 200+ dashboards and zero data-driven decisions.
Don't separate analytics from action.
Knowing that "Step 3 has a 42% drop-off" is useless if it takes 2 weeks to get engineering to fix it. The insight and the action need to be close together.
Where I am now
Xora is in early beta. We have a handful of products sending us around 4,000 events per week. It's not a lot. We're early.
But the feedback is real. One team replaced their entire Mixpanel setup with our 5-minute integration and got their first actionable recommendation within 24 hours. No dashboards built. No data analyst hired. No engineering time wasted.
The free tier handles up to 500K events/month: analytics.xora.es
If you're building a SaaS product and drowning in analytics tools that give you charts but not answers — I'd love to hear about your setup. What's working? What's broken? Drop a comment or hit me up.

I'm Alex, founder of Xora Analytics. I write about product analytics, churn patterns, and building SaaS products. Follow me here on DEV or connect on LinkedIn if you want to nerd out about this stuff.

Top comments (0)