EnglishFrançaisDashboards & KPI ReportingMathematical Modeling & ForecastingApps, APIs, Integrations & MoreStrategic & Competitive AnalysisBlogFAQAboutÀ proposCase Study: 100M-Row Data Warehouse in 4 WeeksÉtude de cas : Entrepôt de données 100M de lignes en 4 semaines
← home

// case study · montreal e-commerce electronics

ROAS > 10× Hiding in the Data — While the Dashboard Said “Cut It”

Actual ROAS> 10×after proper measurement
Prior approachLinearattribution on observed revenue only
New approachCohortLTV projected via survival analysis

The Situation

A Montreal-based e-commerce electronics retailer came to us convinced they needed to kill their best-performing paid channel. Costs were climbing month after month with no additional spend on their side, and the dashboard they had was saying the same thing every morning: this channel is losing money.

The verdict looked straightforward. The channel with the highest spend was also the one showing the lowest apparent return. Standard advice: cut it, reallocate, move on. Except the business itself wasn't collapsing. Revenue held steady. Customers weren't disappearing. Something in the numbers didn't reconcile with what the business was actually doing.

What We Built

Rather than rework the dashboard, we rebuilt the measurement underneath it. The existing setup used linear attribution on observed revenue only, which guarantees recent acquisition spend looks unprofitable — most of the lifetime value those customers will generate hasn't happened yet. Extending the window backward wouldn't have fixed it; it would have added more historical spend to the comparison without unlocking any of the future revenue that spend was actually purchasing.

  • Cohort-based revenue curves— each acquisition cohort tracked as a unit over time, not blended into aggregate monthly totals where newer customers always drag down the average
  • Kaplan-Meier survival curves for retention— so cohort revenue could be projected forward past what had already been observed, producing an expected-lifetime number rather than just a revenue-so-far number
  • Channel-level ROAS against expected lifetime revenue— with confidence intervals, so decisions could be made against expected values and risk ranges instead of a single point estimate on a truncated horizon
  • Separation of acquisition economics from retention— so comparisons across channels weren't distorted by cohort-age mix, which was quietly penalising any channel that had been ramping up

The Result

Once each cohort's expected total revenue was projected forward via the survival model, the picture inverted. The “expensive” channel everyone wanted to cut was acquiring customers whose expected lifetime value was 10–20× their acquisition cost. Actual return on ad spend : above 10×.

The channel stayed on. The company kept acquiring customers through it. More importantly, the business now had a measurement system that reflected how their customers actually behaved — not one that treated every recent acquisition as a sunk cost.

The old dashboard was right about what it was measuring. What it was measuring wasn't what the business needed to know. Extending the window backward would only have made the near term look worse; the fix was to project the future forward, cohort by cohort. This is what happens when BI gets treated as a templating exercise: the numbers look confident, and the decision gets made on a false premise.

Not a visualization problem. A measurement problem.

Wondering what your dashboard might be hiding? Tell us what you're trying to measure →