EnglishFrançaisDashboards & KPI ReportingMathematical Modeling & ForecastingApps, APIs, Integrations & MoreStrategic & Competitive AnalysisBlogFAQAboutÀ proposCase Study: 100M-Row Data Warehouse in 4 WeeksÉtude de cas : Entrepôt de données 100M de lignes en 4 semaines
← home

// case study

100M-Row Data Warehouse in 4 Weeks

Project cost$12,500$200k+ quoted
Delivery4 weeksvs. 18 months quoted
Annual savings$24,000unexpected benefit
SR&ED ITC$1,22535% of $3,500 eligible

// timeline comparison

big firm
12 mo (quote)
iaminter.net
4 weeks
1m3m6m9m12m15m18m

The Brief

A mid-market company's reporting infrastructure had outgrown its original architecture. Their PostgreSQL transactional database had grown to over 100 million rows — a scale it was never designed for. Analytical queries took 15–20 minutes. The finance team ran reports at 6 AM to avoid impacting production. Marketing had no access to data at all.

They contacted a large consulting firm. The quote came back: $200,000+ and 18 months. They called us instead. We scoped the project in a day and started the following week.

What Was Delivered

  • 11 cross-department dashboardsin production — Finance, Marketing, Operations, each with their own lens on the same underlying data
  • 3 mathematical modelsincluding a Bayesian predictive pricing model — deployed as production software, not a spreadsheet
  • 100M-row BigQuery warehouse with automated ETL pipelines refreshing every 5 minutes from PostgreSQL, HubSpot, Google Analytics, Shopify, and internal APIs
  • Marketing analytics platformthat replaced a $2,000/month external consultant — $24,000 in unexpected annual savings

The Architecture

Three-layer deployment built in standard, open technologies:

  • Layer 1 — Ingestion. Automated pipelines from PostgreSQL (change-data-capture), HubSpot, Google Analytics, Shopify, and internal APIs. 5-minute refresh cycle. No manual exports, no waiting on IT.
  • Layer 2 — Warehouse. Google BigQuery as the analytical engine. Denormalized fact tables, dimensional models, pre-computed aggregates. 100M+ rows, sub-3-second query times on analytical workloads.
  • Layer 3 — Presentation. Purpose-built dashboards per department. Role-based access. Unlimited users at zero marginal cost. Everything written in SQL and TypeScript — readable and extensible by any developer.

The entire stack deploys to the client's cloud account. Infrastructure cost: ~$150/month in BigQuery and compute. No per-seat fees. No vendor licenses.

The Financial Picture

The $12,500 build cost is capitalizable as an intangible asset under IFRS (IAS 38) and ASPE (Section 3064) — it goes on the balance sheet and amortizes over 3–7 years rather than hitting the income statement in full. Compare that to the consulting firm's $200k+ quote, which would have been a pure operating expense with the same outcome.

A portion of the modeling work qualified for SR&ED tax credits. The client received a $1,225 refundable federal ITC (35% of $3,500 in qualifying expenditures). Combined with the $24,000 in annual savings from replacing the external marketing consultant, the effective payback period was under 7 months.

Timeline

Week 1

Data audit, pipeline setup, BigQuery warehouse live with automated ingestion from all source systems.

Week 2

Warehouse modeling: dimensional model, fact tables, transformation logic. Business-critical reports running.

Week 3

Dashboard development: department-specific views with drill-downs, iterative stakeholder feedback.

Week 4

QA, cross-validation against known financials, end-user training, pipeline monitoring deployed. Production.

Want a result like this? Tell us what you need →