Case studies
Engagements, anonymized.
How the decision framework plays out in specific environments. Each case study walks the decision sequence: the constraints the framework filtered against, the architecture the engagement landed on, the trade-offs accepted, the outcomes after deployment. Identifying detail is anonymized; the architecture math is real.
Outcome summaries are public. The full case-study narratives — decision sequence, architecture diagrams, cost math, post-deployment metrics — are shared with prospective clients during scoping.
-
Mid-size financial services
From $4M Splunk to a federated lakehouse, with the math.
Eight TB/day, regional bank. Splunk per-GB licensing was unsustainable. The framework drove a hot/cold split — ClickHouse hot tier, Iceberg cold tier on S3, Polaris catalog, Trino for cross-source query. Implementation seven months. Year-one savings ~$2.6M against the licensing baseline; secondary savings from compute-tier rightsizing.
- ~$2.6M Y1 savings
- 8 TB/day
- 7-month implementation
Request the full case study →
-
MSSP serving regulated customers
Multi-tenant lakehouse with Unity Catalog tenant isolation.
Twelve-customer MSSP, mixed regulated industries. Existing stack: Splunk plus a brittle Kafka mesh. The framework's F0 isolation question landed on multi-tenant MSSP, Unity Catalog mandatory for tenant boundary enforcement. Outcome: 5× improvement on detection latency, capacity headroom for 30+ tenants, nine-month implementation.
- 5× detection-latency win
- 12 → 30+ tenant headroom
- ~35% per-tenant cost reduction
Request the full case study →
-
Regional healthcare provider
Foundation engagement, eight months from a HIPAA audit.
Four TB/day Splunk environment, no migration appetite, HIPAA audit eight months out. The foundation engagement ran as a standalone deliverable — per-source health plus cross-tool gap analysis. Twenty-three silent data-quality issues surfaced (including a six-month EDR event-loss regression). Audit closed with no findings on the data-quality track; the cross-tool gap analysis was called out as exemplary documentation.
- 23 silent data-quality issues surfaced
- HIPAA audit cleared with no findings
- No platform migration
Request the full case study →
-
600-employee SaaS
DetectFlow on Splunk, no platform migration.
Healthy 1.8 TB/day Splunk install, detection program in three years of maintenance debt. 420 rules consolidated to 365, DetectFlow practices applied in Splunk-native form. Six weeks engagement; backlog of fifty rules cleared in four months post-closure, false-positive rate down 40%, coverage uplift on previously-weak ATT&CK techniques. The platform decision didn't move — and didn't need to.
- 40% false-positive reduction
- 420 → 365 rules
- 6-week engagement
Request the full case study →
The full four-phase decision framework these case studies apply lives in the matrix client materials. The component-criteria page backs the architecture choices each study landed on.