The fraud detection platform existed. The problem was upstream.
Each business unit (cards, payments, retail, digital) ran its own inconsistent event schemas, fragmented Kafka clusters, and incomplete data fields. Fraud models were receiving dirty inputs, false positives were elevated, and onboarding a new event source took weeks of bespoke engineering. The bank needed a data layer that could be trusted — and scaled.
Kablamo designed a configuration-driven data streaming and transformation layer that aggregated banking events across domains and delivered clean, normalized events into the fraud engine.
The solution includes a canonical fraud-ready event schema, Avro-based contract enforcement, dead-letter topics, idempotent processing, and immutable audit logging. Temporal-orchestrated workflows ensure reliability under load. It was built for future Fraud team independence and efficiency - making onboarding new event sources a configuration task, not an engineering project.
The bank can now confidently tackle millions in fraud risk through live fraud scoring of 50,000+ transaction events per second, with sub-second latency achieved for fraud scoring.
With dramatically improved accuracy due to cleaner, enriched event data, fraud analysts and data scientists now spend less time correcting malformed data and more time refining detection strategies. The Fraud team can more quickly and independently expand their capabilities with the ability to onboard new fraud event sources with low-code configuration, instead of waiting for weeks on engineering.


