A major insurance operator had no established data pipeline to feed its AML platform covering Suspicious Activity Monitoring, Watch-List Filtering, and Customer Due Diligence. CALIGO built the full pipeline from scratch — unifying customers across 4 source systems with 1,900 daily data quality checks.
A major insurance operator running across multiple business lines had no established data pipeline to feed its Anti-Money Laundering platform — covering Suspicious Activity Monitoring, Watch-List Filtering, and Customer Due Diligence. The company operated four distinct source systems serving different product lines and branches, and the data required by the AML application was scattered across these systems with no unified customer view, no historical record of entity states, and no mechanism to identify and transmit only what had changed between processing cycles.
CALIGO designed and delivered the end-to-end AML data pipeline, targeting all 45 entities in the target data model — including customer, policy, transaction, relationship, and dimensional tables — across a daily batch architecture built on Oracle Exadata, PL/SQL, and Informatica PowerCenter. Customer unification was a core component: cross-system identity matching using national ID, tax ID, and passport numbers produced a single record per real-world customer across all four source systems. A full historical snapshot mechanism was introduced, with a change-data layer comparing each daily batch against the prior snapshot to produce incremental feeds flagged as inserts, updates, or deletions. A dedicated data quality framework embeds 1,900 individually authored PL/SQL queries executed at the end of each batch run, distributing findings to stakeholders daily.
The organisation went from having no stable AML data pipeline to operating a fully functional, daily batch architecture covering 100+ source tables across 45 target entities — with all active customers, policies, and transactions unified and delivered clean to the AML platform. Customer unification resolved fragmented multi-system identities into a single coherent view. The historical snapshot and change-data layer gave the compliance team full retroactive investigation capability. The 1,900-query data quality framework provided daily, systematic visibility into completeness, consistency, and conformity — replacing ad hoc data investigation with a structured, scalable process.