In May 2024, the Monetary Authority of Singapore (MAS) issued a landmark information paper on data governance and management practices, setting detailed supervisory expectations for banks and finance companies based on thematic inspections of domestic systemically important banks (D-SIBs). That guidance reinforced the Basel Committee’s Principles for Effective Risk Data Aggregation and Risk Reporting (BCBS 239), called for board-level accountability, and exposed persistent gaps in data lineage, data quality controls, and critical data element (CDE) governance across Singapore’s financial sector.
Since then, MAS has continued to raise its expectations. In November 2025, it released a consultation paper on Guidelines for Artificial Intelligence Risk Management, explicitly naming data governance — data quality, lineage, and provenance — as a foundational AI risk domain. Separately, revised Outsourcing Guidelines for both banks and non-bank financial institutions took effect in December 2024, tightening expectations around third-party data management, cloud services, and supply chain risk.
This blog walks through the full arc of MAS’s evolving requirements — what they ask for, where institutions commonly fall short, and what a practical compliance response looks like.
MAS’s 2024 guidance requires board oversight, defined governance roles, robust data quality controls, and audit-ready CDE management aligned with BCBS 239.
MAS’s November 2025 AI risk management guidelines make data governance — quality, lineage, provenance, and bias controls — a standalone AI risk domain with formal supervisory expectations.
Revised MAS Outsourcing Guidelines (effective December 2024) require stronger due diligence, outsourcing registers, and security controls covering cloud and third-party data flows.
Most institutions still manage critical data elements manually, in spreadsheets and documents — creating the inconsistency, audit gaps, and scalability failures MAS’s inspections identified.
Automation is the only viable path to sustainable CDE governance at a regulatory scale.
MAS’s May 2024 information paper was grounded in direct supervisory observation. During thematic inspections of D-SIBs in 2022 and 2023, MAS identified four recurring failure patterns: inconsistent and incomplete risk data, particularly in the aggregation used for regulatory reporting; weak board oversight, with boards receiving insufficiently granular updates to exercise genuine accountability; siloed data systems that undermined cross-enterprise risk aggregation in direct tension with BCBS 239; and generic data quality thresholds that failed to reflect the varying criticality of individual data elements.
MAS’s response was to codify four supervisory expectations: boards must actively oversee data governance and receive structured, element-level reporting; quality controls must be automated and calibrated to element criticality, not applied uniformly; governance structures must include clearly defined roles, a data management office, and a stewardship framework with escalation paths; and processes must exist to quickly identify, escalate, and correct data issues affecting reporting.
Running through all four expectations is a consistent thread: institutions must be able to identify their most critical data elements, document them rigorously, trace their provenance, demonstrate their quality, and provide audit-ready evidence of controls — for specific, named data elements. This is the discipline of CDE governance, and it remains the area where MAS found the most significant gaps.
MAS’s guidance closely mirrors BCBS 239 — the 14 principles the Basel Committee issued to ensure banks can produce accurate, timely, and comprehensive risk data, particularly under stress. In Singapore, the seven D-SIBs are formally required to comply with BCBS 239, and MAS’s 2024 paper makes clear that it views CDE governance as the practical mechanism through which that compliance is demonstrated.
CDEs are the small subset of data points that matter most: the fields that drive regulatory reports, risk calculations, customer outcomes, and financial disclosures. Governing them well requires identifying which elements are truly critical, defining them consistently across business lines, mapping them to the physical systems where they reside, applying appropriate quality standards, and maintaining that governance as systems and regulations evolve — all in a way that produces a defensible, auditable record.
Most institutions still attempt this manually. Large banks estimate the fully-loaded cost of governing a single CDE manually — stewardship, lineage validation, quality checks, regulatory documentation, annual attestation — at around $10,000 per CDE per year in people time. For an institution with 200 CDEs, that is a $2 million annual governance overhead, before accounting for the risk of getting it wrong.
The bottom-up identification trap: institutions that start by scanning their data assets bottom-up — tagging columns as “critical” — often find themselves with thousands of flagged fields and no manageable path forward. One institution identified 20,000 columns as potentially critical across 380,000 total fields. The solution is to work top-down: start with regulatory obligations and business outcomes, identify the logical CDEs they depend on, then map those CDEs to their physical representations in the data estate.
In November 2025, MAS released a consultation paper proposing formal Guidelines for Artificial Intelligence Risk Management for all financial institutions — covering generative AI, predictive models, and AI agents. A 12-month transition period will follow official promulgation, making near-term readiness a genuine priority.
The Guidelines identify nine key risk domains. Data governance appears as a standalone domain, defined to encompass data provenance and quality across the full AI lifecycle, lineage and traceability from source to model input, protections for personal and sensitive data used in training and inference, and bias detection rooted in data representativeness and completeness.
The implication for institutions is significant: you cannot demonstrate AI risk management compliance without first demonstrating that the data feeding your models is governed. That means knowing what your critical data is, where it lives, how it flows, whether it meets quality standards, and whether it has been appropriately validated and documented — in other words, exactly the CDE governance capability MAS’s 2024 paper identified as widely absent.
Institutions that build robust CDE governance frameworks now — in response to the 2024 data governance guidance — will be materially better positioned when the AI risk management guidelines are formally enacted. Those that do not will face a compounding compliance gap: undocumented data feeding undocumented models, with no audit trail for either.
Effective December 11, 2024, MAS replaced its longstanding outsourcing notices with a strengthened regulatory framework covering both banks and non-bank financial institutions. The revised guidelines significantly raise expectations for how institutions govern data that flows through third parties, cloud service providers, and complex supply chains.
Key requirements include comprehensive due diligence on service providers and subcontractors, maintenance and submission of an outsourcing register documenting all Material Ongoing Outsourced Relevant Services (MOORS), enhanced protections for customer information disclosed to outsourced parties, and contractual requirements ensuring that security and governance standards flow through every link in the supply chain.
The data governance dimension is direct: a significant proportion of the data institutions rely on for regulatory reporting — including CDE-level data — passes through or resides in outsourced environments. Cloud data warehouses, third-party processors, and API-connected platforms all represent extension points for governance. The revised guidelines make clear that institutions are accountable for data quality and security regardless of where data physically resides.
MAS’s Technology Risk Management (TRM) guidelines complement the data governance framework by requiring robust security controls — vulnerability management, access controls, threat detection — as integral components of how institutions protect the data assets their governance programs depend on.
MAS’s perspective is worth underscoring: data quality alone is insufficient if the systems holding critical data are not resilient to cyber threats. In 2024, 54% of global financial institutions experienced cyber-attacks that destroyed data — up 12.5% from 2023. The average cost of a financial sector breach rose to USD $6.08 million, significantly above the cross-industry average. Institutions operating in Singapore must treat cybersecurity and data governance as complementary disciplines, not separate functions.
Taken together, MAS’s evolving guidance — the 2024 data governance paper, the 2025 AI risk management guidelines, and the revised outsourcing rules — describes a single, coherent expectation: financial institutions must know what their critical data is, demonstrate that it is accurate and governed, trace it across the systems and third parties where it lives, and produce an audit-ready record of all of the above.
This is not a documentation exercise. It is an operational capability. Institutions that rely on spreadsheets, SharePoint sites, and manual stewardship processes to govern their CDEs will consistently fall short of what MAS inspectors now look for — and will face compounding exposure as AI governance requirements add a new layer of demand on the same underlying data infrastructure.
The institutions that respond most effectively share a common approach: they work top-down from regulatory obligations, automate the heaviest manual work, apply governance effort proportionally to the criticality of each data element, and maintain a living record that holds up under scrutiny.
There is no single playbook, but the institutions that fare best in MAS examinations tend to share several practices.
Rather than scanning the data estate bottom-up for critical fields, begin with the regulatory reports, risk calculations, and business outcomes that matter most. Which data elements, if wrong, would compromise a regulatory filing or a board-level risk decision? Those are your CDEs. Define them at the logical level first — Customer ID, Counterparty Exposure, Date of Birth — before mapping them to physical systems. Alation’s guide to critical data elements covers this methodology in more depth.
MAS found that generic, uniform quality thresholds fail to reflect the importance of individual fields. High-risk CDEs — those driving regulatory submissions or board reporting — warrant tighter controls, more frequent monitoring, and clearer escalation paths than supporting data elements.
Alation's data quality monitoring surfaces rule violations at the element level — with severity classifications and timestamps that make it clear when a threshold was breached and what the exposure is. Building this tiering into your quality framework is both a MAS expectation and a practical way to focus governance effort.
For each CDE, you should be able to trace the data from the regulatory report back to its source system, through every transformation, join, and aggregation in between. This data lineage must be current — not a point-in-time snapshot — and must be accessible to governance and audit teams without requiring deep technical knowledge.
Alation's business lineage view translates technical data flows into a form that governance and audit teams can actually use — tracing each CDE from its source system through every transformation, all the way to the regulatory report.
Each CDE needs a named business owner and steward — with clearly documented responsibilities. MAS’s inspections found that accountability was frequently diffuse or absent. Users can leverage Alation to tie identified risks directly to the data elements involved, with likelihood and impact scores, linked documents, and clear ownership, replacing the informal accountability that MAS consistently flags as a gap.
Governance frameworks that rely on the goodwill of business stakeholders rather than structured workflows and automated prompts will not produce the consistency that regulators require.
Every governance action — a CDE definition updated, a quality threshold revised, an attestation completed — should be captured in a persistent, auditable record. Regulators do not just want to see that CDEs are governed today; they want evidence of a sustained, systematic governance process over time.
In Alation, governance actions are captured as a structured, auditable record. Controls like this data quality check document what was validated, which risks are mitigated, and how effective the control was — creating the sustained evidence trail MAS inspectors seek:
Alation’s Agentic Data Intelligence Platform was designed for exactly the governance challenges MAS has described. Its data catalog provides the metadata foundation — centralized lineage, data quality monitoring, stewardship workflows, and policy management — that underpins compliance with the 2024 data governance guidance, the revised outsourcing rules, and the forthcoming AI risk management guidelines.
For CDE governance specifically, Alation recently launched CDE Manager — the first agent-powered solution for governing critical data elements at scale. Rather than replacing the judgment of data stewards and business owners, CDE Manager automates the parts of the CDE lifecycle that have historically consumed most of their time: translating policy documents into structured standards, drafting CDE definitions for review, and mapping logical CDEs to their physical representations across the catalog.
Here's what that looks like in practice: a single CDE record for Customer ID, showing its technical lineage, linked controls, quality scores, ownership, and approval status — everything a regulator would need to see, in one place:
By providing a comprehensive view into Critical Data Elements (CDEs) Alation helps businesses comply with critical regulations.
In practice, this means:
A Policy-as-Code Agent reads MAS guidelines or internal policy documents and generates structured standards and controls — eliminating the manual interpretation bottleneck that produces inconsistency across teams.
A Drafting Agent auto-generates CDE definitions, rationale, and documentation for steward review, removing the blank-page problem that stalls most governance programs.
A Mapping Agent links each logical CDE to its physical catalog assets — columns, BI fields, dashboards — achieving in seconds what previously took days, with 80–90% accuracy on first pass.
A CDE Registry consolidates definitions, risk levels, ownership, data quality scores, and mapped assets into a single source of truth, providing the board-level visibility and audit trail MAS requires.
Early customers of CDE Manager report saving 3–10 days of effort per CDE annually, with the compounding benefit that governance stays current rather than drifting out of date between regulatory cycles:
TPG Telecom: “5–10 days per CDE saved annually, and that’s probably conservative once automation is complete.”— Robert Uge, Senior Data Governance Specialist
HBF (Health Insurance): “One of the standout capabilities is how effortlessly it extracts and formats the information we need. It’s streamlined our workflows and saved a huge amount of time.”— Reece Offer, Data Governance Analyst
Brambles (Global Logistics): “I can see huge potential for CDE Manager as a powerful, intuitive tool that helps us govern the data that drives our business — bringing key indicators for our most important data into one place.”— Alistair Griffin, Global Data Governance Lead
MAS’s regulatory trajectory is clear: data governance is moving from a voluntary best practice to a formally examined compliance obligation, with direct connections to AI governance, outsourcing risk, and technology resilience. The institutions that will navigate this environment most confidently are those that treat CDE governance not as a one-time project but as an operational discipline — automated where possible, audit-ready by design, and proportionate to the actual risk each data element represents.
For financial services leaders in Singapore and across the Asia-Pacific region, the window to build this capability proactively is open. The 2024 data governance inspection cycle has already revealed where the gaps are. The AI risk management guidelines will add a new layer of scrutiny within the next 12 months. The time to act is now, not at the next examination.
To learn more about how Alation supports MAS compliance, visit our financial services page or request a demo.
The May 2024 information paper requires enhanced board oversight, clearly defined governance roles, data quality controls calibrated to element criticality, and audit-ready CDE management — all grounded in BCBS 239 principles and based on MAS’s own inspections of D-SIBs.
CDEs are the small subset of data points most critical to regulatory reporting, risk decisions, and customer outcomes. MAS expects institutions to identify them formally, document them consistently, map them to physical systems, apply appropriate quality standards, and maintain a defensible, auditable record of their governance. See Alation’s full guide to CDEs for more.
MAS’s AI risk management consultation paper explicitly names data governance as a core AI risk domain. Institutions must demonstrate that the data feeding their AI models is governed, traceable, and of documented quality — making CDE governance a prerequisite for AI compliance.
The revised guidelines require stronger due diligence on service providers, maintenance of an outsourcing register, enhanced customer information protections, and contractual security controls that extend through the full supply chain — including cloud and API-connected platforms. Effective December 11, 2024.
By building governance on automated, catalog-backed infrastructure that can be recalibrated as MAS frameworks change — rather than on manual processes that require periodic rebuilding. Alation’s data catalog and CDE Manager are designed specifically for this continuous compliance model.
The May 2024 guidance from MAS requires enhanced board oversight, defined governance roles, and strong data quality controls to support accurate risk reporting.
MAS aligns with BCBS 239 by enforcing principles of accurate, timely, and comprehensive risk data aggregation and reporting.
A catalog centralizes metadata, automates lineage, quality monitoring, governance workflows, and generates audit-ready documentation—streamlining compliance with MAS and global standards.
CDEs are key data points critical to financial reporting and risk management. MAS highlights their importance, and a data catalog helps identify, track, and govern them securely.
Through automated, real-time monitoring of data quality, mapping MAS controls to metadata assets, and continuous control validation via a data catalog.
They must comprehensively govern cloud services, APIs, and third-party providers, overseeing supply chains and ensuring control through rigorous technology risk and compliance frameworks.
TRM brings in access controls, vulnerability management, and security standards—making information security a pillar of governance, not an afterthought.
Loading...