Abstract Blue Particle Wave

What Canada’s Data Collection Modernization Project Could Mean for Financial Institutions

Key Takeaways

  • Canada’s Data Collection Modernization (DCM) project changes how supervisory data is submitted and lays the groundwork for more structured, higher‑quality regulatory data in the long term.

  • The project’s long‑term direction signals greater expectations for data quality, consistency and lineage across regulatory filings.

  • Institutions that treat regulatory reporting as an enterprise data capability—integrated with risk and finance—will be better positioned to reduce manual remediation and turn regulatory data into management insights. 

The Office of the Superintendent of Financial Institutions (OSFI), in partnership with the Bank of Canada (BoC) and the Canada Deposit Insurance Corporation (CDIC), are undertaking a major transformation in how supervisory data is collected from banks and financial institutions.

The project—the multi-year Data Collection Modernization (DCM) initiative from May 2023 to April 2028—aims to manage regulatory reporting burden better by modernizing the data collection platform and advancing data initiatives to enhance data quality.

With industry engagements and forums nearly complete, the project is now in its implementation phase. 

What is the DCM’s new regulatory data collection platform?

The technology implementation will support change management through streamlined data migration, phased industry onboarding and dedicated filer support.

The new data collection platform is expected to go live in autumn 2026. Once operational, the platform creates a single submission channel for OSFI-regulated data, processing high-volume filings across multiple formats and structures. 

The DCM project will onboard the banking sector in summer 2027. 


With the phased industry onboarding, Canadian banks are expected to begin transitioning to the new platform in summer 2027. This transition will include training and practice, after which banks will submit existing regulatory files through the new platform. In this period, banks can continue using their current submission formats. 
 

How will the DCM project change data requirements in the long term?


The DCM project supports a shift toward more structured, granular regulatory data collection for selected data sets. While some regulatory reporting already requires granular or record‑level data, the project aims to further improve consistency, quality and standardization of how detailed information is collected across priority areas. 
 

In September 2026, the DCM project will begin implementing their prioritized data sets. 

To support “risk-intelligent decision-making" in banking, the DCM project is prioritizing:

  1. Transitioning real estate secured lending (RESL) and securities holdings to structured data collection return

  1. Refining the collection and quality of non-retail (corporate and commercial estate) credit risk 



Adaptable Regulatory Reporting by Design: 
Advancing Confidently With the Pace of Regulation 

Learn about the adaptable, multi-layered approach to regulatory reporting, why it scales efficiently through continuous regulatory change and the five indicators that signal your reporting processes aren’t scaling efficiently. 


 Download the Whitepaper


What are the DCM project’s possible long-term implications for financial institutions?

Here are three key possible implications of the DCM project and how financial institutions can prepare: 

1. Higher expectations for data quality, consistency and lineage

With automated validations, cross-checks and trend analysis at scale, regulators will gain sharper visibility into internal inconsistencies across returns, data breaks between domains and unexplained volatility or outliers.

Potential Impact: Inconsistencies between similar data reported to different authorities will become more visible and indefensible. Regulatory scrutiny around data lineage, controls and reconciliation logic could increase, with reduced tolerance for manual overrides and opaque adjustments.

Recommendation: Formalize data governance, issue management and remediation workflows. Shift focus from report assembly to upstream data quality and lineage, and assign direct accountability for regulatory submissions to data owners in finance, risk, treasury and credit. 

2. Improved availability of timely, decision‑ready data

While reporting frequency isn't anticipated to change, the new platform improves OSFI’s ability to request, receive and validate targeted data during supervisory or stress events.

Potential Impact: The new platform expects to accelerate turnaround times on ad-hoc and targeted requests, reducing reliance on bespoke, one-off submissions channels. Lead time for regulatory data preparation during supervisory events may compress.

Recommendation: Maintain "always-ready" regulatory data sets. Automation and straight-through processing can accelerate turnaround, as brittle or heavily manual reporting processes will struggle to keep up. 

3. Greater alignment of regulatory, risk and finance data

The DCM project’s standardized data definitions and reuse model—by design—drives alignment between regulatory reporting, risk management and finance.

Potential Impact: Parallel data pipelines for similar metrics—such as exposures, collateral and classifications—would grow more costly and inefficient.

Recommendation: Approach regulatory reporting as an integrated core data capability that delivers management insight, not merely a compliance function. Standardize data at its source into a single, reusable governed data layer that can serve regulatory filings, risk analytics and executive decision‑making. 

From Compliance Output to Upstream Design

Canada’s DCM project could alter the economics and expectations of compliance. With platform onboarding for Canadian banks scheduled for next year, it’s anticipated in the long term that the project will raise the standard for data quality and defensibility.

This increased standard by regulators would shift reporting from a downstream task to an upstream design choice. Institutions that view regulatory data as an enterprise asset and integrate regulatory reporting with their enterprise-wide data will stand to reduce manual remediation efforts and gain management insights.

 


Adaptable Regulatory Reporting by Design: 
Advancing Confidently With the Pace of Regulation 

 

Basel III reforms, the Integrated Reporting Framework (IReF) and similar multi-year modernization efforts worldwide are increasing standards for granular data, transparency and explainability.

The incremental workarounds performed in rigid reporting architectures aren’t sustainable—and the incurred operational inefficiencies and risks will accelerate an institution’s pathway to its architectural tipping point.

Discover how an adaptable, multi-layered approach preserves control—and strengthens scalability—while absorbing the volume and complexity of change in today's regulatory landscape. 
 

Download Whitepaper Now

Jump to Topic

Recommended For You

Market Regulation

Managing Regulatory Reporting Requirements Amid Volatile Market Change

Get started with Nasdaq

Download the Whitepaper

Discover how architectural adaptability preserves control and improves scalability.

Regulatory Technology

Nasdaq AxiomSL

Future-proof your risk and regulatory reporting with an intelligent data management and analytics platform.

Learn More ->

Latest articles

Info icon

This data feed is not available at this time.

Data is currently not available