Mark Dangelo: Mortgage Data Core: Why Returns—Not Tools—Must Drive the Next Industry Curve
Longtime MBA NewsLink contributor Mark Dangelo says tomorrow’s requirements are not about technologies—they are about the data. “Without a clear purpose, framework and repeatable diagnostics, success over data will be claimed—not won.”

In today’s mortgage environment, the conversation around data has become saturated with terminology—platforms, AI, fabrics, meshes, and modernization programs. However, for all the investment and activity, a more fundamental question remains insufficiently addressed: why are so few organizations realizing measurable, repeatable returns from their data initiatives?
This is not a tooling issue, nor an AI mandate. It is not a one-off talent gap. It is, more precisely, a leadership and architectural discipline issue—one that becomes most visible under pressure especially against traditional ideas of data and operational systems. The mortgage industry is now operating under sustained pressure and material uncertainty.
Additionally, across the industry segments margins remain compressed. Interest rates and inflation are once again drivers of consumer sentiments. Regulatory expectations continue to expand in both scope and scrutiny. At the same time, executives are being asked to justify AI investments, accelerate integration timelines, and improve data traceability (for regulatory compliance and auditing)—all while reducing operational cost.
These are not independent, isolated challenges. They are symptoms of a deeper structural constraint: an underdeveloped data core that was never designed to operate as an enterprise asset let alone serve as connectivity for interoperable AI capabilities. Intuitively we know this is a challenge, all the while incorporating systemic traditional approaches which result in multifaceted parasitic financial sinkholes.
The organizations that will separate themselves in the next cycle will not be those that adopt the most (AI) technology. It will be those leaders that rethink the data core as an economic engine, governed by architectural leadership and measured by objective returns (see Table 1 below).
The Mortgage Industry’s Structural Reality
Mortgage institutions have historically evolved through product expansion, channel diversification, and acquisition. Data environments followed that same path—incrementally, functionally, and often reactively.
The result is familiar:
• Definitions of borrower, loan, and risk vary across origination, servicing, and secondary markets,
• Data lineage exists in fragments, often reconstructed for audits rather than embedded in operations,
• Integration efforts reset with each acquisition or platform change, and
• Reporting processes rely on coordination rather than structural assurance.
These conditions were manageable when cycles were longer and tolerance for delay was higher. That is no longer the case. Today, value occurs in real time utilizing federated data and solutions:
• AI initiatives stall because inputs cannot be trusted consistently,
• M&A synergies are delayed by prolonged data reconciliation, and
• Regulatory responses require disproportionate efforts to validate and explain.
Each of these carries a direct economic cost as discrete financial contributions as well as interoperable, layered competencies. Nonetheless, most organizations continue to treat them as operational inconveniences, siloed integrations rather than architectural failures.
Reframing the Data Core as an Economic Construct
To move forward, leaders must shift the framing of data from an operational, process dependency (i.e., system ideation) to an economic asset (i.e., data ideation). This requires answering three questions with precision:
• Where is data creating measurable return today?
• Where is it introducing friction, delay, or risk?
• What architectural conditions are required to systematically improve both?
This is where emerging architectural frameworks (i.e., AXTent) designed for data ideation (e.g., data meshes, fabrics, and foundries) becomes relevant—not as a technology model, but as a leadership construct leveraging cross-platforms and cloud offerings.
AXTent reframes the data core around a set of non-negotiable principles:
• Accountability for meaning is explicitly assigned,
• Data is managed as reusable products, not one-off pipelines,
• Governance is embedded into execution, not applied after the fact, and
• Architecture is designed to compound value, not accumulate (data) debt.
Within a mortgage context, this shifts the conversation from “how do we integrate systems?” to “how do we create repeatable economic outcomes from data?”
Pressure Point 1: AI in Mortgage—Confidence Is the Currency
AI adoption in mortgage is accelerating, particularly in underwriting, servicing optimization, and borrower engagement. However, most initiatives struggle to move beyond controlled pilots.
The common explanation is that “the data isn’t ready.” That diagnosis is directionally correct—but incomplete. The real issue is that AI exposes inconsistencies that already exist:
• Borrower attributes are defined differently across systems (i.e., semantics),
• Loan performance data lacks consistent lineage (e.g., standards, SOR), and
• Policy enforcement varies by process and channel (i.e., system ideations).
AI does not create these problems. It operationalizes them. From a leadership perspective, the question is not whether AI models are accurate. It is whether the conditions under which they operate are stable and defensible.
This is where architectural discipline directly translates into return:
• When data inputs are consistent, model performance stabilizes,
• When lineage is embedded, auditability becomes continuous rather than episodic, and
• When ownership is explicit, issue resolution accelerates.
The outcome is not theoretical. It is measurable (see Table 1):
• Reduced model drift and rework,
• Faster deployment cycles, and
• Increased executive confidence in AI-driven decisions.
Leadership question: Are your AI investments generating scalable outcomes—or are they constrained by conditions that were never architecturally addressed?
Pressure Point 2: M&A Integration—From Cost Center to Value Multiplier
Mortgage institutions continue to rely on acquisition as a growth strategy. Moreover, post-merger integration remains one of the most persistent sources of unrealized value.
The traditional approach treats each integration as a discrete event:
• Extract and normalize data,
• Reconcile definitions manually,
• Build custom mappings, and
• Repeat for the next acquisition.
This model guarantees two outcomes: time delay and cumulative data debt. Architectural leadership reframes integration as a reusable capability. Instead of starting with systems, it starts with shared meaning:
• Core data domains—borrower, loan, collateral, risk—are defined independently of source systems,
• Acquired data is aligned to these definitions, not to downstream reports, and
• Variances are identified immediately, not after integration is complete.
This shift has direct economic implications:
• Integration timelines compress,
• Redundant transformation work is eliminated, and
• Each acquisition strengthens the architecture rather than fragmenting it.
Over time, the organization builds a compounding advantage—one where integration becomes faster, less costly, and more predictable.
Leadership question: Does each acquisition make your data environment stronger—or does it increase the cost and complexity of the next one?
Pressure Point 3: Regulatory Reporting—From Effort to Assurance
Few areas in mortgage carry as much reputational and financial risk as regulatory reporting. Requirements continue to evolve, with increasing emphasis on transparency, traceability, and control. However, many organizations still rely on a familiar model:
• Assemble data from multiple sources,
• Reconcile inconsistencies,
• Document lineage manually, and
• Validate outputs under tight timelines.
This is not a reporting process. It is a reconstruction exercise. The underlying issue is that most data environments lack what can be described as “structural memory”—the ability to inherently answer:
• Where did this data originate (e.g., system-of-record)?
• How was it transformed (e.g., manipulations, lift-and-shifts)?
• Who is accountable for its meaning (e.g., security, privacy, lineage)?
• What controls were applied (e.g., internal, external, auditability)?
Without this, every reporting cycle becomes a new effort. Architectural leadership changes the model by embedding these answers into daily operations:
• Regulatory outputs are treated as extensions of governed data products,
• Lineage is generated automatically as part of execution, and
• Policies are enforced continuously rather than validated periodically.
The impact is both operational and economic:
• Reduced reporting cycle time,
• Lower compliance costs, and
• Increased confidence in regulatory interactions.
Most importantly, reporting shifts from being a periodic burden to a continuous capability.
Leadership question: Is your organization proving compliance after the fact—or demonstrating it as a natural outcome of how data operates?
The AXTent Lens: Connecting Architecture to Return
What distinguishes AXTent architectural framework in this context is not its structure, but its intent. It aligns architectural decisions directly with economic outcomes by enforcing a set of interconnected disciplines:
• Data products ensure reuse and consistency,
• Embedded governance reduces downstream remediation,
• Continuous feedback loops allow adaptation without destabilization, and
• Modular design supports incremental adoption without fragmentation.
For mortgage executives, this provides a practical pathway: you do not need to transform everything at once. However, you do need to ensure that every step contributes to a coherent, layered architecture. This is where many initiatives fail—not because they lack ambition, but because they lack alignment and measured stepped improvements.
Leadership Implications: A Shift in Accountability
The most significant change required is not technical. It is organizational. Architectural leadership demands that executives move beyond sponsorship into ownership of outcomes. This includes:
• Defining what “good” looks like in terms of data-driven return,
• Holding domains accountable for the meaning and quality of their data,
• Prioritizing investments that reduce structural friction, not just immediate cost, and
• Measuring success through reuse, speed, and confidence—not activity.
In practical terms, this means shifting conversations from “What platform should we invest in?” to “What architectural conditions are required to improve return, and how do we enforce them?”
The Path Forward: Measured, Not Incremental
The mortgage industry does not have the luxury of pursuing transformation as a long-term aspiration—this is not a next-gen standards discussion. The pressures are immediate, and the cost of inaction is cumulative. However, the path forward is not about moving faster indiscriminately. It is about moving with precision.
Organizations that succeed will:
• Anchor initiatives in clearly defined economic outcomes,
• Build around governed, reusable data constructs,
• Treat integration and compliance as architectural capabilities, and
• Apply architectural frameworks like AXTent to ensure consistency across efforts.
These high-level steps will not eliminate complexity, but they will control how complexity behaves.
The next phase of survival in mortgage will not be defined by who has the most data or the most advanced tools. It will be defined by who can consistently convert data into trusted, explainable, and repeatable outcomes.
That is an architectural challenge. It is a future state reality. And ultimately, it is a leadership decision. Are you investing in data initiatives—or are you building a data core that produces measurable return under pressure?
TABLE 1—Ensuing your data core is accelerating results, not eroding them
| AXTent Diagnostic Questions | KPI’s Impacted | |
| Data Product Accountability (Economic Ownership of Meaning) | Are borrower and loan definitions consistent across LOS, POS, and servicing systems?Do underwriting, processing, and closing teams operate from the same data assumptions?When a loan stalls, can the root cause be traced to a specific data owner? | Pull-Through Rate (%): Inconsistent data definitions lead to fallout during underwriting and closing.Loan Cycle Time (Days): Rework driven by data discrepancies extends processing timelines.Cost-to-Originate ($ per loan): Manual reconciliation and exception handling increase per-loan cost. |
| Embedded Governance → KPI Alignment: Defect Rates, Repurchase Risk, Compliance Cost | Are compliance rules (e.g., HMDA, ATR/QM) enforced during data creation, or validated later?How frequently are defects identified post-closing or post-sale?Is governance visible in operations—or only during audits? | Defect Rate (%): Late-stage governance increases underwriting and documentation errors.Repurchase / Indemnification Rate (%): Weak control environments elevate investor risk exposure.Compliance Cost ($ / Loan or % of Ops Expense): Manual reviews and audit remediation inflate costs. |
| Lineage and Transparency → KPI Alignment: Reporting Cycle Time, Audit Findings, Investor Confidence | Can you trace servicing metrics or loan performance data end-to-end without delay?How often are reports revalidated due to conflicting data sources?Are audit findings driven by data gaps or process gaps? | Reporting Cycle Time (Days): Manual lineage reconstruction delays internal and external reporting.Audit Findings (# / Severity): Lack of traceability increases frequency and severity of findings.Investor Confidence / Pricing Adjustments (bps): Poor transparency can lead to pricing penalties in secondary markets. |
| Integration as a Capability → KPI Alignment: Time-to-Synergy, Boarding Time, Data Conversion Cost | How long does it take to onboard acquired loans into servicing with trusted data?Are integrations repeatable—or rebuilt each time?Do data issues surface early—or after systems are connected? | Time-to-Synergy (Months): Delayed integration postpones revenue and cost benefits.Loan Boarding Time (Days): Data inconsistencies slow servicing transfer and readiness.Data Conversion Cost ($ per Loan / Portfolio): Custom mappings and remediation increase integration expense. |
| AI Readiness → KPI Alignment: Productivity per FTE, Cycle Time Reduction, Customer Experience | Are AI-driven underwriting or servicing tools reducing manual effort?How often do AI outputs require human override due to data concerns?Is AI improving borrower experience—or introducing inconsistency? | Loans Processed per FTE: Stable data inputs enable automation and productivity gains.Cycle Time Reduction (%): AI accelerates decisions only when inputs are consistent.Customer Satisfaction / NPS: Inconsistent outputs degrade borrower trust and experience. |
| Reuse and Compounding Value → KPI Alignment: IT Spend Efficiency, Project Delivery Time, Change Cost | How often are the same data transformations rebuilt across teams?Are new products or channels leveraging existing data assets?Does delivery speed improve over time—or remain constant? | IT Spend as % of Revenue: Redundant work inflates technology and data costs.Project Delivery Time (Weeks/Months): Lack of reuse slows time-to-market for new initiatives.Cost of Change ($ per Initiative): Fragmented architecture increases the cost of adaptation. |
| Time-to-Confidence → KPI Alignment: Decision Velocity, Lock-to-Close Conversion, Market Responsiveness | How quickly can executives act on production, pipeline, or servicing insights?How often are decisions delayed due to data validation efforts?Is confidence in data consistent across capital markets, servicing, and origination? | Decision Velocity (Time to Action): Delays reduce the ability to respond to rate changes and pipeline risk.Lock-to-Close Conversion (%): Slow or uncertain decisions increase fallout risk.Pipeline Hedging Effectiveness (Variance / bps): Inaccurate or delayed data impacts hedge performance. |
(Views expressed in this article do not necessarily reflect policies of the Mortgage Bankers Association, nor do they connote an MBA endorsement of a specific company, product or service. MBA NewsLink welcomes submissions from member firms. Inquiries can be sent to Editor Michael Tucker or Editorial Manager Anneliese Mahoney.)
