AI Hallucinations: A Mortgage Lenders Bad Dream, an MBA Premier Member Editorial

By Andrew Liput, CEO of Secure Insight, Hamilton Square, N.J.

Mortgage lenders face regulatory and compliance concerns daily as they navigate the mortgage loan process and balance decisions against ever-changing rules and regulations. When they operate in multiple jurisdictions, the requirement to constantly track rule updates and changes can be a nightmare.

Andrew Liput

Now that AI driven technology platforms are entering the industry, to make mortgage lending faster, more efficient and less costly (so it is advertised), lenders need to fully understand the risks, rewards, limitations and ROI from artificial intelligence and machine learning in their operations workflow. This includes marketing, sales, processing, underwriting, document production, closing table coordination, post-closing file management, and secondary sales efforts. New terminology like “agentic AI,” “human in the loop,” “AI psychosis” (more on that in another article), and “AI hallucination” are entering the industry lexicon. We are addressing AI hallucination here.

AI hallucination, also known as artificial hallucination and confabulation, occurs when an AI technology platform receives a query and produces response and output that is factually incorrect, made up, or even nonsensical. This is good data in, and garbage data out. It happens more often than we would like.

We have to avoid thinking of AI systems as having human thought processes. Machine learning and large language models can and do take in queries and process the request using access to millions and even billions of data points to produce a response designed to answer the query. They are not organic, living things. They are programmed and have limited independent learning capabilities. If they have a “brain,” they are infant brains (at least currently, and from what we know) not full-grown adult brains.

Data exists in the infospace that is intentionally and negligently false, misleading, creative rather than factual (i.e. fictional), outdated, and fluid. Definitions change, case studies change, and rules change. Historical context even changes: what was true 10 years ago is not true today in some factual assessments.

Hallucinations can be intrinsic, that is contradicting the query prompt (i.e. getting the ask wrong), or extrinsic, responsive but containing false or misleading information.

There are scores of real-world examples of AI churning out hallucination responses, including examples where lawyers (who should know better) submitting legal briefs with citations found through ChatGPT that were entirely fabricated and false, resulting in those attorneys receiving professional licensing discipline. My favorite example is Google’s AI recommending adding non-toxic glue to pizza sauce to help cheese stick, which was accessed from a forum where someone responded to a question with a joke. Tragically, there are people who might read this and actually do it, a legitimate concern.

For lenders, AI hallucination carries the risk that lending decisions, document preparation, consumer interfacing communications, and reporting could become skewed, carry false calculations, dates and deadlines. It could also result in HMDA and Fair Lending issues where for example, a series of decisions, relying on the same type of data sets, results in unintended but disparate impact discrimination.

Hallucinations can be overcome with “human in the loop” structures where staff oversight can monitor and reverify black box decision making before it is locked, loaded and communicated externally.

AI is the present and the future. It offers tremendous opportunities for operational efficiencies, more accurate data evaluation and decision making, and faster risk assessment. Yet the ROI is still being examined amidst the overarching regulatory and compliance obligations which remain engrained in the mortgage industry and for which lenders may not avoid through lack of oversight and overreliance on third-party technology solutions.

(Views expressed in this article do not necessarily reflect policies of the Mortgage Bankers Association, nor do they connote an MBA endorsement of a specific company, product or service. MBA NewsLink welcomes submissions from member firms. Inquiries can be sent to Editor Michael Tucker or Editorial Manager Anneliese Mahoney.)