
First American’s Sarah Frano on AI-Driven Fraud: The Hidden Threat in Real Estate

Sarah Frano is a Vice President and Real Estate Fraud Expert at First American Title Insurance Co.
The tools and technologies powered by artificial intelligence continue to evolve rapidly, and while the real estate industry is harnessing AI to automate everything from property valuations and predictive analytics to customer relationship management and fraud prevention, scammers are harnessing AI to identify targets, rapidly scale their schemes and avoid detection. For example, a Deloitte study estimated that generative AI could cause U.S. fraud losses to grow by 32% each year moving forward, reaching $40 billion by 2027.
What does this mean for the U.S. real estate and mortgage finance industries and the customers they serve? Sarah Frano, Vice President and Real Estate Fraud Expert at First American Title, shared her thoughts on the distinct threat AI fraud poses to home buyers and sellers, real estate professionals and lenders, providing concrete advice to help protect yourself from fraudsters.
MBA NewsLink: How are scammers using AI to commit real estate fraud?
Sarah Frano: AI tools make it easier to quickly fabricate correspondence, identification, deeds, mortgages, video and voices, which can be indistinguishable from a real document or person. Given the intrinsic value of real estate, property transactions and mortgages are attractive targets for scammers.
Armed with AI, scammers can commit broader and increasingly complex types of fraud. Deepfakes, for example, are created using deep learning algorithms that gather a large dataset of images or videos of a target person, which is then used to train an AI model to understand the person’s voice, facial features, expressions and movements. In real estate transactions, scammers can use deepfake audio or video to impersonate real estate agents or other professionals involved in the transaction, leading to fraudulent communications that provide false information or instructions.
Scammers can also use deepfakes to impersonate home sellers. Recently, a Florida title company scheduled a video call to confirm the identity of a woman attempting to sell a vacant lot. They were shocked when they encountered an AI-generated person. The fraudsters likely used AI and face-swapping technology to create the alleged seller, but the face was actually that of a woman who had disappeared in 2018.
MBA NewsLink: What are the most common targets?
Sarah Frano: Properties without an owner-occupant, such as a vacant lot, a second home or a rental property, are common targets for scammers since it’s less likely the owner will discover the fraud. Conversely, while owner-occupied properties are less susceptible to seller impersonation fraud, they may be at a greater risk of scammers taking out fraudulent loans and stripping the property’s equity. Scammers are after money, not property, so high-equity and mortgage-free properties are attractive targets.
MBA NewsLink: How can I spot AI-driven fraud?
Sarah Frano: Detecting deepfakes can be challenging, but there are several techniques and tools that can help identify them, including visual and behavior analysis. Look for inconsistencies in blinking patterns, lip movements, reflections, shadows, skin texture and hair. Deepfakes often struggle to replicate natural blinking patterns and lip movements and may show overly smooth skin or inconsistencies in hair.
There are also specialized software and tools designed to detect deepfakes. These tools analyze videos for subtle artifacts and inconsistencies that are hard for humans to spot. Advanced machine learning models can also be trained to detect deepfakes by identifying patterns and anomalies in the data.
For a quick practical tip, perform a reverse image search to check if the image or video has been used elsewhere on the web.
MBA NewsLink: How can people protect themselves from AI-driven fraud?
Sarah Frano: To protect yourself from deepfake scams during the home buying and selling process, consider the following precautions, and tell those you work with to do the same.
• Verify identities. Always verify the identity of the person you are dealing with through multiple sources. Whenever possible, meet in person to confirm details and verify identities.
• Use trusted platforms. Conduct transactions through trusted escrow services and their secure platforms. Be cautious of emails and content from unknown or unverified sources.
• Protect your title. Where available, always purchase a title insurance policy that covers fraud after you purchase your home.
• Stay informed. Keep up to date with the latest fraud schemes and how to detect them. Educate yourself and others about the signs of deepfakes and encourage a critical approach to consuming digital content.
By combining these techniques and staying vigilant, you can improve your ability to detect deepfakes and reduce the risk of real estate fraud powered by AI.
For a deeper dive into the risk of AI-driven fraud, register today for the April 16 webinar “Unreal Deals: AI, Deepfakes and How Your Closings are at Risk” featuring Sarah and other First American subject matter experts unraveling the growing threat of AI-driven fraud in real estate and how to stay ahead.
(Views expressed in this article do not necessarily reflect policies of the Mortgage Bankers Association, nor do they connote an MBA endorsement of a specific company, product or service. MBA NewsLink welcomes your submissions. Inquiries can be sent to Editor Michael Tucker or Editorial Manager Anneliese Mahoney.)