The latest housing scam: Using AI to impersonate your agent or lender
The closing date was approaching on Raegan Bartlo’s new home in West Virginia, so she wasn’t surprised to receive an email on Aug. 23 that appeared to be from the title company lawyer with instructions to wire a $255,000 down payment.
She sent the money to the account listed in the email, assuming all was well with her planned move from Alexandria, Virginia. Two days later, she learned she’d actually wired the funds to a scammer.
The scam email, she believes, was written by a computer software program powered by artificial intelligence.
“It was terrifying. You feel violated. I thought we lost everything,” said Bartlo, who formerly worked on cybersecurity issues at the Analysis and Resilience Center for Systemic Risk, where she learned to recognize the hallmarks of a scammer’s email.
This one, she said, was different, with a conversational yet professional tone and no grammatical red flags - characteristics that cybersecurity experts say are typical of AI-generated text.
Across the country, cybersecurity officials say, scammers are using generative AI programs to impersonate real estate agents, lenders or other parties in a home sale - mimicking their writing style in emails or their voice in voicemails to direct unsuspecting recipients to wire funds to accounts controlled by the scammers.
These schemes are “almost a perfect crime,” said Tom Cronkright, co-founder and executive chairman of CertifID, a real estate wire fraud prevention company. Huge sums of money are at stake, and buyers often want to move quickly in a tight housing market.
Because most buyers and sellers don’t have much experience with these transactions, he said, they’ll generally take direction from their agents, lenders and title company - or people posing as them.
“You slap generative AI on top of this, and you’re like, ‘My agent called me, I got this voicemail, I read this email, and, yeah, I wired it, because I’m so fatigued by this process that I’ll do anything to make sure it closes,’” Cronkright said.
During a phone interview, Cronkright recorded 13 seconds of a reporter’s voice, then used an AI program to have a voice clone pose as a real estate agent and instruct a client to make a wire transfer. He also typed up a quick prompt to create an AI-written email with wiring instructions, in a mostly formal tone but with a golf joke included for good measure.
Matt O’Neill, the former head of the U.S. Secret Service unit that investigates cybercrimes, said he first encountered this type of scam “almost the day after ChatGPT was released” in late 2022.
Scammers can use different techniques, such as hacking into accounts and phishing real estate professionals to gain transaction details, in order to learn the necessary information to conduct their scams. They can also use publicly available information from real estate databases and listings to add persuasive details to their scam emails - say, the paint color of the master bedroom, said O’Neill.
Free AI programs such as ChatGPT, O’Neill said, allow scammers to write better phishing emails - messages that trick recipients into clicking a link or attachment. “It lowers the bar for bad actors who had been in the game before, because they don’t need any advanced skills,” he said.
Two FBI officials with expertise in complex financial and cyber-enabled fraud crimes said they’ve seen an increase in housing fraud cases involving AI.
Previously, one of the officials said, scam emails and text messages were often written by people who lacked a strong grasp of English, making the tone overly formal or informal or leading to sentences with grammatical errors and awkward wording.
Now, the official said, “asking a generative AI program to come up with a business-toned email that can convince a person that there was an issue with their closing and they need to switch their bank account is really easy to generate, and that can be written out by the software in seconds.”
The FBI doesn’t have data on AI-involved housing scams, the official said, because the use of AI is not itself illegal and because it’s difficult to prove that AI was used in a specific case.
But earlier this month, the FBI issued a public service announcement warning of the increasing use of AI in fraud schemes to generate persuasive text, audio, video and other communications. That followed an alert last month from the Treasury Department’s Financial Crimes Enforcement Network (FinCEN) about a recent increase in AI-generated “deepfakes” targeting financial institutions.
In 2023, the FBI received 21,489 complaints of “business email compromise” - email scams involving fraudulent transfers of funds - totaling more than $2.9 billion, according to the agency’s annual internet crime report. For overall cybercrime, D.C. had more per capita complaints and losses than any state. Maryland and Virginia were both in the top half of states. (The report does not list state-level data for business email compromise schemes specifically.)
It’s impossible for Bartlo and other scam victims to know for sure whether their scammers used AI. But her cybersecurity background taught her to recognize a human-written phishing email. The email from her scammer, she said, didn’t read like one.
“Quite frankly, it never crossed my mind,” she said of the possibility that the sender wasn’t really the title company lawyer. “The way the person did it - it was really, really good.” Later, she discovered one digit was wrong in the email signature’s fax number.
Bartlo and her husband, Michael, were eventually able to get more than half of their money back with cooperation from the banks on both sides of the transaction. But the couple still lost more than $112,000, forcing them to dip into their retirement savings before they were eligible to do so without tax penalties.
“I keep paying for this criminal, and the criminal gets away with anything they want,” Bartlo, 49, said.
But the incident did not stop them from buying the house in West Virginia’s Bunker Hill community, close to the Virginia border, where the mortgage payments and cost of living are far less than what they faced in Alexandria.
“It’s beautiful,” Bartlo said of her new home.
Three days before they finally closed on the house, Bartlo received an almost identical email to the first one.
Again, it was courteous and professional, reading as if it were written by the title company lawyer. And, again, the message asked her to deposit money into a bank account.
This time, Bartlo said, she didn’t fall for it.
Related Content
Deny and delay: The practices fueling anger at U.S. health insurers